WO2017169752A1 - 車両の姿勢検出装置、画像処理システム、車両、および車両の姿勢検出方法 - Google Patents
車両の姿勢検出装置、画像処理システム、車両、および車両の姿勢検出方法 Download PDFInfo
- Publication number
- WO2017169752A1 WO2017169752A1 PCT/JP2017/010266 JP2017010266W WO2017169752A1 WO 2017169752 A1 WO2017169752 A1 WO 2017169752A1 JP 2017010266 W JP2017010266 W JP 2017010266W WO 2017169752 A1 WO2017169752 A1 WO 2017169752A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- attitude
- captured image
- wheel
- image
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims description 109
- 238000000034 method Methods 0.000 title description 9
- 238000001514 detection method Methods 0.000 claims abstract description 77
- 238000003384 imaging method Methods 0.000 claims description 59
- 239000000284 extract Substances 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 2
- 230000006870 function Effects 0.000 description 16
- 230000003287 optical effect Effects 0.000 description 16
- 230000008859 change Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000009966 trimming Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/11—Pitch movement
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/112—Roll movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/97—Determining parameters from multiple pictures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- Embodiments of the present disclosure relate to a vehicle attitude detection device, an image processing system, a vehicle, and a vehicle attitude detection method.
- Patent Document 1 discloses a configuration in which a region to be extracted from a surrounding image is determined or a position of an index to be superimposed on the surrounding image is changed for driving support in accordance with a change in vehicle inclination. Yes.
- a vehicle attitude detection device includes an input / output interface and a processor.
- the input / output interface acquires a captured image obtained by capturing the wheels of the vehicle.
- the processor detects the posture of the vehicle based on the position of the wheel in the captured image.
- An image processing system includes an imaging device, a posture detection device, and an image processing device.
- An imaging device produces
- the posture detection device detects the posture of the vehicle based on the position of the wheel in the captured image.
- the image processing device determines at least one of a position where the image is superimposed on the captured image and an image processing range in the captured image based on the detected posture.
- a vehicle includes a wheel, an imaging device, a posture detection device, and an image processing device.
- the imaging device generates a captured image obtained by imaging the wheel.
- the attitude detection device detects the attitude of the vehicle based on the position of the wheel in the captured image.
- the image processing device determines at least one of a position where the image is superimposed on the captured image and an image processing range in the captured image based on the detected posture.
- a vehicle attitude detection method is a vehicle attitude detection method executed by an attitude detection device, which acquires a captured image obtained by imaging a vehicle wheel and sets the position of the wheel in the captured image. Based on this, the attitude of the vehicle is detected.
- the image processing system 10 includes one or more imaging devices 12, a posture detection device 13, and an image processing device 14.
- the imaging device 12, the posture detection device 13, and the image processing device 14 are connected to be communicable with each other.
- the vehicle 11 is an automobile including, for example, four wheels, but may be provided with one or more wheels.
- the imaging device 12 is, for example, a vehicle-mounted camera capable of wide-angle shooting.
- the imaging device 12 is installed in the vehicle 11, for example.
- three imaging devices 12 are installed in the vehicle 11.
- the three imaging devices 12 include, for example, a left side camera 12a, a right side camera 12b, and a rear camera 12c.
- a left side camera 12a, a right side camera 12b, and a rear camera 12c are referred to as a left side camera 12a, a right side camera 12b, and a rear camera 12c.
- the left side camera 12a is installed vertically downward on the left side of the vehicle 11.
- the left side camera 12a may be installed on a left side mirror, for example.
- the left side camera 12a is installed so as to be able to simultaneously image the external space on the left side of the vehicle 11, the left front wheel of the vehicle 11, the left rear wheel, and a part of the vehicle 11.
- the part of the vehicle 11 may include at least a part of the left side surface of the vehicle body, for example.
- Two left side cameras 12a may be installed so as to be able to image the left front wheel and the left rear wheel of the vehicle 11, respectively.
- the right side camera 12b is installed vertically downward on the right side of the vehicle 11.
- the right side camera 12b may be installed on a right side mirror, for example.
- the right side camera 12b is installed so that the external space on the right side of the vehicle 11, the right front wheel of the vehicle 11, the right rear wheel, and a part of the vehicle 11 can be imaged simultaneously.
- the part of the vehicle 11 may include at least a part of the right side surface of the vehicle body, for example.
- Two right side cameras 12b may be installed so as to be able to image the right front wheel and the right rear wheel of the vehicle 11, respectively.
- the left side mirror and the right side mirror may include an optical mirror.
- the left side mirror and the right side mirror may include an electronic mirror using a display, for example.
- the rear camera 12 c is installed behind the vehicle 11 so as to face the rear of the vehicle 11.
- the rear camera 12c may be installed, for example, below the bumper behind the vehicle 11.
- the rear camera 12c is installed so as to be able to image the external space behind the vehicle 11.
- the number and arrangement of the imaging devices 12 installed in the vehicle 11 are not limited to the configuration described above.
- the additional imaging device 12 may be installed in the vehicle 11.
- a front camera may be installed so as to be able to image an external space in front of the vehicle 11.
- the attitude detection device 13 is installed at an arbitrary position of the vehicle 11.
- the attitude detection device 13 detects the attitude of the vehicle 11 based on the captured images respectively generated by the left side camera 12a and the right side camera 12b.
- the posture of the vehicle 11 includes at least one of an inclination ⁇ in the front-rear direction of the vehicle 11, an inclination ⁇ in the left-right direction of the vehicle 11, and a height h of the vehicle 11.
- the front-rear direction of the vehicle 11 is also referred to as a first direction.
- the left-right direction of the vehicle 11 is also referred to as a second direction.
- the first direction and the second direction are not limited to the front-rear direction and the left-right direction of the vehicle 11, and may be different directions. A specific operation in which the attitude detection device 13 detects the attitude of the vehicle 11 will be described later.
- the posture detection device 13 outputs information indicating the detected posture of the vehicle 11 to the image processing device 14.
- information indicating the attitude of the vehicle 11 is also referred to as attitude information.
- the detected attitude of the vehicle 11 is used in image processing executed by the image processing device 14 as will be described later.
- the image processing device 14 is installed at an arbitrary position of the vehicle 11.
- the image processing device 14 performs predetermined image processing on the captured image generated by the imaging device 12.
- the predetermined image processing differs depending on a desired function using the captured image.
- the image processing device 14 performs predetermined image processing on the captured image of the rear camera 12c.
- the image processing device 14 outputs the captured image on which image processing has been performed to the vehicle 11 side, and displays the captured image on a display device provided in the vehicle 11, for example.
- the predetermined image processing includes processing for determining an image processing range in the captured image, trimming processing for cutting out the image processing range, processing for superimposing a support image for supporting the driver on the cut out image, and the like. It is.
- the support image may include a guide line that virtually indicates the course of the vehicle 11, for example.
- the image processing device 14 performs predetermined image processing on an image captured by the right side camera 12b.
- the image processing device 14 detects other vehicles in the peripheral region from the right side to the right rear side of the vehicle 11. When another vehicle is detected, the image processing device 14 notifies the vehicle 11 of the presence of the other vehicle, and causes a speaker provided in the vehicle 11 to output a warning sound, for example.
- the predetermined image processing includes processing for determining an image processing range in the captured image, object recognition processing for detecting another vehicle within the image processing range, and the like.
- the image processing device 14 determines at least one of the position where the support image is superimposed on the captured image and the image processing range in the captured image based on the posture information acquired from the posture detection device 13. decide.
- the position on the captured image on which the support image is superimposed is also referred to as a superimposed position.
- the image processing apparatus 14 can determine the superposition position and the image processing range according to the attitude of the vehicle 11 as described below.
- the imaging range of the imaging device 12 changes and the subject to be imaged changes.
- the imaging range of the imaging device 12 and the subject to be imaged change according to the inclination and height of the vehicle 11. Therefore, for example, in the configuration in which the superimposed position or the image processing range in the captured image is fixedly determined, when the attitude of the vehicle 11 changes, the positional shift of the image processing range in which the trimming process is performed on the captured image of the rear camera 12c with respect to the subject. appear. For this reason, a position shift of the support image superimposed on the subject may occur, and the convenience of the driving support function may be reduced.
- the image processing apparatus 14 moves, deforms, or rotates the image processing range on the captured image in accordance with the posture of the vehicle 11, for example. For this reason, the image processing device 14 can reduce the positional deviation of the image processing range with respect to the subject due to a change in the posture of the vehicle 11.
- the image processing device 14 is not limited to the functions described above, and may have various functions using captured images. For example, the image processing device 14 cuts out an image processing range determined based on the posture of the vehicle 11 from a plurality of captured images. The image processing device 14 performs viewpoint conversion processing on the plurality of captured images that have been cut out. The image processing device 14 may have a function of combining the plurality of captured images and generating, for example, an overhead image of the external space over the entire circumference of the vehicle 11.
- each component of the image processing system 10 will be specifically described.
- the imaging device 12 includes an imaging optical system 15, an imaging element 16, an input / output interface 17, a memory 18, and a processor 19.
- the imaging optical system 15 includes an optical member.
- the optical member may include, for example, one or more lenses and a diaphragm.
- the imaging optical system 15 forms a subject image on the light receiving surface of the imaging element 16.
- the imaging optical system 15 functions as, for example, a fisheye lens and has a relatively wide angle of view.
- the image sensor 16 includes, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide Semiconductor) image sensor. A plurality of pixels are arranged on the light receiving surface of the image sensor 16. The image sensor 16 captures a subject image formed on the light receiving surface and generates a captured image.
- CCD Charge Coupled Device
- CMOS Complementary Metal-Oxide Semiconductor
- the input / output interface 17 is an interface for inputting / outputting information to / from an external device via the network 20.
- the network 20 may include, for example, wired (wireless), wireless, or a CAN (Controller Area Network) mounted on the vehicle 11.
- the external devices include a posture detection device 13 and an image processing device 14.
- the external device may include, for example, an ECU (Electronic Control Unit) provided in the vehicle 11, a display, a navigation device, and the like.
- the memory 18 includes, for example, a primary storage device and a secondary storage device.
- the memory 18 stores various information and programs necessary for the operation of the imaging device 12.
- the processor 19 includes a dedicated processor such as a DSP (Digital Signal Processor) and a general-purpose processor such as a CPU (Central Processing Unit).
- the processor 19 controls the overall operation of the imaging device 12.
- the processor 19 controls the operation of the image sensor 16 and periodically generates a captured image at 30 fps, for example.
- the processor 19 performs predetermined image processing on the captured image generated by the image sensor 16.
- the predetermined image processing may include white balance adjustment processing, exposure adjustment processing, gamma correction processing, and the like.
- the predetermined image processing may be performed on the captured image of the current frame, or may be performed on the captured images of the next and subsequent frames.
- the processor 19 outputs the captured image on which the predetermined image processing has been performed to the attitude detection device 13 and the image processing device 14 via the input / output interface 17.
- the processor 19 may be capable of transmitting / receiving a synchronization signal to / from another imaging device 12 via the input / output interface 17.
- the synchronization timing can synchronize the imaging timing between one imaging device 12 and one or more other imaging devices 12.
- the imaging timings of the left side camera 12a, the right side camera 12b, and the rear camera 12c are synchronized.
- the attitude detection device 13 (Configuration of attitude detection device) The attitude detection device 13 will be described.
- the attitude detection device 13 includes an input / output interface 21, a memory 22, and a processor 23.
- the input / output interface 21 is an interface for inputting / outputting information to / from an external device via the network 20.
- the external device includes the imaging device 12 and the image processing device 14.
- the external device may include, for example, an ECU, a display, and a navigation device provided in the vehicle 11.
- the memory 22 includes, for example, a primary storage device and a secondary storage device.
- the memory 22 stores various information and programs necessary for the operation of the attitude detection device 13.
- the memory 22 stores first correspondence information described later. Details of the information stored in the memory 22 will be described later.
- the processor 23 includes a dedicated processor such as a DSP and a general-purpose processor such as a CPU.
- the processor 23 controls the overall operation of the attitude detection device 13.
- the processor 23 acquires a captured image from at least one of the left side camera 12a and the right side camera 12b via the input / output interface 21.
- the processor 23 acquires a set of captured images from the left side camera 12a and the right side camera 12b.
- One set of captured images are captured images captured at substantially the same timing by the left side camera 12a and the right side camera 12b whose imaging timings are synchronized.
- the processor 23 detects the attitude of the vehicle 11 based on the acquired captured image. This will be specifically described below.
- the distance L1 indicates the distance between the left side camera 12a and the left front wheel 24.
- a distance H1 indicates a component of the distance L1 in the optical axis direction of the left side camera 12a.
- a distance W1 indicates a component of the distance L1 in a direction perpendicular to the optical axis of the left side camera 12a. Therefore, the distances L1, H1, and W1 satisfy the following expression (1).
- L1 ⁇ 2 H1 ⁇ 2 + W1 ⁇ 2 (1)
- the distance L2 indicates the distance between the left side camera 12a and the left rear wheel 25.
- a distance H2 indicates a component of the distance L2 in the optical axis direction of the left side camera 12a.
- a distance W2 indicates a component of the distance L2 in a direction perpendicular to the optical axis of the left side camera 12a. Therefore, the distances L2, H2, and W1 satisfy the following expression (2).
- L2 ⁇ 2 H2 ⁇ 2 + W2 ⁇ 2 (2)
- H1 H2.
- FIG. 4 shows a state where the vehicle 11 is decelerated or stopped when the vehicle 11 is traveling forward.
- the front of the vehicle body sinks and the rear of the vehicle body rises.
- each wheel shall be in contact with the road surface. While each wheel is in contact with the road surface, the suspension between each wheel and the vehicle body is deformed, so that the vehicle body is inclined relative to the road surface and each wheel.
- the captured image of the left side camera 12a will be specifically described.
- a part 26 of the vehicle 11, the left front wheel 24, and the left rear wheel 25 are captured in the captured image.
- the part 26 of the vehicle 11 may include at least a part of the left side surface of the vehicle body, for example.
- the position of the part 26 of the vehicle 11 in the captured image does not change, while the left front wheel 24 and the left rear wheel 25 that move relative to the vehicle body are imaged.
- the position in the image changes.
- the positions of the left front wheel 24 and the left rear wheel 25 in the captured image change according to the inclination ⁇ .
- the distance W1 and the distance W2 can be regarded as constants because the change due to the inclination ⁇ is very small.
- the memory 22 stores first correspondence information indicating a correspondence relationship between the position of each wheel in the captured image and a parameter used to detect the attitude of the vehicle 11.
- the position of each wheel in the captured image may be the absolute position of the wheel in the captured image, or may be the relative position of the wheel with respect to the part 26 of the vehicle 11 in the captured image.
- the first correspondence information includes, for example, first correspondence information for the left side camera 12a and first correspondence information for the right side camera 12b.
- the first correspondence information for the left side camera 12a is a correspondence relationship between the positions on the images of the left front wheel 24 and the left rear wheel 25 in the captured image, and the distance L1 and the distance L2. Indicates. Since the first correspondence information for the right side camera 12b is similar to the first correspondence information for the left side camera 12a, description thereof is omitted.
- the first correspondence information can be determined in advance by, for example, experiments or simulations.
- the processor 23 of the posture detection device 13 detects the position of each wheel in the captured image acquired from the left side camera 12a. For example, the processor 23 detects the positions of the left front wheel 24 and the left rear wheel 25 in the captured image.
- the processor 23 detects the positions of the left front wheel 24 and the left rear wheel 25 in the captured image.
- an algorithm that performs edge detection on the captured image and detects an elliptical shape on the captured image as a wheel is employed, but any algorithm can be employed.
- the processor 23 extracts the distance L1 and the distance L2 corresponding to the detected positions of the left front wheel 24 and the left rear wheel 25.
- the processor 23 uses the extracted distance L1 and distance L2 and the distances W1 and W2 stored in the memory 22 to calculate the inclination ⁇ from tan ⁇ calculated by the above-described equations (1) to (3).
- the first correspondence information may be information indicating a correspondence relationship between the positions on the images of the left front wheel 24 and the left rear wheel 25 in the captured image and the distances W1, W2, H1, and H2.
- the processor 23 extracts a parameter corresponding to the position of each wheel of the vehicle 11, and calculates the inclination ⁇ , which is the calculation result of the algorithm described above, using the extracted parameter.
- the first correspondence information may be information indicating a correspondence relationship between the position on the image of each of the left front wheel 24 and the left rear wheel 25 in the captured image and the inclination ⁇ . In such a case, the processor 23 extracts the inclination ⁇ corresponding to the position of each wheel of the vehicle 11.
- the processor 23 detects the inclination ⁇ of the vehicle 11 in the front-rear direction based on the captured image of the left side camera 12a.
- the processor 23 uses the distance L10, L20, H10, H20, W10 based on the captured image of the right side camera 12b to detect the inclination ⁇ described later, as in the case of using the captured image of the left side camera 12a. , And W20.
- the distance L10 indicates the distance between the right side camera 12b and the right front wheel.
- a distance H10 indicates a component of the distance L10 in the optical axis direction of the right side camera 12b.
- a distance W10 indicates a component in a direction perpendicular to the optical axis of the right side camera 12b of the distance L10.
- a distance L20 indicates a distance between the right side camera 12b and the right rear wheel.
- a distance H20 indicates a component of the distance L20 in the optical axis direction of the right side camera 12b.
- a distance W20 indicates a component of the distance L20 in a direction perpendicular to the optical axis of the right side camera 12b.
- the distance Q indicates the distance between the left side camera 12a and the right side camera 12b in the left-right direction of the vehicle 11.
- the memory 22 of the posture detection device 13 stores the distance Q in advance.
- the processor 23 of the posture detection device 13 calculates the slope ⁇ from the sin ⁇ calculated by the above-described equation (4), using the distance H1, the distance H10 calculated as described above, and the distance Q stored in the memory 22. To do.
- the processor 23 may calculate the height h of the vehicle 11 based on the plurality of parameters calculated as described above.
- the plurality of parameters include slope ⁇ , slope ⁇ , distances L1, L2, H1, H2, W1, W2, L10, L20, H10, H20, W10, and W20.
- the height h may be, for example, the height from the road surface to an arbitrary position of the vehicle 11. Further, the height h may be calculated further based on an arbitrary vehicle constant.
- the vehicle constant may include, for example, the dimensions of the vehicle 11.
- the processor 23 detects the inclination ⁇ , the inclination ⁇ , and the height h calculated as described above as the posture of the vehicle 11.
- the processor 23 may detect the average values of the inclination ⁇ , the inclination ⁇ , and the height h calculated in the current frame and a predetermined number of past frames, respectively, as the posture of the vehicle 11.
- the processor 23 outputs attitude information indicating the attitude of the vehicle 11 to the image processing device 14.
- the posture information may include information indicating the inclination ⁇ , the inclination ⁇ , and the height h.
- the image processing apparatus 14 (Configuration of image processing apparatus) The image processing apparatus 14 will be described with reference to FIG.
- the image processing device 14 includes an input / output interface 28, a memory 29, and a processor 30.
- the input / output interface 28 is an interface for inputting / outputting information to / from an external device via the network 20.
- the external device includes an imaging device 12 and a posture detection device 13.
- the external device may include, for example, an ECU, a display, and a navigation device provided in the vehicle 11.
- the memory 29 includes, for example, a primary storage device and a secondary storage device.
- the memory 29 stores various information and programs necessary for the operation of the image processing apparatus 14.
- the processor 30 includes a dedicated processor such as a DSP and a general-purpose processor such as a CPU.
- the processor 30 controls the operation of the entire image processing apparatus 14.
- the processor 30 acquires captured images from one or more imaging devices 12 via the input / output interface 28 and acquires posture information from the posture detection device 13.
- the processor 30 determines at least one of the superposition position where the support image is superimposed on the captured image and the image processing range in the captured image, according to the attitude of the vehicle 11. When the attitude of the vehicle 11 changes, the processor 30 moves the superimposed position. When the posture of the vehicle 11 changes, the processor 30 moves, deforms, and rotates the image processing range.
- the processor 30 determines the superimposed position and the image processing range according to the attitude of the vehicle 11 so as to maintain the relative positional relationship of the image processing range in the reference state with respect to the subject.
- the processor 30 determines the superimposed position and the image processing range in the captured image so as to reduce the positional deviation of the subject in the captured image due to the change in the posture of the vehicle 11. With this configuration, it is possible to reduce the displacement of the superimposed position and the image processing range with respect to the subject.
- the processor 30 cuts out an image processing range from the captured image of the rear camera 12c and superimposes the support image on the superimposition position, for example, when executing the driving support function when the vehicle 11 moves backward.
- the processor 30 outputs the captured image on which the support image is superimposed to the vehicle 11 side.
- the output captured image is displayed on a display device provided in the vehicle 11, for example.
- the processor 30 may perform object recognition processing for detecting other vehicles within the image processing range of the captured image of the right side camera 12b, for example, when executing a driving support function at the time of joining the expressway.
- the processor 30 notifies the vehicle 11 side of the presence of the other vehicle.
- a warning sound is output from a speaker provided in the vehicle 11.
- This operation may be performed for each frame in which a captured image is generated, for example, or may be performed periodically with a predetermined interval.
- Step S100 The processor 23 of the posture detection device 13 acquires a set of captured images from the left side camera 12a and the right side camera 12b.
- Step S101 The processor 23 detects the posture of the vehicle 11 based on the set of captured images acquired in step S100 and the first information stored in the memory 22. For example, the processor 23 detects at least one of the inclination ⁇ , the inclination ⁇ , and the height h as the posture of the vehicle 11. The processor 23 may detect the average values of the inclination ⁇ , the inclination ⁇ , and the height h calculated in the current frame and a predetermined number of past frames, respectively, as the posture of the vehicle 11.
- Step S102 The processor 23 outputs attitude information indicating the attitude of the vehicle 11 detected in step S101 to the image processing device 14.
- This operation may be performed for each frame in which a captured image is generated, for example, or may be performed periodically with a predetermined interval.
- Step S200 The processor 30 of the image processing apparatus 14 acquires a captured image from the rear camera 12c.
- Step S201 The processor 30 acquires attitude information indicating the attitude of the vehicle 11 from the attitude detection device 13.
- Step S202 The processor 30 determines at least one of the superimposed position and the image processing range according to the attitude of the vehicle 11 indicated by the attitude information acquired in step S201.
- the processor 30 may store the superimposed position or image processing range determined for each captured image in the memory 29 in association with the imaging device 12 that generated the captured image.
- description will be made assuming that both the superposition position and the image processing range are determined.
- Step S203 The processor 30 cuts out an image processing range from the captured image.
- Step S204 The processor 30 superimposes the support image on the superimposed position of the cut out captured image.
- Step S205 The processor 30 outputs the captured image on which the support image is superimposed in step S204 to the vehicle 11 side.
- a tilt sensor is provided in the imaging device in order to detect the tilt of the vehicle.
- providing an additional component such as a tilt sensor in the imaging apparatus can lead to an increase in size and cost of the imaging apparatus. Therefore, there is room for improvement in the configuration for detecting the posture of the vehicle such as the inclination of the vehicle.
- the attitude detection device 13 acquires a captured image obtained by imaging a wheel provided in the vehicle 11, and the attitude of the vehicle 11 based on the position of the wheel in the captured image. Is detected.
- the posture detection device 13 acquires a captured image of the left side camera 12a, and detects the inclination ⁇ as the posture of the vehicle 11 based on the position of the wheel in the captured image. Therefore, for example, the posture of the vehicle 11 can be detected without adding a component such as a tilt sensor to the imaging device 12.
- the attitude detection device 13 may acquire a plurality of captured images and calculate the attitude of the vehicle 11 based on the positions of a plurality of wheels in the plurality of captured images.
- the inclination ⁇ , the inclination ⁇ , and the height h are detected as the posture of the vehicle 11 based on the positions of the four wheels in the captured image of the left side camera 12a and the captured image of the right side camera 12b. Is possible. Therefore, the detection accuracy of the posture of the vehicle 11 is improved.
- the posture detection device 13 may store first correspondence information indicating a correspondence relationship between the position of the wheel in the captured image and a parameter used for detecting the posture of the vehicle 11.
- the parameter may include, for example, the distance L1 and the distance L2.
- the functions of the processor 23 of the posture detection device 13 may be provided in another device that can communicate with the posture detection device 13, such as the imaging device 12.
- the posture detection device 13 itself may be included in one imaging device 12.
- some or all of the functions of the processor 30 of the image processing device 14 may be provided in another device that can communicate with the image processing device 14, for example, the imaging device 12.
- the image processing device itself may be included in one imaging device 12.
- the configuration in which the posture detection device 13 detects the posture of the vehicle 11 based on the first correspondence information has been described, but the configuration for detecting the posture of the vehicle 11 is not limited thereto.
- the memory 22 of the posture detection device 13 may store the second correspondence information instead of the first correspondence information.
- the second correspondence information is information indicating a correspondence relationship between the positions of the plurality of wheels in the plurality of captured images and the posture of the vehicle 11.
- the second correspondence information includes the positions of the left front wheel, the left rear wheel, the right front wheel, and the right rear wheel in the captured image of the left side camera 12a and the captured image of the right side camera 12b.
- Information indicating a correspondence relationship between the inclination ⁇ , the inclination ⁇ , and the height h is information indicating a correspondence relationship between the inclination ⁇ , the inclination ⁇ , and the height h.
- the second correspondence information may be information indicating a correspondence relationship between the positions of the left and right front wheels and the rear wheels and the inclination ⁇ in the captured image of the left side camera 12a or the right side camera 12b.
- the second correspondence information is information indicating a correspondence relationship between the positions of the three wheels, the inclination ⁇ , the inclination ⁇ , and the height h in the captured image of the left side camera 12a and the captured image of the right side camera 12b.
- the three wheels may include, for example, a left front wheel 24, a left rear wheel 25, and a right front wheel 27.
- the posture detection device 13 directly determines the posture of the vehicle 11 from the second correspondence information if the positions of the plurality of wheels in one or more captured images are detected. Can be extracted. For this reason, the processing load can be reduced and the processing speed can be improved.
- the posture detection device 13 may stop detecting the posture of the vehicle 11 when the vehicle 11 is in a specific state. For example, the posture detection device 13 stops the detection of the posture of the vehicle 11 when the detection accuracy of the posture of the vehicle 11 can be lowered. For example, when the vehicle 11 is traveling on an uneven road surface such as off-road, there may be a moment when one or more wheels are not in contact with the road surface. In such a case, the detection accuracy of the posture of the vehicle 11 may be lowered. Specifically, the posture detection device 13 acquires various vehicle information indicating the state of the vehicle 11 from the vehicle 11 side via the network 20. The vehicle information may include information indicating the vibration of the vehicle 11, for example.
- the posture detection device 13 stops detecting the posture of the vehicle 11 when it is determined that the vibration amplitude of the vehicle 11 is equal to or greater than a predetermined threshold.
- the vehicle information may include information indicating the state of the road surface on which the vehicle 11 is traveling. In such a case, the posture detection device 13 stops detecting the posture of the vehicle 11 when it is determined that the road surface is not flat or when it is determined that the road surface is off-road.
- the operation performed by the image processing system 10 using the left side camera 12a may be performed using the right side camera 12b instead of the left side camera 12a.
- words such as “left side camera 12a” and “left side” may be read as “right side camera 12b” and “right side”, respectively.
- the configuration in which the front-rear direction inclination ⁇ of the vehicle 11 is detected as the posture of the vehicle 11 based on the positions of the left front wheel 24 and the left rear wheel 25 in the captured image of the left side camera 12a has been described.
- the method for detecting the attitude of the vehicle 11 is not limited to this.
- the inclination in the first direction can be calculated based on the positions of any two wheels in one or more captured images among the four wheels included in the vehicle 11.
- the inclination in the first direction, the inclination in the second direction, and the height of the vehicle 11 based on the positions of any three wheels in one or more captured images among the four wheels included in the vehicle 11. Can be calculated.
- Each component of the image processing system 10 may be realized as an information processing apparatus such as a mobile phone or a smartphone, and may be connected to the vehicle 11 by wire or wirelessly.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Image Processing (AREA)
Abstract
Description
撮像装置12について説明する。撮像装置12は、撮像光学系15と、撮像素子16と、入出力インターフェース17と、メモリ18と、プロセッサ19と、を備える。
姿勢検出装置13について説明する。姿勢検出装置13は、入出力インターフェース21と、メモリ22と、プロセッサ23と、を含む。
図3を参照して、車両11の前後方向の傾きα=0における、左サイドカメラ12aと、左側前輪24と、左側後輪25と、の位置関係について説明する。車両11の前後方向に平行であって路面に垂直な平面上に投影された左サイドカメラ12a、左側前輪24、および左側後輪25の位置関係について説明する。
L1^2=H1^2+W1^2 (1)
L2^2=H2^2+W2^2 (2)
ここで、傾きα=0においては、H1=H2である。
tanα=(H2-H1)/(W1+W2) (3)
図8を参照して、車両11の左右方向の傾きβ≠0における、左サイドカメラ12aと、右サイドカメラ12bと、左側前輪24と、右側前輪27と、の位置関係について説明する。車両11の左右方向に平行であって路面に垂直な平面上に投影された左サイドカメラ12a、右サイドカメラ12b、左側前輪24、および右側前輪27の位置関係について説明する。
sinβ=(H10-H1)/Q (4)
プロセッサ23は、上述のようにして算出された複数のパラメータに基づいて、車両11の高さhを算出してもよい。本実施形態において、複数のパラメータは、傾きα、傾きβ、距離L1、L2、H1、H2、W1、W2、L10、L20、H10、H20、W10、およびW20を含む。高さhは、例えば路面から車両11の任意の位置までの高さであってもよい。また、高さhは、任意の車両定数にさらに基づいて算出されてもよい。車両定数は、例えば車両11の寸法などを含んでもよい。
図2を参照して、画像処理装置14について説明する。画像処理装置14は、入出力インターフェース28と、メモリ29と、プロセッサ30と、を含む。
11 車両
12 撮像装置
12a 左サイドカメラ
12b 右サイドカメラ
12c リアカメラ
13 姿勢検出装置
14 画像処理装置
15 撮像光学系
16 撮像素子
17 入出力インターフェース
18 メモリ
19 プロセッサ
20 ネットワーク
21 入出力インターフェース
22 メモリ
23 プロセッサ
24 左側前輪
25 左側後輪
26 車両の一部
27 右側前輪
28 入出力インターフェース
29 メモリ
30 プロセッサ
Claims (10)
- 車両の車輪を撮像した撮像画像を取得する入出力インターフェースと、
前記撮像画像における前記車輪の位置に基づいて、前記車両の姿勢を検出するプロセッサと、
を備える、車両の姿勢検出装置。 - 請求項1に記載の姿勢検出装置であって、
前記プロセッサは、前記撮像画像における前記車輪の絶対位置に基づいて、前記車両の姿勢を検出する、姿勢検出装置。 - 請求項1に記載の姿勢検出装置であって、
前記撮像画像は、前記車輪および前記車両の一部を撮像した画像であり、
前記プロセッサは、前記撮像画像における、前記車両の前記一部に対する前記車輪の相対位置に基づいて、前記車両の姿勢を検出する、姿勢検出装置。 - 請求項1乃至3の何れか一項に記載の姿勢検出装置であって
前記入出力インターフェースは、複数の前記車輪を撮像した複数の前記撮像画像を取得し、
前記プロセッサは、前記複数の撮像画像における前記複数の車輪の位置に基づいて、前記車両の姿勢を検出する、姿勢検出装置。 - 請求項1乃至4の何れか一項に記載の姿勢検出装置であって、
前記プロセッサは、前記車両の第1方向の傾き、前記車両の第2方向の傾き、および、前記車両の高さのうち、少なくとも1つを前記車両の姿勢として検出する、姿勢検出装置。 - 請求項1乃至5の何れか一項に記載の姿勢検出装置であって、
前記撮像画像における前記車輪の位置と、前記車両の姿勢を検出するために用いられるパラメータと、の対応関係を示す第1対応情報を記憶するメモリをさらに備え、
前記プロセッサは、前記第1対応情報に基づいて、前記撮像画像における前記車輪の位置に対応するパラメータを抽出し、該パラメータを用いる演算の結果を、前記車両の姿勢として検出する、姿勢検出装置。 - 請求項1乃至5の何れか一項に記載の姿勢検出装置であって、
前記撮像画像における前記車輪の位置と、前記車両の姿勢を示す姿勢情報と、の対応関係を示す第2対応情報を記憶するメモリをさらに備え、
前記プロセッサは、前記第2対応情報に基づいて、前記撮像画像における前記車輪の位置に対応する姿勢情報を取得し、該姿勢情報に示される姿勢を前記車両の姿勢として検出する、姿勢検出装置。 - 車両の車輪を撮像した撮像画像を生成する撮像装置と、
前記撮像画像における前記車輪の位置に基づいて、前記車両の姿勢を検出する姿勢検出装置と、
検出された前記姿勢に基づいて、前記撮像画像に画像を重畳させる位置、および、前記撮像画像における画像処理範囲のうち、少なくとも一方を決定する画像処理装置と、
を備える、画像処理システム。 - 車輪と、
前記車輪を撮像した撮像画像を生成する撮像装置と、
前記撮像画像における前記車輪の位置に基づいて、車両の姿勢を検出する姿勢検出装置と、
検出された前記姿勢に基づいて、前記撮像画像に画像を重畳させる位置、および、前記撮像画像における画像処理範囲のうち、少なくとも一方を決定する画像処理装置と、
を備える、車両。 - 姿勢検出装置が実行する車両の姿勢検出方法であって、
車両の車輪を撮像した撮像画像を取得し、
前記撮像画像における前記車輪の位置に基づいて、前記車両の姿勢を検出する、姿勢検出方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018508978A JPWO2017169752A1 (ja) | 2016-03-29 | 2017-03-14 | 車両の姿勢検出装置、画像処理システム、車両、および車両の姿勢検出方法 |
EP17774296.2A EP3437948A4 (en) | 2016-03-29 | 2017-03-14 | DEVICE FOR DETECTING THE ORIENTATION OF A VEHICLE, IMAGE PROCESSING SYSTEM, VEHICLE AND METHOD FOR DETECTING THE EQUIPMENT OF A VEHICLE |
US16/088,493 US20200111227A1 (en) | 2016-03-29 | 2017-03-14 | Orientation detection apparatus for vehicle, image processing system, vehicle, and orientation detection method for vehicle |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016065962 | 2016-03-29 | ||
JP2016-065962 | 2016-03-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017169752A1 true WO2017169752A1 (ja) | 2017-10-05 |
Family
ID=59965227
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/010266 WO2017169752A1 (ja) | 2016-03-29 | 2017-03-14 | 車両の姿勢検出装置、画像処理システム、車両、および車両の姿勢検出方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20200111227A1 (ja) |
EP (1) | EP3437948A4 (ja) |
JP (1) | JPWO2017169752A1 (ja) |
WO (1) | WO2017169752A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200059475A (ko) * | 2018-11-21 | 2020-05-29 | 현대모비스 주식회사 | 초음파 센서의 높이 보상장치 및 그 방법 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020090512A1 (ja) * | 2018-10-31 | 2020-05-07 | ソニー株式会社 | 撮影装置、制御方法、及び、プログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003309844A (ja) * | 2002-04-18 | 2003-10-31 | Nissan Motor Co Ltd | 車両用表示装置 |
WO2013145015A1 (ja) * | 2012-03-29 | 2013-10-03 | トヨタ自動車株式会社 | 路面状態推定装置 |
JP2013203186A (ja) * | 2012-03-28 | 2013-10-07 | Honda Motor Co Ltd | 先行車姿勢検出装置 |
JP2016021653A (ja) * | 2014-07-14 | 2016-02-04 | アイシン精機株式会社 | 周辺監視装置、及びプログラム |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
BE1014606A3 (nl) * | 2002-02-05 | 2004-01-13 | Krypton Electronic Eng Nv | Werkwijze voor het dynamisch meten van de positie en orientatie van een wiel. |
US20070296961A1 (en) * | 2004-11-26 | 2007-12-27 | Keita Sekine | Vehicle Lamp Inspection Equipment and Inspection Method |
JP6361382B2 (ja) * | 2014-08-29 | 2018-07-25 | アイシン精機株式会社 | 車両の制御装置 |
-
2017
- 2017-03-14 WO PCT/JP2017/010266 patent/WO2017169752A1/ja active Application Filing
- 2017-03-14 EP EP17774296.2A patent/EP3437948A4/en not_active Withdrawn
- 2017-03-14 US US16/088,493 patent/US20200111227A1/en not_active Abandoned
- 2017-03-14 JP JP2018508978A patent/JPWO2017169752A1/ja active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003309844A (ja) * | 2002-04-18 | 2003-10-31 | Nissan Motor Co Ltd | 車両用表示装置 |
JP2013203186A (ja) * | 2012-03-28 | 2013-10-07 | Honda Motor Co Ltd | 先行車姿勢検出装置 |
WO2013145015A1 (ja) * | 2012-03-29 | 2013-10-03 | トヨタ自動車株式会社 | 路面状態推定装置 |
JP2016021653A (ja) * | 2014-07-14 | 2016-02-04 | アイシン精機株式会社 | 周辺監視装置、及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3437948A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200059475A (ko) * | 2018-11-21 | 2020-05-29 | 현대모비스 주식회사 | 초음파 센서의 높이 보상장치 및 그 방법 |
KR102601353B1 (ko) * | 2018-11-21 | 2023-11-14 | 현대모비스 주식회사 | 초음파 센서의 높이 보상장치 및 그 방법 |
Also Published As
Publication number | Publication date |
---|---|
EP3437948A1 (en) | 2019-02-06 |
US20200111227A1 (en) | 2020-04-09 |
EP3437948A4 (en) | 2019-11-27 |
JPWO2017169752A1 (ja) | 2019-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7266165B2 (ja) | 撮像装置、撮像システム、および表示システム | |
US20150042800A1 (en) | Apparatus and method for providing avm image | |
JP6565188B2 (ja) | 視差値導出装置、機器制御システム、移動体、ロボット、視差値導出方法、およびプログラム | |
US10704957B2 (en) | Imaging device and imaging method | |
EP3002727B1 (en) | Periphery monitoring apparatus and periphery monitoring system | |
WO2016063545A1 (ja) | ステレオカメラ装置及びステレオカメラ装置を備える車両 | |
US11968462B2 (en) | Solid-state image pickup apparatus, correction method, and electronic apparatus | |
JPWO2017169365A1 (ja) | 路面変位検出装置およびサスペンション制御方法 | |
WO2018016150A1 (ja) | 画像処理装置と画像処理方法 | |
WO2018016151A1 (ja) | 画像処理装置と画像処理方法 | |
WO2017169752A1 (ja) | 車両の姿勢検出装置、画像処理システム、車両、および車両の姿勢検出方法 | |
JP6407596B2 (ja) | 画像処理装置、及び、運転支援システム | |
US11368620B2 (en) | Image processing apparatus, image processing method, and electronic device | |
WO2016121406A1 (ja) | 画像処理装置、画像処理システム、車両、撮像装置、および画像処理方法 | |
JP7170167B2 (ja) | 撮像装置、表示システム、および撮像システム | |
KR20150009763A (ko) | 카메라 시스템 및 카메라 시스템의 제어 방법 | |
WO2015115103A1 (ja) | 画像処理装置、カメラシステム、および画像処理方法 | |
JP2018006889A (ja) | 画像処理装置、車載カメラ装置、および画像処理方法 | |
JP2011180962A (ja) | 運転支援装置、運転支援方法およびプログラム | |
WO2018198833A1 (ja) | 制御装置、方法、およびプログラム | |
WO2018179624A1 (ja) | 撮像装置、撮像システム、および、撮像装置の制御方法 | |
WO2023026626A1 (ja) | 信号処理装置、信号処理システム、信号処理方法及びプログラム | |
WO2023067867A1 (ja) | 車載制御装置、および、3次元情報取得方法 | |
US20230412923A1 (en) | Signal processing device, imaging device, and signal processing method | |
JP2018132338A (ja) | 距離導出装置および距離導出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018508978 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017774296 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017774296 Country of ref document: EP Effective date: 20181029 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17774296 Country of ref document: EP Kind code of ref document: A1 |