US11087145B2 - Gradient estimation device, gradient estimation method, computer program product, and controlling system - Google Patents
Gradient estimation device, gradient estimation method, computer program product, and controlling system Download PDFInfo
- Publication number
- US11087145B2 US11087145B2 US16/125,312 US201816125312A US11087145B2 US 11087145 B2 US11087145 B2 US 11087145B2 US 201816125312 A US201816125312 A US 201816125312A US 11087145 B2 US11087145 B2 US 11087145B2
- Authority
- US
- United States
- Prior art keywords
- pixel
- image
- gradient
- gradient magnitude
- matching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G06K9/00798—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
- B60W40/076—Slope angle of the road
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G06K9/4609—
-
- G06K9/6202—
-
- G06K9/6215—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/15—Road slope, i.e. the inclination of a road segment in the longitudinal direction
-
- G05D2201/0213—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- Embodiments described herein relate generally to a gradient estimation device, a gradient estimation method, a computer program product, and controlling system.
- a gradient of a plane such as a road surface is estimated by using a plurality of images.
- a gradient is estimated from a transformation parameter by which one of two images is transformed such that a luminance difference between the two images is eliminated in a set road surface candidate region.
- FIG. 1 is a view illustrating an exemplary moving body according to an embodiment
- FIG. 2 is a diagram illustrating an exemplary functional configuration of a moving body according to a first embodiment
- FIG. 3 is a view illustrating exemplary acquired images
- FIG. 4 is a view to describe a method of determining a gradient direction of a plane
- FIG. 5 is a view to describe a method of determining a gradient direction of a plane
- FIG. 6 is a view to describe a method of determining a gradient direction of a plane
- FIG. 7 is a view illustrating exemplary pixels to be extracted
- FIG. 8 is a view illustrating exemplary image transformation
- FIG. 9 is a view illustrating exemplary image transformation
- FIG. 10 is a flowchart of gradient estimation processing according to the first embodiment
- FIG. 11 is a diagram illustrating an exemplary functional configuration of a moving body according to a second embodiment
- FIG. 12 is a view illustrating an exemplary candidate region to be set
- FIG. 13 is a view illustrating an exemplary candidate region to be set
- FIG. 14 is a view illustrating an exemplary candidate region to be set
- FIG. 15 is a view illustrating an exemplary candidate region to be set
- FIG. 16 is a view illustrating an exemplary candidate region to be set
- FIG. 17 is a view illustrating an example of setting a plurality of candidate regions
- FIG. 18 is a flowchart of gradient estimation processing according to the second embodiment.
- FIG. 19 is a hardware configuration diagram of a gradient estimation device according to the first or second embodiment.
- a gradient estimation device includes one or more processors.
- the processors acquire a first image and a second image obtained by photographing a surface of an object.
- the processors extract, from the first image, a first pixel indicating a boundary of an object in a direction different from a gradient direction of the surface, and extract a second pixel corresponding to the first pixel from the second image.
- the processors calculate gradient magnitude of the surface by transforming at least one of the first image and the second image such that a matching degree between the first pixel and the second pixel is increased.
- a gradient estimation device estimates a gradient (gradient magnitude) of a plane (a surface of an object) by extracting, from two images obtained by photographing the plane, pixels aligned in a direction different from a gradient direction of the plane, and transforming the images such that the extracted pixels are more matched. Consequently, influence of a white line extending in the gradient direction can be reduced and gradient estimation accuracy can be improved, for example.
- FIG. 1 is a view illustrating an exemplary moving body 10 according to the present embodiment.
- the moving body 10 includes a gradient estimation device 20 , an output unit 10 A, a sensor 10 B, an input device 10 C, a power output controller 10 G, and a power output unit 10 H.
- a gradient estimation device 20 estimates a gradient of a photographed plane in an image.
- the plane includes, for example, a road surface, a ground surface, a water surface, a floor surface, a wall surface, a ceiling surface, or the like existing in a moving direction of the moving body 10 .
- the gradient estimation device 20 is, for example, a dedicated or general purpose computer. In the present embodiment, an exemplary case where the gradient estimation device 20 is mounted on the moving body 10 will be described.
- the moving body 10 is a movable object.
- the moving body 10 is, for example, a vehicle (a motorcycle, an automatic four-wheel vehicle, or a bicycle), a carriage, a robot, a ship, or a flying object (such as an airplane or an unmanned aerial vehicle (UAV)).
- the moving body 10 is, for example, a moving body that travels by drive operation by a person or a moving body that can automatically travel (autonomously travel) without involving drive operation by a person.
- the moving body capable of automatically traveling is, for example, an automatic driving vehicle.
- the description will be provided for an example in which the moving body 10 of the present embodiment is a vehicle that can perform autonomous travel.
- the gradient estimation device 20 is not limited to a form mounted on the moving body 10 .
- the gradient estimation device 20 may be mounted on a stationary object.
- the stationary object is an object that cannot be moved or an object that is in a stationary state with respect to the ground.
- the stationary object includes, for example, a guardrail, a pole, a parked vehicle, a road sign, and the like.
- the gradient estimation device 20 may be mounted on a cloud server that executes processing on the cloud.
- the output unit 10 A outputs various kinds of information.
- the output unit 10 A outputs output information from various kinds of processing.
- the output unit 10 A includes, for example, a communication function to transmit output information, a display function to display output information, a sound output function to output a sound to indicate output information, and the like.
- the output unit 10 A includes a communication unit 10 D, a display 10 E, and a speaker 10 F.
- the communication unit 10 D communicates with an external device.
- the communication unit 10 D is a VICS (registered trademark) communication circuit or a dynamic map communication circuit.
- the communication unit 10 D transmits output information to an external device. Additionally, the communication unit 10 D receives road information and the like from an external device.
- the road information includes a signal, a sign, a surrounding building, a road width of each lane, a lane center line, and the like.
- the road information may be stored in the storage 20 B.
- the display 10 E displays output information.
- the display 10 E is, for example, a known liquid crystal display (LCD), a projection device, a light, or the like.
- the speaker 10 F outputs a sound to indicate output information.
- the sensor 10 B is a sensor to acquire a travel environment of the moving body 10 .
- the travel environment includes, for example, observation information of the moving body 10 and peripheral information of the moving body 10 .
- the sensor 10 B is, for example, an external sensor or an internal sensor.
- the internal sensor is a sensor to observe observation information.
- the observation information includes, for example, an acceleration rate of the moving body 10 , a speed of the moving body 10 , and an angular velocity of the moving body 10 .
- the internal sensor is, for example, an inertial measurement unit (IMU), an acceleration sensor, a speed sensor, a rotary encoder, or the like.
- the IMU observes observation information including a triaxial acceleration rate and a triaxial angular velocity of the moving body 10 .
- the external sensor observes peripheral information of the moving body 10 .
- the external sensor may be mounted on the moving body 10 or may be mounted on the outside of the moving body 10 (for example, another moving body, an external device, or the like).
- the peripheral information is information to indicate a situation in the periphery of the moving body 10 .
- the periphery of the moving body 10 is a region within a predetermined range from the moving body 10 .
- the range is a range that can be observed by the external sensor.
- the range may be preliminarily set.
- the peripheral information includes, for example, a photographed image and distance information of the periphery of the moving body 10 .
- the peripheral information may include positional information of the moving body 10 .
- a photographed image is photographed image data obtained by photographing (hereinafter may be simply referred to as a photographed image).
- the distance information is information indicating a distance from the moving body 10 to an object.
- the object is a place that can be observed by the external sensor in the external world.
- the positional information may be a relative position or an absolute position.
- the external sensor include, for example, a photographing device to obtain a photographed image by photographing, a distance sensor (a millimeter wave radar, a laser sensor, or a distance image sensor), a position sensor (a global navigation satellite system (GNSS), a global positioning system (GPS), or a radio communication device), and the like.
- a distance sensor a millimeter wave radar, a laser sensor, or a distance image sensor
- a position sensor a global navigation satellite system (GNSS), a global positioning system (GPS), or a radio communication device
- a photographed image includes digital image data in which a pixel value is defined per pixel, a depth map in which a distance from the sensor 10 B is defined per pixel, and the like.
- the laser sensor is, for example, a two-dimensional laser imaging detection and ranging (LIDAR) sensor or a three-dimensional LIDAR sensor installed parallel to a horizontal plane.
- LIDAR laser imaging detection and ranging
- the input device 10 C receives various kinds of commands and information input from a user.
- the input device 10 C is, for example, a pointing device such as a mouse or a track ball, or an input device such as a keyboard. Additionally, the input device 10 C may be an input function on a touch panel integrally provided with the display 10 E.
- the power output controller 10 G controls the power output unit 10 H.
- the power output unit 10 H is a device mounted on the moving body 10 and used for driving.
- the power output unit 10 H includes, for example, an engine, a motor, a wheel, and the like.
- the power output unit 10 H is driven under the control of the power output controller 10 G.
- the power output controller 10 G determines a peripheral situation on the basis of output information generated by the gradient estimation device 20 , information obtained from the sensor 10 B, and the like, and controls an acceleration amount, a brake amount, a steering angle, and the like.
- the power output controller 10 G adjusts the acceleration amount and the brake amount in accordance with gradient magnitude estimated by the gradient estimation device 20 such that the moving body 10 is moved at a desired speed.
- FIG. 2 is a block diagram illustrating an exemplary configuration of the moving body 10 .
- the moving body 10 includes the gradient estimation device 20 , output unit 10 A, sensor 10 B, input device 10 C, power output controller 10 G, and power output unit 10 H.
- the output unit 10 A includes the communication unit 10 D, display 10 E, and speaker 10 F.
- the gradient estimation device 20 , output unit 10 A, sensor 10 B, input device 10 C, and power output controller 10 G are connected via a bus 10 I.
- the power output unit 10 H is connected to the power output controller 10 G.
- the gradient estimation device 20 includes the storage 20 B and a processor 200 .
- the output unit 10 A, sensor 10 B, input device 10 C, power output controller 10 G, processor 200 , and storage 20 B are connected via the bus 10 I.
- At least one of the storage 20 B, output unit 10 A (communication unit 10 D, display 10 E, and speaker 10 F), sensor 10 B, input device 10 C, and power output controller 10 G is connected to the processor 200 by a wire or radio. Additionally, at least one of the storage 20 B, output unit 10 A (communication unit 10 D, display 10 E, and speaker 10 F), sensor 10 B, input device 10 C, and power output controller 10 G is connected to the processor 200 via a network.
- the storage 20 B stores various kinds of data.
- the storage 20 B is, for example, a random access memory (RAM), a semiconductor memory device such as a flash memory, a hard disk, an optical disk, or the like.
- the storage 20 B may also be provided outside the gradient estimation device 20 .
- the storage 20 B may also be provided outside the moving body 10 .
- the storage 20 B may also be arranged in a server device installed on a cloud.
- the storage 20 B may also be a storage medium.
- the storage medium may be the one that downloads or temporarily stores a program and various kinds of information via a local area network (LAN), the Internet, or the like.
- the storage 20 B may be formed of plurality of storage media.
- the processor 200 includes an acquisition unit 201 , a determination unit 202 , an extraction unit 203 , and a calculation unit 204 .
- Each of the processing functions in the processor 200 is stored in the storage 20 B in a form of a program that can be executed by a computer.
- the processor 200 is a processor that implements a functional unit corresponding to each program by reading and executing the program from the storage 20 B.
- the processor 200 having read each program has each of the functional units illustrated in the processor 200 of FIG. 2 .
- FIG. 2 a description will be provided assuming that the acquisition unit 201 , determination unit 202 , extraction unit 203 , and calculation unit 204 are implemented by the single processor 200 .
- the processor 200 may be formed by combining a plurality of independent processors in order to implement each of the functions.
- each processor implements each function by executing a program.
- each processing function is formed as a program and one processing circuit executes each program, or there may be a case where a specific function is installed in a dedicated independent program execution circuit.
- processor used in the present embodiment represents circuits of, for example, a central processing unit (CPU), a graphical processing unit (GPU), an application specific integrated circuit (ASIC), a programmable logic device (for example, a simple programmable logic device (SPLD)), a complex programmable logic device (CPLD), and a field programmable gate array (FPGA)).
- CPU central processing unit
- GPU graphical processing unit
- ASIC application specific integrated circuit
- SPLD simple programmable logic device
- CPLD complex programmable logic device
- FPGA field programmable gate array
- the processor implements a function by reading and executing a program stored in the storage 20 B.
- a program may also be directly incorporated inside a circuit of the processor instead of storing a program in the storage 20 B.
- the processor implements a function by reading and executing the program incorporated in the circuit.
- the acquisition unit 201 acquires images (first image and second image) obtained by photographing a plane.
- the acquisition unit 201 acquires at least two or more images obtained by photographing the plane of a region for which a gradient is measured, and the images are each provided with a position and a posture of the photographing device (camera) having photographed the images.
- the acquisition unit 201 may directly acquire an image photographed by the camera from the camera, or may acquire an image stored in the storage medium or the like. Meanwhile, in a case where a position and a posture of the photographing device can be calculated (estimated) by, for example, analyzing an image, the acquisition unit 201 may acquire an image not provided with a position and a posture of the photographing device.
- FIG. 3 is a diagram illustrating exemplary acquired images.
- FIG. 3 illustrates exemplary images obtained by photographing a region 421 for which a gradient is estimated while a camera is installed at a windshield of an automobile (example of the moving body 10 ) in a manner facing forward.
- An image 411 represents, for example, an image photographed from an automobile 401 at a time (t ⁇ 1).
- An image 412 represents, for example, an image photographed from an automobile 402 at a time t.
- the automobile 402 corresponds, for example, to the automobile 401 that has been moved to a gradient having magnitude 423 .
- the image 412 includes a region 422 corresponding to the region 421 for which the gradient is estimated.
- two images photographed at different times are used, but the two images to be processed are not limited thereto.
- any image can be used.
- two images photographed at the same time by a stereo camera or the like may also be used.
- the determination unit 202 determines a gradient direction of a plane on an acquired image.
- FIG. 4 is a view to describe a method of determining a gradient direction of a plane. For example, as illustrated in FIG. 4 , the determination unit 202 determines, as a gradient direction of a plane on an image, a direction 502 obtained by projecting, on an image, a direction 501 of a straight line directed from an image acquisition position toward a region 421 for which a gradient is estimated on a three-dimensional space of the real world.
- the method of determining a gradient direction of a plane is not limited thereto, and any method may be used.
- the determination unit 202 may determine, as a gradient direction of a plane, a direction 602 directed from one side of the image (for example, a lower end portion of an image) to a vanishing point 601 .
- the determination unit 202 may determine, as a gradient direction of a plane, an advancing direction 701 of the moving body 10 on an image.
- a gradient direction of a plane can also be preliminarily fixed and set.
- the determination unit 202 can preliminarily set, as a gradient direction, a direction from a lower side to an upper side of an image.
- the extraction unit 203 extracts, from an image, pixels (a pixel column) in a direction different from the determined gradient direction. For example, the extraction unit 203 extracts, from one (first image) of two images, pixels (first pixels) indicating a boundary in the direction different from the gradient direction of the plane out of boundaries of objects photographed in this image. Additionally, the extraction unit 203 extracts, from the other one (second image) of the two images, pixels (second pixels) corresponding to the first pixels. For example, the extraction unit 203 may extract, from the second image, pixels (second pixels) indicating a boundary in a direction different from the gradient direction of the plane out of the boundaries of the objects photographed in this image.
- the extraction unit 203 may extract, as a corresponding pixel, a second pixel matched with a first pixel while extracting, as a third pixel, a pixel existing on the coordinate of the second image and located at a coordinate same as the first pixel, and searching a periphery of the third pixel. Specifically, search can be made by a method such as block matching using pixels in the periphery of the first pixels. Besides, the extraction unit 203 may extract a point having large luminance change as a feature point on the second image, and may extract, as a second pixel corresponding to a first pixel, a center pixel of a feature point most similar to a feature amount in the periphery of the first pixel.
- a direction different from the gradient direction may be preset as a direction of a range in which a pixel that influences gradient estimation accuracy can be excluded, for example.
- a direction within the preset range from a direction perpendicular to the gradient direction can be set a direction different from the gradient direction. Consequently, it is possible to exclude a boundary extending in the gradient direction (or a direction close to the gradient direction) such as a white line painted on a road surface in an advancing direction on the road surface.
- a weight can be added in order to reduce the influence such that the closer to the perpendicular direction, the more the influence on a matching degree described below is increased.
- a weight can be added to a linear shape in accordance with an angle while preliminarily setting a maximum value and a minimum value of the weight, and adding the maximum value in a case of being perpendicular to the gradient direction, and adding the minimum value in a case of being parallel to the gradient direction.
- the boundary of the object in the direction different from the gradient direction is, for example, a boundary in a direction different from the advancing direction out of boundaries of objects such as a crack generated on a road surface, a line (stop line or the like) other than the line indicating the advancing direction, and a broken line painted in the advancing direction (a dashed center line or the like).
- the boundary of the object can be detected by edge detection processing, for example.
- the extraction unit 203 extracts, as a pixel in a direction different from a gradient direction, a pixel located on an edge in a direction different from the gradient direction.
- edge detection processing a pixel having intense luminance change in the gradient direction can be extracted.
- FIG. 7 is a view illustrating exemplary pixels to be extracted.
- FIG. 7 illustrates exemplary pixels (edge) 801 to 804 having intense luminance change and extracted in a case where a gradient direction is a vertical direction of an image.
- the pixels 801 and 802 are exemplary pixels detected as boundaries of objects corresponding to parts of a dashed center line.
- the pixels 803 and 804 are exemplary pixels detected as boundaries of objects corresponding to a crack made on a road surface.
- the number of pixels (pixel column) to be extracted may be plural or may be one.
- pixels on an edge in a lateral direction perpendicular to the vertical direction are pixels having more intense luminance change.
- the extraction unit 203 detects an edge in a horizontal edge larger than a threshold value by using an edge detection filter such as a Sobel filter that detects a horizontal edge, and then extracts a pixel on the edge as a pixel in the direction different from the gradient direction.
- the method of extracting pixels in the direction different from the gradient direction is not limited thereto.
- the extraction unit 203 extracts, as pixels in the direction different from the gradient direction, an edge that is other than a white line and corresponds to a lane detected by the lane detection function from among edges detected from an image.
- the pixels in the direction different from the gradient direction may also be obtained from an acquired image by using a learning model subjected to learning in advance so as to extract and output pixels in the direction different from the gradient direction from an input image.
- the learning model may be any type of model such as a neural network model.
- the calculation unit 204 calculates gradient magnitude of a plane (gradient angle) by using the extracted pixels. For example, the calculation unit 204 transforms at least one of two images such that a matching degree between a pixel (first pixel) extracted from one (first image) of the two images and a pixel (second pixel) extracted from the other one (second image) becomes higher than that before transformation. The calculation unit 204 outputs, as an estimation value of the gradient magnitude, the gradient magnitude of the plane when the matching degree becomes higher.
- the calculation unit 204 transforms one of the two images by using a nomography matrix H(p) that can be calculated from: a certain gradient angle p; a translation vector T indicating a relative positional change between images; and a rotation matrix R indicating change of a posture.
- the calculation unit 204 estimates, as a gradient angle (gradient magnitude), a gradient angle p at which the matching degree between the transformed image and the other image becomes higher.
- the homography matrix H(p) can be expressed similar to a homography matrix P (Expression (1)) of “Efficient Plane-parameter Estimation Using Stereo Images” by shigeki SUGIMOTO et al., Information Processing Society of Japan Transactions, Vol. 48, No. SIG1, pp. 24-34, February 2007 (non-patent document), for example.
- a normal vector n, a rotation matrix R, and a translation vector t on a plane of the non-patent document correspond to the gradient angle p, rotation matrix R, and translation vector T of the present embodiment.
- the rotation matrix R and the translation vector T can be acquired from, for example, a position and a posture of a photographing device provided to an image.
- a distance to a plane can be set as a fixed value in accordance with an installation height of the photographing device, for example.
- FIGS. 8 and 9 are views illustrating exemplary image transformation.
- FIG. 8 illustrates an example in a case where, out of the image 411 and the image 412 shown in FIG. 3 , the image 411 is transformed to generate an image 912 .
- FIG. 8 illustrates an example in which there is no error in the translation vector T and the rotation matrix R and an estimated gradient angle p is equal to a true value.
- images of estimated plane regions for example, the region 422 in the image 412 and a region 922 in the image 912 after transformation are matched.
- FIG. 9 illustrates an example in which there are errors in the translation vector T and the rotation matrix R and an estimated gradient angle p is not equal to a true value.
- Solid lines in an image 1012 - 1 represent an image transformed by the homography matrix H(p 1 ) calculated from p 1 that differs from a true value of the gradient angle p. Dashed lines represent an image 412 displayed in order to easily view a matching result. Note that the image 412 to be compared is also displayed in a superimposed manner in FIG. 9 .
- Each of arrows 1021 - 1 represents a matching result between a pixel in the direction different from the gradient direction in the image 1012 - 1 and a pixel in the direction different from the gradient direction in the image 412 .
- An image 1012 - 2 represents an image transformed by a homography matrix H(p 2 ) calculated from p 2 close to the true value of the gradient angle p.
- Each of arrows 1021 - 2 represents a matching result between a pixel in the direction different from the gradient direction in the image 1012 - 2 and a pixel in the direction different from the gradient direction in the image 412 .
- the calculation unit 204 performs matching for a pixel in a direction different from the gradient direction between images in order to calculate a matching degree. For example, the calculation unit 204 performs matching between regions having similar feature amounts in a periphery of a pixel on the image.
- the periphery of a pixel is, for example, a region in a predetermined range including the pixel.
- the feature amount may be luminance of an image or may be a feature amount such as a zero-mean normalized cross-correlation (ZNCC) calculated from luminance arrangement in the periphery of a pixel, or accelerated KAZE (AKAZE).
- ZNCC zero-mean normalized cross-correlation
- AKAZE accelerated KAZE
- an error is usually generated in the translation vector T and the rotation matrix R, and therefore, pixels are not completely matched even in a case of changing the gradient angle p like p 1 and p 2 .
- the calculation unit 204 determines, as an image having the highest matching degree, an image obtained by changing the gradient angle p and performing transformation by the gradient angle p at which a total distance between the pixels to be matched on the image is shortest.
- the calculation unit 204 outputs, as an estimation value of gradient magnitude, the gradient angle p that transforms the image to have the highest matching degree.
- the calculation unit 204 transforms one image by using a plurality of estimation values indicating gradient angles (gradient magnitude), and calculates a matching degree with the other image.
- the calculation unit 204 outputs, from among the plurality of estimation values, an estimation value having the matching degree higher than other estimation values.
- the gradient magnitude estimation method is not limited thereto. Any method may be applied as far as a method can calculate the gradient magnitude at which a matching degree between a transformed image and a comparison image becomes higher.
- the calculation unit 204 may add a weight to the distance such that the shorter a distance that is a feature amount used for matching is (the higher similarity between the feature amounts is), the more the distance is shortened.
- the calculation unit 204 may also use a matching degree at which the higher similarity of a feature amount between pixels to be matched is, the larger a value becomes.
- the calculation unit 204 may use a method of calculating a plane parameter by decomposing a transformation parameter after obtaining a transformation parameter (nomography matrix H(p)) for image transformation.
- the calculation unit 204 outputs, as an estimation value of gradient magnitude, the gradient angle p 2 at which the matching degree becomes higher, for example.
- FIG. 10 is a flowchart illustrating exemplary gradient estimation processing according to the first embodiment.
- the acquisition unit 201 acquires two images obtained by photographing a plane (step S 101 ).
- the determination unit 202 determines a gradient direction of the plane on the acquired images (step S 102 ).
- the extraction unit 203 extracts, from each of the two images, pixels in a direction different from the determined gradient direction (step S 103 ). For example, the extraction unit 203 extracts an edge extending in the direction different from the gradient direction in each of the images.
- the calculation unit 204 transforms at least one of the two images (step S 104 ).
- the calculation unit 204 calculates gradient magnitude on the basis of an image having a higher matching degree with the other image (step S 105 ).
- the estimated gradient magnitude may be used to control movement of the moving body 10 , for example.
- the power output controller 10 G controls an acceleration rate and the like of the moving body 10 in accordance with the gradient magnitude.
- operation of the function may also be controlled in accordance with estimated gradient magnitude.
- whether to apply the obstacle detection function may be determined in accordance with estimated gradient magnitude.
- gradient magnitude of a plane is estimated by determining a matching degree between pixels aligned in a direction different from the gradient direction of the plane. Consequently, influence of a white line extending in the gradient direction can be reduced and gradient estimation accuracy can be improved, for example.
- a gradient estimation device enables setting for a region in an image used to estimate gradient magnitude.
- FIG. 11 is a block diagram illustrating an exemplary configuration of the moving body 10 - 2 according to the second embodiment.
- a moving body 10 - 2 includes a gradient estimation device 20 - 2 , an output unit 10 A, a sensor 10 B, an input device 10 C, a power output controller 10 G, and a power output unit 10 H.
- the gradient estimation device 20 - 2 includes storage 20 B and a processor 200 - 2 .
- the second embodiment differs from a first embodiment in that a setting unit 205 - 2 is added into the processor 200 - 2 and a function of an extraction unit 203 - 2 is different. Since other configurations and functions are similar to those of FIG. 1 that is a block diagram of a gradient estimation device 20 according to the first embodiment, the same reference signs are assigned thereto and a description thereof will be omitted.
- the setting unit 205 - 2 sets a region in an image.
- a region to be set is, for example, a candidate region to be a candidate for a region including a plane.
- the number of candidate regions to be set may be one or plural.
- the setting unit 205 - 2 may set a plurality of regions aligned in a gradient direction as the candidate regions.
- the setting unit 205 - 2 outputs the set candidate regions to the extraction unit 203 - 2 .
- the setting unit 205 - 2 may set a preset region on an image as a candidate region, or may set a candidate region at a changed candidate position when a candidate position of a desired plane is changed. For example, a distance to a position where the gradient is desired to be acquired may be changed and also a position of a candidate region on an image may be vertically moved in accordance with a moving speed of the moving body 10 - 2 .
- FIGS. 12 and 13 are views illustrating exemplary candidate regions to be set.
- FIG. 12 illustrates an example in which a candidate region 1201 is set at a bottom portion of an image in order to estimate gradient at a position closer to the moving body 10 - 2 .
- FIG. 13 illustrates an example in which a candidate region 1301 is moved upward of an image in order to estimate gradient at a farther position than in FIG. 12 .
- the setting unit 205 - 2 may set a candidate region by enlarging or reducing the candidate region.
- FIG. 14 is a diagram illustrating an exemplary candidate region 1401 set by reducing the candidate region. As illustrated in FIG. 14 , the setting unit 205 - 2 may set the candidate region by reducing the same in a case where the candidate region is moved in an upward direction of a gradient.
- the setting unit 205 - 2 may detect a road surface from an image by a method to perform collation with a road surface pattern or the like, and may set a candidate region only within a detected road surface.
- FIGS. 15 and 16 are views each illustrating an exemplary candidate region set in a road surface.
- a candidate region 1501 in FIG. 15 illustrates an exemplary candidate region set in a trapezoidal shape along a road surface shape at a bottom portion of an image in order to estimate gradient at a position closer to the moving body 10 - 2 .
- a candidate region 1601 in FIG. 16 illustrates an exemplary candidate region moved upward in the image and set in a trapezoidal shape along the road surface shape in order to estimate gradient at a position farther than in FIG. 15 .
- FIG. 17 is a view illustrating an example of setting a plurality of candidate regions.
- FIG. 17 illustrates the example in which two candidate regions 1701 and 1702 are set.
- the extraction unit 203 - 2 differs from an extraction unit 203 of the first embodiment in extracting, from a region in each image set by the setting unit 205 - 2 , a pixel in a direction different from a gradient direction.
- FIG. 18 is a flowchart illustrating exemplary gradient estimation processing according to the second embodiment.
- steps S 201 to S 202 Since processing in steps S 201 to S 202 is similar to that in steps S 101 to S 102 in the gradient estimation device 20 according to the first embodiment, the description thereof will be omitted.
- the setting unit 205 - 2 sets a candidate region of a plane in an acquired image (step S 203 ).
- the extraction unit 203 - 2 performs edge extraction processing while setting the set candidate region as a target (step S 204 ).
- steps S 205 to S 206 Since processing in steps S 205 to S 206 is similar to that in S 104 to S 105 in the gradient estimation device 20 according to the first embodiment, the description thereof will be omitted.
- a calculation unit 204 estimates gradient magnitude for each of the regions, and outputs each estimation value. For example, the calculation unit 204 outputs, without change, the gradient magnitude estimated for each of the regions. The calculation unit 204 calculates a difference in gradient magnitude between the respective regions (change in the gradient magnitude), and in a case where the difference is larger than a threshold value, the calculation unit 204 outputs output information indicating that there is a change point of the gradient magnitude. Also, a distance to the change point from a position of a region having such a change point on an image may be output.
- a region in an image used to estimate gradient magnitude is set. Consequently, estimation accuracy for gradient magnitude of a plane can be further improved.
- a camera is installed horizontally with respect to a plane in a manner facing forward.
- a horizontal, perpendicular, vertical, or lateral directions may be changed in accordance with the installation direction.
- characteristic portions in the gradient direction can be matched by utilizing an image transformation result by using which a pixel in a direction different from a gradient direction of a plane is matched.
- improve estimation accuracy for gradient magnitude of a plane can be improved, and gradient estimation can be executed robust.
- FIG. 19 is an explanatory diagram illustrating an exemplary hardware configuration of the gradient estimation devices according to the first or second embodiment.
- the gradient estimation device includes a control device such as a central processing unit (CPU) 51 , storage devices such as a read only memory (ROM) 52 and a random access memory (RAM) 53 , a network communication I/F 54 to perform communication by being connected to a network, and a bus 61 to connect the respective units.
- a control device such as a central processing unit (CPU) 51
- storage devices such as a read only memory (ROM) 52 and a random access memory (RAM) 53
- ROM read only memory
- RAM random access memory
- network communication I/F 54 to perform communication by being connected to a network
- bus 61 to connect the respective units.
- a program executed by the gradient estimation device according to the first or second embodiment is provided by being incorporated in advance in the ROM 52 or the like.
- the program executed by the gradient estimation device may be recorded as a file in a computer readable storage medium, such as a compact disk read only memory (CD-ROM), a flexible disk (FD), a compact disk recordable (CD-R), a digital versatile disk (DVD), or the like, in a installable format or an executable format, and may be provided as a computer program product.
- a computer readable storage medium such as a compact disk read only memory (CD-ROM), a flexible disk (FD), a compact disk recordable (CD-R), a digital versatile disk (DVD), or the like, in a installable format or an executable format, and may be provided as a computer program product.
- a program may be stored on a computer connected to a network such as the Internet, and may be provided by being downloaded via the network. Furthermore, the program executed by the gradient estimation device according to the first or second embodiment may be provided or distributed via a network such as the Internet.
- the program executed by the gradient estimation device according to the first or second embodiment may be capable of causing a computer to function as each unit of the above-described gradient estimation device.
- the computer can have the CPU 51 read and execute the program on a main storage device from a computer readable storage medium.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Mechanical Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Electromagnetism (AREA)
- Transportation (AREA)
- Geometry (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (10)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017235717A JP6901386B2 (en) | 2017-12-08 | 2017-12-08 | Gradient Estimator, Gradient Estimator, Program and Control System |
| JP2017-235717 | 2017-12-08 | ||
| JPJP2017-235717 | 2017-12-08 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20190180116A1 US20190180116A1 (en) | 2019-06-13 |
| US11087145B2 true US11087145B2 (en) | 2021-08-10 |
Family
ID=63528600
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/125,312 Active 2038-11-29 US11087145B2 (en) | 2017-12-08 | 2018-09-07 | Gradient estimation device, gradient estimation method, computer program product, and controlling system |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11087145B2 (en) |
| EP (1) | EP3496040A1 (en) |
| JP (1) | JP6901386B2 (en) |
| KR (1) | KR102117313B1 (en) |
| CN (1) | CN109900245B (en) |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102017217005A1 (en) * | 2017-09-26 | 2019-03-28 | Robert Bosch Gmbh | Method for determining the slope of a roadway |
| CN112184792B (en) * | 2020-08-28 | 2023-05-26 | 辽宁石油化工大学 | Road gradient calculation method and device based on vision |
| CN114155304B (en) * | 2020-09-04 | 2024-12-13 | 株式会社理光 | Spatial plane image comparison method, device and computer readable storage medium |
| CN112660125B (en) * | 2020-12-26 | 2023-04-07 | 江铃汽车股份有限公司 | Vehicle cruise control method and device, storage medium and vehicle |
| JP7739020B2 (en) * | 2021-03-30 | 2025-09-16 | キヤノン株式会社 | Image processing device, image processing method, mobile device, and computer program |
| CN117532624B (en) * | 2024-01-10 | 2024-03-26 | 南京东奇智能制造研究院有限公司 | Automatic positioning and aligning method and system for guardrail plate installation |
Citations (33)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5550937A (en) * | 1992-11-23 | 1996-08-27 | Harris Corporation | Mechanism for registering digital images obtained from multiple sensors having diverse image collection geometries |
| US6408105B1 (en) * | 1998-05-12 | 2002-06-18 | Advantest Corporation | Method for detecting slope of image data utilizing hough-transform |
| US6408109B1 (en) * | 1996-10-07 | 2002-06-18 | Cognex Corporation | Apparatus and method for detecting and sub-pixel location of edges in a digital image |
| US6519372B1 (en) * | 1999-08-31 | 2003-02-11 | Lockheed Martin Corporation | Normalized crosscorrelation of complex gradients for image autoregistration |
| US20030223615A1 (en) * | 2002-06-04 | 2003-12-04 | Keaton Patricia A. | Digital image edge detection and road network tracking method and system |
| US20050018043A1 (en) * | 2000-02-29 | 2005-01-27 | Kabushiki Kaisha Toshiba | Obstacle detection apparatus and method |
| US20060013439A1 (en) * | 2001-05-23 | 2006-01-19 | Kabushiki Kaisha Toshiba | System and method for detecting obstacle |
| US20060210116A1 (en) * | 2005-03-18 | 2006-09-21 | Honda Elesys Co., Ltd. | Lane recognition apparatus |
| US20070070365A1 (en) * | 2005-09-26 | 2007-03-29 | Honeywell International Inc. | Content-based image retrieval based on color difference and gradient information |
| US20070110319A1 (en) * | 2005-11-15 | 2007-05-17 | Kabushiki Kaisha Toshiba | Image processor, method, and program |
| US20080063300A1 (en) * | 2006-09-11 | 2008-03-13 | Porikli Fatih M | Image registration using joint spatial gradient maximization |
| US20080253606A1 (en) * | 2004-08-11 | 2008-10-16 | Tokyo Institute Of Technology | Plane Detector and Detecting Method |
| US20090041337A1 (en) * | 2007-08-07 | 2009-02-12 | Kabushiki Kaisha Toshiba | Image processing apparatus and method |
| US20090123070A1 (en) * | 2007-11-14 | 2009-05-14 | Itt Manufacturing Enterprises Inc. | Segmentation-based image processing system |
| EP2177425A1 (en) | 2007-08-10 | 2010-04-21 | Equos Research Co., Ltd. | Vehicle |
| US20100134444A1 (en) * | 2007-03-30 | 2010-06-03 | Yoichiro Yahata | Image processing device, control program, computer-readable storage medium, electronic apparatus, and image processing device control method |
| US20110293190A1 (en) * | 2006-07-17 | 2011-12-01 | Mitsubishi Denki Kabushiki Kaisha | Image processing for change detection |
| US20120093420A1 (en) * | 2009-05-20 | 2012-04-19 | Sony Corporation | Method and device for classifying image |
| US20120173083A1 (en) * | 2010-12-31 | 2012-07-05 | Automotive Research & Test Center | Vehicle roll over prevention safety driving system and method |
| US20130182896A1 (en) * | 2011-11-02 | 2013-07-18 | Honda Elesys Co., Ltd. | Gradient estimation apparatus, gradient estimation method, and gradient estimation program |
| US20140218525A1 (en) | 2011-05-31 | 2014-08-07 | Stefan Sellhusen | Method for determining a pitch of a camera installed in a vehicle and method for controling a light emission of at least one headlight of a vehicle. |
| EP2793184A1 (en) | 2011-12-13 | 2014-10-22 | Panasonic Intellectual Property Corporation of America | Measurement-target-selecting device, face-shape-estimating device, method for selecting measurement target, and method for estimating face shape |
| US20150003741A1 (en) * | 2013-07-01 | 2015-01-01 | Here Global B.V. | Occlusion Resistant Image Template Matching Using Distance Transform |
| US9013284B2 (en) * | 2011-03-22 | 2015-04-21 | Denso Corporation | Vehicle presence notification apparatus |
| EP2913998A1 (en) | 2014-02-28 | 2015-09-02 | Ricoh Company, Ltd. | Disparity value deriving device, equipment control system, movable apparatus, robot, disparity value deriving method, and computer-readable storage medium |
| WO2016079117A1 (en) * | 2014-11-20 | 2016-05-26 | Autoliv Development Ab | Gradient detection based on perspective-transformed image |
| US20170069072A1 (en) * | 2015-09-03 | 2017-03-09 | Kabushiki Kaisha Toshiba | Image processing apparatus, image processing system, and image processing method |
| US20170078551A1 (en) * | 2015-09-10 | 2017-03-16 | Korea Advanced Institute Of Science And Technology | Apparatus and method for adjusting camera exposure |
| US20170222612A1 (en) * | 2016-01-28 | 2017-08-03 | Harman Becker Automotive Systems Gmbh | System and method for external sound synthesis of a vehicle |
| EP3246205A1 (en) | 2015-01-14 | 2017-11-22 | Koito Manufacturing Co., Ltd. | Control device for vehicular lamp, and vehicular lamp system |
| US9868323B2 (en) * | 2013-05-16 | 2018-01-16 | Anden Co., Ltd. | Vehicle approach alert device |
| US9984277B2 (en) * | 2014-11-24 | 2018-05-29 | Massachusetts Institute Of Technology | Systems, apparatus, and methods for analyzing blood cell dynamics |
| US20190213746A1 (en) * | 2018-01-05 | 2019-07-11 | Panasonic Intellectual Property Management Co., Ltd. | Disparity estimation device, disparity estimation method, and program |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1950844A (en) * | 2004-03-03 | 2007-04-18 | 日本电气株式会社 | Object posture estimation/correlation system, object posture estimation/correlation method, and program for the same |
| JP2015056057A (en) * | 2013-09-12 | 2015-03-23 | トヨタ自動車株式会社 | Posture estimation method and robot |
| JP6515039B2 (en) * | 2016-01-08 | 2019-05-15 | Kddi株式会社 | Program, apparatus and method for calculating a normal vector of a planar object to be reflected in a continuous captured image |
-
2017
- 2017-12-08 JP JP2017235717A patent/JP6901386B2/en active Active
-
2018
- 2018-09-07 EP EP18193180.9A patent/EP3496040A1/en active Pending
- 2018-09-07 US US16/125,312 patent/US11087145B2/en active Active
- 2018-09-10 CN CN201811048936.6A patent/CN109900245B/en active Active
- 2018-09-10 KR KR1020180107761A patent/KR102117313B1/en active Active
Patent Citations (34)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5550937A (en) * | 1992-11-23 | 1996-08-27 | Harris Corporation | Mechanism for registering digital images obtained from multiple sensors having diverse image collection geometries |
| US6690842B1 (en) * | 1996-10-07 | 2004-02-10 | Cognex Corporation | Apparatus and method for detection and sub-pixel location of edges in a digital image |
| US6408109B1 (en) * | 1996-10-07 | 2002-06-18 | Cognex Corporation | Apparatus and method for detecting and sub-pixel location of edges in a digital image |
| US6408105B1 (en) * | 1998-05-12 | 2002-06-18 | Advantest Corporation | Method for detecting slope of image data utilizing hough-transform |
| US6519372B1 (en) * | 1999-08-31 | 2003-02-11 | Lockheed Martin Corporation | Normalized crosscorrelation of complex gradients for image autoregistration |
| US20050018043A1 (en) * | 2000-02-29 | 2005-01-27 | Kabushiki Kaisha Toshiba | Obstacle detection apparatus and method |
| US20060013439A1 (en) * | 2001-05-23 | 2006-01-19 | Kabushiki Kaisha Toshiba | System and method for detecting obstacle |
| US20030223615A1 (en) * | 2002-06-04 | 2003-12-04 | Keaton Patricia A. | Digital image edge detection and road network tracking method and system |
| US20080253606A1 (en) * | 2004-08-11 | 2008-10-16 | Tokyo Institute Of Technology | Plane Detector and Detecting Method |
| US20060210116A1 (en) * | 2005-03-18 | 2006-09-21 | Honda Elesys Co., Ltd. | Lane recognition apparatus |
| US20070070365A1 (en) * | 2005-09-26 | 2007-03-29 | Honeywell International Inc. | Content-based image retrieval based on color difference and gradient information |
| US20070110319A1 (en) * | 2005-11-15 | 2007-05-17 | Kabushiki Kaisha Toshiba | Image processor, method, and program |
| US20110293190A1 (en) * | 2006-07-17 | 2011-12-01 | Mitsubishi Denki Kabushiki Kaisha | Image processing for change detection |
| US20080063300A1 (en) * | 2006-09-11 | 2008-03-13 | Porikli Fatih M | Image registration using joint spatial gradient maximization |
| US20100134444A1 (en) * | 2007-03-30 | 2010-06-03 | Yoichiro Yahata | Image processing device, control program, computer-readable storage medium, electronic apparatus, and image processing device control method |
| US20090041337A1 (en) * | 2007-08-07 | 2009-02-12 | Kabushiki Kaisha Toshiba | Image processing apparatus and method |
| EP2177425A1 (en) | 2007-08-10 | 2010-04-21 | Equos Research Co., Ltd. | Vehicle |
| US20090123070A1 (en) * | 2007-11-14 | 2009-05-14 | Itt Manufacturing Enterprises Inc. | Segmentation-based image processing system |
| US20120093420A1 (en) * | 2009-05-20 | 2012-04-19 | Sony Corporation | Method and device for classifying image |
| US20120173083A1 (en) * | 2010-12-31 | 2012-07-05 | Automotive Research & Test Center | Vehicle roll over prevention safety driving system and method |
| US9013284B2 (en) * | 2011-03-22 | 2015-04-21 | Denso Corporation | Vehicle presence notification apparatus |
| US20140218525A1 (en) | 2011-05-31 | 2014-08-07 | Stefan Sellhusen | Method for determining a pitch of a camera installed in a vehicle and method for controling a light emission of at least one headlight of a vehicle. |
| US20130182896A1 (en) * | 2011-11-02 | 2013-07-18 | Honda Elesys Co., Ltd. | Gradient estimation apparatus, gradient estimation method, and gradient estimation program |
| EP2793184A1 (en) | 2011-12-13 | 2014-10-22 | Panasonic Intellectual Property Corporation of America | Measurement-target-selecting device, face-shape-estimating device, method for selecting measurement target, and method for estimating face shape |
| US9868323B2 (en) * | 2013-05-16 | 2018-01-16 | Anden Co., Ltd. | Vehicle approach alert device |
| US20150003741A1 (en) * | 2013-07-01 | 2015-01-01 | Here Global B.V. | Occlusion Resistant Image Template Matching Using Distance Transform |
| EP2913998A1 (en) | 2014-02-28 | 2015-09-02 | Ricoh Company, Ltd. | Disparity value deriving device, equipment control system, movable apparatus, robot, disparity value deriving method, and computer-readable storage medium |
| WO2016079117A1 (en) * | 2014-11-20 | 2016-05-26 | Autoliv Development Ab | Gradient detection based on perspective-transformed image |
| US9984277B2 (en) * | 2014-11-24 | 2018-05-29 | Massachusetts Institute Of Technology | Systems, apparatus, and methods for analyzing blood cell dynamics |
| EP3246205A1 (en) | 2015-01-14 | 2017-11-22 | Koito Manufacturing Co., Ltd. | Control device for vehicular lamp, and vehicular lamp system |
| US20170069072A1 (en) * | 2015-09-03 | 2017-03-09 | Kabushiki Kaisha Toshiba | Image processing apparatus, image processing system, and image processing method |
| US20170078551A1 (en) * | 2015-09-10 | 2017-03-16 | Korea Advanced Institute Of Science And Technology | Apparatus and method for adjusting camera exposure |
| US20170222612A1 (en) * | 2016-01-28 | 2017-08-03 | Harman Becker Automotive Systems Gmbh | System and method for external sound synthesis of a vehicle |
| US20190213746A1 (en) * | 2018-01-05 | 2019-07-11 | Panasonic Intellectual Property Management Co., Ltd. | Disparity estimation device, disparity estimation method, and program |
Non-Patent Citations (5)
| Title |
|---|
| Shigeki Sugimoto, et al., "Efficient Plane-parameter Estimation Using Stereo Images", vol. 48, No. SIG (CVIM)., Feb. 2007, 32 pages (with Machine Generated English Translation). |
| Shigeki Sugimoto, et al., "Fast Plane Parameter Estimation From Stereo Images", Proceedings of the IAPR Conference on Machine Vision Applications (MVA2007), pp. 567-570, May 16, 2007. |
| WESTERHOFF JENS; LESSMANN STEPHANIE; MEUTER MIRKO; SIEGEMUND JAN; KUMMERT ANTON: "Development and comparison of homography based estimation techniques for camera to road surface orientation", 2016 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), IEEE, 19 June 2016 (2016-06-19), pages 1034 - 1040, XP032939094, DOI: 10.1109/IVS.2016.7535516 |
| Westerhoff, J. et al., "Development and Comparison of Homography based Estimation Techniques for Camera to Road Surface Orientation", 2016 IEEE Intelligent Vehicles Symposium (IV), XP032939094, Jun. 19, 2016, pp. 1034-1040. |
| Zhou, J. et al., "Homography-based Ground Detection for A Mobile Robot Platform Using a Single Camera", IEEE International Conference on Robotics and Automation, May 2006, pp. 4100-4105. |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20190068409A (en) | 2019-06-18 |
| CN109900245A (en) | 2019-06-18 |
| JP2019102007A (en) | 2019-06-24 |
| EP3496040A1 (en) | 2019-06-12 |
| CN109900245B (en) | 2021-10-08 |
| US20190180116A1 (en) | 2019-06-13 |
| JP6901386B2 (en) | 2021-07-14 |
| KR102117313B1 (en) | 2020-06-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11087145B2 (en) | Gradient estimation device, gradient estimation method, computer program product, and controlling system | |
| US10909411B2 (en) | Information processing apparatus, information processing method, and computer program product | |
| US12164030B2 (en) | Local sensing based autonomous navigation, and associated systems and methods | |
| US11200432B2 (en) | Method and apparatus for determining driving information | |
| JP7082545B2 (en) | Information processing methods, information processing equipment and programs | |
| EP3245614B1 (en) | Object detection using location data and scale space representations of image data | |
| EP3349143B1 (en) | Nformation processing device, information processing method, and computer-readable medium | |
| US20180137376A1 (en) | State estimating method and apparatus | |
| KR20200042760A (en) | Vehicle localization method and vehicle localization apparatus | |
| US10789488B2 (en) | Information processing device, learned model, information processing method, and computer program product | |
| US20170344844A1 (en) | Information processing apparatus and information processing method | |
| CN110945379A (en) | Determining yaw error from map data, lasers and cameras | |
| US10885353B2 (en) | Information processing apparatus, moving object, information processing method, and computer program product | |
| US11204610B2 (en) | Information processing apparatus, vehicle, and information processing method using correlation between attributes | |
| KR102056147B1 (en) | Registration method of distance data and 3D scan data for autonomous vehicle and method thereof | |
| JP2018048949A (en) | Object identification device | |
| JP2018189463A (en) | Vehicle position estimating device and program | |
| US20250069256A1 (en) | Multi-person 3d pose estimation | |
| RU2775817C2 (en) | Method and system for training machine learning algorithm for detecting objects at a distance | |
| WO2018212286A1 (en) | Measurement device, measurement method and program | |
| KR20190134905A (en) | Apparatus and method for determining location of vehicle and computer recordable medium storing computer program thereof |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TASAKI, TSUYOSHI;MIYAMOTO, TAKUYA;SUGIURA, TAKAYUKI;SIGNING DATES FROM 20181004 TO 20181016;REEL/FRAME:047274/0606 Owner name: TOSHIBA ELECTRONIC DEVICES & STORAGE CORPORATION, Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TASAKI, TSUYOSHI;MIYAMOTO, TAKUYA;SUGIURA, TAKAYUKI;SIGNING DATES FROM 20181004 TO 20181016;REEL/FRAME:047274/0606 Owner name: TOSHIBA ELECTRONIC DEVICES & STORAGE CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TASAKI, TSUYOSHI;MIYAMOTO, TAKUYA;SUGIURA, TAKAYUKI;SIGNING DATES FROM 20181004 TO 20181016;REEL/FRAME:047274/0606 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP, ISSUE FEE PAYMENT VERIFIED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |