CN113341942A - Vehicle device and acceleration correction method - Google Patents

Vehicle device and acceleration correction method Download PDF

Info

Publication number
CN113341942A
CN113341942A CN202010092184.4A CN202010092184A CN113341942A CN 113341942 A CN113341942 A CN 113341942A CN 202010092184 A CN202010092184 A CN 202010092184A CN 113341942 A CN113341942 A CN 113341942A
Authority
CN
China
Prior art keywords
vehicle
acceleration
reference point
information
offset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010092184.4A
Other languages
Chinese (zh)
Inventor
三浦直树
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Priority to CN202010092184.4A priority Critical patent/CN113341942A/en
Publication of CN113341942A publication Critical patent/CN113341942A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P21/00Testing or calibrating of apparatus or devices covered by the preceding groups
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Abstract

The present invention relates to a vehicle device and an acceleration correction method. A vehicle device (10) obtains acceleration information by correcting detection information of an acceleration sensor (14) mounted on a vehicle (12) on the basis of an acceleration reference point (AO). The vehicle device is provided with a shooting device (40), a vanishing point calculation part (64) and a deviation amount calculation part (66), wherein the shooting device is used for shooting the outside of the vehicle; the vanishing point calculating unit (64) calculates a Vanishing Point (VP) from imaging information (PI) imaged by an imaging device; the offset amount calculation unit calculates an offset amount (α) of the vanishing point with respect to a vanishing point reference value (AO). A reference point setting unit (70) sets an acceleration reference point (AO) by learning detection information detected by an acceleration sensor (14) from an offset amount when the current position of the vehicle is at a predetermined gradient. This makes it possible to correct the detection information detected by the acceleration sensor with high accuracy.

Description

Vehicle device and acceleration correction method
Technical Field
The present invention relates to a vehicle device that corrects detection information detected by an acceleration sensor mounted on a vehicle, an acceleration correction method, and a storage medium storing a correction program.
Background
A vehicle such as a four-wheel vehicle is equipped with an acceleration sensor that detects acceleration acting on the vehicle during traveling, parking, and the like. Acceleration information detected by the acceleration sensor is used to recognize various states such as the posture of the vehicle.
For example, a navigation device having an acceleration sensor that detects acceleration of a vehicle is disclosed in japanese patent laid-open No. 4761703 and japanese patent laid-open No. 4074598. The navigation device calculates a change amount of the posture of the vehicle in the up-down direction from acceleration information of the acceleration sensor in the up-down direction (gravity direction), and determines, for example, entrance/exit of the vehicle in the vicinity of an entrance or an exit of an expressway, using gradient data of map information and the calculated change amount of the posture in the up-down direction.
Disclosure of Invention
Further, since there is an error for each sensor and the mounting angle on the vehicle differs for each vehicle, the acceleration sensor needs to learn in a horizontal position and appropriately set an acceleration reference point (so-called zero point) for correcting the detection information. In particular, since it is not known whether the acceleration reference point can be accurately set when the vehicle stops at a place having a slope, the acceleration sensor mounted on the vehicle sets the acceleration reference point by cumulatively learning the detection information of a plurality of places.
However, the angle (horizontal state) also changes when a heavy object is loaded in the compartment (including the trunk) of the vehicle. Therefore, in the case where the learning detection information is accumulated as described above, the acceleration reference point cannot immediately follow the posture change with the presence or absence of the weight. In addition, when the detection information is learned in a state where a heavy object is loaded for a long period of time, the accuracy of the acceleration reference point deteriorates. Accordingly, the possibility of erroneous determination is greatly increased when the vehicle device determines to enter/exit the expressway based on the acceleration information.
The present invention has been made to solve the above-described problems, and an object of the present invention is to provide a vehicle device, an acceleration correction method, and a storage medium storing a correction program, which can correct detection information detected by an acceleration sensor with high accuracy by setting an acceleration reference point more appropriately.
In order to achieve the above object, a first aspect of the present invention is a vehicle device including an acceleration sensor mounted on a vehicle and detecting at least an acceleration in a vertical direction, and obtaining acceleration information by correcting detection information detected by the acceleration sensor based on an acceleration reference point, the vehicle device including an imaging device mounted on the vehicle and imaging an outside of the vehicle, an vanishing point calculating unit, a deviation amount calculating unit, and a reference point setting unit; the vanishing point calculating part calculates vanishing points according to shooting information shot by the shooting device; the offset amount calculation unit calculates an offset amount of the vanishing point with respect to a vanishing point reference value; the reference point setting unit sets the acceleration reference point by learning the detection information detected by the acceleration sensor based on the offset amount when gradient information of the current position of the vehicle is acquired and the current position of the vehicle is a predetermined gradient.
In order to achieve the above object, a 2 nd aspect of the present invention is an acceleration correction method for obtaining acceleration information by detecting at least acceleration in a vertical direction by an acceleration sensor mounted on a vehicle and correcting the detected information based on an acceleration reference point, the acceleration correction method including a vanishing point calculating step of calculating a vanishing point based on imaging information of an imaging device mounted on the vehicle and imaging an external environment of the vehicle, a shift amount calculating step, and a reference point setting step; the offset amount calculating step is a step of calculating an offset amount of the vanishing point with respect to a vanishing point reference value; the reference point setting step is a step of setting the acceleration reference point, and the acceleration reference point is set by learning the detection information detected by the acceleration sensor based on the offset amount when gradient information of the current position of the vehicle is acquired and the current position of the vehicle is a predetermined gradient.
In order to achieve the above object, a 3 rd aspect of the present invention is a storage medium storing an acceleration correction program for obtaining acceleration information by detecting at least an acceleration in a vertical direction by an acceleration sensor mounted on a vehicle and correcting the detected information based on an acceleration reference point, the correction program executing a reference point setting step of setting the acceleration reference point, the reference point setting step including: the acceleration reference point is set by learning the detection information detected by the acceleration sensor based on the offset amount when the current position of the vehicle is a predetermined gradient.
The above-described vehicle device, the acceleration correction method, and the storage medium storing the correction program can set the acceleration reference point more favorably, and thereby can correct the detection information detected by the acceleration sensor with high accuracy.
The above objects, features and advantages should be readily understood from the following description of the embodiments with reference to the accompanying drawings.
Drawings
Fig. 1A is a side view schematically showing a running state of a vehicle on which a vehicle device according to an embodiment of the present invention is mounted. Fig. 1B is a side view schematically showing a traveling state of a vehicle on which a heavy object is mounted.
Fig. 2 is a block diagram showing the structure of the vehicle device.
Fig. 3A is an explanatory diagram showing shot information in a case where the posture of the vehicle is in the horizontal direction.
Fig. 3B is an explanatory diagram showing captured information in the case where the posture of the vehicle is inclined.
Fig. 4 is a block diagram showing functions of an image processing apparatus and a control apparatus constituting a device for a vehicle.
Fig. 5A is an explanatory diagram showing a relationship between the amount of displacement of the vanishing point and the detection information in the case where the posture of the vehicle is in the horizontal direction. Fig. 5B is an explanatory diagram showing a relationship between the amount of displacement of the vanishing point and the detection information in the case where the posture of the vehicle is inclined.
Fig. 6 is a flowchart showing calculation of the offset amount of the vanishing point in the acceleration correction method.
Fig. 7 is a flowchart 1 showing the setting of an acceleration reference point in the acceleration correction method.
Fig. 8 is a flow chart 2 showing the setting of the acceleration reference point in the acceleration correction method.
Fig. 9 is a flowchart showing the setting of the travel path in the acceleration correction method.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings by way of examples of preferred embodiments.
As shown in fig. 1A, a vehicle device 10 according to an embodiment of the present invention is mounted on a vehicle 12 such as a four-wheeled automobile, and performs appropriate processing such as recognizing the posture of the vehicle 12 from the acceleration acting on the vehicle 12. Therefore, the vehicle device 10 includes the acceleration sensor 14 and the navigation device 16 (processing unit) that processes the detection information detected by the acceleration sensor 14.
For example, the vehicle device 10 calculates a pitch angle (pitch angle), which is the current attitude of the vehicle 12, from information on the acceleration of the acceleration sensor 14 in the vertical direction. Accordingly, the vehicle device 10 can calculate the posture of the front portion 12a of the vehicle 12 inclined downward in the gravity direction by a predetermined degree or more in the vicinity of the exit of the expressway HR, and can determine the exit from the exit of the expressway HR based on the calculated posture. Alternatively, the vehicle device 10 may determine that the vehicle is traveling on the highway HR by detecting a posture in which the front portion 12a of the vehicle 12 is inclined upward in the direction of gravity by a predetermined degree or more in the vicinity of the entrance to the highway HR.
Here, the vehicle device 10 calculates acceleration information from a difference between the detection information of the acceleration sensor 14 and the acceleration reference point AO using the acceleration reference point AO (zero point: see fig. 4) when acquiring the acceleration information of the acceleration sensor 14. The acceleration reference point AO is basically a reference point set based on the gravitational acceleration acting in the vertical direction in the posture of the vehicle 12 in the horizontal direction. Therefore, the vehicle device 10 calculates and sets the acceleration reference point AO by using the detection information of the acceleration sensor 14 at a plurality of points when the vehicle 12 moves.
As shown in fig. 1B, when a load (heavy object W) having a certain weight is loaded in the trunk of the rear portion 12B of the vehicle 12, the front portion 12a of the vehicle 12 is inclined upward in the direction of gravity. Therefore, if the detected information is corrected using the acceleration reference point AO in a state where the weight W is not loaded in a state where the weight W is loaded, the acceleration information reflecting the current posture of the vehicle 12 is not obtained. Therefore, the vehicle device 10 according to the present embodiment recognizes the change in the posture of the vehicle 12 and appropriately calculates the acceleration reference point AO, thereby obtaining highly accurate acceleration information. Next, a detailed configuration of the vehicle device 10 will be described with reference to fig. 2.
The acceleration sensor 14 of the vehicle device 10 is a 3-axis acceleration sensor capable of detecting accelerations in 3 directions (vertical direction (vehicle height direction), front-rear direction (vehicle length direction), and left-right direction (vehicle width direction)) of the vehicle 12. The acceleration sensor 14 is a small sensor, and is configured as a MEMS (micro electro mechanical system) device to be attached to a control device 24 (base) of the navigation device 16.
The acceleration sensor 14 is not particularly limited, and various configurations can be applied as long as it is a sensor that detects at least the acceleration in the vertical direction. For example, the acceleration sensor 14 may also have a function of a gyro sensor that detects an angular velocity. The acceleration sensor 14 may be provided independently of the navigation device 16. As another example, a sensor used for performing automatic driving (including driving assistance) in which the vehicle 12 itself controls the running, a sensor used for detecting an impact applied to the vehicle 12, or the like can be applied to the acceleration sensor 14.
The navigation device 16 calculates the current position of the vehicle 12 by a known positioning mechanism, and displays information indicating the current position on the owned map information MI (see fig. 4). In addition, the navigation device 16 calculates a travel route from the current position of the vehicle 12 and the destination set by the user, and displays the travel route on the map information MI.
Specifically, the navigation device 16 includes, in addition to the acceleration sensor 14, an input/output device 18, a GNSS reception module 20, a sensor group 22, and a control device 24 connected to these devices. The control device 24 is configured as a computer having an input/output interface 26, a processor 28, a memory 30 (storage medium), a timer 32, a communication module 34, and the like.
The input/output device 18 is a device that outputs information to a user (driver or other occupant) of the vehicle 12 and inputs information based on a user operation to the control device 24, and is applicable to an apparatus that performs input/output by an image or voice. For example, the input/output device 18 can be configured by appropriately combining (or using) the display 18a, the touch panel 18b, the physical buttons, the camera, the speaker, the microphone, and the like.
The GNSS reception module 20 is a device that receives radio waves from a plurality of satellites constituting a satellite positioning system (GNSS), and the navigation device 16 calculates the current position of the vehicle 12 from these radio waves. The navigation device 16 may be configured to calculate the current position of the vehicle 12 using the detection information of the sensor group 22 in addition to the satellite positioning system.
The sensor group 22 is constituted by 1 or more detection devices that detect information used for processing by the navigation device 16. The sensor group 22 includes, for example, a vehicle speed sensor 22a, a geomagnetic sensor 22b, and the like in addition to the acceleration sensor 14 described above.
The control device 24 generates and outputs information required for navigation by the processor 28 executing the program 36 stored in the memory 30. For example, the control device 24 processes various information transmitted from the input-output device 18, the GNSS reception module 20, and the sensor group 22 through the input-output interface 26, and calculates the current position and orientation of the vehicle 12, a travel route to a destination, a travel time, and the like. In addition, the control device 24 stores map information MI necessary for navigation in the memory 30, and superimposes (performs matching) the calculated various information on the map information MI to output to the input-output device 18.
Then, the vehicle device 10 associates the gradient information GI with the plurality of coordinates of the map information MI stored in the memory 30. The gradient information GI is data indicating the degree of inclination (inclination angle) of the road with respect to the horizontal direction orthogonal to the direction of gravity. For example, the gradient information GI of one coordinate is composed of an inclination direction and an inclination angle (value) with respect to a coordinate axis (longitude, latitude). As will be described later, in the present embodiment, since it is sufficient to know the point where the vehicle 12 is in the horizontal state, the gradient information GI may be information from which the coordinates of the horizontal position can be extracted. Alternatively, the vehicle device 10 may be configured to receive the gradient information GI based on the current position from the outside of the vehicle 12 (a server connected to a network) using the communication module 34.
The vehicle device 10 includes an imaging device 40 that images the outside of the vehicle 12 in addition to the acceleration sensor 14 and the navigation device 16. That is, the vehicle device 10 appropriately sets (corrects) the acceleration reference point AO using the imaging information PI imaged by the imaging device 40, thereby improving the detection accuracy of the acceleration information.
The imaging device 40 includes: a camera 42 fixed to an upper side of a front windshield of the vehicle 12, for capturing an image of a front of the vehicle 12; and an image processing device 44 that processes the photographing information PI photographed by the camera 42. The camera 42 and the image processing apparatus 44 may be housed in 1 casing, or may be located at positions separated from each other and communicably connected. The imaging device 40 may be configured to be combined with a radar, an infrared sensor, and the like, not shown, in addition to the camera 42, as a device for acquiring the outside world information of the vehicle 12. It is needless to say that the imaging device 40 may be a known imaging device provided for automatic driving of the vehicle 12 or collision avoidance.
The camera 42 of the imaging device 40 includes a CMOS sensor or the like in which a plurality of imaging elements (pixels) are arranged, and optically receives external light in front of the vehicle 12. Accordingly, as shown in fig. 3A, the camera 42 generates the imaging information PI obtained by converting the three-dimensional external world into a two-dimensional image substantially composed of perspective projection (one-point perspective projection). The captured image information PI includes various information that affects the traveling, such as a traveling lane ahead of the vehicle 12 and other vehicles. The camera 42 acquires the shooting information PI at a prescribed frame rate, and sends it to the image processing apparatus 44.
Returning to fig. 2, the image processing device 44 processes the shooting information PI, and supplies information (offset amount α of vanishing point VP described later) for setting the acceleration reference point AO by the control device 24 of the navigation apparatus 16 to the control device 24. The image processing device 44 is configured as a computer having an input/output interface 46, a processor 48, a memory 50, a timer 52, and the like, and further has a communication module 54 connected (or capable of wireless communication) to a communication line 56 such as an in-vehicle LAN. That is, the image processing apparatus 44 is connected to the control apparatus 24 through the communication line 56. In addition, the control device 24 and the image processing device 44 may be provided in the same cradle (processing section).
The image processing apparatus 44 constructs a functional section that processes the imaging information PI imaged by the camera 42 by the processor 48 executing the program 58 stored in the memory 50 during startup. Specifically, as shown in fig. 4, the image processing device 44 includes an image correction unit 60, an extraction unit 62, a vanishing point calculation unit 64, and a shift amount calculation unit 66.
The image correction unit 60 performs image processing for sufficiently extracting information of the object included in the imaging information PI by performing correction of the imaging information PI and the like. For example, known processing methods such as color tone correction, sharpness correction, luminance correction, and shading correction can be applied as the image processing of the imaging information PI. Preferably, the image correction unit 60 performs appropriate correction according to the performance of the imaging device 40, the imaging environment, and the like.
The extraction unit 62 extracts lane lines (a pair of boundary lines SL: see fig. 3) of a traveling lane on which the vehicle 12 travels from the corrected imaging information PI, extracts stationary objects (obstacles, signs, etc.) and dynamic objects (living bodies such as other vehicles and pedestrians) existing in the vicinity of the lane lines, and extracts information on the objects. It is preferable that the extraction unit 62 appropriately performs correction to interpolate the pair of boundary lines SL when an actual lane line disappears or when reading is insufficient. The processing means for extracting the imaging information PI may be any known means, and examples thereof include processing for extracting a boundary (shape) from a color tone calculation feature amount, and processing for calculating a difference from the past imaging information PI.
The vanishing point calculating unit 64 calculates the vanishing point VP of the imaging information PI from the extracted extraction target. In the two-dimensional image formed by perspective projection, 2 or more extended lines of the extraction object extending parallel to the imaging direction of the camera 42 are generated, and a point at which these extended lines intersect is the vanishing point VP. The method of calculating the vanishing point VP is not particularly limited, and a known processing method can be applied.
For example, as shown in fig. 3A, the vanishing point calculating unit 64 virtually extends a pair of boundary lines SL of the extracted travel lane, and calculates a point where the pair of extended lines intersect as a vanishing point VP. In addition, the actual travel lane on which the vehicle 12 travels is rarely a straight line, and there may be another vehicle in front of the vehicle 12. Therefore, the vanishing point calculating unit 64 may calculate tangent lines from a pair of boundary lines SL (lines extending within a predetermined range on the lower side of the imaging information PI) at positions near the vehicle 12, and calculate a point where the tangent lines of the pair of boundary lines SL intersect with each other as the vanishing point VP.
Alternatively, if the camera 42 is configured to generate the imaging information PI formed by 1-point perspective projection, the vanishing point VP can be calculated without using a pair of boundary lines SL of the driving lane. For example, the vanishing point calculating unit 64 may generate extended lines of a plurality of lines extending along the extracted other stationary objects (fence, guardrail, curb, wall of building, etc. parallel to the traveling lane), and set a point at which the extended lines intersect as the vanishing point VP. The vanishing point calculating unit 64 may calculate the vanishing points VP from the imaging information PI continuously captured by the camera 42, and appropriately correct the current vanishing point VP using the calculation result of the past vanishing points VP (for example, the moving average is taken).
Returning to fig. 4, the offset amount calculation unit 66 calculates the offset amount α of the vanishing point VP using the vanishing point VP calculated by the vanishing point calculation unit 64. Here, if the posture of the vehicle 12 is horizontal and the front of the vehicle 12 captured by the camera 42 extends horizontally from the vehicle 12, the camera 42 fixed to the vehicle 12 always calculates the vanishing point VP at the same position as the captured information PI. Therefore, the offset amount calculation unit 66 stores the vanishing point reference value VO in the memory 50 when the posture of the vehicle 12 is horizontal, and calculates the offset amount α by taking the difference between the vanishing point reference value VO and the vanishing point VP calculated by the vanishing point calculation unit 64. Then, the offset amount calculation section 66 outputs the calculated offset amount α of the vanishing point VP to the control device 24 of the navigation apparatus 16.
For example, as shown in fig. 3A, if the calculated vanishing point VP and the vanishing point reference value VO are substantially matched, it can be considered that the camera 42 fixed to the vehicle 12 is capturing images in the direction of the vanishing point reference value VO. In other words, it can be estimated that the posture of the vehicle 12 is in the horizontal direction from the imaging information PI of the imaging device 40.
On the other hand, as shown in fig. 3B, if the calculated vanishing point VP is in a state of being separated from the vanishing point reference value VO, it can be considered that the camera 42 fixed to the vehicle 12 is capturing images in a direction deviating from the vanishing point reference value VO. In other words, it can be estimated that the posture of the vehicle 12 is not in the horizontal direction from the imaging information PI of the imaging device 40.
Therefore, the control device 24 (navigation device 16) of the vehicle device 10 can obtain information on the posture of the vehicle 12 estimated from the imaging information PI by receiving the offset amount α of the vanishing point VP from the imaging device 40. Therefore, the control device 24 forms the reference point setting portion 70 shown in fig. 4 by executing the program 36 (correction program) stored in the memory 30 by the processor 28, and sets the acceleration reference point AO using the offset amount α of the vanishing point VP. Specifically, the reference point setting unit 70 includes a positioning information acquiring unit 72, a current position specifying unit 74, a gradient determining unit 76, a vehicle state determining unit 78, an offset amount determining unit 80, a detection information acquiring unit 82, and a reference point learning unit 84.
The positioning information acquiring unit 72 acquires the current position of the vehicle 12 from the GNSS receiver module 20. The positioning information acquiring unit 72 includes a reliability determining unit 72a that verifies the reliability of the positioning of the GNSS receiver module 20, and determines whether the reliability of the obtained positioning information is sufficient. That is, the GNSS receiver module 20 may calculate the positioning information greatly deviated from the current position of the vehicle 12 according to the reception state of the radio wave. Therefore, the control device 24 is configured to evaluate the positioning information using an appropriate evaluation method to determine the reliability thereof, and thereby learn the detection information of the acceleration sensor 14 of the positioning information (the current position of the vehicle 12) having high reliability.
The current position determining unit 74 matches the current position of the vehicle 12 acquired by the positioning information acquiring unit 72 with the map information MI stored in the memory 30. At this time, in order to set the acceleration reference point AO, the current position specifying unit 74 determines whether or not the acceleration is learned last time. This is because, if the acceleration reference point AO is learned a plurality of times at a point with a slight gradient, the accuracy of the acceleration reference point AO is lowered.
The gradient determination unit 76 reads gradient information GI based on the current position of the vehicle 12 on the map information MI specified by the current position specification unit 74 from the memory 30, and determines whether or not the current position is substantially horizontal (a gradient within a predetermined range with respect to the horizontal direction). In this way, by using the gradient information GI corresponding to the current position of the vehicle 12, the control apparatus 24 can recognize whether the posture of the vehicle 12 is actually horizontal.
The vehicle state determination unit 78 determines whether the vehicle 12 is in a traveling state or a stopped state based on speed information of the vehicle speed sensor 22a and the like. In this case, it is preferable that the vehicle state determination unit 78 measures the time from when the vehicle 12 is stopped by the timer 32, and determines whether or not the vehicle 12 is actually stopped by monitoring the elapse of a predetermined period or more. That is, the vehicle device 10 according to the present embodiment is configured to learn the acceleration for setting the acceleration reference point AO in the stopped state of the vehicle 12. Accordingly, the detection information of the acceleration sensor 14 in a state where no acceleration other than gravity acts on the vehicle 12 can be reliably obtained, and thus the learning accuracy of the acceleration reference point AO can be improved. Further, if the vehicle travels straight at a constant speed during traveling (if the acceleration in the front, rear, left, and right directions is zero), the learning of the acceleration in the up-down direction is not hindered. Therefore, the vehicle device 10 may be configured to learn the acceleration when traveling in a horizontal place.
The offset amount determination unit 80 determines a method of learning the detection information detected by the acceleration sensor 14, based on the offset amount α of the vanishing point VP acquired from the imaging device 40. Therefore, the offset amount determination unit 80 has an offset amount acquisition unit 80a therein, and holds (stores) in advance an offset amount threshold Th, and the offset amount acquisition unit 80a acquires the offset amount α of the vanishing point VP from the image processing apparatus 44 and temporarily stores it in the memory 30. Then, the shift amount determination unit 80 determines that the cumulative learning is applicable when the shift amount α of the vanishing point VP is equal to or less than the shift amount threshold Th, and determines that the cumulative learning is not applicable when the shift amount α of the vanishing point VP exceeds the shift amount threshold Th. In addition, the vehicle device 10 may be configured such that the offset amount acquisition unit 80a (the control device 24) acquires the imaging information to calculate the offset amount α of the vanishing point VP (including the calculation of the vanishing point VP itself).
Here, the meaning of using the offset amount α of the vanishing point VP in learning the acceleration reference point AO will be described with reference to fig. 5A and 5B. As described above, the gradient information GI of the current position of the vehicle 12 is determined in the learning for setting the acceleration reference point AO, and therefore, the control device 24 can recognize that the vehicle 12 is located at a point in the horizontal direction. However, when the weight W is loaded on the rear portion 12b of the vehicle 12 although the vehicle is positioned horizontally, the vehicle 12 assumes an inclined posture such that the front portion 12a is raised. In this state, even if the detection information of the acceleration sensor 14 is learned in order to set the acceleration reference point AO, the acceleration reference point AO cannot be obtained with high accuracy.
Therefore, the vehicle device 10 determines the offset amount α of the vanishing point VP (the posture of the vehicle 12), thereby improving the learning accuracy of the detection information of the acceleration sensor 14. That is, as shown in fig. 5A, if the weight W is not mounted on the vehicle 12 located in a horizontal position, the offset amount α of the vanishing point VP is zero (or sufficiently small), and therefore the offset amount α is equal to or smaller than the offset amount threshold Th, and it can be said that the posture of the vehicle 12 is not inclined. Therefore, it can be determined that the detection information of the acceleration sensor 14 can be used for the accumulation learning.
On the other hand, as shown in fig. 5B, if a heavy object W is loaded on the vehicle 12 located in a horizontal position, the offset amount α of the vanishing point VP may exceed the offset amount threshold Th. In this way, when the offset amount α of the vanishing point VP exceeds the offset amount threshold Th, it can be said that the posture of the vehicle 12 is inclined, and therefore it can be determined that the cumulative learning of the detection information of the horizontal place where the detection information of the acceleration sensor 14 is not used is relatively good.
However, the user may use the vehicle 12 for a long time in a state where the vehicle 12 is loaded with the weight W. Therefore, the vehicle device 10 is configured to learn detection information of the acceleration sensor 14 when the offset amount α of the vanishing point VP exceeds the offset amount threshold Th.
In fig. 4, the offset threshold Th is represented by a range having an upper limit value and a lower limit value, and the offset amount α exceeding the offset threshold Th means exceeding the upper limit value or being lower than the lower limit value. When the shift amount α of the vanishing point VP is calculated as an absolute value, the shift amount threshold Th may be an absolute value. The shift amount threshold Th may be arbitrarily set, but in order to reliably grasp the change in the posture of the vehicle 12, it is preferable to set the shift amount threshold Th to a sufficiently low value (for example, 1/10 or less of the top-bottom width of the imaging information PI).
Returning to fig. 4, the detection information acquisition unit 82 acquires the detection information of the acceleration sensor 14 in order to learn the acceleration reference point AO, temporarily stores the acquired detection information in the memory 30, and outputs the acquired detection information to the reference point learning unit 84.
The reference point learning unit 84 learns the detection information of the acceleration sensor 14 by a plurality of learning methods (algorithms) based on the offset amount α of the vanishing point VP, and therefore includes a 1 st learning unit 86 (low offset amount learning unit) and a 2 nd learning unit 88 (high offset amount learning unit). The 1 st learning unit 86 learns the detection information of the acceleration sensor 14 in which the offset amount α is equal to or smaller than the offset amount threshold Th (within the range of the upper limit value and the lower limit value). The 2 nd learning unit 88 learns the detection information of the acceleration sensor 14 in which the shift amount α of the vanishing point VP exceeds the shift amount threshold Th.
Specifically, when the offset amount α of the vanishing point VP is equal to or less than the offset amount threshold Th, the 1 st learning unit 86 stores the detection information of the acceleration sensor 14 in the memory 30 as the low offset amount acceleration LA. Then, the 1 st learning unit 86 cumulatively learns the plurality of stored low offset amount accelerations LA. As the accumulated learning, an accumulated average or an accumulated standard deviation is calculated using a plurality of low offset accelerations LA, and the calculation result is set as the 1 st reference point O1 (low offset reference point). By using the 1 st reference point O1, the vehicle device 10 can calculate (correct) acceleration information with high accuracy from the detection result detected by the acceleration sensor 14 during traveling or the like.
On the other hand, the 2 nd learning unit 88 stores the detection information of the acceleration sensor 14 from the time point when the shift amount α of the vanishing point VP exceeds the shift amount threshold Th in the memory 30 as the high-shift-amount acceleration HA. This process continues until the offset amount α of the vanishing point VP becomes equal to or smaller than the offset threshold Th. That is, a state in which the offset amount α of the vanishing point VP continuously exceeds the offset amount threshold Th can be regarded as a state in which the same weight W is loaded in the vehicle 12 continuously. Therefore, the 2 nd learning unit 88 performs learning different from the 1 st learning unit 86 using the plurality of stored high offset amount accelerations HA.
For example, the 2 nd learning unit 88 calculates a weighted moving average of the plurality of high offset amount accelerations HA, and sets the calculation result as the 2 nd reference point O2 (high offset amount reference point). In the weighted moving average, a calculation is performed in which the weight of the acceleration detected most recently is made high and the weight of the acceleration detected immediately before is made small in a stepwise manner. Even if the weight W is loaded for a short period of time, the vehicle device 10 can set the appropriate 2 nd reference point O2.
It can be said that, when the offset amount α of the vanishing point VP exceeding the offset amount threshold Th becomes equal to or less than the offset amount threshold Th, the weight W is removed from the vehicle 12. Therefore, the 2 nd learning unit 88 deletes the detection information (high offset acceleration HA) stored in the memory 30 up to this point. When the offset amount α of the vanishing point VP exceeds the offset amount threshold Th again, the 2 nd learning unit 88 newly stores the detection information from the time point when the offset amount threshold Th is exceeded, and calculates the 2 nd reference point O2 from the new high offset acceleration HA.
The reference point setting unit 70 of the control device 24 can appropriately set the acceleration reference point AO by the above-described functional units. As shown in fig. 4, the control device 24 also constitutes a functional unit for performing processing using the set acceleration reference point AO, in addition to the reference point setting unit 70. Specifically, an acceleration calculation section 90 that calculates acceleration information, a posture calculation section 92 that determines the posture of the vehicle 12 from the acceleration information, and a route providing section 94 that generates a travel route of the vehicle 12 and provides it to the user are formed in the control device 24.
The acceleration calculation unit 90 acquires detection information detected by the acceleration sensor 14 during traveling of the vehicle 12, and calculates acceleration information by correcting the detection information using the acceleration reference point AO set by the reference point setting unit 70. The attitude calculation unit 92 calculates the attitude of the pitch angle of the vehicle 12, that is, how much the front portion 12a of the vehicle 12 is inclined in the vertical direction, based on the acceleration information.
The route providing section 94 calculates a travel route from the current position of the vehicle 12 and the destination set by the user, and outputs it to the input-output device 18. The route providing unit 94 also continues to monitor the current position of the vehicle 12 during traveling, and calculates a new travel route corresponding to the current position when the current position deviates from the travel route.
The route providing unit 94 determines, holds, or recalculates the travel route based on the posture of the vehicle 12 calculated by the posture calculating unit 92. For example, when the vehicle 12 traveling on the expressway HR approaches the exit, if it is recognized that the front portion 12a is inclined downward by a predetermined degree or more based on the information obtained by the posture calculating portion 92, the route providing portion 94 determines that the expressway HR has been traveled out. In contrast, when the vehicle 12 traveling on the ordinary road approaches the entrance of the expressway HR, if it is recognized that the front portion 12a is inclined upward by a predetermined degree or more based on the information obtained by the posture calculation portion 92, the route providing portion 94 determines that the vehicle has entered the expressway HR. Accordingly, the route providing unit 94 can provide the user with an appropriate travel route.
The vehicle device 10 according to the present embodiment is basically configured as described above, and the operation (method of correcting acceleration) thereof will be described below.
The vehicle device 10 calculates the offset α of the vanishing point VP in the image capturing device 40, for example, according to the processing flow shown in fig. 6. The image processing apparatus 44 of the photographing device 40 operates the camera 42 at an appropriate time when the vehicle 12 is parked (or under the instruction of the navigation device 16), photographs the outside in front of the vehicle 12, and generates photographing information PI (step S1). Then, when the image processing apparatus 44 acquires the photographing information PI from the camera 42, image correction of the photographing information PI is performed by the image correction section 60 (step S2).
Next, the extraction unit 62 of the image processing apparatus 44 extracts the extraction target object included in the shooting information PI from the corrected shooting information PI (step S3). Then, the vanishing point calculating unit 64 calculates the vanishing point VP of the captured image information PI using a line (a pair of boundary lines SL of the driving lane, etc.) extending parallel to the image capturing direction of the image capturing device 40 in the extraction target (step S4: vanishing point calculating step).
After that, the shift amount calculation section 66 of the image processing apparatus 44 calculates the shift amount α of the vanishing point VP using the calculated vanishing point VP and the saved vanishing point reference value VO (step S5: shift amount calculation step). When the offset amount α of the vanishing point VP is calculated, the offset amount calculation unit 66 transmits the information (offset amount α) to the control device 24 of the navigation device 16 (step S6).
On the other hand, the control device 24 of the navigation device 16 implements the processing flow shown in fig. 7 and 8 in the method of correcting the acceleration (reference point setting step). Specifically, the control device 24 operates the GNSS receiver module 20 to acquire the positioning information of the current position of the vehicle 12 by the positioning information acquiring unit 72 (step S11). Then, the current position determining section 74 places the current position of the vehicle 12 based on the positioning information on the map information MI. At this time, the current position determination unit 74 determines whether or not the current position of the vehicle 12 is located at a predetermined position (road, parking lot, etc.) on the map information MI (in other words, whether or not the current position of the vehicle 12 matches the map) (step S12).
In the case where the current position of the vehicle 12 matches the map (step S12: YES), the flow proceeds to the next step S13. On the other hand, when the current position of the vehicle 12 does not match the map (NO in step S12), the process returns to step S11 to perform the same process.
In step S13, the current position determination unit 74 determines whether or not the vehicle 12 has left the point at which the detection information of the acceleration sensor 14 was last learned. When the vehicle 12 leaves the previous spot (yes in step S13), the process proceeds to step S14. On the other hand, when the vehicle 12 is located at the same spot as the previous spot (no in step S13), the process returns to step S11 to perform the same process or to end the learning of the detection information this time.
In step S14, the gradient determination unit 76 acquires gradient information GI of the current position of the vehicle 12, and determines whether the vehicle 12 is located at a horizontal position (whether the gradient is a predetermined gradient). If the vehicle 12 is located in a horizontal position (yes in step S14), the process proceeds to the next step S15. On the other hand, when the vehicle 12 is located at a place inclined by a predetermined degree or more (place having a slope) with respect to the horizontal direction (no in step S14), the process returns to step S11 to perform the same process or to end the learning of the current detection information.
In step S15, the vehicle state determination unit 78 determines whether or not a predetermined time has elapsed after the vehicle 12 is stopped. When the predetermined time has elapsed after the vehicle has stopped (step S15: YES), the process proceeds to the next step S16, and when the predetermined time has not elapsed after the vehicle has stopped (step S15: NO), the process returns to step S11 to perform the same process.
In step S16, the control device 24 determines whether the reliability of the positioning information is sufficient by the reliability determination section 72 a. If the reliability of the positioning information is sufficient (yes in step S16), the process proceeds to step S17. On the other hand, if the reliability of the positioning information is insufficient (no in step S14), the process returns to step S11 to perform the same process or to end the learning of the current detection information.
In step S17, the control device 24 acquires the detection information detected by the acceleration sensor 14 through the detection information acquisition section 82.
The offset amount determination unit 80 acquires the offset amount of the vanishing point VP from the imaging device 40 (step S18), and determines whether or not the offset amount α is equal to or less than the offset threshold Th using the acquired offset amount α of the vanishing point VP (step S19). When the shift amount α of the vanishing point VP is equal to or smaller than the shift amount threshold Th (yes in step S19), the process proceeds to step S20 and the process of the 1 st learning unit 86 is performed. On the other hand, when the shift amount α of the vanishing point VP exceeds the shift amount threshold Th (no in step S19), the process proceeds to step S21 and the process of the 2 nd learning unit 88 is performed.
In step S20, the 1 st learning section 86 adds a label of the low offset acceleration LA to the detection information detected this time and stores it in the memory 30, and calculates the 1 st reference point O1 (acceleration reference point AO) by cumulative learning of the stored low offset acceleration LA. The 1 st learning unit 86 stores the calculated 1 st reference point O1 in the memory 30.
In step S21, the 2 nd learning unit 88 determines whether or not the state in which the offset amount α of the vanishing point VP exceeds the offset amount threshold Th is continuing (i.e., whether or not learning has been performed by the 2 nd learning unit 88 in the previous learning). If the offset of vanishing point VP is continuing (yes in step S21), the process proceeds to step S22, and if the offset of vanishing point VP is a new offset (no in step S21), the process proceeds to step S23.
In step S22, the 2 nd learning unit 88 adds a label of the high offset acceleration HA to the detection information detected this time and stores it in the memory 30, and calculates the 2 nd reference point O2 (acceleration reference point AO) by performing weighted moving average of the plurality of stored high offset accelerations HA. The 2 nd learning unit 88 stores the calculated 2 nd reference point O2 in the memory 30.
On the other hand, in step S23, the 2 nd learning unit 88 deletes the high offset acceleration HA stored in the previous learning, stores the detection information (high offset acceleration HA) detected this time in the memory 30, and calculates the 2 nd reference point O2 using the detection information.
The control device 24 (reference point setting unit 70) of the vehicle device 10 can obtain the plurality of acceleration reference points AO (1 st reference point O1, 2 nd reference point O2) by the above processing. Therefore, by appropriately selecting these acceleration reference points AO while the vehicle 12 is traveling, the detection information of the acceleration sensor 14 can be corrected with high accuracy.
The processing flow for setting the acceleration reference point AO is not limited to the processing procedure of each step, and the processing procedure of steps S11 to S18 may be arbitrarily designed. In addition, the above-described steps may be appropriately omitted in the processing flow for setting the acceleration reference point AO without departing from the gist of the present invention.
Next, a method of using the acceleration reference point AO set by the reference point setting unit 70 will be described with reference to fig. 9. Before or during traveling, the vehicle device 10 calculates the vanishing point VP from the imaging information PI of the imaging device 40, and determines whether the vanishing point VP is not shifted, that is, whether the shift amount α of the vanishing point VP is equal to or less than the shift amount threshold Th (step S31). When the vanishing point VP does not deviate (yes in step S31), the 1 st reference point O1 learned by the 1 st learning unit 86 is selected in step S32. On the other hand, when the vanishing point VP is shifted (no in step S31), the 2 nd reference point O2 learned by the 2 nd learning unit 88 is selected in step S33.
When the acceleration calculation portion 90 acquires the detection information of the acceleration sensor 14 while the vehicle 12 is running (step S34), the control device 24 of the navigation device 16 corrects the detection information using the set acceleration reference point AO to calculate acceleration information (step S35). Then, the attitude calculation unit 92 calculates the attitude (pitch angle) of the vehicle 12 from the calculated acceleration information (step S36).
Then, the route providing unit 94 determines whether the vehicle 12 is approaching a predetermined position on the road (for example, the exit or entrance of the expressway HR) based on the positioning information while the vehicle 12 is traveling (step S37). When the vehicle is leaving the predetermined position on the road (NO in step S37), the flow returns to step S34 and the same processing flow is performed.
When the vehicle 12 is approaching the predetermined position on the road (step S37: YES), the travel route of the vehicle 12 is maintained or reset on the basis of the information on the predetermined position and the calculated posture of the vehicle 12 (step S38). For example, when the front portion 12a of the vehicle 12 is inclined downward in the vicinity of the exit (predetermined position) of the expressway HR, it is determined that the expressway HR is being exited, and the vehicle is set as the travel route corresponding to the exited expressway HR. On the other hand, when the front portion 12a of the vehicle 12 is in the substantially horizontal direction near the exit of the expressway HR (predetermined position), it is determined that the vehicle is traveling on the expressway HR and a travel route corresponding to the travel of the expressway HR is set.
The present invention is not limited to the above embodiments, and various modifications can be made in accordance with the gist of the present invention. For example, the vehicle device 10 may be applied not only to the imaging device 40 that images the front of the vehicle 12, but also to a device that images the rear of the vehicle 12 to obtain the imaging information PI.
For example, the functional units of the imaging device 40 (image processing device 44) and the navigation device 16 (control device 24) are not limited to those shown in fig. 4. For example, the vehicle device 10 may include the vanishing point calculating unit 64 and the offset amount calculating unit 66 of the imaging device 40 in the control device 24.
Alternatively, the vehicle device 10 may be configured not to use the navigation device 16, and may have, for example, a dedicated control unit that calculates acceleration information by correcting the detection information of the acceleration sensor 14 using the acceleration reference point AO. The control unit may be integrated with the acceleration sensor 14.
The technical ideas and effects that can be grasped by the above embodiments are described below.
The 1 st aspect of the present invention is a vehicle device 10 mounted on a vehicle 12, including an acceleration sensor 14 that detects at least an acceleration in a vertical direction, and obtaining acceleration information by correcting detection information detected by the acceleration sensor 14 based on an acceleration reference point AO, the vehicle device 10 including an imaging device 40, a vanishing point calculating unit 64, an offset calculating unit 66, and a reference point setting unit 70, wherein the imaging device 40 is mounted on the vehicle 12 and images an outside of the vehicle 12; the vanishing point calculating part 64 calculates the vanishing point VP from the photographing information PI photographed by the photographing device 40; the offset amount calculation unit 66 calculates an offset amount α of the vanishing point VP with respect to the vanishing point reference value VO; the reference point setting unit 70 sets the acceleration reference point AO, and when the gradient information GI of the current position of the vehicle 12 is acquired and the current position of the vehicle 12 has a predetermined gradient, the reference point setting unit 70 learns the detection information detected by the acceleration sensor 14 based on the offset amount α, thereby setting the acceleration reference point AO.
Accordingly, the vehicle device 10 can set the acceleration reference point AO more favorably by learning the detection information of the acceleration sensor 14 based on the offset amount α of the vanishing point VP calculated from the imaging information PI of the imaging device 40. That is, it can be considered that the shift of the vanishing point VP is affected by the posture of the vehicle 12 in a state where the vehicle 12 is located at a substantially horizontal position (predetermined gradient). For example, when the front of the vehicle 12 is inclined obliquely upward due to the load of the weight W, the offset amount α of the vanishing point VP increases. Therefore, the vehicle device 10 can appropriately use the detection information of the acceleration sensor 14 that sets the acceleration reference point AO according to the offset amount α. Then, the vehicle device 10 can correct the detection information detected by the acceleration sensor 14 with high accuracy by using the acceleration reference point AO thus set, and obtain acceleration information.
The vehicle device 10 further includes a shift amount determination unit 80 that determines whether or not the shift amount α is equal to or smaller than a predetermined shift amount threshold Th, and the reference point setting unit 70 includes a low shift amount learning unit (1 st learning unit 86) that stores detection information that the shift amount α is equal to or smaller than the shift amount threshold Th as the low shift amount acceleration LA, and performs cumulative learning on the plurality of stored low shift amount accelerations LA to calculate the low shift amount reference point (1 st reference point O1) as the acceleration reference point AO. In this way, the vehicle device 10 can set the acceleration reference point AO with higher accuracy by using the detection information that the offset amount α is equal to or smaller than the offset amount threshold Th.
The reference point setting unit 70 includes a high offset amount learning unit (2 nd learning unit 88) that stores detection information that the offset amount α exceeds the offset amount threshold Th as the high offset amount acceleration HA, learns the stored high offset amount acceleration HA by a method other than cumulative learning, and calculates a high offset amount reference point (2 nd reference point O2) as the acceleration reference point AO. In this way, the vehicle device 10 can immediately set the acceleration reference point AO corresponding to the case where the vehicle 12 has the heavy object W by using the detection information that the offset amount α exceeds the offset amount threshold Th.
The high offset amount learning unit (the 2 nd reference point O2) calculates a weighted moving average as the high offset amount reference point (the 2 nd reference point O2) using the plurality of high offset amount accelerations HA. Accordingly, the vehicle device 10 can set the acceleration reference point AO with sufficient accuracy even when the vehicle 12 is loaded with the weight W.
In addition, the vehicle device 10 calculates the acceleration information using the low deviation amount reference point (the 1 st reference point O1) when the deviation amount α is equal to or less than the deviation amount threshold Th, and calculates the acceleration information using the high deviation amount reference point (the 2 nd reference point O2) when the deviation amount α exceeds the deviation amount threshold Th. Accordingly, the vehicle device 10 can obtain acceleration information with higher accuracy by using an appropriate acceleration reference point AO according to the state (posture, etc.) of the vehicle 12.
The reference point setting unit 70 uses the detection information detected by the acceleration sensor 14 in the stopped state of the vehicle 12 when learning the acceleration reference point AO. Accordingly, the vehicle device 10 can learn the detection information from which the influence of the acceleration received when the vehicle 12 is traveling is eliminated, and thus can further improve the accuracy of the acceleration reference point AO.
The vehicle device 10 processes acceleration information calculated from the acceleration reference point AO to calculate a pitch angle of the vehicle 12, and recognizes a traveling state of the vehicle 12 from the calculated pitch angle of the vehicle 12. The vehicle device 10 can accurately calculate the attitude of the pitch angle of the vehicle 12 from the acceleration information calculated based on the acceleration reference point AO in this way, and can recognize the traveling state (such as traveling inclined with respect to the horizontal direction) well.
Further, the vehicle device 10 determines that the vehicle 12 enters the expressway HR or the vehicle 12 exits from the expressway HR based on the calculated pitch angle when the vehicle 12 is traveling near the exit or the entrance of the expressway HR. Accordingly, the vehicle device 10 can determine the entrance/exit of the expressway HR with higher accuracy.
Further, a 2 nd aspect of the present invention is an acceleration correction method for obtaining acceleration information by detecting at least acceleration in a vertical direction by an acceleration sensor 14 mounted on a vehicle 12 and correcting the detected information based on an acceleration reference point AO, the acceleration correction method including a vanishing point calculation step of calculating a vanishing point VP based on imaging information PI of an imaging device 40 mounted on the vehicle 12 to image the outside of the vehicle 12, an offset calculation step, and a reference point setting step; the offset amount calculating step is a step of calculating an offset amount α of the vanishing point VP with respect to the vanishing point reference value VO; the reference point setting step is a step of setting the acceleration reference point AO, and in the reference point setting step, when the gradient information GI of the current position of the vehicle 12 is acquired and the current position of the vehicle 12 is a predetermined gradient, the detection information detected by the acceleration sensor 14 is learned from the offset amount α, and the acceleration reference point AO is set based thereon. Accordingly, the acceleration correction method can set the acceleration reference point AO in a satisfactory manner, and can correct the detection information detected by the acceleration sensor 14 with high accuracy.
The 3 rd aspect of the present invention is a storage medium (memory 30) storing an acceleration correction program (program 36) for acquiring acceleration information by detecting at least an acceleration in a vertical direction by an acceleration sensor 14 mounted on a vehicle 12 and correcting the detected information based on an acceleration reference point AO, the correction program executing a reference point setting step of setting the acceleration reference point AO, wherein in the reference point setting step, gradient information of a current position of the vehicle 12 is acquired to determine whether or not the current position of the vehicle 12 is a predetermined gradient, and a deviation amount α of a vanishing point VP calculated based on imaging information PI of an imaging device 40 from a vanishing point reference value VO is acquired, wherein the imaging device 40 images the outside of the vehicle 12, and when the current position of the vehicle 12 is the predetermined gradient, the acceleration reference point AO is set by learning the detection information detected by the acceleration sensor 14 from the offset amount α.

Claims (10)

1. A vehicle device having an acceleration sensor mounted on a vehicle and detecting at least an acceleration in a vertical direction, and obtaining acceleration information by correcting detection information detected by the acceleration sensor based on an acceleration reference point,
comprises an imaging device, a vanishing point calculating section, an offset calculating section, and a reference point setting section,
the shooting device is mounted on the vehicle and is used for shooting the outside of the vehicle;
the vanishing point calculating part calculates vanishing points according to shooting information shot by the shooting device;
the offset amount calculation unit calculates an offset amount of the vanishing point with respect to a vanishing point reference value;
the reference point setting unit sets the acceleration reference point,
the reference point setting unit sets the acceleration reference point by learning the detection information detected by the acceleration sensor based on the offset amount, when gradient information of a current position of the vehicle is acquired and the current position of the vehicle is a predetermined gradient.
2. The device for a vehicle according to claim 1,
a shift amount determination unit for determining whether or not the shift amount is equal to or less than a predetermined shift amount threshold,
the reference point setting unit includes a low offset learning unit that stores the detection information having the offset equal to or less than the offset threshold as a low offset acceleration, and performs cumulative learning on the plurality of stored low offset accelerations to calculate a low offset reference point as the acceleration reference point.
3. The device for a vehicle according to claim 2,
the reference point setting unit includes a high offset learning unit that stores the detection information in which the offset exceeds the offset threshold as a high offset acceleration, and learns the stored high offset acceleration by a method other than the cumulative learning to calculate a high offset reference point as the acceleration reference point.
4. The device for a vehicle according to claim 3,
the high offset learning unit calculates a weighted moving average as the high offset reference point using a plurality of the high offset accelerations.
5. The device for a vehicle according to claim 3 or 4,
the vehicle device calculates the acceleration information using the low offset reference point when the offset amount is equal to or less than the offset amount threshold, and calculates the acceleration information using the high offset reference point when the offset amount exceeds the offset amount threshold.
6. The device for a vehicle according to claim 1,
the reference point setting unit uses the detection information detected by the acceleration sensor in a stopped state of the vehicle when learning the acceleration reference point.
7. The device for a vehicle according to claim 1,
the vehicle device processes the acceleration information calculated from the acceleration reference point to calculate a pitch angle of the vehicle, and recognizes a driving state of the vehicle from the calculated pitch angle of the vehicle.
8. The device for a vehicle according to claim 7,
when the vehicle is traveling near an exit or near an entrance of an expressway, the vehicle apparatus determines that the vehicle enters the expressway or that the vehicle exits the expressway, based on the calculated pitch angle.
9. A method for correcting acceleration, which obtains acceleration information by detecting at least acceleration in a vertical direction by an acceleration sensor mounted on a vehicle and correcting the detected information based on an acceleration reference point,
having a vanishing point calculating step, an offset calculating step, and a reference point setting step, wherein,
the vanishing point calculating step is a step of calculating a vanishing point based on imaging information of an imaging device mounted on the vehicle and imaging an outside of the vehicle;
the offset amount calculating step is a step of calculating an offset amount of the vanishing point with respect to a vanishing point reference value;
the reference point setting step is a step of setting the acceleration reference point,
in the reference point setting step, when gradient information of a current position of the vehicle is acquired and the current position of the vehicle is a predetermined gradient, the acceleration reference point is set based on the detection information detected by the acceleration sensor learned based on the offset amount.
10. A storage medium storing an acceleration correction program for obtaining acceleration information by detecting at least acceleration in a vertical direction by an acceleration sensor mounted on a vehicle and correcting the detected information based on an acceleration reference point,
the correction program performs a reference point setting step of setting the acceleration reference point,
in the reference point setting step:
obtaining gradient information of a current position of the vehicle to determine whether the current position of the vehicle is a prescribed gradient,
acquiring a deviation amount of a vanishing point calculated from photographing information of a photographing device which photographs the outside of the vehicle with respect to a vanishing point reference value,
the acceleration reference point is set by learning the detection information detected by the acceleration sensor based on the offset amount when the current position of the vehicle is a predetermined gradient.
CN202010092184.4A 2020-02-14 2020-02-14 Vehicle device and acceleration correction method Pending CN113341942A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010092184.4A CN113341942A (en) 2020-02-14 2020-02-14 Vehicle device and acceleration correction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010092184.4A CN113341942A (en) 2020-02-14 2020-02-14 Vehicle device and acceleration correction method

Publications (1)

Publication Number Publication Date
CN113341942A true CN113341942A (en) 2021-09-03

Family

ID=77466940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010092184.4A Pending CN113341942A (en) 2020-02-14 2020-02-14 Vehicle device and acceleration correction method

Country Status (1)

Country Link
CN (1) CN113341942A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000255319A (en) * 1999-03-10 2000-09-19 Fuji Heavy Ind Ltd Vehicle running direction recognizing device
JP2017090159A (en) * 2015-11-06 2017-05-25 株式会社日本自動車部品総合研究所 Vehicle pitch angle estimation device
JP2019090615A (en) * 2017-11-10 2019-06-13 株式会社Soken Attitude detection device and attitude detection program
WO2020021842A1 (en) * 2018-07-25 2020-01-30 株式会社デンソー Vehicle display control device, vehicle display control method, control program, and persistent tangible computer-readable medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000255319A (en) * 1999-03-10 2000-09-19 Fuji Heavy Ind Ltd Vehicle running direction recognizing device
JP2017090159A (en) * 2015-11-06 2017-05-25 株式会社日本自動車部品総合研究所 Vehicle pitch angle estimation device
JP2019090615A (en) * 2017-11-10 2019-06-13 株式会社Soken Attitude detection device and attitude detection program
WO2020021842A1 (en) * 2018-07-25 2020-01-30 株式会社デンソー Vehicle display control device, vehicle display control method, control program, and persistent tangible computer-readable medium

Similar Documents

Publication Publication Date Title
JP4600357B2 (en) Positioning device
US9740942B2 (en) Moving object location/attitude angle estimation device and moving object location/attitude angle estimation method
CN111986506B (en) Mechanical parking space parking method based on multi-vision system
JP4321821B2 (en) Image recognition apparatus and image recognition method
EP2485203B1 (en) Vehicle-surroundings monitoring device
JP5966747B2 (en) Vehicle travel control apparatus and method
KR101628427B1 (en) Deadreckoning-based navigation system using camera and control method thereof
KR101704405B1 (en) System and method for lane recognition
CN102565832A (en) Method of augmenting GPS or gps/sensor vehicle positioning using additional in-vehicle vision sensors
US20200271449A1 (en) Calibration system and calibration apparatus
CN112005079B (en) System and method for updating high-definition map
JP5028662B2 (en) Road white line detection method, road white line detection program, and road white line detection device
JP2002259995A (en) Position detector
JP2010134640A (en) Information acquisition apparatus
WO2004048895A1 (en) Moving body navigate information display method and moving body navigate information display device
JP2014106739A (en) In-vehicle image processing device
EP3795952A1 (en) Estimation device, estimation method, and computer program product
JP2008082925A (en) Navigation device, its control method, and control program
CN113795726B (en) Self-position correction method and self-position correction device
CN113341942A (en) Vehicle device and acceleration correction method
JP6627135B2 (en) Vehicle position determination device
JP2020086956A (en) Imaging abnormality diagnosis device
JP4742695B2 (en) Gaze recognition apparatus and gaze recognition method
JP2003196798A (en) Device for detecting white line
CN111583387A (en) Method and system for three-dimensional reconstruction of outdoor scene of unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination