CN110312639A - Vehicle assistant drive device, vehicle and information processing method - Google Patents
Vehicle assistant drive device, vehicle and information processing method Download PDFInfo
- Publication number
- CN110312639A CN110312639A CN201880011498.8A CN201880011498A CN110312639A CN 110312639 A CN110312639 A CN 110312639A CN 201880011498 A CN201880011498 A CN 201880011498A CN 110312639 A CN110312639 A CN 110312639A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- scope
- mesh
- distance
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
- G07C5/085—Registering performance data using electronic data carriers
- G07C5/0866—Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
A kind of vehicle assistant drive device (1) includes more mesh cameras (12) and information processing system (14).More mesh cameras (12) are used to carry out Image Acquisition to the scene within the scope of the aspect at vehicle (1) left, right or rear.Information processing system (14) is used to calculate the current distance of object and vehicle within the scope of aspect according to the collected more mesh images of more mesh cameras (12).The vehicle assistant drive device has widened the usage scenario of vehicle assistant drive device.Additionally provide a kind of vehicle and information processing method.
Description
Copyright notice
This patent document disclosure includes material protected by copyright.The copyright is all for copyright holder.Copyright
Owner does not oppose the patent document in the presence of anyone replicates the proce's-verbal of Patent&Trademark Office and archives or should
Patent discloses.
Technical field
This application involves vehicular fields, more specifically, are related at a kind of vehicle assistant drive device, vehicle and information
Reason method.
Background technique
With the extensive use of vehicle, vehicle assistant drive technology more and more attention has been paid to.
Vehicle assistant drive technology can provide the peripheral information (such as video and/or acoustic information) of vehicle for driver,
Evidence is driven or provided in the event of a failure with auxiliary.The Limited information that traditional vehicle assistant drive device is capable of providing, leads
Cause its usage scenario limited.
Summary of the invention
The application provides a kind of vehicle assistant drive device, vehicle and information processing method, can enrich vehicle auxiliary
The function of pilot instrument widens its usage scenario.
In a first aspect, providing a kind of vehicle assistant drive device, comprising: more mesh cameras, for vehicle left, right
Or the scene within the scope of the aspect at rear carries out Image Acquisition;Information processing system, for obtaining more mesh cameras
Collected more mesh images;According to more mesh images, working as object within the scope of the aspect and the vehicle is calculated
Front distance.
Second aspect provides a kind of vehicle, including vehicle assistant drive device as described in relation to the first aspect.
The third aspect, provides a kind of information processing method, and the method is applied to the vehicle assistant drive device of vehicle, institute
Stating vehicle assistant drive device includes more mesh cameras, and more mesh cameras are used for the vehicle left, right or rear
Aspect within the scope of scene carry out Image Acquisition;The described method includes: it is collected more to obtain more mesh cameras
Mesh image;According to more mesh images, the current distance of the object and the vehicle within the scope of the aspect is calculated.
More mesh camera schemes are used at the rear, left or right of vehicle, vehicle assistant drive device is mentioned
For the range information of rear of vehicle, left or right, the function of vehicle assistant drive device is enriched, it has been widened and has used field
Scape.
Detailed description of the invention
Fig. 1 is the exemplary diagram of installation site of the vehicle assistant drive device provided by the embodiments of the present application in vehicle.
Fig. 2 is the schematic flow chart for the information processing method that the application one embodiment provides.
Fig. 3 is a kind of schematic flow chart of possible implementation of the step S24 in Fig. 2.
Fig. 4 is the schematic flow chart of the alternatively possible implementation of the step S24 in Fig. 2.
Fig. 5 is the exemplary diagram of anti-collision warning information provided by the embodiments of the present application.
Fig. 6 is the schematic flow chart for the information processing method that another embodiment of the application provides.
Specific embodiment
There are many type for the vehicle assistant drive device installed on vehicle, wherein similar to reverse radar system, driving recording
Many vehicle assistant drive devices such as instrument can provide the image information of vehicle-surroundings for driver.But these vehicle assistant drives
The image information that device is capable of providing is limited, is usually only capable of providing single image information, leads to vehicle assistant drive device
Usage scenario is limited.It is illustrated by taking automobile data recorder as an example below.
Image of the automobile data recorder commonly used in record vehicle driving on the way, to provide evidence for traffic accident.Panorama
Automobile data recorder is increasingly becoming the first choice of more car owners due to can recorde the image within the scope of 360 °.
Traditional automobile data recorder generallys use the image of monocular cam record vehicle periphery.It is with panorama automobile data recorder
Example, the automobile data recorder usually require that all around 1 camera is respectively installed in 4 directions in vehicle.4 cameras collect
Image can be stitched together, and shown on console or the mobile phone of car owner, show 360 ° of images of vehicle periphery.
But traditional automobile data recorder carries out Image Acquisition using monocular cam in vehicle periphery, can only obtain single
Image information.
Below with reference to Fig. 1, vehicle assistant drive device provided by the embodiments of the present application is described in detail.As shown in Figure 1, vehicle
Vehicle assistant drive device 10 is installed on 1.Vehicle assistant drive device 10 may include more mesh cameras 12 (as shown in figure 1
Camera 12a and camera 12b) and information processing system 14.
The scene that more mesh cameras 12 can be used within the scope of the aspect to 1 left of vehicle, right or rear carries out figure
As acquisition.
More mesh cameras 12 may include two or more cameras.As an example, more mesh cameras 12
It may include for acquiring the two of color image cameras (i.e. binocular camera).As another example, more mesh cameras
12 may include that (i.e. three mesh are taken the photograph for two cameras for acquiring color image camera and for acquiring gray level image
As head).
As shown in Figure 1, more mesh cameras 12 may include binocular camera 12a, 12b.The binocular camera 12a, 12b
It can be respectively used to acquisition binocular image (including left-eye image and eye image).
More mesh cameras 12 can be located at rear, left or the right of vehicle 1.After being located at vehicle with more mesh cameras 12
For side, as shown in Figure 1, more mesh cameras 12 may be mounted at the rear window of vehicle 1, such as the top of rear window.Alternatively, more mesh camera shootings
First 12 may be mounted on the license plate of vehicle, such as the middle position near license plate or at the top of license plate.Due to the binocular camera shooting in Fig. 1
Head 12a, 12b is mounted on the rear of vehicle, therefore, binocular camera 12a, 12b can also be known as backsight binocular camera.
More mesh cameras 12 can be used for carrying out Image Acquisition to the scene in target zone, but the embodiment of the present application is to target
The value of angular field of view is not specifically limited, can installation site and more mesh cameras 12 based on more mesh cameras 12 view
The factors such as rink corner determine.Aspect range for example can be 90 degree of angular field of view, be also possible to 135 degree of angular field of view.
Information processing system 14 can be integrated with more mesh cameras 12, can also be separated from each other (as shown in Figure 1),
As long as guaranteeing that information processing system 14 and more mesh cameras 12 communicate to connect.
Fig. 2 is the schematic flow chart of information processing method provided by the embodiments of the present application.The method of Fig. 2 can be by Fig. 1
In information processing system 14 execute.The method of Fig. 2 may include step S22 and step S24.
In step S22, more collected more mesh images of mesh camera are obtained.
In step S24, the current distance of the object and vehicle within the scope of aspect is calculated according to more mesh images.
Due to installation site difference, parallax is had between more mesh images.Therefore, it can be generated by the matching of more mesh images
Disparity map, and then obtain the current distance of the object within the scope of aspect and vehicle.
The embodiment of the present application uses more mesh camera schemes at the rear, left or right of vehicle, so that vehicle auxiliary is driven
The range information that device is capable of providing rear of vehicle, left or right is sailed, the usage scenario of vehicle assistant drive device has been widened.
For example, can use vehicle assistant drive device provides the range information progress anti-collision warning of rear of vehicle, left or right, from
And make the traveling of vehicle safer.
More collected more mesh images of mesh camera may be matched, to obtain in the scene within the scope of aspect
The range information of object.The matching of more mesh images is alternatively referred to as the registration of more mesh images.Using more mesh images as binocular image (left eye
Image and eye image) for, the disparity map of scene can be calculated according to left-eye image and eye image, to calculate scene
Depth map.By taking more mesh images include three or three or more images as an example, it can use these images and calculate depth map two-by-two,
Finally obtain the depth map of entire scene.
The matching effect of more mesh images directly affects the accuracy of the calculated range information of information processing system 14.More mesh
The matching effect of image is usually related with environment locating for vehicle.For example, the scene more dull in half-light scene or texture, more
The matching result of mesh image may be inaccurate, and then causes the calculated range information of information processing system 14 inaccurate.
Below with reference to Fig. 3, a kind of possible implementation of step S24 is provided, to improve the calculating of information processing system 14
The accuracy of range information out.
In step S32, the first distance information of range finder module output is received.
First distance information may be used to indicate working as object on the target angle direction within the scope of aspect and vehicle
Front distance.
The range finder module for example can be the radar sensor (such as reversing radar) installed on vehicle.Range finder module can be
Range finder module based on ultrasound is also possible to based on CW with frequency modulation (frequencymodulated continuous
Wave, FMCW) range finder module, be also possible to the range finder module based on laser, such as laser acquisition and measurement (light
Detection and ranging, Lidar) system.
Target angle direction can be one or more angle directions within the scope of aspect.It is reversing with range finder module
For radar, the range finder module measurement target angle direction may include the left back direction of vehicle, dead astern to and right back
To.
More mesh images are matched, depth map is obtained using first distance information as benchmark in step S34, so that deep
The difference spent between second distance information and first distance information in figure is less than preset threshold.
Similar with first distance information, second distance information can also be used for the object and vehicle on instruction target angle direction
Current distance.First distance information and second distance information are the difference is that first distance information is measured by range finder module
It obtains, second distance information is obtained by more mesh images match.
It should be noted that range finder module and more mesh cameras are typically mounted on the different location of vehicle, therefore, for side
Just the comparison of range information, the first distance information that can be exported to range finder module is coordinately transformed and be corrected (such as by the
One range information is transformed under the camera coordinates system where more mesh cameras) so that first distance information and second distance information
Benchmark it is identical.Certainly, if the installation site of range finder module and more mesh cameras is very close, first distance can also be believed
Breath and second distance information are considered as the collected range information under same benchmark.
One kind that step S34 can be understood as the range information in range information and depth map that range finder module measures is melted
It closes, it is therefore an objective to which a degree of correction is carried out to the matching result of more mesh images.The implementation of step S34 can there are many.
As an example, first more mesh images can be matched, obtains initial depth figure;When second in the initial depth figure away from
When being greater than preset threshold from the difference between information and first distance information, the matching for readjusting the pixel in more mesh images is closed
System, until the difference between the second distance information in calculated depth map and first distance information is less than preset threshold
Only.It is of course also possible to use other information amalgamation mode carries out information fusion, for example, directly being replaced using first distance information
Second distance information in depth map.
In certain embodiments, it when the environment locating for the vehicle meets preset condition, may not need first distance information
As benchmark, directly more mesh images are matched;Again by first distance when the environment locating for the vehicle is unsatisfactory for preset condition
Information matches more mesh images as benchmark.Environment locating for vehicle, which is unsatisfactory for preset condition, can be locating for vehicle
Environment is that environment texture locating for half-light environment or vehicle is weaker.The flexibility of algorithm can be improved in this implementation, compared with
The calculation amount of few algorithm.
In step S36, according to depth map, the current distance of the object and vehicle within the scope of aspect is calculated.
Since range finder module is less subject to the influence of vehicle local environment, the range information that range finder module measures is logical
It is often relatively more accurate.Matching knot of the range information that the embodiment of the present application is measured using range finder module as benchmark to more mesh images
Fruit is corrected, and the accuracy of range information can be improved.
Embodiment shown in Fig. 3 gives a kind of possible implementation of step S24.Below with reference to Fig. 4, step is provided
The alternatively possible implementation of S24.
Fig. 4 includes step S42- step S46.It describes in detail below to these steps.
In step S42, the first distance information of range finder module output is received.
First distance information may be used to indicate the current distance of object and vehicle within the scope of aspect.
The range finder module for example can be the radar sensor (such as reversing radar) installed on vehicle.Range finder module can be
Range finder module based on ultrasound is also possible to the range finder module based on FMCW, is also possible to the range finder module based on laser, than
Such as Lidar system.
Step S44, the object within the scope of more mesh image recognition aspects is utilized.
Object identification mode in image has very much, can use traditional image recognition algorithm based on support vector machines
It is identified, can also be identified that the embodiment of the present application does not limit this using preparatory trained neural network model.
Step S46, using first distance information as the range information of the object within the scope of aspect, to form depth
Figure.
In other words, step S46 is labeled or is marked to the distance of the object in more mesh images using first distance information
Note, to generate depth map.
Step S48, according to depth map, the current distance of the object and vehicle within the scope of aspect is calculated.
Since range finder module is less subject to the influence of vehicle local environment, the range information that range finder module measures is logical
It is often relatively more accurate.The range information that the embodiment of the present application is measured using range finder module is directly as the object in more mesh images
Ranging accuracy can be improved in range information.
Above it has already been indicated that making vehicle assistant drive device be capable of providing vehicle periphery using more mesh camera schemes
Range information, so as to widen the usage scenario of vehicle assistant drive device.Hereafter to vehicle provided by the embodiments of the present application
The function and usage scenario of auxiliary driving device carry out detailed illustration.
As an example, it is collided after vehicle assistant drive device provided by the embodiments of the present application being applied to pre-
It is alert.For example, information processing system 14 it is pre- can to generate collision according to the current distance of object and vehicle within the scope of aspect
Alert information.
Conventional truck generallys use reversing radar and provides anti-collision warning information, but reversing radar can only be provided on limited direction
Anti-collision warning information, this depend on reversing radar in sensor (such as ultrasonic sensor) quantity and installation site.Example
Such as, traditional reversing radar be typically only capable to provide the left back direction of vehicle, dead astern to right back to anti-collision warning information.With
Reversing radar is different, and the embodiment of the present application provides range information using more mesh cameras.Since more mesh cameras provide distance letter
Breath may include the current distance in all angles direction within the scope of aspect, therefore, collision provided by the embodiments of the present application
Warning information also may include the corresponding anti-collision warning information of all angles within the scope of aspect, to improve early warning effect
Fruit.
Optionally, in some embodiments, anti-collision warning information provided by the embodiments of the present application, which can be, images more mesh
The anti-collision warning information that the range information that head provides obtains after merging with the range information that reversing radar provides, to further increase
Early warning effect.
Optionally, in some embodiments, information processing system 14 can also be used in control display screen show for characterization touch
Hit the early warning figure of warning information.
The display screen for example can be the display screen of car owner's mobile phone, the display screen being also possible on the central control panel of vehicle 1.
As shown in figure 5, the early warning figure may include at least one camber line.Camber line may include within the scope of aspect
The corresponding point of all directions angle.The color of camber line can be used for characterizing current between object and vehicle within the scope of aspect
(color of camber line is not shown in Fig. 5 to distance, in fact, same camber line may have multiple color.For example, the both sides of a camber line
It can be red, centre can be green, be carried out between red and green using other colors gradually excessive.Red can indicate
Vehicle is close with object distance, to remind driver to pay attention to;Green can indicate vehicle and object distance farther out).
Different from traditional early radar warning figure, the omnidirection that early warning figure provided by the embodiments of the present application can provide vehicle is pre-
It is alert.
It optionally, in certain embodiments, can also be based on one or more image in more mesh images to aspect
Object in range identified, then (range information can be by more mesh cameras for the range information for tying within the scope of aspect
There is provided, can also be provided by reversing radar, be also possible to the fuse information for the range information that the two provides), it is shown in display screen
Image in identify the object excessively close with vehicle.For example, when pedestrian by rear of vehicle and it is excessively close apart from vehicle when, Ke Yi
The pedestrian is identified using certain way in image, such as pedestrian is coloured or is warned using other labels.
Above-described vehicle assistant drive device can be automobile data recorder, such as in automobile data recorder addition acquisition away from
Function from information, so that automobile data recorder can be applied to more extensive scene.The automobile data recorder can be common
Automobile data recorder, is also possible to panorama automobile data recorder, and the embodiment of the present application does not limit this.
Above-described is application of the vehicle assistant drive device provided by the embodiments of the present application in terms of anti-collision warning.Under
Text provides application of the vehicle assistant drive device provided by the embodiments of the present application in terms of colliding record.
Conventional truck auxiliary driving device (such as automobile data recorder) has collision writing function, but is to detect vehicle
It shakes and determines just to open video recording function when collision occurs.Therefore, the collision of conventional truck auxiliary driving device records function
It can be understood as one kind and passively collide writing function.Vehicle assistant drive device provided by the embodiments of the present application has actively
Collision writing function, i.e., prediction vehicle whether may collide, if possible collide, i.e., unlatching videograph function
Energy.Below with reference to Fig. 4, the execution process provided by the embodiments of the present application for actively colliding writing function is described in detail.
The method of Fig. 6 can be executed by the information processing system 14 in vehicle assistant drive device 10.The method of Fig. 6 includes
Step S62 to step S64.
In step S62, according within the scope of the current distance of object and vehicle within the scope of aspect and aspect
The situation of change of the history distance of object and vehicle determines the possibility that the object in vehicle-to-target angular field of view collides
Property.
The implementation of step S62 can there are many.
As an example, when current distance is less than history distance, and current distance is less than preset threshold, vehicle is determined
It may collide with the object within the scope of aspect.
Current distance, which is less than history distance, can indicate object close to vehicle, and current distance is less than preset threshold can be with table
Show that the distance between object and vehicle are very close, can be determined that object and vehicle have a possibility that colliding at this time.
As another example, it also can be determined that whether the difference of history distance and current distance is greater than some threshold value, if
The difference of history distance and current distance is greater than some threshold value, then can be determined that object and vehicle have a possibility that colliding.
What the sampling interval of the distance between vehicle and object was usually fixed, the difference of history distance and current distance is greater than
Some threshold value can indicate current object just in fast approaching vehicle, can be determined that object and vehicle have at this time and collide
Possibility.
In step S64, when determining that the object in vehicle-to-target angular field of view may collide, imaged using more mesh
Head is recorded a video.
The current distance of object and vehicle within the scope of aspect can be information processing system 14 in present sample
Collected range information is carved, the history distance of object and vehicle within the scope of aspect can be information processing system 14 and exist
The collected range information of previous or preceding several sampling instants.Step S62 can for example be realized in the following way.Firstly,
Compare current distance and history distance.If some object within the scope of aspect is at a distance from vehicle in the close and object
The distance between vehicle is less than preset threshold, it is determined that the object in vehicle-to-target angular field of view can in the presence of what is collided
Energy property, and the collision writing function of unlocking vehicle auxiliary driving device, are recorded a video using more mesh cameras.
In addition, in some cases, the recording function of more mesh cameras is likely to be at closed state, for example driver may
Close the collision writing function of vehicle assistant drive device.Therefore, when determining that the object in vehicle-to-target angular field of view can
It, can be with the more mesh cameras of enforced opening if the recording function of more mesh cameras is in close state when can collide
Recording function, and recorded a video using more mesh cameras.
It is some special when vehicle assistant drive device provided by the present application can also be applied to record vehicle stopped state
Event.For example, information processing system 14 can also be used to judge whether someone according to more mesh images when vehicle is in dead ship condition
Or object proximity vehicle opens the recording function of more mesh cameras, and utilize more mesh when judging someone or object proximity vehicle
Camera is recorded a video, to improve safety of the vehicle in dead ship condition.
The embodiment of the present application also provides a kind of vehicle.The vehicle can be vehicle 1 as shown in Figure 1.The vehicle 1 includes vehicle
Auxiliary driving device 10.Optionally, in some embodiments, more mesh cameras 12 of vehicle assistant drive device 10 can pacify
On the rear window, license plate of vehicle 1 or the peripheral position of license plate.
The embodiment of the present application also provides a kind of information processing method.Information processing method can be applied to the vehicle auxiliary of vehicle
Pilot instrument.Vehicle assistant drive device includes more mesh cameras, and more mesh cameras are used for vehicle left, right or rear
Scene within the scope of aspect carries out Image Acquisition.This method may include step S22 shown in Fig. 2 to step S24.
Optionally, which can also wrap step shown in Fig. 3.
Optionally, which can also wrap step shown in Fig. 4.
Optionally, the information processing method may also include that according within the scope of aspect object and vehicle it is current away from
From generation anti-collision warning information.
Optionally, anti-collision warning information may include the corresponding anti-collision warning letter of all angles within the scope of aspect
Breath.
Optionally, which may also include that control display screen is shown for characterizing the pre- of anti-collision warning information
Alert figure, early warning figure include at least one camber line, and camber line includes point corresponding with all angles within the scope of aspect, camber line
Color is used to characterize the current distance between the object and vehicle within the scope of aspect.
Optionally, which further includes step shown in fig. 5.
In the above-described embodiments, can come wholly or partly by software, hardware, firmware or any other combination real
It is existing.When implemented in software, it can entirely or partly realize in the form of a computer program product.The computer program
Product includes one or more computer instructions.When loading on computers and executing the computer program instructions, all or
It partly generates according to process or function described in the embodiment of the present invention.The computer can be general purpose computer, dedicated meter
Calculation machine, computer network or other programmable devices.The computer instruction can store in computer readable storage medium
In, or from a computer readable storage medium to the transmission of another computer readable storage medium, for example, the computer
Instruction can pass through wired (such as coaxial cable, optical fiber, number from a web-site, computer, server or data center
User's line (digital subscriber line, DSL)) or wireless (such as infrared, wireless, microwave etc.) mode to another
Web-site, computer, server or data center are transmitted.The computer readable storage medium can be computer capacity
Any usable medium enough accessed either includes that the data such as one or more usable mediums integrated server, data center are deposited
Store up equipment.The usable medium can be magnetic medium (for example, floppy disk, hard disk, tape), optical medium (such as digital video light
Disk (digital video disc, DVD)) or semiconductor medium (such as solid state hard disk (solid state disk,
SSD)) etc..
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure
Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually
It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician
Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed
Scope of the present application.
In several embodiments provided herein, it should be understood that disclosed systems, devices and methods, it can be with
It realizes by another way.For example, the apparatus embodiments described above are merely exemplary, for example, the unit
It divides, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components
It can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point, it is shown or
The mutual coupling, direct-coupling or communication connection discussed can be through some interfaces, the indirect coupling of device or unit
It closes or communicates to connect, can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, each functional unit in each embodiment of the application can integrate in one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.
The above, the only specific embodiment of the application, but the protection scope of the application is not limited thereto, it is any
Those familiar with the art within the technical scope of the present application, can easily think of the change or the replacement, and should all contain
Lid is within the scope of protection of this application.Therefore, the protection scope of the application should be based on the protection scope of the described claims.
Claims (31)
1. a kind of vehicle assistant drive device characterized by comprising
More mesh cameras carry out Image Acquisition for the scene within the scope of the aspect to vehicle left, right or rear;
Information processing system, for obtaining the collected more mesh images of more mesh cameras;According to more mesh images, calculate
The current distance of object and the vehicle within the scope of the aspect.
2. vehicle assistant drive device according to claim 1, which is characterized in that the information processing system is also used to connect
The first distance information of range finder module output is received, the first distance information is used to indicate the target within the scope of the aspect
The current distance of object and the vehicle on angle direction;
It is described to calculate the current distance of the object and the vehicle within the scope of the aspect according to more mesh images, packet
It includes:
Using the first distance information as benchmark, more mesh images are matched, depth map is obtained, so that the depth
The difference between second distance information and the first distance information in figure is less than preset threshold, wherein the second distance
Information is used to indicate the current distance of object and the vehicle on the target angle direction;
According to the depth map, the current distance of the object and the vehicle within the scope of the aspect is calculated.
3. vehicle assistant drive device according to claim 2, which is characterized in that described to make the first distance information
On the basis of, more mesh images are matched, comprising:
When the environment locating for the vehicle is unsatisfactory for preset condition, using the first distance information as benchmark, to described more
Mesh image is matched;
The information processing system is also used to:
When the environment locating for the vehicle meets preset condition, directly more mesh images are matched.
4. vehicle assistant drive device according to claim 1, which is characterized in that the information processing system is also used to connect
The first distance information of range finder module output is received, the first distance information is used to indicate the object within the scope of the aspect
With the current distance of the vehicle;
It is described to calculate the current distance of the object and the vehicle within the scope of the aspect according to more mesh images, packet
It includes:
Utilize the object within the scope of aspect described in more mesh image recognitions;
Using the first distance information as the range information of the object within the scope of the aspect, to form depth map;
According to the depth map, the current distance of the object and the vehicle within the scope of the aspect is calculated.
5. the vehicle assistant drive device according to any one of claim 2-4, which is characterized in that the range finder module is
The radar sensor installed on the vehicle.
6. vehicle assistant drive device according to any one of claims 1-5, which is characterized in that the information processing system
System is also used to generate anti-collision warning information according to the current distance of object and the vehicle within the scope of the aspect.
7. vehicle assistant drive device according to claim 6, which is characterized in that the anti-collision warning information includes described
The corresponding anti-collision warning information of all angles within the scope of aspect.
8. vehicle assistant drive device according to claim 6 or 7, which is characterized in that the information processing system is also used
Show that the early warning figure for characterizing the anti-collision warning information, the early warning figure include at least one camber line in control display screen,
The camber line includes point corresponding with all angles within the scope of the aspect, and the color of the camber line is described for characterizing
The current distance between object and the vehicle within the scope of aspect.
9. vehicle assistant drive device according to claim 1 to 8, which is characterized in that the information processing system
System is also used within the scope of the current distance and the aspect according to object and the vehicle within the scope of the aspect
Object and the vehicle history distance situation of change, determine that the object within the scope of the vehicle and the aspect is sent out
A possibility that raw collision;When determining that the object within the scope of the vehicle and the aspect may collide, institute is utilized
More mesh cameras are stated to record a video.
10. vehicle assistant drive device according to claim 9, which is characterized in that described when the determining vehicle and institute
When stating the object within the scope of aspect may collide, recorded a video using more mesh cameras, comprising:
When determining that the object within the scope of the vehicle and the aspect may collide, if more mesh cameras
Recording function be in close state, then the recording function of more mesh cameras described in enforced opening, and being imaged using more mesh
Head is recorded a video.
11. vehicle assistant drive device according to claim 9 or 10, which is characterized in that described to be regarded according to the target
The history of the current distance of object and the vehicle in angular region and object and the vehicle within the scope of the aspect
The situation of change of distance determines a possibility that object within the scope of the vehicle and the aspect collides, comprising:
When the current distance is less than the history distance, and the current distance is less than preset threshold, the vehicle is determined
It may collide with the object within the scope of the aspect.
12. vehicle assistant drive device described in any one of -11 according to claim 1, which is characterized in that the information processing
System is also used to when the vehicle is in dead ship condition, judges whether there is people or object described according to more mesh images
Vehicle opens the recording function of more mesh cameras, and utilize described more when judging vehicle described in someone or object proximity
Mesh camera is recorded a video.
13. vehicle assistant drive device described in any one of -12 according to claim 1, which is characterized in that more mesh camera shootings
Head includes for acquiring the two of color image cameras, or includes for acquiring color image camera and being used for
Acquire two cameras of gray level image.
14. vehicle assistant drive device according to claim 1 to 13, which is characterized in that the vehicle auxiliary
Pilot instrument is automobile data recorder.
15. vehicle assistant drive device according to claim 14, which is characterized in that the automobile data recorder is panorama row
Vehicle recorder.
16. a kind of vehicle, which is characterized in that including the vehicle assistant drive device as described in any one of claim 1-15.
17. vehicle according to claim 16, which is characterized in that after more mesh cameras are mounted on the vehicle
On window, license plate or the peripheral position of license plate.
18. a kind of information processing method, which is characterized in that the method is applied to vehicle assistant drive device, and the vehicle is auxiliary
Helping pilot instrument includes more mesh cameras, and more mesh cameras are used to regard the target at the vehicle left, right or rear
Scene in angular region carries out Image Acquisition;
The described method includes:
Obtain the collected more mesh images of more mesh cameras;
According to more mesh images, the current distance of the object and the vehicle within the scope of the aspect is calculated.
19. according to the method for claim 18, which is characterized in that the method also includes:
The first distance information of range finder module output is received, the first distance information is used to indicate within the scope of the aspect
Target angle direction on object and the vehicle current distance;
It is described to calculate the current distance of the object and the vehicle within the scope of the aspect according to more mesh images, packet
It includes:
Using the first distance information as benchmark, more mesh images are matched, depth map is obtained, so that the depth
The difference between second distance information and the first distance information in figure is less than preset threshold, wherein the second distance
Information is used to indicate the current distance of object and the vehicle on the target angle direction;
According to the depth map, the current distance of the object and the vehicle within the scope of the aspect is calculated.
20. according to the method for claim 19, which is characterized in that it is described using the first distance information as benchmark, it is right
More mesh images are matched, comprising:
When the environment locating for the vehicle is unsatisfactory for preset condition, using the first distance information as benchmark, to described more
Mesh image is matched;
The information processing system is also used to:
When the environment locating for the vehicle meets preset condition, directly more mesh images are matched.
21. according to the method for claim 18, which is characterized in that the method also includes:
The first distance information of range finder module output is received, the first distance information is used to indicate within the scope of the aspect
Object and the vehicle current distance;
It is described to calculate the current distance of the object and the vehicle within the scope of the aspect according to more mesh images, packet
It includes:
Utilize the object within the scope of aspect described in more mesh image recognitions;
Using the first distance information as the range information of the object within the scope of the aspect, to form depth map;
According to the depth map, the current distance of the object and the vehicle within the scope of the aspect is calculated.
22. method described in any one of 9-21 according to claim 1, which is characterized in that the range finder module is the vehicle
The radar sensor of upper installation.
23. method described in any one of 8-22 according to claim 1, which is characterized in that the method also includes:
Anti-collision warning information is generated according to the current distance of object and the vehicle within the scope of the aspect.
24. according to the method for claim 23, which is characterized in that the anti-collision warning information includes the aspect model
Enclose the interior corresponding anti-collision warning information of all angles.
25. the method according to claim 23 or 24, which is characterized in that the method also includes:
Control display screen shows that the early warning figure for characterizing the anti-collision warning information, the early warning figure include at least one arc
Line, the camber line include point corresponding with all angles within the scope of the aspect, and the color of the camber line is for characterizing
Work as distance between object and the vehicle within the scope of the aspect.
26. method described in any one of 8-25 according to claim 1, which is characterized in that the method also includes:
According to the object within the scope of the current distance and the aspect of object and the vehicle within the scope of the aspect
The situation of change of the history distance of body and the vehicle, determines that the vehicle is touched with the object within the scope of the aspect
A possibility that hitting;
When determining that the object within the scope of the vehicle and the aspect may collide, more mesh cameras are utilized
It records a video.
27. according to the method for claim 26, which is characterized in that described when the determining vehicle and the aspect model
When object in enclosing may collide, recorded a video using more mesh cameras, comprising:
When determining that the object within the scope of the vehicle and the aspect may collide, if more mesh cameras
Recording function be in close state, then the recording function of more mesh cameras described in enforced opening, and being imaged using more mesh
Head is recorded a video.
28. the method according to claim 26 or 27, which is characterized in that the object according within the scope of the aspect
The variation feelings of the history distance of the current distance and object and the vehicle within the scope of the aspect of body and the vehicle
Condition determines a possibility that object within the scope of the vehicle and the aspect collides, comprising:
When the current distance is less than the history distance, and the current distance is less than preset threshold, the vehicle is determined
It may collide with the object within the scope of the aspect.
29. method described in any one of 8-28 according to claim 1, which is characterized in that the method also includes:
When the vehicle is in dead ship condition, people or object is judged whether there is close to the vehicle according to more mesh images;
When judging vehicle described in someone or object proximity, the recording function of more mesh cameras is opened, and utilize described more
Mesh camera is recorded a video.
30. method described in any one of 8-29 according to claim 1, which is characterized in that more mesh cameras include being used for
Two cameras of color image are acquired, or include for acquiring color image camera and for acquiring grayscale image
Two cameras of picture.
31. method described in any one of 8-30 according to claim 1, which is characterized in that the vehicle assistant drive device is
Automobile data recorder.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/107517 WO2020061794A1 (en) | 2018-09-26 | 2018-09-26 | Vehicle driver assistance device, vehicle and information processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110312639A true CN110312639A (en) | 2019-10-08 |
Family
ID=68074282
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201880011498.8A Pending CN110312639A (en) | 2018-09-26 | 2018-09-26 | Vehicle assistant drive device, vehicle and information processing method |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110312639A (en) |
WO (1) | WO2020061794A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111986248A (en) * | 2020-08-18 | 2020-11-24 | 东软睿驰汽车技术(沈阳)有限公司 | Multi-view visual perception method and device and automatic driving automobile |
CN112184949A (en) * | 2020-09-29 | 2021-01-05 | 广州星凯跃实业有限公司 | Automobile image monitoring method, device, equipment and storage medium |
CN112937486A (en) * | 2021-03-16 | 2021-06-11 | 吉林大学 | Vehicle-mounted online monitoring and driving assistance system and method for road accumulated water |
CN113727064A (en) * | 2020-05-26 | 2021-11-30 | 北京罗克维尔斯科技有限公司 | Method and device for determining field angle of camera |
CN114523957A (en) * | 2020-10-30 | 2022-05-24 | 丰田自动车株式会社 | Driving support system, driving support method, and storage medium |
CN114913626A (en) * | 2022-05-07 | 2022-08-16 | 中汽创智科技有限公司 | Data processing method, device, equipment and storage medium |
CN115331483A (en) * | 2021-05-11 | 2022-11-11 | 宗盈国际科技股份有限公司 | Intelligent locomotive warning device and system |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113609945B (en) * | 2021-07-27 | 2023-06-13 | 圆周率科技(常州)有限公司 | Image detection method and vehicle |
CN113805566B (en) * | 2021-09-17 | 2023-09-29 | 南斗六星系统集成有限公司 | Detection method and system for integrated auxiliary driving system controller |
CN114407928A (en) * | 2022-01-24 | 2022-04-29 | 中国第一汽车股份有限公司 | Vehicle avoidance control method and vehicle avoidance control device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102881058A (en) * | 2012-06-19 | 2013-01-16 | 浙江吉利汽车研究院有限公司杭州分公司 | System for pre-warning scraping of automobiles and recording evidences |
CN106355675A (en) * | 2016-08-31 | 2017-01-25 | 重庆市朗信智能科技开发有限公司 | OBD hidden type automobile driving record equipment |
CN106846902A (en) * | 2015-12-03 | 2017-06-13 | 财团法人资讯工业策进会 | Vehicle collision avoidance system and method |
CN108108680A (en) * | 2017-12-13 | 2018-06-01 | 长安大学 | A kind of front vehicle identification and distance measuring method based on binocular vision |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4193886B2 (en) * | 2006-07-26 | 2008-12-10 | トヨタ自動車株式会社 | Image display device |
US7728879B2 (en) * | 2006-08-21 | 2010-06-01 | Sanyo Electric Co., Ltd. | Image processor and visual field support device |
CN101763640B (en) * | 2009-12-31 | 2011-10-19 | 无锡易斯科电子技术有限公司 | Online calibration processing method for vehicle-mounted multi-view camera viewing system |
CN106225764A (en) * | 2016-07-01 | 2016-12-14 | 北京小米移动软件有限公司 | Based on the distance-finding method of binocular camera in terminal and terminal |
CN107146247A (en) * | 2017-05-31 | 2017-09-08 | 西安科技大学 | Automobile assistant driving system and method based on binocular camera |
-
2018
- 2018-09-26 CN CN201880011498.8A patent/CN110312639A/en active Pending
- 2018-09-26 WO PCT/CN2018/107517 patent/WO2020061794A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102881058A (en) * | 2012-06-19 | 2013-01-16 | 浙江吉利汽车研究院有限公司杭州分公司 | System for pre-warning scraping of automobiles and recording evidences |
CN106846902A (en) * | 2015-12-03 | 2017-06-13 | 财团法人资讯工业策进会 | Vehicle collision avoidance system and method |
CN106355675A (en) * | 2016-08-31 | 2017-01-25 | 重庆市朗信智能科技开发有限公司 | OBD hidden type automobile driving record equipment |
CN108108680A (en) * | 2017-12-13 | 2018-06-01 | 长安大学 | A kind of front vehicle identification and distance measuring method based on binocular vision |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113727064A (en) * | 2020-05-26 | 2021-11-30 | 北京罗克维尔斯科技有限公司 | Method and device for determining field angle of camera |
CN113727064B (en) * | 2020-05-26 | 2024-03-22 | 北京罗克维尔斯科技有限公司 | Method and device for determining camera field angle |
CN111986248A (en) * | 2020-08-18 | 2020-11-24 | 东软睿驰汽车技术(沈阳)有限公司 | Multi-view visual perception method and device and automatic driving automobile |
CN111986248B (en) * | 2020-08-18 | 2024-02-09 | 东软睿驰汽车技术(沈阳)有限公司 | Multi-vision sensing method and device and automatic driving automobile |
CN112184949A (en) * | 2020-09-29 | 2021-01-05 | 广州星凯跃实业有限公司 | Automobile image monitoring method, device, equipment and storage medium |
CN114523957A (en) * | 2020-10-30 | 2022-05-24 | 丰田自动车株式会社 | Driving support system, driving support method, and storage medium |
CN114523957B (en) * | 2020-10-30 | 2024-02-09 | 丰田自动车株式会社 | Driving support system, driving support method, and storage medium |
CN112937486A (en) * | 2021-03-16 | 2021-06-11 | 吉林大学 | Vehicle-mounted online monitoring and driving assistance system and method for road accumulated water |
CN112937486B (en) * | 2021-03-16 | 2022-09-02 | 吉林大学 | Vehicle-mounted online monitoring and driving assistance system and method for road accumulated water |
CN115331483A (en) * | 2021-05-11 | 2022-11-11 | 宗盈国际科技股份有限公司 | Intelligent locomotive warning device and system |
CN114913626A (en) * | 2022-05-07 | 2022-08-16 | 中汽创智科技有限公司 | Data processing method, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2020061794A1 (en) | 2020-04-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110312639A (en) | Vehicle assistant drive device, vehicle and information processing method | |
US10713490B2 (en) | Traffic monitoring and reporting system and method | |
Garcia et al. | Data fusion for overtaking vehicle detection based on radar and optical flow | |
EP1892149B1 (en) | Method for imaging the surrounding of a vehicle and system therefor | |
US20110182473A1 (en) | System and method for video signal sensing using traffic enforcement cameras | |
CN103786644B (en) | Apparatus and method for following the trail of peripheral vehicle location | |
JP2010198552A (en) | Driving state monitoring device | |
CN109541583A (en) | A kind of leading vehicle distance detection method and system | |
CN107169418A (en) | A kind of obstacle detection method and device | |
CN109070882B (en) | Utilize the driving information providing method and device of camera image | |
CN106503622A (en) | A kind of vehicle antitracking method and device | |
CN110254349A (en) | A kind of vehicle collision prewarning method, system, vehicle and storage medium | |
CN106740476A (en) | A kind of car steering environment control method, device and automobile | |
CN208376628U (en) | A kind of safe driving assistant system | |
CN106203381A (en) | Obstacle detection method and device in a kind of driving | |
CN106809214A (en) | A kind of rear-end collision method for early warning, device and electronic equipment | |
CN115877343B (en) | Man-car matching method and device based on radar target tracking and electronic equipment | |
CN107826092A (en) | Advanced drive assist system and method, equipment, program and medium | |
CN114925747A (en) | Vehicle abnormal running detection method, electronic device, and storage medium | |
CN109703555A (en) | Method and apparatus for detecting object shielded in road traffic | |
CN109901194A (en) | Onboard system, method, equipment and the storage medium of anticollision | |
CN108432242A (en) | Display apparatus, vehicle display methods and program | |
CN107757472A (en) | Door opening alarm method, driving door alarm device and Vehicular system | |
JP2013069045A (en) | Image recognition device, image recognition method, and image recognition program | |
EP3975042B1 (en) | Method and apparatus for determining running region information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20191008 |