CN116946114A - Parking control method, device, vehicle and storage medium - Google Patents

Parking control method, device, vehicle and storage medium Download PDF

Info

Publication number
CN116946114A
CN116946114A CN202210417995.6A CN202210417995A CN116946114A CN 116946114 A CN116946114 A CN 116946114A CN 202210417995 A CN202210417995 A CN 202210417995A CN 116946114 A CN116946114 A CN 116946114A
Authority
CN
China
Prior art keywords
vehicle
obstacle
suspended
determining
parking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210417995.6A
Other languages
Chinese (zh)
Inventor
李梅
刘学良
李小龙
黄泽谦
陈小伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Rockwell Technology Co Ltd
Original Assignee
Beijing Rockwell Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Rockwell Technology Co Ltd filed Critical Beijing Rockwell Technology Co Ltd
Priority to CN202210417995.6A priority Critical patent/CN116946114A/en
Publication of CN116946114A publication Critical patent/CN116946114A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/0285Parking performed automatically
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/102Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects

Abstract

The embodiment of the disclosure provides a parking control method, a parking control device, a vehicle and a storage medium. The parking control method comprises the following steps: determining whether a suspension obstacle exists around the vehicle according to the vehicle image shot by the looking-around camera; under the condition that the suspended obstacle exists around the vehicle, determining the projection position of the suspended obstacle on the ground according to the vehicle image; and controlling the parking of the vehicle according to the projection position of the suspended obstacle on the ground. The projection position of the suspended obstacle on the ground is determined by utilizing the looking-around camera, and the parking of the vehicle is controlled based on the projection position, so that the problem that the parking ultrasonic radar cannot detect the suspended obstacle and the suspended obstacle is scraped and bumped when the vehicle is automatically parked according to the obstacle determined by the parking ultrasonic radar can be avoided.

Description

Parking control method, device, vehicle and storage medium
Technical Field
The disclosure relates to the technical field of automatic driving, and in particular relates to a parking control method, a device, a vehicle and a storage medium.
Background
Currently, in order to solve the problem of difficulty in parking due to a small parking space, some vehicles are equipped with an automatic parking (Auto Parking Asist, APK) system to control the automatic parking of the vehicle by using the APK system. Currently, sensors of an automatic parking system in widespread use include a parking ultrasonic radar and a look-around camera. The controller in the automatic parking system can determine the distance between the corresponding position of the vehicle body (namely the position where the parking ultrasonic radar is installed) and the adjacent irradiation obstacle according to the signal generated by the ultrasonic radar, determine the ground parking space borderline according to the wide-angle picture shot by each looking-around camera, and then plan the automatic parking path according to the obstacle distance, the parking space borderline and the position of the vehicle.
However, at present, ultrasonic radars in vehicles are mostly deployed at a position where the height of the vehicle body from the ground is about 55cm, and the detection angle in the vertical direction is less than 20 degrees (the beta angle in industry). Due to the limitation of the arrangement height of the ultrasonic radar, the ultrasonic radar cannot detect suspended obstacles at the higher positions around the parking space, and then the controller cannot control the vehicle to avoid the obstacles, so that the vehicle collides with the obstacles or scratches in the automatic parking process.
Disclosure of Invention
In order to solve the technical problems described above, embodiments of the present disclosure provide a parking control method, a device, a vehicle, and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a parking control method, including:
determining whether a suspension obstacle exists around the vehicle according to the vehicle image shot by the looking-around camera;
under the condition that the suspended obstacle exists around the vehicle, determining the projection position of the suspended obstacle on the ground according to the vehicle image;
and controlling the parking of the vehicle according to the projection position of the suspended obstacle on the ground.
Optionally, whether the periphery has the unsettled barrier is confirmed according to the periphery image that looks around the camera shooting, include:
And processing the vehicle periphery image by adopting a pre-trained suspended obstacle recognition model to determine whether the suspended obstacle exists in the vehicle periphery, wherein the suspended obstacle recognition model is obtained by adopting suspended obstacle sample training.
Optionally, whether the periphery has the unsettled barrier is confirmed according to the periphery image that looks around the camera shooting, include:
carrying out distortion correction on the vehicle periphery image to obtain a corrected vehicle periphery image;
detecting a detection frame of the corrected vehicle periphery image to obtain a detection frame;
calculating similarity according to the pixel images in the detection frame and the images of the pre-stored sample suspended obstacles;
under the condition that the similarity is larger than the preset similarity, the fact that the suspended obstacle exists around the vehicle is determined.
Optionally, before determining whether the vehicle periphery has the suspended obstacle according to the vehicle periphery image shot by the looking-around camera, the method further includes:
judging whether a basic obstacle exists around the vehicle;
whether the periphery has unsettled barrier is confirmed according to the periphery image that looks around the camera shooting, include:
and under the condition that the foundation obstacle exists around the vehicle, determining whether the suspended obstacle exists around the vehicle according to the vehicle image shot by the looking-around camera.
Optionally, the determining, according to the vehicle periphery image, a projection position of the suspended obstacle on the ground includes:
acquiring vehicle periphery images formed by shooting the suspended obstacle at a plurality of positions and/or two angles by using a surrounding camera;
determining the position of the suspended obstacle relative to the vehicle according to the vehicle periphery images corresponding to the positions, and the position deviation of the positions and/or the angle deviation of the two angles;
and determining the projection position of the suspended obstacle on the ground according to the position of the suspended obstacle relative to the vehicle.
Optionally, the method further comprises:
determining the measured distance between a basic obstacle suspending the suspended obstacle and a vehicle according to a distance signal generated by the parking ultrasonic radar;
the determining the projection position of the suspended barrier on the ground according to the vehicle periphery image comprises the following steps:
determining an estimated projected distance of the overhead obstacle relative to the underlying obstacle;
and determining the projection position of the suspended obstacle on the ground according to the measured distance and the estimated exploration distance.
Optionally, the method further comprises:
determining the measured distance between a basic obstacle suspending the suspended obstacle and a vehicle according to a distance signal generated by the parking ultrasonic radar;
The determining the projection position of the suspended obstacle on the ground according to the vehicle periphery image comprises the following steps:
determining the type of the suspended obstacle according to the vehicle periphery image;
determining a matched preset exploration distance according to the type of the suspended obstacle;
and determining the projection position of the suspended obstacle on the ground according to the measured distance and the preset exploration distance.
Optionally, before the planning of the parking path according to the projected position of the overhead obstacle on the ground, the method further includes:
determining a parking space marking line according to the vehicle periphery image shot by the looking-around camera;
the planning a parking path according to the projection position of the suspended obstacle on the ground comprises the following steps:
and planning a parking path according to the projection position of the suspended obstacle on the ground and the parking space marking line.
In a second aspect, an embodiment of the present disclosure further provides a parking control apparatus, including:
the suspension obstacle recognition unit is used for determining whether suspension obstacles exist around the vehicle according to the vehicle periphery image shot by the looking-around camera;
the projection position determining unit is used for determining the projection position of the suspended obstacle on the ground according to the vehicle periphery image under the condition that the suspended obstacle is determined to be around the vehicle;
And the parking path planning unit is used for controlling the parking of the vehicle according to the projection position of the suspended obstacle on the ground.
In a third aspect, embodiments of the present disclosure provide a vehicle comprising a pan-around camera and a vehicle controller;
the looking-around camera is used for shooting a vehicle periphery image;
the vehicle controller acquires the vehicle periphery image captured by the looking-around camera, and executes the parking control method according to any one of claims 1 to 8.
In a fourth aspect, the disclosed embodiments provide a computer-readable storage medium storing executable instructions that, when executed by a vehicle controller, cause the vehicle controller to implement a park control method as described above.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages:
by adopting the scheme provided by the embodiment of the disclosure, under the condition that the surrounding image shot by the looking-around camera determines that the surrounding has the suspended obstacle, the projection position of the suspended obstacle on the ground can be determined according to the surrounding image, and the parking of the vehicle is controlled according to the projection position on the ground. Specifically, the vehicle controller may determine an area that the vehicle may cover according to a projection of the overhead obstacle on the ground, plan a parking path according to the area that the vehicle may cover and a steering mechanism of the vehicle, and control parking of the vehicle according to the parking path. The projection position of the suspended obstacle on the ground is determined by utilizing the looking-around camera, and the parking of the vehicle is controlled based on the projection position, so that the problem that the parking ultrasonic radar cannot detect the suspended obstacle and the suspended obstacle is scraped and bumped when the vehicle is automatically parked according to the obstacle determined by the parking ultrasonic radar can be avoided.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments of the present disclosure or the prior art, the drawings that are used in the description of the embodiments or the prior art will be briefly described below. It will be obvious to those skilled in the art that other figures can be obtained from these figures without inventive effort, in which:
FIG. 1 is a flow chart of a park control method provided by an embodiment of the present disclosure;
FIG. 2 is a flow chart of a park control method provided by other embodiments of the present disclosure;
FIG. 3 is a flow chart providing a method for determining a projected location of a suspended obstacle on the ground from a vehicle-surrounding image in accordance with some embodiments of the present disclosure;
FIG. 4 is a schematic diagram of a method for determining an estimated probe distance using geometric processing methods provided by some embodiments of the present disclosure;
fig. 5 is a schematic structural view of a parking control apparatus provided in an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a vehicle controller provided by some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below. It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
Before describing the parking control method provided by the embodiment of the disclosure, the embodiments of the disclosure first explain a vehicle equipped with an automatic parking system, so that the following parking control method is more convenient to understand. In an embodiment of the present disclosure, in a vehicle in which an automatic parking system is installed, hardware equipment for implementing automatic parking includes a parking ultrasonic radar and a look-around camera.
The parking ultrasonic radar is a radar for obstacle detection using ultrasonic waves, and may include ultrasonic parking assist (Ultrasonic Parking Assistance, UPA) radar and automatic parking assist (Automatic Parking Assistance, APA) radar. The UPA radar is a radar for measuring front and rear obstacles, and is often mounted on front and rear bumpers of a vehicle. In conventional applications, UPA radars mostly adopt a front 4-rear 4 arrangement and a front 6-rear 6 arrangement. The APA radar is a radar for measuring left and right side obstacles, which is mostly installed at a side of a bumper of a vehicle or at a guest-welcoming pedal at a lower portion of a center pillar of the vehicle. . Both the UPA radar and the APA radar have a certain detection width and a range, wherein the detection width represents the detection width range of the ultrasonic radar, and the range represents the distance which can be detected by the ultrasonic radar in the detection width range.
The looking-around camera is a camera for shooting the surrounding environment of the vehicle and obtaining the image of the surrounding environment of the vehicle. In practical application, the looking-around camera generally comprises a front camera, a rear camera and two side-looking cameras. The front camera is installed in the middle position of the vehicle middle net, the rear camera is installed in the middle position of the vehicle tail door or the trunk cover, and the two side view cameras are respectively installed on the lower sides of the two rearview mirrors of the vehicle. In practical application, each looking around camera is the fisheye camera to can reach maximum shooting visual angle.
According to the installation position of the looking-around camera and the installation position of the parking ultrasonic radar, the installation height of the looking-around camera in the vehicle can be determined to be larger than the installation height of the parking ultrasonic radar.
In addition, the vehicle provided with the automatic parking system further comprises a vehicle controller, and the vehicle controller is in communication connection with the parking ultrasonic radar and the looking-around camera and can receive detection distance signals sent by the ultrasonic radar and image signals shot by the looking-around camera. According to the detection distance signal, the vehicle controller can determine the distance from the parking ultrasonic radar at the corresponding position to the obstacle; according to the image signals shot by the looking-around camera, the vehicle controller can determine the parking space marking in the ground and determine the pose of the vehicle relative to the parking space according to the trend of the parking space marking.
After determining the pose of the vehicle relative to the parking space and the reasonable safe distance from surrounding obstacles, the vehicle controller can formulate an automatic parking control strategy to control the vehicle power device and the brake device so as to realize automatic parking of the vehicle.
Fig. 1 is a flowchart of a parking control method provided in an embodiment of the present disclosure. The parking control method provided by the embodiment of the disclosure is executed by the vehicle controller. As shown in fig. 1, the parking control method includes steps S110 to S130.
Step S110: and determining whether a suspended obstacle exists around the vehicle according to the vehicle periphery image shot by the looking-around camera.
When the vehicle enters an automatic parking state, the vehicle controller controls the surrounding camera to start so that the surrounding camera shoots a vehicle periphery image. After the surrounding camera is started and shoots the vehicle periphery image, the vehicle periphery image is sent to the vehicle controller. In the embodiment of the disclosure, the vehicle controller can simultaneously receive the vehicle periphery images shot by the looking-around cameras in all directions, and can also determine which of the vehicle periphery images shot by the looking-around cameras are received according to the gear state of the vehicle. For example, when the vehicle is in D range (i.e., forward range), the vehicle controller may receive only images captured by the front camera and the side view camera. While the vehicle is in R (i.e., reverse), the vehicle controller may only receive images taken by the rear camera and the side view camera.
After receiving the vehicle periphery image, the vehicle controller can judge whether the vehicle periphery has a suspended obstacle according to the vehicle periphery image. The suspended barrier is a barrier in a suspended state, that is, a barrier with a gap between the bottom surface and the ground. The suspended obstacle may be an obstacle that may be detected by the vehicle-mounted ultrasonic radar, or an obstacle that may not be detected by the vehicle-mounted ultrasonic radar, and the embodiment of the disclosure is not particularly limited.
In the embodiment of the disclosure, after receiving the vehicle periphery image, the vehicle controller may process the vehicle periphery image by adopting a predetermined processing method, to determine whether the vehicle periphery has a suspended obstacle.
In some embodiments, the vehicle controller may include step S111 when performing step S110.
Step S111: and processing the vehicle periphery image by adopting a pre-trained suspended obstacle recognition model, and determining whether suspended obstacles exist on the vehicle periphery.
Specifically, in some embodiments of the present disclosure, a pre-trained airborne obstacle recognition model is configured in a vehicle controller. The suspended obstacle recognition model is a model specially used for detecting suspended obstacles. Specifically, the suspended obstacle recognition model may be determined using a vehicle-surrounding image including a marked sample suspended obstacle location and a corresponding suspended obstacle tag. For example, if a sample obstacle is a fire pipe on a garage post, the corresponding peri-vehicle image is an image that includes the marked fire pipe, and the corresponding tag is "fire pipe". For another example, if a sample obstacle is an electricity meter box on a garage wall, the corresponding vehicle surrounding image is an image including a marked electricity meter box, and the corresponding tag is "electricity meter box".
In still other embodiments, the vehicle controller may include steps S112-S114 when executing step S110.
Step S112: and carrying out distortion correction on the vehicle periphery image to obtain a corrected vehicle periphery image.
In practical application, most of the looking-around cameras arranged in the vehicle are wide-angle cameras or fisheye cameras. The peripheral image photographed by the looking-around camera is severely distorted. In order to determine whether there is a suspended obstacle around the vehicle by adopting the subsequent steps S113 to S114, it is preferable to correct distortion of the vehicle image.
In a specific embodiment, a correction matrix configured according to the characteristics of the vehicle-surrounding camera is configured in the vehicle controller. After the vehicle periphery image is obtained, the vehicle periphery image is corrected by adopting the correction matrix, and the corrected vehicle periphery image can be obtained.
Step S113: and carrying out edge detection on the corrected vehicle periphery image to obtain a detection frame.
In the embodiment of the disclosure, edge detection is performed on the corrected vehicle periphery image to obtain a detection frame of the corrected vehicle periphery image. The detection frame is a frame with larger contrast variation in the vehicle periphery image. For the vehicle periphery image, most of the images are edges of suspended obstacles or edges of suspended obstacles with internal images. Of course, in some embodiments of the present disclosure, if there is an image on the underlying obstacle in the vehicle surrounding image, edges in the image and lines in the image interior region may also be identified as detection borders.
In the embodiment of the disclosure, the vehicle controller may determine the detection frame by using an existing method such as gaussian filtering.
Step S114: and calculating the similarity according to the pixel image in the detection frame and the image of the pre-stored sample suspended obstacle.
After the detection frame is determined, the vehicle controller can adopt a similarity detection algorithm to calculate the similarity between the pixel image in the detection frame and the image of the sample suspended obstacle according to the detection frame, so as to obtain the similarity. The sample overhead obstacle may be an overhead obstacle such as a hydrant, fire pipe, billboard, or the like.
Step S115: under the condition that the similarity is larger than the preset similarity, the fact that the suspended obstacle exists around the vehicle is determined.
After the similarity is calculated, if the similarity is larger than the preset similarity, determining that the suspended obstacle with the same sample suspended obstacle appears in the detection frame with high probability, and determining that the suspended obstacle exists around the vehicle.
Step S120: under the condition that a suspended obstacle exists around the vehicle, determining the projection position of the suspended obstacle on the ground according to the vehicle image.
The projection position of the suspended obstacle on the ground is a display position on the ground, which is obtained by vertically projecting the suspended obstacle. In the embodiment of the present disclosure, how to determine the projection position of the suspended obstacle on the ground according to the vehicle-surrounding image is described in detail later.
Step S130: and controlling the parking of the vehicle according to the projection position of the suspended obstacle on the ground.
After the projection position of the suspended obstacle on the ground is obtained, the vehicle controller can plan a parking path according to the projection position of the suspended obstacle on the ground, and work and make the vehicle parking according to the steering mechanism and the power mechanism of the planned parking path. Specifically, the vehicle controller may determine an area that the vehicle may cover according to a projection of the overhead obstacle on the ground, plan a parking path according to the area that the vehicle may cover and a steering mechanism of the vehicle, and control parking of the vehicle according to the parking path. In some embodiments, after determining the projection of the overhead obstacle onto the ground, the vehicle controller may attempt to park the vehicle in a conventional manner and avoid the projected area of the overhead obstacle onto the ground during parking until the vehicle is parked in place.
By adopting the parking control method provided by the embodiment of the disclosure, the vehicle controller can determine the projection position of the suspended obstacle on the ground according to the vehicle periphery image under the condition that the suspended obstacle is determined to exist on the vehicle periphery according to the vehicle periphery image shot by the looking-around camera, and can control the vehicle to park according to the projection position on the ground. According to the parking control method provided by the embodiment of the disclosure, the projection position of the suspended obstacle on the ground is determined by using the looking-around camera, and the parking of the vehicle is controlled based on the projection position, so that the problem that the suspended obstacle cannot be detected by the parking ultrasonic radar and the suspended obstacle is scraped and bumped possibly when the parking ultrasonic radar determines the obstacle to automatically park.
Fig. 2 is a flow chart of a parking control method provided in further embodiments of the present disclosure. As shown in fig. 2, a parking control method provided in other embodiments of the present disclosure includes steps S210 to S240.
Step S210: judging whether a basic obstacle exists around the vehicle; if yes, go to step S220.
In an embodiment of the present disclosure, the base obstacle is an obstacle suspending the overhead obstacle. The type of underlying barrier may vary depending on the particular scenario. For example, in the case where the parking space is a garage parking space, the foundation may be a wall, a column, or the like.
It should be noted that the underlying obstacle is an obstacle that is in direct contact with the ground. A parking ultrasonic radar in a vehicle may detect a basic obstacle. That is, the parking ultrasonic radar may determine the basic obstacle from the emitted ultrasonic waves and the ultrasonic waves reflected by the basic obstacle.
In the embodiment of the disclosure, during automatic parking of the vehicle, the vehicle controller may determine whether a basic obstacle exists around the vehicle according to the received signal.
In some embodiments, the vehicle controller may determine whether there is a basic obstacle around the vehicle based on the distance signal transmitted from the parking ultrasonic radar. If the parking ultrasonic radar generates a specific distance signal (for example, the obstacle distance is 1 m) to the vehicle controller during parking of the vehicle, the vehicle controller determines that there is a basic obstacle around the vehicle.
In other embodiments, the vehicle controller may determine whether there is a basic obstacle around the vehicle based on the images captured by the looking-around camera. For example, during parking of the vehicle, the vehicle controller processes the vehicle-surrounding image captured by the looking-around camera, identifies whether a significant sign decomposition line exists at a specific pixel position in the vehicle-surrounding image, and determines that a basic obstacle exists around the vehicle if a significant sign boundary line exists. For another example, in an instance where a ground garage or the like has a remarkable ground color and wall surface (pillar) color, the vehicle controller may determine whether or not there is a basic obstacle around the vehicle based on the colors of the ground and wall surface. For another example, the vehicle controller may process the vehicle and the image using a pre-trained deep learning model to determine whether there are basic obstacles around the vehicle. The deep learning model is obtained by training a sample image comprising basic obstacle pixel information.
Step S220: and determining whether a suspended obstacle exists around the vehicle according to the vehicle periphery image shot by the looking-around camera.
Step S230: and under the condition that the suspended obstacle exists around the vehicle, determining the projection position of the suspended obstacle on the ground according to the vehicle image.
Step S240: and controlling the parking of the vehicle according to the projection position of the suspended obstacle on the ground.
Steps S220 to S240 are identical to steps S110 to S130 in the embodiment, and will not be repeated here, and reference is made specifically to the foregoing description.
In the embodiment of the disclosure, a vehicle controller first detects whether a basic obstacle exists around a vehicle. The vehicle controller performs determination of the vehicle number if there is a basic obstacle around the vehicleThere is the step of suspending the obstacle. According to practical experience, the suspended barriers pass through the foundation barrierThe object support can be hung on the wall, that is to say, a foundation barrier is certain in the place with the suspended barrier. The step of determining whether the vehicle periphery has the suspended obstacle or not according to the vehicle periphery image shot by the looking-around camera is executed after the condition that the vehicle periphery has the basic obstacle is judged, so that the condition that the vehicle periphery suspended obstacle is detected under the condition that the vehicle periphery does not have the basic obstacle and the suspended obstacle is avoided, and the cost of a vehicle controller is reduced.
As described above, the vehicle controller determines the projection position of the overhead obstacle on the ground according to the vehicle-surrounding image during the execution of step S120 or step S230. Fig. 3 is a flow chart providing a method for determining a projected position of a suspended obstacle on the ground from a vehicle-surrounding image according to some embodiments of the present disclosure. As shown in fig. 3, in an embodiment of the present disclosure, determining, by a vehicle controller, a projection position of a suspended obstacle on the ground according to a vehicle-surrounding image may include steps S310 to S330.
Step S310: and acquiring the vehicle periphery images formed by shooting the suspended obstacle at a plurality of positions by using the looking-around camera.
As can be seen from optical knowledge, the spatial positioning of the suspended obstacle cannot be achieved by using only the looking-around camera or only the vehicle-surrounding image captured by one looking-around camera. In order to achieve positioning of the suspended obstacle through binocular vision, in the embodiment of the disclosure, a vehicle periphery image formed by shooting the suspended obstacle at a plurality of positions can be obtained through moving the looking-around camera or rotating the looking-around camera. Specifically, the camera can be driven to move for a certain distance, so that the position of the looking-around camera is changed, and the vehicle periphery images comprising the suspended obstacle are respectively shot before and after the position is changed.
Step S320: and determining the position of the suspended obstacle relative to the vehicle according to the vehicle periphery images corresponding to the positions and the position deviation of the positions.
After the vehicle-surrounding images corresponding to the plurality of positions are obtained, the vehicle controller can correct the distortion of the vehicle-surrounding images, and the corrected vehicle-surrounding images are obtained. After the corrected vehicle periphery image is obtained, the vehicle controller determines the position of the suspended obstacle relative to the vehicle by adopting a binocular vision algorithm according to the vehicle periphery images corresponding to the positions and the position deviation of the positions. The position of the overhead obstacle relative to the vehicle includes a position relative to the vehicle in a horizontal direction and a position relative to the vehicle in a height direction.
Step S330: and determining the projection position of the suspended obstacle on the ground according to the position of the suspended obstacle relative to the vehicle.
In an embodiment of the disclosure, the vehicle controller determines a projected position of the overhead obstacle on the ground according to the position of the overhead obstacle relative to the vehicle, and takes the directional position of the overhead obstacle relative to the vehicle as the projected position of the overhead obstacle on the ground.
Optionally, in some embodiments of the present disclosure, the vehicle control may further perform step S140 before performing the aforementioned step S120 or step S230.
Step S140: and determining the measured distance between the basic obstacle suspending the suspended obstacle and the vehicle according to the distance signal generated by the parking ultrasonic radar.
In a specific embodiment, the vehicle controller can directly analyze the distance signal sent by the parking ultrasonic radar to determine the measured distance between the parking ultrasonic radar and the basic obstacle.
Optionally, in the case that the measured distance between the base obstacle and the vehicle can be determined, the determining, by the vehicle controller, the projection position of the suspended obstacle on the ground according to the vehicle periphery image in the embodiment of the disclosure may include steps S410 to S420.
Step S410: an estimated projected distance of the overhead obstacle relative to the underlying obstacle is determined.
The projected distance is the distance of the outer facade of the suspended obstacle relative to the outer facade of the underlying obstacle. The estimated extraction distance is a distance obtained by estimating the extraction distance.
In some embodiments of the present disclosure, the vehicle controller may process the vehicle periphery image using an image processing model to determine an estimated projected distance of the overhead obstacle relative to the underlying obstacle.
In another aspect of the disclosureIn other embodiments, the vehicle controller may determine the estimated projected distance using a geometric processing method. Fig. 4 is a schematic diagram of a method for determining an estimated detection distance using a geometric processing method provided by some embodiments of the present disclosure. As shown in fig. 4, assuming that the vehicle is parked in a parking lot having a horizontal ground, when the vehicle is at different distances from the base obstacle 11, the corresponding distances may be determined using a parking ultrasonic radar, in which case the distance from the looking-around camera 13 to the base obstacle 11 may also be determined, and the distances from the looking-around camera 13 to the base obstacle 11 are denoted by a and b. The looking-around camera 13 is fixed on the vehicle, the height of the looking-around camera 13 from the ground is fixed, and the height of the looking-around camera from the ground is denoted by c. Assuming that the flying obstacle 12 has a ground clearance x, it is estimated to have a projected distance y. As shown in figure 4 of the drawings, The α and β can be determined according to the pixel where the lower edge of the suspended obstacle 12 (i.e., the edge of the suspended obstacle 12 that is close to the triangle shown by the dotted line in the figure) is located, and then the estimated exploration distance y can be obtained by combining the foregoing formulas.
Step S420: and determining the projection position of the suspended obstacle on the ground according to the measured distance and the estimated exploration distance.
The projection position of the suspended obstacle on the ground is determined according to the measured distance and the estimated protruding distance, and the relative distance between the suspended obstacle and the vehicle can be obtained by subtracting the estimated protruding distance from the measured distance.
Optionally, in the case that the measured distance between the base obstacle and the vehicle can be determined, the determining, by the vehicle controller, the projection position of the suspended obstacle on the ground according to the vehicle periphery image in the embodiment of the disclosure may include steps S510 to S520.
Step S510: and determining the type of the suspended obstacle according to the vehicle periphery image.
Step S520: and determining a matched preset exploration distance according to the type of the suspended obstacle.
In an actual scene, the suspended obstacles are mainly the obstacles of fire-fighting pipelines, fire hydrants, gas pipelines, ammeter boxes, advertising lamp boxes and the like. The various types of overhead obstacles described above have specific dimensional standards or corresponding dimensional specifications, as well as corresponding installation specifications. The projected distance of the overhead obstacle relative to the underlying obstacle may be generally determined using the aforementioned dimensional specifications and safety specifications. Based on this, a matching predicted detection distance may be determined according to the type of overhead obstacle.
In some embodiments of the present disclosure, a flying obstacle recognition model is configured in a vehicle controller, the flying obstacle type recognition model being trained using a large number of sample flying obstacles. After the vehicle periphery image comprising the suspended obstacle is acquired, the vehicle periphery influence is processed by adopting a suspended obstacle type recognition model, so that the type of the suspended obstacle can be determined.
After determining the type of the overhead obstacle, the vehicle controller may query a predetermined lookup table to determine a corresponding predetermined projected distance.
Step S530: and determining the projection position of the suspended obstacle on the ground according to the measured distance and the preset exploration distance.
The projection position of the suspended obstacle on the ground is determined according to the measured distance and the predicted protruding distance, and the relative distance between the suspended obstacle and the vehicle can be obtained by subtracting the predicted protruding distance from the measured distance.
Optionally, in some embodiments of the present disclosure, step S150 may also be performed before the vehicle controller performs step S130 or step S240 described above.
Step S150: and determining a parking space marking line according to the vehicle periphery image shot by the looking-around camera.
Correspondingly, the determining the parking space marking line body according to the circumferential image shot by the looking-around camera in step S130 and step S240 may include: and planning a parking path according to the projection position of the suspended obstacle on the ground and the parking space marking line.
That is, in some parking positions having a parking space marking line, a parking path is planned according to a projected position of a suspended obstacle on the ground and the parking space marking line so that the vehicle can park into a parking space. In other embodiments of the present disclosure, where the parking location does not have a space marking line, the parking path may also be planned based only on the projected location of the overhead obstacle.
It should be noted that the aforementioned planning of the parking path may also take into account other factors, such as the distance to other non-hanging obstacles as determined by the parking ultrasonic radar.
Optionally, in some embodiments of the present disclosure, the parking control method may further perform step S160 as follows before performing the aforementioned step S130 or S240.
Step S160: outputting estimated suspension obstacle information and receiving a feedback control instruction.
In a specific embodiment, the vehicle controller may output information of the estimated dangling obstacle after determining that the surrounding of the vehicle has the dangling obstacle. The outputted information of the floating obstacle may include the type of the floating obstacle and the trip distance (the trip distance may be the aforementioned estimated trip distance or an estimated trip distance determined based on the type of the floating obstacle). The vehicle air system can output obstacle information through the vehicle-mounted central control display screen. By outputting the information of the floating obstacle, the driver can know the information of the floating obstacle and determine whether to confirm.
If the driver determines that the obstacle information is correct, the driver can directly confirm and generate a feedback control instruction for determining the suspension obstacle information. If the driver determines that the obstacle information is incorrect, a feedback control instruction for changing the obstacle information may also be generated, so that the vehicle-mounted controller changes the information of the obstacle according to the feedback control instruction for changing the obstacle information.
On the basis of performing the aforementioned step S160, the vehicle controller may further perform step S170: and under the condition that the information of the suspension obstacle is determined to be wrong according to the feedback control instruction, determining the projection of the obstacle on the ground according to the correct obstacle information in the feedback control instruction.
After performing step S160, the vehicle controller may perform step S130 or step S240 described previously.
In addition to providing the foregoing parking control method, embodiments of the present disclosure also provide a parking control apparatus. Fig. 5 is a schematic structural view of a parking control apparatus provided in an embodiment of the present disclosure. As shown in fig. 5, a parking control apparatus 500 provided by an embodiment of the present disclosure includes a floating obstacle identifying unit 501, a projected position determining unit 502, and a parking path planning unit 503.
The suspended obstacle recognition unit 501 is configured to determine whether a suspended obstacle exists around a vehicle according to a vehicle image captured by a looking-around camera.
The projection position determining unit 502 is configured to determine, when it is determined that the suspended obstacle exists around the vehicle, a projection position of the suspended obstacle on the ground according to the vehicle image;
the parking path planning unit 503 is used for controlling the vehicle to park according to the projection position of the suspended obstacle on the ground.
In some embodiments of the present disclosure, the suspended obstacle recognition unit 501 processes the vehicle periphery image with a pre-trained suspended obstacle recognition model to determine whether the suspended obstacle is present in the vehicle periphery, where the suspended obstacle recognition model is obtained by training a suspended obstacle sample.
In some embodiments of the present disclosure, the floating obstacle recognition unit 501 includes an image correction subunit, a detection bezel recognition subunit, a similarity calculation subunit, and a floating obstacle recognition subunit.
The image correction subunit is used for carrying out distortion correction on the vehicle periphery image and obtaining a corrected vehicle periphery image. The detection frame recognition subunit is used for detecting the detection frame of the corrected vehicle periphery image to obtain a detection frame. The similarity calculating subunit is used for calculating the similarity according to the pixel image in the detection frame and the image of the pre-stored sample suspended obstacle. The suspended obstacle recognition subunit is used for determining that suspended obstacles exist around the vehicle under the condition that the similarity is larger than the preset similarity.
In some embodiments of the present disclosure, the parking control apparatus 500 further includes a basic obstacle recognition unit. The basic obstacle recognition unit is used for judging whether basic obstacles exist around the vehicle. In the case where the basic obstacle recognition unit recognizes that there is a basic obstacle, the suspended obstacle recognition unit 501 determines whether there is the suspended obstacle in the vehicle periphery according to the vehicle periphery image captured by the looking-around camera.
In some embodiments of the present disclosure, the projection position determination unit 502 includes a vehicle periphery image acquisition subunit, a relative position determination subunit, and a projection position determination subunit.
The vehicle periphery image acquisition subunit is used for acquiring vehicle periphery images formed by shooting the suspended obstacle at a plurality of positions and/or two angles by the looking-around camera;
the relative position determining subunit is used for determining the position of the suspended obstacle relative to the vehicle according to the vehicle periphery images corresponding to the positions and the position deviation of the positions and/or the angle deviation of the two angles;
the projection position determining subunit is used for determining the projection position of the suspended obstacle on the ground according to the position of the suspended obstacle relative to the vehicle.
In some embodiments of the present disclosure, the parking control apparatus 500 further includes a distance measurement unit. The distance measuring unit is used for determining the measuring distance between the basic obstacle suspending the suspended obstacle and the vehicle according to the distance signal generated by the parking ultrasonic radar. Correspondingly, the projection position determining unit comprises an estimated projected distance calculating subunit and a first projection position determining subunit. An estimated projected distance calculation subunit is configured to determine an estimated projected distance of the overhead obstacle relative to the base obstacle. The first projection position determining subunit is used for determining the projection position of the suspended obstacle on the ground according to the measured distance and the estimated exploration distance.
In some embodiments of the present disclosure, the parking control apparatus 500, in the case where the distance measurement unit is included, the projection position determination unit may further include an obstacle type recognition subunit, a preset pop-up distance determination subunit, and a second projection position determination subunit. The obstacle type recognition subunit is used for determining the type of the suspended obstacle according to the vehicle periphery image. The preset pop-up distance determining subunit is used for determining a matched preset pop-up distance according to the type of the suspended obstacle. The second projection position determining subunit is used for determining the projection position of the suspended obstacle on the ground according to the measured distance and the preset exploration distance.
In some embodiments of the present disclosure, the parking control apparatus 500 may further include a parking space marking line determination unit. And the parking space mark line determining unit is used for determining a parking space mark line according to the vehicle periphery image shot by the looking-around camera. Correspondingly, the parking path planning unit 503 plans a parking path according to the projection position of the suspended obstacle on the ground and the parking space marking line.
The embodiment of the disclosure also provides a vehicle. The vehicle includes a plurality of ultrasonic radars, a plurality of looking-around cameras, and a vehicle controller. The parking ultrasonic radar is arranged on the vehicle body periphery side of the vehicle, is used for generating a distance signal based on an ultrasonic signal, and is used for sending the distance signal to the vehicle controller, looking around the camera, is arranged on the vehicle body periphery side of the vehicle, is used for shooting a vehicle periphery image when the vehicle is in a parking state, and is used for sending the vehicle periphery image to the vehicle controller; the vehicle controller is used for determining whether a suspended obstacle exists around the vehicle according to the vehicle periphery image shot by the looking-around camera; under the condition that the suspended obstacle exists around the vehicle, determining the projection position of the suspended obstacle on the ground according to the vehicle image; and planning a parking path according to the projection position of the suspended obstacle on the ground.
Of course, the vehicle provided in the embodiments of the present disclosure further includes various mechanisms such as a driving mechanism, a braking mechanism, and a steering mechanism that can implement driving control of the vehicle, and the embodiments of the present disclosure are not specifically described.
The embodiment of the disclosure also provides a vehicle controller, which comprises a processor and a memory, wherein the memory stores a computer program, and the voice interaction control method of any embodiment can be realized when the computer program is executed by the processor. Specifically, after the vehicle controller loads the computer program stored in the memory, each functional unit in the vehicle control device as described above may be instantiated, and then the corresponding operation is performed by using each functional unit, so as to complete the parking control of the vehicle. In specific implementations, the vehicle controller may be a dedicated controller or may be a common controller that integrates various control functions in the vehicle, and embodiments of the disclosure are not particularly limited.
Fig. 6 is a schematic structural diagram of a vehicle controller provided by some embodiments of the present disclosure. Referring now in particular to fig. 6, a schematic diagram of a vehicle controller 600 suitable for use in implementing embodiments of the present disclosure is shown. The vehicle controller shown in fig. 6 is merely one example and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 6, the vehicle controller 600 may include a processing device (e.g., a central processing unit, a graphics processor, etc.) 601 that may perform various appropriate actions and processes according to programs stored in a read only memory ROM602 or programs loaded from a storage device 608 into a random access memory RAM 603. In the RAM603, various programs and data required for the operation of the vehicle controller 600 are also stored. The processing device 601, the ROM602, and the RAM603 are connected to each other through a bus 604. An input/output I/O interface 606 is also connected to bus 604.
In general, the following devices may be connected to the I/O interface 606: input devices 606 including, for example, a touch screen, touchpad, camera, microphone, accelerometer, gyroscope, and the like; an output device 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including a hard disk and the like; and a communication device 609. The communication device 609 may allow the vehicle controller 600 to communicate wirelessly or by wire with other devices to exchange data. While fig. 6 shows a vehicle controller 600 having various devices, it should be understood that not all of the illustrated devices are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 601.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable medium may be contained in the vehicle controller; or may exist alone without being incorporated into the vehicle controller.
The computer readable medium carries one or more programs that, when executed by the vehicle controller, cause the vehicle controller to: determining whether a suspension obstacle exists around the vehicle according to the vehicle image shot by the looking-around camera; under the condition that the suspended obstacle exists around the vehicle, determining the projection position of the suspended obstacle on the ground according to the vehicle image; and planning a parking path according to the projection position of the suspended obstacle on the ground. Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection according to one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The embodiments of the present disclosure further provide a computer readable storage medium, where a computer program is stored, where when the computer program is executed by a processor, the method of any of the foregoing method embodiments may be implemented, and the implementation manner and the beneficial effects are similar, and are not repeated herein.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The above is merely a specific embodiment of the disclosure to enable one skilled in the art to understand or practice the disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown and described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (11)

1. A parking control method, characterized by comprising:
determining whether a suspension obstacle exists around the vehicle according to the vehicle image shot by the looking-around camera;
under the condition that the suspended obstacle exists around the vehicle, determining the projection position of the suspended obstacle on the ground according to the vehicle image;
and controlling the parking of the vehicle according to the projection position of the suspended obstacle on the ground.
2. The method of claim 1, wherein determining whether there is a suspended obstacle in the vehicle periphery based on the vehicle periphery image captured by the pan-around camera comprises:
and processing the vehicle periphery image by adopting a pre-trained suspended obstacle recognition model to determine whether the suspended obstacle exists in the vehicle periphery, wherein the suspended obstacle recognition model is obtained by adopting suspended obstacle sample training.
3. The method of claim 1, wherein determining whether there is a suspended obstacle in the vehicle periphery based on the vehicle periphery image captured by the pan-around camera comprises:
carrying out distortion correction on the vehicle periphery image to obtain a corrected vehicle periphery image;
detecting a detection frame of the corrected vehicle periphery image to obtain a detection frame;
calculating similarity according to the pixel images in the detection frame and the images of the pre-stored sample suspended obstacles;
under the condition that the similarity is larger than the preset similarity, the fact that the suspended obstacle exists around the vehicle is determined.
4. The method of claim 1, further comprising, prior to determining whether there is a suspended obstacle around the vehicle from the vehicle-surrounding image captured by the pan-around camera:
judging whether a basic obstacle exists around the vehicle;
whether the periphery has unsettled barrier is confirmed according to the periphery image that looks around the camera shooting, include:
and under the condition that the foundation obstacle exists around the vehicle, determining whether the suspended obstacle exists around the vehicle according to the vehicle image shot by the looking-around camera.
5. The method of claim 1, wherein determining the projected location of the overhead obstacle on the ground from the vehicle-surrounding image comprises:
Acquiring vehicle periphery images formed by shooting the suspended obstacle at a plurality of positions and/or two angles by using a surrounding camera;
determining the position of the suspended obstacle relative to the vehicle according to the vehicle periphery images corresponding to the positions, and the position deviation of the positions and/or the angle deviation of the two angles;
and determining the projection position of the suspended obstacle on the ground according to the position of the suspended obstacle relative to the vehicle.
6. The method according to claim 1, wherein the method further comprises:
determining the measured distance between a basic obstacle suspending the suspended obstacle and a vehicle according to a distance signal generated by the parking ultrasonic radar;
the determining the projection position of the suspended barrier on the ground according to the vehicle periphery image comprises the following steps:
determining an estimated projected distance of the overhead obstacle relative to the underlying obstacle;
and determining the projection position of the suspended obstacle on the ground according to the measured distance and the estimated exploration distance.
7. The method according to claim 1, wherein the method further comprises:
determining the measured distance between a basic obstacle suspending the suspended obstacle and a vehicle according to a distance signal generated by the parking ultrasonic radar;
The determining the projection position of the suspended obstacle on the ground according to the vehicle periphery image comprises the following steps:
determining the type of the suspended obstacle according to the vehicle periphery image;
determining a matched preset exploration distance according to the type of the suspended obstacle;
and determining the projection position of the suspended obstacle on the ground according to the measured distance and the preset exploration distance.
8. The method of any of claims 1-7, wherein prior to the planning of the parking path based on the projected location of the overhead obstacle on the ground, the method further comprises:
determining a parking space marking line according to the vehicle periphery image shot by the looking-around camera;
the planning a parking path according to the projection position of the suspended obstacle on the ground comprises the following steps:
and planning a parking path according to the projection position of the suspended obstacle on the ground and the parking space marking line.
9. A parking control apparatus, characterized by comprising:
the suspension obstacle recognition unit is used for determining whether suspension obstacles exist around the vehicle according to the vehicle periphery image shot by the looking-around camera;
the projection position determining unit is used for determining the projection position of the suspended obstacle on the ground according to the vehicle periphery image under the condition that the suspended obstacle is determined to be around the vehicle;
And the parking path planning unit is used for planning a parking path according to the projection position of the suspended obstacle on the ground.
10. A vehicle, comprising a look-around camera and a vehicle controller;
the looking-around camera is used for shooting a vehicle periphery image;
the vehicle controller acquires the vehicle periphery image captured by the looking-around camera, and executes the parking control method according to any one of claims 1 to 8.
11. A computer readable storage medium storing executable instructions that, when executed by a vehicle controller, cause the vehicle controller to implement the park control method of any of claims 1-8.
CN202210417995.6A 2022-04-20 2022-04-20 Parking control method, device, vehicle and storage medium Pending CN116946114A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210417995.6A CN116946114A (en) 2022-04-20 2022-04-20 Parking control method, device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210417995.6A CN116946114A (en) 2022-04-20 2022-04-20 Parking control method, device, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN116946114A true CN116946114A (en) 2023-10-27

Family

ID=88460674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210417995.6A Pending CN116946114A (en) 2022-04-20 2022-04-20 Parking control method, device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN116946114A (en)

Similar Documents

Publication Publication Date Title
US11216972B2 (en) Vehicle localization using cameras
CN107554430B (en) Vehicle blind area visualization method, device, terminal, system and vehicle
CN110316182B (en) Automatic parking system and method
US10112585B1 (en) Vehicle cleanliness detection systems and methods
US10346694B2 (en) Vehicle start support device
JP6926976B2 (en) Parking assistance device and computer program
JP4556742B2 (en) Vehicle direct image display control apparatus and vehicle direct image display control program
JP6678605B2 (en) Information processing apparatus, information processing method, and information processing program
US20110169957A1 (en) Vehicle Image Processing Method
CN110126820A (en) Automated parking system, method of parking and vehicle
JP2004056763A (en) Monitoring apparatus, monitoring method, and program for monitor
CN110831818B (en) Parking assist method and parking assist device
JP2009211624A (en) Driving support device, driving support method, and computer program
WO2022227656A1 (en) Vehicle rearview mirror control method and apparatus, and electronic device and storage medium
CN110341621B (en) Obstacle detection method and device
TW201724052A (en) Vehicle monitoring system and method thereof
CN112092809A (en) Auxiliary reversing method, device and system and vehicle
CN104228683A (en) Method for operating a driver assist system for maneuvering and/or parking a motor vehicle
CN106295553B (en) Interactive parking method and system based on image recognition and vehicle
US20210327129A1 (en) Method for a sensor-based and memory-based representation of a surroundings, display device and vehicle having the display device
JP2006160193A (en) Vehicular drive supporting device
CN116946114A (en) Parking control method, device, vehicle and storage medium
CN112026751A (en) Vehicle and control method thereof
CN112912895B (en) Detection method and device and vehicle
CN116946115A (en) Parking control method, device, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination