CN111429469A - Parking position determining method and device, electronic equipment and storage medium - Google Patents

Parking position determining method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111429469A
CN111429469A CN201910309691.6A CN201910309691A CN111429469A CN 111429469 A CN111429469 A CN 111429469A CN 201910309691 A CN201910309691 A CN 201910309691A CN 111429469 A CN111429469 A CN 111429469A
Authority
CN
China
Prior art keywords
berth
boundary
berths
endpoint
monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910309691.6A
Other languages
Chinese (zh)
Other versions
CN111429469B (en
Inventor
王志海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201910309691.6A priority Critical patent/CN111429469B/en
Publication of CN111429469A publication Critical patent/CN111429469A/en
Application granted granted Critical
Publication of CN111429469B publication Critical patent/CN111429469B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas

Abstract

The embodiment of the application provides a method and a device for determining a parking position, an electronic device and a storage medium, which are applied to the technical field of image processing, wherein the method for determining the parking position comprises the following steps: acquiring target image data of a monitoring scene of a designated monitoring device, and displaying the target image data; acquiring a first berth boundary, a second berth boundary and the number of berths in target image data; and according to the number of the berths, dividing the region determined by the first berth boundary and the second berth boundary into a plurality of berth positions. The method for determining the berth position in the embodiment of the application realizes the determination of the berth position in the monitoring scene of the monitoring equipment, and can increase the marking efficiency of the berth position compared with the method that a user marks four edges of each berth respectively.

Description

Parking position determining method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for determining a parking position, an electronic device, and a storage medium.
Background
With the popularization of monitoring equipment, the monitoring coverage of parking lots, roadside parks and the like is gradually realized. In the related art, only the surveillance video data is generally utilized for subsequent forensics, but in reality, the surveillance video data can also be used for statistics and intelligent management of the berths.
For the monitoring equipment, for example, for a smart camera and the like, the parking situation such as whether the parking space is parked or not can be counted according to the determined parking position by using the computer vision technology. The statistics of parking situations depend on the determination of the parking position, but the parking position in the monitoring equipment is difficult to determine due to different factors such as the erection height and the erection angle of different monitoring equipment, and therefore it is desirable to be able to confirm the parking position in the monitoring scene of the monitoring equipment.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for determining a parking position, an electronic device, and a storage medium, so as to determine a parking position in a monitoring scene of a monitoring device. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a method for determining a parking position, where the method includes:
acquiring target image data of a monitoring scene of a designated monitoring device, and displaying the target image data;
acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data;
and dividing the region determined by the first berth boundary and the second berth boundary into a plurality of berth positions according to the berth number.
Optionally, after the target image data of the monitoring scene of the designated monitoring device is obtained and displayed, the method further includes:
judging whether the positions of all the berths in the monitoring scene of the specified monitoring equipment are marked or not;
the acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data includes:
and if the positions of the berths in the monitoring scene of the designated monitoring equipment are not marked, acquiring a first berth boundary, a second berth boundary and the number of the berths in the target image data.
Optionally, after the determining whether the berth positions in the monitoring scene of the designated monitoring device have been marked, the method further includes:
and if the positions of the berths in the monitoring scene of the specified monitoring equipment are marked, displaying the positions of the berths in the monitoring scene of the specified monitoring equipment.
Optionally, the acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data includes:
acquiring first berth boundary drawing information and second berth boundary drawing information input by a user, drawing the first berth boundary according to the first berth boundary drawing information, and drawing the second berth boundary according to the second berth boundary drawing information;
acquiring the number information of the berths input by a user, and determining the number of the berths according to the number information of the berths.
Optionally, dividing the region defined by the first berth boundary and the second berth boundary into a plurality of berth positions according to the number of berths includes:
determining each berth endpoint of the first berth boundary according to the first berth boundary and the berth number, and determining each berth endpoint of the second berth boundary according to the second berth boundary and the berth number;
and connecting the berth end points of the first berth boundary with the berth end points of the second berth boundary in the same sequence to obtain the berth positions in the monitoring scene of the specified monitoring equipment.
Optionally, the method for determining a parking position according to the embodiment of the present application further includes:
acquiring berth endpoint adjustment information input by a user;
adjusting the position of the corresponding berth endpoint in the first berth boundary and/or the second berth boundary according to the berth endpoint adjustment information;
and correcting the corresponding berth position in the monitoring scene of the specified monitoring equipment according to the adjusted berth end point position.
Optionally, the method for determining a parking position according to the embodiment of the present application further includes:
acquiring berth boundary adjustment information input by a user;
adjusting the first berth boundary and/or the second berth boundary according to the berth boundary adjustment information;
and correcting the corresponding berthing position in the monitoring scene of the appointed monitoring equipment according to the current first berthing boundary and the second berthing boundary.
Optionally, the method for determining a parking position according to the embodiment of the present application further includes:
and storing each berth position in the monitoring scene of the specified monitoring equipment.
In a second aspect, an embodiment of the present application provides a berth position determining apparatus, including:
the image data display module is used for acquiring target image data of a monitoring scene of the appointed monitoring equipment and displaying the target image data;
the calculation parameter acquisition module is used for acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data;
and the berthing position determining module is used for dividing the region determined by the first berthing boundary and the second berthing boundary into a plurality of berthing positions according to the number of berths.
Optionally, the berth position determining apparatus according to an embodiment of the present application further includes:
the berth marking judging module is used for judging whether each berth position in the monitoring scene of the specified monitoring equipment is marked;
the calculation parameter obtaining module is specifically configured to:
and if the positions of the berths in the monitoring scene of the designated monitoring equipment are not marked, acquiring a first berth boundary, a second berth boundary and the number of the berths in the target image data.
Optionally, the berth position determining apparatus according to an embodiment of the present application further includes:
and the berth position display module is used for displaying each berth position in the monitoring scene of the specified monitoring equipment if each berth position in the monitoring scene of the specified monitoring equipment is marked.
Optionally, the calculation parameter obtaining module includes:
the system comprises a drawing information acquisition submodule, a drawing information acquisition submodule and a drawing information extraction submodule, wherein the drawing information acquisition submodule is used for acquiring first berth boundary drawing information and second berth boundary drawing information input by a user, drawing a first berth boundary according to the first berth boundary drawing information, and drawing a second berth boundary according to the second berth boundary drawing information;
and the number information acquisition submodule is used for acquiring the number information of the berths input by the user and determining the number of the berths according to the number information of the berths.
Optionally, the berthing position determining module includes:
a berthing endpoint determination submodule, configured to determine each berthing endpoint of the first berthing boundary according to the first berthing boundary and the number of berthes, and determine each berthing endpoint of the second berthing boundary according to the second berthing boundary and the number of berthes;
and the berth position marking submodule is used for connecting each berth endpoint of the first berth boundary with the berth endpoint in the same sequence in each berth endpoint of the second berth boundary to obtain each berth position in the monitoring scene of the specified monitoring equipment.
Optionally, the berth position determining apparatus according to an embodiment of the present application further includes:
the system comprises an endpoint adjustment information acquisition module, a position acquisition module and a position acquisition module, wherein the endpoint adjustment information acquisition module is used for acquiring berth endpoint adjustment information input by a user;
a berth endpoint correction module, configured to adjust positions of corresponding berth endpoints in the first berth boundary and/or the second berth boundary according to the berth endpoint adjustment information;
and the berth position correction module is used for correcting the corresponding berth position in the monitoring scene of the specified monitoring equipment according to the adjusted position of the berth endpoint.
Optionally, the berth position determining apparatus according to an embodiment of the present application further includes:
the boundary adjustment information acquisition module is used for acquiring berth boundary adjustment information input by a user;
a berth boundary correction module, configured to adjust the first berth boundary and/or the second berth boundary according to the berth boundary adjustment information;
and the berth position correcting module is used for correcting the corresponding berth position in the monitoring scene of the specified monitoring equipment according to the current first berth boundary and the second berth boundary.
Optionally, the berth position determining apparatus according to an embodiment of the present application further includes:
and the berth position storage module is used for storing each berth position in the monitoring scene of the specified monitoring equipment.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to implement the method for determining a parking position according to any one of the first aspect described above when executing the program stored in the memory.
In a fourth aspect, a computer-readable storage medium has stored therein a computer program which, when executed by a processor, implements the method for determining a berth position according to any one of the above-mentioned first aspects.
The method, the device, the electronic equipment and the storage medium for determining the berthing position, provided by the embodiment of the application, are used for acquiring target image data of a monitoring scene of the appointed monitoring equipment and displaying the target image data; acquiring a first berth boundary, a second berth boundary and the number of berths in target image data; and according to the number of the berths, dividing the region determined by the first berth boundary and the second berth boundary into a plurality of berth positions. The method and the device realize the determination of the berth position in the monitoring scene of the monitoring equipment, and can increase the marking efficiency of the berth position compared with the method that the user marks four edges of each berth respectively. Of course, not all advantages described above need to be achieved at the same time in the practice of any one product or method of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a first schematic diagram of a berth position determining method according to an embodiment of the present application;
FIG. 2a is a first schematic view of a first parking space boundary and a second parking space boundary in accordance with an embodiment of the present invention;
FIG. 2b is a second schematic view of a first docking boundary and a second docking boundary in accordance with an embodiment of the present application;
fig. 3 is a second schematic diagram of a berth position determining method according to an embodiment of the present application;
FIG. 4 is a schematic view of a docking endpoint according to an embodiment of the present application;
FIG. 5 is a third schematic diagram of a berth-position determining method according to an embodiment of the present application;
FIG. 6 is a schematic view of a berth-position determining apparatus according to an embodiment of the present application;
fig. 7 is a schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Parking position: the parking space of the parking place such as a roadside, a garage and the like is pointed. The embodiment of the application provides a method for determining a berth position, and referring to fig. 1, the method comprises the following steps:
s101, acquiring target image data of a monitoring scene of the appointed monitoring equipment, and displaying the target image data.
The method for determining the parking position according to the embodiment of the present application may be implemented by a client device, where the client device includes a processor and a memory, where the memory stores a computer program, and the processor executes the computer program stored in the memory to implement the method for determining the parking position according to the embodiment of the present application.
The client device obtains image data of a monitoring scene acquired by the appointed monitoring device, namely target image data. The target image data may be a video stream or may be individual video frames. After the client device obtains the target image data, the target image data is displayed. The client device can be internally or externally connected with a display screen, and the target image data is displayed on the display screen.
S102, acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data.
The electronic equipment acquires a first berth boundary, a second berth boundary and the berth number in the target image data. The first and second berth boundaries are two opposite berth boundary lines in an actual monitoring scene, and the first and second berth boundaries penetrate through the berths of the positions to be marked in the actual monitoring scene, for example, the first and second berth boundaries may be as shown in fig. 2a and 2b, where the positions of the first and second berth boundaries may be interchanged. In a possible implementation manner, the target image data includes a berthage to be marked, and the electronic device may analyze the target image data through a computer vision technique, such as a pre-trained convolutional neural network, so as to obtain a first berthage boundary, a second berthage boundary, and the number of berthages in the target image data. The lighter and thicker lines represent the parking space boundaries, the darker and thinner lines represent the parking space lines, and 001, 002 and 003 represent the marks of the respective parking spaces.
In one possible embodiment, the first berth boundary, the second berth boundary and the number of berths may be manually input in order to increase the accuracy of the berth position marking. Optionally, referring to fig. 3, the acquiring of the first berth boundary, the second berth boundary and the number of berths in the target image data includes:
s301, obtaining first berth boundary drawing information and second berth boundary drawing information input by a user, drawing the first berth boundary according to the first berth boundary drawing information, and drawing the second berth boundary according to the second berth boundary drawing information.
The client device obtains first berth boundary drawing information and second berth boundary drawing information input by a user, draws a first berth boundary in target image data according to the first berth boundary drawing information, and draws a second berth boundary in the target image data according to the second berth boundary drawing information. For example, the user may specify that two line segments are input on the target image data by means of a mouse or a touch screen, and the client device draws the first parking space boundary and the second parking space boundary according to the line segments input by the user.
S302, acquiring the number information of the berths input by the user, and determining the number of the berths according to the number information of the berths.
The client equipment acquires the number information of the berths input by the user so as to determine the number of the berths. For example, if the user inputs the number-of-berths information of 4, it is determined that the number of berths is 4. In the embodiment of the present application, the order of S301 and S302 is not limited, S301 and S302 may be executed first, S302 and S301 may be executed first, or S301 and S302 may be executed simultaneously.
S103, dividing the region defined by the first and second berth boundaries into a plurality of berth positions according to the number of berths.
The client device determines a quadrilateral area according to the first berthing boundary and the second berthing boundary, and divides the quadrilateral area into sub-areas with the same number according to the number of berths, wherein each sub-area is a berthing position. For example, the quadrangular region may be equally divided into the same number of sub-regions as the number of berths, or the like.
In one possible embodiment, the dividing the region defined by the first and second berth boundaries into a plurality of berth positions according to the number of berths includes:
step one, determining each berth end point of the first berth boundary according to the first berth boundary and the berth number, and determining each berth end point of the second berth boundary according to the second berth boundary and the berth number.
The client device divides the two lane lines according to a preset rule, and determines a parking end point on each lane line. The preset rule can be determined according to actual conditions, for example, the client device can respectively divide two lane lines at equal distance; or the client device may determine a reduction ratio of distances in the target image data of equal distances in the actual scene according to the angle of view and the shooting distance of the monitoring device and the sequence of the shooting distances from near to far, and divide the two lane lines in equal proportion according to the reduction ratio. The number of the berthing endpoints on each lane line is 1 more than that of the berthing endpoints. For example, if the number of berths is 5, the number of berth endpoints on each lane line is 6.
And secondly, connecting the berth end points of the first berth boundary with the berth end points of the second berth boundary in the same sequence to obtain the berth positions in the monitoring scene of the specified monitoring equipment.
The client device connects the same-order ones of the first and second berth endpoints according to the order of the berth endpoints, for example, as shown in fig. 4, connects the 1 st berth endpoint of the first berth boundary with the 1 st berth endpoint of the second berth boundary, connects the 2 nd berth endpoint of the first berth boundary with the 2 nd berth endpoint of the second berth boundary, and so on, thereby obtaining each berth position in the monitoring scene of the designated monitoring device. The order of the berthing endpoints can be set according to the implementation situation, but the first berthing boundary and the second berthing boundary need to be ensured to have the same rule for determining the order of the berthing endpoints. For example, when the lane line is in the vertical direction, the parking position endpoint close to the top is the first parking position endpoint according to the sequence from top to bottom; when the lane line is in the horizontal direction, the parking position end point close to the left side is the first parking position end point according to the sequence from left to right. The lighter and thicker lines represent the parking space boundaries, the darker and thinner lines represent the parking space lines, and 001, 002 and 003 represent the marks of the respective parking spaces.
After the client device determines each parking position in the monitoring scene of the designated monitoring device, each parking position may also be configured to the designated monitoring device, for example, the client device may access the designated monitoring device through a browser by using a Web control, thereby implementing configuration of the parking position of the designated monitoring device.
In the embodiment of the application, the berth positions in the monitoring scene of the monitoring equipment are determined, and compared with the method that the user marks four edges of each berth respectively, the marking efficiency of the berth positions can be increased.
Optionally, referring to fig. 5, after the target image data of the monitoring scene of the designated monitoring device is obtained and displayed, the method further includes:
s501, judging whether the positions of the berths in the monitoring scene of the specified monitoring equipment are marked or not.
The client device judges whether the positions of the berths in the monitoring scene of the specified monitoring device are marked or not. For example, the client device determines whether the designated storage location stores each berth location in the monitoring scene of the designated monitoring device, if so, determines that each berth location in the monitoring scene of the designated monitoring device has been marked, and if not, determines that each berth location in the monitoring scene of the designated monitoring device has not been marked. The designated storage location is determined according to an actually set parking location saving location, for example, the designated storage location is a client device local storage, a cloud storage, or a storage of a designated monitoring device.
The above-mentioned first berth boundary, second berth boundary and berth quantity of obtaining the above-mentioned target image data, including:
and if the positions of the berths in the monitoring scene of the appointed monitoring equipment are not marked, acquiring a first berth boundary, a second berth boundary and the number of the berths in the target image data.
If the positions of the berths in the monitoring scene of the appointed monitoring equipment are not marked, the client equipment acquires a first berth boundary, a second berth boundary and the number of the berths in the target image data so as to carry out a subsequent berth position calibration process.
Optionally, after determining whether each berth position in the monitoring scene of the specified monitoring device has been marked, the method further includes:
and S502, if the positions of the berths in the monitoring scene of the specified monitoring equipment are marked, displaying the positions of the berths in the monitoring scene of the specified monitoring equipment.
And if the positions of the berths in the monitoring scene of the specified monitoring equipment are marked, the client equipment acquires and displays the positions of the berths in the monitoring scene of the specified monitoring equipment in the target image data.
In practical situations, it may happen that the automatic marking of the berth end points is inaccurate, and thus there may be situations where manual revision is required. Optionally, the method for determining a parking position according to the embodiment of the present application further includes:
and S503, acquiring the berth endpoint adjustment information input by the user.
The client device obtains the berth endpoint adjustment information input by the user, and the berth endpoint adjustment information represents the berth endpoint to be adjusted and the adjusted position of the berth endpoint to be adjusted. For example, the user may drag the position of the designated parking endpoint directly through a touch screen or a mouse, and the client device obtains the adjustment input of the user, that is, the parking endpoint adjustment information.
S504, adjusting the position of the corresponding parking endpoint in the first parking boundary and/or the second parking boundary according to the parking endpoint adjustment information.
When the berth endpoint adjustment information represents adjusting endpoints on the first berth boundary, the client device adjusts the positions of corresponding berth endpoints on the first berth boundary. When the berth endpoint adjustment information represents that the berth endpoint on the second berth boundary is adjusted, the client device adjusts the position of the corresponding berth endpoint on the second berth boundary. When the representation of the berthing endpoint adjustment information adjusts the berthing endpoints on the first berthing boundary and the second berthing boundary, the client equipment adjusts the positions of the corresponding berthing endpoints on the first berthing boundary and the second berthing boundary.
And S505, correcting the corresponding berth position in the monitoring scene of the specified monitoring equipment according to the adjusted berth endpoint position.
And the client equipment corrects the adjusted berth position of the berth endpoint in the monitoring scene of the appointed monitoring equipment according to the adjusted berth endpoint position, and does not adjust aiming at the unadjusted berth position of the berth endpoint.
Optionally, the method for determining a parking position according to the embodiment of the present application further includes:
s506, storing each berth position in the monitoring scene of the specified monitoring equipment.
And the client equipment converts each berth position in the monitoring scene of the appointed monitoring equipment into coordinate data and stores the coordinate data. The client device may store the coordinate data of each berth position locally in the client device, may also store the coordinate data of each berth position in the cloud storage, and may also store the coordinate data of each berth position in the designated monitoring device.
In the embodiment of the application, the positions of the designated monitoring devices in the monitoring scene are stored, so that the positions of the designated monitoring devices in the follow-up monitoring scene can be conveniently obtained, and the use and correction of the positions of the follow-up designated monitoring devices can be conveniently realized.
In order to improve the accuracy of the parking position mark, in a possible implementation manner, the method for determining a parking position according to the embodiment of the present application further includes:
step one, acquiring berth boundary adjustment information input by a user.
The method comprises the steps that the client device obtains parking space boundary adjustment information input by a user, and the parking space boundary adjustment information represents a parking space boundary to be adjusted and a position of the adjusted parking space boundary to be adjusted. For example, the user may drag the position of the designated parking space boundary directly through a touch screen, a mouse, or the like, and the client device obtains the adjustment input of the user, that is, the parking space boundary adjustment information.
And step two, adjusting the first berth boundary and/or the second berth boundary according to the berth boundary adjustment information.
When the berth boundary adjustment information represents the position of the first berth boundary, adjusting the position of the first berth boundary; when the berth boundary adjustment information represents the position of the second berth boundary, adjusting the position of the second berth boundary; and when the representation of the berth boundary adjustment information adjusts the positions of the first berth boundary and the second berth boundary, adjusting the positions of the first berth boundary and the second berth boundary.
And step three, correcting the corresponding berth position in the monitoring scene of the appointed monitoring equipment according to the current first berth boundary and the second berth boundary.
And re-determining the berth positions according to the adjusted current first berth boundary and the adjusted current second berth boundary so as to correct the berth positions.
An embodiment of the present application further provides a berth position determining apparatus, referring to fig. 6, the apparatus includes:
the image data display module 601 is configured to obtain target image data of a monitoring scene of a designated monitoring device, and display the target image data;
a calculation parameter obtaining module 602, configured to obtain a first berth boundary, a second berth boundary, and the number of berths in the target image data;
a berth position determining module 603, configured to divide an area determined by the first berth boundary and the second berth boundary into a plurality of berth positions according to the number of berths.
In the embodiment of the application, the berth positions in the monitoring scene of the monitoring equipment are determined, and compared with the method that the user marks four edges of each berth respectively, the marking efficiency of the berth positions can be increased.
Optionally, the berth position determining apparatus implemented in the present application further includes:
the berth marking judging module is used for judging whether each berth position in the monitoring scene of the specified monitoring equipment is marked;
the calculation parameter obtaining module 602 is specifically configured to:
and if the positions of the berths in the monitoring scene of the appointed monitoring equipment are not marked, acquiring a first berth boundary, a second berth boundary and the number of the berths in the target image data.
Optionally, the berth position determining apparatus implemented in the present application further includes:
and the berth position display module is used for displaying each berth position in the monitoring scene of the specified monitoring equipment if each berth position in the monitoring scene of the specified monitoring equipment is marked.
Optionally, the calculation parameter obtaining module 602 includes:
the system comprises a drawing information acquisition submodule, a drawing information acquisition submodule and a drawing information extraction submodule, wherein the drawing information acquisition submodule is used for acquiring first berth boundary drawing information and second berth boundary drawing information input by a user, drawing the first berth boundary according to the first berth boundary drawing information and drawing the second berth boundary according to the second berth boundary drawing information;
and the number information acquisition submodule is used for acquiring the number information of the berths input by the user and determining the number of the berths according to the number information of the berths.
Optionally, the berth position determining module 603 includes
A berth endpoint determination submodule, configured to determine each berth endpoint of the first berth boundary according to the first berth boundary and the number of berths, and determine each berth endpoint of the second berth boundary according to the second berth boundary and the number of berths;
and the berth position marking submodule is used for connecting each berth endpoint of the first berth boundary with the berth endpoint of the second berth boundary in the same sequence to obtain each berth position in the monitoring scene of the specified monitoring equipment.
Optionally, the berth position determining apparatus implemented in the present application further includes:
the adjustment information acquisition module is used for acquiring the berth endpoint adjustment information input by a user;
a berth endpoint correction module, configured to adjust positions of corresponding berth endpoints in the first berth boundary and/or the second berth boundary according to the berth endpoint adjustment information;
and the berth position correction module is used for correcting the corresponding berth position in the monitoring scene of the specified monitoring equipment according to the adjusted position of the berth endpoint.
Optionally, the berth position determining apparatus implemented in the present application further includes:
and the berth position storage module is used for storing each berth position in the monitoring scene of the specified monitoring equipment.
Optionally, the berth position determining apparatus according to an embodiment of the present application further includes:
the boundary adjustment information acquisition module is used for acquiring berth boundary adjustment information input by a user;
a berth boundary correction module, configured to adjust the first berth boundary and/or the second berth boundary according to the berth boundary adjustment information;
and the berth position correcting module is used for correcting the corresponding berth position in the monitoring scene of the specified monitoring equipment according to the current first berth boundary and the second berth boundary.
The embodiment of the application also provides an electronic device, which comprises a processor and a memory;
the memory is used for storing computer programs;
the processor is configured to implement the following steps when executing the program stored in the memory:
acquiring target image data of a monitoring scene of a designated monitoring device, and displaying the target image data, wherein the target image data comprises a berth to be marked;
acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data;
and dividing the region defined by the first berth boundary and the second berth boundary into a plurality of berth positions according to the number of berths.
In the embodiment of the application, the berth positions in the monitoring scene of the monitoring equipment are determined, and compared with the method that the user marks four edges of each berth respectively, the marking efficiency of the berth positions can be increased.
Optionally, the processor may be configured to implement any of the above-described parking position determining methods when the processor is configured to execute the program stored in the memory.
Optionally, as shown in fig. 7, the electronic device according to the embodiment of the present application may further include a communication interface 702 and a communication bus 704, where the processor 701, the communication interface 702, and the memory 703 complete mutual communication through the communication bus 704.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the following steps:
acquiring target image data of a monitoring scene of a designated monitoring device, and displaying the target image data, wherein the target image data comprises a berth to be marked;
acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data;
and dividing the region defined by the first berth boundary and the second berth boundary into a plurality of berth positions according to the number of berths.
In the embodiment of the application, the berth positions in the monitoring scene of the monitoring equipment are determined, and compared with the method that the user marks four edges of each berth respectively, the marking efficiency of the berth positions can be increased.
Optionally, the computer program, when executed by a processor, is further capable of implementing any of the above-described methods for determining a parking position.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the embodiments of the apparatus, the electronic device, and the storage medium, since they are substantially similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
The above description is only for the preferred embodiment of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application are included in the protection scope of the present application.

Claims (16)

1. A method for determining a berthing position, the method comprising:
acquiring target image data of a monitoring scene of a designated monitoring device, and displaying the target image data;
acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data;
and dividing the region determined by the first berth boundary and the second berth boundary into a plurality of berth positions according to the berth number.
2. The method of claim 1, wherein after the obtaining target image data of the scene monitored by the designated monitoring device and presenting the target image data, the method further comprises:
judging whether the positions of all the berths in the monitoring scene of the specified monitoring equipment are marked or not;
the acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data includes:
and if the positions of the berths in the monitoring scene of the designated monitoring equipment are not marked, acquiring a first berth boundary, a second berth boundary and the number of the berths in the target image data.
3. The method of claim 2, wherein after said determining whether each berth position in the designated monitoring device monitoring scene has been marked, the method further comprises:
and if the positions of the berths in the monitoring scene of the specified monitoring equipment are marked, displaying the positions of the berths in the monitoring scene of the specified monitoring equipment.
4. The method of claim 1, wherein the acquiring the first berth boundary, the second berth boundary, and the number of berths in the target image data comprises:
acquiring first berth boundary drawing information and second berth boundary drawing information input by a user, drawing the first berth boundary according to the first berth boundary drawing information, and drawing the second berth boundary according to the second berth boundary drawing information;
acquiring the number information of the berths input by a user, and determining the number of the berths according to the number information of the berths.
5. The method of claim 1, wherein said dividing the area defined by the first and second berth boundaries into a plurality of berth positions according to the number of berths comprises:
determining each berth endpoint of the first berth boundary according to the first berth boundary and the berth number, and determining each berth endpoint of the second berth boundary according to the second berth boundary and the berth number;
and connecting the berth end points of the first berth boundary with the berth end points of the second berth boundary in the same sequence to obtain the berth positions in the monitoring scene of the specified monitoring equipment.
6. The method of claim 5, further comprising:
acquiring berth endpoint adjustment information input by a user;
adjusting the position of the corresponding berth endpoint in the first berth boundary and/or the second berth boundary according to the berth endpoint adjustment information;
and correcting the corresponding berth position in the monitoring scene of the specified monitoring equipment according to the adjusted berth end point position.
7. The method of claim 1, further comprising:
acquiring berth boundary adjustment information input by a user;
adjusting the first berth boundary and/or the second berth boundary according to the berth boundary adjustment information;
and correcting the corresponding berthing position in the monitoring scene of the appointed monitoring equipment according to the current first berthing boundary and the second berthing boundary.
8. The method according to any one of claims 1-7, further comprising:
and storing each berth position in the monitoring scene of the specified monitoring equipment.
9. A berthing position determining apparatus, comprising:
the image data display module is used for acquiring target image data of a monitoring scene of the appointed monitoring equipment and displaying the target image data;
the calculation parameter acquisition module is used for acquiring a first berth boundary, a second berth boundary and the number of berths in the target image data;
and the berthing position determining module is used for dividing the region determined by the first berthing boundary and the second berthing boundary into a plurality of berthing positions according to the number of berths.
10. The apparatus of claim 9, further comprising:
the berth marking judging module is used for judging whether each berth position in the monitoring scene of the specified monitoring equipment is marked;
the calculation parameter obtaining module is specifically configured to:
and if the positions of the berths in the monitoring scene of the designated monitoring equipment are not marked, acquiring a first berth boundary, a second berth boundary and the number of the berths in the target image data.
11. The apparatus of claim 10, further comprising:
and the berth position display module is used for displaying each berth position in the monitoring scene of the specified monitoring equipment if each berth position in the monitoring scene of the specified monitoring equipment is marked.
12. The apparatus of claim 9, wherein the calculation parameter obtaining module comprises:
the system comprises a drawing information acquisition submodule, a drawing information acquisition submodule and a drawing information extraction submodule, wherein the drawing information acquisition submodule is used for acquiring first berth boundary drawing information and second berth boundary drawing information input by a user, drawing a first berth boundary according to the first berth boundary drawing information, and drawing a second berth boundary according to the second berth boundary drawing information;
and the number information acquisition submodule is used for acquiring the number information of the berths input by the user and determining the number of the berths according to the number information of the berths.
13. The apparatus of claim 9, wherein the berthing position determining module comprises:
a berthing endpoint determination submodule, configured to determine each berthing endpoint of the first berthing boundary according to the first berthing boundary and the number of berthes, and determine each berthing endpoint of the second berthing boundary according to the second berthing boundary and the number of berthes;
and the berth position marking submodule is used for connecting each berth endpoint of the first berth boundary with the berth endpoint in the same sequence in each berth endpoint of the second berth boundary to obtain each berth position in the monitoring scene of the specified monitoring equipment.
14. The apparatus of claim 13, further comprising:
the system comprises an endpoint adjustment information acquisition module, a position acquisition module and a position acquisition module, wherein the endpoint adjustment information acquisition module is used for acquiring berth endpoint adjustment information input by a user;
a berth endpoint correction module, configured to adjust positions of corresponding berth endpoints in the first berth boundary and/or the second berth boundary according to the berth endpoint adjustment information;
and the berth position correction module is used for correcting the corresponding berth position in the monitoring scene of the specified monitoring equipment according to the adjusted position of the berth endpoint.
15. The apparatus of claim 9, further comprising:
the boundary adjustment information acquisition module is used for acquiring berth boundary adjustment information input by a user;
a berth boundary correction module, configured to adjust the first berth boundary and/or the second berth boundary according to the berth boundary adjustment information;
and the berth position correcting module is used for correcting the corresponding berth position in the monitoring scene of the specified monitoring equipment according to the current first berth boundary and the second berth boundary.
16. The apparatus of any of claims 9-15, further comprising:
and the berth position storage module is used for storing each berth position in the monitoring scene of the specified monitoring equipment.
CN201910309691.6A 2019-04-17 2019-04-17 Berth position determining method and device, electronic equipment and storage medium Active CN111429469B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910309691.6A CN111429469B (en) 2019-04-17 2019-04-17 Berth position determining method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910309691.6A CN111429469B (en) 2019-04-17 2019-04-17 Berth position determining method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111429469A true CN111429469A (en) 2020-07-17
CN111429469B CN111429469B (en) 2023-11-03

Family

ID=71546745

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910309691.6A Active CN111429469B (en) 2019-04-17 2019-04-17 Berth position determining method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111429469B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115457760A (en) * 2022-08-03 2022-12-09 北京云星宇交通科技股份有限公司 Method and device for recognizing and binding license plate and parking space of vehicle

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2138653A2 (en) * 2008-06-23 2009-12-30 PMS GmbH Parking area management
CN102005133A (en) * 2010-11-04 2011-04-06 任杰 Recognizable mark-based parking position detecting method
US20120112929A1 (en) * 2010-11-09 2012-05-10 International Business Machines Corporation Smart spacing allocation
WO2013136592A1 (en) * 2012-03-14 2013-09-19 オムロン株式会社 Area designating method and area designating device
CN103473950A (en) * 2012-06-06 2013-12-25 刘鉵 Parking lot parking space monitoring method
CN104112370A (en) * 2014-07-30 2014-10-22 哈尔滨工业大学深圳研究生院 Monitoring image based intelligent parking lot parking place identification method and system
JP2015075966A (en) * 2013-10-09 2015-04-20 富士通株式会社 Image processing device, image processing method and program
US20150124093A1 (en) * 2013-11-04 2015-05-07 Xerox Corporation Method for object size calibration to aid vehicle detection for video-based on-street parking technology
US20150339535A1 (en) * 2012-11-27 2015-11-26 Clarion Co., Ltd. On-vehicle image processor
CN107248309A (en) * 2017-06-19 2017-10-13 深圳市盛路物联通讯技术有限公司 A kind of intelligentized parking charging method and system
KR101806066B1 (en) * 2017-03-24 2018-01-11 주식회사 넥스쿼드 Camera module with function of parking guidance
CN107738612A (en) * 2017-09-22 2018-02-27 西安电子科技大学 The detection of automatic parking parking stall and identifying system based on panoramic vision accessory system
US20180099661A1 (en) * 2016-10-12 2018-04-12 Lg Electronics Inc. Parking assistance apparatus and vehicle having the same
US20180128638A1 (en) * 2015-10-30 2018-05-10 Chongqing University Of Posts And Telecommunications Parking space navigation method, parking space management method, mobile terminal, and server
CN108351958A (en) * 2015-10-22 2018-07-31 日产自动车株式会社 The detection method and device of the wire on parking stall
US20180364063A1 (en) * 2017-06-14 2018-12-20 Here Global B.V. Mapping system and method for identifying a parking lot from probe data
CN109472184A (en) * 2017-09-08 2019-03-15 深圳市金溢科技股份有限公司 The condition detection method in berth, system and its data processing equipment in road

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2138653A2 (en) * 2008-06-23 2009-12-30 PMS GmbH Parking area management
CN102005133A (en) * 2010-11-04 2011-04-06 任杰 Recognizable mark-based parking position detecting method
US20120112929A1 (en) * 2010-11-09 2012-05-10 International Business Machines Corporation Smart spacing allocation
WO2013136592A1 (en) * 2012-03-14 2013-09-19 オムロン株式会社 Area designating method and area designating device
CN103473950A (en) * 2012-06-06 2013-12-25 刘鉵 Parking lot parking space monitoring method
US20150339535A1 (en) * 2012-11-27 2015-11-26 Clarion Co., Ltd. On-vehicle image processor
JP2015075966A (en) * 2013-10-09 2015-04-20 富士通株式会社 Image processing device, image processing method and program
US20150124093A1 (en) * 2013-11-04 2015-05-07 Xerox Corporation Method for object size calibration to aid vehicle detection for video-based on-street parking technology
CN104112370A (en) * 2014-07-30 2014-10-22 哈尔滨工业大学深圳研究生院 Monitoring image based intelligent parking lot parking place identification method and system
CN108351958A (en) * 2015-10-22 2018-07-31 日产自动车株式会社 The detection method and device of the wire on parking stall
US20180128638A1 (en) * 2015-10-30 2018-05-10 Chongqing University Of Posts And Telecommunications Parking space navigation method, parking space management method, mobile terminal, and server
US20180099661A1 (en) * 2016-10-12 2018-04-12 Lg Electronics Inc. Parking assistance apparatus and vehicle having the same
KR101806066B1 (en) * 2017-03-24 2018-01-11 주식회사 넥스쿼드 Camera module with function of parking guidance
US20180364063A1 (en) * 2017-06-14 2018-12-20 Here Global B.V. Mapping system and method for identifying a parking lot from probe data
CN107248309A (en) * 2017-06-19 2017-10-13 深圳市盛路物联通讯技术有限公司 A kind of intelligentized parking charging method and system
CN109472184A (en) * 2017-09-08 2019-03-15 深圳市金溢科技股份有限公司 The condition detection method in berth, system and its data processing equipment in road
CN107738612A (en) * 2017-09-22 2018-02-27 西安电子科技大学 The detection of automatic parking parking stall and identifying system based on panoramic vision accessory system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115457760A (en) * 2022-08-03 2022-12-09 北京云星宇交通科技股份有限公司 Method and device for recognizing and binding license plate and parking space of vehicle
CN115457760B (en) * 2022-08-03 2023-09-01 北京云星宇交通科技股份有限公司 Method and device for identifying and binding license plate and berth of vehicle

Also Published As

Publication number Publication date
CN111429469B (en) 2023-11-03

Similar Documents

Publication Publication Date Title
JP6739517B2 (en) Lane recognition modeling method, device, storage medium and device, and lane recognition method, device, storage medium and device
US20180129856A1 (en) Systems and methods for adaptive scanning based on calculated shadows
US11107246B2 (en) Method and device for capturing target object and video monitoring device
WO2018204552A1 (en) Gps offset calibration for uavs
CN109426788B (en) Queuing length detection method and device and server
LU502288B1 (en) Method and system for detecting position relation between vehicle and lane line, and storage medium
CN112798811B (en) Speed measurement method, device and equipment
CN113112480B (en) Video scene change detection method, storage medium and electronic device
CN110135278B (en) Obstacle detection method and device and electronic equipment
CN103581562A (en) Panoramic shooting method and panoramic shooting device
CN111275765B (en) Method and device for determining target GPS and camera
EP4170601A1 (en) Traffic marker detection method and training method for traffic marker detection model
CN111429469A (en) Parking position determining method and device, electronic equipment and storage medium
CN109598746A (en) A kind of method and device tracking image template generation
CN113497897B (en) Vehicle-road cooperative roadside camera installation parameter adjusting method and device and electronic equipment
CN104899854A (en) Detection method and detection device of grain piling height line
CN110309330A (en) The treating method and apparatus of vision map
CN111212260A (en) Method and device for drawing lane line based on surveillance video
CN108055456B (en) Texture acquisition method and device
CN114782555A (en) Map mapping method, apparatus, and storage medium
CN113112551B (en) Camera parameter determining method and device, road side equipment and cloud control platform
CN110764526A (en) Unmanned aerial vehicle flight control method and device
CN110874814A (en) Image processing method, image processing device and terminal equipment
US10223592B2 (en) Method and associated apparatus for performing cooperative counting with aid of multiple cameras
CN107688427B (en) Image display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant