CN117496756B - Garage management method and system, computer readable storage medium and electronic device - Google Patents

Garage management method and system, computer readable storage medium and electronic device Download PDF

Info

Publication number
CN117496756B
CN117496756B CN202311827910.2A CN202311827910A CN117496756B CN 117496756 B CN117496756 B CN 117496756B CN 202311827910 A CN202311827910 A CN 202311827910A CN 117496756 B CN117496756 B CN 117496756B
Authority
CN
China
Prior art keywords
information
agent
vehicle
target object
garage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311827910.2A
Other languages
Chinese (zh)
Other versions
CN117496756A (en
Inventor
张钦满
石雪丽
李立凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LeiShen Intelligent System Co Ltd
Original Assignee
LeiShen Intelligent System Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LeiShen Intelligent System Co Ltd filed Critical LeiShen Intelligent System Co Ltd
Priority to CN202311827910.2A priority Critical patent/CN117496756B/en
Publication of CN117496756A publication Critical patent/CN117496756A/en
Application granted granted Critical
Publication of CN117496756B publication Critical patent/CN117496756B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/145Traffic control systems for road vehicles indicating individual free spaces in parking areas where the indication depends on the parking areas
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/149Traffic control systems for road vehicles indicating individual free spaces in parking areas coupled to means for restricting the access to the parking space, e.g. authorization, access barriers, indicative lights

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides a garage management method and a system thereof, a computer readable storage medium and an electronic device, comprising: when a vehicle is at a gate of a garage, acquiring first related information of the vehicle through a sensor, wherein the first related information comprises license plate number information, and the sensor comprises a plurality of laser radars; after the vehicle is parked in the garage, the position information of the vehicle is obtained through the sensor, and the position information of the vehicle is included in the first related information; after receiving the authentication information, obtaining the position information of the agent based on the sensor; and according to license plate number information provided by the agent, the position information of the agent is included in the first related information of the vehicle, so that the matching between the vehicle and the agent in the garage is realized. After the agent is matched with the vehicle in the garage, the position information of the agent and the position information of the vehicle are determined through the point cloud data with the position information, which are acquired through the laser radar, so that the agent can rapidly position the vehicle in the garage.

Description

Garage management method and system, computer readable storage medium and electronic device
Technical Field
The application relates to the technical field of intelligent parking management, in particular to a garage management method and system, a computer readable storage medium and electronic equipment.
Background
In daily life, the situation that a part of groups park in a garage and vehicles cannot be found in the garage accurately often happens. Therefore, the part of garage can be divided into a plurality of areas, and each parking space is numbered, so that the agent can be conveniently positioned. However, this requires that the agent must keep track of the area and number to which the vehicle belongs before leaving the garage. Otherwise, the agent may not find the vehicle. In addition, some agents have poor sense of direction, and even if the area and number to which the vehicle belongs are recorded, the vehicle cannot be quickly positioned.
Disclosure of Invention
In view of this, the present application provides a garage management method and system, a computer readable storage medium and an electronic device, which can effectively solve the problem that an agent cannot quickly locate a vehicle.
In a first aspect, the present application provides a garage management method, including:
when a vehicle is at a gate of a garage, acquiring first related information of the vehicle through a sensor, wherein the first related information comprises license plate number information, and the sensor comprises a plurality of laser radars;
after the vehicle is parked in the garage, the position information of the vehicle is obtained through the sensor, and the position information of the vehicle is included in the first related information;
after receiving the authentication information, obtaining the position information of the agent based on the sensor;
and according to license plate number information provided by the agent, the position information of the agent is included in the first related information of the vehicle, so that the matching between the vehicle and the agent in the garage is realized.
In an alternative embodiment, the authentication information includes the following:
the method comprises the steps of receiving an activation signal of an agent, and determining which sensor in a garage the agent is in the scanning range;
and receiving license plate number information which is sent by an agent and needs to be matched.
In an alternative embodiment, the authentication information includes the following:
receiving license plate number information which is sent by an agent and needs to be matched;
based on the license plate number information which is sent by the agent and needs to be matched, a corresponding specific action signal is sent to the agent;
based on the data of the sensors in the garage, it is clear which sensor in the garage the agent is in the scanning range of.
In an optional embodiment, after the matching between the vehicle and the agent in the garage is achieved, the method further includes:
updating the position information of the agent in the first related information in real time based on the sensor;
and sending the first related information updated in real time to the agent.
In an optional embodiment, the updating, based on the sensor, the location information of the agent in the first related information in real time includes real-time tracking of the target object based on the sensor; or,
after the vehicle is parked in the garage, the position information of the vehicle is obtained through the sensor, wherein the real-time tracking of the target object is based on the sensor;
wherein the real-time tracking of the target object based on the sensor comprises:
the sensor comprises a plurality of cameras;
defining a laser radar of an acquisition area where a target object is located as a reference laser radar, and obtaining parameter information of the target object based on point cloud data of the reference laser radar, wherein the parameter information comprises contour information and motion state parameters;
defining a camera of which the target object is in an acquisition area as a reference camera, and obtaining color information of the target object based on image data of the reference camera;
if the target object leaves the acquisition area of the reference laser radar, defining the adjacent laser radars of the reference laser radar as first-class laser radars;
screening the first type of laser radars based on the motion state parameters of the target object, and defining a screening result as a third type of laser radars;
and obtaining a tracking object of the target object based on the contour information and the color information of the target object of the third-class laser radar.
In an optional embodiment, the obtaining the tracking object of the target object based on the contour information and the color information of the target object of the third type of laser radar includes:
if the contour information and the color information of the target object of the third type of laser radar meet the following two conditions at the same time, defining the target object of the third type of laser radar as a tracking object of the target object:
if the difference value between the contour information of the target object of the third type of laser radar and the contour information of the target object is within a preset value;
if the difference value between the color information of the target object of the third type of laser radar and the color information of the target object is within a preset value.
In an optional embodiment, the screening the first type of lidar based on the motion state parameter of the target object, and defining a screening result as a third type of lidar includes:
the motion state parameters comprise course angle information and speed information;
screening the first type of laser radar based on the course angle information, and defining a screening result as a second type of laser radar;
and screening the second-class laser radar by combining the speed information of the target object based on the distance between the reference laser radar and the second-class radar, and defining a screening result as a third-class laser radar.
In an optional implementation manner, the screening of the second class lidar based on the distance between the reference lidar and the second class lidar and the scanning range between the reference lidar and the second class lidar in combination with the speed information of the target object, and defining a screening result as a third class lidar includes:
based on the distance between the reference laser radar and the second class radar, combining the scanning ranges of the reference laser radar and the second class radar to obtain the distance between the blind areas of the reference laser radar and the second class radar, wherein the distance is defined as the blind area distance;
based on the speed information of the target object leaving the acquisition area of the reference laser radar, combining the blind area distance to obtain the time of the target object completing the blind areas of the reference laser radar and the second type of radar, wherein the time is defined as the blind area time;
obtaining the number of frames of the target object entering an acquisition area of the second type radar based on the blind area time and the frame rate of the second type radar, wherein the number is defined as the blind area frame number;
and screening the second type of laser radars based on the blind area frame number, and defining a screening result as a third type of laser radars.
In a second aspect, the present application provides a garage management method, including:
the agent sends license plate number information and authentication information based on a sensor, wherein the sensor comprises a plurality of laser radars;
the agent receives the first related information; the first related information comprises position information of an agent, license plate number information and position information of a vehicle;
the agent receives the location information of the agent in the real-time update first related information.
In an alternative embodiment, the agent sends license plate number information and authentication information based on the sensor, including:
the agent sends out an activation signal to determine which sensor in the garage the agent is in the scanning range;
the agent sends out license plate number information to be matched.
In an alternative embodiment, the agent sends license plate number information and authentication information based on the sensor, including:
the agent sends license plate number information to be matched;
the agent receives a specific action signal corresponding to the license plate number;
within the scanning range of any sensor, the agent makes a specific action based on a specific action signal.
In a third aspect, the present application provides a garage management system, comprising:
the sensor module is used for acquiring license plate number information of the vehicle when the vehicle is at a gate of the garage; the sensor module is also used for acquiring the position information of the vehicle after the vehicle is parked in the garage; the sensor module is also used for acquiring the position information of the agent;
the signal processing module is used for generating first related information, wherein the first related information comprises license plate number information of a vehicle, position information of the vehicle and position information of an agent, and matching of the vehicle and the agent in a garage is realized;
and the authentication module is used for receiving authentication information of the agent and informing the sensor to acquire the position information of the agent.
In a fourth aspect, the present application provides a computer readable storage medium storing a computer program which when executed by a processor performs the steps of the method of the first or second aspect of the present application.
In a fifth aspect, the present application provides an electronic device comprising one or more processors and memory associated with the one or more processors, program instructions which, when read for execution by the one or more processors, perform the steps of the method of the first or second aspect of the present application.
The embodiment of the application has the following beneficial effects:
after the agent is matched with the vehicle in the garage, the position information of the agent and the position information of the vehicle can be determined through the point cloud data with the position information collected by the laser radar, so that the agent can rapidly position the vehicle in the garage.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 shows a schematic view of a prior art garage of the present application;
FIG. 2 is a schematic diagram of a garage management system according to embodiment 1 of the present application;
fig. 3 shows a flowchart of a garage management method according to embodiment 2 of the present application;
FIG. 4 shows a flowchart of S300 in embodiment 2 of the present application;
FIG. 5 shows another flowchart of S300 in embodiment 2 of the present application;
FIG. 6 shows a flow chart of sensor-based real-time tracking of a target object in embodiment 2 of the present application;
FIG. 7 shows a flow chart of T500 in example 2 of the present application;
FIG. 8 shows a flow chart of T400 in example 2 of the present application;
FIG. 9 shows a flow chart of T430 in example 2 of the present application;
fig. 10 shows a flowchart of a garage management method according to embodiment 3 of the present application;
FIG. 11 shows a flow chart of S'100 in example 3 of the present application;
FIG. 12 shows another flow chart of S'100 in example 3 of the present application;
description of main reference numerals:
100-sensor module, 200-signal processing module, 300-authentication module.
Description of the embodiments
The following description of the embodiments of the present application will be made clearly and completely with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments.
The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
In the following, the terms "comprises", "comprising", "having" and their cognate terms may be used in various embodiments of the present application are intended only to refer to a particular feature, number, step, operation, element, component, or combination of the foregoing, and should not be interpreted as first excluding the existence of or increasing the likelihood of one or more other features, numbers, steps, operations, elements, components, or combinations of the foregoing. Furthermore, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which various embodiments of this application belong. The terms (such as those defined in commonly used dictionaries) will be interpreted as having a meaning that is identical to the meaning of the context in the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein in connection with the various embodiments.
Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The embodiments described below and features of the embodiments may be combined with each other without conflict.
For ease of understanding, the background art will now be explained in detail.
In daily life, the situation that a part of groups park in a garage and vehicles cannot be found in the garage accurately often happens. Because the design of the garage is simple, and the garage belongs to the repeated arrangement of basic designs (as shown in fig. 1), the behavior (especially the agent with poor direction sense) can be easily lost, and the position of the garage cannot be accurately determined. For this reason, the existing garage generally divides the number, and the agent can pass through the area and the number, so that the vehicle can be positioned relatively easily. However, this requires that the agent must keep track of the area and number to which the vehicle belongs before leaving the garage. Otherwise, the agent may not find the vehicle. In addition, even if the area and the number of the vehicle are recorded, the position of the vehicle in the garage is lacking, and the unreasonable arrangement of the zoning number of the partial garage is added, so that the agent cannot accurately know the accurate positions of the agent and the vehicle in the garage, or the direction of the agent can be lost, and the vehicle cannot be rapidly positioned.
Example 1
Exemplary, embodiment 1 provides a garage management system, as shown in fig. 2, including:
a sensor module 100, a signal processing module 200, an authentication module 300;
the sensor module 100 comprises several lidars, preferably a 905nm or 1550nm lidar. Lidars may be installed in locations 2 to 10 meters high (based on indoor or outdoor garages) depending on the field requirements. Above-mentioned laser radar sets up in the garage, and the gate in garage sets up at least one laser radar, and all parking stalls in the coverage garage are covered to the scanning scope of other laser radars. The sensor module 100 is used for acquiring license plate number information of a vehicle when the vehicle is at a gate of a garage; the sensor module 100 is further configured to obtain position information of the vehicle in the parking space after the vehicle finishes parking in the garage; the sensor module 100 is also used to obtain location information of an agent.
The signal processing module 200 is configured to generate first related information, where the first related information includes license plate number information of a vehicle, position information of the vehicle and position information of an agent, so as to implement matching between the vehicle and the agent in the garage;
the authentication module 300, the authentication module 300 is used for receiving authentication information of the agent and notifying the sensor to obtain the position information of the agent.
Embodiment 1 is through after matching the agent with the vehicle in the garage, the point cloud data that has positional information that gathers through laser radar confirms the positional information of agent and the positional information of vehicle, has avoided the agent can't accurately confirm the position of self with the vehicle in the garage to make the agent can carry out quick location to the vehicle in the garage.
The garage management method is described below in connection with specific embodiments.
Example 2
This embodiment 2 is a specific embodiment based on the embodiment 1 based on the system side embodiment. As shown in fig. 3, exemplary embodiment 2 provides a garage management method, which includes:
s100: when a vehicle is at a gate of a garage, acquiring first related information of the vehicle through a sensor, wherein the first related information comprises license plate number information, and the sensor comprises a plurality of laser radars;
specifically, a unified coordinate system is required to be set for each laser radar in the garage, point cloud data obtained by each laser radar are converted into the unified coordinate system, and regional point cloud data obtained by scanning each laser radar are fused to obtain complete point cloud data.
In the laser radar arrangement of embodiment 2, at least one laser radar is provided in addition to the gate of the garage for acquiring license plate number information of the current vehicle. In addition, the sensor can also comprise a camera, and license plate number information of the current vehicle can be acquired through the camera. The scanning range of the rest laser radars needs to cover all parking spaces of the garage so as to acquire the position information of all the parked vehicles in the garage.
S200: after the vehicle is parked in the garage, the position information of the vehicle is obtained through the sensor, and the position information of the vehicle is included in the first related information;
specifically, when the garage is at a gate, license plate number information of the vehicle is acquired, and the laser radar is required to continuously track the vehicle at the moment so as to ensure that position information of the vehicle after parking is obtained and avoid confusion with position information of other vehicles. At this time, there are two cases for tracking vehicles, one is that the scanning range of each laser radar in the garage already covers all traffic roads and parking spaces in the garage, and at this time, because the communication roads and the parking spaces do not have scanning blind areas of the laser radars, the vehicles are directly tracked in real time through each laser radar. Another situation is that the scanning range of each laser radar in the garage cannot fully cover the traffic road and the parking space in the garage, and at the moment, the traffic road and the parking space have the scanning blind areas of the laser radar. Therefore, further processing is required for the scanning blind area where the lidar exists.
S300: after receiving the authentication information, obtaining the position information of the agent based on the sensor;
in one embodiment, as shown in fig. 4, the authentication information in S300 includes the following:
s301: the method comprises the steps of receiving an activation signal of an agent, and determining which sensor in a garage the agent is in the scanning range;
s302: and receiving license plate number information which is sent by an agent and needs to be matched.
Specifically, after the above embodiment mainly shows that the system receives the authentication information sent by the agent, it can determine which sensor in the garage the current agent is located and which is the agent in the scanning area of the sensor, so as to determine the location information of the current agent. The agent sends license plate number information to be matched to the background based on the terminal (such as a mobile phone). Because the position information of the current agent and the position information and license plate number information of all the vehicles parked in the garage are acquired, the agent to be matched can be associated with the vehicle to be matched after the license plate number information to be matched is sent by the agent is received. The authentication information may be an operation of determining which sensor in the garage the agent is currently located by the agent through a button (each button corresponds to one laser radar) or scanning (each identification code corresponds to one laser radar) and the like, which is not limited in this application. In addition, S301 and S302 may be performed sequentially, which is not limited in this application.
In another embodiment, as shown in fig. 5, the authentication information in S300 includes the following:
s311: receiving license plate number information which is sent by an agent and needs to be matched;
s312: based on the license plate number information which is sent by the agent and needs to be matched, a corresponding specific action signal is sent to the agent;
s313: based on the data of the sensors in the garage, it is clear which sensor in the garage the agent is in the scanning range of.
Specifically, the above embodiment mainly shows that the system receiving agent sends the vehicle number information to be matched through various terminals (such as mobile phones) in advance, and then generates the corresponding unique non-repeated specific action according to the vehicle number information. The system can determine the position information of the current agent by judging whether all the laser radars have objects or not to do the specific actions. After the position information of the current agent is determined, the position information of the current agent, the position information of all vehicles which have completed stopping in the garage and the license plate number information are acquired, and the agent needing to be matched can be associated with the vehicle needing to be matched after the license plate number information needing to be matched is sent out by the agent is received. The specific action can be the action which does not generally occur in the garage such as in-situ rotation for a plurality of circles or arm rotation for a plurality of circles against the center of the laser radar, and the application is not limited in any way.
S400: and according to license plate number information provided by the agent, the position information of the agent is included in the first related information of the vehicle, so that the matching between the vehicle and the agent in the garage is realized.
In one embodiment, S400 further comprises,
s500: updating the position information of the agent in the first related information in real time based on the sensor;
s600: and sending the first related information updated in real time to the agent.
Specifically, the agent locates the vehicle according to the first related information. The agent may look for the car based on the first related information. At this time, because the position of the agent is continuously changed, the agent needs to be continuously tracked at this time, so as to ensure that the real-time position information of the agent is obtained, and the confusion with the position information of other agents in the garage is avoided. At this time, there are two situations to the tracking of agent, and one situation is that the scanning range of each laser radar in the garage has all covered traffic road and parking stall in the garage, because communication road and parking stall do not have the scanning blind area of laser radar this moment, consequently, directly carry out real-time tracking to the agent through each laser radar can. Another situation is that the scanning range of each laser radar in the garage cannot fully cover the traffic road and the parking space in the garage, and at the moment, the traffic road and the parking space have the scanning blind areas of the laser radar. Therefore, further processing is required for the scanning blind area where the lidar exists.
In an embodiment, updating the location information of the agent in the first related information in real time based on the sensor includes real-time tracking of the target object based on the sensor; or alternatively
After the vehicle is parked in the garage, the position information of the vehicle is obtained through the sensor, wherein the real-time tracking of the target object is based on the sensor;
as shown in fig. 6, the real-time tracking of the target object based on the sensor includes:
the sensor comprises a plurality of cameras;
t100: defining a laser radar of an acquisition area where a target object is located as a reference laser radar, and obtaining parameter information of the target object based on point cloud data of the reference laser radar, wherein the parameter information comprises contour information and motion state parameters;
t200: defining a camera of which the target object is in an acquisition area as a reference camera, and obtaining color information of the target object based on image data of the reference camera;
t300: if the target object leaves the acquisition area of the reference laser radar, defining the adjacent laser radars of the reference laser radar as first-class laser radars;
t400: screening the first type of laser radars based on the motion state parameters of the target object, and defining a screening result as a third type of laser radars;
t500: and obtaining a tracking object of the target object based on the contour information and the color information of the target object of the third-class laser radar.
In particular, whether based on real-time tracking of the lidar for the vehicle or the agent, means that real-time tracking of the individual lidar for the target object is required. As described above, if the communication road and the parking space do not have the scanning blind areas of the lidars, the target object (agent/vehicle) is directly tracked in real time by each lidar. If the communication road and the parking space have the scanning blind areas of the laser radars, screening and estimating the adjacent laser radars of the target object after the target object leaves the current laser radars. The main screening mode is mainly to screen based on the motion state parameters of the target object to obtain a third type of laser radar. And the tracking object can be obtained in a third type of laser radar according to the characteristic that the contour information and the color information of the target object cannot change when the target object spans different laser radar scanning areas. In addition, T100 and T200 may be performed in parallel or sequentially with each other, which is not limited in this application.
In one embodiment, as shown in fig. 7, T500 further includes the substeps of:
t510: if the contour information and the color information of the target object of the third type of laser radar meet the following two conditions at the same time, defining the target object of the third type of laser radar as a tracking object of the target object:
t520: if the difference value between the contour information of the target object of the third type of laser radar and the contour information of the target object is within a preset value;
t530: if the difference value between the color information of the target object of the third type of laser radar and the color information of the target object is within a preset value.
The profile information (referred to as shape parameter) of the vehicle itself is unchanged when the vehicle spans different laser radar scanning areas. Therefore, the difference value between the contour information of the vehicle of the third type of laser radar and the contour information of the vehicle can be obtained directly through the shape parameters in the contour information. However, the agent may have variations in shape parameters due to the act of walking while crossing different lidar scanning areas. But the length information of the contour line itself will generally not change. Therefore, the agent can judge based on the shape parameters in the profile information, if the profile information is different, the agent can judge based on the length information of the profile line in the profile information, and finally the difference value between the profile information of the agent and the profile information of the agent of the third type laser radar is obtained.
In one embodiment, as shown in fig. 8, T400 further includes the substeps of:
t410: the motion state parameters comprise course angle information and speed information;
t420: screening the first type of laser radar based on the course angle information, and defining a screening result as a second type of laser radar;
t430: and screening the second class of laser radars based on the distance between the reference laser radars and the second class of radars and combining the speed information of the target object, and defining a screening result as a third class of laser radars.
Specifically, coarse screening is performed on the first type of laser radar (all the laser radars which do not belong to the corresponding course angle are removed) through course angle information and speed information of the target object, so that the second type of laser radar is obtained.
In one embodiment, as shown in fig. 9, T430 further includes the following:
t431: based on the distance between the reference laser radar and the second class radar, combining the scanning range of the reference laser radar and the second class radar to obtain the distance between the reference laser radar and the blind area of the second class radar, wherein the distance is defined as the blind area distance;
t432: based on the speed information of the target object leaving the acquisition area of the reference laser radar, combining the blind area distance to obtain the time of the target object completing the blind areas of the reference laser radar and the second type radar, wherein the time is defined as the blind area time;
t433: obtaining the number of frames of the target object entering an acquisition area of the second type radar based on the blind area time and the frame rate of the second type radar, wherein the number is defined as the blind area frame number;
t434: and screening the second type of laser radars based on the blind area frame number, and defining a screening result as a third type of laser radars.
Specifically, since the blind area distance between the second type of lidar and the reference radar is known, the speed information of the target object leaving the reference lidar can be combined to determine when the target object will finish the blind area distance (i.e. the blind area time) and enter the sampling area of the second type of lidar. In the case of the lidar, the frame rate of the lidar is substantially fixed, so that based on the dead zone time and the frame rate, it is possible to obtain, for the second type of lidar, how many frames are likely to appear in the image data of the second type of lidar after the target object leaves the reference lidar. Therefore, the second type of laser radar can be fine-screened, and the third type of laser radar can be obtained.
In the embodiment 2, the license plate number information of the vehicle and the position information after parking are obtained through a sensor comprising a laser radar; determining current position information of an agent by receiving authentication information of the agent; the license plate number information provided by the agent is received, the agent is matched with the vehicle, so that the agent can accurately know the accurate positions of the agent and the vehicle in the garage, and the vehicle can be rapidly positioned in the garage.
Example 3
Specifically, this embodiment 3 is a specific implementation of embodiment 1 based on agent embodiment. For the definition of nouns in embodiment 3, reference is made to the previous embodiments unless otherwise indicated. Details of each step, technical effects, technical problems to be solved and other relevant information in embodiment 3 can be referred to the foregoing embodiments unless otherwise specified. Some of the options or preferences in the foregoing embodiments apply to embodiment 3 without violating design objectives and laws of physics.
As shown in fig. 10, exemplary embodiment 3 provides a garage management method, which includes:
s'100: the agent sends license plate number information and authentication information based on a sensor, wherein the sensor comprises a plurality of laser radars;
s'200: the agent receives the first related information; the first related information comprises position information of an agent, license plate number information and position information of a vehicle;
s'300: the agent receives the location information of the agent in the real-time update first related information.
In one embodiment, as shown in fig. 11, S'100 includes the steps of:
s'110: the agent sends out an activation signal to determine which sensor in the garage the agent is in the scanning range;
s'120: the agent sends out license plate number information to be matched.
Specifically, the agent informs the system of which laser radar the agent is currently in the scanning range through the button (each button corresponds to one laser radar) or scanning (each identification code corresponds to one laser radar) and the like, and obtains the position information of the agent based on the laser radars. The agent sends license plate number information to be matched to the background based on the terminal (such as a mobile phone). Because the position information of the current agent and the position information and license plate number information of all the vehicles parked in the garage are acquired, the agent to be matched can be associated with the vehicle to be matched after the license plate number information to be matched is sent by the agent is received. In addition, S '110 and S'120 may be performed sequentially with each other, which is not limited in this application.
In another embodiment, as shown in fig. 12, S'100 includes the steps of:
s'101: the agent sends license plate number information to be matched;
s'102: the agent receives a specific action signal corresponding to the license plate number;
s'103: within the scanning range of any sensor, the agent makes a specific action based on a specific action signal.
Specifically, the above embodiment mainly shows that the agent sends the vehicle number information to be matched through various terminals (such as mobile phones) in advance, and then receives the unique and non-repeated specific action corresponding to the vehicle number information. The agent only needs to select any laser radar (preferably the laser radar closest to the agent), the specific action is performed, the specific action is identified by the laser radar, the agent can determine the current self position information, and the position information of the vehicle needing to be matched is determined.
In this embodiment 3, authentication information sent by an agent determines current location information of the agent; the agent matches itself with the vehicle through the license plate number information provided, so that the agent can accurately know the accurate positions of the agent and the vehicle in the garage, and the vehicle can be rapidly positioned in the garage.
Example 4
Embodiment 4 also provides an electronic device including one or more processors; and a memory associated with the one or more processors, the memory configured to store program instructions that, when read and executed by the one or more processors, perform the functions of the above-described garage management method or the above-described respective modules in the garage management system.
The processor may be an integrated circuit chip with signal processing capabilities. The processor may be a general purpose processor including at least one of a central processing unit (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU) and a network processor (Network Processor, NP), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like that may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present application.
The Memory may be, but is not limited to, random access Memory (Random Access Memory, RAM), read Only Memory (ROM), programmable Read Only Memory (Programmable Read-Only Memory, PROM), erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc. The memory is used for storing a computer program, and the processor can correspondingly execute the computer program after receiving the execution instruction.
Example 5
The present embodiment 5 also provides a readable storage medium storing the computer program used in the above-described terminal device.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other manners as well. The apparatus embodiments described above are merely illustrative, for example, of the flow diagrams and block diagrams in the figures, which illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules or units in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a smart phone, a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application.

Claims (10)

1. A method of garage management comprising:
when a vehicle is at a gate of a garage, acquiring first related information of the vehicle through a sensor, wherein the first related information comprises license plate number information, and the sensor comprises a plurality of laser radars;
after the vehicle is parked in the garage, the position information of the vehicle is obtained through the sensor, and the position information of the vehicle is included in the first related information;
after receiving the authentication information, obtaining the position information of the agent based on the sensor;
according to license plate number information provided by an agent, position information of the agent is included in first related information of the vehicle, and matching between the vehicle and the agent in the garage is achieved;
the authentication information comprises the following contents:
receiving license plate number information which is sent by an agent and needs to be matched;
based on the license plate number information which is sent by the agent and needs to be matched, a corresponding specific action signal is sent to the agent;
judging whether an object makes a specific action or not through all the laser radars, and determining which sensor in the garage the agent is in the scanning range.
2. The garage management method according to claim 1, further comprising, after the matching of the vehicle and the agent in the garage is achieved:
updating the position information of the agent in the first related information in real time based on the sensor;
and sending the first related information updated in real time to the agent.
3. The garage management method according to claim 2, wherein the updating of the location information of the agent in the first related information based on the sensor in real time includes real-time tracking of the target object based on the sensor; or after the garage is parked, the position information of the vehicle is obtained through the sensor, wherein the real-time tracking of the target object is based on the sensor;
wherein the real-time tracking of the target object based on the sensor comprises:
the sensor comprises a plurality of cameras;
defining a laser radar of an acquisition area where a target object is located as a reference laser radar, and obtaining parameter information of the target object based on point cloud data of the reference laser radar, wherein the parameter information comprises contour information and motion state parameters;
defining a camera of which the target object is in an acquisition area as a reference camera, and obtaining color information of the target object based on image data of the reference camera;
if the target object leaves the acquisition area of the reference laser radar, defining the adjacent laser radars of the reference laser radar as first-class laser radars;
screening the first type of laser radars based on the motion state parameters of the target object, and defining a screening result as a third type of laser radars;
and obtaining a tracking object of the target object based on the contour information and the color information of the target object of the third-class laser radar.
4. A garage management method according to claim 3, wherein the obtaining the tracking object of the target object based on the contour information and the color information of the target object of the third type of lidar includes:
if the contour information and the color information of the target object of the third type of laser radar meet the following two conditions at the same time, defining the target object of the third type of laser radar as a tracking object of the target object:
if the difference value between the contour information of the target object of the third type of laser radar and the contour information of the target object is within a preset value;
if the difference value between the color information of the target object of the third type of laser radar and the color information of the target object is within a preset value.
5. The garage management method according to claim 3 or 4, wherein the screening of the first type of lidar based on the motion state parameter of the target object, defining the screening result as a third type of lidar, includes:
the motion state parameters comprise course angle information and speed information;
screening the first type of laser radar based on the course angle information, and defining a screening result as a second type of laser radar;
and screening the second-class laser radar by combining the speed information of the target object based on the distance between the reference laser radar and the second-class radar, and defining a screening result as a third-class laser radar.
6. The garage management method according to claim 5, wherein the step of screening the second class lidar based on the distance between the reference lidar and the second class lidar and the scanning range between the reference lidar and the second class lidar in combination with the speed information of the target object, and defining the screening result as a third class lidar includes:
based on the distance between the reference laser radar and the second class radar, combining the scanning ranges of the reference laser radar and the second class radar to obtain the distance between the blind areas of the reference laser radar and the second class radar, wherein the distance is defined as the blind area distance;
based on the speed information of the target object leaving the acquisition area of the reference laser radar, combining the blind area distance to obtain the time of the target object completing the blind areas of the reference laser radar and the second type of radar, wherein the time is defined as the blind area time;
obtaining the number of frames of the target object entering an acquisition area of the second type radar based on the blind area time and the frame rate of the second type radar, wherein the number is defined as the blind area frame number;
and screening the second type of laser radars based on the blind area frame number, and defining a screening result as a third type of laser radars.
7. A method of garage management comprising:
the agent sends license plate number information and authentication information based on a sensor, wherein the sensor comprises a plurality of laser radars;
the method comprises the steps that an agent receives first relevant information, wherein the first relevant information comprises position information of the agent, license plate number information and position information of a vehicle, and the position information of the agent is obtained by judging whether the agent makes specific actions or not through all laser radars;
the agent receives and updates the position information of the agent in the first related information in real time;
the agent sends license plate number information and authentication information based on a sensor, and the method comprises the following steps:
the agent sends license plate number information to be matched;
the agent receives a specific action signal corresponding to the license plate number;
within the scanning range of any sensor, the agent makes a specific action based on a specific action signal.
8. A garage management system, comprising:
the sensor module comprises a plurality of laser radars, and is used for acquiring license plate number information of a vehicle when the vehicle is at a gate of a garage; the sensor module is also used for acquiring the position information of the vehicle after the vehicle is parked in the garage; the sensor module is also used for acquiring the position information of the agent;
the signal processing module is used for generating first related information, wherein the first related information comprises license plate number information of a vehicle, position information of the vehicle and position information of an agent, and matching of the vehicle and the agent in a garage is realized;
the authentication module is used for receiving authentication information of the agent and informing the sensor to acquire the position information of the agent;
the authentication information comprises the following contents:
receiving license plate number information which is sent by an agent and needs to be matched;
based on the license plate number information which is sent by the agent and needs to be matched, a corresponding specific action signal is sent to the agent;
judging whether an object makes a specific action or not through all the laser radars, and determining which sensor in the garage the agent is in the scanning range.
9. A computer-readable storage medium comprising,
a computer program is stored which, when executed by a processor, implements the steps of the method of any one of claims 1 to 7.
10. An electronic device, comprising:
one or more processors;
and a memory associated with the one or more processors, the memory for storing program instructions that, when read for execution by the one or more processors, perform the steps of the method of any of claims 1 to 7.
CN202311827910.2A 2023-12-28 2023-12-28 Garage management method and system, computer readable storage medium and electronic device Active CN117496756B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311827910.2A CN117496756B (en) 2023-12-28 2023-12-28 Garage management method and system, computer readable storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311827910.2A CN117496756B (en) 2023-12-28 2023-12-28 Garage management method and system, computer readable storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN117496756A CN117496756A (en) 2024-02-02
CN117496756B true CN117496756B (en) 2024-03-22

Family

ID=89678583

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311827910.2A Active CN117496756B (en) 2023-12-28 2023-12-28 Garage management method and system, computer readable storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN117496756B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170033075A (en) * 2015-09-16 2017-03-24 김용묵 parking guidance image camera, system for parking guidance using the same and method for parking guidance by parking guidance system
CN107819667A (en) * 2016-09-14 2018-03-20 张政 A kind of shifting car management system and shifting car management method based on wechat public platform

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9317983B2 (en) * 2012-03-14 2016-04-19 Autoconnect Holdings Llc Automatic communication of damage and health in detected vehicle incidents
US20190050634A1 (en) * 2012-08-06 2019-02-14 Cloudparc, Inc. Tolling with vehicle tracking

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170033075A (en) * 2015-09-16 2017-03-24 김용묵 parking guidance image camera, system for parking guidance using the same and method for parking guidance by parking guidance system
CN107819667A (en) * 2016-09-14 2018-03-20 张政 A kind of shifting car management system and shifting car management method based on wechat public platform

Also Published As

Publication number Publication date
CN117496756A (en) 2024-02-02

Similar Documents

Publication Publication Date Title
CN106485198B (en) System and method for autonomous valet parking using full-light camera
CN107851125B (en) System and method for two-step object data processing via vehicle and server databases to generate, update and communicate accurate road characteristics databases
US10157544B2 (en) Method for determining position data for use during the operation of a vehicle system of a motor vehicle, and position-data determining and distributing system
CN102458964B (en) Camera system for use in vehicle parking
US20160321926A1 (en) Method for identifying parking areas and/or free spaces--
EP2330580A1 (en) Vehicle parking locator system and method using connected vehicles
CN110945320B (en) Vehicle positioning method and system
DE102017101466A1 (en) TRACKING OBJECTS IN A DYNAMIC ENVIRONMENT FOR IMPROVED LOCALIZATION
US20200160718A1 (en) Ride sharing demand and pricing via automotive edge computing
US11119502B2 (en) Vehicle control system based on social place detection
US11963066B2 (en) Method for indicating parking position and vehicle-mounted device
CN114945802A (en) System, apparatus and method for identifying and updating design applicability of autonomous vehicles
GB2525765A (en) Unauthorized vehicle detection
CN114670852A (en) Method, device, equipment and medium for identifying abnormal driving behaviors
CN113239459A (en) Target screening method, screening system, electronic device, and storage medium
CN117496756B (en) Garage management method and system, computer readable storage medium and electronic device
CN111479746A (en) Method for mobile parking assistance
CN113852925A (en) Vehicle command method and system
CA3113494A1 (en) Parking lot management system based on real-time computer vision processing
CN115953748A (en) Multi-sensor fusion sensing method, system, device and medium for Internet of vehicles
CN113611131B (en) Vehicle passing method, device, equipment and computer readable storage medium
US12084077B2 (en) System and method of curb management for a vehicle
RU2749527C1 (en) Device and system for recording traffic conditions
CN111661054B (en) Vehicle control method, device, electronic device and storage medium
CN108242163A (en) The driver assistance system of motor vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant