CN113963332A - Detection method and device, electronic equipment and storage medium - Google Patents

Detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113963332A
CN113963332A CN202111272939.XA CN202111272939A CN113963332A CN 113963332 A CN113963332 A CN 113963332A CN 202111272939 A CN202111272939 A CN 202111272939A CN 113963332 A CN113963332 A CN 113963332A
Authority
CN
China
Prior art keywords
electric vehicle
target object
target
information
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202111272939.XA
Other languages
Chinese (zh)
Inventor
邬征宇
熊梓云
李娟�
王天课
侯敏
姚玉姣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202111272939.XA priority Critical patent/CN113963332A/en
Publication of CN113963332A publication Critical patent/CN113963332A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30264Parking

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a detection method and device, electronic equipment and a storage medium, wherein a target image of a target area is obtained; the target area is a forbidden area of the electric vehicle; detecting whether an electric vehicle and a target object exist in the target image; determining that a target event occurs under the condition that the electric vehicle and the target object are detected to exist in the target image; the target event refers to an event that the electric vehicle enters the forbidden area.

Description

Detection method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a detection method and apparatus, an electronic device, and a storage medium.
Background
In some cases, a user may drive the electric vehicle to an illegal region for convenient use of the electric vehicle, and potential safety hazards may be caused to people and articles in the illegal region due to the fact that the electric vehicle may have the situations of electric leakage, spontaneous combustion and the like. In the prior art, no effective monitoring means for the condition that a user drives an electric vehicle to an illegal area exists.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present application provide a detection method and apparatus, an electronic device, and a storage medium.
The embodiment of the application provides a detection method, which comprises the following steps:
acquiring a target image of a target area; the target area is a forbidden area of the electric vehicle;
detecting whether an electric vehicle and a target object exist in the target image;
determining that a target event occurs under the condition that the electric vehicle and the target object are detected to exist in the target image; the target event refers to an event that the electric vehicle enters the forbidden area.
In an optional embodiment of the present application, it is detected that the number of target objects included in the target image is at least one, and the method further includes:
determining whether a target object having an association relationship with the electric vehicle exists among at least one target object included in the target image in a case where it is determined that the target event occurs;
and acquiring the attribute characteristics of the target object having the association relation with the electric vehicle when the target object having the association relation with the electric vehicle exists in the at least one target object.
In an optional embodiment of the present application, the determining whether there is a target object having an association relationship with the electric vehicle among at least one target object included in the target image includes:
determining a mutual positional relationship between each of the at least one target object and the electric vehicle;
and determining whether a target object having an association relation with the electric vehicle exists in the at least one target object according to the mutual position relation.
In an optional embodiment of the present application, the determining a mutual position relationship between each target object of the at least one target object and the electric vehicle includes:
obtaining pose information of the electric vehicle and pose information of each target object of the at least one target object based on the target image;
determining a mutual positional relationship of each of the at least one target object and the electric vehicle based on the pose information of the electric vehicle and the pose information of each of the at least one target object; the mutual position relation between the target object and the electric vehicle is used for representing whether the target object pushes or rides the electric vehicle.
In an optional embodiment of the present application, the determining a mutual position relationship between each of the at least one target object and the electric vehicle based on the pose information of the electric vehicle and the pose information of each of the at least one target object includes:
determining, for each of the electric vehicle and the at least one target object, position information and orientation information of the target object, and determining position information and orientation information of the electric vehicle;
determining whether the target object is pushing or riding the electric vehicle based on the position information and orientation information of the electric vehicle and the position information and orientation information of the target object.
In an optional embodiment of the present application, the attribute feature includes at least one of: human body characteristics and human face characteristics, and the method further comprises the following steps:
and calling and/or recording personal information of the target object having the association relation with the electric vehicle according to at least one attribute feature of human body characteristics and human face features of the target object having the association relation with the electric vehicle.
In an optional embodiment of the present application, the method further comprises:
and deploying and controlling the target object having the association relation with the electric vehicle based on the personal information.
An embodiment of the present application further provides a detection device, the device includes:
a first acquisition unit configured to acquire a target image of a target area; the target area is a forbidden area of the electric vehicle;
a detection unit configured to detect whether an electric vehicle and a target object exist in the target image;
a first determination unit configured to determine that a target event occurs in a case where it is detected that an electric vehicle and a target object exist in the target image; the target event refers to an event that the electric vehicle enters a forbidden area.
In an optional embodiment of the present application, it is detected that the number of target objects included in the target image is at least one, and the apparatus further includes:
a second determination unit configured to determine whether there is a target object having an association relationship with the electric vehicle among at least one target object included in the target image, in a case where it is determined that the target event occurs;
a second obtaining unit, configured to obtain an attribute feature of a target object having an association relationship with the electric vehicle, if there is a target object having an association relationship with the electric vehicle, among the at least one target object.
In an optional embodiment of the present application, the second determining unit is specifically configured to: determining a mutual positional relationship between each of the at least one target object and the electric vehicle; and determining whether a target object having an association relation with the electric vehicle exists in the at least one target object according to the mutual position relation.
In an optional embodiment of the present application, the second determining unit is specifically configured to: obtaining pose information of the electric vehicle and pose information of each target object of the at least one target object based on the target image; determining a mutual positional relationship of each of the at least one target object and the electric vehicle based on the pose information of the electric vehicle and the pose information of each of the at least one target object; the mutual position relation between the target object and the electric vehicle is used for representing whether the target object pushes or rides the electric vehicle.
In an optional embodiment of the present application, the second determining unit is specifically configured to: determining, for each of the electric vehicle and the at least one target object, position information and orientation information of the target object, and determining position information and orientation information of the electric vehicle; determining whether the target object is pushing or riding the electric vehicle based on the position information and orientation information of the electric vehicle and the position information and orientation information of the target object.
In an optional embodiment of the present application, the attribute feature includes at least one of: human body characteristics, human face characteristics, the device still includes:
the recording unit is used for calling and/or recording personal information of the target object in the incidence relation with the electric vehicle according to at least one attribute feature of human body characteristics and human face features of the target object in the incidence relation with the electric vehicle.
In an optional embodiment of the present application, the apparatus further comprises:
and the control unit is used for controlling the target object which has the incidence relation with the electric vehicle based on the personal information.
An embodiment of the present application further provides an electronic device, where the electronic device includes a memory and a processor, where the memory stores computer-executable instructions, and the processor executes the computer-executable instructions on the memory to implement the method steps in the foregoing embodiments.
The embodiment of the present application further provides a computer storage medium, where executable instructions are stored on the storage medium, and when executed by a processor, the executable instructions implement the method steps described in the above embodiment.
According to the technical scheme of the embodiment of the application, the target image of the target area is obtained; the target area is a forbidden area of the electric vehicle; detecting whether an electric vehicle and a target object exist in the target image; determining that a target event occurs under the condition that the electric vehicle and the target object are detected to exist in the target image; the target event refers to an event that the electric vehicle enters the forbidden area. So, can monitor the electric motor car forbidden region in real time, under the condition that has electric motor car and people simultaneously in monitoring forbidden region, confirm that the electric motor car has got into forbidden region to drive the effective monitoring of the condition in forbidden region with the electric motor car to the user, reduce the electric motor car potential safety hazard in forbidden region.
Drawings
Fig. 1 is a schematic flow chart of a detection method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a target image provided by an embodiment of the present application;
fig. 3 is a schematic structural component diagram of a detection apparatus provided in the embodiment of the present application;
fig. 4 is a schematic structural component diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Various exemplary embodiments of the present application will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present application unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the application, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Embodiments of the application are applicable to electronic devices such as computer systems/servers and the like, which are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with electronic devices, such as computer systems/servers, include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set top boxes, programmable consumer electronics, network pcs, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above systems, and the like. The electronic device, such as a computer system/server, may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
The electric vehicle in the embodiment of the present application may be various types of electric vehicles, including but not limited to electric bicycles, electric motorcycles, scooters, balance cars, and the like. The forbidden region in the embodiment of the present application can also be flexibly set according to various situations, for example, the forbidden region can be: residential buildings, factory workshops, or other areas where parking of electric vehicles is specifically prohibited. One or more image acquisition devices can be arranged in a plurality of sub-areas in the contraband area. For example, if the forbidden area is a residential building, the corridor, the hall and the elevator of the residential building are all provided with image acquisition devices.
Taking an electric bicycle as an electric bicycle and an illegal area as a residential building as an example, the electric bicycle is a practical and cheap vehicle, and has high popularization rate in cities. However, the battery used by the electric bicycle has a safety hazard of spontaneous combustion in the charging process. In order to eliminate the hidden danger of community fire fighting, the electric bicycle is required to be charged at fixed points outdoors in urban management without entering a resident building. For residents who carry electric bicycles into residential buildings, city management workers need to recognize and give criticizing education. In some schemes, after people push the electric bicycle into a residential building, the electric bicycle itself is detected, and this way may cause a false alarm of an event that the electric bicycle enters an illegal area.
Based on the above scenes, the technical scheme of the embodiment of the application is described by using the electric vehicle as the electric bicycle and using the forbidden region as the residential building. The technical scheme of this application is applied to above scene, can utilize computer vision technique to solve the city management problem, accomplishes the monitoring that electric bicycle car violated the embankment, and is with low costs, efficient, is favorable to the clearance of well defence hidden danger in the city, ensures citizen's safety, improves supervision efficiency for city management department.
Fig. 1 is a schematic flow chart of a detection method provided in an embodiment of the present application, and as shown in fig. 1, the method includes the following steps:
step 101: acquiring a target image of a target area; the target area is a forbidden area of the electric vehicle.
In this application embodiment, the target area is the forbidden area that forbids electric motor car to park or get into, and the forbidden area of this application embodiment can be set for according to various circumstances is nimble, for example, the forbidden area can be: residential buildings, factory workshops, or other areas where parking of electric vehicles is specifically prohibited. One or more image acquisition devices can be arranged in a plurality of sub-areas in the contraband area. For example, if the forbidden region is a residential building, the corridor, the hall and the elevator of the residential building are all provided with image acquisition devices, and one or more acquisition devices arranged in the forbidden region can acquire images of the forbidden region in real time or at regular time, so that monitoring of electric vehicles and target objects entering the forbidden region is realized.
Step 102: and detecting whether the electric vehicle and the target object exist in the target image.
Here, the target object is generally a human. Taking an forbidden area as an example of a residential building, after one or more acquired images in a corridor are acquired by one or more image acquisition devices arranged in the corridor of the residential building, one or more target images are acquired based on the image acquisition devices in the corridor of the residential building, specifically taking one target image as an example, after the target image acquired by the image acquisition devices is acquired, whether an electric vehicle and a person exist in the target image at the same time can be detected by reading the target image data.
Step 103: and determining that a target event occurs in the case that the electric vehicle and the target object are detected to exist in the target image.
In the embodiment of the present application, the target event refers to an event that the electric vehicle enters the prohibited area.
Specifically, when it is detected whether a person and an electric vehicle are present in the target image, it can be determined that an event that the electric vehicle enters an illegal area occurs, for example, when it is detected that the electric vehicle and the person are present in the image simultaneously by using the image acquired by the image acquisition device in the passageway of the residential building, it is determined that the event that the electric vehicle enters the building occurs.
Here, in the case where only the electric vehicle exists in the target image but any target object does not exist, the position information and the identification information of the electric vehicle will be directly output. Through the position information of the electric vehicle, the position of the electric vehicle can be conveniently and timely positioned by related management personnel, and the electric vehicle can be timely moved to a non-forbidden area. As an optional implementation manner, by recording the identification information of the electric vehicle, the owner of the electric vehicle can be determined based on the identification information of the electric vehicle under the condition of the correspondence between the pre-stored identification information of the electric vehicle and the owner.
Here, when detecting the target image, it may be detected whether there is an electric vehicle in the target image first, and if there is no electric vehicle, it is not necessary to further identify the target image. And if the electric vehicle exists in the target image, further detecting whether a person exists in the target image.
The technical scheme of this application embodiment can be through monitoring the forbidden region of electric motor car in real time, under the condition that has electric motor car and people simultaneously in monitoring the forbidden region, confirms that the electric motor car has got into forbidden region to drive the effective monitoring of the condition in forbidden region with the electric motor car to the user, reduce the electric motor car potential safety hazard in forbidden region. The method and the device solve the problem of city management by using a computer vision technology, achieve illegal entry monitoring of the electric vehicle, are low in cost and high in efficiency, are beneficial to clearing of defense hidden dangers in cities, guarantee the safety of citizens, and improve the supervision efficiency for city management departments.
In an optional embodiment of the present application, it is detected that the number of target objects included in the target image is at least one, and when the target event is determined to occur, the following steps may be further performed:
determining whether a target object having an association relationship with the electric vehicle exists in at least one target object included in the target image;
and acquiring the attribute characteristics of the target object having the association relation with the electric vehicle when the target object having the association relation with the electric vehicle exists in the at least one target object.
In the embodiment of the application, the target object having an association relationship with the electric vehicle may specifically refer to a vehicle owner of the electric vehicle.
In the embodiment of the application, after the target image is obtained, whether the electric vehicle and the person exist in the target image can be detected by reading and identifying the target image data; wherein, a plurality of electric vehicles and a plurality of persons can be simultaneously included in the target image. Here, taking the target image shown in fig. 2 as an example, fig. 2 includes 1 electric vehicle and 4 target objects (target object 1, target object 2, target object 3, and target object 4).
In the 4 target objects in fig. 2, there may be 1 target object and an electric vehicle in an association relationship, that is, one of the 4 target objects may be an owner of the electric vehicle, and of course, any one of the 4 target objects may not be an owner of the electric vehicle.
In the embodiment of the application, the electric vehicle and the target object exist in the target image at the same time, but the position information and the identification information of the electric vehicle can be directly output under the condition that the target object which is associated with the electric vehicle does not exist. Through the position information of the electric vehicle, the position of the electric vehicle can be conveniently and timely positioned by related management personnel, and the electric vehicle can be timely moved to a non-forbidden area. As an optional implementation manner, by recording the identification information of the electric vehicle, the owner of the electric vehicle can be determined based on the identification information of the electric vehicle under the condition of the correspondence between the pre-stored identification information of the electric vehicle and the owner.
In the scheme of the embodiment of the application, under the condition that the electric vehicle is determined to enter the forbidden area, the target object having the incidence relation with the electric vehicle is determined based on the target image, and the attribute characteristic of the target object having the incidence relation with the electric vehicle is obtained, so that the personal information of the target object can be conveniently obtained based on the attribute characteristic of the target object, the target object is controlled, and the tracking of the target object having the incidence relation with the electric vehicle is realized.
In an optional embodiment of the present application, the attribute feature of the target object having an association relationship with the electric vehicle includes at least one of: human body characteristics and human face characteristics, and the method further comprises the following steps:
and calling and/or recording personal information of the target object having the association relation with the electric vehicle according to at least one attribute feature of human body characteristics and human face features of the target object having the association relation with the electric vehicle.
Specifically, after the target object in the target image, which has an association relationship with the electric vehicle, is determined according to the target image, for example, after the target object 4 in fig. 2 is determined as an owner of the electric vehicle, the human body and/or the human face features of the target image 4 are extracted and identified, and the personal information of the target object 4, such as identity card information, name information, contact information, home address, and the like, can be called or recorded according to the identification result.
According to the scheme of the embodiment of the application, the personal information of the target object having the association relation with the electric vehicle is called through the detected attribute characteristics of the target object having the association relation with the electric vehicle, and the personal information of the target object can be conveniently acquired based on the attribute characteristics of the target object, so that the target object is controlled, and the tracking of the target object having the association relation with the electric vehicle is realized.
In an optional embodiment of the present application, after personal information of a target object having an association relationship with an electric vehicle is retrieved, the target object having an association relationship with the electric vehicle may be deployed and controlled based on the personal information.
Specifically, after the personal information of the target object 4 is determined, the target object 4 can be controlled based on the personal information of the target object 4, the movement of the target object 4 can be tracked, the target object 4 is controlled, the target object 4 is timely notified to move out of an forbidden area, and related managers can conveniently conduct criticizing education on the target object 4.
In this scheme of this application embodiment, can realize having the pursuit of the target object of incidence relation with the electric motor car through the control based on having incidence relation target object with the electric motor car to after tracking the target object, in time inform the target object to move out the electric motor car in the forbidden region, be convenient for relevant managers in time criticize the education to the target object simultaneously.
In an optional embodiment of the present application, the step of determining whether a target object having an association relationship with the electric vehicle exists in at least one target object included in the target image may specifically be implemented by:
determining a mutual positional relationship between each of the at least one target object and the electric vehicle;
and determining whether a target object having an association relation with the electric vehicle exists in the at least one target object according to the mutual position relation.
Specifically, after the target image is acquired, whether the electric vehicle and the person exist in the target image can be detected by reading and identifying the target image data; wherein, a plurality of electric vehicles and a plurality of persons can be simultaneously included in the target image. Here, taking the target image shown in fig. 2 as an example, fig. 2 includes 1 electric vehicle and four target objects (i.e., 4 persons), and after detecting that 1 electric vehicle and 4 persons exist in the target image, the mutual positional relationship between each of the 4 persons and the electric vehicle is further determined. Here, the mutual positional relationship may specifically include: the electric vehicle is ridden by people, the electric vehicle is pushed by people, the electric vehicle is faced by people, the electric vehicle is back to people, the electric vehicle is close to people or the electric vehicle is far away from people, and the like.
In fig. 2, it can be determined whether 4 persons have an association relationship with the electric vehicle according to the mutual position relationship between the persons and the electric vehicle. For example, if it is determined that 1 person out of 4 persons is riding or pushing the electric vehicle, it is determined that this person is the owner of the electric vehicle.
According to the scheme of the embodiment of the application, the target object having the association relation with the electric vehicle can be determined based on the mutual position relation between each target object in the target image and the electric vehicle, and other objects having no association relation with the electric vehicle are excluded, so that the owner of the electric vehicle can be accurately determined.
In an optional embodiment of the present application, the process of determining the mutual position relationship between each target object of the at least one target object and the electric vehicle may be implemented as follows:
obtaining pose information of the electric vehicle and pose information of each target object of the at least one target object based on the target image;
determining a mutual positional relationship of each of the at least one target object and the electric vehicle based on the pose information of the electric vehicle and the pose information of each of the at least one target object; the mutual position relation between the target object and the electric vehicle is used for representing whether the target object pushes or rides the electric vehicle.
In the embodiment of the application, the detection of the electric vehicle and the person in the target image can be realized by using the neural network model, the electric vehicle and the person in the target image can be detected by using the same neural network, and the person and the electric vehicle in the target image can be detected by using different neural networks respectively. If the different neural networks are respectively used for detecting the people and the electric vehicle in the target image, for a first deep neural network used for detecting the electric vehicle in the target image, the first deep neural network can determine the pose information of the electric vehicle besides detecting whether the electric vehicle in the target image exists, and if the first neural network does not detect that the electric vehicle exists in the target image, data output is not carried out; for the second deep neural network for detecting the person in the target image, the second deep neural network can detect the pose information of the person in addition to detecting whether the person exists in the target image.
According to the scheme, when the electric vehicle owner pushes or rides the electric vehicle, the position, the posture and other information between the person and the electric vehicle can be utilized, and whether each target object in the target image rides or pushes the electric vehicle or not is determined by determining the position, the posture and other information between the target object in the target image and the electric vehicle, so that the owner of the electric vehicle is determined.
In an optional embodiment of the present application, the step of determining a mutual position relationship between each target object of the at least one target object and the electric vehicle based on the pose information of the electric vehicle and the pose information of each target object of the at least one target object may be specifically implemented as follows
Determining, for each of the electric vehicle and the at least one target object, position information and orientation information of the target object, and determining position information and orientation information of the electric vehicle;
determining whether the target object is pushing or riding the electric vehicle based on the position information and orientation information of the electric vehicle and the position information and orientation information of the target object.
Specifically, the pose information of the electric vehicle and the person detected by using the first deep neural network and the second deep neural network may include: location information and orientation information.
After the position and orientation information of the electric vehicle and 1 or more persons in the target image are respectively determined, the mutual position relationship between each person and the electric vehicle can be determined. For example, as shown in fig. 2, for the electric vehicle and the target object 1 in the image, the distance (including the horizontal distance and/or the vertical distance) between the two may be determined based on the position information of the two, and the relative orientation of the two may be determined based on the orientation of the two (such as the person facing the electric vehicle or the person facing away from the electric vehicle). And determining whether the person pushes or rides the electric vehicle based on the determined distance and relative orientation between the person and the electric vehicle.
Specifically, after the position and/or orientation information of the electric vehicle and 1 or more persons in the target image are determined, the mutual position relationship between each person and the electric vehicle can be determined. For example, as shown in fig. 2, for the electric vehicle and the target object 1 in the image, the distance (including the horizontal distance and/or the vertical distance) between the two may be determined based on the position information of the two, and the relative orientation of the two may be determined based on the orientation of the two (such as the person facing the electric vehicle or the person facing away from the electric vehicle). And determining whether the person pushes or rides the electric vehicle based on the determined distance and relative orientation between the person and the electric vehicle.
Here, after determining the relative positional relationship between the electric vehicle and the individuals based on the mutual positional relationship between the one or more persons in the target image and the electric vehicle, particularly based on the orientation and positional information of the electric vehicle and the individuals, it is possible to determine whether or not the electric vehicle is being pushed and ridden. As shown in fig. 2, if the distance between the target object 4 and the electric vehicle is less than 20cm and faces the electric vehicle, the target object 4 may be determined as a person who pushes (or rides) the electric vehicle, and thus, the target object 4 may be determined as a target object having an association relationship with the electric vehicle, that is, the target object 4 may be determined as an owner of the electric vehicle.
According to the scheme of the embodiment of the application, whether the person pushes or rides the electric vehicle can be determined by determining the pose information of the person and the vehicle and particularly determining the distance and the relative orientation between the person and the electric vehicle, so that whether the person and the electric vehicle have an incidence relation or not can be determined, namely whether the person is a vehicle owner of the electric vehicle or not.
According to the technical scheme, the forbidden region of the electric vehicle can be monitored in real time, the electric vehicle is determined to enter the forbidden region under the condition that the electric vehicle and people exist in the forbidden region simultaneously when the forbidden region is monitored, so that the potential safety hazard of the forbidden region of the electric vehicle is reduced by effectively monitoring the condition that the electric vehicle is driven to the forbidden region by a user. In addition, the attribute characteristics of the target object having the incidence relation with the electric vehicle are detected, so that the owner of the electric vehicle is determined, and the owner can be conveniently notified, criticized and educated in a follow-up and timely manner.
Fig. 3 is a schematic structural composition diagram of a detection apparatus provided in an embodiment of the present application, and as shown in fig. 3, the detection apparatus includes:
a first acquisition unit 301 configured to acquire a target image of a target area; the target area is a forbidden area of the electric vehicle;
a detection unit 302 for detecting whether there are an electric vehicle and a target object in the target image;
a first determination unit 303 configured to determine that a target event occurs in a case where it is detected that an electric vehicle and a target object are present in the target image; the target event refers to an event that the electric vehicle enters a forbidden area.
In an optional embodiment of the present application, it is detected that the number of target objects included in the target image is at least one, and the apparatus further includes:
a second determination unit 304 configured to determine whether there is a target object having an association relationship with the electric vehicle among at least one target object included in the target image, in a case where it is determined that the target event occurs;
a second obtaining unit 305, configured to obtain an attribute feature of a target object having an association relationship with the electric vehicle if there is a target object having an association relationship with the electric vehicle among the at least one target object.
In an optional embodiment of the application, the second determining unit 304 is specifically configured to: determining a mutual positional relationship between each of the at least one target object and the electric vehicle; and determining whether a target object having an association relation with the electric vehicle exists in the at least one target object according to the mutual position relation.
In an optional embodiment of the application, the second determining unit 304 is specifically configured to: obtaining pose information of the electric vehicle and pose information of each target object of the at least one target object based on the target image;
determining a mutual positional relationship of each of the at least one target object and the electric vehicle based on the pose information of the electric vehicle and the pose information of each of the at least one target object; the mutual position relation between the target object and the electric vehicle is used for representing whether the target object pushes or rides the electric vehicle.
In an optional embodiment of the application, the second determining unit 304 is specifically configured to: determining, for each of the electric vehicle and the at least one target object, position information and orientation information of the target object, and determining position information and orientation information of the electric vehicle;
determining whether the target object is pushing or riding the electric vehicle based on the position information and orientation information of the electric vehicle and the position information and orientation information of the target object.
In an optional embodiment of the present application, the attribute feature includes at least one of: human body characteristics, human face characteristics, the device still includes:
the recording unit 306 is configured to retrieve and/or record personal information of a target object having an association relationship with the electric vehicle according to at least one attribute feature of a human body characteristic and a human face characteristic of the target object having an association relationship with the electric vehicle.
In an optional embodiment of the present application, the apparatus further comprises:
and a control unit 307 configured to control a target object having an association relationship with the electric vehicle based on the personal information.
Those skilled in the art will understand that the functions implemented by the units in the detection device shown in fig. 3 can be understood by referring to the related description of the detection method. The functions of the units in the detection apparatus shown in fig. 3 may be implemented by a program running on a processor, or may be implemented by specific logic circuits.
Fig. 4 is a schematic structural component diagram of an electronic device according to an embodiment of the present disclosure, and as shown in fig. 4, the electronic device 400 may include one or more processors 401 (only one of which is shown in the figure) (the processors 401 may include, but are not limited to, a processing device such as a Microprocessor (MCU) or a Programmable logic device (FPGA), a memory 402 for storing data, and a transmission device 403 for a communication function. It will be understood by those skilled in the art that the structure shown in fig. 4 is only an illustration and is not intended to limit the structure of the electronic device. For example, electronic device 400 may also include more or fewer components than shown in FIG. 4, or have a different configuration than shown in FIG. 4.
The memory 402 can be used for storing software programs and modules of application software, such as program instructions/modules corresponding to the methods in the embodiments of the present application, and the processor 401 executes various functional applications and data processing by running the software programs and modules stored in the memory 402, so as to implement the methods described above. The memory 402 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 402 may further include memory located remotely from the processor 401, which may be connected to the electronic device 400 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission means 403 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the electronic device 400. In one example, the transmission device 403 includes a Network adapter (NIC) that can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 403 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
The detection device in the embodiment of the present application, if implemented in the form of a software functional module and sold or used as an independent product, may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof that contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for enabling an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
The technical solutions described in the embodiments of the present application can be arbitrarily combined without conflict.
In the several embodiments provided in the present application, it should be understood that the disclosed method and intelligent device may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one second processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (10)

1. A method of detection, the method comprising:
acquiring a target image of a target area; the target area is a forbidden area of the electric vehicle;
detecting whether an electric vehicle and a target object exist in the target image;
determining that a target event occurs under the condition that the electric vehicle and the target object are detected to exist in the target image; the target event refers to an event that the electric vehicle enters the forbidden area.
2. The method according to claim 1, wherein at least one target object included in the target image is detected, the method further comprising:
determining whether a target object having an association relationship with the electric vehicle exists among at least one target object included in the target image in a case where it is determined that the target event occurs;
and acquiring the attribute characteristics of the target object having the association relation with the electric vehicle when the target object having the association relation with the electric vehicle exists in the at least one target object.
3. The method according to claim 2, wherein the determining whether there is a target object having an association relationship with the electric vehicle among at least one target object included in the target image comprises:
determining a mutual positional relationship between each of the at least one target object and the electric vehicle;
and determining whether a target object having an association relation with the electric vehicle exists in the at least one target object according to the mutual position relation.
4. The method of claim 3, wherein the determining a mutual positional relationship between each of the at least one target object and the electric vehicle comprises:
obtaining pose information of the electric vehicle and pose information of each target object of the at least one target object based on the target image;
determining a mutual positional relationship of each of the at least one target object and the electric vehicle based on the pose information of the electric vehicle and the pose information of each of the at least one target object; the mutual position relation between the target object and the electric vehicle is used for representing whether the target object pushes or rides the electric vehicle.
5. The method of claim 4, wherein determining a mutual positional relationship of each of the at least one target object and the electric vehicle based on the pose information of the electric vehicle and the pose information of each of the at least one target object comprises:
determining, for each of the electric vehicle and the at least one target object, position information and orientation information of the target object, and determining position information and orientation information of the electric vehicle;
determining whether the target object is pushing or riding the electric vehicle based on the position information and orientation information of the electric vehicle and the position information and orientation information of the target object.
6. The method of claim 2, wherein the attribute features comprise at least one of: human body characteristics and human face characteristics, and the method further comprises the following steps:
and calling and/or recording personal information of the target object having the association relation with the electric vehicle according to at least one attribute feature of human body characteristics and human face features of the target object having the association relation with the electric vehicle.
7. The method of claim 6, further comprising:
and deploying and controlling the target object having the association relation with the electric vehicle based on the personal information.
8. A detection device, the device comprising:
a first acquisition unit configured to acquire a target image of a target area; the target area is a forbidden area of the electric vehicle;
a detection unit configured to detect whether an electric vehicle and a target object exist in the target image;
a first determination unit configured to determine that a target event occurs in a case where it is detected that an electric vehicle and a target object exist in the target image; the target event refers to an event that the electric vehicle enters a forbidden area.
9. An electronic device, comprising a memory having computer-executable instructions stored thereon and a processor, wherein the processor, when executing the computer-executable instructions on the memory, is configured to perform the method steps of any of claims 1-7.
10. A computer storage medium having stored thereon executable instructions which, when executed by a processor, carry out the method steps of any of claims 1 to 7.
CN202111272939.XA 2021-10-29 2021-10-29 Detection method and device, electronic equipment and storage medium Withdrawn CN113963332A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111272939.XA CN113963332A (en) 2021-10-29 2021-10-29 Detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111272939.XA CN113963332A (en) 2021-10-29 2021-10-29 Detection method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113963332A true CN113963332A (en) 2022-01-21

Family

ID=79468565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111272939.XA Withdrawn CN113963332A (en) 2021-10-29 2021-10-29 Detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113963332A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114627612A (en) * 2022-03-10 2022-06-14 阿里云计算有限公司 Early warning processing method, device and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114627612A (en) * 2022-03-10 2022-06-14 阿里云计算有限公司 Early warning processing method, device and system

Similar Documents

Publication Publication Date Title
CN109194436A (en) Sensor time stabs synchronous detecting method, device, equipment, medium and vehicle
US10102431B2 (en) Visual monitoring of queues using auxillary devices
CN108284427A (en) Security robot and its automatic detecting method
CN106503666A (en) A kind of method for safety monitoring, device and electronic equipment
CN103248534A (en) Safeguard system for kindergarten
CN105447939A (en) Parking lot security and protection system and method based on internet of things
CN109767527A (en) Personal identification method and Related product
CN110867046A (en) Intelligent car washer video monitoring and early warning system based on cloud computing
CN201903966U (en) Monitoring system of networking automobiles
KR20160074208A (en) System and method for providing safety service using beacon signals
KR101368757B1 (en) Parked vehicle monitoring system and method
CN112381677A (en) Community management method, community management device and Internet of things intelligent box
CN108806151A (en) Monitoring alarm method, device, server and storage medium
CN109446926A (en) A kind of traffic monitoring method and device, electronic equipment and storage medium
CN113963332A (en) Detection method and device, electronic equipment and storage medium
US10867490B2 (en) Object for theft detection
CN112580470A (en) City visual perception method and device, electronic equipment and storage medium
CN116739221B (en) Comprehensive early warning system, comprehensive early warning method, device, equipment and medium
CN108337469B (en) Remote goods receiving method and system
CN107991953A (en) One kind building cental system
CN112418063A (en) Face recognition method and device, electronic equipment and storage medium
CN107798504B (en) Equipment and method for high-risk radioactive source automatic safety supervision
CN111914050A (en) Visual 3D monitoring platform based on specific places
CN116233163A (en) Intelligent campus security management system and method based on big data
CN113128294B (en) Road event evidence obtaining method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20220121

WW01 Invention patent application withdrawn after publication