CN111288998A - Map drawing method and device, storage medium and electronic device - Google Patents
Map drawing method and device, storage medium and electronic device Download PDFInfo
- Publication number
- CN111288998A CN111288998A CN201811493492.7A CN201811493492A CN111288998A CN 111288998 A CN111288998 A CN 111288998A CN 201811493492 A CN201811493492 A CN 201811493492A CN 111288998 A CN111288998 A CN 111288998A
- Authority
- CN
- China
- Prior art keywords
- user
- map
- monitoring equipment
- acquiring
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 57
- 230000033001 locomotion Effects 0.000 claims abstract description 55
- 238000012544 monitoring process Methods 0.000 claims abstract description 39
- 230000008569 process Effects 0.000 claims abstract description 15
- 238000004590 computer program Methods 0.000 claims description 20
- 238000012806 monitoring device Methods 0.000 claims description 16
- 238000013507 mapping Methods 0.000 claims description 14
- 238000012545 processing Methods 0.000 claims description 12
- 230000007613 environmental effect Effects 0.000 claims description 5
- 230000000694 effects Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000004873 anchoring Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3863—Structures of map data
- G01C21/387—Organisation of map data, e.g. version management or database structures
- G01C21/3878—Hierarchical structures, e.g. layering
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3848—Data obtained from both position sensors and additional sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Abstract
The invention provides a map drawing method and device, a storage medium and an electronic device, wherein the method comprises the following steps: the method comprises the steps that a motion track of a user is obtained through monitoring equipment, wherein the motion track is used for indicating the corresponding relation between the position and the time of the user in the moving process; carrying out face recognition on the user through the monitoring equipment, and marking identity information of the user according to a recognition result; and drawing the motion trail of the user and the mark of the user into a map. According to the invention, the problem that the current position information of the mobile equipment can only be displayed on a map through GPS positioning in the related art is solved, and the user experience effect is improved.
Description
Technical Field
The invention relates to the field of communication, in particular to a map drawing method and device, a storage medium and an electronic device.
Background
In the prior art, a Global Positioning System (GPS) System is generally used to update location information of a mobile device according to a certain frequency, and then the location information is displayed on a relevant data layer of a map. However, this display method is very dependent on the GPS positioning accuracy, and it is difficult to perform high-accuracy monitoring on places with extremely high pedestrian density and indoor places, and it is only possible to display the current location information of the mobile device on a map.
In view of the above problems in the related art, no effective solution exists at present.
Disclosure of Invention
The embodiment of the invention provides a map drawing method and device, a storage medium and an electronic device, which are used for at least solving the problem that the current position information of a mobile device can only be displayed on a map through GPS positioning in the related art.
According to an aspect of the present invention, there is provided a map drawing method including: the method comprises the steps that a motion track of a user is obtained through monitoring equipment, wherein the motion track is used for indicating the corresponding relation between the position and the time of the user in the moving process; carrying out face recognition on the user through the monitoring equipment, and marking identity information of the user according to a recognition result; and drawing the motion trail of the user and the mark of the user into a map.
According to another aspect of the present invention, there is provided a map drawing apparatus including: the monitoring device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring a motion track of a user through the monitoring device, and the motion track is used for indicating the corresponding relation between the position and the time of the user in the moving process; the processing module is used for carrying out face recognition on the user through the monitoring equipment and marking the identity information of the user according to a recognition result; and the first drawing module is used for drawing the motion trail of the user and the mark of the user into a map.
According to a further embodiment of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
By the method and the device, the motion trail of the user can be drawn on the map in real time, and the user can be identified on the map by face recognition through the monitoring equipment, so that the problem that the current position information of the mobile equipment can only be displayed on the map through GPS positioning in the related technology is solved, and the user experience effect is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a block diagram of a hardware configuration of a terminal device of a map drawing method according to an embodiment of the present invention;
fig. 2 is a flowchart of a map drawing method according to an embodiment of the present invention;
FIGS. 3 a-3 c are schematic diagrams of mapping according to an embodiment of the invention;
fig. 4 is a schematic structural diagram of a map drawing apparatus according to an embodiment of the present invention;
fig. 5 is a first diagram illustrating an alternative configuration of a map drawing apparatus according to an embodiment of the present invention;
fig. 6 is a second diagram illustrating an alternative configuration of a map drawing apparatus according to an embodiment of the present invention.
Detailed Description
The invention will be described in detail hereinafter with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Example 1
The method provided by the first embodiment of the present application may be executed in a terminal device, a computer terminal, or a similar computing device. Taking an example of the method performed by a terminal device, fig. 1 is a block diagram of a hardware structure of the terminal device according to the method for drawing a map according to the embodiment of the present invention. As shown in fig. 1, the terminal device 10 may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally, a transmission device 106 for communication functions and an input/output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the terminal device. For example, terminal device 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of application software, such as a computer program corresponding to the map drawing method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, so as to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 104 may further include memory located remotely from processor 102, which may be connected to terminal device 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the terminal device 10. In one example, the transmission device 106 includes a Network adapter (NIC), which can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
In this embodiment, a method for drawing a map running on the terminal device is provided, and fig. 2 is a flowchart of the method for drawing a map according to the embodiment of the present invention, as shown in fig. 2, the flowchart includes the following steps:
step S202, acquiring a motion track of a user through monitoring equipment, wherein the motion track is used for indicating the corresponding relation between the position and the time of the user in the moving process;
in a particular application scenario, for example, a user walks on a commercial street moving from location a to location B, where the corresponding time for location a is 9 am and the time to location B is 9:30 as described above. The correspondence is therefore that between 9 o' clock and 9:30, the position change (from a to B position) and the time change.
Step S204, carrying out face recognition on the user through the monitoring equipment, and marking the identity information of the user according to the recognition result;
and step S206, drawing the motion trail of the user and the mark of the user into a map.
Through the steps S202 to S206, the motion trail of the user can be drawn on the map in real time, and the user can be subjected to face recognition through the monitoring equipment so as to identify the identity of the user on the map, so that the problem that the current position information of the mobile equipment can only be displayed on the map through GPS positioning in the related art is solved, and the user experience effect is improved.
In an optional implementation manner of this embodiment, as to the manner of acquiring the motion trajectory of the user by the monitoring device in S202 of this embodiment, this embodiment may be implemented by:
step S202-1, acquiring corresponding pixel points in a picture of monitoring equipment in the moving process of a user, wherein the mapping relation between the pixel points and actual spatial positions is stored in the monitoring equipment;
step S202-2, determining an actual spatial position corresponding to the pixel point according to the mapping relation, and acquiring the time of the user at the actual spatial position;
and step S202-3, acquiring the motion trail of the user according to the actual space position and the time of the user at the actual space position.
As for the above steps S202-1 to S202-3, it should be noted that the mapping relationship between the pixel points in the monitor device image and the actual spatial positions is set in advance, that is, the actual spatial positions of the people and the objects in the monitor image can be obtained by analyzing the pixel points of the monitor image obtained by the monitor device. For example, when a user appears in a screen of the monitoring device, the actual location of the user is determined by a pixel point corresponding to the user. After the current position of the user is determined, the corresponding relation between the current position and the current position is determined by the current displayed time of the monitoring equipment, and when the user moves, a plurality of corresponding relations are recorded. And then the motion trail of the user can be determined.
In an optional implementation manner of this embodiment, before the step S202 of obtaining the motion trajectory of the user through the monitoring device, the method of this embodiment may further include: acquiring environmental information within a preset range of a position where a user is located; wherein the environment information includes: road shape, building location; drawing the environment information into the map as a fixed information layer of the map;
based on the above-mentioned obtaining of the environment information and on the premise that the environment information is used as a fixed information layer of the map, the step S206 of the embodiment of drawing the motion trail of the user and the mark of the user to the map includes: and taking the motion trail of the user and the mark of the user as a dynamic information layer of the map, and adding the dynamic information layer to the map which already comprises the fixed information layer.
That is, the map drawn in the present embodiment includes two different display layers, and fig. 3a to 3c are schematic views of the map drawn according to the embodiment of the present invention, as shown in fig. 3a to 3c, and fig. 3a is a fixed information layer for displaying fixed information, such as "fixed" basic information of road shape, building location, and the like; fig. 3b is a dynamic information layer for displaying a dynamic motion profile of a user. Fig. 3c shows the real-time status of the pedestrian in the map area after the two images of fig. 3a and 3b are superimposed. The map in the prior art only displays the current position, but not displays the motion track of the pedestrian in a period of time.
In the present application, the information of the dynamic information layer is recorded whether there is a pedestrian at a certain position point, or in the video screenshot obtained at this moment, where the pedestrian is located is recorded in a data table 1, that is, the corresponding relationship between the position and the time in the user movement process is obtained, as shown in table 1:
pedestrian location | Time of day |
(x1,y1) | T1 |
(x2,y2) | T2 |
(x3,y3) | T3 |
…… | …… |
TABLE 1
Then, at the same time point on the map, for example, at time T1, the pedestrian is represented by the bright spot and displayed on the dynamic information layer. Then, the bright spot at the next moment is displayed on the dynamic information layer as time updates. Therefore, the pedestrian track in a specific area can be visually determined and the advancing trend can be predicted through the information of the bright spots and the combination of the superposed basic information layers.
It should be noted that, in order to obtain accurate position information, the position information in table 1 above may be analyzed through a picture obtained by a camera. The camera in charge of monitoring sets up the picture of acquireing in advance: the corresponding position (pixel point) in the picture corresponds to the actual spatial position (x, y). After the position of the person in the picture is identified, the position of the person is mapped to an actual space position for storage.
In another optional implementation manner of this embodiment, as to the manner in which the monitoring device performs face recognition on the user and performs identity information on the user according to the recognition result, the method may be implemented as follows:
step S204-1, carrying out face recognition on a user through monitoring equipment, and comparing the recognized data with data in a database;
step S204-2, under the condition that the data matched with the recognized data is stored in the database, acquiring the identity information of the user from the database;
and step S204-3, marking correspondingly according to different identity information.
As for the above steps S204-1 to S204-3, in this embodiment, the monitoring device may also perform face recognition on the user, and compare the recognition result with a local database, where the data may be a database of a public security system or other databases. The tagging is performed based on identity information obtained from the data. For example, labeled by age groups of the person, including: the color of the children, young, middle-aged and old people is marked according to different age groups. Of course, age groups may also be subdivided, and the above is merely illustrative. In addition, the marking may be in the role of a person, such as a government official, entrepreneur, scientist, doctor, or the like. In short, the marking mode is many, and the corresponding setting can be carried out according to the actual requirement.
For the above step S204-1 to step S204-3, in a specific application scenario of the embodiment, the following may be performed: the positions of the persons in the video are identified, and face identification can also be carried out to obtain identity information of the persons in the picture. After face recognition, the following information shown in table 2 is obtained:
UID | position of | Time of day |
XXXXXX | (x,y) | T1 |
…… | …… | …… |
TABLE 2
The UID is a unique serial number assigned to a person after face recognition. In order to determine the identity information of the user, the identification result may be compared with information on an identity document in the public security system, so as to further obtain detailed information of the person. After the face recognition data layer is added, when a map is generally displayed, the identity information of a user can be displayed, whether a certain bright spot (namely a person) is anchored in the map or not can be selected according to the identity information of the user, and the UID anchoring the person is tracked.
Further, after public security data is introduced and face recognition is performed, real identity information of people in the video can be obtained. Comparing with the database of the public security system, the person can be known whether the person is a key monitoring person, such as whether the person is a fleeing person, whether the person has crime such as theft or other identity information. When the relevant person appears, the alarm data layer is activated, and the light point corresponding to the person on the map is identified by other common light points. For example, a light spot for an average person would be identified with yellow, while a key monitoring person would be marked red. When the monitoring personnel select the key monitoring person on the dynamic map, the related data is called, and the identity information of the person is directly displayed on the map.
It should be noted that, the method of this embodiment may further include:
in step S208, if the database does not store data matching the identified data, the identified data is saved.
It can be seen that if the recognition result cannot be matched in the database, the recognition result is saved. That is, the method of the present embodiment described above can also enrich the data of the database.
In another optional implementation manner of this embodiment, the method of this embodiment may further include:
step S210, selecting a designated user from the map, and tracking the movement trajectory of the designated user.
It can be seen that a specific user can be tracked, and the tracking can be performed according to the result of the face recognition, such as important and key people, or some other specific user. The corresponding selection can be carried out according to the actual requirement. For example, in a specific application scenario, a certain monitoring device identifies a wanted escaper, marks the wanted escaper on a map, and displays a motion track of the mark on the map, which is beneficial to arresting the escaper. Of course this approach may be applied to other scenarios.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
In this embodiment, a map drawing device is further provided, and the map drawing device is used to implement the foregoing embodiments and preferred embodiments, and the description of the map drawing device is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 4 is a schematic structural diagram of a map drawing apparatus according to an embodiment of the present invention, as shown in fig. 4, the apparatus including: a first obtaining module 42, configured to obtain a motion trajectory of the user through the monitoring device, where the motion trajectory is used to indicate a correspondence between a position and time of the user in a moving process; the processing module 44 is coupled with the obtaining module 42 and is used for performing face recognition on the user through the monitoring device and marking the identity information of the user according to the recognition result; and a first drawing module 46 coupled to the processing module 44 for drawing the motion trail of the user and the mark of the user into the map.
Optionally, the first obtaining module 42 involved in this embodiment includes: the first acquisition unit is used for acquiring corresponding pixel points in a picture of the monitoring equipment in the moving process of a user, wherein the mapping relation between the pixel points and the actual spatial position is stored in the monitoring equipment; the second acquisition unit is coupled with the first acquisition unit and used for determining the actual spatial position corresponding to the pixel point according to the mapping relation and acquiring the time of the user at the actual spatial position; and the third acquisition unit is coupled with the second acquisition unit and used for acquiring the motion trail of the user according to the actual spatial position and the time of the user at the actual spatial position.
Optionally, the processing module 44 involved in this embodiment includes: the processing unit is used for carrying out face recognition on the user through the monitoring equipment and comparing the recognized data with the data in the database; the fourth acquisition unit is coupled with the processing unit and used for acquiring the identity information of the user from the database under the condition that the database stores data matched with the identified data; the marking unit is coupled with the fourth acquisition unit and used for correspondingly marking according to different identity information; and the storage unit is coupled with the fourth acquisition unit and used for storing the identified data under the condition that the data matched with the identified data is not stored in the database.
Fig. 5 is a schematic diagram of an alternative structure of a map drawing device according to an embodiment of the present invention, as shown in fig. 5, on the basis of fig. 4, the device further includes: and the tracking module 52 is coupled with the drawing module 46 and is used for selecting the specified user from the map and tracking the motion trail of the specified user.
Fig. 6 is a schematic diagram of an alternative structure of a map drawing apparatus according to an embodiment of the present invention, as shown in fig. 6, and on the basis of fig. 4, the apparatus further includes: the second obtaining module 62 is configured to obtain environmental information within a predetermined range of a location where a user is located before obtaining a motion trajectory of the user through the monitoring device; wherein the environment information includes: road shape, building location; a second drawing module 64, coupled to the second obtaining module 62, for drawing the environmental information into the map as a fixed information layer of the map;
based on the second obtaining module 62 and the second drawing module 64, the first drawing module 42 in this embodiment is further configured to use the motion trail of the user and the mark of the user as a dynamic information layer of the map, and add the dynamic information layer to the map already including the fixed information layer.
It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are respectively located in different processors in any combination.
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
step S1, acquiring a motion trail of the user through the monitoring equipment, wherein the motion trail is used for indicating the corresponding relation between the position and the time of the user in the moving process;
step S2, recognizing the face of the user through the monitoring equipment, and marking the identity information of the user according to the recognition result;
in step S3, the motion trail of the user and the mark of the user are drawn into a map.
Optionally, the storage medium is further arranged to store a computer program for performing the steps of:
s1, acquiring corresponding pixel points in the picture of the monitoring equipment in the moving process of the user, wherein the mapping relation between the pixel points and the actual spatial positions is stored in the monitoring equipment;
s2, determining the actual spatial position corresponding to the pixel point according to the mapping relation, and acquiring the time of the user at the actual spatial position;
and S3, acquiring the motion trail of the user according to the actual space position and the time when the user is at the actual space position.
Optionally, the storage medium is further arranged to store a computer program for performing the steps of:
s1, performing face recognition on the user through the monitoring equipment, and comparing the recognized data with the data in the database;
s2, acquiring the identity information of the user from the database under the condition that the database stores the data matched with the identified data;
and S3, marking according to different identity information.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
step S1, acquiring a motion trail of the user through the monitoring equipment, wherein the motion trail is used for indicating the corresponding relation between the position and the time of the user in the moving process;
step S2, recognizing the face of the user through the monitoring equipment, and marking the identity information of the user according to the recognition result;
in step S3, the motion trail of the user and the mark of the user are drawn into a map.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.
Claims (14)
1. A method of mapping, comprising:
the method comprises the steps that a motion track of a user is obtained through monitoring equipment, wherein the motion track is used for indicating the corresponding relation between the position and the time of the user in the moving process;
carrying out face recognition on the user through the monitoring equipment, and marking identity information of the user according to a recognition result;
and drawing the motion trail of the user and the mark of the user into a map.
2. The method of claim 1, wherein the obtaining of the motion profile of the user through the monitoring device comprises:
acquiring corresponding pixel points in the picture of the monitoring equipment in the moving process of the user, wherein the mapping relation between the pixel points and the actual spatial position is stored in the monitoring equipment;
determining an actual spatial position corresponding to the pixel point according to the mapping relation, and acquiring the time of the user at the actual spatial position;
and acquiring the motion trail of the user according to the actual space position and the time of the user at the actual space position.
3. The method according to claim 1 or 2, wherein the face recognition of the user by the monitoring device and the marking of the identity information of the user according to the recognition result comprise:
carrying out face recognition on the user through the monitoring equipment, and comparing the recognized data with data in a database;
under the condition that the database stores data matched with the identified data, acquiring the identity information of the user from the database;
and correspondingly marking according to different identity information.
4. The method of claim 3,
and under the condition that the data matched with the identified data is not stored in the database, storing the identified data in the database.
5. The method according to claim 1 or 2, characterized in that the method further comprises:
and selecting a specified user from the map, and tracking the motion trail of the specified user.
6. The method of claim 1,
before the motion trail of the user is obtained through the monitoring device, the method further comprises the following steps: acquiring environmental information within a preset range of the position where the user is located; wherein the environment information includes: road shape, building location; drawing the environment information into the map as a fixed information layer of the map;
mapping the motion trajectory of the user and the indicia of the user to a map comprises: and taking the motion trail of the user and the mark of the user as a dynamic information layer of the map, and adding the dynamic information layer to the map which already comprises the fixed information layer.
7. An apparatus for drawing a map, comprising:
the monitoring device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring a motion track of a user through the monitoring device, and the motion track is used for indicating the corresponding relation between the position and the time of the user in the moving process;
the processing module is used for carrying out face recognition on the user through the monitoring equipment and marking the identity information of the user according to a recognition result;
and the first drawing module is used for drawing the motion trail of the user and the mark of the user into a map.
8. The apparatus of claim 7, wherein the first obtaining module comprises:
the first obtaining unit is used for obtaining corresponding pixel points in the picture of the monitoring equipment in the moving process of the user, wherein the mapping relation between the pixel points and the actual spatial position is stored in the monitoring equipment;
the second acquisition unit is used for determining the actual spatial position corresponding to the pixel point according to the mapping relation and acquiring the time of the user at the actual spatial position;
and the third acquisition unit is used for acquiring the motion trail of the user according to the actual space position and the time of the user at the actual space position.
9. The apparatus of claim 7 or 8, wherein the processing module comprises:
the processing unit is used for carrying out face recognition on the user through the monitoring equipment and comparing the recognized data with data in a database;
a fourth obtaining unit, configured to obtain, from the database, the identity information of the user in a case where data matching the identified data is stored in the database;
and the marking unit is used for correspondingly marking according to different identity information.
10. The apparatus of claim 9, wherein the processing module further comprises:
a saving unit configured to save the identified data when the data matching the identified data is not stored in the database.
11. The apparatus of claim 7 or 8, further comprising:
and the tracking module is used for selecting a specified user from the map and tracking the motion trail of the specified user.
12. The apparatus of claim 7,
the device further comprises: the second acquisition module is used for acquiring the environmental information in the preset range of the position where the user is located before the movement track of the user is acquired through the monitoring equipment; wherein the environment information includes: road shape, building location; the second drawing module is used for drawing the environment information into the map as a fixed information layer of the map;
the first drawing module is further configured to use the motion trail of the user and the mark of the user as a dynamic information layer of the map, and add the dynamic information layer to the map already including the fixed information layer.
13. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 6 when executed.
14. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 6.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811493492.7A CN111288998A (en) | 2018-12-06 | 2018-12-06 | Map drawing method and device, storage medium and electronic device |
PCT/CN2019/112656 WO2020114128A1 (en) | 2018-12-06 | 2019-10-23 | Map drawing method and apparatus, storage medium and electronic apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811493492.7A CN111288998A (en) | 2018-12-06 | 2018-12-06 | Map drawing method and device, storage medium and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111288998A true CN111288998A (en) | 2020-06-16 |
Family
ID=70974908
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811493492.7A Pending CN111288998A (en) | 2018-12-06 | 2018-12-06 | Map drawing method and device, storage medium and electronic device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111288998A (en) |
WO (1) | WO2020114128A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112383751A (en) * | 2020-11-05 | 2021-02-19 | 佳都新太科技股份有限公司 | Monitoring video data processing method and device, terminal equipment and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101604448A (en) * | 2009-03-16 | 2009-12-16 | 北京中星微电子有限公司 | A kind of speed-measuring method of moving target and system |
CN105933650A (en) * | 2016-04-25 | 2016-09-07 | 北京旷视科技有限公司 | Video monitoring system and method |
CN106650652A (en) * | 2016-12-14 | 2017-05-10 | 黄先开 | Trajectory tracking system and method based on face recognition technology |
CN106998444A (en) * | 2017-02-14 | 2017-08-01 | 北京中科天云科技有限公司 | A kind of big data face monitoring system and device |
CN107314769A (en) * | 2017-06-19 | 2017-11-03 | 成都领创先科技有限公司 | The strong indoor occupant locating system of security |
CN207231497U (en) * | 2017-06-19 | 2018-04-13 | 成都领创先科技有限公司 | A kind of security positioning system based on recognition of face |
CN108197565A (en) * | 2017-12-29 | 2018-06-22 | 深圳英飞拓科技股份有限公司 | Target based on recognition of face seeks track method and system |
CN108805140A (en) * | 2018-05-23 | 2018-11-13 | 国政通科技股份有限公司 | A kind of feature rapid extracting method and face identification system based on LBP |
-
2018
- 2018-12-06 CN CN201811493492.7A patent/CN111288998A/en active Pending
-
2019
- 2019-10-23 WO PCT/CN2019/112656 patent/WO2020114128A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101604448A (en) * | 2009-03-16 | 2009-12-16 | 北京中星微电子有限公司 | A kind of speed-measuring method of moving target and system |
CN105933650A (en) * | 2016-04-25 | 2016-09-07 | 北京旷视科技有限公司 | Video monitoring system and method |
CN106650652A (en) * | 2016-12-14 | 2017-05-10 | 黄先开 | Trajectory tracking system and method based on face recognition technology |
CN106998444A (en) * | 2017-02-14 | 2017-08-01 | 北京中科天云科技有限公司 | A kind of big data face monitoring system and device |
CN107314769A (en) * | 2017-06-19 | 2017-11-03 | 成都领创先科技有限公司 | The strong indoor occupant locating system of security |
CN207231497U (en) * | 2017-06-19 | 2018-04-13 | 成都领创先科技有限公司 | A kind of security positioning system based on recognition of face |
CN108197565A (en) * | 2017-12-29 | 2018-06-22 | 深圳英飞拓科技股份有限公司 | Target based on recognition of face seeks track method and system |
CN108805140A (en) * | 2018-05-23 | 2018-11-13 | 国政通科技股份有限公司 | A kind of feature rapid extracting method and face identification system based on LBP |
Also Published As
Publication number | Publication date |
---|---|
WO2020114128A1 (en) | 2020-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10812761B2 (en) | Complex hardware-based system for video surveillance tracking | |
CN107527075B (en) | Method and device for establishing corresponding relation between RFID (radio frequency identification) tag and personnel and tracking track | |
CN108446015B (en) | Exhibition method and exhibition system based on mixed reality | |
DE202015009183U1 (en) | Analyzing semantic locations and related data from a variety of location data reports | |
CN108234927A (en) | Video frequency tracking method and system | |
CN104936283A (en) | Indoor positioning method, server and system | |
CN109961458B (en) | Target object tracking method and device and computer readable storage medium | |
EP2988473B1 (en) | Argument reality content screening method, apparatus, and system | |
CN105095991A (en) | Method and device for crowd risk early warning | |
CN108830180A (en) | Electronic check-in method, device and electronic equipment | |
CN103984716A (en) | Method and system for precisely positioning space address based on geographical information | |
CN114862946B (en) | Location prediction method, system, device, and medium | |
CN107358314A (en) | The processing method and system of a kind of thermodynamic chart | |
CN112016363A (en) | Personnel monitoring method and device, computer device and readable storage medium | |
DE102020209054A1 (en) | DEVICE AND METHOD FOR PERSONAL RECOGNITION, TRACKING AND IDENTIFICATION USING WIRELESS SIGNALS AND IMAGES | |
CN111209446A (en) | Method and device for presenting personnel retrieval information and electronic equipment | |
CN110990514A (en) | Behavior track display method, display device and readable storage medium | |
CN111586367A (en) | Method, system and terminal equipment for positioning and tracking personnel in space area in real time | |
CN111288998A (en) | Map drawing method and device, storage medium and electronic device | |
CN109190466A (en) | A kind of method and apparatus that personnel position in real time | |
CN108616919A (en) | A kind of public domain stream of people monitoring method and device | |
CN103268643A (en) | Attendance data acquisition apparatus and information processing system | |
Liebig et al. | Methods for analysis of spatio-temporal bluetooth tracking data | |
CN110222561A (en) | Missing crew's facial images match method, apparatus and storage medium | |
KR20160055587A (en) | Providing location information system in video display using rfid system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20221206 Address after: 710000 second floor, building B3, yunhuigu, No. 156, Tiangu 8th Road, software new town, high tech Zone, Xi'an, Shaanxi Applicant after: Xi'an Guangqi Intelligent Technology Co.,Ltd. Address before: 710003 2nd floor, B3, yunhuigu, 156 Tiangu 8th Road, software new town, Xi'an City, Shaanxi Province Applicant before: Xi'an Guangqi Future Technology Research Institute |
|
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200616 |