CN115221349A - Target positioning method and system - Google Patents

Target positioning method and system Download PDF

Info

Publication number
CN115221349A
CN115221349A CN202210415911.5A CN202210415911A CN115221349A CN 115221349 A CN115221349 A CN 115221349A CN 202210415911 A CN202210415911 A CN 202210415911A CN 115221349 A CN115221349 A CN 115221349A
Authority
CN
China
Prior art keywords
target object
goods
cargo
real
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210415911.5A
Other languages
Chinese (zh)
Inventor
丁臣臣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Terminus Technology Group Co Ltd
Original Assignee
Terminus Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Terminus Technology Group Co Ltd filed Critical Terminus Technology Group Co Ltd
Priority to CN202210415911.5A priority Critical patent/CN115221349A/en
Publication of CN115221349A publication Critical patent/CN115221349A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Alarm Systems (AREA)

Abstract

The embodiment of the application discloses a target positioning method and a system, wherein the method comprises the following steps: calling a three-dimensional reference mesh image and a simulation image of a target object corresponding to a target place from a database; monitoring the target place through a monitoring device to obtain state information corresponding to the target object; determining a real-time scene position of the target object based on the three-dimensional stereo reference mesh map, the simulated map of the target object, and the state information. The efficient and convenient cargo management and personnel management are realized.

Description

Target positioning method and system
Technical Field
The embodiment of the application relates to the technical field of position detection, in particular to a target positioning method and system.
Background
With the development of information technology, enterprises have higher and higher requirements on security of office places, the work supervision of employees is gradually increased, and personnel positioning systems in office buildings are gradually improved to the planning of information management of various enterprises. The latest Internet of things positioning technology is integrated into personnel management, personnel positioning is carried out in an office scene, particularly in a building, a series of functions of intelligent attendance checking, position real-time positioning, historical track checking, electronic fence early warning, fixed asset equipment checking, visitor management and the like of enterprise personnel can be quickly realized, and the personnel positioning management has important significance for building intelligent and AI office.
In the prior art, the personnel flow condition in a building is managed and monitored in different areas, the position and state information of personnel can be displayed in real time by matching the radio frequency cards worn by different personnel and using the positioner, and the system can be accurately positioned to floors, rooms, corridors and the like. Furthermore, capturing personnel information using video surveillance is also a common approach. However, these methods rely mainly on configuring special equipment, such as RFID cards, signal source capture devices, artificial intelligence cameras, etc., which are costly.
Disclosure of Invention
Therefore, the embodiment of the application provides a target positioning method and a target positioning system, and efficient and convenient cargo management and personnel management are realized.
In order to achieve the above object, the embodiments of the present application provide the following technical solutions:
according to a first aspect of embodiments of the present application, there is provided a target positioning method, the method including:
calling a three-dimensional reference mesh image and a simulation image of a target object corresponding to a target place from a database;
monitoring the target place through a monitoring device to obtain state information corresponding to the target object;
determining a real-time scene position of the target object based on the three-dimensional stereo reference mesh map, the simulated map of the target object, and the state information.
Optionally, if the target object is a cargo, the method further includes:
judging the cargo state based on the real-time scene position of the cargo and the simulated diagram of the cargo;
if the real-time scene position of the goods is not in the imaging area of the shelf in the simulated diagram of the goods, judging that the goods are in an out-of-stock state;
and if the real-time scene position of the goods is in the imaging area of the shelf in the simulated diagram of the goods, judging that the goods are in the stock state.
Optionally, after determining that the goods are in the out-of-stock state, the method further comprises: and sending a replenishment reminding message.
Optionally, if the cargo state is a stock state, the method further comprises:
and judging whether the real-time scene position of the goods is the same as the position of the simulated diagram of the goods in the three-dimensional reference reticular diagram, if so, judging that the goods placement has no deviation, and if not, judging that the goods placement has the deviation.
Optionally, after determining that the goods are placed with the deviation, the method further comprises: sending a reminding message of goods placement adjustment.
Optionally, if the target object is a cargo, the simulated view of the cargo is in a three-dimensional reference mesh view;
if the target object is a person, the simulation diagram of the person is in the three-dimensional reference mesh diagram.
Optionally, if the target object is a good, the state information corresponding to the target object includes a good position, a good identifier and a timestamp;
and if the target object is a person, the state information corresponding to the target object comprises a fixed position, a moving track, behavior characteristics and a time stamp of the person.
According to a second aspect of embodiments herein, there is provided an object localization system, the system comprising:
the data calling module is used for calling a three-dimensional reference reticular map corresponding to a target place and a simulation map of a target object from a database;
the state information acquisition module is used for monitoring the target place through a monitoring device and acquiring state information corresponding to the target object;
a real-time scene position detection module, configured to determine a real-time scene position of the target object based on the three-dimensional reference mesh map, the simulated map of the target object, and the state information.
According to a third aspect of embodiments herein, there is provided an electronic device comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the computer program to implement the method of the first aspect.
According to a fourth aspect of embodiments herein, there is provided a computer readable storage medium having computer readable instructions stored thereon, the computer readable instructions being executable by a processor to implement the method of the first aspect.
In summary, the embodiment of the present application provides a target positioning method and system, which retrieve a three-dimensional reference mesh map and a simulated map of a target object from a database, where the three-dimensional reference mesh map corresponds to a target location; monitoring the target place through a monitoring device to obtain state information corresponding to the target object; determining a real-time scene position of the target object based on the three-dimensional stereo reference mesh map, the simulated map of the target object, and the state information. The efficient and convenient cargo management and personnel management are realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It should be apparent that the drawings in the following description are merely exemplary, and that other embodiments can be derived from the drawings provided by those of ordinary skill in the art without inventive effort.
The structures, ratios, sizes, and the like shown in the present specification are only used for matching with the contents disclosed in the specification, so that those skilled in the art will understand and read the present invention, and do not limit the conditions for implementing the present invention, so that the present invention has no technical essence, and any modifications of the structures, changes of the ratio relationships, or adjustments of the sizes, should still fall within the scope covered by the technical contents disclosed in the present invention without affecting the efficacy and the achievable purpose of the present invention.
Fig. 1 is a schematic flowchart of a target positioning method according to an embodiment of the present disclosure;
FIG. 2 is a block diagram of a target location system provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 4 shows a schematic diagram of a computer-readable storage medium provided by an embodiment of the present application.
Detailed Description
The present invention is described in terms of particular embodiments, other advantages and features of the invention will become apparent to those skilled in the art from the following disclosure, and it is to be understood that the described embodiments are merely exemplary of the invention and that it is not intended to limit the invention to the particular embodiments disclosed. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 illustrates an object locating method provided in an embodiment of the present application, where the method includes:
step 101: calling a three-dimensional reference mesh image and a simulation image of a target object corresponding to a target place from a database;
step 102: monitoring the target place through a monitoring device to obtain state information corresponding to the target object;
step 103: determining a real-time scene position of the target object based on the three-dimensional stereo reference mesh map, the simulated map of the target object, and the state information.
In a possible implementation, if the target object is a cargo, the method further includes: judging the cargo state based on the real-time scene position of the cargo and the simulated diagram of the cargo; if the real-time scene position of the goods is not in the imaging area of the shelf in the simulated diagram of the goods, judging that the goods are in a goods shortage state; and if the real-time scene position of the goods is in the imaging area of the shelf in the simulated diagram of the goods, judging that the goods are in the stock state.
In one possible embodiment, after determining that the cargo is out of stock, the method further comprises: and sending a replenishment reminding message.
In one possible embodiment, if the cargo state is a stock state, the method further comprises: and judging whether the real-time scene position of the goods is the same as the position of the simulated diagram of the goods in the three-dimensional reference reticular diagram, if so, judging that the goods placement has no deviation, and if not, judging that the goods placement has the deviation.
In one possible embodiment, after determining that the cargo is placed with a deviation, the method further comprises: and sending a reminding message of goods placement adjustment.
In a possible implementation manner, if the target object is a good, the state information corresponding to the target object includes a good location, a good identifier, and a timestamp; and if the target object is a person, the state information corresponding to the target object comprises a fixed position, a moving track, behavior characteristics and a time stamp of the person.
In a possible embodiment, if the target object is a cargo, the simulated view of the cargo is in a three-dimensional reference mesh; if the target object is a person, the simulated view of the person is in the three-dimensional reference mesh map.
The three-dimensional reference reticular graph is formed by drawing the three-dimensional reference reticular graph according to the set position point of the scene occurrence place in a communicated mode. A three-dimensional stereoscopic reference mesh is drawn, for example, for an object such as a desk in an office, and a simulated view of the target object is included in the three-dimensional stereoscopic reference mesh. In practical application, the state information of the target object is further judged according to the real-time scene position of the target object and the simulated diagram of the target object in the three-dimensional reference mesh diagram.
In one possible embodiment, the method further comprises: analyzing a target scene into partitioned imaging areas by using a scene analysis mechanism and storing the partitioned imaging areas into a database; and analyzing the characteristic data and the position data of the personnel into imaging areas by utilizing a personnel analysis mechanism and storing the imaging areas into a database.
In one possible implementation, the database includes, but is not limited to, the ElasticSearch database or the hbase database.
Therefore, in the application scenario of the positioning method provided by the embodiment of the application, the state of the goods can be judged, so that replenishment or position adjustment can be performed; personnel can be positioned, so that the position information of the target personnel can be acquired more three-dimensionally and visually. And when the visitor enters the monitoring place, the target object is identified and has no relevant data storage in the database, so that a visitor reminding message is sent.
In summary, the embodiment of the present application provides a target positioning method, which retrieves a three-dimensional reference mesh map and a simulated map of a target object from a database, where the three-dimensional reference mesh map corresponds to a target location; monitoring the target place through a monitoring device to obtain state information corresponding to the target object; determining a real-time scene position of the target object based on the three-dimensional stereo reference mesh map, the simulated map of the target object, and the state information. The efficient and convenient cargo management and personnel management are realized.
Based on the same technical concept, an embodiment of the present application further provides an object positioning system, as shown in fig. 2, the system includes:
a data retrieving module 201, configured to retrieve, from a database, a three-dimensional reference mesh map and a simulated map of a target object corresponding to a target location;
a state information obtaining module 202, configured to monitor the target location through a monitoring device, and obtain state information corresponding to the target object;
a real-time scene position detection module 203, configured to determine a real-time scene position of the target object based on the three-dimensional stereo reference mesh map, the simulated map of the target object, and the state information.
The embodiment of the application also provides electronic equipment corresponding to the method provided by the embodiment. Please refer to fig. 3, which illustrates a schematic diagram of an electronic device according to some embodiments of the present application. The electronic device 20 may include: the system comprises a processor 200, a memory 201, a bus 202 and a communication interface 203, wherein the processor 200, the communication interface 203 and the memory 201 are connected through the bus 202; the memory 201 stores a computer program that can be executed on the processor 200, and the processor 200 executes the computer program to perform the method provided by any of the foregoing embodiments of the present application.
The Memory 201 may include a high-speed Random Access Memory (RAM) and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one physical port 203 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
Bus 202 can be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The memory 201 is used for storing a program, and the processor 200 executes the program after receiving an execution instruction, where the method disclosed in any embodiment of the foregoing application may be applied to the processor 200, or implemented by the processor 200.
The processor 200 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 200. The Processor 200 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, etc. as is well known in the art. The storage medium is located in the memory 201, and the processor 200 reads the information in the memory 201 and completes the steps of the method in combination with the hardware thereof.
The electronic device provided by the embodiment of the application and the method provided by the embodiment of the application have the same inventive concept and have the same beneficial effects as the method adopted, operated or realized by the electronic device.
Referring to fig. 4, the computer readable storage medium is an optical disc 30, on which a computer program (i.e., a program product) is stored, and when the computer program is executed by a processor, the computer program performs the method provided in any of the foregoing embodiments.
It should be noted that examples of the computer-readable storage medium may also include, but are not limited to, a phase change memory (PRAM), a Static Random Access Memory (SRAM), a Dynamic Random Access Memory (DRAM), other types of Random Access Memories (RAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a flash memory, or other optical and magnetic storage media, which are not described in detail herein.
The computer-readable storage medium provided by the above-mentioned embodiments of the present application and the method provided by the embodiments of the present application have the same advantages as the method adopted, executed or implemented by the application program stored in the computer-readable storage medium.
It should be noted that:
the algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose devices may be used with the teachings herein. The required structure for constructing such a device will be apparent from the description above. In addition, this application is not directed to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the present application as described herein, and any descriptions of specific languages are provided above to disclose the best modes of the present application.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the application, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the application and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
Those skilled in the art will appreciate that the modules in the devices in an embodiment may be adaptively changed and arranged in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Moreover, those of skill in the art will understand that although some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in the creation apparatus of a virtual machine according to embodiments of the present application. The present application may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the preferred embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method of locating an object, the method comprising:
calling a three-dimensional reference mesh image and a simulation image of a target object corresponding to a target place from a database;
monitoring the target place through a monitoring device to obtain state information corresponding to the target object;
determining a real-time scene position of the target object based on the three-dimensional stereo reference mesh map, the simulated map of the target object, and the state information.
2. The method of claim 1, wherein if the target object is cargo, the method further comprises:
judging the cargo state based on the real-time scene position of the cargo and the simulated diagram of the cargo;
if the real-time scene position of the goods is not in the imaging area of the shelf in the simulated diagram of the goods, judging that the goods are in an out-of-stock state;
and if the real-time scene position of the goods is in the imaging area of the shelf in the simulated diagram of the goods, judging that the goods are in the stock state.
3. The method of claim 2, wherein after determining that the cargo is out of stock, the method further comprises: and sending a replenishment reminding message.
4. The method of claim 2, wherein if the cargo state is an inventory state, the method further comprises:
and judging whether the real-time scene position of the goods is the same as the position of the simulated diagram of the goods in the three-dimensional reference reticular diagram, if so, judging that the goods placement has no deviation, and if not, judging that the goods placement has the deviation.
5. The method of claim 4, wherein after determining that the cargo is placed with the deviation, the method further comprises: sending a reminding message of goods placement adjustment.
6. The method of claim 1, wherein if the target object is a cargo, the simulated view of the cargo is in a three-dimensional reference mesh;
if the target object is a person, the simulation diagram of the person is in the three-dimensional reference mesh diagram.
7. The method of claim 1, wherein if the target object is a good, the status information corresponding to the target object comprises a location of the good, a identity of the good, and a timestamp;
and if the target object is a person, the state information corresponding to the target object comprises a fixed position, a moving track, behavior characteristics and a time stamp of the person.
8. An object positioning system, characterized in that the system comprises:
the data calling module is used for calling a three-dimensional reference reticular map corresponding to a target place and a simulation map of a target object from a database;
the state information acquisition module is used for monitoring the target place through a monitoring device and acquiring state information corresponding to the target object;
and the real-time scene position detection module is used for determining the real-time scene position of the target object based on the three-dimensional reference mesh graph, the simulation graph of the target object and the state information.
9. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor executes when executing the computer program to implement the method according to any of claims 1-7.
10. A computer-readable storage medium having computer-readable instructions stored thereon, the computer-readable instructions being executable by a processor to implement the method of any one of claims 1-7.
CN202210415911.5A 2022-04-20 2022-04-20 Target positioning method and system Pending CN115221349A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210415911.5A CN115221349A (en) 2022-04-20 2022-04-20 Target positioning method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210415911.5A CN115221349A (en) 2022-04-20 2022-04-20 Target positioning method and system

Publications (1)

Publication Number Publication Date
CN115221349A true CN115221349A (en) 2022-10-21

Family

ID=83606710

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210415911.5A Pending CN115221349A (en) 2022-04-20 2022-04-20 Target positioning method and system

Country Status (1)

Country Link
CN (1) CN115221349A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116596449A (en) * 2023-07-17 2023-08-15 北京中科智易科技股份有限公司 Positioning method and system for instruments in military warehouse

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116596449A (en) * 2023-07-17 2023-08-15 北京中科智易科技股份有限公司 Positioning method and system for instruments in military warehouse

Similar Documents

Publication Publication Date Title
CN108197658B (en) Image annotation information processing method, device, server and system
US10620084B2 (en) System for hierarchical actions based upon monitored building conditions
US8731241B2 (en) Activity mapping system
CN113938647B (en) Intelligent tower crane operation panoramic monitoring and restoring method and system for intelligent construction site
CN109728940A (en) Method for inspecting, device and storage medium
CN113111144A (en) Room marking method and device and robot movement method
CN112037477A (en) Indoor electronic fence positioning method and system based on RFID
CN115221349A (en) Target positioning method and system
CN112929602A (en) Data monitoring method and device based on image processing and related equipment
CN111967438A (en) Garbage collection and transportation monitoring method, device, equipment and computer readable storage medium
CN113734981A (en) Intelligent tower crane material transportation path setting method and device
CN109168173A (en) Base station operation management method, apparatus and electronic equipment
CN113139427A (en) Steam pipe network intelligent monitoring method, system and equipment based on deep learning
CN112201044A (en) Road violation vehicle identification method and system, storage medium and terminal
CN115983766A (en) Object position detection method and device, electronic equipment and readable storage medium
CN115767439A (en) Object position display method and device, storage medium and electronic equipment
CN113911918B (en) Fault emergency dispatch control method and system for intelligent tower crane cluster
CN112734968A (en) Method and device for polling data equipment and computer storage medium
CN112967414A (en) Patrol system
CN111597954A (en) Method and system for identifying vehicle position in monitoring video
CN112153341A (en) Task supervision method, device and system, electronic equipment and storage medium
CN115249051A (en) Equipment management system, method and device
CN110796044A (en) Target area security monitoring method and device
CN110705333A (en) Identity judgment method and device and positioning system
CN113896109B (en) Camera shooting monitoring method and system for intelligent tower crane background remote control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination