CN113129334A - Object tracking method and device, storage medium and wearable electronic equipment - Google Patents

Object tracking method and device, storage medium and wearable electronic equipment Download PDF

Info

Publication number
CN113129334A
CN113129334A CN202110264232.8A CN202110264232A CN113129334A CN 113129334 A CN113129334 A CN 113129334A CN 202110264232 A CN202110264232 A CN 202110264232A CN 113129334 A CN113129334 A CN 113129334A
Authority
CN
China
Prior art keywords
target
thermal image
route
contour
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110264232.8A
Other languages
Chinese (zh)
Inventor
陈诏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Original Assignee
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yulong Computer Telecommunication Scientific Shenzhen Co Ltd filed Critical Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority to CN202110264232.8A priority Critical patent/CN113129334A/en
Publication of CN113129334A publication Critical patent/CN113129334A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an object tracking method, an object tracking device, a storage medium and wearable electronic equipment. The method comprises the steps of obtaining target contour characteristics and storing the target contour characteristics; acquiring radiation signals by using an infrared sensor, and generating a target thermal image corresponding to the target contour characteristics based on the radiation signals; and displaying the target thermal image and target position information corresponding to the target thermal image. In the embodiment of the application, any positioning equipment used for wearing can be not required to be considered, the thermal image corresponding to the outline of the child or the outline of the old person is collected and generated through the infrared sensor, accurate positioning of tracking the child or the old person is achieved, and safety of preventing the child or the old person from being lost is improved.

Description

Object tracking method and device, storage medium and wearable electronic equipment
Technical Field
The present disclosure relates to the field of positioning and tracking technologies, and in particular, to an object tracking method and apparatus, a storage medium, and a wearable electronic device.
Background
In order to ensure the personal safety of children or the old people going out, positioning equipment based on a GPS (global positioning system) is generally worn on the children or the old people, and position information is obtained through the GPS so as to prevent the children or the old people from getting lost.
However, the positioning device depends on signals of a positioning satellite, and inaccurate positioning can be caused when the signals are blocked by obstacles such as walls, buildings, skyscrapers, trees and the like; or when equipment failure occurs, the positioning cannot be obtained, and the position information of children or old people cannot be accurately tracked.
Disclosure of Invention
The embodiment of the application provides an object tracking method, an object tracking device, a storage medium and wearable electronic equipment, which can avoid the defects caused by the traditional positioning mode and realize accurate tracking of the position information of an object.
In a first aspect, an embodiment of the present application provides an object tracking method, where the method is applied to a wearable device, where the wearable device is provided with an infrared thermal sensor, and the method includes:
acquiring target contour features and storing the target contour features;
acquiring radiation signals by using an infrared sensor, and generating a target thermal image corresponding to the target contour characteristics based on the radiation signals;
and displaying the target thermal image and target position information corresponding to the target thermal image.
In one alternative of the first aspect, collecting the radiation signal with an infrared sensor comprises:
collecting radiation signals within a preset range by using an infrared sensor;
generating a target thermal image corresponding to the target profile features based on the radiation signals, comprising:
generating an initial thermal image based on the radiation signal;
determining an initial profile feature from the initial thermal image;
and matching the target contour features with the initial contour features to obtain a target thermal image corresponding to the target contour features.
In yet another alternative of the first aspect, matching the target contour feature with the initial contour feature to obtain a target thermal image corresponding to the target contour feature includes:
matching the target contour features with the initial contour features according to the similarity;
and taking the initial thermal image corresponding to the initial contour features with the similarity greater than or equal to the preset similarity as the target thermal image.
In yet another alternative of the first aspect, the method further comprises:
acquiring current position information;
determining a target route based on the current position information and the target position information;
and displaying the target route.
In yet another alternative of the first aspect, determining the target route based on the current location information and the target location information includes:
acquiring a historical travelling route;
matching the current position information and the target position information with the historical travelling route;
and if the historical travel route comprises the current position information and the target position information, taking the route from the current position information to the target position information in the historical travel route as the target route.
In yet another alternative of the first aspect, after obtaining the current location information, the method further includes:
when detecting that the current position information and the target position information meet the preset route distance, displaying prompt information; the prompt message is used for representing that the target is at risk of being lost.
In a second aspect, an embodiment of the present application provides an object tracking apparatus, where the apparatus is applied to a wearable device, and the wearable device is provided with an infrared thermal sensor, including:
the first processing module is used for acquiring the target contour characteristics and storing the target contour characteristics;
the second processing module is used for acquiring the radiation signals by using the infrared sensor and generating a target thermal image corresponding to the target contour characteristics based on the radiation signals;
the first display module is used for displaying the target thermal image and target position information corresponding to the target thermal image.
In an alternative of the second aspect, the second processing module comprises:
a first processing unit for generating an initial thermal image based on the radiation signal;
a second processing unit for determining an initial profile feature from the initial thermal image;
and the third processing unit is used for matching the target contour features with the initial contour features to obtain a target thermal image corresponding to the target contour features.
In yet another alternative of the second aspect, the third processing unit specifically includes:
matching the target contour features with the initial contour features according to the similarity;
and taking the initial thermal image corresponding to the initial contour features with the similarity greater than or equal to the preset similarity as the target thermal image.
In yet another alternative of the second aspect, the apparatus further comprises:
the acquisition module is used for acquiring current position information;
the third processing module is used for determining a target route based on the current position information and the target position information;
and the second display module is used for displaying the target route.
In yet another alternative of the second aspect, the third processing module includes:
an acquisition unit configured to acquire a historical travel route;
the fourth processing unit is used for matching the current position information and the target position information with the historical travelling route;
and the fifth processing unit is used for taking a route from the current position information to the target position information in the historical travelling route as the target route if the historical travelling route comprises the current position information and the target position information.
In yet another alternative of the second aspect, the obtaining unit further includes:
the third display module is used for displaying prompt information when the current position information and the target position information meet the preset route distance; the prompt message is used for representing that the target is at risk of being lost.
In a third aspect, an embodiment of the present application further provides an object tracking apparatus, including a processor and a memory;
the processor is connected with the memory;
a memory for storing executable program code;
the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory, for executing the object tracking method provided by the first aspect.
In a fourth aspect, embodiments of the present application further provide a wearable electronic device, which includes an infrared thermal sensor and the object tracking apparatus provided in the third aspect.
In a fifth aspect, the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the object tracking method according to the first aspect.
In the embodiment of the application, the target contour feature can be acquired and stored, the infrared sensor is used for acquiring the radiation signal and generating the target thermal image corresponding to the target contour feature based on the radiation signal, the target thermal image is further displayed to a user, any positioning equipment for wearing can be omitted, the thermal image corresponding to the contour of a child or an old man is acquired and generated through the infrared sensor, accurate positioning for tracking the child or the old man is achieved, and the safety for preventing the child or the old man from being lost is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic diagram of an object tracking system according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an object tracking method according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of another object tracking method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an object tracking device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a wearable electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating an architecture of an object tracking system according to an embodiment of the present disclosure.
As shown in fig. 1, the object tracking system includes a wearable electronic device 101 and a server 102, wherein:
wearable electronic equipment 101 can be the AR glasses that are provided with infrared thermal sensor, still can need not the accurate location of positioning device realization to the target when satisfying user's normal use AR glasses demand. Specifically, after the user normally wears the AR glasses, the AR glasses can be started through a power key arranged on the AR glasses frame, and a system interface of the AR glasses is displayed on one side, facing the eyes of the user, of the AR glasses lenses. The system interface of the AR glasses of the present application may include, but is not limited to, applications such as cameras, galleries, weather, videos, browsers, etc., and dates and times, wherein the updates of the applications and dates and times may be obtained from the server 102 through the network, and new applications may also be obtained from the server 102 through the network.
Further, when a user needs to use the AR glasses to track a target object, the entire contour of the target object may be scanned by the camera disposed on one side of the AR glasses frame, and when the target object is not found in the visual range of the user, the tracking application program is entered by controlling the touch pad disposed on the AR glasses frame, and a thermal image corresponding to the contour of the target object is generated by using the infrared thermal sensor for collection. The tracking application program stores the overall contour of the object scanned by the camera, and can also acquire the overall contour of the object stored in the server database from the server through the network. After the thermal image corresponding to the outline of the target object is generated, the thermal image and the corresponding position can be displayed on the AR glasses lens, and a user can visually acquire the position of the target object, so that the target object can be found quickly. It is understood that the object mentioned in the present application can be, but is not limited to, an elderly person or a child or a pet dog, etc.
The server 102 may be a server capable of providing multiple services, and may obtain a request instruction such as a networking request and an update request of the wearable electronic device 101 through a network, receive data such as an image, voice, and a file sent by the wearable electronic device 101 through the network, or send data such as an image, voice, and a file to the wearable electronic device 101 through the network. For example, the server 102 may receive and store, through the network, the outline of the target object scanned by the wearable electronic device 101 through the camera, and transmit, through the network, the outline of the target object to the wearable electronic device 101 when receiving a request instruction sent by the wearable electronic device 101 to obtain the outline of the target object. The server 102 may be, but is not limited to, a hardware server, a virtual server, a cloud server, and the like.
The network may be a medium that provides a communication link between wearable electronic device 101 and server 102, or may be the internet containing network devices and transmission media, without limitation. The transmission medium may be a wired link (such as, but not limited to, coaxial cable, fiber optic cable, and Digital Subscriber Line (DSL), etc.) or a wireless link (such as, but not limited to, wireless fidelity (WIFI), bluetooth, and mobile device network, etc.).
It is to be understood that the number of wearable electronic devices 101 and servers 102 in the object tracking system shown in fig. 1 is by way of example only, and that the object tracking system may include any number of wearable electronic devices 101 and servers 102 in a particular implementation. The embodiments of the present application do not limit this. For example, but not limiting of, server 102 may be a server cluster of multiple servers.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating an object tracking method according to an embodiment of the present disclosure.
As shown in fig. 2, the method is applicable to a wearable electronic device provided with an infrared thermal sensor, and includes:
step 201, obtaining the target contour feature, and storing the target contour feature.
Specifically, when the user normally uses the functions of the wearable electronic device, if the user needs to track the target object, the wearable electronic device may acquire the overall contour of the target object and store the overall contour of the target object.
For example, taking wearable electronic devices as AR glasses as an example, the user can wear the AR glasses to realize functions of video playing, voice sending, route planning, and the like. When the user needs to use the AR glasses to track the target object, the touch pad arranged on one side of the AR glasses frame can be controlled to enter the thermal imaging tracking application program of the AR glasses, and the camera arranged on one side of the AR glasses frame is started to acquire the overall profile characteristic of the target object. The method includes that a user can judge whether a target contour of a target object is obtained or not by observing a contour effect displayed and obtained on an AR glasses lens, for example, when the display content of the AR glasses lens is the target object and a contour line at the edge of the target object is identified, the AR glasses can generate prompt information to prompt the user that the current target object is blocked when the edge of the target object is blocked and the AR glasses cannot identify the contour line at the edge, and the prompt information can be but is not limited to voice prompt information of 'existence of the blocked object and please move away'. Possibly, when the whole of the object is not completely scanned by the camera, so that the AR glasses cannot completely recognize the whole contour line of the object, the AR glasses may generate a prompt message to prompt the user that the current object is not completely scanned, where the prompt message may be, but is not limited to, a voice prompt message of "please leave the object".
Further, after the AR glasses acquire the overall contour of the target object, the overall contour of the target object may be saved in a memory of the thermal imaging tracking application program, in a memory of the AR glasses, or in a database of the server through a network.
It should be noted that the target contour feature mentioned in this embodiment may not be limited to the overall contour of the target object, and for example, the gender or the age stage of the target object may also be increased through face recognition, that is, the overall contour, the gender or the age stage of the target object are all used as the target contour feature to increase the accuracy of tracking the target object.
It is also understood that the objects mentioned in the present application are not limited to the elderly or children or pet dogs.
Step 202, collecting radiation signals by using an infrared sensor, and generating a target thermal image corresponding to the target contour features based on the radiation signals.
Specifically, when the user finds that the target object is not in the visible range in the process of wearing the wearable electronic device, the infrared sensor arranged on the wearable electronic device can be used for acquiring surrounding radiation signals, and a thermal image corresponding to the outline of the target object is generated according to the radiation signals.
Taking wearable electronic equipment as the example of the AR glasses, when the user wears the AR glasses and finds that the target object does not appear in the visible range of the lenses of the AR glasses, the touch pad arranged on one side of the spectacle frame of the AR glasses can be controlled to enter the thermal imaging tracking application program of the AR glasses. Upon entering the thermal imaging tracking application, the saved global contour of the object may be selected as a search criteria to generate a thermal image corresponding to the global contour of the object.
The user can select the function to be executed after entering the thermal imaging tracking application program of the AR glasses, and the profile feature of the new target object can be scanned and recorded through the camera when the profile feature of the new target object needs to be scanned. When the thermal image of the target object is required to be acquired, the profile features of the target object can be selected from the stored profile features, and the thermal image corresponding to the selected profile features of the target object is acquired and generated through the infrared sensor.
As an embodiment of the present application, collecting a radiation signal using an infrared sensor includes:
collecting radiation signals within a preset range by using an infrared sensor;
generating a target thermal image corresponding to the target profile features based on the radiation signals, comprising:
generating an initial thermal image based on the radiation signal;
determining an initial profile feature from the initial thermal image;
and matching the target contour features with the initial contour features to obtain a target thermal image corresponding to the target contour features.
Specifically, when the user finds that the target object is not within the visible range in the process of wearing the wearable electronic device, in order to ensure the accuracy of tracking the target object, the infrared sensor can be used for acquiring the radiation signal within the preset range. The preset range can be a circular area with the wearable electronic equipment as an origin and the preset distance as a radius, the preset distance can be set by a user on the wearable electronic equipment or set by default on the wearable electronic equipment, and the preferable preset distance can be set to be 5 kilometers. The wearable electronic device may then convert the radiation signals collected by the infrared sensor into an initial thermal image, determine a corresponding initial overall profile from the initial thermal image, and match the selected overall profile of the target with the initial overall profile to obtain a target thermal image corresponding to the overall profile of the target.
It is understood that the radiation signals collected by the wearable electronic device using the infrared sensor may be radiation signals of one or more objects, the initial thermal image generated from the radiation signals includes one or more thermal images, and the initial overall contour closest to the overall contour of the object may be screened out by matching the selected overall contour of the object with the initial overall contour, and the thermal image corresponding to the closest initial overall contour may be used as the target thermal image.
As another embodiment of the present application, matching the target contour feature with the initial contour feature to obtain a target thermal image corresponding to the target contour feature includes:
matching the target contour features with the initial contour features according to the similarity;
and taking the initial thermal image corresponding to the initial contour features with the similarity greater than or equal to the preset similarity as the target thermal image.
Specifically, the target overall contour and all the initial overall contours may be divided into a plurality of nodes connected in sequence, and the nodes constituting the target overall contour may be matched with the nodes constituting the initial overall contour. And when the similarity between the nodes forming the initial overall contour and the nodes forming the target overall contour is greater than or equal to the preset similarity, taking the initial thermal image corresponding to the initial overall contour corresponding to the node as the target thermal image. The preset similarity can be set by the user on the wearable electronic device or set by the wearable electronic device in a default mode.
It should be noted that, in this embodiment, the number of the target thermal images corresponding to the initial contour features with the similarity greater than or equal to the preset similarity may be one or more, and when the number is multiple, the multiple target thermal images may be simultaneously displayed to the user, and the user may automatically determine and select the closest target thermal image according to the form of each target thermal image.
Step 203, displaying the target thermal image and the target position information corresponding to the target thermal image.
Specifically, after the wearable electronic device generates the target thermal image, the target thermal image and corresponding location information may be displayed to the user.
For example, taking wearable electronic equipment as AR glasses as an example, after the AR glasses generate the target thermal image, the target thermal image may be displayed on lenses of the AR glasses in a red display manner, and a position corresponding to the target thermal image is also displayed at the same time. The position of the target thermal image can be acquired through a map application program carried by the AR glasses. It is understood that when the user determines the orientation of the thermal image of the target, the user can also track the target according to the orientation of the thermal image of the target.
In this embodiment, by acquiring and storing the target contour feature, acquiring the radiation signal by using the infrared sensor and generating the target thermal image corresponding to the target contour feature based on the radiation signal, and further displaying the target thermal image to the user, any positioning device for wearing can be omitted, and the thermal image corresponding to the contour of a child or an old person can be acquired and generated by using the infrared sensor, so that accurate positioning for tracking the child or the old person is realized, and the safety for preventing the child or the old person from being lost is improved.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating another object tracking method according to an embodiment of the present disclosure.
As shown in fig. 3, the method is applicable to a wearable electronic device provided with an infrared thermal sensor, including:
and 301, acquiring the target contour features and storing the target contour features.
Specifically, step 301 is identical to step 201, and is not described herein again.
Step 302, collecting radiation signals by using an infrared sensor, and generating a target thermal image corresponding to the target contour features based on the radiation signals.
Specifically, step 302 is identical to step 202, and is not described herein again.
Step 303, displaying the target thermal image and the target position information corresponding to the target thermal image.
Specifically, step 303 is identical to step 203, and is not described herein again.
And step 304, acquiring current position information.
Specifically, after the wearable electronic device displays the target thermal image and the corresponding target position, the wearable electronic device can also obtain the current position through a map application program carried by the wearable electronic device. When the wearable electronic device does not carry the map application program, the current position can be scanned through a camera of the wearable electronic device, for example, the current position can be obtained through scanning a building with characteristics or an identification.
As another embodiment of the present application, after obtaining the current location information, the method further includes:
and when the current position information and the target position information are detected to meet the preset route distance, displaying prompt information.
Specifically, after acquiring the current position, the wearable electronic device may detect a linear distance between the current position and the target position, and display a prompt message for representing that the target is at risk of losing when the linear distance satisfies a preset route distance, for example, show a voice and/or text prompt message of "the target is far away" to the user. The distance satisfying the preset route distance can be a distance interval which is greater than a preset first threshold and smaller than a preset second threshold, and when the possible linear distance between the current position and the target position is smaller than the preset first threshold, the target object is represented not to be moved far away.
When the possible linear distance between the current position and the target position is larger than a preset second threshold value, the representation target object exceeds the detection range of the wearable electronic device, the wearable electronic device can plan and search a route for a user according to the previous traveling route of the target object, collects radiation signals by using the infrared sensor in real time, and generates a target thermal image corresponding to the target contour characteristics according to the radiation signals. The preset first threshold and the preset second threshold may be set by a user on the wearable electronic device or default settings of the wearable electronic device, the preferred preset first threshold may be 3 km, and the preferred preset second threshold may be 5 km.
Step 305, determining a target route based on the current position information and the target position information.
Specifically, the wearable electronic device can plan a target route for the user after acquiring the current position and the target position, so that the user can be guided to quickly track the target object.
As another embodiment of the present application, determining a target route based on current location information and target location information includes:
acquiring a historical travelling route;
matching the current position information and the target position information with the historical travelling route;
and if the historical travel route comprises the current position information and the target position information, taking the route from the current position information to the target position information in the historical travel route as the target route.
Specifically, the wearable electronic device may retrieve a travel route of the user during use of the wearable electronic device, for example, but not limited to, obtaining a historical travel route from a mapping application of the wearable electronic device, matching the obtained historical travel route with the current location and the target location, and detecting whether the historical travel route includes the current location and the target location. If the historical travel route is detected to comprise the current position information and the target position information, the route from the current position information to the target position information in the historical travel route is used as the target route, so that a user can know the target route and can quickly track the target object.
It should be noted that, in this embodiment, it is not limited to match and search any one of the historical travel routes with the current position and the target position, and the matching success rate can be increased by matching a plurality of travel routes with the current position and the target position. For example, if route a and route B in the historical travel route are matched with the current position and the target position, and it is detected that route a includes the current position and route B includes the target position, then route a and route B may be integrated, the current position is used as the starting point, and the target position is used as the end point to obtain the target route.
And step 306, displaying the target route.
In particular, the wearable electronic device, upon determining the target route, may display the target route to the user in the form of a navigation. For example, taking wearable electronic equipment as AR glasses as an example, after the AR glasses determine the target route, the current traveling direction may be displayed while displaying the target thermal image in the lenses of the AR glasses, for example, displaying text prompt information of "straight line 200 meters" and a corresponding straight arrow, and also generating corresponding voice prompt information of "please straight line 200 meters" to the user, so that the user can quickly track the target object.
Referring to fig. 4, fig. 4 is a schematic structural diagram illustrating an object tracking device 400 according to an embodiment of the present disclosure, where the object tracking device 400 at least includes a first processing module 401, a second processing module 402, and a first display module 403. Wherein:
the first processing module 401 is configured to obtain a target contour feature and store the target contour feature;
a second processing module 402, configured to acquire a radiation signal by using an infrared sensor, and generate a target thermal image corresponding to the target contour feature based on the radiation signal;
the first display module 403 is configured to display the target thermal image and target position information corresponding to the target thermal image.
In this embodiment, by acquiring and storing the target contour feature, acquiring the radiation signal by using the infrared sensor and generating the target thermal image corresponding to the target contour feature based on the radiation signal, and further displaying the target thermal image to the user, any positioning device for wearing can be omitted, and the thermal image corresponding to the contour of a child or an old person can be acquired and generated by using the infrared sensor, so that accurate positioning for tracking the child or the old person is realized, and the safety for preventing the child or the old person from being lost is improved.
In some possible embodiments, the second processing module 402 includes:
a first processing unit for generating an initial thermal image based on the radiation signal;
a second processing unit for determining an initial profile feature from the initial thermal image;
and the third processing unit is used for matching the target contour features with the initial contour features to obtain a target thermal image corresponding to the target contour features.
In some possible embodiments, the third processing unit is specifically configured to:
matching the target contour features with the initial contour features according to the similarity;
and taking the initial thermal image corresponding to the initial contour features with the similarity greater than or equal to the preset similarity as the target thermal image.
In some possible embodiments, the apparatus further comprises:
the acquisition module is used for acquiring current position information;
the third processing module is used for determining a target route based on the current position information and the target position information;
and the second display module is used for displaying the target route.
In some possible embodiments, the third processing module comprises:
an acquisition unit configured to acquire a historical travel route;
the fourth processing unit is used for matching the current position information and the target position information with the historical travelling route;
and the fifth processing unit is used for taking a route from the current position information to the target position information in the historical travelling route as the target route if the historical travelling route comprises the current position information and the target position information.
In some possible embodiments, after the obtaining unit, the method further includes:
the third display module is used for displaying prompt information when the current position information and the target position information meet the preset route distance; the prompt message is used for representing that the target is at risk of being lost.
Referring to fig. 5, fig. 5 is a schematic structural diagram illustrating a wearable electronic device according to an embodiment of the present disclosure.
As shown in fig. 5, the wearable electronic device 500 may include: at least one processor 501, at least one network interface 504, a user interface 503, memory 505, infrared thermal sensor 506, and at least one communication bus 502.
The communication bus 502 can be used for realizing the connection communication of the above components.
The user interface 503 may include keys, and the optional user interface may also include a standard wired interface or a wireless interface.
The network interface 504 may optionally include a bluetooth module, an NFC module, a Wi-Fi module, or the like.
Processor 501 may include one or more processing cores, among other things. The processor 501 interfaces with various interfaces and circuitry throughout the electronic device 500 to perform various functions of the electronic device 500 and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 505 and invoking data stored in the memory 505. Optionally, the processor 501 may be implemented in at least one hardware form of DSP, FPGA, and PLA. The processor 501 may integrate one or a combination of several of a CPU, GPU, modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 501, but may be implemented by a single chip.
The memory 505 may include a RAM or a ROM. Optionally, the memory 505 includes a non-transitory computer readable medium. The memory 505 may be used to store instructions, programs, code sets, or instruction sets. The memory 505 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 505 may alternatively be at least one memory device located remotely from the processor 501. As shown in fig. 5, memory 505, which is one type of computer storage medium, may include an operating system, a network communication module, a user interface module, and an object tracking application.
In particular, the processor 501 may be configured to invoke an object tracking application stored in the memory 505 and specifically perform the following operations:
acquiring target contour features and storing the target contour features;
acquiring radiation signals by using an infrared sensor, and generating a target thermal image corresponding to the target contour characteristics based on the radiation signals;
and displaying the target thermal image and target position information corresponding to the target thermal image.
In some possible embodiments, the processor 501 collects the radiation signal using an infrared sensor, and performs:
collecting radiation signals within a preset range by using an infrared sensor;
generating a target thermal image corresponding to the target profile features based on the radiation signals, comprising:
generating an initial thermal image based on the radiation signal;
determining an initial profile feature from the initial thermal image;
and matching the target contour features with the initial contour features to obtain a target thermal image corresponding to the target contour features.
In some possible embodiments, the processor 501 matches the target contour feature with the initial contour feature to obtain a target thermal image corresponding to the target contour feature, and performs:
matching the target contour features with the initial contour features according to the similarity;
and taking the initial thermal image corresponding to the initial contour features with the similarity greater than or equal to the preset similarity as the target thermal image.
In some possible embodiments, processor 501 further performs:
acquiring current position information;
determining a target route based on the current position information and the target position information;
and displaying the target route.
In some possible embodiments, the processor 501 determines the target route based on the current location information and the target location information, and performs:
acquiring a historical travelling route;
matching the current position information and the target position information with the historical travelling route;
and if the historical travel route comprises the current position information and the target position information, taking the route from the current position information to the target position information in the historical travel route as the target route.
In some possible embodiments, after the processor 501 obtains the current location information, it further performs:
when detecting that the current position information and the target position information meet the preset route distance, displaying prompt information; the prompt message is used for representing that the target is at risk of being lost.
Embodiments of the present application also provide a computer-readable storage medium, which stores instructions that, when executed on a computer or a processor, cause the computer or the processor to perform one or more of the steps in the embodiments shown in fig. 2 or fig. 3. The respective constituent modules of the above-described object tracking apparatus, if implemented in the form of software functional units and sold or used as independent products, may be stored in the computer-readable storage medium.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., Digital Versatile Disk (DVD)), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. And the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks. The technical features in the present examples and embodiments may be arbitrarily combined without conflict.
The above-described embodiments are merely preferred embodiments of the present application, and are not intended to limit the scope of the present application, and various modifications and improvements made to the technical solutions of the present application by those skilled in the art without departing from the design spirit of the present application should fall within the protection scope defined by the claims of the present application.

Claims (10)

1. An object tracking method is applied to a wearable electronic device, wherein the wearable electronic device is provided with an infrared thermal sensor, and the method comprises the following steps:
acquiring target contour features and storing the target contour features;
acquiring radiation signals by using the infrared sensor, and generating a target thermal image corresponding to the target contour features based on the radiation signals;
displaying the target thermal image and target location information corresponding to the target thermal image.
2. The method of claim 1, wherein said collecting a radiation signal with said infrared sensor comprises:
collecting radiation signals within a preset range by using the infrared sensor;
the generating of the target thermal image corresponding to the target contour feature based on the radiation signal comprises:
generating an initial thermal image based on the radiation signal;
determining an initial profile feature from the initial thermal image;
and matching the target contour features with the initial contour features to obtain a target thermal image corresponding to the target contour features.
3. The method of claim 2, wherein said matching the target contour feature to the initial contour feature results in a target thermal image corresponding to the target contour feature, comprising:
matching the target contour features with the initial contour features according to similarity;
and taking the initial thermal image corresponding to the initial contour features with the similarity greater than or equal to the preset similarity as a target thermal image.
4. The method of claim 1, further comprising:
acquiring current position information;
determining a target route based on the current location information and the target location information;
and displaying the target route.
5. The method of claim 4, wherein determining a target route based on the current location information and the target location information comprises:
acquiring a historical travelling route;
matching the current location information and the target location information with the historical travel route;
and if the historical travelling route comprises the current position information and the target position information, taking a route from the current position information to the target position information in the historical travelling route as the target route.
6. The method of claim 4, wherein after obtaining the current location information, further comprising:
when detecting that the current position information and the target position information meet a preset route distance, displaying prompt information; the prompt information is used for representing the risk of the target being lost.
7. An object tracking device, wherein the device is applied to a wearable device provided with an infrared thermal sensor, comprising:
the first processing module is used for acquiring target contour characteristics and storing the target contour characteristics;
the second processing module is used for acquiring radiation signals by using the infrared sensor and generating a target thermal image corresponding to the target contour feature based on the radiation signals;
a first display module to display the target thermal image and target location information corresponding to the target thermal image.
8. An object tracking device comprising a processor and a memory;
the processor is connected with the memory;
the memory for storing executable program code;
the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory for performing the object tracking method according to any one of claims 1 to 6.
9. A wearable electronic device, comprising an infrared thermal sensor and the object tracking device of claim 8.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out an object tracking method according to any one of claims 1 to 6.
CN202110264232.8A 2021-03-11 2021-03-11 Object tracking method and device, storage medium and wearable electronic equipment Pending CN113129334A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110264232.8A CN113129334A (en) 2021-03-11 2021-03-11 Object tracking method and device, storage medium and wearable electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110264232.8A CN113129334A (en) 2021-03-11 2021-03-11 Object tracking method and device, storage medium and wearable electronic equipment

Publications (1)

Publication Number Publication Date
CN113129334A true CN113129334A (en) 2021-07-16

Family

ID=76773212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110264232.8A Pending CN113129334A (en) 2021-03-11 2021-03-11 Object tracking method and device, storage medium and wearable electronic equipment

Country Status (1)

Country Link
CN (1) CN113129334A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114373216A (en) * 2021-12-07 2022-04-19 图湃(北京)医疗科技有限公司 Eye movement tracking method, device, equipment and storage medium for anterior segment OCTA
CN114550222A (en) * 2022-04-24 2022-05-27 深圳市赛特标识牌设计制作有限公司 Dynamic hotel identification guidance system based on Internet of things

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103808326A (en) * 2012-11-07 2014-05-21 腾讯科技(深圳)有限公司 Navigation method and navigation system
US20180143006A1 (en) * 2016-11-18 2018-05-24 Child Mind Institute, Inc. Thermal sensor position detecting device
CN108806148A (en) * 2017-04-26 2018-11-13 佛山市顺德区美的电热电器制造有限公司 A kind of security alarm method, infra-red thermal imaging sensor and server
CN109117819A (en) * 2018-08-30 2019-01-01 Oppo广东移动通信有限公司 Object recognition methods, device, storage medium and wearable device
CN109784177A (en) * 2018-12-14 2019-05-21 深圳壹账通智能科技有限公司 Missing crew's method for rapidly positioning, device and medium based on images match
CN109977833A (en) * 2019-03-19 2019-07-05 网易(杭州)网络有限公司 Object tracking method, object tracking device, storage medium and electronic equipment
CN110070695A (en) * 2019-04-24 2019-07-30 河南书网教育科技股份有限公司 A kind of method for early warning, device, server and communication system
CN111601254A (en) * 2020-04-16 2020-08-28 深圳市优必选科技股份有限公司 Target tracking method and device, storage medium and intelligent equipment
CN111598062A (en) * 2020-07-21 2020-08-28 深圳市天和荣科技有限公司 Pet identification method, system, readable storage medium and computer equipment
CN112217992A (en) * 2020-09-29 2021-01-12 Oppo(重庆)智能科技有限公司 Image blurring method, image blurring device, mobile terminal, and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103808326A (en) * 2012-11-07 2014-05-21 腾讯科技(深圳)有限公司 Navigation method and navigation system
US20180143006A1 (en) * 2016-11-18 2018-05-24 Child Mind Institute, Inc. Thermal sensor position detecting device
CN108806148A (en) * 2017-04-26 2018-11-13 佛山市顺德区美的电热电器制造有限公司 A kind of security alarm method, infra-red thermal imaging sensor and server
CN109117819A (en) * 2018-08-30 2019-01-01 Oppo广东移动通信有限公司 Object recognition methods, device, storage medium and wearable device
CN109784177A (en) * 2018-12-14 2019-05-21 深圳壹账通智能科技有限公司 Missing crew's method for rapidly positioning, device and medium based on images match
CN109977833A (en) * 2019-03-19 2019-07-05 网易(杭州)网络有限公司 Object tracking method, object tracking device, storage medium and electronic equipment
CN110070695A (en) * 2019-04-24 2019-07-30 河南书网教育科技股份有限公司 A kind of method for early warning, device, server and communication system
CN111601254A (en) * 2020-04-16 2020-08-28 深圳市优必选科技股份有限公司 Target tracking method and device, storage medium and intelligent equipment
CN111598062A (en) * 2020-07-21 2020-08-28 深圳市天和荣科技有限公司 Pet identification method, system, readable storage medium and computer equipment
CN112217992A (en) * 2020-09-29 2021-01-12 Oppo(重庆)智能科技有限公司 Image blurring method, image blurring device, mobile terminal, and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吕云翔 等编著: "《Python深度学习》", 31 October 2020, 北京:机械工业出版社, pages: 119 - 120 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114373216A (en) * 2021-12-07 2022-04-19 图湃(北京)医疗科技有限公司 Eye movement tracking method, device, equipment and storage medium for anterior segment OCTA
CN114550222A (en) * 2022-04-24 2022-05-27 深圳市赛特标识牌设计制作有限公司 Dynamic hotel identification guidance system based on Internet of things
CN114550222B (en) * 2022-04-24 2022-07-08 深圳市赛特标识牌设计制作有限公司 Dynamic hotel mark guidance system based on Internet of things

Similar Documents

Publication Publication Date Title
US9003030B2 (en) Detecting relative crowd density via client devices
CN107871114B (en) Method, device and system for pushing tracking information of target person
CN106662458B (en) Wearable sensor data for improving map and navigation data
CN107148636B (en) Navigation system, client terminal device, control method, and storage medium
US20140176348A1 (en) Location based parking management system
JP2019021019A (en) Human flow analysis method, human flow analysis device and human flow analysis system
US20150186426A1 (en) Searching information using smart glasses
US9971402B2 (en) Information processing system, mobile terminal, server apparatus, method for processing information, and non-transitory computer readable storage medium
CN110717918B (en) Pedestrian detection method and device
TW201944324A (en) Guidance system
JP2017228115A (en) Method for providing information, program for causing computer to execute the method, and device for providing information
JP7479496B2 (en) System and method for identifying obstacles and hazards along a route - Patents.com
US20160298978A1 (en) WiFi-Based Indoor Positioning and Navigation as a New Mode in Multimodal Transit Applications
US11181381B2 (en) Portable pedestrian navigation system
CN109341693B (en) Entertainment place navigation method and system based on big data and deep learning
CN113129334A (en) Object tracking method and device, storage medium and wearable electronic equipment
CN114677848A (en) Perception early warning system, method, device and computer program product
CN114708545A (en) Image-based object detection method, device, equipment and storage medium
JP2021039485A (en) Collection method
JP2020193956A5 (en)
JP2020193956A (en) On-vehicle device, driving support method, and driving support system
CN113724454B (en) Interaction method of mobile equipment, device and storage medium
KR20110023255A (en) Apparatus and method for providing friend meeting serivce in portable terminal
KR102366773B1 (en) Electronic business card exchanging system using mobile terminal and method thereof
JP2021071967A (en) Response support system, method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination