CN111722240B - Electronic equipment, object tracking method and device - Google Patents

Electronic equipment, object tracking method and device Download PDF

Info

Publication number
CN111722240B
CN111722240B CN202010603898.7A CN202010603898A CN111722240B CN 111722240 B CN111722240 B CN 111722240B CN 202010603898 A CN202010603898 A CN 202010603898A CN 111722240 B CN111722240 B CN 111722240B
Authority
CN
China
Prior art keywords
image
tof
target object
module
tof module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010603898.7A
Other languages
Chinese (zh)
Other versions
CN111722240A (en
Inventor
成通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010603898.7A priority Critical patent/CN111722240B/en
Publication of CN111722240A publication Critical patent/CN111722240A/en
Application granted granted Critical
Publication of CN111722240B publication Critical patent/CN111722240B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • G01S7/4866Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak by fitting a model or function to the received signal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses electronic equipment, an object tracking method and an object tracking device, belongs to the technical field of communication, and solves the problems that the acquisition space of TOF depth images of existing electronic equipment is limited and the acquisition effect is poor. Comprises a device main body, a time-of-flight TOF module, a movable structure and a TOF movable cavity. Wherein, through set up TOF activity cavity in electronic equipment, set up movable structure and TOF module in TOF activity cavity to connect TOF module and TOF activity cavity through movable structure, make movable structure can drive TOF module and remove in TOF activity cavity, thereby can adjust the collection region to TOF depth image according to the image tracking condition, compare in traditional mode of directly fixing in electronic equipment with TOF module, can realize the collection to the depth image in the bigger space, promoted the effect of real-time tracking image collection.

Description

Electronic equipment, object tracking method and device
Technical Field
The application belongs to the technical field of communication, and particularly relates to electronic equipment, an object tracking method and an object tracking device.
Background
With the development of intelligent electronic devices, 3D sensing technology is becoming one of the standard hardware of intelligent electronic devices. Of these, TOF (Time of flight) is very widely used. For example, the method is applied to aspects of physical ranging, 3D modeling, photographing performance improvement and the like. One important application is to enable real-time tracking of the state of the human body.
In the prior art, the FOV (Field of view) of a TOF module in an electronic device is generally small, resulting in limited space in which such electronic device can collect depth information. Especially when such electronic devices are used for closely tracking target objects in real time, the target objects and the electronic devices are required to be kept in a small space angle, which results in that the use of such electronic devices for tracking target objects is greatly limited and the use experience is poor.
Disclosure of Invention
The embodiment of the application aims to provide electronic equipment, an object tracking method and an object tracking device, which can solve the problems that the acquisition space of TOF depth images of the existing electronic equipment is limited and the acquisition effect is poor.
In order to solve the technical problems, the application is realized as follows:
in a first aspect, an embodiment of the present application provides an electronic device, including a device body, a time-of-flight TOF module, a movable structure, and a TOF active cavity; wherein,,
the TOF movable cavity is arranged in the equipment main body;
the movable structure is arranged in the TOF movable cavity;
the TOF module is arranged in the TOF movable cavity and connected with the movable structure, and the movable structure drives the TOF module to move in the TOF movable cavity;
The TOF module comprises a TOF receiving assembly and a TOF transmitting assembly; the TOF emission component is used for emitting light rays to a target object; the TOF receiving component is configured to receive reflected light of the emitted light from the target object.
In a second aspect, an embodiment of the present application provides an object tracking method, which is applied to the electronic device in the first aspect, and includes:
acquiring a first image;
determining position information of the target object in the first image based on the first image;
based on the position information, if the image of the target object in the first image is not located in a first preset area, controlling the TOF module to move so that the image of the target object is located in the first preset area; and tracking the target object by utilizing the moved TOF module.
In a third aspect, an embodiment of the present application provides an object tracking apparatus, including:
the acquisition module is used for acquiring a first image;
a determining module, configured to determine, based on the first image, location information of the target object in the first image;
the control module is used for controlling the TOF module to move if the image of the target object in the first image is not located in a first preset area based on the position information so that the image of the target object is located in the first preset area; and tracking the target object by utilizing the moved TOF module.
In a fourth aspect, embodiments of the present application provide an electronic device comprising a processor, a memory, and a program or instruction stored on the memory and executable on the processor, the program or instruction when executed by the processor implementing the steps of the object tracking method according to the second aspect.
In a fifth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which, when executed by a processor, implement the steps of the object tracking method according to the second aspect.
In a sixth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the object tracking method according to the second aspect.
In this application embodiment, through set up TOF activity cavity in electronic equipment, set up movable structure and TOF module in TOF activity cavity to connect TOF module and TOF activity cavity through movable structure, make movable structure can drive TOF module and remove in TOF activity cavity, thereby can adjust the collection region to TOF depth image according to the image tracking condition, compare in traditional mode of directly fixing TOF module in electronic equipment, can realize the collection to the depth image in the bigger space, promoted the effect of real-time tracking image collection.
Further, according to the object tracking method applied to the electronic device, the first image is collected, and the position information of the target object in the first image is determined based on the first image, so that when the position information determines that the image of the target object in the first image is not located in the first preset area, the TOF module is controlled to move, the image of the target object is located in the first preset area, and the moved TOF module is utilized to track the target object. Therefore, according to the technical scheme, the movable TOF module can be used for adjusting the acquisition area of the TOF depth image according to the image tracking condition, compared with the range of the depth image which can be acquired by the conventional TOF module fixed in the electronic equipment, the acquisition of the depth image in a larger space can be realized, and the effect of real-time tracking image acquisition is improved.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a TOF module according to an embodiment of the present application.
Fig. 3 is a cross-sectional view of a handset in one embodiment of the application.
Fig. 4 is a front view of a mobile phone in one embodiment of the present application.
Fig. 5 is a schematic diagram of a TOF module receiving and transmitting infrared light according to an embodiment of the present application.
Fig. 6 is a schematic diagram of a TOF module receiving and transmitting infrared light according to another embodiment of the present application.
Fig. 7 is a schematic flow chart diagram of an object tracking method in one embodiment of the present application.
FIG. 8 is a schematic representation of a zoning scheme in one embodiment of the present application.
Fig. 9 is a schematic flow chart diagram of an object tracking method in another embodiment of the present application.
Fig. 10 is a schematic structural diagram of an object tracking device in an embodiment of the present application.
Fig. 11 is a schematic hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The electronic device provided by the embodiment of the application is described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of an electronic device in an embodiment of the present application. As shown in fig. 1, the electronic device includes a device body 110, a TOF module 120, a movable structure 121, and a TOF movable cavity 122; wherein,,
the TOF movable cavity 122 is disposed in the device main body 110, the movable structure 121 is disposed in the TOF movable cavity 122, the TOF module 120 is disposed in the TOF movable cavity 122 and connected to the movable structure 121, and the movable structure 121 drives the TOF module 120 to move in the TOF movable cavity 122. As shown in fig. 2, the TOF module 120 includes a TOF receiving assembly 1201 and a TOF emitting assembly 1202, the TOF emitting assembly 1202 is configured to emit light to a target object, and the TOF receiving assembly 1201 is configured to receive reflected light of the target object on the emitted light.
The light emitted from the TOF module to the target object may be infrared light.
In this application embodiment, through set up TOF activity cavity in electronic equipment, set up movable structure and TOF module in TOF activity cavity to connect TOF module and TOF activity cavity through movable structure, make movable structure can drive TOF module and remove in TOF activity cavity, thereby can adjust the collection region to TOF depth image according to the image tracking condition, compare in traditional mode of directly fixing TOF module in electronic equipment, can realize the collection to the depth image in the bigger space, promoted the effect of real-time tracking image collection.
As the TOF module adopts the infrared band light source to work and is little influenced by ambient light, the TOF module can meet the related application of various scenes, such as the aspects of physical ranging, 3D modeling, photographing performance improvement, real-time tracking of targets and the like. Accordingly, electronic devices also include a variety of devices, such as mobile phones, computers, cameras, smart measurement devices, and the like. The structure of the electronic device provided in the above embodiment is described below by taking the electronic device as a mobile phone and the TOF module as an example, where the TOF module works with an infrared band light source and is disposed at a position close to the screen direction of the mobile phone.
In one embodiment, the electronic device is a cell phone. Fig. 3 is a cross-sectional view of a handset in one embodiment of the application. Fig. 4 is a front view of a mobile phone, with TOF active cavity 322 not shown in fig. 4, in one embodiment of the present application. As shown in fig. 3-4, the mobile phone includes a mobile phone body 310, a TOF module 320, a movable structure 321, and a TOF movable cavity 322; wherein,,
the TOF movable cavity 322 is arranged in the mobile phone body 310, the movable structure 321 is arranged in the TOF movable cavity 322, the TOF module 320 is arranged in the TOF movable cavity 322 and is connected with the movable structure 321, and the movable structure 321 drives the TOF module 320 to move in the TOF movable cavity 322. The TOF module 320 includes a TOF receiving component 3201 and a TOF transmitting component 3202, the TOF transmitting component 3202 is configured to transmit infrared light to a target object, and the TOF receiving component 3201 is configured to receive reflected infrared light of the target object to the transmitted infrared light.
When the mobile phone tracks a target object by utilizing the TOF module 320, firstly, the TOF module 320 is utilized to acquire a TOF depth image of the target object, and then the movable structure 321 is controlled to move when the target object is obviously deviated in the TOF depth image, so that the movable structure 321 drives the TOF module 320 to move in a TOF movable cavity, the acquisition area of the TOF depth image is adjusted according to the image tracking condition, the acquisition of the depth image in a larger space is realized, and the real-time tracking image acquisition effect is improved.
As shown in fig. 5, the dotted line crossing area F is an area where the TOF module can collect a depth image when not moving, the solid line crossing area F' is an area where the TOF module can collect a depth image after moving, and in the current state, the area where the TOF module can collect a depth image is an area between the dotted line side 1 and the solid line side 2, and compared with an area F where the TOF module can only collect a depth image when not moving (as shown in fig. 6, the drawing only schematically shows that the area where the TOF module can collect a depth image is F) in the mobile phone, the depth image collection in a larger space is realized.
It should be noted that, when the mobile phone shown in fig. 5 tracks the target object, the area where the TOF module can collect the depth image is not limited to the above-mentioned area between the dashed line side 1 and the solid line side 2. The TOF module can move to a plurality of directions under the drive of the movable structure, so that the area capable of acquiring the depth image is wider. For example, the TOF module acquires depth images in a 180 ° spatial region driven by the movable structure. In this embodiment, the area where the TOF module can collect the depth image is not specifically limited.
In this application embodiment, through set up TOF activity cavity in the cell-phone, set up movable structure and TOF module in TOF activity cavity to connect TOF module and TOF activity cavity through movable structure, make movable structure can drive TOF module and remove in TOF activity cavity, thereby can adjust the collection area to TOF depth image according to the image tracking condition, compare in traditional mode of directly being fixed in the cell-phone with TOF module, can realize the collection to the depth image in the bigger space, promoted the effect of real-time tracking image acquisition.
As shown in fig. 3, the TOF receiving assembly 3201 and the TOF transmitting assembly 3202 are fixedly connected by a connecting piece 3203. Optionally, the connector 3203 is a substrate or a bracket. The TOF receiving assembly 3201 and the TOF transmitting assembly 3202 are fixedly arranged on a substrate, or the TOF receiving assembly 3201 and the TOF transmitting assembly 3202 are fixedly arranged on a bracket. Alternatively, the TOF receiving assembly 3201 and the TOF transmitting assembly 3202 may be fixedly disposed at two ends of one bracket.
Because TOF receiving component and TOF transmitting component pass through connecting piece fixed connection for the direction of movement of each subassembly in the TOF module can keep unanimous, and need not use software to carry out real-time calibration to each subassembly in the TOF module, reduced the complexity of cell-phone, and can guarantee that the performance of each subassembly in the TOF module has the uniformity.
In one embodiment, the movable structure 321 is a rotatable structure, and the TOF module 320 is rotatably connected to the rotatable structure. Alternatively, the rotatable structure is a rotatable cylinder structure or a sphere structure. Alternatively, the rotatable structure may be rotated 180 ° in multiple directions.
Wherein the rotation angle of the rotatable structure may be accurate to 0.01 °.
In this embodiment, the rotatable structure is used to connect the TOF module and the TOF movable cavity, and because the rotatable structure has better rotation performance, and compared with the movable structure, the rotation angle of the TOF module can be larger, so that the acquisition of the depth image in a larger space can be realized.
Fig. 7 is a schematic flow chart diagram of an object tracking method in one embodiment of the present application. Is applicable to any of the electronic devices shown in fig. 1-4. The method of fig. 7 may include:
s702, acquiring a first image.
Wherein the first image comprises image characteristic information of the target object to be tracked.
In one embodiment, the target object to be tracked may be determined by an AI (Artificial Intelligence ) algorithm or specified by the user during the acquisition of the first image. The image characteristic information of the target object may include information such as color characteristics, texture characteristics, shape characteristics, spatial relationship characteristics in the first image, and the like of the image of the target object, and the image of the target object may be determined according to the image characteristic information.
The method for determining the target object according to the AI algorithm is applicable to a scene of the operation of the electronic device for executing the target object which is explicitly required to be tracked. For example, when some platforms perform payment verification, according to product requirements, the tracked target object is set to be the pupil in advance, and then the electronic device may determine that the target object in the first image is the pupil according to the AI algorithm. The image characteristic information of the pupil includes information such as color characteristics, texture characteristics, shape characteristics, spatial relationship characteristics in the first image, and the like of the image of the pupil.
The method specified by the user in the process of acquiring the first image can be suitable for a scene shot by using the electronic equipment, and the specified operation by the user can be focusing operation.
For example, when the user performs a focusing operation on the letter "F" while photographing using the electronic apparatus, it may be determined that the target object to be tracked is the letter "F". The image feature information of the letter "F" includes shape feature information of an image of the letter "F".
S704, determining location information of the target object in the first image based on the first image.
The position information of the target object in the first image can be determined according to the image characteristic information of the target object in the first image.
For example, the image characteristic information of the target object is spatial relationship characteristic information of the image of the target object in the first image. According to the spatial relationship characteristic information of the image of the target object in the first image, the position information of the target object in the first image can be determined.
S706, based on the position information, if the image of the target object in the first image is not located in the first preset area, controlling the TOF module to move so that the image of the target object is located in the first preset area; and tracking the target object by utilizing the moved TOF module.
If the image of the target object is located in the first preset area, the TOF module is not required to be controlled to move.
In one embodiment, the first image may be divided into at least 2 regions at a distance from a center position of the first image. For example, the first image may be divided into the following 2 areas based on the distance between each area and the center position of the first image: an edge region and a safety region. The first preset area may be a safety area.
For another example, the first image may be divided into 3 regions as shown in fig. 8 based on the distance between each region and the center position of the first image: an edge area, a guard area and a security area. The first preset area may be a safety area.
When the TOF module is controlled to move so that the image of the target object is positioned in a first preset area, if the distance between each area and the central position of the first image is based on the distance, the first image is divided into the following 2 areas: the first preset area is a safety area. And controlling the TOF module to move when the image of the target object is determined to be positioned in the edge area so as to ensure that the image of the target object is positioned in the safety area.
If the first image is divided into 3 areas as shown in fig. 8 based on the distance between each area and the center position of the first image: the first preset area is a safety area. Controlling the TOF module to move when the image of the target object is determined to be positioned in the edge area so as to enable the image of the target object to be positioned in the safety area; and when the image of the target object is determined to be positioned in the warning area, controlling the TOF module to move so that the image of the target object is positioned in the safety area.
In the embodiment of the application, the first image is acquired, and the position information of the target object in the first image is determined based on the first image, so that when the position information is determined that the image of the target object in the first image is not located in the first preset area, the TOF module is controlled to move, the image of the target object is located in the first preset area, and the moved TOF module is utilized to track the target object. Therefore, according to the technical scheme, the movable TOF module can be used for adjusting the acquisition area of the TOF depth image according to the image tracking condition, compared with the range of the depth image which can be acquired by the conventional TOF module fixed in the electronic equipment, the acquisition of the depth image in a larger space can be realized, and the effect of real-time tracking image acquisition is improved.
In one embodiment, the first image includes image depth information. The TOF module may be controlled to move according to the following steps A1-A3, so that the image of the target object is located in the first preset area.
A1, determining a first distance between an image of the target object and a first preset area in the first image, a deviation direction of the image of the target object relative to the first preset area and the like according to position information corresponding to the target object.
The position information corresponding to the target object includes a center position of the target object, an edge position of the target object (i.e., a position where an edge contour line of the target object is located), and positions of other portions in the target object.
In this step, a distance between a center position of an image of the target object in the first image and a center position of the first preset area may be taken as the first distance. The distance to be moved when the edge positions of the images meeting the target object in the first image are all located in the first preset area can be used as the first distance.
Along the line of the above example, the first image is divided into 3 areas as shown in fig. 8 based on the distance between each area and the center position of the first image: an edge area, a guard area and a security area. Assuming that the first preset area is a safety area, the target object is a letter "F" which is placed on a horizontal plane after rotating clockwise by 90 degrees, as shown in fig. 8, an image of the target object "F" is located in a warning area in the first image, and according to position information corresponding to the target object "F", a first distance Δx to be moved when positions of edge contour lines of the image of the target object "F" are all located in the safety area can be determined.
In this embodiment, the deviation direction of the image of the target object in the first image with respect to the first preset area may be directly determined by the position information corresponding to the target object. For example, a common representation mode of the position is coordinates, and according to the coordinates, the position relationship between the image of the target object and the first preset area can be determined, so that the deviation direction of the image of the target object relative to the first preset area can be determined.
A2, determining the movement information of the TOF module according to a first distance between the image of the target object in the first image and a first preset area, the deviation direction of the image of the target object relative to the first preset area and the image depth information.
Wherein the movement information includes a movement distance, a movement direction, and the like. The moving distance is positively correlated with the first distance, and the moving distance is negatively correlated with a depth value corresponding to the image depth information.
The depth value corresponding to the image depth information may be a gray value corresponding to each pixel point in the first image.
In one embodiment, the second distance between the target object and the TOF module may be determined according to the image depth information, and further, the movement information of the TOF module may be determined according to the first distance between the image of the target object and the first preset area in the first image, the deviation direction of the image of the target object relative to the first preset area, and the second distance.
Wherein the distance of movement is inversely related to the second distance.
For example, if the first distance is Δx and the second distance is L, the moving distance Δα of the TOF module may be a ratio of the first distance Δx to the second distance L.
It should be noted that, because the position relationship between the TOF module and the target object in practical application is more complex, the above-mentioned method is an exemplary calculation method for obtaining the moving distance. But the correlation between the moving distance and the first and second distances remains unchanged.
In this embodiment, if the movable structure is a rotatable structure, the movement information may be rotation information including a rotation angle, a rotation direction, and the like.
The moving direction can be determined according to the deviation direction of the image of the target object relative to the preset area. For example, if the image of the target object deviates to the left with respect to the preset area, the movement direction of the electronic device is to the left.
A3, controlling the TOF module to move according to the movement information.
In this embodiment, according to the position information corresponding to the target object, a first distance between the image of the target object in the first image and the first preset area and a deviation direction of the image of the target object relative to the first preset area are determined, and according to the first distance, the deviation direction and the image depth information, the movement information of the TOF module is determined, and further the movement of the TOF module is controlled according to the movement information, so that the TOF module can be accurately moved, the image of the target object is located in the first preset area, the movement efficiency of the TOF module is improved, and real-time tracking of the target object is facilitated.
In one embodiment, after the TOF module is controlled to move, a prompt message is sent out when the position information meets a preset condition. The prompting information is used for prompting a user to adjust the position of the electronic equipment. The preset conditions comprise: after the moving times of the TOF module are controlled to reach a preset threshold, the image of the target object is still located outside a second preset area in the first image.
Along the line of the above example, the first image is divided into the following 2 areas based on the distance between each area and the center position of the first image: an edge region and a safety region. The second preset area is a safety area. After the TOF module is controlled to move, if the moving times reach a preset threshold value and the image of the target object is still located outside the safety area, prompt information for prompting a user to adjust the position of the electronic equipment is sent.
For another example, the first image may be divided into 3 regions as shown in fig. 8 based on the distance between each region and the center position of the first image: an edge area, a guard area and a security area. The second preset area is a safety area or a warning area. After controlling the TOF module to move, if the moving times reach a preset threshold value and the image of the target object is still located outside the warning area, sending out prompt information for prompting a user to adjust the position of the electronic equipment; and if the moving times reach the preset threshold value and the image of the target object is still located outside the safety area, sending out prompt information for prompting the user to adjust the position of the electronic equipment.
In this embodiment, the prompting information for prompting the user to adjust the position of the electronic device may include a plurality of items of information, such as a moving direction of the electronic device, a moving distance of the electronic device, a moving angle of the electronic device, and the like. The specific prompt content needs to be determined according to the specific situation in practical application, and the specific content included in the prompt content is not limited herein.
In one embodiment, the moving direction, moving distance or moving angle of the electronic device may be determined according to a distance between the image of the target object in the first image and the first preset area, a deviation direction of the image of the target object with respect to the first preset area, and so on. For example, if the first distance between the image of the target object and the first preset area in the first image is x and the deviation direction of the image of the target object with respect to the first preset area is left, the movement information of the electronic device may be determined as: the distance ax is shifted to the left so that the image of the target object is located within the first preset area, where a is a coefficient representing a positive correlation between the shift distance and the first distance.
Various prompt forms of prompt information are available, such as popup prompt boxes, and prompt modes of providing dynamic indication (such as dynamic arrow) on the current display interface of the electronic device.
In this embodiment, after the TOF module is controlled to move, if the number of times of movement reaches a preset threshold, and the image of the target object is still located outside the second preset area, a prompt message for prompting the user to adjust the position of the electronic device is sent. The condition of failure in tracking the target object is effectively avoided, and the real-time tracking of the target object is facilitated.
In addition, after the TOF module is controlled to move so that the image of the target object is located in the second preset area, the moving TOF module may be used to track the target object, that is, the steps in S702-S706 are executed again, so as to implement real-time tracking of the target object, until an operation indicating to stop tracking (for example, operations of clicking, long pressing, double clicking, etc. on a designated key on the electronic device, operations of clicking, long pressing, double clicking, etc. on a virtual key on the electronic device, etc.) is received, and tracking of the target object is stopped.
Fig. 9 is a schematic flow chart diagram of an object tracking method in another embodiment of the present application. In this embodiment, the object tracking method is applied to the mobile phone as shown in fig. 3-4, and the method of fig. 9 may include:
s901, collecting a first image.
Wherein the first image comprises image characteristic information of the target object to be tracked. The image characteristic information of the target object may include information such as color characteristics, texture characteristics, shape characteristics, spatial relationship characteristics in the first image, and the like of the image of the target object, and the image of the target object may be determined according to the image characteristic information. The determination manner of the target object to be tracked is consistent with the determination manner of the target object to be tracked in S702, and will not be described herein.
S902, determining location information of the target object in the first image based on the first image.
The determining manner of the location information is identical to the determining manner of the location information in S704, and will not be described herein.
S903, based on the position information, it is determined whether the image of the target object in the first image is located within the first preset area. If yes, executing S904; if not, S906 is performed.
The division manner of the first preset area is identical to the division manner of the first preset area in S706, and will not be described herein.
S904, controlling the TOF module to be fixed at the current position.
S905, a second image is acquired, and S902 is performed.
In this step, the position of the TOF module in S904 is kept unchanged, a second image is acquired, and S902-S903 are executed again. Accordingly, S902 is determining positional information of the target object in the second image based on the second image. S903 is to determine, based on the position information, whether the image of the target object in the second image is located within the first preset area. Wherein, the second image, the third image, etc. may be acquired at a preset frequency, and the execution S902-S903 is polled to achieve real-time tracking of the target object. And stopping tracking the target object when an operation (such as clicking, long pressing, double clicking and the like of a designated key on the mobile phone, clicking, long pressing, double clicking and the like of a virtual key on the mobile phone, and the like) for indicating to stop tracking is received.
S906, determining a first distance between the image of the target object and the first preset area in the first image, a deviation direction of the image of the target object with respect to the first preset area, and the like according to the position information corresponding to the target object.
The position information corresponding to the target object comprises the center position of the target object, the edge position of the target object and the positions of other parts in the target object.
In this step, a distance between a center position of an image of the target object in the first image and a center position of the first preset area may be taken as the first distance. The distance to be moved when the edge positions of the images meeting the target object in the first image are all located in the first preset area can be used as the first distance.
For example, the first image is divided into 3 areas as shown in fig. 8 based on the distance between each area and the center position of the first image: an edge area, a guard area and a security area. Assuming that the first preset area is a safe area, the target object is the letter "F" placed on the horizontal plane after rotating 90 ° clockwise. According to the position information corresponding to the target object "F", a first distance Δx to be moved when the positions of the edge contour lines of the image of the target object "F" are all located in the safety area can be determined.
S907, determining a second distance between the target object and the TOF module according to the image depth information.
Wherein the second distance may be determined by a depth value corresponding to the image depth information. The depth value corresponding to the image depth information may be a gray value corresponding to each pixel point in the first image.
S908, determining the movement information of the TOF module according to the first distance, the deviation direction and the second distance.
Wherein the movement information includes a movement distance, a movement direction, and the like. The distance of movement is positively correlated with the first distance and the distance of movement is negatively correlated with the second distance.
The determination manner of the movement information of the TOF module is described in detail in the above embodiments, and is not described herein again.
S909, controlling the movement of the TOF module according to the movement information of the TOF module so as to enable the image of the target object to be located in a first preset area. After that, S905 is performed.
In addition, if the moving times reach the preset threshold after the TOF module is controlled to move and the image of the target object is still located outside the second preset area, prompt information for prompting the user to adjust the position of the mobile phone is sent.
For example, after the TOF module is controlled to move, the number of times of movement reaches a preset threshold, and the image of the target object is still located outside the second preset area, at this time, a prompt box with content of "please shift the mobile phone 10 cm to the left" pops up on the mobile phone, and prompts the user to adjust the position of the mobile phone.
In the embodiment of the application, the first image is acquired, and the position information of the target object in the first image is determined based on the first image, so that when the position information is determined that the image of the target object in the first image is not located in the first preset area, the TOF module is controlled to move, the image of the target object is located in the first preset area, and the moved TOF module is utilized to track the target object. Therefore, the technical scheme can adjust the acquisition area of the TOF depth image by utilizing the movable TOF module according to the image tracking condition, compared with the range of the depth image which can be acquired by the conventional TOF module fixed in the mobile phone, the acquisition of the depth image in a larger space can be realized, and the effect of real-time tracking image acquisition is improved.
It should be noted that, in the object tracking method provided in the embodiment of the present application, the execution body may be an object tracking device, or a control module in the object tracking device for executing the loading object tracking method. In the embodiment of the application, the object tracking device executes the loaded object tracking method as an example, and the object tracking method provided in the embodiment of the application is described.
Fig. 10 is a schematic structural diagram of an object tracking device in an embodiment of the present application. Referring to fig. 10, an object tracking apparatus includes:
an acquisition module 1010 for acquiring a first image;
a determining module 1020 for determining location information of the target object in the first image based on the first image;
the control module 1030 is configured to control the TOF module to move so that the image of the target object is located in the first preset area if the image of the target object in the first image is not located in the first preset area based on the position information; and tracking the target object by utilizing the moved TOF module.
In one embodiment, the first image includes image depth information; the control module 1030 includes:
a first determining unit, configured to determine, according to position information corresponding to the target object, at least one of a first distance between an image of the target object in the first image and a first preset area, and a deviation direction of the image of the target object with respect to the first preset area;
the second determining unit is used for determining the movement information of the TOF module according to at least one of the first distance, the deviation direction and the image depth information; the movement information comprises at least one of a movement distance and a movement direction; a positive correlation between the movement distance and the first distance; negative correlation between the moving distance and the depth value corresponding to the image depth information;
And the control unit is used for controlling the TOF module to move according to the movement information.
In an embodiment, the second determining unit is further configured to:
determining a second distance between the target object and the TOF module according to the image depth information;
and determining the movement information of the TOF module according to at least one of the first distance, the deviation direction and the second distance.
In one embodiment, the object tracking device further comprises:
the execution module is used for sending out prompt information under the condition that the position information meets the preset condition; the prompt information is used for prompting a user to adjust the position of the electronic equipment; the preset conditions comprise: after the moving times of the TOF module are controlled to reach a preset threshold, the image of the target object is still located outside a second preset area in the first image.
The object tracking device in the embodiment of the application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device may be a mobile electronic device or a non-mobile electronic device. By way of example, the mobile electronic device may be a cell phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, wearable device, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook or personal digital assistant (personal digital assistant, PDA), etc., and the non-mobile electronic device may be a server, network attached storage (NetworkAttached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The object tracking device in the embodiments of the present application may be a device having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The object tracking device provided in the embodiment of the present application can implement each process implemented by the object tracking device in the embodiment of the object tracking method in fig. 7 to 9, and in order to avoid repetition, a detailed description is omitted here.
In the embodiment of the application, the first image is acquired, and the position information of the target object in the first image is determined based on the first image, so that when the position information is determined that the image of the target object in the first image is not located in the first preset area, the TOF module is controlled to move, the image of the target object is located in the first preset area, and the moved TOF module is utilized to track the target object. Therefore, the device can utilize mobilizable TOF module to adjust the collection region to TOF depth image according to the image tracking condition, compares in traditional TOF module that is fixed in electronic equipment can gather the scope of depth image, can realize the collection to the depth image in the bigger space, has promoted the effect of real-time tracking image collection.
Optionally, the embodiment of the present application further provides an electronic device, including a processor, a memory, and a program or an instruction stored in the memory and capable of running on the processor, where the program or the instruction when executed by the processor implements each process of the embodiment of the object tracking method, and the process can achieve the same technical effect, so that repetition is avoided, and details are not repeated here.
It should be noted that, the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 11 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1100 includes, but is not limited to: radio frequency unit 1101, network module 1102, audio output unit 1103, input unit 1104, sensor 1105, display unit 1106, user input unit 1107, interface unit 1108, memory 1109, and processor 1110.
Those skilled in the art will appreciate that the electronic device 1100 may further include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1110 by a power management system, such as to perform functions such as managing charging, discharging, and power consumption by the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine some components, or may be arranged in different components, which are not described in detail herein.
Wherein, the radio frequency unit 1101 is configured to acquire a first image.
The processor 1110 is configured to determine, based on the first image, location information of the target object in the first image, control the movement of the TOF module to locate the image of the target object in the first preset area if the image of the target object in the first image is not located in the first preset area based on the location information, and track the target object using the moved TOF module.
Optionally, the processor 1110 is further configured to determine, according to the position information corresponding to the target object, at least one of a first distance between the image of the target object in the first image and the first preset area and a deviation direction of the image of the target object with respect to the first preset area, determine movement information of the TOF module according to at least one of the first distance, the deviation direction and the image depth information, and control movement of the TOF module according to the movement information. The moving information comprises at least one of a moving distance and a moving direction, the moving distance is positively correlated with the first distance, and the moving distance is negatively correlated with a depth value corresponding to the image depth information.
Optionally, the processor 1110 is further configured to determine a second distance between the target object and the TOF module according to the image depth information, and determine movement information of the TOF module according to at least one of the first distance, the deviation direction, and the second distance.
Optionally, the processor 1110 is further configured to send out a prompt message if the location information meets a preset condition. The prompt information is used for prompting a user to adjust the position of the electronic equipment. The preset conditions comprise: after the moving times of the TOF module are controlled to reach a preset threshold, the image of the target object is still located outside a second preset area in the first image.
In the embodiment of the application, the first image is acquired, and the position information of the target object in the first image is determined based on the first image, so that when the position information is determined that the image of the target object in the first image is not located in the first preset area, the TOF module is controlled to move, the image of the target object is located in the first preset area, and the moved TOF module is utilized to track the target object. Therefore, the electronic equipment can utilize the movable TOF module to adjust the acquisition area of the TOF depth image according to the image tracking condition, and compared with the range of the depth image which can be acquired by the traditional TOF module fixed in the electronic equipment, the electronic equipment can acquire the depth image in a larger space, and the effect of real-time tracking image acquisition is improved.
It should be appreciated that in embodiments of the present application, the input unit 1104 may include a graphics processor (Graphics Processing Unit, GPU) 11041 and a microphone 11042, the graphics processor 11041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 1106 may include a display panel 11061, and the display panel 11061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1107 includes a touch panel 11071 and other input devices 11072. The touch panel 11071 is also referred to as a touch screen. The touch panel 11071 may include two parts, a touch detection device and a touch controller. Other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein. Memory 1109 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 1110 may integrate an application processor that primarily processes operating systems, user interfaces, applications, etc., with a modem processor that primarily processes wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 1110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the processes of the embodiment of the object tracking method are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium such as a Read-Only Memory (ROM), a random access Memory (RandomAccess Memory, RAM), a magnetic disk or an optical disk, and the like.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the above embodiment of the object tracking method, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided herein.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), including several instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (9)

1. An electronic device is characterized by comprising a device main body, a time-of-flight TOF module, a movable structure and a TOF movable cavity; wherein,,
the TOF movable cavity is arranged in the equipment main body;
the movable structure is arranged in the TOF movable cavity;
the TOF module is arranged in the TOF movable cavity and connected with the movable structure, and the movable structure drives the TOF module to move in the TOF movable cavity;
the TOF module comprises a TOF receiving assembly and a TOF transmitting assembly; the TOF emission component is used for emitting light rays to a target object; the TOF receiving component is used for receiving the reflected light of the emitted light of the target object; the TOF receiving component and the TOF transmitting component are fixedly arranged at two ends of a bracket.
2. The electronic device of claim 1, wherein the movable structure is a rotatable structure; the TOF module is rotatably connected with the rotatable structure.
3. The electronic device of claim 2, wherein the rotatable structure is a rotatable cylinder structure or a sphere structure.
4. An object tracking method, applied to the electronic device as claimed in any one of claims 1 to 3, comprising:
Acquiring a first image;
determining position information of the target object in the first image based on the first image;
based on the position information, if the image of the target object in the first image is not located in a first preset area, controlling the TOF module to move so that the image of the target object is located in the first preset area; and tracking the target object by utilizing the moved TOF module.
5. The method of claim 4, wherein the first image comprises image depth information;
the controlling the TOF module to move so that the image of the target object is located in the first preset area includes:
determining at least one of a first distance between the image of the target object and the first preset area in the first image and a deviation direction of the image of the target object relative to the first preset area according to the position information corresponding to the target object;
determining movement information of the TOF module according to at least one of the first distance, the deviation direction and the image depth information; the movement information comprises at least one of a movement distance and a movement direction; a positive correlation between the distance of movement and the first distance; the moving distance is inversely related to a depth value corresponding to the image depth information;
And controlling the TOF module to move according to the movement information.
6. The method of claim 5, wherein determining movement information of the TOF module based on at least one of the first distance, the direction of departure, and the image depth information comprises:
determining a second distance between the target object and the TOF module according to the image depth information;
and determining the movement information of the TOF module according to at least one of the first distance, the deviation direction and the second distance.
7. The method of claim 4, wherein after said controlling the movement of the TOF module, further comprising:
sending out prompt information under the condition that the position information meets the preset condition; the prompt information is used for prompting a user to adjust the position of the electronic equipment; the preset conditions include: after the moving times of the TOF module are controlled to reach a preset threshold, the image of the target object is still located outside a second preset area in the first image.
8. An object tracking device, characterized by being applied to the electronic apparatus as claimed in any one of claims 1 to 3, comprising:
The acquisition module is used for acquiring a first image;
a determining module, configured to determine, based on the first image, location information of the target object in the first image;
the control module is used for controlling the TOF module to move if the image of the target object in the first image is not located in a first preset area based on the position information so that the image of the target object is located in the first preset area; and tracking the target object by utilizing the moved TOF module.
9. An electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the object tracking method of any of claims 4-7.
CN202010603898.7A 2020-06-29 2020-06-29 Electronic equipment, object tracking method and device Active CN111722240B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010603898.7A CN111722240B (en) 2020-06-29 2020-06-29 Electronic equipment, object tracking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010603898.7A CN111722240B (en) 2020-06-29 2020-06-29 Electronic equipment, object tracking method and device

Publications (2)

Publication Number Publication Date
CN111722240A CN111722240A (en) 2020-09-29
CN111722240B true CN111722240B (en) 2023-07-21

Family

ID=72569624

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010603898.7A Active CN111722240B (en) 2020-06-29 2020-06-29 Electronic equipment, object tracking method and device

Country Status (1)

Country Link
CN (1) CN111722240B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2200670A1 (en) * 1997-03-21 1998-09-21 David F. Sorrells Tracking system and method for controlling the field of view of a camera
US7620150B1 (en) * 2007-01-30 2009-11-17 Martin Annis X-ray backscatter system for imaging at shallow depths
WO2016089430A1 (en) * 2014-12-03 2016-06-09 Faro Technologies, Inc. Using two-dimensional camera images to speed registration of three-dimensional scans
CN106657781A (en) * 2016-12-19 2017-05-10 北京小米移动软件有限公司 Target object photographing method and target object photographing device
CN206410678U (en) * 2016-12-30 2017-08-15 武汉海达数云技术有限公司 There-dimensional laser scanning device
EP3209007A1 (en) * 2016-02-16 2017-08-23 ABB Schweiz AG An image scanner and a component feeder comprising an image scanner
CN109154657A (en) * 2017-11-29 2019-01-04 深圳市大疆创新科技有限公司 Detecting devices and moveable platform
CN109343079A (en) * 2018-10-30 2019-02-15 合肥泰禾光电科技股份有限公司 A kind of ranging barrier-avoiding method and obstacle avoidance apparatus
CN109862275A (en) * 2019-03-28 2019-06-07 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109983468A (en) * 2016-12-01 2019-07-05 深圳市大疆创新科技有限公司 Use the method and system of characteristic point detection and tracking object
CN110246188A (en) * 2019-05-20 2019-09-17 歌尔股份有限公司 Internal reference scaling method, device and camera for TOF camera

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10142798B2 (en) * 2016-08-09 2018-11-27 Symbol Technologies, Llc Arrangement for, and method of, locating a mobile device in a venue by inferring transit timer values of ranging signals received by the mobile device in a time difference of arrival (TDOA)-based ultrasonic locationing system
CN108769476B (en) * 2018-06-06 2019-07-19 Oppo广东移动通信有限公司 Image acquiring method and device, image collecting device, computer equipment and readable storage medium storing program for executing
CN109274957A (en) * 2018-10-31 2019-01-25 维沃移动通信有限公司 A kind of depth image image pickup method and mobile terminal
CN110398748B (en) * 2019-07-19 2022-05-31 Oppo广东移动通信有限公司 Distance measuring device, equipment and method
CN111062969B (en) * 2019-12-06 2023-05-30 Oppo广东移动通信有限公司 Target tracking method and related product

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2200670A1 (en) * 1997-03-21 1998-09-21 David F. Sorrells Tracking system and method for controlling the field of view of a camera
US7620150B1 (en) * 2007-01-30 2009-11-17 Martin Annis X-ray backscatter system for imaging at shallow depths
WO2016089430A1 (en) * 2014-12-03 2016-06-09 Faro Technologies, Inc. Using two-dimensional camera images to speed registration of three-dimensional scans
EP3209007A1 (en) * 2016-02-16 2017-08-23 ABB Schweiz AG An image scanner and a component feeder comprising an image scanner
CN109983468A (en) * 2016-12-01 2019-07-05 深圳市大疆创新科技有限公司 Use the method and system of characteristic point detection and tracking object
CN106657781A (en) * 2016-12-19 2017-05-10 北京小米移动软件有限公司 Target object photographing method and target object photographing device
CN206410678U (en) * 2016-12-30 2017-08-15 武汉海达数云技术有限公司 There-dimensional laser scanning device
CN109154657A (en) * 2017-11-29 2019-01-04 深圳市大疆创新科技有限公司 Detecting devices and moveable platform
CN109343079A (en) * 2018-10-30 2019-02-15 合肥泰禾光电科技股份有限公司 A kind of ranging barrier-avoiding method and obstacle avoidance apparatus
CN109862275A (en) * 2019-03-28 2019-06-07 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN110246188A (en) * 2019-05-20 2019-09-17 歌尔股份有限公司 Internal reference scaling method, device and camera for TOF camera

Also Published As

Publication number Publication date
CN111722240A (en) 2020-09-29

Similar Documents

Publication Publication Date Title
CN109325967B (en) Target tracking method, device, medium, and apparatus
CN109947886B (en) Image processing method, image processing device, electronic equipment and storage medium
CN109785368B (en) Target tracking method and device
US20160127715A1 (en) Model fitting from raw time-of-flight images
US20200116893A1 (en) Use of thermopiles to detect human location
CN109165606B (en) Vehicle information acquisition method and device and storage medium
RU2656711C2 (en) Method and system for detecting and tracking of moving objects based on three-dimensional sensor data
US20230057965A1 (en) Robot and control method therefor
CN113205549B (en) Depth estimation method and device, electronic equipment and storage medium
WO2022110614A1 (en) Gesture recognition method and apparatus, electronic device, and storage medium
EP2996067A1 (en) Method and device for generating motion signature on the basis of motion signature information
CN113194253B (en) Shooting method and device for removing reflection of image and electronic equipment
CN116030512B (en) Gaze point detection method and device
CN112291473A (en) Focusing method and device and electronic equipment
Amamra et al. GPU-based real-time RGBD data filtering
CN112333439B (en) Face cleaning equipment control method and device and electronic equipment
CN112954153B (en) Camera device, electronic equipment, depth of field detection method and depth of field detection device
CN108307031B (en) Screen processing method, device and storage medium
CN111722240B (en) Electronic equipment, object tracking method and device
CN112543284A (en) Focusing system, method and device
CN109032354B (en) Electronic device, gesture recognition method thereof and computer-readable storage medium
CN113672193B (en) Audio data playing method and device
CN115902882A (en) Collected data processing method and device, storage medium and electronic equipment
EP4078089B1 (en) Localization using sensors that are tranportable with a device
CN111310526B (en) Parameter determination method and device for target tracking model and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant