WO2017201663A1 - Procédé de surveillance d'objet mobile, appareil portable et serveur - Google Patents

Procédé de surveillance d'objet mobile, appareil portable et serveur Download PDF

Info

Publication number
WO2017201663A1
WO2017201663A1 PCT/CN2016/083095 CN2016083095W WO2017201663A1 WO 2017201663 A1 WO2017201663 A1 WO 2017201663A1 CN 2016083095 W CN2016083095 W CN 2016083095W WO 2017201663 A1 WO2017201663 A1 WO 2017201663A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving target
trackable
module
wearable device
image sensors
Prior art date
Application number
PCT/CN2016/083095
Other languages
English (en)
Chinese (zh)
Inventor
方骏
张景嵩
蔡世光
Original Assignee
英华达(上海)科技有限公司
英华达(上海)电子有限公司
英华达股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 英华达(上海)科技有限公司, 英华达(上海)电子有限公司, 英华达股份有限公司 filed Critical 英华达(上海)科技有限公司
Priority to PCT/CN2016/083095 priority Critical patent/WO2017201663A1/fr
Priority to CN201680001393.5A priority patent/CN106605154B/zh
Publication of WO2017201663A1 publication Critical patent/WO2017201663A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/30Mounting radio sets or communication systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to monitoring technologies, and in particular, to a method for monitoring a moving target, a wearable device, and a server.
  • wearable types of electronic devices are gradually being used in people's daily lives.
  • the wearable device can be worn directly on the user's body or integrated into the user's clothing or accessories.
  • wearable devices also implement numerous functions through software processing and data interaction, for example, for medical health, outdoor sports, information reminders, and the like.
  • infrared sensors or lasers When monitoring targets in motion, current wearable devices mostly use infrared sensors or lasers to detect their distance from moving targets.
  • this infrared sensor is based on the principle of signal reflection, the accuracy of ranging is low, and the distance of the moving target that can be detected is relatively close and the directivity is poor.
  • the disadvantage of using laser ranging method is that it is easily interfered by smoke, dust and raindrops. Therefore, the current wearable device using infrared or laser is not accurate enough to monitor moving targets, and the accuracy of ranging is low.
  • the embodiments of the present invention provide a method for monitoring a moving target, a wearable device, and a server, which can improve the monitoring accuracy of the moving target and the resource utilization rate of the wearable device.
  • the invention provides a method for monitoring a moving target, comprising:
  • the present invention also provides a wearable device, wherein the wearable device includes a plurality of image sensors, and the wearable device further includes:
  • An acquiring module configured to acquire an image captured by the plurality of image sensors
  • a searching module configured to search for a trackable moving target from the images captured by the plurality of image sensors acquired by the acquiring module
  • a determining module configured to determine, from the plurality of image sensors, two first image sensors for the trackable moving target found by the searching module, where the images captured by the first image sensors include the traceable Moving target
  • a calculating module configured to calculate a first distance between the trackable moving target and the wearable device according to the images captured by the two first image sensors determined by the determining module;
  • a sending module configured to send the computing module to calculate a first distance and send the information to the monitoring device for display.
  • the invention further provides a server comprising:
  • An acquisition module configured to acquire, from the wearable device, multiple image transmissions in the wearable device Image captured by the sensor;
  • a searching module configured to search for a trackable moving target from the images captured by the plurality of image sensors acquired by the acquiring module
  • a determining module configured to determine, from the plurality of image sensors, two first image sensors for the trackable moving target found by the searching module, where the images captured by the first image sensors include the traceable Moving target;
  • a calculating module configured to calculate a first distance between the trackable moving target and the wearable device according to the images captured by the two first image sensors determined by the determining module.
  • the method provided by the embodiment of the present invention implements the calculation of the distance between the wearable device and the moving target based on the images captured by the two image sensors, and utilizes the principle of binocular vision imaging without moving the motion.
  • the target transmits any signal, which improves the accuracy of the ranging compared to the prior art, makes the monitoring of the moving target more accurate, and improves the resource utilization of the wearable device.
  • FIG. 1a is a schematic diagram of an implementation environment according to an embodiment of the present invention.
  • FIG. 1b is a schematic diagram of an implementation environment according to another embodiment of the present invention.
  • FIG. 2 is a schematic structural view of a wearable device according to an embodiment of the present invention.
  • FIG. 3 is a schematic flow chart of a method for monitoring a moving target according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a list of trackable moving target data pools according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a method for calculating a first distance according to an embodiment of the present invention.
  • FIG. 6 is a schematic flow chart of a method for monitoring a moving target according to another embodiment of the present invention.
  • FIG. 7a is a schematic diagram of determining direction information in an embodiment of the present invention.
  • FIG. 7b is a schematic diagram of coordinates of determining location information according to an embodiment of the present invention.
  • FIG. 8 is a schematic flowchart diagram of a method for monitoring a moving target according to an embodiment of the present invention.
  • 9a is a schematic diagram of a movement trajectory according to an embodiment of the present invention.
  • 9b is a schematic diagram of a movement trajectory according to another embodiment of the present invention.
  • FIG. 10 is a schematic flowchart diagram of a method for monitoring a moving target according to another embodiment of the present invention.
  • FIG. 11 is a schematic flow chart of a method for monitoring a moving target according to still another embodiment of the present invention.
  • 12a is a schematic flow chart of a method for monitoring a moving target according to an embodiment of the present invention.
  • 12b is a schematic flow chart of a method for monitoring a moving target according to another embodiment of the present invention.
  • FIG. 13 is a schematic structural diagram of a wearable device according to an embodiment of the present invention.
  • FIG. 14 is a schematic structural diagram of a wearable device according to another embodiment of the present invention.
  • FIG. 15 is a schematic structural view of a wearable device according to still another embodiment of the present invention.
  • Figure 16 is a schematic structural view of a moving object monitoring device according to an embodiment of the present invention.
  • Figure 17 is a schematic structural view of a moving object monitoring device according to another embodiment of the present invention.
  • FIG. 18 is a schematic structural diagram of a server according to an embodiment of the present invention.
  • FIG. 19 is a schematic structural diagram of a server according to another embodiment of the present invention.
  • FIG. 1a is a schematic diagram of an implementation environment according to an embodiment of the present invention.
  • the moving target monitoring system 101 includes a wearable device 110, a moving target 1 ... a moving target N, a monitoring device 120, and a moving target monitoring device 130.
  • the moving target monitoring device 130 is located in the wearable device 110, and the wearable device 110 includes a plurality of image sensors.
  • the moving target monitoring device 130 is provided with monitoring moving target 1... moving target N, data
  • the processing and information sending function captures the image of the moving target in the surrounding environment through the plurality of image sensors in the wearable device 110, finds the trackable moving target from the captured image, and continuously tracks, when a certain trigger condition is reached,
  • the monitoring information may be sent to the monitoring device 120 in a wireless transmission manner to indicate a reminder or an early warning.
  • FIG. 1b is a schematic diagram of an implementation environment according to another embodiment of the present invention.
  • the moving target monitoring system 102 includes a wearable device 110, a moving target 1 ... a moving target N, a monitoring device 120, a moving target monitoring device 130, and a server 140.
  • the monitoring device 130 of the moving target is located in the server 140.
  • the server 140 can be a server, or a server cluster consisting of several servers, or a cloud computing service center.
  • the wearable device 110 includes a plurality of image sensors for capturing an image of the surrounding environment, and then transmitting the captured image to the server 140 in a wireless manner, and then the monitoring device 130 of the moving target from the server 140 The image is acquired and further processed.
  • the monitoring information may be sent to the monitoring device 120 by means of wireless transmission.
  • the monitoring device 120 can be a monitoring server and/or an intelligent terminal. After receiving the monitoring information sent by the monitoring device 130 of the moving target, the monitoring information is displayed to the user for the purpose of reminding or warning.
  • the monitoring device 120 is a monitoring server
  • the monitoring server can be a sub-server in the server 140.
  • the monitoring device 130 of the moving target can send the monitoring information to the monitoring device 120 by means of internal data transmission.
  • the monitoring server is independent of the server 140.
  • the monitoring device 130 of the moving target can transmit the monitoring information to the monitoring device 120 by wire or wirelessly.
  • the triggering condition may be to find a moving target, be tracked by the moving target, or collide with the moving target.
  • the monitoring information sent includes primary warning information and advanced warning information, wherein the primary warning information refers to discovering one When the target is moved, the user is informed of the distance and direction information of the moving target; the advanced warning information refers to the current position information of the moving target when the mobile target is determined to be tracked or collides with the moving target, Pictures, moving tracks, etc.
  • the moving target 2 is a car moving obliquely to the wearable device 110 on the left front side of the wearable device 110
  • the moving target 3 is at the left rear of the wearable device 110 and oblique to the wearable device 110
  • the moving object 4 is a car that is moving in the same direction as the wearable device 110 directly behind the wearable device 110.
  • the present invention does not limit the specific form of the moving object, and may be any object that is in motion, such as a bicycle, an electric car, an animal, or the like, in addition to a pedestrian or a car.
  • the angles of view of the adjacent two image sensors may overlap, and the sum of the angles of view of all the image sensors can satisfy 360° full coverage.
  • the image sensor may be a charge coupled device (CCD) image sensor.
  • FIG. 2 is a schematic structural diagram of a wearable device 200 according to an embodiment of the present invention.
  • five CCD image sensors 201 to 205 of the same model number are mounted in the wearable device 200, wherein the CCD image sensors 201 to 205 form a regular pentagon.
  • the angle of view of each CCD image sensor is related to the focal length of the lens used by the CCD and the size of the CCD.
  • the CCD image sensors 201 to 205 may use a 1/3" CCD lens with a focal length of 2.8 mm, and the angle of view of each CCD image sensor is 86.3 °, and the angles of view of the two CCD image sensors may overlap.
  • the sum of the viewing angles of the five CCD image sensors exceeds 360°, meeting the requirements of full coverage.
  • the motion state of the wearable device or the motion state of the user using the wearable device is not specifically limited, and may be in a stationary state.
  • the state can also be in a state of motion at different moving speeds.
  • FIG. 3 is a schematic flow chart of a method for monitoring a moving target according to an embodiment of the present invention. The method includes the following steps.
  • Step 301 Acquire images captured by multiple image sensors.
  • the plurality of image sensors are located in a wearable device, and the plurality of image sensors include at least two CCD image sensors.
  • the CCD image sensor uses a high-sensitivity semiconductor material that converts light into electric charge and converts it into a digital signal through an analog-to-digital converter chip.
  • the digital signal can be stored in a wearable device after being compressed, thereby achieving processing.
  • Image data is not limited to a wearable device.
  • Step 302 Find a trackable moving target from the images captured by the plurality of image sensors.
  • the search operation can be cycled according to a certain period. Specifically, at each first predetermined time interval, for each image sensor, a plurality of consecutive images captured by the image sensor are subjected to feature point analysis to obtain feature points corresponding to the image sensor.
  • the accuracy of the feature points that can be analyzed depends on the distance the image sensor can take, for example, the distance that can be captured is 1 meter.
  • the distance that the image sensor can capture is related to the hardware condition of the image sensor itself.
  • feature points corresponding to at least two image sensors may be determined as the above-described trackable moving target.
  • the time point at which the trackable moving target is first searched is recorded as the first time stamp of the trackable moving target, and the time at which the trackable moving target is currently found is recorded. The point is recorded as the current timestamp of the trackable moving target.
  • the continuous tracking time of the trackable moving target may be determined according to the first time stamp and the current time stamp, for example, the difference between the first time stamp and the current time stamp is used as the continuous tracking time of the trackable moving target.
  • the feature points corresponding to the at least two image sensors are determined as the candidate moving targets, and then a trackable moving target data pool is set, and the candidate moving targets are set. Perform data elimination processing. Specifically,
  • Step 3022 Add the candidate moving target to the trackable moving target data pool, and record the time point at which the candidate moving target is determined as the current time stamp of the candidate moving target. If the candidate moving target is already stored in the trackable moving target data pool, the previously recorded time stamp of the alternate moving target is updated to the current time stamp.
  • the time point at which the trackable moving target is first determined may be recorded as the first time stamp of the trackable moving target, and the difference between the first time stamp and the current time stamp may be used as the continuous tracking of the trackable moving target. time.
  • Step 3023 Arrange the current timestamps of all candidate moving targets in the trackable moving target data pool in chronological order, and select L candidate moving targets corresponding to the first L timestamps as the trackable motion according to the arrangement. aims.
  • the trackable moving target data pool includes a serial number of each trackable moving target, a first time drawn, a current time stamp, and a currently captured picture.
  • the first timestamp and the current timestamp can be used to determine the continuous tracking time of the trackable moving target.
  • a trackable moving target with sequence number 1 has a first timestamp of 8:20 and a current timestamp of 8:46.
  • the currently captured image is displayed as a crosswalk, a black man walking.
  • the trackable moving target with sequence number 2 has only the first timestamp data of 8:45, indicating that the moving target is captured for the first time, and the currently captured picture is known to be a running car.
  • Step 303 determining two firsts for the trackable moving target from the plurality of image sensors An image sensor, the image captured by the first image sensor includes a trackable moving target.
  • step 302 when the trackable moving target is found, two first CCD image sensors including the trackable moving target among the captured images may be simultaneously determined.
  • Step 304 Calculate a first distance between the trackable moving target and the wearable device according to the images captured by the two first image sensors, and send the first distance to the monitoring device for display.
  • each adjacent two CCD image sensors constitute a binocular CCD camera model
  • the bionic human uses the binocular sensing distance method to image the same target from different positions by using two cameras.
  • the obtained stereo image pairs are matched to corresponding image points by various algorithms, and then the first distance is calculated by triangulation.
  • the two-dimensional position information includes a lateral coordinate and a longitudinal coordinate; a difference between the lateral coordinates of the two two-dimensional positional information as a second distance, also referred to as a parallax distance; acquiring a third distance between the two first CCD image sensors in the wearable device, also referred to as a baseline distance; Then, the first distance is calculated based on the second distance and the third distance.
  • FIG. 5 is a schematic diagram of a method for calculating a first distance according to an embodiment of the present invention.
  • two first CCD image sensors respectively provide a left CCD lens and a right CCD lens
  • the tracked moving target is obtained on the left CCD image surface and the right CCD image surface, respectively.
  • T left (X left , Y left )
  • T right (X right , Y right ).
  • f is the focal length of the CCD lens, where the left CCD lens and the right CCD lens have the same focal length.
  • the trackable moving target and the wearable device are calculated according to the images captured by the two first CCD image sensors.
  • the first distance between the first distance and the first distance is sent to the monitoring device, and the distance between the wearable device and the moving target is calculated based on the dual CCD image sensor, and the principle of binocular vision imaging is utilized, without transmitting the moving target Compared with the prior art, any signal improves the accuracy of the ranging, makes the monitoring of the moving target more accurate, and improves the resource utilization of the wearable device.
  • FIG. 6 is a schematic flow chart of a method for monitoring a moving target according to another embodiment of the present invention. The method includes the following steps.
  • Step 601 Acquire an orientation of each CCD image sensor.
  • a plurality of CCD image sensors are included in the wearable device. After the wearable device is activated, the correction mode is first entered, and the orientation (ie, the directional position) of each CCD image sensor with respect to the user is set in advance. The orientation of each CCD image sensor is acquired from the wearable device.
  • the front side of the user, the back of the user, the left hand position of the user, and the right hand position of the user are basic orientations.
  • the orientation of the CCD image sensor 201 is directly in front of the user.
  • the direction between the two basic orientations is the intermediate orientation, such as the user's right front (ie, the orientation of the CCD image sensor 202), the user's right rear (ie, the orientation of the CCD image sensor 203), and the user's left front (ie, the CCD image).
  • the orientation of the sensor 205), the left rear of the user ie, the orientation of the CCD image sensor 204).
  • the user takes a photo of each of the CCD image sensors.
  • the user takes five photos by five CCD image sensors, and then images the images in each photo corresponding to the actual scene to obtain the orientation corresponding to the image.
  • the orientation is further correlated to the CCD image sensor that captured the image to determine the orientation of each CCD image sensor.
  • the wearable device prompts the user to perform a series of movements in the directions of front, back, left, right, left front, right front, left rear, and right rear, and each CCD image sensor is moved before and after each movement. Take at least one photo.
  • each CCD image sensor captures at least two photos, and the same target can be found in at least two photos before and after the movement (refer to the search method of step 302).
  • the first distance between the same target and the wearable device is measured (refer to the calculation method of step 304), and the orientation is determined by the change of the first distance before and after the movement. For example, if the user moves forward and the measurement results show that the first distance from the same target is decreasing, then the corresponding CCD image sensor is determined to be directly in front of the user.
  • Step 602 Acquire images captured by a plurality of CCD image sensors, and search for traceable moving targets from images captured by the plurality of CCD image sensors.
  • the trackable moving target can be found based on the method of feature point analysis.
  • the method of feature point analysis may employ a continuously adaptive average shift (CamShift) algorithm.
  • the CamShift algorithm performs an average shift operation on all frames of the image, and sets a search window to use the result of the previous frame (ie, the center and size of the search window) as the initial search window used for the next frame shift. The value is then iterated over the average shift operation of each frame of data to find one or more feature points from multiple consecutive images.
  • the advantage of this CamShift algorithm is that it can be adaptively adjusted when the size of the moving target changes. The target area continues to track.
  • the determination of the feature points may also use the contour of the continuous image or the color of the moving target to perform the auxiliary determination.
  • the method of feature point analysis may employ a Kalman filter algorithm. Specifically, the state variable and the estimated value of the output signal are obtained without considering the influence of the input signal and the observed noise, and then the estimated value of the state variable is corrected by weighting the estimated error of the output signal, so that the mean square error of the state variable estimation error is minimized.
  • This Kalman filter algorithm belongs to the optimized autoregressive data processing algorithm, and the recognition of moving targets is more accurate.
  • Step 603 Determine two first CCD image sensors for the trackable moving target from the plurality of CCD image sensors, and calculate an image between the trackable moving target and the wearable device according to the images captured by the two first CCD image sensors. The first distance.
  • This step is the same as the operations of steps 303 and 304, and will not be described again.
  • Step 604 Determine, from the plurality of CCD image sensors, at least one second CCD image sensor for the trackable moving target, where the image captured by the second CCD image sensor includes a trackable moving target, according to at least one second CCD image sensor The orientation and range of view determine the direction information that can be tracked by the moving target.
  • the orientation of the second CCD image sensor is determined as the direction information corresponding to the trackable moving target. If the determination is made based on the orientations and viewing angle ranges of the two second CCD image sensors, the two second CCD image sensors at this time are selectively the same as the two first CCD image sensors in step 303.
  • the trackable moving target is in a common view area of the two CCD image sensors, and then the orientations of the combined two second CCD image sensors are determined as direction information corresponding to the trackable moving target.
  • the CCD image sensor including the trackable moving target in the captured image can be used as the basis for judging the azimuth and the range of the angle of view.
  • Figure 7 is a schematic diagram of determining direction information in an embodiment of the present invention.
  • the angle of view of the CCD image sensor 201 is as shown in 701.
  • the range of viewing angles of the CCD image sensor 202 is as shown at 702, the common viewing angle area of 701 and 702 is 703, and a trackable moving target 704 is found within 703.
  • the second CCD image sensor is only the CCD image sensor 201 in step 604, the direction information of the trackable moving target 704 is the orientation of the CCD image sensor 201, that is, directly in front of the user.
  • the direction information of the trackable moving target 704 is the orientation of the CCD image sensor 202, that is, the user's right front. If the second CCD image sensor includes the CCD image sensors 201 and 202, the direction information of the trackable moving target 704 is the combined orientation of the CCD image sensors 201 and 202, that is, the user is front + right front.
  • Step 605 Send the first distance and direction information to the monitoring device for display.
  • the monitoring device is another smart terminal of the user who uses the wearable device, and the smart terminal receives the first distance and direction information and displays it to the user, notifies the user to find a moving target, and informs the specific distance. And direction information of the moving target.
  • the wearable device functions as a primary warning to remind the user to find suspicious moving targets.
  • the monitoring method of the moving target can also notify the user whether there is a collision, whether it is tracked, or the like.
  • the following method embodiments will further provide a monitoring method for moving targets that provide advanced warning information.
  • FIG. 8 is a schematic flow chart of a method for monitoring a moving target according to an embodiment of the present invention, which can determine whether a collision occurs.
  • the wearable device includes a position locating module, such as a global positioning system (GPS) sensor, in addition to a plurality of CCD image sensors.
  • GPS global positioning system
  • Step 801 Obtain location information of the wearable device.
  • the location information may be a two-dimensional GPS coordinate captured by the GPS, expressed as I(x I , y I ). Where x I represents longitude and y I represents latitude. Alternatively, it may be a three-dimensional GPS coordinate captured by GPS, expressed as I(x I , y I , z I ). Where zI represents the height.
  • Step 802 Acquire images captured by a plurality of CCD image sensors every first predetermined time interval.
  • Step 803 Find whether there is a trackable moving target from the images captured by the plurality of CCD image sensors.
  • step 804 is performed; otherwise, step 802 is returned.
  • Step 804 Determine two first CCD image sensors for the trackable moving target from the plurality of CCD image sensors.
  • Step 805 Calculate a first distance according to images captured by the two first CCD image sensors every second predetermined time interval, and obtain relative coordinates of the trackable moving target relative to the wearable device.
  • the relative coordinates may be two-dimensional coordinates (x r , y r ), or three-dimensional coordinates (x r , y r , z r ).
  • p is the pixel size of the CCD lens, here the image of the left CCD lens and the right CCD lens
  • the meta size is the same.
  • the second predetermined time interval may be the same as or different from the first predetermined time interval in step 802.
  • the second predetermined time interval is twice the first predetermined time interval.
  • the first predetermined time interval determines a period for finding a trackable moving target
  • the second predetermined time interval determines a period for transmitting the warning information to the monitoring device.
  • Step 806 Send the first distance calculated by the current time to the monitoring device for display.
  • the wearable device first sends an initial level of early warning information to the monitoring device.
  • Step 807 Calculate position information of the trackable moving target according to the calculated relative coordinate and the position information of the wearable device in each second predetermined time interval, and calculate the position information calculated in the plurality of second predetermined time intervals. Wire the line to get the trajectory of the trackable moving target.
  • the association between the coordinate systems (X L , Y L ) can determine the position information of the trackable moving target.
  • the association between the above two coordinate systems can be characterized by the rotation angle ⁇ r of Y r with respect to Y L .
  • the two-dimensional GPS coordinates of the moving target can be tracked (x T , y In T ), x T and y T can be calculated by the following formula:
  • FIG. 7b is a schematic diagram of coordinates for determining position information in an embodiment of the present invention.
  • the two first CCD image sensors are 201 and 202
  • the midpoints of the two are as shown in 2012, and 201 and 202 are connected to obtain the horizontal axis X r .
  • the midpoint 2012 is drawn outward.
  • the vertical line gives the vertical axis Y r .
  • the horizontal axis X r and the vertical axis Y r are obtained in the CCD coordinate system composed of the two first CCD image sensors 203 and 204, the midpoints of which are shown as 2034;
  • the CCD coordinate system formed by the first CCD image sensors 204 and 205 obtains the horizontal axis X r and the vertical axis Y r , and the midpoints of the two are as shown at 2045.
  • the vertical axis Y r is rotated counterclockwise with respect to the vertical axis Y L (shown by a broken line Y L ' in the figure) to obtain a rotational angle ⁇ r , 0 ⁇ ⁇ r ⁇ 2 ⁇ .
  • the reference coordinate axis "directly in front of the user" shown in Fig. 2 is introduced in Fig. 7b, denoted as Y U .
  • Y U Defining the rotation angle of the vertical axis Y r with respect to the reference coordinate axis Y U (or its parallel line Y U ') is ⁇ 1
  • the rotation angle of the reference coordinate axis Y U with respect to the vertical axis Y L is ⁇ 2 , then
  • the reference coordinate axis Y U and the position of each CCD image sensor are uniquely determined. Then, the rotation angle ⁇ 1 of the vertical axis Y r with respect to the reference coordinate axis Y U in the CCD coordinate system constituted by each of the two first CCD image sensors is a fixed value, and can be passed in the coordinate system shown in FIG. The coordinates of each CCD image sensor uniquely determine the value of ⁇ 1 .
  • the rotation angle ⁇ 2 of the reference coordinate axis Y U (or using its parallel line Y U ') with respect to the vertical axis Y L changes with the movement of the user (ie, the movement of the wearable device), and can be worn by the wearable
  • the GPS coordinates (x I , y I ) of the device determine the value of ⁇ 2 . Further, for the current position information of the user or the wearable device, the value of ⁇ 2 is the same for every two first CCD image sensors.
  • FIG. 9a is a schematic illustration of a movement trajectory of a trackable moving object in accordance with one embodiment of the present invention. As shown in FIG. 9, within a two-dimensional coordinate consisting of longitude and latitude, the moving trajectory is as shown by curve 910.
  • Step 808 Calculate a moving speed of the trackable moving target by using a difference between the first distance calculated by the current time and the first distance calculated by the previous time, and multiply the moving speed by a preset human body reaction time to obtain safe distance.
  • the moving speed v of the trackable moving target can be calculated as:
  • the human body reaction time (also referred to as safety time) can be preset to be 10 seconds.
  • Step 809 Determine whether the first distance calculated at the current time is less than a safety distance.
  • step 810 If yes, go to step 810; otherwise, go back to step 805.
  • Step 810 Send a picture of the trackable moving target, and/or the position information calculated at the current time, and/or the moving track to the monitoring device for display.
  • step 803 when a trackable moving target is found, the picture of the trackable moving target is saved, and the picture may be captured by any one of the two first CCD image sensors.
  • step 807 is optional. Step 807 may not be performed, and after step 805 (and step 806) is performed, step 808 is further performed. If step 807 is not performed, then in step 810, only the picture of the trackable moving target is sent to the monitoring device for display.
  • calculating the relative coordinates of the trackable moving target relative to the wearable device may also be optional, that is, the relative coordinate may not be calculated in step 805, but the relative coordinate is calculated when step 807 is performed. .
  • the user wears a wearable using the method described in the above embodiments.
  • the device is to walk in public. For example, by crossing the road, if there is a car on the side (moving target 2 as shown in FIG. 1), the CCD image sensor on the wearable device first finds the moving target 2 and reports to the user. The distance (and the azimuth) is then calculated by measuring the moving speed, and if the distance is less than the safe distance, the user is warned that the user may collide.
  • FIG. 10 is a schematic flow chart of a method for monitoring a moving target according to another embodiment of the present invention, which can determine whether a user is tracked. Based on the method shown in FIG. 8, after steps 801 through 807 are performed, steps 1001 and 1002 are continued. Specifically,
  • Step 1001 Find a number of inflection points on the moving track that can track the occurrence of the moving target.
  • the so-called inflection point refers to a point that changes the curve of the moving trajectory in an upward or downward direction.
  • the method of searching may be: calculating a slope between two adjacent position information in the movement track, and when a significant change in the slope is found, an intermediate position of one or both of the two adjacent position information is used as an inflection point. For example, three inflection points are found in the movement trajectory 910 shown in FIG. 9a: the inflection point 1, the inflection point 2, and the inflection point 3, that is, the number of inflection points is 3.
  • Step 1002 Determine whether the number of inflection points is greater than a preset inflection point threshold.
  • step 810 If yes, go to step 810. Otherwise, go back to step 805.
  • step 810 is executed, that is, the current time.
  • the calculated position information and/or movement trajectory is sent to the monitoring device for display.
  • the inflection point in the movement trajectory of the trackable moving target after determining the inflection point in the movement trajectory of the trackable moving target, it is also determined whether the inflection point also appears in the movement trajectory of the user, thereby determining whether to send the advanced warning information to the monitoring device.
  • the movement track of the user is consistent with the movement track of the wearable device, and can be obtained by connecting a plurality of position information of the wearable device.
  • step 9b is a schematic diagram of a movement trajectory according to another embodiment of the present invention, wherein the movement trajectory of the trackable moving object is as shown by a curve 921, and the movement trajectory of the user is as shown by a curve 922.
  • step 1001 After finding three inflection points in the movement trajectory 921 in the step 1001: the inflection point 1, the inflection point 2, and the inflection point 3, it is judged whether the three inflection points also appear in the movement track of the user. If it is determined that the inflection point of the movement track of the trackable moving target also appears in the movement track of the user, step 810 is executed to send the advanced warning information to the monitoring device.
  • the user wears a wearable device that uses the method described in the above embodiments to walk in a public place. For example, walking on the street, at this time if there is a bicycle behind the user. Then the CCD image sensor on the wearable device first finds the moving target, and reports the distance (and the azimuth) to the user, and then records the moving trajectory of the moving target through the assistance of the GPS to find the number of inflection points in the moving trajectory. The alert value alerts the user that the user is likely to be tracked.
  • FIG. 11 is a schematic flow chart of a method for monitoring a moving target according to still another embodiment of the present invention, and the method can also determine whether a user is tracked. Based on the method shown in FIG. 8, after steps 801 through 807 are performed, steps 1101 and 1102 are continued. Specifically,
  • Step 1101 Record a time point at which the trackable moving target is first determined as a first time stamp of the trackable moving target, and use a difference between the first time stamp and the current time stamp as a continuous tracking time of the trackable moving target.
  • step 302 in steps 802 and 803, it is searched for a trackable moving target every first predetermined time interval. After finding a trackable moving target for the first time, the time point at this time is recorded as the first time stamp. Then, the trackable moving target is cyclically confirmed according to the first predetermined time interval, and the current time stamp is updated to continuously track the trackable moving target.
  • Step 1102 Determine whether the continuous tracking time is greater than a preset tracking time threshold. If so, Then step 810 is performed. Otherwise, go back to step 805.
  • the preset tracking time threshold is 5 minutes.
  • step 807 is also optional. Step 807 may not be performed, and after step 805 (and step 806) is performed, step 1101 is further performed. If step 807 is not performed, then in step 810, only the picture of the trackable moving target is sent to the monitoring device for display.
  • calculating the relative coordinates of the trackable moving target relative to the wearable device may also be optional, that is, the relative coordinate may not be calculated in step 805, but the relative coordinate is calculated when step 807 is performed. .
  • the user wears a wearable device that uses the method described in the above embodiments to walk in a public place. For example, walking on the street, at this time if there is a line behind (moving target 3 as shown in Figure 1), it follows the user. Then the CCD image sensor on the wearable device first finds the moving target 3 and reports the distance (and orientation) to the user. If the calculated continuous tracking time exceeds the warning value, for example, 5 minutes, the user is warned. Prompt the user to be tracked.
  • the warning value for example, 5 minutes
  • FIG. 12a is a schematic flow chart of a method for monitoring a moving target according to an embodiment of the present invention. This method can determine if a collision will occur and if it is tracked.
  • the wearable device includes a plurality of CCD image sensors. The method includes the following steps.
  • Step 1201 Acquire position information of the wearable device, and set an orientation of each CCD image sensor.
  • This step is an operation performed by the wearable device at the time of initialization. Refer to the detailed description of steps 601 and 801.
  • Step 1202 Acquire images captured by a plurality of CCD image sensors every first predetermined time interval.
  • Step 1203 Find out whether there is a trackable moving target from the images captured by the plurality of CCD image sensors.
  • step 1204 is performed; otherwise, step 1102 is returned.
  • steps 1202 and 1203 may be referred to the specific description of steps 802 and 803, respectively.
  • Step 1204 Determine, for each trackable moving target, two first CCD image sensors for the trackable moving target from the plurality of CCD image sensors, and determine the traceable according to the orientation and the range of the two first CCD image sensors.
  • Direction information corresponding to the common view area in which the moving target is located.
  • a plurality of trackable moving targets can be found in step 1203. As described in step 302, up to 30 trackable moving targets can be saved in the trackable moving target data pool. Then, in the subsequent processing, for each trackable moving target, the first distance is calculated, the direction information is determined, and whether the advanced warning information is sent to the monitoring device is determined.
  • step 604 two first CCD image sensors are used to determine direction information corresponding to the trackable moving target.
  • the trackable moving target is in a common viewing angle area of the two first CCD image sensors.
  • Step 1205 Calculate a first distance according to images captured by the two first CCD image sensors every second predetermined time interval, and obtain relative coordinates of the trackable moving target relative to the wearable device.
  • step 805 The operation of this step can be referred to the specific description of step 805, respectively.
  • Step 1206 Send the direction information obtained in step 1204 and/or the first distance calculated in the current time obtained in step 1205 to the monitoring device for display.
  • the primary warning information is presented to the user on the monitoring device, that is, a suspicious moving target is found in a certain direction relative to the wearable device at a distance from the first distance.
  • Step 1207 Calculate position information of the trackable moving target according to the calculated relative coordinate and the position information of the wearable device in each second predetermined time interval, and calculate the position information calculated in the plurality of second predetermined time intervals. Wire the line to get the trajectory of the trackable moving target.
  • Step 1208 Calculate a moving speed of the trackable moving target by using a difference between the first distance calculated by the current time and the first distance calculated by the previous time, and multiply the moving speed by a preset human body reaction time to obtain a moving speed. safe distance.
  • steps 1207 and 1208 can be referred to the specific description of steps 807 and 808, respectively.
  • Step 1209 Determine whether the first distance calculated at the current time is less than a safety distance. If yes, go to step 1210; otherwise, go to step 1211.
  • Step 1210 Send a picture of the trackable moving target, and/or the position information calculated at the current time, and/or the moving track to the monitoring device for display.
  • step 1203 when a trackable moving target is found, a picture of the trackable moving target is saved, and the picture may be captured by any one of the two first CCD image sensors.
  • the advanced warning information is displayed to the user on the monitoring device, that is, a trackable moving target is displayed at a specific location, and a picture of the trackable moving target and a moving track are displayed.
  • Step 1211 Find the number of inflection points that can be tracked by the moving target from the moving track.
  • Step 1212 Determine whether the number of inflection points is greater than a preset inflection point threshold. If yes, go to step 1210; otherwise, go to step 1213.
  • steps 1211 and 1212 can be referred to the specific description of steps 1001 and 1002, respectively.
  • Step 1213 Record the time point at which the trackable moving target is first determined as traceable The first timestamp of the moving target, the difference between the first timestamp and the current timestamp is taken as the continuous tracking time of the trackable moving target.
  • Step 1214 Determine whether the continuous tracking time is greater than a preset tracking time threshold. If yes, go to step 1210; otherwise, go back to step 1205.
  • steps 1213 and 1214 can be referred to the specific description of steps 1101 and 1102, respectively.
  • step 1208 and step 1209 are for determining the safety distance
  • steps 1211 and 1212 are for the inflection point
  • steps 1213 and 1214 are for the continuous tracking time.
  • step 1210 may be performed as long as any one of the trigger conditions is satisfied. Therefore, the order of execution of the three trigger conditions is variable. According to the traversal manner, there are a total of six execution orders.
  • the execution sequence shown in FIG. 12 is only an example, and may be performed in other five sequential orders. Execution, ie:
  • Steps 1213+1214, 1212+1212, and 1208+1209 are sequentially performed.
  • the user wears a wearable device that uses the method described in the above embodiments to walk in a public place. For example, walking on the street, at this time if there is a car on the side (moving target 2 as shown in Figure 1) heading towards the user, and there is a pedestrian behind it (moving target 3 as shown in Figure 1) that follows the user. . Then the CCD image sensor on the wearable device first finds the two moving targets and reports the respective directions and distances to the user, and if the current distance calculated for the moving target 2 is less than the safe distance, the user is A warning is issued to alert the user that a collision may occur, and if the continuous tracking time exceeds the warning value for the moving target 3, the user is also warned that the user may be tracked.
  • FIG. 12b is a schematic flow chart of a method for monitoring a moving target according to another embodiment of the present invention. This method can determine if a collision will occur at the same time and be tracked. The steps in the method shown in Fig. 12b and Fig. 12a are the same, but the order of execution of the steps is different.
  • steps 1208+1209 (trigger condition one), step 1207+1211+1212 (trigger condition two), and step 1213+1214 (trigger condition three), that is, three, may be simultaneously performed.
  • the trigger conditions are executed in parallel.
  • step 1210 is performed; otherwise, step 1205 is performed; when it is determined in step 1212 that the number of inflection points is greater than the preset inflection point threshold, step 1210 is performed; Otherwise, step 1205 is performed; when it is determined in step 1214 that the continuous tracking time is greater than the preset tracking time threshold, step 1210 is performed; otherwise, step 1205 is performed.
  • the advanced warning information is sent to the monitoring device; in addition, if any two of the above triggering conditions are met or the above three triggering conditions are simultaneously met, the advanced warning is also sent to the monitoring device.
  • Information that is, a double warning of simultaneous collisions and simultaneous tracking.
  • FIG. 13 is a schematic structural diagram of a wearable device 1300 according to an embodiment of the present invention, wherein the wearable device 1300 includes a plurality of image sensors 1301 - 130M, that is, an image sensor 1 ... an image sensor M, wherein M is greater than 1. A positive integer.
  • the wearable device 1300 further includes:
  • the acquiring module 1310 is configured to acquire images captured by the plurality of image sensors 1301 130130M;
  • the searching module 1320 is configured to search for a trackable moving target from the images captured by the plurality of image sensors acquired by the obtaining module 1310;
  • a determining module 1330 configured to determine two first image sensors from the plurality of image sensors for the traceable moving target found by the searching module 1320, where the images captured by the first image sensors include a trackable moving target;
  • a calculation module 1340 configured to calculate a first distance between the trackable moving target and the wearable device 1300 according to the images captured by the two first image sensors determined by the determining module 1330;
  • the sending module 1350 is configured to send the first distance calculated by the calculating module 1340 to the monitoring device for display.
  • the searching module 1320 is configured to: set a trackable moving target data pool; perform feature point analysis on the plurality of consecutive images captured by the image sensor for each image sensor every first predetermined time interval, Obtaining a feature point corresponding to the image sensor; determining a feature point corresponding to the at least two image sensors as an alternate motion target, adding the candidate motion target to the trackable moving target data pool, and determining the candidate motion
  • the time point of the target is recorded as the current time stamp of the candidate moving target; if the candidate moving target is already stored in the trackable moving target data pool, the time stamp of the previously recorded moving target is updated to the current time Placing; arranging the current timestamps of all candidate moving targets in the trackable moving target data pool in chronological order, and selecting L candidate moving targets corresponding to the first L timestamps as the trackable moving target according to the arrangement Where L is a positive integer greater than one.
  • the calculating module 1340 is configured to acquire, for each first image sensor, two-dimensional position information of the trackable moving target in the image captured by the first image sensor, the two-dimensional position information includes a lateral coordinate and Longitudinal coordinate; taking the difference between the lateral coordinates of the two two-dimensional position information obtained as the second distance; acquiring a third distance between the two first image sensors in the wearable device; according to the second distance and the third The distance calculates the first distance.
  • FIG. 14 is a schematic structural diagram of a wearable device 1400 according to an embodiment of the present invention. Based on the wearable device 1300 shown in FIG. 13, the wearable device 1400 further includes a position locating module 1401 and a setting module 1410.
  • a setting module 1410 is configured to set an orientation of each image sensor
  • the determining module 1330 is further configured to: determine, from the plurality of image sensors, at least one second image sensor for the trackable moving target, the image captured by the second image sensor includes a trackable moving target; and at least the setting according to the setting module 1410 Azimuth and a range of viewing angles of a second image sensor determine direction information corresponding to the trackable moving target;
  • the sending module 1350 is further configured to send the direction information determined by the determining module 1330 to the monitoring device for display.
  • the calculating module 1340 is configured to calculate a first distance according to images captured by the two first image sensors every second predetermined time interval;
  • the calculating module 1340 is further configured to calculate a moving speed of the trackable moving target by using a difference between the first distance calculated by the current time and the first distance calculated by the previous time, and reacting the moving speed with the preset human body Time multiplication gives a safe distance;
  • the sending module 1350 is further configured to: if the first distance calculated by the current time is less than the safety distance calculated by the calculating module 1340, send the picture of the trackable moving target to the monitoring device for display.
  • the calculating module 1340 is configured to obtain relative coordinates of the trackable moving target relative to the wearable device when calculating the first distance every second predetermined time interval;
  • the location locating module 1401 is configured to acquire location information of the wearable device.
  • the calculating module 1340 is further configured to calculate position information of the trackable moving target according to the position information of the wearable device acquired by the relative coordinate and position positioning module 1401 in each second predetermined time interval, and the plurality of second predetermined times The position information calculated in the interval is connected to obtain a movement track of the trackable moving target;
  • the sending module 1350 is further configured to: if the first distance calculated at the current time is less than the calculation
  • the safety distance calculated by the module 1340 sends the position information calculated at the current time, and/or the movement trajectory to the monitoring device for display.
  • the calculating module 1340 is configured to obtain relative coordinates of the trackable moving target relative to the wearable device according to the images captured by the two first image sensors every second predetermined time interval;
  • the location locating module 1401 is configured to acquire location information of the wearable device.
  • the calculating module 1340 is further configured to calculate position information of the trackable moving target according to the position information of the wearable device acquired by the relative coordinate and position positioning module 1401 in each second predetermined time interval, and the plurality of second predetermined times The position information calculated in the interval is connected to obtain a movement track of the trackable moving target; and the number of inflection points at which the moving target can be tracked is searched from the moving track;
  • the sending module 1350 is further configured to: if the calculated number of inflection points calculated by the calculating module 1340 is greater than a preset inflection point threshold, the picture of the moving target can be tracked, and/or the position information calculated at the current time, and/or the moving track is sent. Display the monitoring device.
  • the calculating module 1340 is further configured to record, as the first timestamp of the trackable moving target, the time point at which the trackable moving target is first searched, and record the current time point at which the trackable moving target is currently found as Tracking the current timestamp of the moving target, using the difference between the first timestamp and the current timestamp as the continuous tracking time of the trackable moving target;
  • the sending module 1350 is further configured to: if the continuous tracking time calculated by the calculating module 1340 is greater than the preset tracking time threshold, send the picture of the trackable moving target to the monitoring device for display.
  • the calculating module 1340 is configured to obtain relative coordinates of the trackable moving target relative to the wearable device according to the images captured by the two first image sensors every second predetermined time interval;
  • the location locating module 1401 is configured to acquire location information of the wearable device.
  • the calculating module 1340 is further configured to calculate position information of the trackable moving target according to the position information of the wearable device acquired by the relative coordinate and position positioning module 1401 in each second predetermined time interval, and the plurality of second predetermined times The position information calculated in the interval is connected to obtain a movement track of the trackable moving target;
  • the sending module 1350 is further configured to: if the continuous tracking time calculated by the calculating module 1340 is greater than the preset tracking time threshold, send the location information calculated at the current time, and/or the moving track to the monitoring device for display.
  • FIG. 15 is a schematic structural diagram of a wearable device 1500 according to still another embodiment of the present invention.
  • the wearable device 1500 includes a processor 1510, a memory 1520, a port 1530, and a bus 1540.
  • the processor 1510 and the memory 1520 are interconnected by a bus 1540.
  • the processor 1510 can receive and transmit data through the port 1530. among them,
  • the processor 1510 is configured to execute a machine readable instruction module stored by the memory 1520.
  • the memory 1520 stores a machine readable instruction module executable by the processor 1510.
  • the instruction module executable by the processor 1510 includes an image sensor module 1521, an acquisition module 1522, a lookup module 1523, a determination module 1524, a calculation module 1525, and a transmission module 1526. among them,
  • the image sensor module 1521 may be executed by the processor 1510 to: control M image sensors to capture images, where M is a positive integer greater than one;
  • the acquiring module 1522 may be executed by the processor 1510 to: acquire an image captured by the image sensor module 1521;
  • the searching module 1523 may be executed by the processor 1510 to: search for a trackable moving target from the images captured by the plurality of image sensors acquired by the obtaining module 1522;
  • the determining module 1524 when executed by the processor 1510, may determine that two first image sensors are determined from the plurality of image sensors for the traceable moving target found by the searching module 1523, and the captured images of the first image sensors include Trackable moving targets;
  • the calculation module 1525 When the calculation module 1525 is executed by the processor 1510, it may be: according to the determining module 1524 Determining the images captured by the two first image sensors to calculate a first distance between the trackable moving target and the wearable device;
  • the sending module 1526 is executed by the processor 1510 to: send the first distance calculated by the calculating module 1525 to the monitoring device for display.
  • instruction module executable by the processor 1510 may further include: a location locating module 1527 and a setting module 1528. specifically,
  • the setting module 1528 is executed by the processor 1510 to: set an orientation of each image sensor;
  • the determining module 1524 is further executed by the processor 1510 to: determine, from the plurality of image sensors, at least one second image sensor for the trackable moving target, wherein the image captured by the second image sensor includes a trackable moving target;
  • the orientation and the range of the angle of view of the at least one second image sensor set by the setting module 1527 determine direction information corresponding to the trackable moving target;
  • the sending module 1526 When the sending module 1526 is executed by the processor 1510, the direction information determined by the determining module 1524 may be sent to the monitoring device for display.
  • the relative coordinates of the traceable moving target relative to the wearable device are obtained according to the images captured by the two first image sensors every second predetermined time interval.
  • the location information may be: acquiring location information of the wearable device;
  • the method may further be: calculating the position information of the trackable moving target according to the position information of the wearable device acquired by the relative coordinate and position positioning module 1527 in each second predetermined time interval, The position information calculated in the plurality of second predetermined time intervals is connected to obtain a movement track of the trackable moving target; and the number of inflection points in which the trackable moving target appears is searched from the moving track;
  • the method may further be: if the number of inflection points calculated by the calculating module 1525 is greater than a preset inflection point threshold, the picture of the moving target, and/or the position information calculated at the current time, and / or, the mobile track is sent to the monitoring device for display.
  • Figure 16 is a block diagram showing the structure of a moving object monitoring device 1600 according to an embodiment of the present invention. As shown in FIG. 16, the moving target monitoring device 1600 includes:
  • the acquiring module 1610 is configured to acquire images captured by multiple image sensors, where the plurality of image sensors are located in a wearable device;
  • the searching module 1620 is configured to search for a trackable moving target from the images captured by the plurality of image sensors acquired by the acquiring module 1610;
  • a determining module 1630 configured to determine two first image sensors from the plurality of image sensors for the traceable moving target found by the searching module 1620, where the images captured by the first image sensors include a trackable moving target;
  • the calculating module 1640 is configured to calculate, according to the image captured by the two first image sensors determined by the determining module 1630, a first distance between the trackable moving target and the wearable device;
  • the sending module 1650 is configured to send the first distance calculated by the calculating module 1640 to the monitoring device for display.
  • the searching module 1620 is configured to: set a trackable moving target data pool; perform feature point analysis on the plurality of consecutive images captured by the image sensor for each image sensor every first predetermined time interval, Obtaining feature points corresponding to the image sensor; determining feature points corresponding to at least two image sensors simultaneously as candidate moving targets, and preparing the device Selecting a moving target to join the trackable moving target data pool, and recording a time point at which the candidate moving target is determined as a current time stamp of the candidate moving target; if the alternate moving target data pool has stored the alternate motion
  • the target updates the timestamp of the previously recorded moving target to the current timestamp; the current timestamps of all the candidate moving targets in the trackable moving target data pool are arranged in chronological order, and are selected according to the arrangement
  • the L candidate moving targets corresponding to the first L time stamps are used as trackable moving targets, where L is a positive integer greater than 1.
  • the calculating module 1640 is configured to acquire, for each first image sensor, two-dimensional position information of the trackable moving target in the image captured by the first image sensor, the two-dimensional position information including the horizontal coordinate and Longitudinal coordinate; taking the difference between the lateral coordinates of the two two-dimensional position information obtained as the second distance; acquiring a third distance between the two first image sensors in the wearable device; according to the second distance and the third The distance calculates the first distance.
  • the obtaining module 1610 is further configured to acquire an orientation of each image sensor
  • the determining module 1630 is further configured to: determine, from the plurality of image sensors, the at least one second image sensor for the trackable moving target, the image captured by the second image sensor includes the trackable moving target; and the at least the acquired by the acquiring module 1610 Azimuth and a range of viewing angles of a second image sensor determine direction information corresponding to the trackable moving target;
  • the sending module 1650 is further configured to send the direction information determined by the determining module 1630 to the monitoring device for display.
  • the calculating module 1640 is configured to calculate a first distance according to images captured by the two first image sensors every second predetermined time interval;
  • the calculating module 1640 is further configured to calculate a moving speed of the trackable moving target by using a difference between the first distance calculated by the current time and the first distance calculated by the previous time, and reacting the moving speed with the preset human body Time multiplication gives a safe distance;
  • the sending module 1650 is further configured to: if the first distance calculated by the current time is less than the safety distance calculated by the calculating module 1640, send the picture of the trackable moving target to the monitoring device for display.
  • the calculating module 1640 is configured to obtain relative coordinates of the trackable moving target relative to the wearable device when calculating the first distance every second predetermined time interval;
  • the obtaining module 1610 is configured to acquire location information of the wearable device.
  • the calculating module 1640 is further configured to calculate position information of the trackable moving target according to the relative coordinate and the position information of the wearable device acquired by the acquiring module 1610 in each second predetermined time interval, and the plurality of second predetermined time intervals The calculated position information is connected to obtain a moving track of the trackable moving target;
  • the sending module 1650 is further configured to: if the first distance calculated by the current time is less than the safety distance calculated by the calculating module 1640, send the position information calculated at the current time, and/or the moving track to the monitoring device for display.
  • the calculating module 1640 is configured to obtain relative coordinates of the trackable moving target relative to the wearable device according to the images captured by the two first image sensors every second predetermined time interval;
  • the obtaining module 1610 is further configured to acquire location information of the wearable device
  • the calculating module 1640 is further configured to calculate position information of the trackable moving target according to the relative coordinate and the position information of the wearable device acquired by the acquiring module 1610 in each second predetermined time interval, and the plurality of second predetermined time intervals
  • the calculated position information is connected to obtain a moving track of the trackable moving target; and the number of inflection points at which the moving target can be tracked is searched from the moving track;
  • the sending module 1650 is further configured to: if the calculated number of inflection points calculated by the calculating module 1640 is greater than a preset inflection point threshold, the picture of the moving target can be tracked, and/or the position information calculated at the current time, and/or the moving track is sent. Display the monitoring device.
  • the calculating module 1640 is further configured to record, as the first timestamp of the trackable moving target, the time point at which the trackable moving target is first searched, and record the current time point at which the trackable moving target is currently found as Tracking the current timestamp of the moving target, using the difference between the first timestamp and the current timestamp as the continuous tracking time of the trackable moving target;
  • the sending module 1650 is further configured to: if the continuous tracking time calculated by the calculating module 1640 is greater than the preset tracking time threshold, send the picture of the trackable moving target to the monitoring device for display.
  • the calculating module 1640 is configured to obtain relative coordinates of the trackable moving target relative to the wearable device according to the images captured by the two first image sensors every second predetermined time interval;
  • the obtaining module 1610 is configured to acquire location information of the wearable device.
  • the calculating module 1640 is further configured to calculate position information of the trackable moving target according to the relative coordinate and the position information of the wearable device acquired by the acquiring module 1610 in each second predetermined time interval, and the plurality of second predetermined time intervals The calculated position information is connected to obtain a moving track of the trackable moving target;
  • the sending module 1650 is further configured to: if the continuous tracking time calculated by the calculating module 1640 is greater than the preset tracking time threshold, send the location information calculated at the current time, and/or the moving track to the monitoring device for display.
  • the moving object monitoring device 1600 can be located in the wearable device or in a server.
  • FIG. 17 is a schematic structural diagram of a moving object monitoring apparatus 1700 according to another embodiment of the present invention.
  • the moving target monitoring device 1700 includes a processor 1710, a memory 1720, a port 1730, and a bus 1740.
  • the processor 1710 and the memory 1720 are interconnected by a bus 1740.
  • the processor 1710 can receive and transmit data through the port 1730. among them,
  • the processor 1710 is configured to execute a machine readable instruction module stored by the memory 1720.
  • the memory 1720 stores machine readable instruction modules executable by the processor 1710.
  • the instruction module executable by the processor 1710 includes an acquisition module 1721, a lookup module 1722, a determination module 1723, a calculation module 1724, and a transmission module 1725. among them,
  • the acquiring module 1721 may be executed by the processor 1710 to: acquire images captured by multiple image sensors, where the plurality of image sensors are located in a wearable device;
  • the searchable target can be searched for from the images captured by the plurality of image sensors acquired by the obtaining module 1721.
  • the determining module 1723 may be executed by the processor 1710 to: determine two first image sensors from the plurality of image sensors for the trackable moving target found by the searching module 1722, and the images captured by the first image sensors include Trackable moving targets;
  • the calculation module 1724 is executed by the processor 1710, and may calculate, according to the images captured by the two first image sensors determined by the determining module 1723, a first distance between the trackable moving target and the wearable device;
  • the sending module 1725 When the sending module 1725 is executed by the processor 1710, the first distance calculated by the calculating module 1724 is sent to the monitoring device for display.
  • the relative coordinates of the traceable moving target relative to the wearable device are obtained when calculating the first distance every second predetermined time interval;
  • the method may further be: acquiring location information of the wearable device;
  • the location information of the trackable moving target may be calculated according to the position information of the wearable device acquired by the relative coordinate and acquisition module 1721 in each second predetermined time interval.
  • the position information calculated in the second predetermined time interval is connected to obtain a movement track of the trackable moving target;
  • the method may further be: if the current time meter If the calculated first distance is smaller than the safety distance calculated by the calculation module 1724, the position information calculated at the current time, and/or the movement trajectory is sent to the monitoring device for display.
  • FIG. 18 is a schematic structural diagram of a server 1800 according to an embodiment of the present invention. As shown in FIG. 18, the server 1800 includes:
  • An obtaining module 1810 configured to acquire, from the wearable device, an image captured by a plurality of image sensors in the wearable device;
  • the searching module 1820 is configured to search for a trackable moving target from the images captured by the plurality of image sensors acquired by the acquiring module 1810;
  • a determining module 1830 configured to determine two first image sensors from the plurality of image sensors for the traceable moving target found by the searching module 1820, where the images captured by the first image sensors include a trackable moving target;
  • the calculating module 1840 is configured to calculate a first distance between the trackable moving target and the wearable device according to the images captured by the two first image sensors determined by the determining module 1830.
  • the server 1800 further includes: a sending module 1850, configured to calculate, by the computing module 1840, the first distance to the monitoring device for display.
  • the searching module 1820 is configured to: set a trackable moving target data pool; perform feature point analysis on the plurality of consecutive images captured by the image sensor for each image sensor every first predetermined time interval, Obtaining a feature point corresponding to the image sensor; determining a feature point corresponding to the at least two image sensors as an alternate motion target, adding the candidate motion target to the trackable moving target data pool, and determining the candidate motion
  • the time point of the target is recorded as the current time stamp of the candidate moving target; if the candidate moving target is already stored in the trackable moving target data pool, the time stamp of the previously recorded moving target is recorded Updating to the current timestamp; arranging the current timestamps of all candidate moving targets in the trackable moving target data pool in chronological order, and selecting L candidate moving targets corresponding to the first L timestamps according to the arrangement
  • a moving target can be tracked, where L is a positive integer greater than one.
  • the obtaining module 1810 is further configured to acquire an orientation of each image sensor
  • the determining module 1830 is further configured to: determine, from the plurality of image sensors, the at least one second image sensor for the trackable moving target, the image captured by the second image sensor includes the trackable moving target; and the at least the acquired by the acquiring module 1810 Azimuth and a range of viewing angles of a second image sensor determine direction information corresponding to the trackable moving target;
  • the sending module 1850 is further configured to send the direction information determined by the determining module 1830 to the monitoring device for display.
  • the calculating module 1840 is configured to calculate a first distance according to images captured by the two first image sensors every second predetermined time interval;
  • the calculating module 1840 is further configured to calculate a moving speed of the trackable moving target by using a difference between the first distance calculated by the current time and the first distance calculated by the previous time, and reacting the moving speed with the preset human body Time multiplication gives a safe distance;
  • the sending module 1850 is configured to send the picture of the trackable moving target to the monitoring device for display if the first distance calculated by the current time is less than the safety distance calculated by the calculating module 1840.
  • the calculating module 1840 is configured to obtain relative coordinates of the trackable moving target relative to the wearable device when calculating the first distance every second predetermined time interval;
  • the obtaining module 1810 is further configured to acquire location information of the wearable device from the wearable device;
  • the calculation module 1840 is further configured to, according to each second predetermined time interval, Calculating position information of the trackable moving target by using the position information of the wearable device acquired by the coordinate and acquisition module 1810, and connecting the position information calculated in the plurality of second predetermined time intervals to obtain a moving track of the trackable moving target ;
  • the sending module 1850 is further configured to: if the first distance calculated by the current time is less than the safety distance calculated by the calculating module 1840, send the position information calculated at the current time, and/or the moving track to the monitoring device for display.
  • the calculating module 1840 is configured to obtain relative coordinates of the trackable moving target relative to the wearable device according to the images captured by the two first image sensors every second predetermined time interval;
  • the obtaining module 1810 is further configured to acquire location information of the wearable device from the wearable device;
  • the calculating module 1840 is further configured to calculate position information of the trackable moving target according to the relative coordinate and the position information of the wearable device acquired by the acquiring module 1810 in each second predetermined time interval, and the plurality of second predetermined time intervals
  • the calculated position information is connected to obtain a moving track of the trackable moving target; and the number of inflection points at which the moving target can be tracked is searched from the moving track;
  • the sending module 1850 is configured to: if the number of inflection points calculated by the calculating module 1840 is greater than a preset inflection point threshold, send a picture of the trackable moving target, and/or the position information calculated at the current time, and/or the moving track to Monitor the device for display.
  • the calculating module 1840 is further configured to: record a time point at which the trackable moving target is first searched as a first time stamp of the trackable moving target, and record a time point at which the currently trackable moving target is currently found as Tracking the current timestamp of the moving target, using the difference between the first timestamp and the current timestamp as the continuous tracking time of the trackable moving target;
  • the sending module 1850 is configured to send the picture of the trackable moving target to the monitoring device for display if the continuous tracking time is greater than the preset tracking time threshold.
  • the calculating module 1840 is configured to obtain relative coordinates of the trackable moving target relative to the wearable device according to the images captured by the two first image sensors every second predetermined time interval;
  • the obtaining module 1810 is further configured to acquire location information of the wearable device from the wearable device;
  • the calculating module 1840 is further configured to calculate position information of the trackable moving target according to the relative coordinate and the position information of the wearable device acquired by the acquiring module 1810 in each second predetermined time interval, and the plurality of second predetermined time intervals The calculated position information is connected to obtain a moving track of the trackable moving target;
  • the sending module 1850 is further configured to: if the continuous tracking time calculated by the calculating module 1840 is greater than the preset tracking time threshold, send the location information calculated at the current time, and/or the moving track to the monitoring device for display.
  • the wearable device after the wearable device captures the image through the plurality of image sensors, it sends the image to the server for subsequent processing, which utilizes the powerful processing capability of the server side, thereby saving the processing operation of the wearable device, thereby reducing the wearable device. Power consumption.
  • FIG. 19 is a schematic structural diagram of a server 1900 according to another embodiment of the present invention.
  • the server 1900 includes a processor 1910, a memory 1920, a port 1930, and a bus 1940.
  • Processor 1910 and memory 1920 are interconnected by a bus 1940.
  • the processor 1910 can receive and transmit data through the port 1930. among them,
  • the processor 1910 is configured to execute a machine readable instruction module stored by the memory 1920.
  • the memory 1920 stores machine readable instruction modules executable by the processor 1910.
  • the instruction module executable by the processor 1910 includes an acquisition module 1921, a lookup module 1922, a determination module 1923, and a calculation module 1924. among them,
  • the acquiring module 1921 may be executed by the processor 1910 to: acquire images captured by multiple image sensors, where the plurality of image sensors are located in a wearable device;
  • the search module 1922 When the search module 1922 is executed by the processor 1910, it may be: searching for a trackable moving target from the images captured by the plurality of image sensors acquired by the obtaining module 1921.
  • the determining module 1923 may be executed by the processor 1910 to: determine two first image sensors from the plurality of image sensors for the traceable moving target found by the searching module 1922, and the captured images of the first image sensors include Trackable moving targets;
  • the calculation module 1924 is executed by the processor 1910 to calculate a first distance between the trackable moving target and the wearable device according to the images captured by the two first image sensors determined by the determining module 1923.
  • the instruction module executable by the processor 1910 further includes: a sending module 1925.
  • the sending module 1925 is executed by the processor 1910, and the first distance calculated by the calculating module 1924 is sent to the monitoring device for display.
  • each functional module in each embodiment of the present invention may be integrated into one processing unit, or each module may exist physically separately, or two or more modules may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • each of the embodiments of the present invention can be implemented by a data processing program executed by a data processing device such as a computer.
  • the data processing program constitutes the present invention.
  • a data processing program usually stored in a storage medium is executed by directly reading a program out of a storage medium or by installing or copying the program to a storage device (such as a hard disk and or a memory) of the data processing device. Therefore, such a storage medium also constitutes the present invention.
  • Storage medium Any type of recording method can be used, such as paper storage media (such as paper tape, etc.), magnetic storage media (such as floppy disk, hard disk, flash memory, etc.), optical storage media (such as CD-ROM, etc.), and magneto-optical storage media (such as MO). and many more.
  • the present invention also discloses a storage medium in which is stored a data processing program for performing any of the above-described embodiments of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de surveillance d'objet mobile, un appareil portable et un serveur. Le procédé selon l'invention consiste : à acquérir des images capturées par une pluralité de capteurs d'image (301), la pluralité de capteurs d'image étant situés dans un appareil portable ; à rechercher un objet mobile traçable à partir des images capturées par la pluralité de capteurs d'image (302) ; à déterminer, parmi la pluralité de capteurs d'image, deux premiers capteurs d'image pour l'objet mobile traçable (303), les images capturées par ces premiers capteurs d'image comprenant l'objet mobile traçable ; et à calculer une première distance entre l'objet mobile traçable et un appareil portable en fonction des images capturées par les deux premiers capteurs d'image et à envoyer la première distance à un dispositif de surveillance pour affichage (304). Les procédé et appareil selon l'invention permettent d'améliorer la précision de surveillance d'un objet mobile et le taux d'utilisation des ressources d'un dispositif portable.
PCT/CN2016/083095 2016-05-24 2016-05-24 Procédé de surveillance d'objet mobile, appareil portable et serveur WO2017201663A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2016/083095 WO2017201663A1 (fr) 2016-05-24 2016-05-24 Procédé de surveillance d'objet mobile, appareil portable et serveur
CN201680001393.5A CN106605154B (zh) 2016-05-24 2016-05-24 一种运动目标的监测方法、穿戴式装置及服务器

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/083095 WO2017201663A1 (fr) 2016-05-24 2016-05-24 Procédé de surveillance d'objet mobile, appareil portable et serveur

Publications (1)

Publication Number Publication Date
WO2017201663A1 true WO2017201663A1 (fr) 2017-11-30

Family

ID=58583260

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/083095 WO2017201663A1 (fr) 2016-05-24 2016-05-24 Procédé de surveillance d'objet mobile, appareil portable et serveur

Country Status (2)

Country Link
CN (1) CN106605154B (fr)
WO (1) WO2017201663A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111638486A (zh) * 2019-03-01 2020-09-08 阿里巴巴集团控股有限公司 一种定位方法、系统和装置

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110120061A (zh) * 2018-02-05 2019-08-13 杭州萤石软件有限公司 一种运动对象监控方法、装置、系统及电子设备
CN110505437A (zh) * 2018-05-18 2019-11-26 杭州海康威视数字技术股份有限公司 一种物体提示的方法、装置及系统
CN111619803A (zh) * 2019-02-28 2020-09-04 上海博泰悦臻电子设备制造有限公司 跟随提醒方法、跟随提醒系统、车载终端及存储介质
CN109901171B (zh) * 2019-04-12 2023-08-18 河南理工大学 汽车防追尾预警方法
CN110113581B (zh) * 2019-06-13 2020-11-06 重庆人为本科技发展有限公司 一种智慧城市监控系统及方法
CN110940982B (zh) * 2019-11-29 2023-09-12 径卫视觉科技(上海)有限公司 一种车辆前方目标识别方法以及相应的设备
CN111665490B (zh) * 2020-06-02 2023-07-14 浙江大华技术股份有限公司 目标跟踪方法和装置、存储介质及电子装置
CN116953680B (zh) * 2023-09-15 2023-11-24 成都中轨轨道设备有限公司 一种基于图像的目标物实时测距方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101155258A (zh) * 2006-09-27 2008-04-02 索尼株式会社 成像设备和成像方法
CN103353677A (zh) * 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 成像装置及方法
US20150243044A1 (en) * 2012-09-21 2015-08-27 The Schepens Eye Research Institute, Inc. Collision Prediction
US20150339861A1 (en) * 2014-05-26 2015-11-26 Samsung Electronics Co., Ltd. Method of processing image and electronic device thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101106700A (zh) * 2007-08-01 2008-01-16 大连海事大学 视频监控系统中的智能化目标细节捕获装置及方法
CN101320048A (zh) * 2008-06-30 2008-12-10 河海大学 扇形排列的多电荷耦合器件图像传感器大视场车辆测速装置
CN102175251A (zh) * 2011-03-25 2011-09-07 江南大学 双目智能导航系统
CN105574838B (zh) * 2014-10-15 2018-09-14 上海弘视通信技术有限公司 多目相机的图像配准和拼接方法及其装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101155258A (zh) * 2006-09-27 2008-04-02 索尼株式会社 成像设备和成像方法
US20150243044A1 (en) * 2012-09-21 2015-08-27 The Schepens Eye Research Institute, Inc. Collision Prediction
CN103353677A (zh) * 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 成像装置及方法
US20150339861A1 (en) * 2014-05-26 2015-11-26 Samsung Electronics Co., Ltd. Method of processing image and electronic device thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111638486A (zh) * 2019-03-01 2020-09-08 阿里巴巴集团控股有限公司 一种定位方法、系统和装置

Also Published As

Publication number Publication date
CN106605154A (zh) 2017-04-26
CN106605154B (zh) 2019-05-24

Similar Documents

Publication Publication Date Title
WO2017201663A1 (fr) Procédé de surveillance d'objet mobile, appareil portable et serveur
CN105940429B (zh) 用于确定设备运动的估计的方法和系统
KR101972374B1 (ko) 컨텐츠 공유 시스템에서 관심점을 식별하기 위한 장치 및 방법
US20180211398A1 (en) System for 3d image filtering
US9165190B2 (en) 3D human pose and shape modeling
CN104103030B (zh) 图像分析方法、照相机装置、控制装置及控制方法
US20130163879A1 (en) Method and system for extracting three-dimensional information
CN106575437B (zh) 信息处理装置、信息处理方法以及程序
CN104966062B (zh) 视频监视方法和装置
US9990547B2 (en) Odometry feature matching
JP2006086591A (ja) 移動体追跡システム、撮影装置及び撮影方法
JP5554726B2 (ja) データ関連付けのための方法と装置
JP6383439B2 (ja) 認識されたオブジェクトを用いてセンサのキャリブレーションを行うための方法およびシステム
WO2019019819A1 (fr) Dispositif électronique mobile et procédé de traitement de tâches dans une région de tâche
Pundlik et al. Collision detection for visually impaired from a body-mounted camera
JPH07262375A (ja) 移動体検出装置
US9292963B2 (en) Three-dimensional object model determination using a beacon
CN112146620B (zh) 目标物体的测距方法及装置
KR101856151B1 (ko) 실내 측위 인프라 정보 수집을 위한 포터블 수집 장치
WO2019037517A1 (fr) Dispositif électronique mobile et procédé de traitement de tâche dans une zone de tâche
WO2024083010A1 (fr) Procédé de localisation visuelle et appareil associé
CN109974667B (zh) 一种室内人体定位方法
CN111968157A (zh) 一种应用于高智能机器人的视觉定位系统及方法
US20190147611A1 (en) Object sensing system, object sensing method, and recording medium storing program code
KR20240026061A (ko) 객체의 궤적을 추적하기 위한 방법

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16902653

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16902653

Country of ref document: EP

Kind code of ref document: A1