WO2021208029A1 - Method and apparatus for correlating a user and a user equipment - Google Patents

Method and apparatus for correlating a user and a user equipment Download PDF

Info

Publication number
WO2021208029A1
WO2021208029A1 PCT/CN2020/085160 CN2020085160W WO2021208029A1 WO 2021208029 A1 WO2021208029 A1 WO 2021208029A1 CN 2020085160 W CN2020085160 W CN 2020085160W WO 2021208029 A1 WO2021208029 A1 WO 2021208029A1
Authority
WO
WIPO (PCT)
Prior art keywords
waveform
user
correlation
motion data
user equipment
Prior art date
Application number
PCT/CN2020/085160
Other languages
French (fr)
Inventor
Xiaobing Leng
Gang Shen
Original Assignee
Nokia Shanghai Bell Co., Ltd.
Nokia Solutions And Networks Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Shanghai Bell Co., Ltd., Nokia Solutions And Networks Oy filed Critical Nokia Shanghai Bell Co., Ltd.
Priority to PCT/CN2020/085160 priority Critical patent/WO2021208029A1/en
Priority to CN202080099859.6A priority patent/CN115461695A/en
Publication of WO2021208029A1 publication Critical patent/WO2021208029A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Definitions

  • Embodiments of the present disclosure generally relate to user monitoring and user equipment monitoring, and specifically to methods, apparatus and computer readable storage medium for correlating a user and a user equipment.
  • data reflecting the user activities can be collected, for example a communication network operator, it is attractive to get the data monetization for business value.
  • an operator can have a try to get user motion data monetization in the shopping mall by analyzing users’ business behavior and preference according to data indicative of user motion.
  • An application example can be used to illustrate problems involved in the data monetization. Assuming in a shopping mall, a woman is in shopping. When she stands before a showcase and browses goods in the showcase for a long time (e.g 3 seconds) , it may mean that she is interested in the goods in the showcase. Her activity can be captured by a camera. The shop may obtain her appearance characteristics via image recognition, e.g. face, clothing colour, etc. But the shop doesn’t know whom she is. The shop cannot push any commercial information, e.g. advertisement related to the goods in the showcase, to her.
  • her appearance characteristic It would be benefit to establish a correlation between her appearance characteristic and her UE. Then, many commercial applications can be done via her UE.
  • a method for correlation between a user and a user equipment comprises obtaining first motion data measured continuously with regard to a motion of a user within a period of time; obtaining second motion data measured continuously with regard to a motion of a user equipment within the period of time; performing a correlation match between a first waveform derived from the first motion data and a second waveform derived from the second motion data; and determining whether there is a correlation between the user and the user equipment based on the correlation match between the first waveform and the second waveform.
  • performing the correlation match can comprise determining a difference waveform corresponding to a difference between the first motion data and the second motion data; and comparing a waveform characteristic of the difference waveform with a threshold to determine whether there is a correlation match between the first waveform and the second waveform.
  • performing the correlation match can comprise transforming the first waveform and the second waveform from time domain to frequency domain; and performing the correlation match in the frequency domain.
  • the correlation match can be performed based on at least one of the following waveform characteristics: amplitude, frequency, period, and phase.
  • the method can further comprise determining a waveform characteristic in frequency domain of the first waveform; and determining a corresponding waveform characteristic in frequency domain of the second waveform.
  • Performing the correlation match can comprise performing a match between the waveform characteristic in frequency domain of the first waveform and the corresponding waveform characteristic in frequency domain of the second waveform.
  • performing a match between waveform characteristics can comprise determining a difference between the waveform characteristic in frequency domain of the first waveform and the corresponding waveform characteristic in frequency domain of the second waveform; and comparing the difference between waveform characteristics with a threshold.
  • the waveform characteristic in frequency domain can comprise at least one of the following: amplitude, frequency, period and phase.
  • the method can further comprise transforming at least one of the first motion data and the second motion data to motion data in a common coordinate system.
  • the first motion data and the second motion data can comprise data indicative of a motion in at least one of the following dimensions: linear velocity, angular velocity, and a dimension defined by a variable derived from at least one the linear velocity and the angular velocity.
  • obtaining the first motion data can comprise analyzing an image flow captured within the period of time to determine the first motion data of the user
  • obtaining the second motion data can comprise receiving a continuous measurement report of a motion sensor inside the user equipment, to determine the second motion data of the user equipment.
  • the method can further comprise determining a set of user equipments; for each particular user equipment of the set, determining a correlation between the user and the particular user equipment in a match window; and excluding from the set, a user equipment which is determined to not have a correlation with the user.
  • a determination of a correlation between the user and the particular user equipment can comprise: obtaining motion data measured continuously with respect to a motion of the particular user equipment within a period of time of the match window, performing a correlation match between the first waveform and a waveform derived from the motion data of the particular user equipment, and determining whether there is a correlation between the user and the particular user equipment based on the correlation match between the first waveform and the waveform derived from the motion data of the particular user equipment.
  • the method can further comprise in case where there are more than one user equipment in the set, continuing the determination of a correlation between the user and each particular user equipment of the set and the excluding in a next match window.
  • the method can further comprise determining a set of users; for each particular user of the set, determining a correlation between the particular user and the user equipment in a match window; and excluding from the set, a user which is determined to not have a correlation with the user equipment.
  • a determination of a correlation between the particular user and the user equipment can comprise obtaining motion data measured continuously with respect to a motion of the particular user within a period of time of the match window, performing a correlation match between a waveform derived from the motion data of the particular user and the second waveform, and determining whether there is a correlation between the particular user and the user equipment based on the correlation match between the waveform derived from the motion data of the particular user and the second waveform.
  • the method can further comprise: in case where there are more than one user in the set, continuing the determination of a correlation between the user equipment and each particular user of the set and the excluding in a next match window.
  • the method can further comprise: obtaining instantaneous motion data in the first motion data measured at a plurality of time instances, and instantaneous data in the second motion data measured at the same time instances; performing a match between the instantaneous motion data in the first motion data and corresponding instantaneous data in the second motion data; and determining, the correlation between the user and the user equipment based on the match between the instantaneous motion data in the first motion data and corresponding instantaneous data in the second motion data.
  • the method can further comprise: determining a difference between the instantaneous motion data in the first motion data and the instantaneous motion data in the second motion data; and comparing the difference between instantaneous motion data with a threshold.
  • the method can further comprise identifying an activity of the user or the user equipment for triggering said correlation match.
  • the method can further comprise extracting at least one appearance characteristic of the user.
  • the method can further comprise correlating the at least one appearance characteristic of the user with the user equipment, according to the correlation between the user and the user equipment.
  • the method can further comprise pushing a service associated with the user to the user equipment, according to correlation between the user and the user equipment.
  • the method can further comprise providing a service associated with the user equipment to the user, according to correlation between the user and the user equipment.
  • an apparatus comprising: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured, with the at least one processor, to cause the apparatus at least to: obtain, first motion data measured continuously with regard to a motion of a user within a period of time; obtain, second motion data measured continuously with regard to a motion of a user equipment within the period of time; and perform a correlation match between a first waveform derived from the first motion data and a second waveform derived from the second motion data; determine whether there is a correlation between the user and the user equipment based on the correlation match between the first waveform and the second waveform.
  • an apparatus comprising: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured, with the at least one processor, to cause the performance of the method according to the first aspect.
  • a fourth aspect of the present disclosure there is provided computer readable storage medium, on which instructions are stored, when executed by at least one processor, the instructions cause the at least one processor to perform the method according to the first aspect.
  • FIG. 1 illustrates an exemplary architecture of a system for implementing a correlation mechanism according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart depicting a procedure for correlating a user with a user equipment according to an embodiment of the present disclosure
  • FIG. 3 illustrates an exemplary scenario for correlating a user with a user equipment according to an embodiment of the present disclosure
  • FIG. 4 illustrates an exemplary procedure for establishing correlation by using waveform match
  • FIG. 5 illustrates exemplary waveform of angular velocity and acceleration measured from a user equipment
  • FIG. 6 illustrates an exemplary procedure for establishing correlation by using multiple instantaneous matches
  • FIG. 7 is a block diagram showing various function modules of a system for correlating a user with a user equipment according to an embodiment of the present disclosure
  • FIG. 8 illustrates an exemplary model for waveform correlation match according to an embodiment of the present disclosure
  • FIG. 9 illustrates an exemplary model for instantaneous correlation match according to an embodiment of the present disclosure
  • FIG. 10 illustrates a flowchart depicting a procedure for correlating a user with a user equipment according to an embodiment of the present disclosure
  • FIG. 11 shows a simplified block diagram of apparatus according to an embodiment of the present disclosure.
  • This disclosure will try to solve the problem as mentioned above, by correlating a user and a user equipment accurately and quickly.
  • Most existing solutions use position and trajectory (aseries of position) to establish a correlation between a user and a UE.
  • UE’s position is measured via radio-based localization solutions.
  • radio-based localization solutions are not accurate, it is not easy to establish correlation base on a small movement. For example, if a user only turns around at a same position, the existing solutions cannot make a determination on the correlation.
  • motion data indicative motion statuses of a user and user equipment are utilized to establish a correlation between a user and a UE.
  • the motion statuses can be indicated with at least one of the following: linear velocities, angular velocities, and waveforms of linear velocities and angular velocities, and some waveform characteristics in frequency-domain of the waveforms of linear velocities and angular velocities, such as amplitude, frequency, period, and phase.
  • FIG. 1 illustrates an exemplary architecture of a system 100 for implementing a correlation mechanism according to an embodiment of the present disclosure.
  • the system 100 can use one or more cameras 104 to recognize one or more users or personal objects (e.g.
  • these cameras may be installed in a shopping mall.
  • These cameras 104 are connected to a video network 110. Data (e.g. image, or video) captured by the cameras 104 can be uploaded to the video network 110, and stored in a storage device (not shown) in the video network 110.
  • the users carry UEs with them.
  • the UE may be any component (or collection of components) capable of wireless communication and motion measurement.
  • the UE may be a smart phone, a smart watch, or a wearable device, and the like, or any combination thereof.
  • Motion status of each UE e.g. UE1 103-1, UE2 103-2, UE3 103-3 in FIG. 1, and commonly referred to as 103 can be measured via motion sensors (such as integrated IMU sensors) within the respective UE.
  • the measured data can be reported to the system 100 via a mobile network 120, such as a cellular network or a WiFi network.
  • a data processing device 101 can be communicably connected to the video network 110, and can retrieve data indicative of motion status of the users from the video network 110.
  • the data processing device 101 can be communicably connected to the mobile network 110, and retrieve the measured data indicative of motion status of the UEs from the mobile network 120. Based on motion statuses of the users 102 and motion statuses of the UEs 103, the data processing device 101 can determine a correlation between a user and a UE, by waveform match and multiple instantaneous match.
  • the data processing device 101 may be an apparatus independent from the video network 110 and the mobile network 120.
  • the data processing device 101 may be an independent component in the mobile network 120 or included as part of any network component.
  • the data processing device 101 may be an application server.
  • FIG. 2 is a flowchart depicting a procedure 200 for correlating a user with a user equipment according to an embodiment of the present disclosure.
  • the data processing device 101 as described above can implement this procedure.
  • the procedure comprises obtaining first motion data measured continuously with regard to a motion of a user within a period of time.
  • the data processing device 101 can analyze an image flow captured within a period of time to determine a motion status of the user.
  • user appearance characteristics of the user can be captured and extracted by camera-based recognition mechanisms.
  • user appearance characteristics can comprise face, clothing colour, hair type, etc.
  • Some of user appearance characteristics are coarse user appearance characteristics, e.g. clothing colour.
  • Some of user appearance characteristics are fine user appearance characteristics, e.g. face.
  • the coarse user appearance characteristics can be used to define a user scope.
  • the fine user appearance characteristics can used to uniquely identify a user.
  • the user appearance characteristics of one user can be a vector of user appearance characteristics, and can used as user identifier.
  • continuous and synchronous images from multiple cameras in different directions can be used to determine a user’s motion data indicative of motion status of the user.
  • the user can be identified from these images based on the extracted user appearance characteristics.
  • the data processing device 101 can calculate the user’s motion status, in at least one of the dimensions including linear velocities in axis x, y, z, and angular velocties about axis x, y, z.
  • a camera can take 30 or more images frames per second.
  • motion data indicative of a motion status of a user based on these images.
  • visual motion status measurement has been a mature technology, e.g. visual odometry can be used for robot motion control.
  • the procedure comprises obtaining second motion data measured continuously with regard to a motion of a user equipment within the period of time.
  • the data processing device 101 can receive a continuous measurement report of a motion sensor inside the user equipment, to determine a motion status of the user equipment.
  • a UE can be a smart phone.
  • motion sensors e.g. IMU (Inertial Measurement Unit) sensor.
  • An IMU sensor combines 3-axis gyroscope, 3-axis accelerometer and 3-axis magnetometer. It can measure acceleration along its axes, angular velocity around its axes, and magnetic field along its axes.
  • a UE can receive a request from the mobile network 120, and then response the network request by reporting IMU messages (such as IMU reports shown in the FIG. 1) via a wireless communications, e.g. cellular and Wi-Fi communications.
  • the IMU messages can comprise IMU measurement results.
  • the IMU measurement results can be directly used to calculate the UE’s linear velocities and angular velocities, from which velocity waveforms can be derived subsequently. Waveform characteristics in frequency domain of the velocity waveforms, such as amplitude, frequency, period and phase, can be extracted accordingly.
  • a UE can transmit its reports of IMU measurement to the mobile network 120 periodically according to application settings.
  • the procedure proceeds to perform a correlation match between a first waveform derived from the first motion data and a second waveform derived from the second motion data.
  • a waveform is defined as continuous outputs of some variables of motion data, such as linear velocity, angular velocity, acceleration, or other variables derived from the motion data.
  • linear velocities can define a target’s translation velocity and translation orientation.
  • Angular velocities can define a target’s rotations of roll, pitch and yaw.
  • a waveform of a user’s or UE’s linear velocities can represent the user’s or UE’s translation motion status over time.
  • a waveform of a user’s or UE’s angular velocities can represent the user’s or UE’s rotation status of roll, pitch and yaw over time.
  • the procedure proceeds to determine whether there is a correlation between the user and the user equipment based on the correlation match between the first waveform and the second waveform.
  • the correlation between a user and a UE can be established based on two facts:
  • the UE always have a same (or almost same) motion status (such as linear velocities along axis x, y, z, angular velocities around axis x, y, z , velocity waveforms, and frequency, amplitude and phase of velocity changes, and the like) as the user, which is identified by the user’s appearance characteristics. That means they always have the same (or almost same) translation velocity, translation orientation, rotation angular of roll, pitch and yaw, velocity waveforms, and frequency, amplitude and phase of velocity changes.
  • the first waveform and the second waveform match to each other well, it can be determined that there is a correlation between the user and the user equipment. Otherwise, for example, if the first waveform and the second waveform do not match to each other, it can be determined that there is not a correlation between the user and the user equipment.
  • FIG. 3 illustrates an exemplary scenario for correlating a user with a user equipment according to an embodiment of the present disclosure.
  • Cameras 304 can capture a user 302’s movement over frame-by-frame images, and calculate linear velocities (e.g. represented with Vx U , Vy U , Vz U ) and angular velocities (e.g. represented with ⁇ x U , ⁇ y U , ⁇ z U ) of the user 302.
  • linear velocities e.g. represented with Vx U , Vy U , Vz U
  • angular velocities e.g. represented with ⁇ x U , ⁇ y U , ⁇ z U
  • a data processing device 301 can obtain the measurements of the linear velocities and angular velocities of the user 302 from the cameras 304, as shown at 310.
  • the original measurements of the linear velocities and angular velocities can be filtered as shown at 330, for example to to eliminate outliers.
  • the linear velocities and angular velocities of the user 302 can be transformed to a common coordinate system, as shown at 350. For example, they can be transformed from a body coordinate to an earth coordinate, based on a position information of the the cameras.
  • a UE 303 can be requested to report its IMU measurement results to the data processing device 301, as shown at 320.
  • the UE 303’s linear velocities (Vx UE , Vy UE , Vz UE ) and angular velocities ( ⁇ x UE , ⁇ y UE , ⁇ z UE ) can be obtained, e.g. via motion calculation.
  • the original measurements of the UE 303’s linear velocities and angular velocities can be filtered as shown at 340, for example to to eliminate outliers.
  • the UE 303’s linear velocities and angular velocities can be transformed to the common coordinate system, as shown at 360.
  • they can be transformed from UE coordinate to the earth coordinate.
  • camera direction and IMU magnetometer can be utilized for coordinate system transformation of a user’s and a UE’s motion status respectively, to make them transform to the same earth coordinate system.
  • One correlation match scheme is to compare waveforms of linear velocities and angular velocities. This scheme is suitable for a situation in which the user’s motion status can be continuously monitored and never be hidden behind an obstacle.
  • Another match scheme is to compare instantaneous linear velocities and angular velocities.
  • multiple matches need to be performed, for example when multiple UEs simultaneously have the same motion status with a target user, especially when the target UC is static.
  • six thresholds for Vx, Vy, Vz, ⁇ x, ⁇ y, ⁇ z can be used to determine whether a user and a UE have the same motion status. For example, only when all velocities meet their threshold requirements and just one UE is determined, the correlation is established.
  • FIG. 4 illustrates an exemplary correlation match process for establishing correlation by using waveform match.
  • the correlation match process can be triggered by a user’s activities, e.g. a woman (denoted as a user 402 in FIG. 4) stands before a showcase and has browsed the showcase for more than 3 seconds.
  • an image flow can be captured by one or more cameras for the user 402.
  • a visual recognition system can extract the user 402’s characteristics and identify the user 402 with a vector of user appearance characteristics.
  • a visual motion monitor can track the target user 402 to measure the target user’s velocities.
  • the visual recognition system and the visual motion monitor can be arranged in a video network, such as the video network 110 as shown in FIG. 1.
  • the visual recognition system and the visual motion monitor can be arranged in a device separate from the video network, such as integrated into the data processing devices 101, 301 as shown in FIGs. 1 and 3.
  • a correlator for correlation match e.g. in the data processing devices 101, 301 as shown in FIGs. 1 and 3, can request at least one UE near the user 402 to report their IMU information.
  • the at least one UE comprise UEs 403-1, 403-2, 403-3, 403-4, and 403-5 (commonly referred to as 403) , which are being served by a base station or a WiFi access point near the user 402.
  • These adjacent UEs can consist of a set of potential correlation UEs.
  • linear velocities and angular velocities of both users and UEs need to be measured continuously for a period of time. For example, each user need to be visible for continuous motion monitoring during the period of time. And IMUs of each UE needs report a series of measurement results.
  • FIG. 5 illustrates exemplary waveforms of angular velocity and acceleration collected from a serious of tested IMU measurements of a mobile phone which is carried in a person’s pocket.
  • the sampling frequency is 128Hz.
  • FIG. 5 gives angular velocity and acceleration waveforms measured when the person walked several steps with the mobile phone.
  • the waveforms indicate the IMU/mobile phone’s motion characteristics when the person walks. For example, angular velocities indicate the phone roll, pitch and yaw.
  • the waveforms are associated with motion characteristics of the person’s body.
  • the waveform indicates any small pose change in walking. Different persons have different waveforms with special characteristics in walking. Through these differencitations, persons can be distinguished with each other, by comparing their moving waveforms of linear velocities and angular velocities. If only one UE has the same velocity waveforms as a user’s waveforms, the correlation will be established.
  • velocity difference waveform integral can be used to determine matching procedure.
  • FIG. 4 which gives the correlation determination process by comparing waveform of linear velocities in axis x (i.e. Vx) and axis y (i.e. Vy) , and angular velocity about axis z (i.e. ⁇ z) of the user 402 and UEs 403.
  • axis x i.e. Vx
  • axis y i.e. Vy
  • angular velocity about axis z i.e. ⁇ z
  • a sliding match window can be adopted to make correlation determination.
  • the sliding match window can be 2 seconds.
  • difference waveforms of each UE with respect to the user 402 can be derived, e.g. by minusing the user’s velocities by the respective UE’s velocities.
  • ⁇ Vx Vx U ⁇ Vx UE
  • ⁇ Vy Vy U ⁇ Vy UE
  • one difference waveform of a UE fluctuates greatly, e.g. if an integral of the difference waveform ⁇ Vx over a slide match window, which written as ⁇
  • ⁇ Vx wavefom 413 and ⁇ Vy waveform 423 of UE3 fluctuate greatly, so UE3 is excluded from the set of potential correlation UEs.
  • ⁇ z waveform 435 of UE5 also fluctuate greatly, it also be excluded. So, the set of potential correlation UE, denoted as ⁇ UE1, UE2, UE4 ⁇ , can be determined.
  • the correlator can continue to perform correlation match between waveforms of the user 402 and waveforms of the UE1 403-1, UE2 403-2 and UE4 403-4.
  • the correlator may find the ⁇ Vx waveform 414 and ⁇ Vy waveform 424 of UE4’s fluctuate greatly, as shown in FIG. 4. Then, UE4 can be excluded from the set of potential correlation UEs.
  • the correlator can exclude UE1 from the set of potential correlation UEs, as its great waveform fluctuation in a ⁇ Vx waveform 411 and ⁇ Vy waveform 421. Finally, only one UE (i.e. UE2) is left in the set of potential correlation UEs. Consequently, a one-to-one correlation between a user and a UE can be established.
  • the waveform of a user and UEs is not limited to waveforms of linear velocity and angular velocity, but can be any kind of waveform corresponding to motions status of a user and UEs.
  • the above waveform may comprise a waveform of acceleration, or a waveform of any variable derived from at least one the linear velocity and the angular velocity.
  • the above waveform e.g. waveforms of linear velocity, angular velocity, and acceleration
  • may be periodic curves correspond to each human walk step. So these waveforms can be transformed from time domain to frequency domain, e.g. via Fast Fourier Transform (FFT) .
  • FFT Fast Fourier Transform
  • a person walk frequency can be determined from a waveform of the user, and then the frequencies of waveforms of the potential correlation UEs can be compared against the user’s frequency, to perform an operation of correlation match.
  • a person’s roll phase may always have a difference with its pitch phase. This phase difference can be used to indicate a special person. Accordingly, the correlation match can be performed based on this phase difference. It can be appreciated that through analyzing the waveforms and comparing the waveforms of a user and UEs, many waveform characteristics reflecting many characteristics of motion status can be utilized for determining correlation between a user and a UE accurately.
  • FIG. 6 illustrates an exemplary procedure for establishing correlation by using multiple instantaneous matches.
  • Instantaneous correlation match is to synchronously measure velocities of users and UEs. It often needs several times of matches at different time instances as in FIG. 6.
  • a horizontal motion and compare user’s and UE’s linear velocities in axis x and axis y, and angular velocity about axis z (yaw) . It is also assumed that those velocities Vx, Vy and ⁇ z have been transformed to earth coordinate system via quaternion calculation.
  • a visual motion monitor will track a target user 602 to measure velocities of the user 602, through the cameras 604.
  • several adjacent UEs 603-1, 603-2, 603-3, 603-4, 603-5 (which consist of a set of potential correlation UEs 603) report their IMU information of velocities.
  • a correlator obtain the information of motion statuses of both the user 602 and UEs 603, it can make correlation determination by comparing instantaneous velocities of the user 602 with instantaneous velocities of each potential UE.
  • a thresholds for Vx, Vy and ⁇ z can be set as VTx, VTy and ⁇ Tz, respectively, for example.
  • a difference between an instantaneous velocity of the user 602 and a potential UE can be calculated and compared with the corresponding threshold. If the difference goes beyond the threshold, the potential UE can be excluded from the set of potential correlation UEs.
  • UE5 doesn’t make a translation motion, it stays at a same position, but it is making a rotation motion with an angular velocity ⁇ z. UE5’s ⁇ z is greater than the user’s angular velocity over the threshold ⁇ Tz.
  • UE5 603-5 is also not correlation with the target user 602.
  • the other three UEs, i.e. UE1, UE2 and UE4 shown in FIG. 6, are static as the target user 602, they can be determined as potential correlation UEs, after the correlation process at time 1. As one-to-one correlation between a user and a UE isn’t established at this time, the correlation process should be proceeded to make further determination.
  • UE1 and UE2 have a same motion status.
  • UE4 makes a translation motion and has a same translation velocity as the user 602, but UE4 has a different orientation.
  • UE4’s Vx and Vy are greater than the user 602’s over thresholds VTx and VTy, respectively. So UE4 603-4 can be excluded from the set of potential correlation UEs. Now, only two UEs left in the set of potential correlation UEs should be distinguished.
  • UE1’s IMU report shows that it makes a translation motion and has a same angular velocity as the user 602. But its translation linear velocity is smaller than the user over the thresholds VTx and VTy. So UE1 603-1 can be excluded from the set of potential correlation UEs. Now, only UE2 603-2 is kept in the set of potential correlation UEs, and consequently one-to-one correlation between a user and a UE is established.
  • UE-centralized correlation i.e. finding a correlation user for a target UE.
  • UE-centralized correlation i.e. finding a correlation user for a target UE.
  • a correlator can find out a correlation user from one or more potential correlation users, through a similar correlation process. Then, appearance characteristics of the correlation user’s can be informed to waiters. That can make waiters easily find the owner of the order. In the case, only one UE reports IMU measurement result.
  • a visual motion monitor can track multiple adjacent users, and measure their motion statuses.
  • the UE-centralized correlation process are similar with that of user-centralized correlation processes shown in FIG. 4 and FIG. 6.
  • the correlation mechanism disclosed in this disclosure can be utilized to support commercial applications.
  • the user-centralized correlation can be used in a scenario in which the user-centralized correlation is triggered according to user activities, e.g. standing in front of a showcase more than 3 seconds, to extract user appearance characteristics, then to establish the user’s correlation UE and push some commercial information to UE.
  • the UE-centralized correlation can be used in a scenario in which the UE-centralized correlation is triggered according to a setting or operation on a UE (e.g. the UE is operated to order a meal in restaurant) , to find out appearance characteristics of the UE’s owner, then to inform waiter the owner’s appearance characteristic.
  • the user-centralized correlation is to find a UE from a group of UEs for the target user, while the UE-centralized correlation is to find a user from a group of users for a target UE. Both scenarios have similar correlation processes. Here only have detailed description for the user-centralized correlation. However, it can be appreciated that these architecture, operation flow and mechanism are easy to extend to scenarios of the UE-centralized correlation.
  • FIG. 7 shows various function modules of a system 700 for correlating a user with a user equipment according to an embodiment of the present disclosure. It is contemplated that the functions of these modules may be combined in one or more modules or performed by other modules of equivalent functionality.
  • the system 700 may be implemented in a data processing device, such as the data processing device 101, 301 shown in FIG. 1 and FIG. 3, which can be coupled to a video network and a mobile network.
  • the system 700 may comprise a communication interface (not shown) to communicate with the video network, so as to obtain data related to motion status of one or more personal users, such as images steam captured by one or more cameras.
  • the system 700 may further comprise a communication interface (not shown) to communicate with the mobile network, so as to obtain data related to motion status of one or more UEs, such as IMU reports.
  • the system 700 may comprise a user motion determination module 701, a UE motion determination module 702, a waveform match module 703, an instantaneous match module 704, and a correlation database 705.
  • the user motion determination module 701 is provided to determine motion data measured continuously with regard to a motion of a user within a period of time.
  • the user motion determination module 701 can analyze images stream to calculate a user’s linear velocities and angular velocities.
  • the system 700 can further comprise a filter for filtering the calculated velocities.
  • the system 700 can further comprise a coordinate transformer for obtaining motion data in the earth coordinate system.
  • the UE motion determination module 702 is provided to determine motion data measured continuously with regard to a motion of a UE within the period of time. For example, the UE motion determination module 702 can receive IMU reports from a UE to calculate linear velocities and angular velocities of the UE. In an embodiment, the calculated velocities can also be filtered by some filters, and be transformed to obtain motion data in the earth coordinate system.
  • the waveform match module 703 is provided to perform a correlation match between a waveform derived from the user’s motion data and a waveform derived from the UE’s motion data. In an embodiment, it can use a sliding window to perform waveform correlation match.
  • An exemplary model 800 for waveform correlation match is illustrated FIG. 8. The model 800 compares waveforms between a user and a UE.The user’s waveform minus by the UE’s waveform, to obtain a difference waveform. Then, integration calculations can be done to each difference waveform for a sliding match window, as shown in FIG. 8. The waveform integration result of each difference waveform can be compared with a predefined threshold.
  • the waveform match module 703 can determine whether there is a correlation between the user and the UE based on the comparison, for example according to a predefined correlation condition.
  • the correlation condition can be set as that, if all waveform integration results are less than related thresholds, i.e. “&” operation at 810 is true, then the UE can be determined as a potential correlation UE of the user, or the user can be determined as a potential correlation user of the UE.
  • Continuous waveform match can be performed, until only one UE or one user meets the correlation condition.
  • the user motion determination module 701 and the UE motion determination module 702 are configured to further obtain the user’s waveform and the UE’s waveform in a next sliding match window, respectively. Accordingly, difference waveform for the next sliding match window can be obtained, and integration calculations can be done to each difference waveform for the next sliding match window.
  • the instantaneous match module 704 is provided to perform a correlation match between the user’s instantaneous motion data and the UE’s instantaneous motion data.
  • An exemplary model 900 for instantaneous correlation match is illustrated FIG. 9.
  • the model 900 compares instantaneous motion statuses between a user and a UE.
  • the difference of each velocity can be compared with a predefined related threshold.
  • the instantaneous match module 704 can determine whether there is a correlation between the user and the UE based on the comparison, for example according to a predefined correlation condition.
  • the correlation condition can be set as that, if all velocities are matched, i.e. all of ⁇ Vx, ⁇ Vy, ⁇ Vz, ⁇ x, ⁇ y, and ⁇ z are less than related thresholds i.e. “&” operation at 910 is true, then the UE can be determined to be a potential correlation UE of the user, or the user can be determined as a potential correlation user of the UE.
  • the match can be performed for several time instances to exclude those UEs without similar motion status, until only one UE or one user which meets the correlation condition is left.
  • the model 900 can be configured to make correlation match through a comparison of waveform characteristics in frequency domain, such as amplitude, frequency and/or phase. More particularly, from each waveform of linear velocity and angular velocity, amplitude, frequency and phase parameters of the waveform can be extracted. So, 18 frequency-domain parameters can be derived from waveforms of Vx, Vy, Vz, ⁇ x, ⁇ y, ⁇ z for correlation match. For example, amplitude, frequency and phase of a user’s Vx waveform in a sliding window can be derived in the user motion determination module 701. Amplitude, frequency and phase of a UE’s Vx waveform in the same sliding window can be derived in the UE motion determination module 702.
  • the amplitude, frequency and phase of the user’s Vx waveform and the amplitude, frequency and phase of the UE’s Vx waveform can be input into the module 900, and be compared with each other, respectively.
  • differences between amplitude, frequency, and phase of the user’s Vx waveform and amplitude, frequency, and phase of the UE’s Vx waveform can be calculated, respectively.
  • the differences can be compared with corresponding threshold, to determine whether the user’s Vx waveform matches to the UE’s Vx waveform.
  • similar operations can be performed for correlation match.
  • Continuous waveform match based on waveform characteristics in frequency domain can be performed, until only one UE or one user meets the correlation condition.
  • the user motion determination module 701 and the UE motion determination module 702 are configured to further obtain the user’s waveform and the UE’s waveform in a next sliding match window, respectively. Accordingly, waveform characteristics in frequency domain of these waveforms in the next sliding match window, such as amplitude, frequency and phase can be obtained, and compared accordingly for the next sliding match window.
  • the results of correlation determination can be stored in a database 705.
  • the database 705 may be any form of storage or storage system.
  • the database 705 may be a component outside the system 700, and coupled to the system 700. In other embodiments, the database 705 may be a part of the system 700.
  • An application module 709 can retrieve the correlation between a user and a UE from the database, to implement application services.
  • the correlation between the user and the UE can support many new application services. For example, in an application service, commercial messages can be pushed to a UE according user activities of its correlated user. In another application service, after a user ordering a meal with his UE, appearance characteristics of the user can be provided to a restaurant waiter, according to the correlation between the UE and the user.
  • the application module 709 may be a component outside the system 700, and coupled to the system 700. In other embodiments, the application module 709 may be a part of the system 700.
  • the system 700 may further comprise a target activities predefinition module 706 and a user activity identification module 707.
  • the target activities predefinition module is provided to define some expected activities for user recognition.
  • the target activities are the triggering conditions of a correlation match process.
  • the user activity identification module 707 is provided to capture user activities, for example based on a images stream from a video network. Then the user activity identification module 707 matches the user activities with the predefined activities, so as to trigger a correlation match.
  • a target activity may be a woman who stands before a showcase and has browsed the showcase for more than 3 seconds.
  • the user activity identification module 707 identifies that a user’s activity matches this target activity, it can trigger a correlation match process with taking the user as a target user. In an example, identifying of the user can be triggered accordingly.
  • system 700 may further comprise a user characteristics extraction module 708, which is provided to extract appearance characteristics of a user, for example from images or videos captured by cameras.
  • the target activities predefinition module 706 and the user activity identification module 707 and the user characteristics extraction module 708 may be components outside the system 700.
  • the target activities predefinition module 706, the user activity identification module 707, and the user characteristics extraction module 708 may be components in a video network or included as part of any network component.
  • the target activities predefinition module 706 and the user activity identification module 707 and the user characteristics extraction module 708 may be part of a visual motion monitor, a visual odometry, or the like.
  • FIG. 10 shows a flowchart of a method 1000 for correlating a user with a user equipment according to an embodiment of the present disclosure.
  • the method 1000 can be implemented by the system 700 of FIG. 7, or a data processing device such as the data processing device 101, 301.
  • the method 1000 may comprise monitoring user activities based on visual monitoring, at block 1010; and triggering a correlation match process based on the user activities, at block 1020. For example, if it is determined that a monitored activity of a user matches a predefined target activity, a correlation match process can be triggered to determine a correlation UE for the user.
  • the method 1000 may further comprise extracting appearance characteristics of the user, to identify a target user, at block 1030.
  • the method 1000 can comprise measuring current motion status of the target user in visual domain, such as linear velocities (Vx, Vy, Vz) and angular velocities ( ⁇ x, ⁇ y, ⁇ z) .
  • the method 1000 can comprise receiving IMU measurement results of each UE of a set of potential correlation UEs. There may be more than one potential correlation UEs in the set. Motion status of each of the potential correlation UEs, such as linear velocities (Vx, Vy, Vz) and angular velocities ( ⁇ x, ⁇ y, ⁇ z) can be calculated or obtained, from the IMU measurement reports.
  • block 1050 is shown as followed after block 1040, it can be appreciated that their order can be changed.
  • block 1050 can be performed in parallel with block 1040.
  • the method may further comprise transforming coordinate systems of the target user and the adjacent UE to a common coordinate system, at block 1060.
  • a quaternion calculation can be performed on the motion status of the target user and the adjacent UE, to complete the transformation of coordinate system.
  • the method 1000 can proceed to perform correlation match over all velocities with the correlation match mechanism described above, such as the waveform correlation match described with reference to FIG. 4 and 8.
  • the method can comprise determining if there is only one correlation UE. If so, it can be determined that a one-to-one correlation between the target user and the correlation UE is established, at block 1090. Otherwise, the method can proceed to block 1100, to exclude the UE which is determined to be not correlate to the target user, from the set of potential correlation UEs. Then, the method returns to make a further correlation match in a next slide match window.
  • multiple exclusion operations are needed to establish one-to-one correlation between a user and a UE. Meanwhile, the exclusion operation can reduce the number of tracked UEs.
  • FIG. 11 illustrating a simplified block diagram of an apparatus 1100 that may be embodied in/as a data processing device (e.g., the data processing device 101, 301 shown in FIG. 1 and FIG. 3) .
  • the apparatus 1100 may comprise at least one processor 1101, such as a data processor (DP) and at least one memory (MEM) 1102 coupled to the at least one processor 1101.
  • the apparatus 1100 may further comprise one or more transmitters TX, one or more receivers RX 1103, or one or more transceivers coupled to the one or more processors 1101 to communicate with a wireless communication network (e.g. the mobile network 120 shown in FIG.
  • a wireless communication network e.g. the mobile network 120 shown in FIG.
  • the apparatus 1100 may have one or more wireline communication means that connects the apparatus to a computer cloud network or system, such as the video network 110.
  • the MEM 1102 stores a program (PROG) 1104.
  • the PROG 1104 may include instructions that, when executed on the associated processor 1101, enable the apparatus 1100 to operate in accordance with the embodiments of the present disclosure, for example to perform one of the methods 200 and 1000.
  • a combination of the at least one processor 1101 and the at least one MEM 1102 may form processing circuitry or means 1105 adapted to implement various embodiments of the present disclosure.
  • Various embodiments of the present disclosure may be implemented by computer program executable by one or more of the processors 1101, software, firmware, hardware or in a combination thereof.
  • the MEMs 1102 may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, as non-limiting examples.
  • the processors 1101 may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors DSPs and processors based on multicore processor architecture, as non-limiting examples.
  • the various exemplary embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof.
  • some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto.
  • firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto.
  • While various aspects of the exemplary embodiments of this invention may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
  • the exemplary embodiments of the inventions may be practiced in various components such as integrated circuit chips and modules. It should thus be appreciated that the exemplary embodiments of this invention may be realized in an apparatus that is embodied as an integrated circuit, where the integrated circuit may comprise circuitry (as well as possibly firmware) for embodying at least one or more of a data processor, a digital signal processor, baseband circuitry and radio frequency circuitry that are configurable so as to operate in accordance with the exemplary embodiments of this invention.
  • exemplary embodiments of the inventions may be embodied in computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other device.
  • the computer executable instructions may be stored on a computer readable medium, for example, non-transitory computer readable medium, such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc.
  • the function of the program modules may be combined or distributed as desired in various embodiments.
  • the function may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA) , and the like.
  • FPGA field programmable gate arrays
  • circuitry may refer to one or more or all of the following:
  • circuitry applies to all uses of this term in this application, including in any claims.
  • circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware.
  • circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.

Abstract

Methods and apparatus for correlation between a user and a user equipment are provided. A method comprises: obtaining first motion data measured continuously with regard to a motion of a user within a period of time; obtaining second motion data measured continuously with regard to a motion of a user equipment within the period of time; performing a correlation match between a first waveform derived from the first motion data and a second waveform derived from the second motion data; and determining whether there is a correlation between the user and the user equipment based on the correlation match between the first waveform and the second waveform.

Description

METHOD AND APPARATUS FOR CORRELATING A USER AND A USER EQUIPMENT TECHNICAL FIELD
Embodiments of the present disclosure generally relate to user monitoring and user equipment monitoring, and specifically to methods, apparatus and computer readable storage medium for correlating a user and a user equipment.
BACKGROUND
With massive user activities, data reflecting the user activities can be collected, for example a communication network operator, it is attractive to get the data monetization for business value. For example, an operator can have a try to get user motion data monetization in the shopping mall by analyzing users’ business behavior and preference according to data indicative of user motion.
An application example can be used to illustrate problems involved in the data monetization. Assuming in a shopping mall, a woman is in shopping. When she stands before a showcase and browses goods in the showcase for a long time (e.g 3 seconds) , it may mean that she is interested in the goods in the showcase. Her activity can be captured by a camera. The shop may obtain her appearance characteristics via image recognition, e.g. face, clothing colour, etc. But the shop doesn’t know whom she is. The shop cannot push any commercial information, e.g. advertisement related to the goods in the showcase, to her.
It would be benefit to establish a correlation between her appearance characteristic and her UE. Then, many commercial applications can be done via her UE.
SUMMARY
The present disclosure is going to solve the aforementioned problems by correlating a user and a user equipment accurately and rapidly. Other features and advantages of embodiments of the present disclosure will also be understood from the  following description of specific embodiments when read in conjunction with the accompanying drawings, which illustrate, by way of example, the principles of embodiments of the present disclosure.
According to a first aspect of the present disclosure, there is provided a method for correlation between a user and a user equipment. The method comprises obtaining first motion data measured continuously with regard to a motion of a user within a period of time; obtaining second motion data measured continuously with regard to a motion of a user equipment within the period of time; performing a correlation match between a first waveform derived from the first motion data and a second waveform derived from the second motion data; and determining whether there is a correlation between the user and the user equipment based on the correlation match between the first waveform and the second waveform.
In some embodiments, performing the correlation match can comprise determining a difference waveform corresponding to a difference between the first motion data and the second motion data; and comparing a waveform characteristic of the difference waveform with a threshold to determine whether there is a correlation match between the first waveform and the second waveform.
In some embodiments, performing the correlation match can comprise transforming the first waveform and the second waveform from time domain to frequency domain; and performing the correlation match in the frequency domain.
In some embodiments, the correlation match can be performed based on at least one of the following waveform characteristics: amplitude, frequency, period, and phase.
In some embodiments, the method can further comprise determining a waveform characteristic in frequency domain of the first waveform; and determining a corresponding waveform characteristic in frequency domain of the second waveform. Performing the correlation match can comprise performing a match between the waveform characteristic in frequency domain of the first waveform and the corresponding waveform characteristic in frequency domain of the second waveform. In some embodiments, performing a match between waveform characteristics can comprise determining a difference between the waveform characteristic in frequency domain of the first waveform and the corresponding waveform characteristic in frequency domain of the second waveform; and comparing the difference between waveform characteristics with a threshold. The waveform characteristic in frequency  domain can comprise at least one of the following: amplitude, frequency, period and phase.
In some embodiments, the method can further comprise transforming at least one of the first motion data and the second motion data to motion data in a common coordinate system.
In some embodiments, the first motion data and the second motion data can comprise data indicative of a motion in at least one of the following dimensions: linear velocity, angular velocity, and a dimension defined by a variable derived from at least one the linear velocity and the angular velocity.
In some embodiments, obtaining the first motion data can comprise analyzing an image flow captured within the period of time to determine the first motion data of the user
In some embodiments, obtaining the second motion data can comprise receiving a continuous measurement report of a motion sensor inside the user equipment, to determine the second motion data of the user equipment.
In some embodiments, the method can further comprise determining a set of user equipments; for each particular user equipment of the set, determining a correlation between the user and the particular user equipment in a match window; and excluding from the set, a user equipment which is determined to not have a correlation with the user. A determination of a correlation between the user and the particular user equipment can comprise: obtaining motion data measured continuously with respect to a motion of the particular user equipment within a period of time of the match window, performing a correlation match between the first waveform and a waveform derived from the motion data of the particular user equipment, and determining whether there is a correlation between the user and the particular user equipment based on the correlation match between the first waveform and the waveform derived from the motion data of the particular user equipment. The method can further comprise in case where there are more than one user equipment in the set, continuing the determination of a correlation between the user and each particular user equipment of the set and the excluding in a next match window.
In some embodiments, the method can further comprise determining a set of users; for each particular user of the set, determining a correlation between the particular user and the user equipment in a match window; and excluding from the set, a user which is determined to not have a correlation with the user equipment. A  determination of a correlation between the particular user and the user equipment can comprise obtaining motion data measured continuously with respect to a motion of the particular user within a period of time of the match window, performing a correlation match between a waveform derived from the motion data of the particular user and the second waveform, and determining whether there is a correlation between the particular user and the user equipment based on the correlation match between the waveform derived from the motion data of the particular user and the second waveform. In some embodiments, the method can further comprise: in case where there are more than one user in the set, continuing the determination of a correlation between the user equipment and each particular user of the set and the excluding in a next match window.
In some embodiments, the method can further comprise: obtaining instantaneous motion data in the first motion data measured at a plurality of time instances, and instantaneous data in the second motion data measured at the same time instances; performing a match between the instantaneous motion data in the first motion data and corresponding instantaneous data in the second motion data; and determining, the correlation between the user and the user equipment based on the match between the instantaneous motion data in the first motion data and corresponding instantaneous data in the second motion data. In some embodiments, the method can further comprise: determining a difference between the instantaneous motion data in the first motion data and the instantaneous motion data in the second motion data; and comparing the difference between instantaneous motion data with a threshold.
In some embodiments, the method can further comprise identifying an activity of the user or the user equipment for triggering said correlation match.
In some embodiments, the method can further comprise extracting at least one appearance characteristic of the user. The method can further comprise correlating the at least one appearance characteristic of the user with the user equipment, according to the correlation between the user and the user equipment.
In some embodiments, the method can further comprise pushing a service associated with the user to the user equipment, according to correlation between the user and the user equipment.
In some embodiments, the method can further comprise providing a service associated with the user equipment to the user, according to correlation between the user and the user equipment.
According to a second aspect of the present disclosure, an apparatus comprising: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured, with the at least one processor, to cause the apparatus at least to: obtain, first motion data measured continuously with regard to a motion of a user within a period of time; obtain, second motion data measured continuously with regard to a motion of a user equipment within the period of time; and perform a correlation match between a first waveform derived from the first motion data and a second waveform derived from the second motion data; determine whether there is a correlation between the user and the user equipment based on the correlation match between the first waveform and the second waveform.
According to a third aspect of the present disclosure, an apparatus comprising: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured, with the at least one processor, to cause the performance of the method according to the first aspect.
According to a fourth aspect of the present disclosure, there is provided computer readable storage medium, on which instructions are stored, when executed by at least one processor, the instructions cause the at least one processor to perform the method according to the first aspect.
BRIEF DESCRIPTION OF THE DRAWINGS
Some example embodiments will now be described with reference to the accompanying drawings in which:
FIG. 1 illustrates an exemplary architecture of a system for implementing a correlation mechanism according to an embodiment of the present disclosure;
FIG. 2 is a flowchart depicting a procedure for correlating a user with a user equipment according to an embodiment of the present disclosure;
FIG. 3 illustrates an exemplary scenario for correlating a user with a user equipment according to an embodiment of the present disclosure;
FIG. 4 illustrates an exemplary procedure for establishing correlation by using waveform match;
FIG. 5 illustrates exemplary waveform of angular velocity and acceleration measured from a user equipment;
FIG. 6 illustrates an exemplary procedure for establishing correlation by using multiple instantaneous matches;
FIG. 7 is a block diagram showing various function modules of a system for correlating a user with a user equipment according to an embodiment of the present disclosure;
FIG. 8 illustrates an exemplary model for waveform correlation match according to an embodiment of the present disclosure;
FIG. 9 illustrates an exemplary model for instantaneous correlation match according to an embodiment of the present disclosure;
FIG. 10 illustrates a flowchart depicting a procedure for correlating a user with a user equipment according to an embodiment of the present disclosure; and
FIG. 11 shows a simplified block diagram of apparatus according to an embodiment of the present disclosure.
DETAILED DESCRIPTION
The embodiments of the present disclosure are described in detail with reference to the accompanying drawings. It should be understood that these embodiments are discussed only for the purpose of enabling those skilled persons in the art to better understand and thus implement the present disclosure, rather than suggesting any limitations on the scope of the present disclosure. Reference throughout this specification to features, advantages, or similar language does not imply that all of the features and advantages that may be realized with the present disclosure should be or are in any single embodiment of the disclosure. Rather, language referring to the features and advantages is understood to mean that a specific feature, advantage, or characteristic described in connection with an embodiment is included in at least one embodiment of the present disclosure.
Furthermore, the described features, advantages, and characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. One skilled in the relevant art will recognize that the disclosure may be practiced without one or more of the specific features or advantages of a particular embodiment.  In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments of the disclosure.
As used herein, the terms "data, " "information" and similar terms may be used interchangeably to refer to data capable of being transmitted, received, stored and/or processed in accordance with embodiments of the present invention. Thus, use of any such terms should not be taken to limit the spirit and scope of embodiments of the present invention.
This disclosure will try to solve the problem as mentioned above, by correlating a user and a user equipment accurately and quickly. Most existing solutions use position and trajectory (aseries of position) to establish a correlation between a user and a UE. UE’s position is measured via radio-based localization solutions. As radio-based localization solutions are not accurate, it is not easy to establish correlation base on a small movement. For example, if a user only turns around at a same position, the existing solutions cannot make a determination on the correlation.
According to embodiments of this disclosure, motion data indicative motion statuses of a user and user equipment (UE) are utilized to establish a correlation between a user and a UE. For example, the motion statuses can be indicated with at least one of the following: linear velocities, angular velocities, and waveforms of linear velocities and angular velocities, and some waveform characteristics in frequency-domain of the waveforms of linear velocities and angular velocities, such as amplitude, frequency, period, and phase. FIG. 1 illustrates an exemplary architecture of a system 100 for implementing a correlation mechanism according to an embodiment of the present disclosure. The system 100 can use one or more cameras 104 to recognize one or more users or personal objects (e.g. user1 102-1, user2 102-2, user3 102-3 in FIG. 1, and commonly referred to as 102) , and monitor activities and motion status of the users. For example, these cameras may be installed in a shopping mall. These cameras 104 are connected to a video network 110. Data (e.g. image, or video) captured by the cameras 104 can be uploaded to the video network 110, and stored in a storage device (not shown) in the video network 110.
The users carry UEs with them. The UE may be any component (or collection of components) capable of wireless communication and motion measurement. For example, the UE may be a smart phone, a smart watch, or a wearable device, and the like, or any combination thereof. Motion status of each UE  (e.g. UE1 103-1, UE2 103-2, UE3 103-3 in FIG. 1, and commonly referred to as 103) can be measured via motion sensors (such as integrated IMU sensors) within the respective UE. The measured data can be reported to the system 100 via a mobile network 120, such as a cellular network or a WiFi network.
data processing device 101 can be communicably connected to the video network 110, and can retrieve data indicative of motion status of the users from the video network 110. The data processing device 101 can be communicably connected to the mobile network 110, and retrieve the measured data indicative of motion status of the UEs from the mobile network 120. Based on motion statuses of the users 102 and motion statuses of the UEs 103, the data processing device 101 can determine a correlation between a user and a UE, by waveform match and multiple instantaneous match.
As shown in the FIG. 1, the data processing device 101 may be an apparatus independent from the video network 110 and the mobile network 120. In other embodiments, the data processing device 101 may be an independent component in the mobile network 120 or included as part of any network component. For example, the data processing device 101 may be an application server.
FIG. 2 is a flowchart depicting a procedure 200 for correlating a user with a user equipment according to an embodiment of the present disclosure. The data processing device 101 as described above can implement this procedure. At block 210, the procedure comprises obtaining first motion data measured continuously with regard to a motion of a user within a period of time. In this regard, the data processing device 101 can analyze an image flow captured within a period of time to determine a motion status of the user.
In an embodiment, user appearance characteristics of the user can be captured and extracted by camera-based recognition mechanisms. For example, user appearance characteristics can comprise face, clothing colour, hair type, etc. Some of user appearance characteristics are coarse user appearance characteristics, e.g. clothing colour. Some of user appearance characteristics are fine user appearance characteristics, e.g. face. The coarse user appearance characteristics can be used to define a user scope. The fine user appearance characteristics can used to uniquely identify a user. In an embodiment, the user appearance characteristics of one user can be a vector of user appearance characteristics, and can used as user identifier.
In an embodiment, continuous and synchronous images from multiple cameras in different directions, can be used to determine a user’s motion data indicative of motion status of the user. The user can be identified from these images based on the extracted user appearance characteristics. For example, from the continuous and synchronous images, the data processing device 101 can calculate the user’s motion status, in at least one of the dimensions including linear velocities in axis x, y, z, and angular velocties about axis x, y, z. In general, a camera can take 30 or more images frames per second. These continuous and synchronous images can capture a small movement of a user. In an embodiment, by using geometry calculation and kinematics formulas, it is easy to obtain motion data indicative of a motion status of a user based on these images. For example, visual motion status measurement has been a mature technology, e.g. visual odometry can be used for robot motion control.
At block 220, the procedure comprises obtaining second motion data measured continuously with regard to a motion of a user equipment within the period of time. In this regard, the data processing device 101 can receive a continuous measurement report of a motion sensor inside the user equipment, to determine a motion status of the user equipment.
A UE can be a smart phone. Currently, almost all smart phones integrate motion sensors, e.g. IMU (Inertial Measurement Unit) sensor. An IMU sensor combines 3-axis gyroscope, 3-axis accelerometer and 3-axis magnetometer. It can measure acceleration along its axes, angular velocity around its axes, and magnetic field along its axes. In an embodiment, a UE can receive a request from the mobile network 120, and then response the network request by reporting IMU messages (such as IMU reports shown in the FIG. 1) via a wireless communications, e.g. cellular and Wi-Fi communications. The IMU messages can comprise IMU measurement results. The IMU measurement results can be directly used to calculate the UE’s linear velocities and angular velocities, from which velocity waveforms can be derived subsequently. Waveform characteristics in frequency domain of the velocity waveforms, such as amplitude, frequency, period and phase, can be extracted accordingly. In another embodiment, a UE can transmit its reports of IMU measurement to the mobile network 120 periodically according to application settings.
At block 230, the procedure proceeds to perform a correlation match between a first waveform derived from the first motion data and a second waveform derived from the second motion data. A waveform is defined as continuous outputs of  some variables of motion data, such as linear velocity, angular velocity, acceleration, or other variables derived from the motion data. As the first motion data and the second motion data are continuously measured within a same period of time, one or more waveforms can be derived from the first motion data and the second motion data. Linear velocities can define a target’s translation velocity and translation orientation. Angular velocities can define a target’s rotations of roll, pitch and yaw. For example, a waveform of a user’s or UE’s linear velocities can represent the user’s or UE’s translation motion status over time. Similarly, a waveform of a user’s or UE’s angular velocities can represent the user’s or UE’s rotation status of roll, pitch and yaw over time. Some key features can be derived from these waveforms, such as period, amplitude, frequency, phase, etc., to identify features of the motion of the user and the UE. The correlation match between the first waveform and the second waveform can be performed based on one or more of these waveform features.
At block 240, the procedure proceeds to determine whether there is a correlation between the user and the user equipment based on the correlation match between the first waveform and the second waveform. The correlation between a user and a UE can be established based on two facts:
(1) For a user carrying a UE, the UE always have a same (or almost same) motion status (such as linear velocities along axis x, y, z, angular velocities around axis x, y, z , velocity waveforms, and frequency, amplitude and phase of velocity changes, and the like) as the user, which is identified by the user’s appearance characteristics. That means they always have the same (or almost same) translation velocity, translation orientation, rotation angular of roll, pitch and yaw, velocity waveforms, and frequency, amplitude and phase of velocity changes.
(2) It is not possible that different users always keep the same motion status. That means they always have different translation velocity, translation orientation, rotation angular of roll, pitch and yaw, velocity waveforms, and frequency, amplitude and phase of velocity changes, over time.
Accordingly, if the first waveform and the second waveform match to each other well, it can be determined that there is a correlation between the user and the user equipment. Otherwise, for example, if the first waveform and the second waveform do not match to each other, it can be determined that there is not a correlation between the user and the user equipment.
FIG. 3 illustrates an exemplary scenario for correlating a user with a user equipment according to an embodiment of the present disclosure. Cameras 304 can capture a user 302’s movement over frame-by-frame images, and calculate linear velocities (e.g. represented with Vx U, Vy U, Vz U) and angular velocities (e.g. represented with ωx U, ωy U, ωz U) of the user 302.
A data processing device 301 can obtain the measurements of the linear velocities and angular velocities of the user 302 from the cameras 304, as shown at 310. The original measurements of the linear velocities and angular velocities can be filtered as shown at 330, for example to to eliminate outliers. In an embodiment, the linear velocities and angular velocities of the user 302 can be transformed to a common coordinate system, as shown at 350. For example, they can be transformed from a body coordinate to an earth coordinate, based on a position information of the the cameras.
UE 303 can be requested to report its IMU measurement results to the data processing device 301, as shown at 320. According to the IMU reports, the UE 303’s linear velocities (Vx UE, Vy UE, Vz UE) and angular velocities (ωx UE, ωy UE, ωz UE) can be obtained, e.g. via motion calculation. The original measurements of the UE 303’s linear velocities and angular velocities can be filtered as shown at 340, for example to to eliminate outliers. In an embodiment, after the filtering, the UE 303’s linear velocities and angular velocities can be transformed to the common coordinate system, as shown at 360. For example, they can be transformed from UE coordinate to the earth coordinate. In an example, camera direction and IMU magnetometer can be utilized for coordinate system transformation of a user’s and a UE’s motion status respectively, to make them transform to the same earth coordinate system.
Above measurements of motion status of the user 302 and motion status of the UE 303 are performed continuously and are synchronized. When the UE’s motion status and the user’s motion status are transformed to the same coordinate system, they could be compared for a correlation match process, as shown at 370.
One correlation match scheme is to compare waveforms of linear velocities and angular velocities. This scheme is suitable for a situation in which the user’s motion status can be continuously monitored and never be hidden behind an obstacle. Another match scheme is to compare instantaneous linear velocities and angular velocities. In an embodiment, multiple matches need to be performed, for example when multiple UEs simultaneously have the same motion status with a target  user, especially when the target UC is static. In an example, six thresholds for Vx, Vy, Vz, ωx, ωy, ωz can be used to determine whether a user and a UE have the same motion status. For example, only when all velocities meet their threshold requirements and just one UE is determined, the correlation is established.
FIG. 4 illustrates an exemplary correlation match process for establishing correlation by using waveform match. In an example, the correlation match process can be triggered by a user’s activities, e.g. a woman (denoted as a user 402 in FIG. 4) stands before a showcase and has browsed the showcase for more than 3 seconds. As shown in FIG. 4, an image flow can be captured by one or more cameras for the user 402. From the image flow, a visual recognition system can extract the user 402’s characteristics and identify the user 402 with a vector of user appearance characteristics. Then, a visual motion monitor can track the target user 402 to measure the target user’s velocities. In an embodiment, the visual recognition system and the visual motion monitor can be arranged in a video network, such as the video network 110 as shown in FIG. 1. In another embodiment, the visual recognition system and the visual motion monitor can be arranged in a device separate from the video network, such as integrated into the data processing devices 101, 301 as shown in FIGs. 1 and 3.
At the same time, a correlator for correlation match, e.g. in the data processing devices 101, 301 as shown in FIGs. 1 and 3, can request at least one UE near the user 402 to report their IMU information. For example, the at least one UE comprise UEs 403-1, 403-2, 403-3, 403-4, and 403-5 (commonly referred to as 403) , which are being served by a base station or a WiFi access point near the user 402. These adjacent UEs can consist of a set of potential correlation UEs. Once the correlator obtains data of motion statuses of both the user 402 and the potential correlation UEs 403, it can make correlation determination.
For waveform match, linear velocities and angular velocities of both users and UEs need to be measured continuously for a period of time. For example, each user need to be visible for continuous motion monitoring during the period of time. And IMUs of each UE needs report a series of measurement results.
FIG. 5 illustrates exemplary waveforms of angular velocity and acceleration collected from a serious of tested IMU measurements of a mobile phone which is carried in a person’s pocket. The sampling frequency is 128Hz. FIG. 5 gives angular velocity and acceleration waveforms measured when the person walked several steps with the mobile phone.
The waveforms indicate the IMU/mobile phone’s motion characteristics when the person walks. For example, angular velocities indicate the phone roll, pitch and yaw. The waveforms are associated with motion characteristics of the person’s body. The waveform indicates any small pose change in walking. Different persons have different waveforms with special characteristics in walking. Through these differencitations, persons can be distinguished with each other, by comparing their moving waveforms of linear velocities and angular velocities. If only one UE has the same velocity waveforms as a user’s waveforms, the correlation will be established.
Note that, in a realistic world, during a walking period, those velocities and accelerations may have drastic changes. Waveforms comparison should be carefully designed. In an embodiment, velocity difference waveform integral can be used to determine matching procedure. Return back to FIG. 4, which gives the correlation determination process by comparing waveform of linear velocities in axis x (i.e. Vx) and axis y (i.e. Vy) , and angular velocity about axis z (i.e. ωz) of the user 402 and UEs 403. Here, only a horizontal motion is considered for simplifying description. It is also assumed that, those velocities Vx, Vy and ωz have been transformed to the earth coordinate system, e.g. via quaternion calculation.
A sliding match window can be adopted to make correlation determination. For example, the sliding match window can be 2 seconds. At first, difference waveforms of each UE with respect to the user 402 can be derived, e.g. by minusing the user’s velocities by the respective UE’s velocities. For example, ΔVx= Vx UˉVx UE, ΔVy= Vy UˉVy UE, and Δωz= Vωz UˉVωz UE. Then, difference waveforms 411 -415 of ΔVx, difference waveforms 421 -425 of ΔVy, and difference waveforms 431 -435 of Δωz are be derived for each UE, as shown in FIG. 4.
If one difference waveform of a UE fluctuates greatly, e.g. if an integral of the difference waveform ΔVx over a slide match window, which written as ∫ |ΔVx| , is greater than a fluctuation threshold, then the UE can be excluded from the set of potential correlation UEs. For example, in an example of FIG. 4, in current slide match window, it is detected that ΔVx wavefom 413 and ΔVy waveform 423 of UE3 fluctuate greatly, so UE3 is excluded from the set of potential correlation UEs. Meanwhile, Δωz waveform 435 of UE5 also fluctuate greatly, it also be excluded. So, the set of potential correlation UE, denoted as {UE1, UE2, UE4} , can be determined.
In a next slide match window of 2 seconds, the correlator can continue to perform correlation match between waveforms of the user 402 and waveforms of the UE1 403-1, UE2 403-2 and UE4 403-4. In this sliding match window, the correlator may find the ΔVx waveform 414 and ΔVy waveform 424 of UE4’s fluctuate greatly, as shown in FIG. 4. Then, UE4 can be excluded from the set of potential correlation UEs.
Then in third 2 seconds, the correlator can exclude UE1 from the set of potential correlation UEs, as its great waveform fluctuation in a ΔVx waveform 411 and ΔVy waveform 421. Finally, only one UE (i.e. UE2) is left in the set of potential correlation UEs. Consequently, a one-to-one correlation between a user and a UE can be established.
Please note that the waveform of a user and UEs is not limited to waveforms of linear velocity and angular velocity, but can be any kind of waveform corresponding to motions status of a user and UEs. For example, the above waveform may comprise a waveform of acceleration, or a waveform of any variable derived from at least one the linear velocity and the angular velocity.
In an embodiment, the above waveform, e.g. waveforms of linear velocity, angular velocity, and acceleration, may be periodic curves correspond to each human walk step. So these waveforms can be transformed from time domain to frequency domain, e.g. via Fast Fourier Transform (FFT) . In frequency domain, characteristics of a person’s walk pose in frequency, amplitude and phase, and be obtained from the waveform, for example based on waveform characteristics comprising amplitude, frequency, period, and phase, etc. Based on this information of waveform characteristics, different persons or different UEs can be distinguished. For example, a person walk frequency can be determined from a waveform of the user, and then the frequencies of waveforms of the potential correlation UEs can be compared against the user’s frequency, to perform an operation of correlation match. In another example, a person’s roll phase may always have a difference with its pitch phase. This phase difference can be used to indicate a special person. Accordingly, the correlation match can be performed based on this phase difference. It can be appreciated that through analyzing the waveforms and comparing the waveforms of a user and UEs, many waveform characteristics reflecting many characteristics of  motion status can be utilized for determining correlation between a user and a UE accurately.
There is another scheme of correlation match in some embodiments, in which instantaneous motion data are utilized for correlation match. FIG. 6 illustrates an exemplary procedure for establishing correlation by using multiple instantaneous matches. Instantaneous correlation match is to synchronously measure velocities of users and UEs. It often needs several times of matches at different time instances as in FIG. 6. For simplifying description, here we only consider a horizontal motion, and compare user’s and UE’s linear velocities in axis x and axis y, and angular velocity about axis z (yaw) . It is also assumed that those velocities Vx, Vy and ωz have been transformed to earth coordinate system via quaternion calculation.
Then, a visual motion monitor will track a target user 602 to measure velocities of the user 602, through the cameras 604. At the same time, several adjacent UEs 603-1, 603-2, 603-3, 603-4, 603-5 (which consist of a set of potential correlation UEs 603) report their IMU information of velocities. Once a correlator obtain the information of motion statuses of both the user 602 and UEs 603, it can make correlation determination by comparing instantaneous velocities of the user 602 with instantaneous velocities of each potential UE. In an embodiment, a thresholds for Vx, Vy and ωz can be set as VTx, VTy and ωTz, respectively, for example. Then, a difference between an instantaneous velocity of the user 602 and a potential UE can be calculated and compared with the corresponding threshold. If the difference goes beyond the threshold, the potential UE can be excluded from the set of potential correlation UEs.
For example, as shown in FIG. 6, at time 1, it is assumed that a target user 602 is static, his/her Vx U=0, Vy U=0, and ωz U=0. An adjacent UE3 is making a translation motion, its Vx and Vy are greater than with the user’s Vx and Vy over thresholds VTx and VTy, respectively. Then, it can be determined that UE3 603-3 is not correlation with the target user 602. At time 1, UE5 doesn’t make a translation motion, it stays at a same position, but it is making a rotation motion with an angular velocity ωz. UE5’s ωz is greater than the user’s angular velocity over the threshold ωTz. Then, it can be determined that UE5 603-5 is also not correlation with the target user 602. The other three UEs, i.e. UE1, UE2 and UE4 shown in FIG. 6, are static as the target user 602, they can be determined as potential correlation UEs, after the correlation process at time 1. As one-to-one correlation between a user and a UE isn’t  established at this time, the correlation process should be proceeded to make further determination.
Next at time 2, e.g. 5 seconds later, only three UEs left in the set of potential correlation UEs can be requested to report their respective IMU information. It is assumed that all of the three UEs are in movement. UE1 and UE2 have a same motion status. UE4 makes a translation motion and has a same translation velocity as the user 602, but UE4 has a different orientation. UE4’s Vx and Vy are greater than the user 602’s over thresholds VTx and VTy, respectively. So UE4 603-4 can be excluded from the set of potential correlation UEs. Now, only two UEs left in the set of potential correlation UEs should be distinguished.
Next at time 3, e.g. another 5 second later, UE1’s IMU report shows that it makes a translation motion and has a same angular velocity as the user 602. But its translation linear velocity is smaller than the user over the thresholds VTx and VTy. So UE1 603-1 can be excluded from the set of potential correlation UEs. Now, only UE2 603-2 is kept in the set of potential correlation UEs, and consequently one-to-one correlation between a user and a UE is established.
Above correlation procedures are user-centralized correlation, i.e. finding a correlation UE for a target user. Embodiments of this disclosure can also support UE-centralized correlation, i.e. finding a correlation user for a target UE. For example, when a user orders a meal on his UE by himself in a restaurant, a correlator can find out a correlation user from one or more potential correlation users, through a similar correlation process. Then, appearance characteristics of the correlation user’s can be informed to waiters. That can make waiters easily find the owner of the order. In the case, only one UE reports IMU measurement result. A visual motion monitor can track multiple adjacent users, and measure their motion statuses. The UE-centralized correlation process are similar with that of user-centralized correlation processes shown in FIG. 4 and FIG. 6.
The correlation mechanism disclosed in this disclosure can be utilized to support commercial applications. For example, the user-centralized correlation can be used in a scenario in which the user-centralized correlation is triggered according to user activities, e.g. standing in front of a showcase more than 3 seconds, to extract user appearance characteristics, then to establish the user’s correlation UE and push some commercial information to UE. The UE-centralized correlation can be used in a scenario in which the UE-centralized correlation is triggered according to a setting or  operation on a UE (e.g. the UE is operated to order a meal in restaurant) , to find out appearance characteristics of the UE’s owner, then to inform waiter the owner’s appearance characteristic. The user-centralized correlation is to find a UE from a group of UEs for the target user, while the UE-centralized correlation is to find a user from a group of users for a target UE. Both scenarios have similar correlation processes. Here only have detailed description for the user-centralized correlation. However, it can be appreciated that these architecture, operation flow and mechanism are easy to extend to scenarios of the UE-centralized correlation.
Reference is now made to FIG. 7, which shows various function modules of a system 700 for correlating a user with a user equipment according to an embodiment of the present disclosure. It is contemplated that the functions of these modules may be combined in one or more modules or performed by other modules of equivalent functionality. The system 700 may be implemented in a data processing device, such as the data processing device 101, 301 shown in FIG. 1 and FIG. 3, which can be coupled to a video network and a mobile network. The system 700 may comprise a communication interface (not shown) to communicate with the video network, so as to obtain data related to motion status of one or more personal users, such as images steam captured by one or more cameras. The system 700 may further comprise a communication interface (not shown) to communicate with the mobile network, so as to obtain data related to motion status of one or more UEs, such as IMU reports.
The system 700 may comprise a user motion determination module 701, a UE motion determination module 702, a waveform match module 703, an instantaneous match module 704, and a correlation database 705. The user motion determination module 701 is provided to determine motion data measured continuously with regard to a motion of a user within a period of time. For example, the user motion determination module 701 can analyze images stream to calculate a user’s linear velocities and angular velocities. In an embodiment, the system 700 can further comprise a filter for filtering the calculated velocities. In an embodiment, the system 700 can further comprise a coordinate transformer for obtaining motion data in the earth coordinate system.
The UE motion determination module 702 is provided to determine motion data measured continuously with regard to a motion of a UE within the period of time. For example, the UE motion determination module 702 can receive IMU  reports from a UE to calculate linear velocities and angular velocities of the UE. In an embodiment, the calculated velocities can also be filtered by some filters, and be transformed to obtain motion data in the earth coordinate system.
The waveform match module 703 is provided to perform a correlation match between a waveform derived from the user’s motion data and a waveform derived from the UE’s motion data. In an embodiment, it can use a sliding window to perform waveform correlation match. An exemplary model 800 for waveform correlation match is illustrated FIG. 8. The model 800 compares waveforms between a user and a UE.The user’s waveform minus by the UE’s waveform, to obtain a difference waveform. Then, integration calculations can be done to each difference waveform for a sliding match window, as shown in FIG. 8. The waveform integration result of each difference waveform can be compared with a predefined threshold. The waveform match module 703 can determine whether there is a correlation between the user and the UE based on the comparison, for example according to a predefined correlation condition. In an embodiment, the correlation condition can be set as that, if all waveform integration results are less than related thresholds, i.e. “&” operation at 810 is true, then the UE can be determined as a potential correlation UE of the user, or the user can be determined as a potential correlation user of the UE. Continuous waveform match can be performed, until only one UE or one user meets the correlation condition. More particularly, the user motion determination module 701 and the UE motion determination module 702 are configured to further obtain the user’s waveform and the UE’s waveform in a next sliding match window, respectively. Accordingly, difference waveform for the next sliding match window can be obtained, and integration calculations can be done to each difference waveform for the next sliding match window.
The instantaneous match module 704 is provided to perform a correlation match between the user’s instantaneous motion data and the UE’s instantaneous motion data. An exemplary model 900 for instantaneous correlation match is illustrated FIG. 9. The model 900 compares instantaneous motion statuses between a user and a UE. The use’s instantaneous velocities minus by the UE’s instantaneous velocities, to obtain instantaneous velocity differences. The difference of each velocity can be compared with a predefined related threshold. The instantaneous match module 704 can determine whether there is a correlation between the user and the UE based on the comparison, for example according to a predefined correlation  condition. In an embodiment, the correlation condition can be set as that, if all velocities are matched, i.e. all of ΔVx, ΔVy, ΔVz, Δωx, Δωy, and Δωz are less than related thresholds i.e. “&” operation at 910 is true, then the UE can be determined to be a potential correlation UE of the user, or the user can be determined as a potential correlation user of the UE. The match can be performed for several time instances to exclude those UEs without similar motion status, until only one UE or one user which meets the correlation condition is left.
In some embodiments, the model 900 can be configured to make correlation match through a comparison of waveform characteristics in frequency domain, such as amplitude, frequency and/or phase. More particularly, from each waveform of linear velocity and angular velocity, amplitude, frequency and phase parameters of the waveform can be extracted. So, 18 frequency-domain parameters can be derived from waveforms of Vx, Vy, Vz, ωx, ωy, ωz for correlation match. For example, amplitude, frequency and phase of a user’s Vx waveform in a sliding window can be derived in the user motion determination module 701. Amplitude, frequency and phase of a UE’s Vx waveform in the same sliding window can be derived in the UE motion determination module 702. Then, the amplitude, frequency and phase of the user’s Vx waveform and the amplitude, frequency and phase of the UE’s Vx waveform can be input into the module 900, and be compared with each other, respectively. In an example, differences between amplitude, frequency, and phase of the user’s Vx waveform and amplitude, frequency, and phase of the UE’s Vx waveform can be calculated, respectively. Then the differences can be compared with corresponding threshold, to determine whether the user’s Vx waveform matches to the UE’s Vx waveform. For the waveforms of Vy, Vz, ωx, ωy, ωz, similar operations can be performed for correlation match. Continuous waveform match based on waveform characteristics in frequency domain can be performed, until only one UE or one user meets the correlation condition. More particularly, the user motion determination module 701 and the UE motion determination module 702 are configured to further obtain the user’s waveform and the UE’s waveform in a next sliding match window, respectively. Accordingly, waveform characteristics in frequency domain of these waveforms in the next sliding match window, such as amplitude, frequency and phase can be obtained, and compared accordingly for the next sliding match window.
The results of correlation determination can be stored in a database 705. The database 705 may be any form of storage or storage system. The database 705  may be a component outside the system 700, and coupled to the system 700. In other embodiments, the database 705 may be a part of the system 700.
An application module 709 can retrieve the correlation between a user and a UE from the database, to implement application services. The correlation between the user and the UE can support many new application services. For example, in an application service, commercial messages can be pushed to a UE according user activities of its correlated user. In another application service, after a user ordering a meal with his UE, appearance characteristics of the user can be provided to a restaurant waiter, according to the correlation between the UE and the user. The application module 709 may be a component outside the system 700, and coupled to the system 700. In other embodiments, the application module 709 may be a part of the system 700.
In an embodiment, the system 700 may further comprise a target activities predefinition module 706 and a user activity identification module 707. The target activities predefinition module is provided to define some expected activities for user recognition. The target activities are the triggering conditions of a correlation match process. The user activity identification module 707 is provided to capture user activities, for example based on a images stream from a video network. Then the user activity identification module 707 matches the user activities with the predefined activities, so as to trigger a correlation match. For example, as mentioned in above embodiments, a target activity may be a woman who stands before a showcase and has browsed the showcase for more than 3 seconds. When the user activity identification module 707 identifies that a user’s activity matches this target activity, it can trigger a correlation match process with taking the user as a target user. In an example, identifying of the user can be triggered accordingly.
In an embodiment, the system 700 may further comprise a user characteristics extraction module 708, which is provided to extract appearance characteristics of a user, for example from images or videos captured by cameras.
In an embodiment, the target activities predefinition module 706 and the user activity identification module 707 and the user characteristics extraction module 708 may be components outside the system 700. In an example, the target activities predefinition module 706, the user activity identification module 707, and the user characteristics extraction module 708 may be components in a video network or included as part of any network component. For example, the target activities  predefinition module 706 and the user activity identification module 707 and the user characteristics extraction module 708 may be part of a visual motion monitor, a visual odometry, or the like.
Reference is now made to FIG. 10, which shows a flowchart of a method 1000 for correlating a user with a user equipment according to an embodiment of the present disclosure. The method 1000 can be implemented by the system 700 of FIG. 7, or a data processing device such as the data processing device 101, 301. As shown in FIG. 10, the method 1000 may comprise monitoring user activities based on visual monitoring, at block 1010; and triggering a correlation match process based on the user activities, at block 1020. For example, if it is determined that a monitored activity of a user matches a predefined target activity, a correlation match process can be triggered to determine a correlation UE for the user. In an embodiment, the method 1000 may further comprise extracting appearance characteristics of the user, to identify a target user, at block 1030.
Then, a correlation match process can be started to determine a correlation UE for the target user. At block 1040, the method 1000 can comprise measuring current motion status of the target user in visual domain, such as linear velocities (Vx, Vy, Vz) and angular velocities (ωx, ωy, ωz) . At block 1050, the method 1000 can comprise receiving IMU measurement results of each UE of a set of potential correlation UEs. There may be more than one potential correlation UEs in the set. Motion status of each of the potential correlation UEs, such as linear velocities (Vx, Vy, Vz) and angular velocities (ωx, ωy, ωz) can be calculated or obtained, from the IMU measurement reports. Although block 1050 is shown as followed after block 1040, it can be appreciated that their order can be changed. For example, block 1050 can be performed in parallel with block 1040. In an embodiment, the method may further comprise transforming coordinate systems of the target user and the adjacent UE to a common coordinate system, at block 1060. For example, a quaternion calculation can be performed on the motion status of the target user and the adjacent UE, to complete the transformation of coordinate system.
Then, the method 1000 can proceed to perform correlation match over all velocities with the correlation match mechanism described above, such as the waveform correlation match described with reference to FIG. 4 and 8. Next at block 1080, the method can comprise determining if there is only one correlation UE. If so, it can be determined that a one-to-one correlation between the target user and the  correlation UE is established, at block 1090. Otherwise, the method can proceed to block 1100, to exclude the UE which is determined to be not correlate to the target user, from the set of potential correlation UEs. Then, the method returns to make a further correlation match in a next slide match window. In practice, multiple exclusion operations are needed to establish one-to-one correlation between a user and a UE. Meanwhile, the exclusion operation can reduce the number of tracked UEs.
Now reference is made to FIG. 11 illustrating a simplified block diagram of an apparatus 1100 that may be embodied in/as a data processing device (e.g., the data processing device 101, 301 shown in FIG. 1 and FIG. 3) . The apparatus 1100 may comprise at least one processor 1101, such as a data processor (DP) and at least one memory (MEM) 1102 coupled to the at least one processor 1101. The apparatus 1100 may further comprise one or more transmitters TX, one or more receivers RX 1103, or one or more transceivers coupled to the one or more processors 1101 to communicate with a wireless communication network (e.g. the mobile network 120 shown in FIG. 1) , for example, by using the wireless local communication network technologies, such as WLAN, UWB, 
Figure PCTCN2020085160-appb-000001
and wireless telecommunication technologies, such as 2/3/4/5/6G (Generation) , or any combinations thereof. Further the apparatus 1100 may have one or more wireline communication means that connects the apparatus to a computer cloud network or system, such as the video network 110. The MEM 1102 stores a program (PROG) 1104. The PROG 1104 may include instructions that, when executed on the associated processor 1101, enable the apparatus 1100 to operate in accordance with the embodiments of the present disclosure, for example to perform one of the  methods  200 and 1000. A combination of the at least one processor 1101 and the at least one MEM 1102 may form processing circuitry or means 1105 adapted to implement various embodiments of the present disclosure.
Various embodiments of the present disclosure may be implemented by computer program executable by one or more of the processors 1101, software, firmware, hardware or in a combination thereof.
The MEMs 1102 may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, as non-limiting examples.
The processors 1101 may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors DSPs and processors based on multicore processor architecture, as non-limiting examples.
In general, the various exemplary embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. For example, some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto. While various aspects of the exemplary embodiments of this invention may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
As such, it should be appreciated that at least some aspects of the exemplary embodiments of the inventions may be practiced in various components such as integrated circuit chips and modules. It should thus be appreciated that the exemplary embodiments of this invention may be realized in an apparatus that is embodied as an integrated circuit, where the integrated circuit may comprise circuitry (as well as possibly firmware) for embodying at least one or more of a data processor, a digital signal processor, baseband circuitry and radio frequency circuitry that are configurable so as to operate in accordance with the exemplary embodiments of this invention.
It should be appreciated that at least some aspects of the exemplary embodiments of the inventions may be embodied in computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other device. The computer executable instructions may be stored on a computer readable medium, for example, non-transitory computer readable medium, such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc. As will be appreciated by  one of skills in the art, the function of the program modules may be combined or distributed as desired in various embodiments. In addition, the function may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA) , and the like.
As used in this application, the term “circuitry” may refer to one or more or all of the following:
(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry) and
(b) combinations of hardware circuits and software, such as (as applicable) :
(i) a combination of analog and/or digital hardware circuit (s) with software/firmware and
(ii) any portions of hardware processor (s) with software (including digital signal processor (s) ) , software, and memory (ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions) and
(c) hardware circuit (s) and or processor (s) , such as a microprocessor (s) or a portion of a microprocessor (s) , that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.
This definition of “circuitry” applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term “circuitry” also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term “circuitry” also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
The present invention includes any novel feature or combination of features disclosed herein either explicitly or any generalization thereof. Various modifications and adaptations to the foregoing exemplary embodiments of this invention may become apparent to those skilled in the relevant arts in view of the foregoing description, when read in conjunction with the accompanying drawings. However, any and all modifications will still fall within the scope of the non-limiting and exemplary embodiments of this invention.

Claims (45)

  1. A method for correlation between a user and a user equipment, comprising:
    obtaining first motion data measured continuously with regard to a motion of a user within a period of time;
    obtaining second motion data measured continuously with regard to a motion of a user equipment within the period of time;
    performing a correlation match between a first waveform derived from the first motion data and a second waveform derived from the second motion data; and
    determining whether there is a correlation between the user and the user equipment based on the correlation match between the first waveform and the second waveform.
  2. The method as claimed in claim 1, wherein performing the correlation match comprises:
    determining a difference waveform corresponding to a difference between the first motion data and the second motion data; and
    comparing a waveform characteristic of the difference waveform with a threshold to determine whether there is a correlation match between the first waveform and the second waveform.
  3. The method as claimed in claim 1 or 2, wherein performing the correlation match comprises:
    transforming the first waveform and the second waveform from time domain to frequency domain; and
    performing the correlation match in the frequency domain.
  4. The method as claimed in any one of claims 1-3, wherein the correlation match is performed based on at least one of the following waveform characteristics: amplitude, frequency, period, and phase.
  5. The method as claimed in claim 1, further comprising:
    determining a waveform characteristic in frequency domain of the first waveform; and
    determining a corresponding waveform characteristic in frequency domain of the second waveform;
    wherein performing the correlation match comprises performing a match between the waveform characteristic in frequency domain of the first waveform and the corresponding waveform characteristic in frequency domain of the second waveform.
  6. The method as claimed in claim 5, wherein performing the match between waveform characteristics comprises:
    determining a difference between the waveform characteristic in frequency domain of the first waveform and the corresponding waveform characteristic in frequency domain of the second waveform; and
    comparing the difference between waveform characteristics with a threshold.
  7. The method as claimed in any one of claims 5-6, wherein the waveform characteristic in frequency domain comprises at least one of the following: amplitude, frequency, period and phase.
  8. The method as claimed in claim 1, further comprising:
    transforming at least one of the first motion data and the second motion data to motion data in a common coordinate system.
  9. The method as claimed in claim 1, wherein the first motion data and the second motion data comprises data indicative of a motion in at least one of the following dimensions: linear velocity, angular velocity, and a dimension defined by a variable derived from at least one the linear velocity and the angular velocity.
  10. The method as claimed in claim 1, wherein obtaining the first motion data comprises:
    analyzing an image flow captured within the period of time to determine the first motion data of the user.
  11. The method as claimed in claim 1, wherein obtaining the second motion data comprises:
    receiving a continuous measurement report of a motion sensor inside the user equipment, to determine the second motion data of the user equipment.
  12. The method as claimed in claim 1, further comprising:
    determining a set of user equipments;
    for each particular user equipment of the set, determining a correlation between the user and the particular user equipment in a match window, wherein a determination of a correlation between the user and the particular user equipment comprises:
    obtaining motion data measured continuously with respect to a motion of the particular user equipment within a period of time of the match window,
    performing a correlation match between the first waveform and a waveform derived from the motion data of the particular user equipment, and
    determining whether there is a correlation between the user and the particular user equipment based on the correlation match between the first waveform and the waveform derived from the motion data of the particular user equipment; and
    excluding from the set, a user equipment which is determined to not have a correlation with the user.
  13. The method as claimed in claim 12, further comprising:
    in case where there are more than one user equipment in the set, continuing the determination of a correlation between the user and each particular user equipment of the set and the excluding in a next match window.
  14. The method as claimed in claim 1, further comprising:
    determining a set of users;
    for each particular user of the set, determining a correlation between the particular user and the user equipment in a match window, wherein a determination of a correlation between the particular user and the user equipment comprises:
    obtaining motion data measured continuously with respect to a motion of the particular user within a period of time of the match window,
    performing a correlation match between a waveform derived from the motion data of the particular user and the second waveform, and
    determining whether there is a correlation between the particular user and the user equipment based on the correlation match between the waveform derived from the motion data of the particular user and the second waveform; and
    excluding from the set, a user which is determined to not have a correlation with the user equipment.
  15. The method as claimed in claim 14, further comprising:
    in case where there are more than one user in the set, continuing the determination of a correlation between the user equipment and each particular user of the set and the excluding in a next match window.
  16. The method as claimed in claim 1, further comprising:
    obtaining instantaneous motion data in the first motion data measured at a plurality of time instances, and instantaneous data in the second motion data measured at the same time instances;
    performing a match between the instantaneous motion data in the first motion data and corresponding instantaneous data in the second motion data; and
    determining, the correlation between the user and the user equipment based on the match between the instantaneous motion data in the first motion data and corresponding instantaneous data in the second motion data.
  17. The method as claimed in claim 16, further comprising:
    determining a difference between the instantaneous motion data in the first motion data and the instantaneous motion data in the second motion data; and
    comparing the difference between instantaneous motion data with a threshold.
  18. The method as claimed in claim 1, further comprising:
    identifying an activity of the user or the user equipment for triggering said correlation match.
  19. The method as claimed in claim 1, further comprising:
    extracting at least one appearance characteristic of the user.
  20. The method as claimed in claim 19, further comprising:
    correlating the at least one appearance characteristic of the user with the user equipment, according to the correlation between the user and the user equipment.
  21. The method as claimed in claim 1, further comprising:
    pushing a service associated with the user to the user equipment, according to correlation between the user and the user equipment.
  22. The method as claimed in claim 1, further comprising:
    providing a service associated with the user equipment to the user, according to correlation between the user and the user equipment.
  23. An apparatus, comprising:
    at least one processor; and
    at least one memory including computer program code, the memory and the computer program code configured to, with the processor, cause the apparatus at least to:
    obtain, first motion data measured continuously with regard to a motion of a user within a period of time;
    obtain, second motion data measured continuously with regard to a motion of a user equipment within the period of time; and
    perform a correlation match between a first waveform derived from the first motion data and a second waveform derived from the second motion data;
    determine whether there is a correlation between the user and the user equipment based on the correlation match between the first waveform and the second waveform.
  24. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to perform the correlation match by causing the apparatus to,
    determine a difference waveform corresponding to a difference between the first motion data and the second motion data; and
    compare a waveform characteristic of the difference waveform with a threshold to determine whether there is a correlation match between the first waveform and the second waveform.
  25. The apparatus as claimed in claim 23 or 24, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to perform the correlation match by causing the apparatus to,
    transform the first waveform and the second waveform from time domain to frequency domain; and
    perform the correlation match in the frequency domain.
  26. The apparatus as claimed in any one of claims 23-25, wherein the correlation match is performed based on at least one of the following waveform characteristics: amplitude, frequency, period, and phase.
  27. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to:
    determine a waveform characteristic in frequency domain of the first waveform; and
    determine a corresponding waveform characteristic in frequency domain of the second waveform;
    wherein the correlation match is performed by performing a match between the waveform characteristic in frequency domain of the first waveform and the corresponding waveform characteristic in frequency domain of the second waveform.
  28. The apparatus as claimed in claim 27, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to perform the match between waveform characteristics by causing the apparatus to:
    determine a difference between the waveform characteristic in frequency domain of the first waveform and the corresponding waveform characteristic in frequency domain of the second waveform; and
    compare the difference between waveform characteristics with a threshold.
  29. The apparatus as claimed in any one of claims 27-28, wherein the waveform characteristic in frequency domain comprises at least one of the following: amplitude, frequency, period and phase.
  30. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    transform at least one of the first motion data and the second motion data to motion data in a common coordinate system.
  31. The apparatus as claimed in claim 23, wherein the first motion data and the second motion data comprises data indicative of a motion in at least one of the following dimensions: linear velocity, angular velocity, and a dimension defined by a variable derived from at least one the linear velocity and the angular velocity.
  32. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to obtain the first motion data, by causing the apparatus to:
    analyze an image flow captured within the period of time to determine the first motion data of the user.
  33. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to obtain the second motion data, by causing the apparatus to:
    receive a continuous measurement report of a motion sensor inside the user equipment, to determine the second motion data of the user equipment..
  34. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    determine a set of user equipments;
    for each particular user equipment of the set, determine a correlation between the user and the particular user equipment in a match window, wherein a determination of a correlation between the user and the particular user equipment comprises:
    obtaining motion data measured continuously with respect to a motion of the particular user equipment within a period of time of the match window,
    performing a correlation match between the first waveform and a waveform derived from the motion data of the particular user equipment, and
    determining whether there is a correlation between the user and the particular user equipment based on the correlation match between the first waveform and the waveform derived from the motion data of the particular user equipment; and
    exclude from the set, a user equipment which is determined to not have a correlation with the user
  35. The apparatus as claimed in claim 34, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    continue the determination of a correlation between the user and each particular user equipment of the set and the excluding in a next match window, if there are more than one user equipment in the set.
  36. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    determine a set of users;
    for each particular user of the set, determine a correlation between the particular user and the user equipment in a match window, wherein a determination of a correlation between the particular user and the user equipment comprises:
    obtaining motion data measured continuously with respect to a motion of the particular user within a period of time of the match window,
    performing a correlation match between a waveform derived from the motion data of the particular user and the second waveform, and
    determining whether there is a correlation between the particular user and the user equipment based on the correlation match between the waveform derived from the motion data of the particular user and the second waveform; and
    exclude from the set, a user which is determined to not have a correlation with the user equipment.
  37. The apparatus as claimed in claim 36, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    continue the determination of a correlation between the user equipment and each particular user of the set and the excluding in a next match window, if there are more than one user in the set.
  38. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    obtain instantaneous motion data in the first motion data measured at a plurality of time instances, and instantaneous data in the second motion data measured at the same time instances;
    perform a match between the instantaneous motion data in the first motion data and corresponding instantaneous data in the second motion data; and
    determine, the correlation between the user and the user equipment based on the match between the instantaneous motion data in the first motion data and corresponding instantaneous data in the second motion data.
  39. The apparatus as claimed in claim 38, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    determine a difference between the instantaneous motion data in the first motion data and the instantaneous motion data in the second motion data; and
    compare the difference between instantaneous motion data with a threshold.
  40. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    identify an activity of the user or the user equipment for triggering said correlation match.
  41. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    extract at least one appearance characteristic of the user.
  42. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    correlate the at least one appearance characteristic of the user with the user equipment, according to the correlation between the user and the user equipment.
  43. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    push a service associated with the user to the user equipment, according to correlation between the user and the user equipment.
  44. The apparatus as claimed in claim 23, wherein the memory and the computer program code is configured to, with the processor, further cause the apparatus to,
    provide a service associated with the user equipment to the user, according to correlation between the user and the user equipment.
  45. A computer readable storage medium, on which instructions are stored, when executed by at least one processor, the instructions cause the at least one processor to perform the method according to any one of claims 1-22.
PCT/CN2020/085160 2020-04-16 2020-04-16 Method and apparatus for correlating a user and a user equipment WO2021208029A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/085160 WO2021208029A1 (en) 2020-04-16 2020-04-16 Method and apparatus for correlating a user and a user equipment
CN202080099859.6A CN115461695A (en) 2020-04-16 2020-04-16 Method and apparatus for correlating users and user devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/085160 WO2021208029A1 (en) 2020-04-16 2020-04-16 Method and apparatus for correlating a user and a user equipment

Publications (1)

Publication Number Publication Date
WO2021208029A1 true WO2021208029A1 (en) 2021-10-21

Family

ID=78083754

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085160 WO2021208029A1 (en) 2020-04-16 2020-04-16 Method and apparatus for correlating a user and a user equipment

Country Status (2)

Country Link
CN (1) CN115461695A (en)
WO (1) WO2021208029A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100217533A1 (en) * 2009-02-23 2010-08-26 Laburnum Networks, Inc. Identifying a Type of Motion of an Object
US20120052972A1 (en) * 2010-08-26 2012-03-01 Michael Bentley Wireless golf club motion capture apparatus
CN102778582A (en) * 2005-07-27 2012-11-14 讯宝科技公司 System and method for monitoring a mobile computing product/arrangement
US20140163393A1 (en) * 2009-05-20 2014-06-12 Sotera Wireless, Inc. Alarm system that processes both motion and vital signs using specific heuristic rules and thresholds
US20160023353A1 (en) * 2014-01-28 2016-01-28 Lam Research Corporation Wafer handling traction control system
WO2017143814A1 (en) * 2016-02-23 2017-08-31 深圳未网科技有限公司 Method, device and system for ball game data statistics, smart basketball and wrist band

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102778582A (en) * 2005-07-27 2012-11-14 讯宝科技公司 System and method for monitoring a mobile computing product/arrangement
US20100217533A1 (en) * 2009-02-23 2010-08-26 Laburnum Networks, Inc. Identifying a Type of Motion of an Object
US20140163393A1 (en) * 2009-05-20 2014-06-12 Sotera Wireless, Inc. Alarm system that processes both motion and vital signs using specific heuristic rules and thresholds
US20120052972A1 (en) * 2010-08-26 2012-03-01 Michael Bentley Wireless golf club motion capture apparatus
US20160023353A1 (en) * 2014-01-28 2016-01-28 Lam Research Corporation Wafer handling traction control system
WO2017143814A1 (en) * 2016-02-23 2017-08-31 深圳未网科技有限公司 Method, device and system for ball game data statistics, smart basketball and wrist band

Also Published As

Publication number Publication date
CN115461695A (en) 2022-12-09

Similar Documents

Publication Publication Date Title
US20170330031A1 (en) Fusing device and image motion for user identification, tracking and device association
US9303999B2 (en) Methods and systems for determining estimation of motion of a device
US9418279B2 (en) Detection of an object's varying features with a non-stationary device
US10142598B2 (en) Wearable terminal device, photographing system, and photographing method
JP2021536609A (en) Gaze point estimation method and system
WO2016040874A1 (en) Associating a user identity with a mobile device identity
Nguyen et al. IdentityLink: user-device linking through visual and RF-signal cues
Yuan Crowd monitoring using mobile phones
US9710708B1 (en) Method and apparatus for autonomously recognizing at least one object in an image
US10997474B2 (en) Apparatus and method for person detection, tracking, and identification utilizing wireless signals and images
Manos et al. Walking direction estimation using smartphone sensors: A deep network-based framework
Li et al. iPAC: Integrate pedestrian dead reckoning and computer vision for indoor localization and tracking
Tsai et al. Enabling identity-aware tracking via fusion of visual and inertial features
Chang et al. Eye on you: Fusing gesture data from depth camera and inertial sensors for person identification
Ikeda et al. Person identification by integrating wearable sensors and tracking results from environmental sensors
Zhai et al. Vm-tracking: Visual-motion sensing integration for real-time human tracking
WO2021208029A1 (en) Method and apparatus for correlating a user and a user equipment
Nowicki Wifi-guided visual loop closure for indoor navigation using mobile devices
Kornilova et al. Smartportraits: Depth powered handheld smartphone dataset of human portraits for state estimation, reconstruction and synthesis
Cao et al. Vitag: Online wifi fine time measurements aided vision-motion identity association in multi-person environments
Kempfle et al. Quaterni-On: Calibration-free Matching of Wearable IMU Data to Joint Estimates of Ambient Cameras
Wu et al. Qnalyzer: Queuing Recognition Using Accelerometer and Wi-Fi Signals
Amin et al. The Evolution of Wi-Fi Technology in Human Motion Recognition: Concepts, Techniques and Future Works
Zhu et al. Device-free intruder sensing leveraging fine-grained physical layer signatures
Dai et al. Interpersonal distance tracking with mmWave radar and IMUs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20930898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20930898

Country of ref document: EP

Kind code of ref document: A1