US9270526B2 - Multi-terminal positioning method, and related device and system - Google Patents

Multi-terminal positioning method, and related device and system Download PDF

Info

Publication number
US9270526B2
US9270526B2 US14/573,531 US201414573531A US9270526B2 US 9270526 B2 US9270526 B2 US 9270526B2 US 201414573531 A US201414573531 A US 201414573531A US 9270526 B2 US9270526 B2 US 9270526B2
Authority
US
United States
Prior art keywords
terminals
terminal
trigger information
sensor
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US14/573,531
Other languages
English (en)
Other versions
US20150188766A1 (en
Inventor
Xingguang Song
Shiguo Lian
Wei Hu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Assigned to HUAWEI TECHNOLOGIES CO., LTD. reassignment HUAWEI TECHNOLOGIES CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HU, WEI, LIAN, SHIGUO, SONG, Xingguang
Publication of US20150188766A1 publication Critical patent/US20150188766A1/en
Application granted granted Critical
Publication of US9270526B2 publication Critical patent/US9270526B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/08Configuration management of networks or network elements
    • H04L41/0803Configuration setting
    • H04L41/0813Configuration setting characterised by the conditions triggering a change of settings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/104Peer-to-peer [P2P] networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/141Setup of application sessions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/18Self-organising networks, e.g. ad-hoc networks or sensor networks
    • H04W84/20Master-slave selection or change arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices

Definitions

  • the present invention relates to the field of intelligent terminal technologies, and in particular, to a multi-terminal positioning method, and a related device and system.
  • multi-terminal positioning is mainly using two terminals that simultaneously transmit an ultrasonic signal to each other; and calculating a distance from one terminal to the other terminal by using a received ultrasonic signal sent by the other terminal and a received ultrasonic signal sent by the terminal itself.
  • the manner of multi-terminal positioning by transmitting ultrasound in the prior art can only be performed between two terminals, and only the distance between the two terminals is calculated, so the positioning efficiency is low.
  • embodiments of the present invention mainly aim to provide a multi-terminal positioning method, and a related device and system, so as to resolve a problem of low efficiency of multi-terminal positioning in the prior art.
  • the present invention provides the following technical solutions:
  • the present invention provides a multi-terminal positioning method, where the method includes:
  • the primary terminal is a terminal among all terminals that meets a preset condition.
  • the sensor trigger information includes:
  • information about the sensor triggering type includes one or a combination of infrared triggering, light triggering and image triggering.
  • the calculating, by the primary terminal according to the recorded arrangement shape and demarcated gesture, and the sensor trigger information, an arrangement sequence of all terminals and relative positions between the terminals includes:
  • the method further includes:
  • the method further includes:
  • the method further includes:
  • the determining, by the primary terminal, whether an exception exists in the sensor trigger information includes:
  • the present invention provides a multi-terminal positioning method, where the method includes:
  • the primary terminal is a terminal among all terminals that meets a preset condition.
  • the sensor trigger information includes:
  • information about the sensor triggering type includes one or a combination of infrared triggering, light triggering and image triggering.
  • the present invention provides a primary terminal, where the primary terminal includes:
  • a first receiving unit configured to receive a collaboration request activation signal triggered by a user
  • a connecting unit configured to establish connections to secondary terminals in a wireless connection manner when the first receiving unit receives the collaboration request activation signal
  • a second receiving unit configured to receive device configuration parameters of the secondary terminals that have established connections to the primary terminal by using the connecting unit
  • a first calculating unit configured to calculate and display, according to the device configuration parameters received by the second receiving unit and the number of secondary terminals, at least one arrangement shape of all terminals and at least one demarcated gesture that matches each arrangement shape;
  • a recording unit configured to, after the first calculating unit calculates and displays the at least one arrangement shape of all terminals and the at least one demarcated gesture that matches each arrangement shape, record one arrangement shape selected by the user and one demarcated gesture that matches the arrangement shape;
  • a third receiving unit configured to receive sensor trigger information sent by at least one secondary terminal that has established a connection to the primary terminal by using the connecting unit;
  • a second calculating unit configured to calculate, according to the arrangement shape and the demarcated gesture that are recorded by the recording unit and the sensor trigger information received by the third receiving unit, an arrangement sequence of all terminals and relative positions between the terminals.
  • the primary terminal is a terminal among all terminals that meets a preset condition.
  • the sensor trigger information includes:
  • information about the sensor triggering type includes one or a combination of infrared triggering, light triggering and image triggering.
  • the second calculating unit includes:
  • an information determining subunit configured to determine, from at least one piece of sensor trigger information that is sent by each secondary terminal and received by the third receiving unit, sensor trigger information provided by a sensor of a highest priority
  • a sequence determining subunit configured to arrange the terminals in a sequence according to trigger time included in the sensor trigger information that is provided by the sensor of the highest priority in each terminal and determined by the information determining subunit, and determine the arrangement sequence of the terminals according to the arrangement shape and the demarcated gesture that are recorded by the recording unit;
  • a first position determining subunit configured to calculate, according to the trigger time included in the sensor trigger information that is provided by the sensor of the highest priority in each terminal and determined by the information determining subunit, a trigger time difference between successively triggered terminals as a relative position between the successively triggered terminals.
  • the second calculating unit further includes:
  • a second position determining subunit configured to calculate, according to an empirical speed value and the trigger time difference that is obtained by the first position determining subunit by means of calculation, a relative distance between the successively triggered terminals as the relative position between the successively triggered terminals.
  • the primary terminal further includes:
  • a failure prompting unit configured to display failure information
  • a first validity determining unit configured to determine whether the third receiving unit has received sensor trigger information detected by all the secondary terminals, where if a result of the determining of the first validity determining unit is yes, the second calculating unit calculates, according to the arrangement shape and the demarcated gesture that are recorded by the recording unit and the sensor trigger information received by the third receiving unit, the arrangement sequence of all terminals and the relative positions between the terminals; and if the result of the determining of the first validity determining unit is no, the failure prompting unit displays the failure information and the third receiving unit receives again sensor trigger information sent by at least one secondary terminal that has established a connection to the primary terminal by using the connecting unit.
  • the primary terminal further includes:
  • a failure prompting unit configured to display failure information
  • a second validity determining unit configured to determine whether an exception exists in the sensor trigger information received by the third receiving unit, where, if a result of the determining of the second validity determining unit is yes, the failure prompting unit displays the failure information, and the third receiving unit receives again the sensor trigger information sent by at least one secondary terminal that has established a connection to the primary terminal by using the connecting unit; and if the result of the determining of the second validity determining unit is no, the second calculating unit calculates, according to the arrangement shape and the demarcated gesture that are recorded by the recording unit and the sensor trigger information received by the third receiving unit, the arrangement sequence of all terminals and the relative positions between the terminals.
  • the second validity determining unit is specifically configured to:
  • the present invention provides a secondary terminal, where the secondary terminal includes:
  • a receiving unit configured to receive a collaboration request activation signal triggered by a user
  • a connecting unit configured to establish a connection to a primary terminal in a wireless connection manner when the receiving unit receives the collaboration request activation signal
  • a first sending unit configured to send a device configuration parameter to the primary terminal, so that the primary terminal calculates and displays, according to received device configuration parameter of secondary terminals and the number of secondary terminals, at least one arrangement shape of all terminals and at least one demarcated gesture that matches each arrangement shape;
  • a second sending unit configured to, when it is detected that a sensor is triggered, send detected sensor trigger information to the primary terminal, so that the primary terminal calculates, according to the recorded arrangement shape and demarcated gesture, and the sensor trigger information, an arrangement sequence of all terminals and relative positions between the terminals.
  • the primary terminal is a terminal among all terminals that meets a preset condition.
  • the sensor trigger information includes:
  • information about the sensor triggering type includes one or a combination of infrared triggering, light triggering and image triggering.
  • a multi-terminal positioning system where the system includes:
  • the primary terminal is the primary terminal provided in the third aspect of the present invention.
  • the secondary terminal is the secondary terminal provided in the fourth aspect of the present invention.
  • a primary terminal informs a user of a possible arrangement shape of all terminals and a matching demarcated gesture.
  • the primary terminal may receive sensor trigger information detected by the terminals, and directly calculate an arrangement sequence of all terminals and relative positions of the terminals according to the sensor trigger information, a terminal arrangement shape selected by the user, and a corresponding demarcated gesture, so as to complete positioning of multiple terminals in only one demarcation process, thereby improving the positioning efficiency.
  • the sensor trigger information is obtained by using a gesture to trigger the sensors, which, with a strong anti-interference capability, is not easily subjected to interference from environmental noise.
  • FIG. 1 is a flowchart of Embodiment 1 of a multi-terminal positioning method in embodiments of the present invention
  • FIG. 2( a ) is a schematic diagram of a terminal arrangement shape according to an embodiment of the present invention.
  • FIG. 2( b ) is a schematic diagram of a terminal arrangement shape and a matching demarcated gesture according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a trigger manner according to an embodiment of the present invention.
  • FIG. 4 is a flowchart of Embodiment 2 of the multi-terminal positioning method in the embodiments of the present invention.
  • FIG. 5 is a flowchart of Embodiment 3 of the multi-terminal positioning method in the embodiments of the present invention.
  • FIG. 6 is a schematic diagram of a terminal arrangement sequence according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of relative positions of terminals according to an embodiment of the present invention.
  • FIG. 8 is a flowchart of Embodiment 4 of the multi-terminal positioning method in the embodiments of the present invention.
  • FIG. 9 is a flowchart of an embodiment of a multi-terminal positioning system in the embodiments of the present invention.
  • FIG. 10 is a schematic diagram of Embodiment 1 of a primary terminal in the embodiments of the present invention.
  • FIG. 11 is a schematic diagram of Embodiment 2 of the primary terminal in the embodiments of the present invention.
  • FIG. 12 is a schematic diagram of Embodiment 3 of the primary terminal in the embodiments of the present invention.
  • FIG. 13 is a schematic diagram of Embodiment 4 of the primary terminal in the embodiments of the present invention.
  • FIG. 14 is a schematic diagram of Embodiment 1 of a secondary terminal in the embodiments of the present invention.
  • FIG. 15 is a schematic diagram of Embodiment 2 of the secondary terminal in the embodiments of the present invention.
  • a multi-terminal positioning method and a related device and system provided in the embodiments of the present invention can be applied to a multi-terminal collaboration scenario, where multi-terminal collaboration means that multiple terminal devices may be paired to perform various operations, and only after an arrangement sequence and relative positions between the terminals are determined can the terminals be used to perform collaborative operations, such as collaborative display and gesture recognition.
  • multi-terminal collaboration means that multiple terminal devices may be paired to perform various operations, and only after an arrangement sequence and relative positions between the terminals are determined can the terminals be used to perform collaborative operations, such as collaborative display and gesture recognition.
  • Embodiment 1 of a multi-terminal positioning method in the embodiments of the present invention may include the following steps, and a primary terminal is used as an execution body in the description of this embodiment:
  • Step 101 The primary terminal establishes, after receiving a collaboration request activation signal triggered by a user, connections to secondary terminals in a wireless connection manner.
  • a collaboration function needs to be activated for all terminals that need to participate in collaboration.
  • a user may trigger the collaboration request activation signal by starting a dedicated application app (app is short for application).
  • the primary terminal establishes, after receiving the collaboration request activation signal triggered by the user, connections to the secondary terminals in a wireless connection manner, where the wireless connection manner may be various manners, including but not limited to Miracast, 3G (3rd-generation, the third generation of mobile telecommunications technologies), Bluetooth, and so on.
  • Wi-fi is wireless compatibility certification and is a technology that enables a personal computer, a handheld device, and other terminals to interconnect with each other in a wireless manner.
  • Miracast is a certification program established by the Wi-Fi Alliance, and a wireless standard based on Wi-fi Direct (Wi-fi Direct).
  • a terminal among all terminals that meets a preset condition may be determined as the primary terminal, and other terminals are determined as the secondary terminals. That is, the primary terminal is a terminal among all terminals that meets a preset condition. For example, a terminal that first starts an app may be determined as the primary terminal, or a terminal whose processor has a highest dominant frequency may be determined as the primary terminal.
  • the primary terminal can receive user operation information and information sent by the secondary terminals, and implement multi-terminals positioning.
  • the secondary terminals can send information such as a device configuration parameter or detected sensor trigger information to the primary terminal so that the primary terminal can implement multi-terminals positioning.
  • Step 102 The primary terminal calculates and displays, according to received device configuration parameters of the secondary terminals and the number of secondary terminals, at least one arrangement shape of all terminals and at least one demarcated gesture that matches each arrangement shape, and records one arrangement shape selected by the user and one demarcated gesture that matches the arrangement shape.
  • the secondary terminals may send device configuration parameters, for example, as a terminal screen size parameter and a screen resolution, to the primary terminal.
  • the primary terminal may calculate, according to the device configuration parameters of the secondary terminals and the number of secondary terminals, a possible arrangement shape of all terminals. For example, as shown in FIG. 2( a ), where three terminals are as an example, an arrangement shape of the terminals may be a line, an L shape, an inverted L shape, or the like. Each arrangement shape may match multiple demarcated gestures, for example, a clockwise, counterclockwise, or ZigZag gesture. As shown in FIG.
  • a line-shaped arrangement shape may have two demarcated gestures: slide to the left and slide to the right; the arrangement shape of an L shape or inverted L shape may have a clockwise or counterclockwise demarcated gesture.
  • the primary terminal may display the arrangement shape and the demarcated gesture in a visible manner for the user to select.
  • the number of terminals is relatively small, all combinations of arrangement shapes and demarcated gestures may be displayed; and when the number of terminals is relatively large, a preferable combination of arrangement shapes and demarcated gestures may be displayed.
  • the primary terminal After the user selects the arrangement shape and the demarcated gesture, the primary terminal needs to record one arrangement shape selected by the user and one demarcated gesture that matches the arrangement shape.
  • Step 103 The primary terminal receives sensor trigger information sent by at least one secondary terminal.
  • a sensor is triggered when the user slides on the terminal by using a demarcated gesture.
  • the sensor may include but is not limited to an infrared sensor, a light sensor, a terminal capacitive screen, an ultrasonic detector, or an image sensor. Therefore, the demarcated gesture used by the user may be an air gesture.
  • a trigger manner may be classified into slide triggering or press triggering.
  • the trigger manner is mainly determined according to a change time ⁇ t of a sensor state S: If ⁇ t is greater than a time constant K, it may be considered as press triggering; and if ⁇ t is less than the time constant K, it is slide triggering.
  • the time constant K may be determined according to an empirical value and usually ranges from 1 s to 3 s.
  • the primary terminal and the secondary terminals can detect a change of sensor signals, and terminals (including the primary terminal and the secondary terminals) send detected sensor trigger information to the primary terminal.
  • various sensors may be triggered in a gesture demarcation process, and in this case, various sensor trigger information is sent to the primary terminal altogether.
  • the sensor trigger information may include but is not limited to trigger time, a trigger manner, and a sensor triggering type.
  • the sensor triggering type may include but is not limited to one or a combination of infrared triggering, light triggering and image triggering. If the trigger manner is press triggering, the trigger time may be a boundary time when the triggering starts or ends.
  • Step 104 The primary terminal calculates, according to the recorded arrangement shape and demarcated gesture, and the sensor trigger information, an arrangement sequence of all terminals and relative positions between the terminals.
  • the primary terminal may calculate the arrangement sequence of all terminals and the relative positions between the terminals according to the sensor trigger information detected by the terminals.
  • the arrangement sequence indicates horizontal, vertical and like arrangement sequences of the terminals, and a relative position indicates a specific position relationship, for example a distance, between terminals.
  • a primary terminal informs a user of a possible arrangement shape of all terminals and a matching demarcated gesture.
  • the primary terminal may receive sensor trigger information detected by the terminals, and directly calculate an arrangement sequence of all terminals and relative positions between the terminals according to the sensor trigger information, a terminal arrangement shape selected by the user, and a corresponding demarcated gesture, so as to complete positioning of multiple terminals in only one demarcation process, thereby improving the positioning efficiency.
  • the sensor trigger information is obtained by using a gesture to trigger the sensors, which, with a strong anti-interference capability, is not easily subjected to interference from environmental noise.
  • validity of the sensor trigger information may be determined in advance to improve accuracy of subsequent calculation of the arrangement sequence of all terminals and the relative positions of the terminals.
  • the multi-terminal positioning method further includes:
  • the multi-terminal positioning method further includes:
  • Embodiment 2 of the multi-terminal positioning method in the embodiments of the present invention may include the following steps:
  • Step 401 A primary terminal establishes, after receiving a collaboration request activation signal triggered by a user, connections to secondary terminals in a wireless connection manner.
  • Step 402 The primary terminal calculates and displays, according to received device configuration parameters of the secondary terminals and the number of secondary terminals, at least one arrangement shape of all terminals and at least one demarcated gesture that matches each arrangement shape, and records one arrangement shape selected by the user and one demarcated gesture that matches the arrangement shape.
  • Step 403 The primary terminal receives sensor trigger information sent by at least one secondary terminal.
  • Step 404 The primary terminal determines whether sensor trigger information sent by all the secondary terminals is received; if yes, goes to step 405 ; and if no, goes to step 406 .
  • the primary terminal receives the sensor trigger information of the terminals, and the information includes, if the primary terminal is a terminal that also needs to be positioned, sensor trigger information of the primary terminal and sensor trigger information of the secondary terminals.
  • the primary terminal determines whether the sensor trigger information of all terminals is received, so as to preliminarily determine validity of the information, and if yes, preliminarily determines that the information is valid and that further validity determination is needed; otherwise, the primary terminal displays failure information in a visible or another manner to inform the user that gesture demarcation is unsuccessful and a new demarcation is required.
  • Step 405 The primary terminal determines whether an exception exists in the sensor trigger information; if yes, goes to step 406 ; and if no, goes to step 407 .
  • the primary terminal After receiving the sensor trigger information of all terminals, the primary terminal needs to determine the validity of and filter the trigger information.
  • the purpose of validity determination is to remove false trigger information caused by a system delay or a detection error, so as to improve system calculation accuracy, where common false trigger information includes abnormal trigger time, an abnormal sensor triggering type, or the like.
  • an arrangement sequence and relative positions of the terminals may be obtained by means of calculation according to a combination of a terminal arrangement shape parameter and demarcated gesture information.
  • a specific implementation manner in which the primary terminal determines whether an exception exists in the sensor trigger information may include:
  • Step 406 The primary terminal displays failure information, and returns to step 403 .
  • Step 407 The primary terminal calculates, according to the recorded arrangement shape and demarcated gesture, and the sensor trigger information, an arrangement sequence of all terminals and relative positions between the terminals.
  • a primary terminal determines whether sensor trigger information detected by all terminals is received, determines whether an exception exists in the sensor trigger information, determines validity of the sensor trigger information, and notifies a user in time to perform multi-terminal positioning again when a gesture demarcation fails, thereby improving the positioning efficiency, avoiding inaccurate multi-terminal positioning caused by an abnormal sensor or other reasons, improving the calculation accuracy, and optimizing the implementation of multi-terminal positioning.
  • Embodiment 3 of the multi-terminal positioning method in the embodiments of the present invention may include the following steps.
  • a specific implementation manner in which a primary terminal calculates an arrangement sequence of all terminals and relative positions between the terminals according to a recorded arrangement shape and demarcated gesture, and sensor trigger information is mainly described.
  • Step 501 The primary terminal determines, from at least one piece of sensor trigger information sent by each secondary terminal, sensor trigger information provided by a sensor of a highest priority.
  • the primary terminal filters the received sensor trigger information and each terminal saves one piece of most significant sensor trigger information.
  • a sensor priority may be set, and infrared trigger information in multiple pieces of sensor trigger information of a terminal is determined as sensor trigger information provided by a sensor of a highest priority.
  • light trigger information may be used as the sensor trigger information provided by a sensor of a highest priority.
  • Step 502 The primary terminal arranges the terminals in a sequence according to trigger time included in the sensor trigger information provided by the sensor of the highest priority in each terminal, and determines the arrangement sequence of the terminals according to the recorded arrangement shape and demarcated gesture.
  • the primary terminal arranges the sensor trigger information of the terminals according to a time sequence, and the arrangement shape of the terminals and a direction of the demarcated gesture are known; and therefore, the arrangement sequence of the terminals may be obtained.
  • the arrangement sequence may be described by using an arrangement matrix; however, a manner in which the arrangement sequence of the terminals is described is not limited to use of an arrangement matrix.
  • three terminals are used as an example.
  • the arrangement sequence of the terminals may be described by an arrangement matrix [T 1 T 2 T 3 ].
  • the arrangement sequence of the terminals may be described by an arrangement matrix
  • Step 503 The primary terminal calculates, according to the trigger time included in the sensor trigger information provided by the sensor of the highest priority in each terminal, a trigger time difference between successively triggered terminals as a relative position between the successively triggered terminals.
  • a relative position between terminals may refer to a distance between the terminals.
  • this distance mainly refers to a distance between sensors.
  • the distance may be measured in many manners, for example, a physical distance, or according to a time difference between triggered sensors in a gesture sliding process (herein it is assumed that a person's gesture sliding speed is basically constant in one scenario).
  • a time difference ⁇ t may be used to describe a relative position between terminals.
  • the primary terminal calculates a trigger time difference between sensor trigger information of successively triggered sensors and obtains a relative position between the terminals.
  • the relative position may be described by using a position matrix; however, a manner in which relative positions between terminals are described is not limited to the position matrix.
  • a trigger time difference between T 1 and T 2 is ⁇ t 12
  • a trigger time difference between T 2 and T 3 is ⁇ t 23 and relative positions between the terminals may be described by a position matrix
  • a trigger time difference between T 1 and T 2 is ⁇ t 12
  • a trigger time difference between T 2 and T 3 is ⁇ t 23 and relative positions of the terminals may be described by a position matrix
  • the multi-terminal positioning method may further include:
  • the relative position may be described by using the trigger time difference, and the relative position may also be described by calculating an appropriate distance between terminals by using the empirical speed value or a calculated sliding speed, and the trigger time difference.
  • a primary terminal implements calculation of an arrangement sequence of all terminals and relative positions between the terminals according to a recorded arrangement shape and demarcated gesture, and sensor trigger information, so as to complete positioning of multiple terminals in only one demarcation process, thereby improving efficiency of positioning multiple terminals.
  • Embodiment 4 of the multi-terminal positioning method in the embodiments of the present invention may include the following steps, and a secondary terminal is used as an execution body in the description of this embodiment:
  • Step 801 The secondary terminal establishes, after receiving a collaboration request activation signal triggered by a user, a connection to a primary terminal in a wireless connection manner.
  • the collaboration request activation signal may be sent by starting a dedicated app.
  • the secondary terminal After receiving the collaboration request activation signal triggered by the user, the secondary terminal establishes a connection to the primary terminal in a wireless connection manner, where the wireless connection manner may be various manners, including but not limited to Wi-fi, Miracast, 3G, Bluetooth, and so on.
  • a terminal among all terminals that meets a preset condition may be determined as the primary terminal, and other terminals are determined as secondary terminals. That is, the primary terminal is a terminal among all terminals that meets a preset condition.
  • the primary terminal can receive user operation information and information sent by the secondary terminals, and implement multi-terminals positioning.
  • the secondary terminals can send information such as a device configuration parameter or detected sensor trigger information to the primary terminal so that the primary terminal can implement multi-terminals positioning.
  • Step 802 The secondary terminal sends a device configuration parameter to the primary terminal, so that the primary terminal calculates and displays, according to received device configuration parameters of secondary terminals and the number of secondary terminals, at least one arrangement shape of all terminals and at least one demarcated gesture that matches each arrangement shape.
  • the secondary terminals may send device configuration parameters, for example, a terminal screen size parameter and a screen resolution, to the primary terminal, so that the primary terminal can calculate, according to the device configuration parameters of the secondary terminals and the number of secondary terminals, a possible arrangement shape of all terminals and multiple demarcated gestures that match each arrangement shape.
  • device configuration parameters for example, a terminal screen size parameter and a screen resolution
  • Step 803 When it is detected that a sensor is triggered, the secondary terminal sends detected sensor trigger information to the primary terminal, so that the primary terminal calculates, according to the recorded arrangement shape and demarcated gesture, and the sensor trigger information, an arrangement sequence of all terminals and relative positions between the terminals.
  • a sensor is triggered when the user slides on the terminal by using a demarcated gesture.
  • the sensor may include but is not limited to an infrared sensor, a light sensor, a terminal capacitive screen, an ultrasonic detector, or an image sensor. Therefore, the demarcated gesture used by the user may be an air gesture.
  • the sensors integrated in the terminals are basically switching sensors, and some sensors are further capable of detecting several discrete states. Therefore, trigger manners may be classified into slide triggering and press triggering.
  • the secondary terminal can detect a change of sensor signals and send detected sensor trigger information to the primary terminal.
  • various sensors may be triggered in a gesture demarcation process, and various sensor trigger information is sent to the primary terminal altogether.
  • the sensor trigger information may include but is not limited to trigger time, a trigger manner, and a sensor triggering type.
  • Information about the sensor triggering type may include but is not limited to one or a combination of infrared triggering, light triggering and image triggering. If the trigger manner is press triggering, information about trigger time may be boundary time when the triggering starts or ends.
  • a secondary terminal when a user triggers sensors of a terminal by using a demarcated gesture, a secondary terminal sends detected sensor trigger information to a primary terminal, so that the primary terminal may directly calculate an arrangement sequence of all terminals and relative positions of the terminals according to the sensor trigger information, a terminal arrangement shape selected by the user, and a corresponding demarcated gesture, so as to complete positioning multiple terminals in only one demarcation process, thereby improving the positioning efficiency.
  • the sensor trigger information is obtained by using a gesture to trigger the sensors, which, with a strong anti-interference capability, is not easily subjected to interference from environmental noise.
  • the present invention further provides an embodiment of a multi-terminal positioning system where the system includes a primary terminal 901 and several secondary terminals 902 .
  • a terminal among all terminals that meets a preset condition may be determined as a primary terminal, and other terminals are determined as secondary terminals. That is, the primary terminal is a terminal among all terminals that meets the preset condition.
  • Embodiment 1 of the primary terminal in the embodiments of the present invention may include:
  • a first receiving unit 1001 configured to receive a collaboration request activation signal triggered by a user
  • a connecting unit 1002 configured to establish connections to secondary terminals in a wireless connection manner when the first receiving unit receives the collaboration request activation signal
  • a second receiving unit 1003 configured to receive device configuration parameters of the secondary terminals that have established connections to the primary terminal by using the connecting unit;
  • a first calculating unit 1004 configured to calculate and display, according to the device configuration parameters received by the second receiving unit and the number of secondary terminals, at least one arrangement shape of all terminals and at least one demarcated gesture that matches each arrangement shape;
  • a recording unit 1005 configured to, after the first calculating unit calculates and displays the at least one arrangement shape of all terminals and the at least one demarcated gesture that matches each arrangement shape, record one arrangement shape selected by the user and one demarcated gesture that matches the arrangement shape;
  • a third receiving unit 1006 configured to receive sensor trigger information sent by at least one secondary terminal that has established a connection to the primary terminal by using the connecting unit.
  • the sensor trigger information may include: trigger time, a trigger manner, and a sensor triggering type.
  • Information about the sensor triggering type includes one or a combination of infrared triggering, light triggering and image triggering.
  • a second calculating unit 1007 is configured to calculate, according to the arrangement shape and the demarcated gesture that are recorded by the recording unit and the sensor trigger information received by the third receiving unit, an arrangement sequence of all terminals and relative positions between the terminals.
  • the primary terminal according to this embodiment of the present invention may further include:
  • a failure prompting unit configured to display failure information.
  • the primary terminal according to this embodiment of the present invention may further include:
  • a first validity determining unit configured to determine whether the third receiving unit has received sensor trigger information detected by all the secondary terminals, where if a result of the determining of the first validity determining unit is yes, the second calculating unit calculates, according to the arrangement shape and the demarcated gesture that are recorded by the recording unit and the sensor trigger information received by the third receiving unit, the arrangement sequence of all terminals and the relative positions between the terminals; and if the result of the determining of the first validity determining unit is no, the failure prompting unit displays the failure information, and the third receiving unit receives again the sensor trigger information sent by at least one secondary terminal that has established a connection to the primary terminal by using the connecting unit.
  • the primary terminal according to this embodiment of the present invention may further include:
  • a second validity determining unit configured to determine whether an exception exists in the sensor trigger information received by the third receiving unit, where if a result of the determining of the second validity determining unit is yes, the failure prompting unit displays the failure information, and the third receiving unit receives again the sensor trigger information sent by at least one secondary terminal that has established a connection to the primary terminal by using the connecting unit; and if the result of the determining of the second validity determining unit is no, the second calculating unit calculates, according to the arrangement shape and the demarcated gesture that are recorded by the recording unit and the sensor trigger information received by the third receiving unit, the arrangement sequence of all terminals and the relative positions between the terminals.
  • Embodiment 2 of the primary terminal in the embodiments of the present invention may include:
  • a first receiving unit 1101 configured to receive a collaboration request activation signal triggered by a user
  • a connecting unit 1102 configured to establish connections to secondary terminals in a wireless connection manner when the first receiving unit receives the collaboration request activation signal
  • a second receiving unit 1003 configured to receive device configuration parameters of the secondary terminals that have established connections to the primary terminal by using the connecting unit;
  • a first calculating unit 1104 configured to calculate and display, according to the device configuration parameters received by the second receiving unit and the number of secondary terminals, at least one arrangement shape of all terminals and at least one demarcated gesture that matches each arrangement shape;
  • a recording unit 1105 configured to, after the first calculating unit calculates and displays the at least one arrangement shape of all terminals and the at least one demarcated gesture that matches each arrangement shape, record one arrangement shape selected by the user and one demarcated gesture that matches the arrangement shape;
  • a third receiving unit 1106 configured to receive sensor trigger information sent by at least one secondary terminal that has established a connection to the primary terminal by using the connecting unit;
  • a first validity determining unit 1107 configured to determine whether the third receiving unit has received sensor trigger information detected by all the secondary terminals, where if a result of the determining of the first validity determining unit is yes, the second calculating unit calculates, according to the arrangement shape and the demarcation gesture that are recorded by the recording unit and the sensor trigger information received by the third receiving unit, an arrangement sequence of all terminals and relative positions between the terminals; and if the result of the determining of the first validity determining unit is no, a failure prompting unit displays the failure information, and the third receiving unit receives again sensor trigger information sent by at least one secondary terminal that has established a connection to the primary terminal by using the connecting unit; and
  • a second validity determining unit 1108 configured to determine whether an exception exists in the sensor trigger information received by the third receiving unit, where if a result of the determining of the second validity determining unit is yes, the failure prompting unit displays the failure information, and the third receiving unit receives again sensor trigger information sent by at least one secondary terminal that has established a connection to the primary terminal by using the connecting unit; and if the result of the determining of the second validity determining unit is no, the second calculating unit calculates, according to the arrangement shape and the demarcated gesture that are recorded by the recording unit and the sensor trigger information received by the third receiving unit, the arrangement sequence of all terminals and the relative positions between the terminals, where,
  • the second validity determining unit may be specifically configured to:
  • the failure prompting unit 1109 configured to display failure information
  • a second calculating unit 1110 configured to calculate, according to the arrangement shape and the demarcated gesture that are recorded by the recording unit and the sensor trigger information received by the third receiving unit, the arrangement sequence of all terminals and the relative positions between the terminals.
  • a primary terminal informs a user of a possible arrangement shape of all terminals and a matching demarcated gesture.
  • the primary terminal may receive sensor trigger information detected by the terminals, directly calculate an arrangement sequence of all terminals and relative positions of the terminals according to the sensor trigger information, a terminal arrangement shape selected by the user, and a corresponding demarcated gesture, so as to complete positioning of multiple terminals in only one demarcation process, thereby improving the positioning efficiency.
  • the sensor trigger information is obtained by using a gesture to trigger the sensors, which, with a strong anti-interference capability, is not easily subjected to interference from environmental noise.
  • the second calculating unit may include:
  • an information determining subunit 1201 configured to determine, from at least one piece of sensor trigger information that is sent by each secondary terminal and received by the third receiving unit, sensor trigger information provided by a sensor of a highest priority;
  • a sequence determining subunit 1202 configured to arrange the terminals in a sequence according to trigger time included in the sensor trigger information that is provided by the sensor of the highest priority in each terminal and determined by the information determining subunit, and determine the arrangement sequence of the terminals according to the arrangement shape and the demarcated gesture that are recorded by the recording unit;
  • a first position determining subunit 1203 configured to calculate, according to the trigger time included in the sensor trigger information that is provided by the sensor of the highest priority in each terminal and determined by the information determining subunit, a trigger time difference between successively triggered terminals as a relative position between the successively triggered terminals.
  • the second calculating unit may further include:
  • a second position determining subunit configured to calculate, according to an empirical speed value and the trigger time difference that is obtained by the first position determining subunit by means of calculation, a relative distance between the successively triggered terminals as the relative position between the successively triggered terminals.
  • a primary terminal implements calculation of an arrangement sequence of all terminals and relative positions between the terminals according to a recorded arrangement shape and demarcated gesture, and sensor trigger information, and completes positioning multiple terminals in only one demarcation process, thereby improving efficiency of positioning multiple terminals.
  • Embodiment 4 of the primary terminal in the embodiments of the present invention may include:
  • an input apparatus 1301 , an output apparatus 1302 , a processor 1303 , and a memory 1304 there may be one or more processors 1301 in the primary terminal 1300 , and one processor 1301 is used as an example in FIG. 13 ).
  • the input apparatus 1301 , the output apparatus 1302 , the processor 1303 , and the memory 1304 may be connected by using a bus or in other manners. A connection by using a bus is used as an example in FIG. 13 .
  • the processor 1303 is configured to perform the following steps:
  • the processor 1303 is further configured to perform the following steps:
  • the processor 1303 is further configured to perform the following step:
  • the processor 1303 is further configured to perform the following steps:
  • the processor 1303 is further configured to perform the following steps:
  • the processor 1303 is further configured to perform the following step:
  • a processor 1303 may receive sensor trigger information detected by the terminals, and directly calculate an arrangement sequence of all terminals and relative positions of the terminals according to the sensor trigger information, a terminal arrangement shape selected by the user, and a corresponding demarcated gesture, so as to complete positioning of multiple terminals in only one demarcation process, thereby improving the positioning efficiency.
  • the sensor trigger information is obtained by using a gesture to trigger the sensors, which, with a strong anti-interference capability, is not easily subjected to interference from environmental noise.
  • a terminal among all terminals that meets a preset condition may be determined as the primary terminal, and other terminals are determined as the secondary terminals. That is, the primary terminal is a terminal among all terminals that meets the preset condition.
  • an embodiment of the a secondary terminal in the embodiments of the present invention may include:
  • a receiving unit 1401 configured to receive a collaboration request activation signal triggered by a user
  • a connecting unit 1402 configured to establish a connection to a primary terminal in a wireless connection manner when the receiving unit receives the collaboration request activation signal
  • a first sending unit 1403 configured to send a device configuration parameter to the primary terminal, so that the primary terminal calculates and displays, according to received device configuration parameters of secondary terminals and the number of secondary terminals, at least one arrangement shape of all terminals and at least one demarcated gesture that matches each arrangement shape;
  • a second sending unit 1404 configured to, when it is detected that a sensor is triggered, send detected sensor trigger information to the primary terminal, so that the primary terminal calculates, according to the recorded arrangement shape and demarcated gesture, and the sensor trigger information, an arrangement sequence of all terminals and relative positions between the terminals.
  • the sensor trigger information may include: trigger time, a trigger manner, and a sensor triggering type.
  • Information about the sensor triggering type includes one or a combination of infrared triggering, light triggering and image triggering.
  • a secondary terminal when a user triggers sensors of a terminal by using a demarcated gesture, a secondary terminal sends detected sensor trigger information to a primary terminal, so that the primary terminal may directly calculate an arrangement sequence of all terminals and relative positions of the terminals according to the sensor trigger information, a terminal arrangement shape selected by the user, and a corresponding demarcated gesture, so as to complete positioning of multiple terminals in only one demarcation process, thereby improving the positioning efficiency.
  • the sensor trigger information is obtained by using a gesture to trigger the sensors, which, with a strong anti-interference capability, is not easily subjected to interference from environmental noise.
  • Embodiment 2 of the secondary terminal in the embodiments of the present invention may include:
  • an input apparatus 1501 , an output apparatus 1502 , a processor 1503 , and a memory 1504 there may be one or more processors 1501 in the secondary terminal 1500 , and one processor 1501 is used as an example in FIG. 15 ).
  • the input apparatus 1501 , the output apparatus 1502 , the processor 1503 , and the memory 1504 may be connected by using a bus or in other manners. A connection by using a bus is used as an example in FIG. 15 .
  • the processor 1503 is configured to perform the following steps:
  • a processor 1503 when a user triggers sensors of a terminal by using a demarcated gesture, a processor 1503 sends detected sensor trigger information to a primary terminal, so that the primary terminal may directly calculate an arrangement sequence of all terminals and relative positions of the terminals according to the sensor trigger information, a terminal arrangement shape selected by the user, and a corresponding demarcated gesture, so as to complete positioning of multiple terminals in only one demarcation process, thereby improving the positioning efficiency.
  • the sensor trigger information is obtained by using a gesture to trigger the sensors, which, with a strong anti-interference capability, is not easily subjected to interference from environmental noise.
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the described apparatus embodiment is merely exemplary.
  • the unit division is merely logical function division and may be other division in actual implementation.
  • a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed.
  • the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces.
  • the indirect couplings or communication connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
  • the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. A part or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • functional units in the embodiments of the present invention may be integrated into one processing unit, or each of the units may exist alone physically, or two or more units are integrated into one unit.
  • the integrated unit may be implemented in a form of hardware, or may be implemented in a form of a software functional unit.
  • the integrated unit When the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, the integrated unit may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of the present invention essentially, or the part contributing to the prior art, or all or a part of the technical solutions may be implemented in the form of a software product.
  • the software product is stored in a storage medium and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or a part of the steps of the methods described in the embodiments of the present invention.
  • the foregoing storage medium includes: any medium that can store program code, such as a USB flash drive, a removable hard disk, a read-only memory (ROM, Read-Only Memory), a random access memory (RAM, Random Access Memory), a magnetic disk, or an optical disc.
  • program code such as a USB flash drive, a removable hard disk, a read-only memory (ROM, Read-Only Memory), a random access memory (RAM, Random Access Memory), a magnetic disk, or an optical disc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
US14/573,531 2013-12-30 2014-12-17 Multi-terminal positioning method, and related device and system Active US9270526B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201310746331.5A CN104748737B (zh) 2013-12-30 2013-12-30 一种多终端定位方法、相关设备及系统
CN201310746331.5 2013-12-30
CN201310746331 2013-12-30

Publications (2)

Publication Number Publication Date
US20150188766A1 US20150188766A1 (en) 2015-07-02
US9270526B2 true US9270526B2 (en) 2016-02-23

Family

ID=52354671

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/573,531 Active US9270526B2 (en) 2013-12-30 2014-12-17 Multi-terminal positioning method, and related device and system

Country Status (5)

Country Link
US (1) US9270526B2 (ko)
EP (1) EP2908238B1 (ko)
JP (1) JP6065234B2 (ko)
KR (1) KR101623396B1 (ko)
CN (1) CN104748737B (ko)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016162132A (ja) * 2015-02-27 2016-09-05 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP6517623B2 (ja) * 2015-08-04 2019-05-22 株式会社東芝 無線機器配置推定装置、無線機器配置推定方法、無線機器配置推定プログラム
WO2020141857A1 (ko) * 2018-12-31 2020-07-09 엘지전자 주식회사 무선 통신 시스템에서 제1 단말이 제2 단말과의 거리를 측정하는 방법 및 단말
CN111273769B (zh) * 2020-01-15 2022-06-17 Oppo广东移动通信有限公司 一种设备控制方法、装置、电子设备及存储介质
CN114666784A (zh) * 2020-12-23 2022-06-24 维沃移动通信有限公司 上报终端传感器信息的方法、终端和可读存储介质
CN115278546A (zh) * 2021-04-30 2022-11-01 维沃移动通信有限公司 数据传输方法、相关设备及可读存储介质
CN116222546B (zh) * 2023-05-10 2023-07-25 北京白水科技有限公司 群组导航定位中的地图信息的生成方法、装置及设备

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003271118A (ja) 2002-03-15 2003-09-25 Toshiba Corp マルチ画面設定方法
US6681110B1 (en) * 1999-07-02 2004-01-20 Musco Corporation Means and apparatus for control of remote electrical devices
US20090197612A1 (en) * 2004-10-29 2009-08-06 Arto Kiiskinen Mobile telephone location application
JP2009296171A (ja) 2008-06-03 2009-12-17 Panasonic Corp 携帯通信端末
CN101674364A (zh) 2009-09-28 2010-03-17 深圳华为通信技术有限公司 一种无线屏幕拼接显示方法、移动通信终端和装置
JP2011048610A (ja) 2009-08-27 2011-03-10 Jvc Kenwood Holdings Inc 画像表示システム、及び画像表示方法
US20110209058A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen hold and tap gesture
US20110209089A1 (en) 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen object-hold and page-change gesture
US20110209103A1 (en) 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen hold and drag gesture
US20110209101A1 (en) 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen pinch-to-pocket gesture
US20110209039A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen bookmark hold gesture
US20110209102A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen dual tap gesture
US20110209104A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen synchronous slide gesture
US20110209057A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen hold and page-flip gesture
US20110209100A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen pinch and expand gestures
US20110252317A1 (en) 2010-04-08 2011-10-13 Nokia Corporation Method, apparatus and computer program product for joining the displays of multiple devices
JP2011237532A (ja) 2010-05-07 2011-11-24 Nec Casio Mobile Communications Ltd 端末装置及び端末通信システム並びにプログラム
US20120081303A1 (en) 2010-10-01 2012-04-05 Ron Cassar Handling gestures for changing focus
US20120117495A1 (en) 2010-10-01 2012-05-10 Imerj, Llc Dragging an application to a screen using the application manager
US20120124490A1 (en) 2010-10-01 2012-05-17 Imerj, Llc Full-screen annunciator
JP2012208720A (ja) 2011-03-29 2012-10-25 Fujitsu Ltd サーバ、端末装置及びグループ化方法
US20120280898A1 (en) 2011-05-03 2012-11-08 Nokia Corporation Method, apparatus and computer program product for controlling information detail in a multi-device environment
US20130005354A1 (en) * 2011-06-30 2013-01-03 Suman Sheilendra Recognition System
US20130080933A1 (en) 2011-08-24 2013-03-28 Paul E. Reeves Opening applications in unified desktop
JP2013195957A (ja) 2012-03-22 2013-09-30 Ntt Docomo Inc 表示装置、表示方法及びプログラム
EP2667299A1 (en) 2012-05-25 2013-11-27 Samsung Electronics Co., Ltd Multiple display method with multiple communication terminals, machine-readable storage medium and communication terminal
JP2013246583A (ja) 2012-05-24 2013-12-09 Buffalo Inc 情報処理システム、サーバ装置、情報処理方法及びプログラム
US8798636B2 (en) * 2011-03-28 2014-08-05 Alcatel Lucent Method and apparatus for carrier selection and scheduling in wireless systems

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5441619B2 (ja) * 2009-10-30 2014-03-12 ソニーモバイルコミュニケーションズ, エービー 近距離無線通信装置、近距離無線通信システム、近距離無線通信装置の制御方法、近距離無線通信装置の制御プログラム、及び、携帯電話端末
JP5858155B2 (ja) 2011-06-23 2016-02-10 ▲華▼▲為▼終端有限公司Huawei Device Co., Ltd. 携帯型端末装置のユーザインターフェースを自動的に切り替える方法、及び携帯型端末装置
US9736701B2 (en) 2011-10-28 2017-08-15 Qualcomm Incorporated Dead reckoning using proximity sensors
CN102902938B (zh) * 2012-09-03 2016-12-21 北京三星通信技术研究有限公司 确定物品位置的方法、电子地图的生成方法及装置

Patent Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6681110B1 (en) * 1999-07-02 2004-01-20 Musco Corporation Means and apparatus for control of remote electrical devices
JP2003271118A (ja) 2002-03-15 2003-09-25 Toshiba Corp マルチ画面設定方法
US20090197612A1 (en) * 2004-10-29 2009-08-06 Arto Kiiskinen Mobile telephone location application
JP2009296171A (ja) 2008-06-03 2009-12-17 Panasonic Corp 携帯通信端末
JP2011048610A (ja) 2009-08-27 2011-03-10 Jvc Kenwood Holdings Inc 画像表示システム、及び画像表示方法
CN101674364A (zh) 2009-09-28 2010-03-17 深圳华为通信技术有限公司 一种无线屏幕拼接显示方法、移动通信终端和装置
US20110209103A1 (en) 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen hold and drag gesture
US20110209089A1 (en) 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen object-hold and page-change gesture
US20110209058A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen hold and tap gesture
US20110209101A1 (en) 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen pinch-to-pocket gesture
US20110209039A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen bookmark hold gesture
US20110209102A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen dual tap gesture
US20110209104A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen synchronous slide gesture
US20110209057A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen hold and page-flip gesture
US20110209100A1 (en) 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen pinch and expand gestures
US20110252317A1 (en) 2010-04-08 2011-10-13 Nokia Corporation Method, apparatus and computer program product for joining the displays of multiple devices
JP2011237532A (ja) 2010-05-07 2011-11-24 Nec Casio Mobile Communications Ltd 端末装置及び端末通信システム並びにプログラム
US20120081302A1 (en) 2010-10-01 2012-04-05 Imerj LLC Multi-screen display control
US20120081303A1 (en) 2010-10-01 2012-04-05 Ron Cassar Handling gestures for changing focus
US20120081307A1 (en) 2010-10-01 2012-04-05 Imerj LLC Flick move gesture in user interface
US20120117495A1 (en) 2010-10-01 2012-05-10 Imerj, Llc Dragging an application to a screen using the application manager
US20120124490A1 (en) 2010-10-01 2012-05-17 Imerj, Llc Full-screen annunciator
US8798636B2 (en) * 2011-03-28 2014-08-05 Alcatel Lucent Method and apparatus for carrier selection and scheduling in wireless systems
JP2012208720A (ja) 2011-03-29 2012-10-25 Fujitsu Ltd サーバ、端末装置及びグループ化方法
US20120280898A1 (en) 2011-05-03 2012-11-08 Nokia Corporation Method, apparatus and computer program product for controlling information detail in a multi-device environment
US20130005354A1 (en) * 2011-06-30 2013-01-03 Suman Sheilendra Recognition System
US20130080939A1 (en) 2011-08-24 2013-03-28 Paul E. Reeves Displaying a unified desktop across devices
US20130080935A1 (en) 2011-08-24 2013-03-28 Paul E. Reeves Application manager in a unified desktop
US20130080934A1 (en) 2011-08-24 2013-03-28 Paul E. Reeves Activating applications in unified desktop
US20130080933A1 (en) 2011-08-24 2013-03-28 Paul E. Reeves Opening applications in unified desktop
US20130080936A1 (en) 2011-09-27 2013-03-28 Paul E. Reeves Displaying a unified desktop across connected devices
US20130076715A1 (en) 2011-09-27 2013-03-28 Mohammed Selim Displaying of charging status on dual screen device
US20130076781A1 (en) 2011-09-27 2013-03-28 Z124 Smartpad - multiapp
JP2013195957A (ja) 2012-03-22 2013-09-30 Ntt Docomo Inc 表示装置、表示方法及びプログラム
JP2013246583A (ja) 2012-05-24 2013-12-09 Buffalo Inc 情報処理システム、サーバ装置、情報処理方法及びプログラム
EP2667299A1 (en) 2012-05-25 2013-11-27 Samsung Electronics Co., Ltd Multiple display method with multiple communication terminals, machine-readable storage medium and communication terminal

Also Published As

Publication number Publication date
KR20150079434A (ko) 2015-07-08
KR101623396B1 (ko) 2016-05-23
CN104748737A (zh) 2015-07-01
EP2908238A1 (en) 2015-08-19
US20150188766A1 (en) 2015-07-02
EP2908238B1 (en) 2018-05-30
JP6065234B2 (ja) 2017-01-25
JP2015130669A (ja) 2015-07-16
CN104748737B (zh) 2017-09-29

Similar Documents

Publication Publication Date Title
US9270526B2 (en) Multi-terminal positioning method, and related device and system
US9615161B2 (en) Wearable electronic device
US9514349B2 (en) Method of guiding a user of a portable electronic device
US11172838B2 (en) Sensing body information apparatus for volume and blood flow via light reflectance
EP2945136B1 (en) Mobile terminal and method for controlling the mobile terminal
JP6134963B2 (ja) スクリーンキャプチャ方法、装置、および端末デバイス
CN106445242B (zh) 压力触控装置和电子设备
US9846529B2 (en) Method for processing information and electronic device
WO2019218843A1 (zh) 按键配置方法、装置、移动终端及存储介质
US20170344234A1 (en) Unlocking control methods and apparatuses, and electronic devices
EP2897027A2 (en) Input method, apparatus and terminal
US9817447B2 (en) Method, device, and system for recognizing gesture based on multi-terminal collaboration
EP3101528A1 (en) Method for controlling a display of an electronic device and the electronic device thereof
WO2017035977A1 (zh) 终端设备配对连接确认的方法及系统
US20170004212A1 (en) Method and apparatus for acquiring search results
CN114501119A (zh) 互动显示方法、装置、电子设备、系统及存储介质
EP3282680B1 (en) Blowing action-based method for operating mobile terminal and mobile terminal
CN110351418B (zh) 屏幕控制方法、装置、移动终端及计算机可读介质
CN108650413A (zh) 投影方法、装置、移动终端以及存储介质
WO2016019629A1 (zh) 一种智能终端单手操作方法、装置及计算机存储介质
JP2013040724A (ja) 空気調和機
CN109085944B (zh) 数据处理方法、装置以及移动终端
CN109196860B (zh) 一种多视角图像的控制方法及相关装置
KR20160150547A (ko) 보조 장치를 가지는 전자 장치 및 그의 문자 수신 방법
US11941908B2 (en) Optical fingerprint module and signal processing method

Legal Events

Date Code Title Description
AS Assignment

Owner name: HUAWEI TECHNOLOGIES CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SONG, XINGGUANG;LIAN, SHIGUO;HU, WEI;REEL/FRAME:035179/0035

Effective date: 20141211

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8