CN113138560A - Terminal control method, device, equipment and readable storage medium - Google Patents

Terminal control method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN113138560A
CN113138560A CN202110388244.1A CN202110388244A CN113138560A CN 113138560 A CN113138560 A CN 113138560A CN 202110388244 A CN202110388244 A CN 202110388244A CN 113138560 A CN113138560 A CN 113138560A
Authority
CN
China
Prior art keywords
terminal
target
distance
condition
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110388244.1A
Other languages
Chinese (zh)
Inventor
李雪亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110388244.1A priority Critical patent/CN113138560A/en
Publication of CN113138560A publication Critical patent/CN113138560A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application discloses a terminal control method, a device, equipment and a readable storage medium, and belongs to the technical field of terminals. The terminal control method comprises the following steps: acquiring a first distance between a first position of a target wearable device and a first terminal and a second distance between a second position of the target wearable device and the first terminal; determining the first terminal as a target terminal under the condition that the difference value between the first distance and the second distance meets a preset condition; and controlling the target terminal to execute the target operation.

Description

Terminal control method, device, equipment and readable storage medium
Technical Field
The application belongs to the technical field of terminals, and particularly relates to a terminal control method, device, equipment and a readable storage medium.
Background
With the rapid development of intelligent home devices, more and more intelligent home devices enter the daily life of people. Along with intelligent home devices are more and more, in order to facilitate a user to control a plurality of intelligent home devices, the user can be connected with the intelligent home devices through an intelligent terminal, and therefore the intelligent home devices are controlled.
However, as more and more smart home devices are used, when a user controls a smart home device through one device, the user needs to accurately select or search a target device from the smart home devices, and then can control the target device.
Content of application
The embodiment of the application aims to provide a terminal control method, a terminal control device, terminal control equipment and a readable storage medium, and can solve the problems that the process of determining target equipment is complicated and user experience is reduced.
In a first aspect, an embodiment of the present application provides a terminal control method, where the method includes:
acquiring a first distance between a first position of a target wearable device and a first terminal and a second distance between a second position of the target wearable device and the first terminal;
determining the first terminal as a target terminal under the condition that the difference value between the first distance and the second distance meets a preset condition;
and controlling the target terminal to execute the target operation.
In a second aspect, an embodiment of the present application provides a terminal control apparatus, including:
the acquisition module is used for acquiring a first distance between a first position of the target wearable device and the first terminal and a second distance between a second position of the target wearable device and the first terminal;
the determining module is used for determining the first terminal as a target terminal under the condition that the difference value of the first distance and the second distance meets a preset condition;
and the control module is used for controlling the target terminal to execute the target operation.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, in order to facilitate the user to quickly control the target terminal, the first distance between the first position of the target wearable device and the first terminal and the second distance between the second position of the target wearable device and the first terminal are obtained, and then the first terminal can be determined as the target terminal under the condition that the difference value between the first distance and the second distance meets the preset condition.
Drawings
Fig. 1 is a schematic flowchart of a terminal control method provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a terminal control method provided in an embodiment of the present application in an embodiment;
fig. 3 is a schematic diagram of a terminal control method provided in an embodiment of the present application in another embodiment;
fig. 4 is a schematic diagram of a terminal control method provided in an embodiment of the present application in a further embodiment;
fig. 5 is a schematic structural diagram of a terminal control device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic hardware structure diagram of another electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
In order to solve the problems in the background art, an embodiment of the present application provides a terminal control method, where a first distance between a first position of a target wearable device and a first terminal and a second distance between a second position of the target wearable device and the first terminal are obtained, and then the first terminal is determined as the target terminal when a difference between the first distance and the second distance meets a preset condition, so that a user does not need to perform additional operations such as searching and selecting, the target terminal is automatically and accurately determined, the target terminal is quickly controlled to perform a target operation, and user experience can be effectively improved.
The terminal control method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, fig. 1 is a schematic flowchart of a terminal control method provided in an embodiment of the present application, including:
step 110, a first distance between a first position of the target wearable device and the first terminal and a second distance between a second position of the target wearable device and the first terminal are obtained.
The first terminal may include a smart home device, for example: the first terminal can also comprise a mobile electronic device and a non-mobile electronic device, and the like. For convenience of description, the above-mentioned various devices are collectively referred to as a first terminal. In the embodiment of the application, the method can be suitable for any position in front of a user and can be provided with one or more first terminals.
The target wearable device may include a smart headset, a smart glasses, and other common wearable devices, and for convenience of understanding, the terminal control method provided in the embodiment of the present application is further explained below by taking the target wearable device as the smart headset as an example.
The first position and the second position on the target wearing equipment are respectively provided with a related positioning element, and the distance information between the target wearing equipment and the first terminal is detected through the positioning elements. The positioning element may acquire the distance to the first terminal based on an antenna array, and the positioning element may also be bluetooth, wireless or radio frequency identification, and the like, which is not specifically limited herein.
As a specific example, the first location and the second location of the target wearable device may be provided with an antenna array, respectively, for example, the smart headset may include a left headset and a right headset, the first location of the smart headset may be at the left headset, and the second location of the smart headset may be at the right headset. The time when the antenna array in the left earphone passes the received radio wave can be used for obtaining a first distance between the first position of the intelligent earphone and the first terminal, and similarly, a second distance between the second position of the intelligent earphone and the first terminal can also be obtained.
Optionally, the smart headset may include: a wired headset, a wireless headset, or a headset. The Wireless headset may be, for example, a True Wireless bluetooth headset (TWS), Kleer headset, or WiFi headset.
The target wearable device may be connected to the mobile phone through a plurality of first terminals, for example, the target wearable device is connected to the mobile phone through the mobile phone and the gateway, and the plurality of first terminals are connected to the gateway respectively, so that the target wearable device may be connected to different first terminals.
In some embodiments, a calculation element for calculating a difference value may be provided in the target-wearing device for calculating a difference value of the first distance and the second distance between the target-wearing device and each of the first terminals. Optionally, the first terminal connected to the target wearing device may be provided with a calculating element for calculating a difference value between a first distance and a second distance between the target wearing device and each first terminal.
And step 120, determining the first terminal as a target terminal under the condition that the difference value between the first distance and the second distance meets a preset condition.
In some embodiments, when there are a plurality of first terminals, optionally, the preset condition may include determining that the first terminal corresponding to the difference value with the smallest absolute value is the target terminal.
There may be one or more first terminals in a first orientation of the target wearable device, in particular the first orientation may be an orientation of a face of the user. The positioning element in the target wearable device may be provided with an antenna array, and the first orientation of the target wearable device may be determined by a time difference between the different array elements in the antenna array for receiving the wireless signal. Furthermore, the positioning element may also be an element that is used to determine the first orientation of the target wearing device through bluetooth, wireless or radio frequency identification, and the like, and is not limited in this respect. Optionally, the orientation of the face of the user may be set as a default orientation, so that the target terminal may be determined directly from the user facing the terminal.
As a specific example, the first distance corresponding to the left earphone and the second distance corresponding to the right earphone may be, and the preset condition may specifically be that the corresponding first terminal is determined to be the target terminal when a difference between the first distance and the second distance is a negative value. At this time, if the first orientation of the target wearable device is the orientation of the face of the user, the target terminal may be determined from the left side of the orientation of the face of the user according to a preset condition that the difference is a negative value.
In addition, the preset condition may be that the first terminal corresponding to the difference value with the smallest absolute value among the negative values of the difference values of the first distance and the second distance is the target terminal. At this time, if the first orientation of the target wearable device is the orientation of the face of the user, the position of the target terminal that is on the left right in the front of the orientation of the face of the user can be obtained according to the preset condition that the difference is a negative value and the absolute value of the difference is the minimum.
The preset condition may specifically be that the corresponding first terminal is determined to be the target terminal when the difference between the first distance and the second distance is a positive value. At this time, if the first orientation of the target wearable device is the orientation of the face of the user, the target terminal may be determined from the right side of the orientation of the face of the user according to a preset condition that the difference is a positive value.
Optionally, the specific condition corresponding to the preset condition may be a default condition preset in the electronic device, for example, the default condition is "the first terminal corresponding to the minimum difference value of the absolute values is the target terminal", or the specific condition may be set by the user autonomously, which is not specifically limited herein.
Taking the preset condition including the determination of the difference value with the minimum absolute value as an example, a user may have a plurality of intelligent lamps, wear an intelligent headset, turn around to face the intelligent lamps, and hope to control one of the intelligent lamps. Referring to fig. 2, the user faces the smart lamp a, the smart lamp B, and the smart lamp C, and may acquire the first distance a1 between the first position (left headphone) and the smart lamp a, the first distance B1 between the first position and the smart lamp B, the first distance C1 between the first position and the smart lamp C, the second distance a2 between the second position (right headphone) and the smart lamp a, the second distance B2 between the second position and the smart lamp B, and the second distance C2 between the second position and the smart lamp C, respectively, through the smart headphones. For the intelligent lamp a, the absolute value | a1-a2| of the difference between the first distance a1 and the second distance a1 may be obtained, for the intelligent lamp B, the absolute value | B1-B2| of the difference between the first distance B1 and the second distance B2 may be obtained, for the intelligent lamp C, the absolute value | C1-C2| of the difference between the first distance C1 and the second distance C2 may be obtained, and then the intelligent lamp corresponding to the difference with the smallest absolute value is determined as the target terminal according to the magnitudes of | a1-a2|, | B1-B2| and | C1-C2| so as to reduce extra user operations, quickly determine the target terminal, and improve the user experience of interaction with the terminal.
It can be understood that, a user has an intelligent lamp around him, and the positioning element in the target wearing device can detect that the difference between the first distance and the second distance of the intelligent lamp changes, and can use the intelligent lamp as the target terminal when the absolute value of the difference becomes minimum.
It is understood that the preset conditions can be set as needed by those skilled in the art.
And step 130, controlling the target terminal to execute the target operation.
After the target terminal is determined, the target terminal may be controlled to perform the target operation.
Continuing with the smart light example, the target operation may be a preset default operation, such as turning off or turning on the light. If the intelligent lamp is in the on state, after the intelligent lamp is determined as the target terminal, the intelligent lamp can be controlled to be turned off; if the intelligent lamp is in the off state, the intelligent lamp can be controlled to be turned on.
Alternatively, the target operation may also be a target operation corresponding to a control instruction input by the user. For example, if the control command input by the user is "turn off after 30 minutes", the smart lamp may automatically turn off after 30 minutes. By the terminal control method provided by the embodiment of the application, if the number of the first terminals is multiple, a user does not need to specifically find which first terminal is, and only needs to face one of the first terminals to output the control instruction, and then the target terminal is automatically determined and is controlled to execute the target operation corresponding to the control instruction.
For example, the control instruction may be a voice input, and may also be an input of a control set on a control interface of the intelligent terminal connected with the target wearable device. Such as, but not limited to, a "light off" control and a time setting control in the control interface.
In another specific embodiment, the at least one first terminal near the user includes a mobile phone, and when the user wears the smart headset and turns around to face the mobile phone, the mobile phone may be determined as the target terminal, and then, the mobile phone may be controlled to light up the screen. Specifically, for example, when a user wears the smart headset to read a book, the user turns around to look up the mobile phone to know the current time, so that the mobile phone is determined to be the target terminal after the user only needs to look up the mobile phone, the mobile phone is automatically lightened, the user can conveniently look up the mobile phone, and the user experience is effectively improved.
In the embodiment of the application, by obtaining the first distance between the first position of the target wearable device and the first terminal and the second distance between the second position of the target wearable device and the first terminal, the first terminal can be determined as the target terminal under the condition that the difference value between the first distance and the second distance meets the preset condition, so that a user does not need to automatically determine the target terminal through extra operations such as searching and selecting. And then, the target terminal can be controlled to execute the target operation, the additional operation that a user manually triggers the target terminal, for example, the screen is lightened is omitted, the control mode of the terminal is more convenient for the user, and the user experience is effectively improved.
In addition, in this embodiment of the application, a fourth distance between the target wearable device and the target terminal may also be obtained, and the target terminal is controlled to execute the preset operation when the fourth distance is smaller than the preset threshold. The fourth distance may be used as a distance between the user and the target terminal, and the preset operation may include generating a prompt message. Specifically, taking the target terminal as a mobile phone as an example, when the user views the mobile phone, it is determined that the fourth distance between the target wearable device and the mobile phone is smaller than the preset threshold, and then prompt information may be generated for prompting the user to pay attention to eye safety. The prompting mode of the prompting message may be pop-up window prompting on the screen, or voice prompting, and is not specifically limited herein. In the embodiment of the application, when the target terminal is determined to be too close to the user, the user can be prompted to pay attention to the use distance, and the user experience is improved.
In some embodiments, when there are a plurality of difference values with the smallest absolute value, there are a plurality of target terminals, and the control target terminal performs the target operation.
In a case that there are a plurality of target terminals, the method may also control the plurality of target terminals to perform the target operation according to priorities of the plurality of target terminals, and specifically may include the following steps:
step 131, obtaining a third distance between the target wearable device and the target terminal.
As a specific example, there may be a plurality of smart lights around the user, and when the user faces the smart lights through turning around, there may be a plurality of smart lights on the opposite side of the user, and since there is a plurality of difference values with the minimum absolute value, there are a plurality of smart lights that can be used as target terminals. In order to further determine the target terminal performing the target operation, a third distance between the target wearable device and the target terminal may be used, and optionally, the third distance may be a first distance between the first location of the target wearable device and the target terminal, or a second distance between the second location of the target wearable device and the target terminal, which is not specifically limited herein. Wherein, the third distance may be a distance between the user and the target terminal.
Optionally, the horizontal included angle between the first position and the second position of the target wearable device and the target terminal may be obtained through a bluetooth direction finding technology or other wireless positioning technologies, the vertical included angle between the first position and the target terminal may be obtained through gyroscopes in the left earphone and the right earphone, and then the first distance and the second distance between the target wearable device and the target terminal are combined to calculate the linear distance between the user face and the target terminal, and the linear distance is used as the third distance between the user and the target terminal, which is not specifically limited herein.
Fig. 3 is a schematic diagram of a terminal control method provided in an embodiment of the present application in another embodiment, and referring to fig. 3, when a user faces a smart lamp, there may be a smart lamp a and a smart lamp B in front of and opposite to the user.
The first distance A1 between the first position and the intelligent lamp A, the first distance B1 between the first position and the intelligent lamp B, the second distance A2 between the second position and the intelligent lamp A, and the second distance B2 between the second position and the intelligent lamp B are acquired through the intelligent earphones respectively. At this time, the intelligent lamp A and the intelligent lamp B are on the same straight line, and the difference value with the minimum absolute value is equal.
Step 132, determining priorities of the plurality of target terminals according to the sizes of the plurality of third distances.
In some embodiments, the priorities of the plurality of target terminals may be determined according to the order of the third distances from small to large, for example, the highest priority to the lowest priority of the target terminals may be determined sequentially according to the order of the third distances from small to large. Optionally, the target terminal corresponding to the highest priority may be controlled to perform the target operation first.
And step 133, controlling the plurality of target terminals to execute the target operation according to the priority.
Optionally, after determining the priorities of the plurality of target terminals, the target terminals may be sequentially controlled to execute the target operations according to the order of the priorities from high to low. Therefore, the target terminal can be controlled to execute the target operation without excessive description of the user under the condition that the number of the target terminals is excessive.
In order to make the process of controlling the terminal more intelligent, in some embodiments, step 133 in this embodiment may further perform the following steps:
firstly, terminal state information of a plurality of target terminals is sequentially acquired according to the priority.
Then, the control target terminal performs the target operation in a case where the terminal state information satisfies a preset control condition.
As a specific example, with continued reference to fig. 3, the target terminals may include a smart lamp a and a smart lamp B, where the smart lamp B has the highest priority and the smart lamp a has the second priority according to the third distance corresponding to each target terminal. Optionally, the target operation may be a preset default operation, and the target operation may also be a target operation corresponding to a control instruction input by a user, for example, the target operation is to turn off an intelligent lamp, and correspondingly, the preset control condition may be that the intelligent lamp is in an on state. And acquiring the terminal state information of the intelligent lamp B according to the priority.
For example, if the terminal state information of the smart lamp B is in an on state, the smart lamp B satisfies a preset control condition, and thus, the smart lamp B may be controlled to perform a target operation. At this time, the acquisition of the terminal status message of the smart lamp a may be stopped. If the terminal state information of the intelligent lamp B is in the off state, the intelligent lamp B does not meet the preset control condition, next, the terminal state information of the intelligent lamp A corresponding to the second priority can be acquired, and if the terminal state information of the intelligent lamp A is in the on state, the intelligent lamp A meets the preset control condition and can be controlled to be turned off.
Continuing with the example of the target operation being turning off a smart lamp, in one possible implementation, if there is a target terminal (not shown in fig. 3) behind the smart lamp a corresponding to the smallest difference in absolute value, the state information of the target terminal is not acquired.
According to the embodiment of the application, under the condition that a plurality of intelligent lamps are possibly arranged opposite to a user, if the user wants to turn off one intelligent lamp, a lamp turning-off instruction can be spoken through voice, and then the corresponding intelligent lamp can be turned off according to the priority of a plurality of target terminals.
In some embodiments, before performing step 110 in the embodiment of the present application, the method may further include the following steps: receiving a first input, wherein the first input comprises an orientation condition; and determining a target difference condition corresponding to the first azimuth condition as a preset condition.
Wherein the first input may be a voice input of the user, for example, after the user may speak a command to turn off the light by voice, step 120 and step 130 are executed. In addition, the target wearable device may be connected to a smart terminal of the user, and the smart terminal may include a control interface, for example, where related control controls may be displayed. Electronic devices such as a mobile phone, a tablet computer, and a smart watch are not listed here. Based on the touch operation of the user on the control, the first input can be obtained.
In some embodiments, after the first input is received, step 110 in this embodiment of the application is executed, and compared with the real-time acquisition of the first distances and the second distances between the target wearable device and the plurality of first terminals, resource consumption can be reduced, and the cruising ability of the target wearable device is improved.
In addition, the first input may further include a first orientation condition, for example, when the user faces a plurality of first terminals, the received first input is "turn off the lamp on the left side of me", from which the first orientation condition is "left side", and thus, according to the first orientation condition "left side" of the first input, a target difference condition corresponding to "left side" may be obtained as a preset condition, so as to accurately determine the target terminal from the plurality of first terminals, and make the control process more intelligent.
It can be understood that different first location conditions may correspond to different target difference conditions, and the target difference condition may be a preset condition, so that the method may be applicable to quickly locating the target terminal when the target terminal is at different locations.
Wherein the difference may be a positive number, a negative number or zero. For example, the target difference condition corresponding to the "left side" of the first orientation condition may be that a corresponding terminal is a target terminal when the difference between the first distance and the second distance is a negative number.
Specifically, with continued reference to fig. 2, when the user faces the smart lamp, it is determined that the smart lamp in front of the user has smart lamp a, smart lamp B, and smart lamp C. The first distance between the first position (the left earphone) and the intelligent lamp A is 3 meters, and the distance between the second position (the right earphone) and the intelligent lamp A is 3.2 meters; the first distance between the first position and the intelligent lamp B is 3.1 meters, and the second distance between the second position and the intelligent lamp B is 3 meters; the first distance between the first position and the intelligent lamp C is 3.3 meters, and the second distance between the second position and the intelligent lamp C is 3.1 meters. Continuing with fig. 2, for the smart lamp a, the difference between the first distance and the second distance is-0.2, for the smart lamp B, the difference between the first distance and the second distance is 0.1, and for the smart lamp C, the difference between the first distance and the second distance is 0.3, so that the smart lamp a corresponding to-0.2 is taken as the target terminal based on the target difference condition corresponding to the "left side" of the first orientation condition.
Based on orientation information of the target-wearing device that can be determined in a positioning element in the target-wearing device, the orientation information may include, in particular, a first orientation, which may be an orientation of a face of the user, or a second orientation, which may be an orientation of a back of the user.
In the embodiment of the application, the orientation information of the target wearable device can be acquired; and determining the target terminal in the orientation information according to the orientation information.
Taking the orientation information as the first orientation, that is, the orientation of the face of the user as an example, after determining the orientation of the face of the user, the terminal oriented by the face of the user may be used as the first terminal, and then, in the case that the difference between the first distance and the second distance satisfies the preset condition, the first terminal is determined as the target terminal, so that the user does not need to perform additional operations such as searching, selecting, and the like, automatically and accurately determine the target terminal, and the target terminal is controlled to perform the target operation quickly.
Alternatively, the orientation of the face of the user may be set as the direction in which the first terminal is acquired by default. Thus, the user can determine the controlled target terminal only by facing the terminal.
As a specific example, the orientation of the target wearable device may be determined based on an antenna array in the positioning element, and optionally, the positioning element may also be an element such as bluetooth, wireless or radio frequency identification, and is not particularly limited herein.
In the embodiment of the application, under the auxiliary positioning of the target wearable device, the user can determine the target terminal only by simply describing the position of the terminal which is desired to be controlled, so that the user does not need to operate the target wearable device through extra operations such as searching and selecting, and the interactive experience of the user and the target terminal can be effectively improved.
In the embodiment of the application, the first orientation condition included in the first input may further include orientation information, so that the target terminal is determined from different orientation information, and convenience of the user application is improved. For example, taking the first input as "turn off the light at the left rear of me", the first orientation condition may be derived to include "left" and "rear", where "rear" is used as the orientation information, i.e., the second orientation (the back of the user is facing). Wherein the target difference condition corresponding to "left side" may be that the difference of the first distance and the second distance is negative. As shown in fig. 4, there are two smart lights a and B behind the user, for example, the first position is 2.5 meters away from the smart light a, and the second position is 2.7 meters away from the smart light a; the first distance between the first position and the intelligent lamp B is 2.6 meters, and the second distance between the second position and the intelligent lamp B is 2.5 meters. It can be obtained that the difference between the first distance and the second distance is-0.2 for the intelligent lamp a, and the difference between the first distance and the second distance is 0.1 for the intelligent lamp B, so that the intelligent lamp a can be taken as a target terminal, and the intelligent lamp a is controlled to be turned off. According to the terminal control method provided by the embodiment of the application, the control process of the target terminal is more intelligent, and the user experience can be effectively improved.
It should be noted that, in the terminal control method provided in the embodiment of the present application, the execution main body may be a terminal control device, or a control module in the terminal control device for executing the terminal control method. In the embodiment of the present application, a terminal control apparatus executing a terminal control method is taken as an example, and the terminal control apparatus provided in the embodiment of the present application is described.
Fig. 5 is a schematic structural diagram of a terminal control device according to an embodiment of the present application.
As shown in fig. 5, the terminal control device 500 may include:
an obtaining module 510, configured to obtain a first distance between a first location of a target wearable device and a first terminal, and a second distance between a second location of the target wearable device and the first terminal;
a determining module 520, configured to determine the first terminal as a target terminal when a difference between the first distance and the second distance satisfies a preset condition;
a control module 530, configured to control the target terminal to execute the target operation.
In the embodiment of the application, by obtaining the first distance between the first position of the target wearable device and the first terminal and the second distance between the second position of the target wearable device and the first terminal, the first terminal can be determined as the target terminal under the condition that the difference value between the first distance and the second distance meets the preset condition, so that a user does not need to automatically determine the target terminal through extra operations such as searching and selecting. And then, the target terminal can be controlled to execute the target operation, the additional operation that a user manually triggers the target terminal, for example, the screen is lightened is omitted, the control mode of the terminal is more convenient for the user, and the user experience is effectively improved.
In a possible embodiment, in a case that the number of the first terminals is multiple, the determining module 520 is further configured to determine, as the target terminal, the first terminal corresponding to the difference value with the smallest absolute value among the multiple first terminals.
Therefore, extra operation of the user can be reduced, the target terminal can be determined quickly, and the interactive use experience of the user and the terminal is improved.
In a possible embodiment, in the case that the difference value with the smallest absolute value is multiple, the number of the target terminals is multiple;
the obtaining module 510 is further configured to obtain a third distance between the target wearable device and the target terminal;
a determining module 520, configured to determine priorities of the multiple target terminals according to the magnitudes of the multiple third distances;
the control module 530 is further configured to control the plurality of target terminals to perform the target operation according to the priority.
Thus, in the case of too many target terminals, the target terminal can be controlled to execute the target operation without excessive description of the user.
In a possible embodiment, the obtaining module 510 is further configured to sequentially obtain the terminal status information of the multiple target terminals according to the priority;
the control module 530 is further configured to control the target terminal to perform the target operation when the terminal state information satisfies the preset control condition.
Therefore, the intelligence and the convenience of the household life of the user can be effectively improved.
In a possible embodiment, the apparatus further comprises:
a receiving module, configured to receive a first input, where the first input includes a first orientation condition;
the determining module 520 is further configured to determine that the target difference condition corresponding to the first azimuth condition is a preset condition.
Therefore, the intelligence and the convenience of the household life of the user can be effectively improved.
In a possible embodiment, the determining module 520 is further configured to obtain orientation information of the target wearable device; and determining the target terminal in the orientation information according to the orientation information.
Therefore, the user can automatically and accurately determine the target terminal without additional operations such as searching, selecting and the like, and the target terminal can be quickly positioned and controlled to execute the target operation.
The terminal control device 500 in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The terminal control device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The terminal control device provided in the embodiment of the present application can implement each process implemented by the terminal control device in the method embodiments of fig. 1 to fig. 4, and is not described here again to avoid repetition.
Optionally, as shown in fig. 6, an electronic device 600 is further provided in this embodiment of the present application, and includes a processor 601, a memory 602, and a program or an instruction stored in the memory 602 and capable of being executed on the processor 601, where the program or the instruction is executed by the processor 601 to implement each process of the terminal control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 7 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, and a processor 710.
Those skilled in the art will appreciate that the electronic device 700 may also include a power supply (e.g., a battery) for powering the various components, and the power supply may be logically coupled to the processor 710 via a power management system, such that the functions of managing charging, discharging, and power consumption may be performed via the power management system. The electronic device structure shown in fig. 7 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 710 is configured to obtain a first distance between a first position of the target wearable device and the first terminal, and a second distance between a second position of the target wearable device and the first terminal;
determining the first terminal as a target terminal under the condition that the difference value between the first distance and the second distance meets a preset condition;
and controlling the target terminal to execute the target operation.
The electronic device 700 provided by the embodiment of the application can determine the first terminal as the target terminal by acquiring the first distance between the first position of the target wearable device and the first terminal and the second distance between the second position of the target wearable device and the first terminal, and then under the condition that the difference value between the first distance and the second distance meets the preset condition, the user does not need to determine the target terminal automatically by searching, selecting and other additional operations. And then, the target terminal can be controlled to execute the target operation, the additional operation that a user manually triggers the target terminal, for example, the screen is lightened is omitted, the control mode of the terminal is more convenient for the user, and the user experience is effectively improved.
Optionally, the processor 710 is further configured to, when the number of the first terminals is multiple, determine, as the target terminal, the first terminal corresponding to the difference value with the smallest absolute value among the multiple first terminals.
Therefore, extra operation of the user can be reduced, the target terminal can be determined quickly, and the interactive use experience of the user and the terminal is improved.
Optionally, the processor 710 is further configured to obtain a third distance between the target wearable device and the target terminal when the minimum difference value of the absolute values is multiple, and the target terminal is multiple;
determining the priorities of a plurality of target terminals according to the sizes of the plurality of third distances;
and controlling a plurality of target terminals to execute target operation according to the priority.
Thus, in the case of too many target terminals, the target terminal can be controlled to execute the target operation without excessive description of the user.
Optionally, the processor 710 is further configured to sequentially obtain terminal state information of a plurality of target terminals according to the priority;
and under the condition that the terminal state information meets the preset control condition, controlling the target terminal to execute the target operation.
Therefore, the intelligence and the convenience of the household life of the user can be effectively improved.
Optionally, the input unit 704 is configured to receive a first input, where the first input includes a first orientation condition; and determining a target difference condition corresponding to the first azimuth condition as a preset condition. Therefore, the intelligence and the convenience of the household life of the user can be effectively improved.
Optionally, the processor 710 is further configured to obtain orientation information of the target wearable device; and determining the target terminal in the orientation information according to the orientation information.
Therefore, the user can automatically and accurately determine the target terminal without additional operations such as searching, selecting and the like, and the target terminal can be quickly positioned and controlled to execute the target operation.
It should be understood that in the embodiment of the present application, the input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics Processing Unit 7041 processes image data of still pictures or videos obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 706 may include a display panel 7061, and the display panel 7061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071 is also referred to as a touch screen. The touch panel 7071 may include two parts of a touch detection device and a touch controller. Other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 709 may be used to store software programs as well as various data, including but not limited to applications and operating systems. Processor 710 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the terminal control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the terminal control method embodiment, and the same technical effect can be achieved.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A terminal control method, comprising:
acquiring a first distance between a first position of a target wearable device and a first terminal and a second distance between a second position of the target wearable device and the first terminal;
determining the first terminal as a target terminal under the condition that the difference value of the first distance and the second distance meets a preset condition;
and controlling the target terminal to execute target operation.
2. The method according to claim 1, wherein in a case that the first terminal is multiple, and in a case that a difference value between the first distance and the second distance satisfies a preset condition, determining the first terminal as a target terminal comprises:
and determining the first terminal corresponding to the minimum difference value of the absolute values as the target terminal in the plurality of first terminals.
3. The method according to claim 2, wherein in a case where the difference value with the smallest absolute value is plural, the target terminal is plural, and the controlling the target terminal to perform the target operation includes:
acquiring a third distance between the target wearable device and a target terminal;
determining the priorities of the target terminals according to the third distances;
and controlling a plurality of target terminals to execute the target operation according to the priority.
4. The method of claim 3, wherein the controlling the plurality of target terminals to perform the target operation according to the priority comprises:
according to the priority, sequentially acquiring the terminal state information of a plurality of target terminals;
and controlling the target terminal to execute the target operation under the condition that the terminal state information meets the preset control condition.
5. The method according to claim 1, wherein before determining the first terminal as a target terminal if a difference between the first distance and the second distance satisfies a preset condition, the method further comprises:
receiving a first input, wherein the first input comprises a first orientation condition;
and determining a target difference condition corresponding to the first orientation condition as the preset condition.
6. The method of claim 1, further comprising:
acquiring orientation information of the target wearable device;
and determining the target terminal in the orientation information according to the orientation information.
7. A terminal control apparatus, characterized in that the apparatus comprises:
the device comprises an acquisition module and a control module, wherein the acquisition module is used for acquiring a first distance between a first position of target wearable equipment and a first terminal and a second distance between a second position of the target wearable equipment and the first terminal;
the determining module is used for determining the first terminal as a target terminal under the condition that the difference value of the first distance and the second distance meets a preset condition;
and the control module is used for controlling the target terminal to execute the target operation.
8. The apparatus of claim 7, wherein the determining module is further configured to determine, as the target terminal, a first terminal corresponding to a difference value with a smallest absolute value among the plurality of first terminals, in case that the plurality of first terminals are multiple.
9. An electronic device, comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the terminal control method according to claims 1-6.
10. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the terminal control method according to claims 1-6.
CN202110388244.1A 2021-04-12 2021-04-12 Terminal control method, device, equipment and readable storage medium Pending CN113138560A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110388244.1A CN113138560A (en) 2021-04-12 2021-04-12 Terminal control method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110388244.1A CN113138560A (en) 2021-04-12 2021-04-12 Terminal control method, device, equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN113138560A true CN113138560A (en) 2021-07-20

Family

ID=76811163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110388244.1A Pending CN113138560A (en) 2021-04-12 2021-04-12 Terminal control method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113138560A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115190202A (en) * 2022-05-23 2022-10-14 荣耀终端有限公司 Unlocking method of terminal equipment and related device
WO2023000519A1 (en) * 2021-07-21 2023-01-26 歌尔股份有限公司 Smart wearable device, and method and system for controlling target device
CN116033431A (en) * 2022-08-18 2023-04-28 荣耀终端有限公司 Connection method and device of wearable device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103826007A (en) * 2014-02-17 2014-05-28 小米科技有限责任公司 Method and device for remotely controlling terminal, and terminal device
US20180063517A1 (en) * 2016-08-31 2018-03-01 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for testing virtual reality head display device
CN107852567A (en) * 2015-01-07 2018-03-27 三星电子株式会社 Wirelessly connect the method and its equipment of equipment
KR20180052831A (en) * 2016-11-10 2018-05-21 한국항공우주연구원 Realtime Indoor and Outdoor Positioning Measurement Apparatus and Method of the Same
CN108549410A (en) * 2018-01-05 2018-09-18 灵动科技(北京)有限公司 Active follower method, device, electronic equipment and computer readable storage medium
US20200110413A1 (en) * 2018-10-08 2020-04-09 Samsung Electronics Co., Ltd. Method and apparatus for determining path
CN111128157A (en) * 2019-12-12 2020-05-08 珠海格力电器股份有限公司 Wake-up-free voice recognition control method for intelligent household appliance, computer readable storage medium and air conditioner
CN111610923A (en) * 2020-04-26 2020-09-01 北京小米移动软件有限公司 Directional operation method, directional operation device and storage medium
CN112286429A (en) * 2020-10-29 2021-01-29 维沃移动通信有限公司 Control method and control device of electronic equipment and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103826007A (en) * 2014-02-17 2014-05-28 小米科技有限责任公司 Method and device for remotely controlling terminal, and terminal device
CN107852567A (en) * 2015-01-07 2018-03-27 三星电子株式会社 Wirelessly connect the method and its equipment of equipment
US20180063517A1 (en) * 2016-08-31 2018-03-01 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for testing virtual reality head display device
KR20180052831A (en) * 2016-11-10 2018-05-21 한국항공우주연구원 Realtime Indoor and Outdoor Positioning Measurement Apparatus and Method of the Same
CN108549410A (en) * 2018-01-05 2018-09-18 灵动科技(北京)有限公司 Active follower method, device, electronic equipment and computer readable storage medium
US20200110413A1 (en) * 2018-10-08 2020-04-09 Samsung Electronics Co., Ltd. Method and apparatus for determining path
CN111128157A (en) * 2019-12-12 2020-05-08 珠海格力电器股份有限公司 Wake-up-free voice recognition control method for intelligent household appliance, computer readable storage medium and air conditioner
CN111610923A (en) * 2020-04-26 2020-09-01 北京小米移动软件有限公司 Directional operation method, directional operation device and storage medium
CN112286429A (en) * 2020-10-29 2021-01-29 维沃移动通信有限公司 Control method and control device of electronic equipment and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023000519A1 (en) * 2021-07-21 2023-01-26 歌尔股份有限公司 Smart wearable device, and method and system for controlling target device
CN115190202A (en) * 2022-05-23 2022-10-14 荣耀终端有限公司 Unlocking method of terminal equipment and related device
CN116033431A (en) * 2022-08-18 2023-04-28 荣耀终端有限公司 Connection method and device of wearable device
CN116033431B (en) * 2022-08-18 2023-10-31 荣耀终端有限公司 Connection method and device of wearable device

Similar Documents

Publication Publication Date Title
KR102248474B1 (en) Voice command providing method and apparatus
KR101753031B1 (en) Mobile terminal and Method for setting metadata thereof
CN106020670B (en) Screen lighting control method and device and electronic equipment
CN113138560A (en) Terminal control method, device, equipment and readable storage medium
CN106055097B (en) Bright screen control method and device and electronic equipment
KR102599383B1 (en) Electronic device for displaying an executable application on a split screen and method for the same
EP2749037B1 (en) Mobile terminal, image display device mounted on vehicle and data processing method using the same
CN105511667A (en) Back touch control mobile terminal and method based on back pressure transducer
WO2018161675A1 (en) Method and device for determining status of terminal, and terminal
US10764425B2 (en) Method and apparatus for detecting state
KR20140026793A (en) Mobile terminal and controlling method thereof
US11886894B2 (en) Display control method and terminal device for determining a display layout manner of an application
CN113742366B (en) Data processing method, device, computer equipment and storage medium
US11404065B2 (en) Method for displaying visual information associated with voice input and electronic device supporting the same
KR101875744B1 (en) Electonic device and method for controlling of the same
KR20180005521A (en) Mobile terminal and method for controlling the same
CN111966436A (en) Screen display control method and device, terminal equipment and storage medium
CN110570465A (en) real-time positioning and map construction method and device and computer readable storage medium
CN113253826A (en) Control method, control device, terminal and storage medium
CN111190515A (en) Shortcut panel operation method, device and readable storage medium
CN110633336B (en) Method and device for determining laser data search range and storage medium
CN112509510A (en) Brightness adjusting method and device and electronic equipment
CN107729439A (en) Obtain the methods, devices and systems of multi-medium data
WO2023066373A1 (en) Sample image determination method and apparatus, device, and storage medium
CN108920065A (en) Split screen window adjusting method, device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination