CN104536556B - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN104536556B
CN104536556B CN201410468082.2A CN201410468082A CN104536556B CN 104536556 B CN104536556 B CN 104536556B CN 201410468082 A CN201410468082 A CN 201410468082A CN 104536556 B CN104536556 B CN 104536556B
Authority
CN
China
Prior art keywords
electronic device
electronic equipment
information
interaction
electronic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410468082.2A
Other languages
Chinese (zh)
Other versions
CN104536556A (en
Inventor
梅思扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410468082.2A priority Critical patent/CN104536556B/en
Publication of CN104536556A publication Critical patent/CN104536556A/en
Application granted granted Critical
Publication of CN104536556B publication Critical patent/CN104536556B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Abstract

The invention discloses an information processing method and electronic equipment, wherein the method is applied to first electronic equipment comprising a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment; and selecting a first interactive object associated with the first operation according to the first information.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to information processing technologies in the field of electronic computers, and in particular, to an information processing method and an electronic device
Background
With the development of information technology and electronic devices, the area of handheld electronic devices such as smart phones and electronic books is increasing; typically these devices electronic devices include a display area; it is possible to arrange interactive objects in the display area in generally every position. These interactive objects are used to receive and respond to user input.
The prior art controls the electronic device to perform corresponding operations, which includes the following two ways:
the first method is as follows:
the user controls the electronic device to execute corresponding operations through gesture operations, specifically, the interactive object is caused to execute corresponding operations through touch control of the interactive object.
The second method comprises the following steps:
the technology of pupil positioning or eyeball tracking and the like is utilized to determine the interactive object pointed by the user on the electronic equipment, assist the operation of gestures and the like, and control the electronic equipment to execute the corresponding operation.
In the first mode, due to the large area of the electronic equipment, when a user operates with one hand, some interactive objects are displayed in an area which can not be touched by the one hand of the electronic equipment, so that the user is inconvenient to operate.
In the second mode, the interactive object pointed by the user is determined by adopting a pupil positioning or eyeball tracking technology, the existing pupil positioning and eyeball tracking technology cannot be very accurate, and the problem that the recognition accuracy is not high enough is solved.
Obviously, a technical scheme which has a high recognition rate and solves the problem of inconvenience in single-hand operation of a user is proposed, which is a problem to be solved urgently in the prior art.
Disclosure of Invention
In view of this, embodiments of the present invention are expected to provide an information processing method and an electronic device to assist interaction between a user and the electronic device, so as to solve the problem that interaction between the user and a first electronic device is difficult in some interaction scenarios.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a first aspect of an embodiment of the present invention provides an information processing method, which is applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
and selecting a first interaction object associated with the first operation according to the first information.
Preferably, the first and second electrodes are formed of a metal,
the method further comprises the following steps:
receiving a first input acting on a first designated area at a first moment; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; wherein the first designated area is an area accessible to the user in the interaction state at the first moment
And executing the first operation according to the first input and the first interactive object.
Preferably, the first and second electrodes are formed of a metal,
the first designated area is a partial area of the display interaction unit.
Preferably, the first and second electrodes are formed of a metal,
the method further comprises the following steps:
a second input received at a second time;
determining a first position where the second input acts on the display interaction unit;
determining an area within a first preset distance from the first position as the first designated area;
wherein the second time is earlier than the first time; and the relationship between the interaction state at the second moment and the interaction state at the first moment meets a first preset condition.
Preferably, the first and second electrodes are formed of a metal,
the method further comprises the following steps:
detecting the interaction state of the user and the first electronic equipment at a third moment;
determining the first designated area according to the interaction state at the third moment and a first preset strategy;
wherein the third time is earlier than the first time;
and the relationship between the interactive state at the third moment and the interactive state at the first moment meets a first preset condition.
Preferably, the first and second electrodes are formed of a metal,
the method further comprises the following steps:
forming a first control that receives the first input within the first designated area prior to the first time.
Preferably, the first and second electrodes are formed of a metal,
the display interaction unit is also provided with a first pointing identifier;
the method further comprises the following steps:
and determining the position of the first pointing identifier according to the first information, and controlling the first pointing identifier to point to the first interactive object.
Preferably, the first and second electrodes are formed of a metal,
before the receiving the first information sent by the second electronic device, the method further includes:
detecting whether the first electronic equipment is in a first preset state or not to form a detection result;
when the detection result shows that the first electronic equipment meets a first preset state, second information is sent to the second electronic equipment;
the second information is used for switching the second electronic equipment from a second preset state to a third preset state or maintaining the second electronic equipment in the third preset state;
and the power consumption of the second electronic equipment in the third preset state is greater than the power consumption of the second electronic equipment in the second preset state.
Preferably, the first and second electrodes are formed of a metal,
after said selecting a first interactive object associated with a first operation in dependence on said first information, the method further comprises:
detecting whether the first electronic equipment is in a first preset state or not to form a detection result;
when the detection result shows that the first electronic equipment is not in a first preset state, third information is sent to the second electronic equipment;
the third information is used for switching the second electronic device from a third preset state to a second preset state or maintaining the second electronic device in the second preset state.
Preferably, the first and second electrodes are formed of a metal,
when the detection result indicates that the first electronic device is in a first preset state, sending second information to the second electronic device specifically includes:
and when the detection result shows that the first electronic equipment is in a first preset state and the first information sent by the second electronic equipment is not received within a first specified time, sending second information to the second electronic equipment.
Preferably, the first and second electrodes are formed of a metal,
before receiving the first information sent by the second electronic device, the method further comprises:
outputting a first signal;
wherein the first information is formed based on the first signal.
A second aspect of the embodiments of the present invention provides an information processing method, which is applied to a second electronic device including a detection unit, where the second electronic device is an electronic device worn by a user; the detection unit is used for detecting the position relation between the second electronic equipment and the first electronic equipment;
the method comprises the following steps:
detecting a relative position of the second electronic device and the first electronic device;
forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
sending the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select the first interaction object associated with the first operation.
Preferably, the first and second electrodes are formed of a metal,
the detecting the relative position of the second electronic device and the first electronic device comprises:
detecting a first signal output by the first electronic device;
determining the relative position from the first signal.
Preferably, the first and second electrodes are formed of a metal,
the detecting the relative position of the second electronic device and the first electronic device comprises:
projecting a second signal to the first electronic device;
acquiring a first image of the second signal projected onto the first electronic device;
determining the relative position from the first image.
Preferably, the first and second electrodes are formed of a metal,
prior to the detecting the first signal output by the first electronic device, the method further comprises:
receiving second information sent by the first electronic equipment; the second information is sent when the first electronic equipment meets a first preset condition;
controlling the second electronic equipment to be switched from a second preset state to a third preset state or to be maintained in the third preset state according to the second information;
and the power consumption of the second electronic equipment in the third preset state is greater than the power consumption of the second electronic equipment in the second preset state.
Preferably, the first and second electrodes are formed of a metal,
the method further comprises the following steps:
receiving third information sent by the second electronic equipment; the third information is sent when the first electronic equipment does not meet the first preset condition;
after the first information is sent to the first electronic device, the second electronic device is controlled to be switched from the third preset state to the second preset state or maintained in the second preset state according to the third information.
A third aspect of the embodiments of the present invention provides an electronic device, where the electronic device is a first electronic device that includes a display interaction unit; at least one interactive object is displayed on the display interactive unit;
the first electronic device includes:
the receiving unit is used for receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
and the selecting unit is used for selecting a first interactive object associated with the first operation according to the first information.
Preferably, the first and second electrodes are formed of a metal,
the first electronic device further comprises:
an input unit for receiving a first input acting on a first designated area at a first time; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; wherein the first designated area is an area accessible to the user in the interaction state at the first moment
And the execution unit is used for executing the first operation according to the first input and the first interactive object.
Preferably, the first and second electrodes are formed of a metal,
the display interaction unit is also used for displaying a first pointing identifier; and determining the position of the first pointing identifier according to the first information, and controlling the first pointing identifier to point to the first interactive object.
A fourth aspect of the embodiments of the present invention provides an electronic device, where the electronic device is a second electronic device, and the second electronic device is an electronic device worn by a user;
the second electronic device includes:
the detection unit is used for detecting the relative position of the second electronic equipment and the first electronic equipment;
a forming unit for forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
a sending unit, configured to send the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select the first interaction object associated with the first operation.
Preferably, the first and second electrodes are formed of a metal,
the detection unit is specifically configured to detect a first signal output by the first electronic device, and determine the relative position according to the first signal.
Preferably, the first and second electrodes are formed of a metal,
the detection unit includes:
a transmitting module for projecting a second signal to the first electronic device;
the acquisition module is used for acquiring a first image of the second signal projected onto the first electronic equipment;
a determination module for determining the relative position from the first image.
According to the information processing method and the electronic equipment in the embodiment of the invention, the first electronic equipment receives the first information sent by the second electronic equipment, and selects the first interactive object associated with the first operation according to the first information, so that the difficulty that a user cannot select some interactive objects displayed on the first electronic equipment in the current interactive state can be solved, and the problems that the existing large-screen mobile phone and/or tablet computer user cannot well interact with the first electronic equipment due to holding the first electronic equipment by one hand and the like are simply and conveniently solved through the assistance of the second electronic equipment; the convenience of interaction with the first electronic equipment is improved.
Drawings
FIG. 1 is a flowchart illustrating an information processing method according to an embodiment of the present invention;
FIG. 2 is a second schematic flowchart of an information processing method according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a first electronic device according to an embodiment of the invention;
fig. 4 is a second schematic structural diagram of the first electronic device according to the embodiment of the invention;
FIG. 5 is a partial flow chart of an information processing method according to an embodiment of the present invention;
fig. 6 is a third schematic structural diagram of a first electronic device according to an embodiment of the invention;
fig. 7 is a fourth schematic structural diagram of the first electronic device according to the embodiment of the invention;
FIG. 8 is a second flowchart illustrating a part of an information processing method according to an embodiment of the present invention;
FIG. 9 is a third flowchart illustrating an information processing method according to an embodiment of the invention;
fig. 10 is a schematic structural diagram of a second electronic device according to an embodiment of the invention;
fig. 11 is a fifth schematic structural diagram of a first electronic device according to an embodiment of the invention;
fig. 12 is a sixth schematic structural view of a first electronic device according to an embodiment of the invention;
fig. 13 is a second schematic structural diagram of a second electronic device according to an embodiment of the invention;
FIG. 14 is a schematic structural diagram of a detecting unit according to an embodiment of the present invention;
fig. 15 is a third schematic structural diagram of a second electronic device according to an embodiment of the invention.
Detailed Description
The technical solution of the present invention is further described in detail with reference to the drawings and the specific embodiments of the specification.
Method embodiment one
As shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
The second electronic equipment is electronic equipment worn by a user, specifically, intelligent glasses, earphones or communication equipment attached to a collar; the second electronic device is preferably an electronic device worn on the upper body of the user. The first electronic equipment can receive the first information sent by the second electronic equipment through communication technologies such as Bluetooth, infrared, wifi or radio frequency.
The first information represents a positional relationship between a first electronic device and a second electronic device, specifically, the positional relationship is a projection of the second electronic device on the first electronic device in a first direction, or a positional relationship such as a position of the first electronic device on which a first signal sent by the second electronic device is projected.
In this embodiment, after receiving the first information, the first electronic device determines, according to the first information determination and a preset selected policy, a first interactive object associated with execution of a first operation and the first electronic device currently selected by the second electronic device.
With the method of this embodiment, when a user wishes to select the first interactive object selected by the display interaction unit, the user may change the relative positions of the first electronic device and the second electronic device, specifically, move the second electronic device or move the first electronic device.
In the embodiment, the second electronic device worn by the user is introduced to select the interactive object in the first electronic device, which can be used for solving the problem that some first interactive operations on the first electronic device are within the operable range of the user when the user operates with one hand.
In a specific implementation process, the first electronic device further performs the first operation according to the first interactive object. When the first operation is executed, the first operation may be executed according to the input of the user to the first electronic device and the first interactive object, or the first interactive object may be executed according to the first interactive object and a preset operation parameter. Specifically, the first interactive object determined according to the first information is a "calendar" icon displayed on the display unit; the preset operation parameter is a viewing calendar; opening the calendar interface for the user to view the calendar is performed directly.
The second method embodiment:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
As shown in fig. 2, the method further comprises:
step S130: receiving a first input acting on a first designated area at a first moment; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; the first designated area is an area which can be reached by the user in the interaction state at the first moment;
step S140: and executing the first operation according to the first input and the first interactive object.
The first designated area is an area on the first electronic equipment which can be touched by a user in the current interaction state; the received user input may be an entity control, or may be a virtual control displayed on the display interaction unit. The entity control can be specifically an entity button and the like.
Fig. 3 is a schematic structural diagram of the first electronic device, where the first electronic device is provided with the first control, and the first control is an entity control. The display interaction unit is arranged on the front face of the first electronic device, and the first control is arranged on the side face of the first electronic device.
When the control receiving the user input is an entity control, the entity control may be a control specially set for the first electronic device, or an existing control multiplexed with other operations.
The first operation is particularly related to a first interaction object and/or the first input; if the first transaction object is an application icon in the first electronic device, the first operation may be to open the first application or close the first application; specifically, the first input is related to an operation parameter of the first input, specifically, a single click indicates opening the first application, and a double click indicates closing the application.
If the first interactive object is an interactive object such as 'confirm' or 'cancel' in the dialog box, the specifically executed first operation is related to the display content of the interactive object and the content displayed in the dialog box.
Specifically, if it is determined in step S120 that the first interactive object associated with the execution of the first operation is the interactive object a, the current user operation acts on the interactive object B to control the interactive object B to execute the related operation, and is not associated with the interactive object a, obviously, the user operation input at this time is the other input.
The first input may be a touch operation directly acting on the display interaction unit, or a floating touch operation close to the display interaction unit.
In summary, how to further trigger the first electronic device to execute the first operation is further introduced in this embodiment, and the first designated area that triggers the first electronic device to execute the first operation is set in the area that can be reached by the operation in the current interaction state, which is convenient for the user to operate.
The third method embodiment:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
As shown in fig. 2, the method further comprises:
step S130: receiving a first input acting on a first designated area at a first moment; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; the first designated area is an area which can be reached by the user in the interaction state at the first moment;
step S140: and executing the first operation according to the first input and the first interactive object.
Preferably, the first designated area is a partial area of the display interaction unit.
In fig. 4, the black boxes represent the shell around the display interaction unit; the display interaction unit is arranged on the inner side of the shell; when a user holds the first electronic device, the display interaction unit is too large, so that the user can only touch a partial area (a user-reachable area shown by a shaded part in fig. 3) in the current interaction state, and a blank part in the display interaction unit is an area which cannot be touched by the user; at this time, the user operates in the reachable region to trigger the first electronic device to execute a first operation.
In this embodiment, the first designated area is set as a partial area on the display interaction unit, and the first input is received through the display interaction area, instead of setting a special entity control to receive the first input or multiplexing an existing entity control to receive the first input, which has strong compatibility with the prior art.
The method comprises the following steps:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
As shown in fig. 2, the method further comprises:
step S130: receiving a first input acting on a first designated area at a first moment; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; the first designated area is an area which can be reached by the user in the interaction state at the first moment;
step S140: and executing the first operation according to the first input and the first interactive object.
When the first designated area is a partial area of the display interaction unit, as shown in fig. 5, the method further includes:
step S101: a second input received at a second time;
step S102: determining a first position where the second input acts on the display interaction unit;
step S103: determining an area within a first preset distance from the first position as the first designated area;
wherein the second time is earlier than the first time; and the relationship between the interaction state at the second moment and the interaction state at the first moment meets a first preset condition.
As shown in fig. 6, a second input of the user is applied to a position a on the display interaction unit, and all regions within a first preset distance from the position a are set as the first designated region, specifically, a region formed by a dashed circle in fig. 6.
The second input may be an input specifically made to determine the first designated area, or may be an input by a user instructing the first electronic device to perform another operation at a previous time.
The first preset condition is that the interaction state at the first time and the interaction state at the second time may be consistent or the change of the interaction state is smaller than a specified parameter, specifically, for example, a user holds the first electronic device with one hand to interact with the second electronic device at both the first time and the second time; the change of the position of the first electronic equipment held by one hand of the user is within a specified range.
Specifically, how to determine that the relationship between the interaction state at the second moment and the interaction state at the first moment meets the first preset condition can be determined through user input received by the first electronic device, where the user input is concentrated in an area accessible to a certain interaction state within a specified time; it may also be determined by the first electronic device sensor detecting whether the first designated area accessible to the user has changed. The sensor can be a sensing unit such as an image acquisition unit.
The embodiment provides a specific area of the display interaction unit, which is accessible to the user, and the embodiment determines the area according to the second input of the user at the second moment earlier than the first moment, so that the method has the advantages of simplicity and quickness in implementation.
Method example five:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
As shown in fig. 2, the method further comprises:
step S130: receiving a first input acting on a first designated area at a first moment; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; the first designated area is an area which can be reached by the user in the interaction state at the first moment;
step S140: and executing the first operation according to the first input and the first interactive object.
When the first designated area is a partial area of the display interaction unit, as shown in fig. 5, the method further includes: detecting the interaction state of the user and the first electronic equipment at a third moment; determining the first designated area according to the interaction state at the third moment and a first preset strategy; wherein the third time is earlier than the first time;
and the relationship between the interactive state at the third moment and the interactive state at the first moment meets a first preset condition.
The first predetermined policy specifically includes: at the third moment of the first moment, a plurality of user inputs are intensively distributed in a certain area of the display interaction unit; and at the first moment, the interaction mode of the first electronic device is the interaction mode needing the assistance of the second electronic device, and the first designated area can be determined according to the historical input of the user.
The first predetermined policy may also be specifically as follows: and determining the first designated area according to the third input and the corresponding historical interaction record in the interaction mode needing the assistance of the second electronic equipment.
The first predetermined policy may also be: and according to the interaction mode that the first electronic equipment needs the assistance of the second electronic equipment, the frequently-used interaction area of the user is used as the first designated area, so that the first designated area is determined. The first predetermined strategy is also various and not limited to the above two.
On the basis of the second method embodiment to the third method embodiment, the present embodiment provides a method for determining the first designated area, which is different from the fourth method embodiment, and similarly has the advantage of simple implementation.
Method example six:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
As shown in fig. 2, the method further comprises:
step S130: receiving a first input acting on a first designated area at a first moment; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; the first designated area is an area which can be reached by the user in the interaction state at the first moment;
step S140: and executing the first operation according to the first input and the first interactive object.
The first designated area is a partial area of the display interaction unit;
the method further comprises the following steps:
forming a first control that receives the first input within the first designated area prior to the first time.
In a specific implementation process, when the first electronic device detects that the second electronic device is currently used for assisting interaction, a control for receiving user input may be formed in the first designated area after the first designated area is determined, so that a user can conveniently align input, and an error caused by touching other interactive objects by the user due to the fact that other interactive objects are further arranged in the first designated area is avoided.
In order to avoid the above error, the first designated area may be divided into two sub-areas, one is a used area in which the interactive object is displayed, and the other is a blank area in which the interactive object is not displayed, if the user input acts on the blank area, the user input may be considered as the first input for triggering the first operation, otherwise, the user input may be considered as the other input.
Method embodiment seven:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
The display interaction unit is also provided with a first pointing identifier; in fig. 7, the first direction indicator is a shaded arrow, and in a specific implementation process, the first direction indicator is not limited to the pattern shown in fig. 7, and may be other patterns. In fig. 7, 4 interactive objects are displayed, the current first pointing identifier points to a first interactive object, and it can be determined that the current first interactive object is the first interactive object.
The method further comprises the following steps:
and determining the position of the first pointing identifier according to the first information, and controlling the first pointing identifier to point to the first interactive object.
Through the display of the first pointing identification, a user can select the first interactive object conveniently by changing the relative position relation of the first electronic equipment or the second electronic equipment according to the first pointing identification, and the use convenience is improved.
The eighth embodiment of the method:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
As shown in fig. 8, before the receiving the first information sent by the second electronic device, the method further includes:
step S151: detecting whether the first electronic equipment is in a first preset state or not to form a detection result;
step S152: when the detection result shows that the first electronic equipment is in a first preset state, sending second information to the second electronic equipment;
the second information is used for switching the second electronic equipment from a second preset state to a third preset state or maintaining the second electronic equipment in the third preset state;
and the power consumption of the second electronic equipment in the third preset state is greater than the power consumption of the second electronic equipment in the second preset state.
In a specific implementation process, determining whether the first electronic device is in the first preset state at least includes two methods determined according to a current operating mode of the first electronic device or determined according to a user input:
the first method comprises the following steps: determining according to the operating mode of the first electronic device, where the first electronic device has two operating modes, before the receiving the first information sent by the second electronic device, the method further includes: when the detection result shows that the first electronic equipment is in a first working mode, determining that the first electronic equipment is in the first preset state; when the detection result shows that the first electronic equipment is in a second working mode, determining that the first electronic equipment is not in the first preset state;
wherein the first mode of operation is different from the second mode of operation.
Specifically, for example, the first operating mode may be that the first electronic device is in a sleep operating mode; only part of the applications run in the background, and the display interaction unit is in a screen-off state; the second working mode is that the first electronic device is in an activation mode, and the display interaction unit is in a bright screen state.
The first operating mode may also be an operating mode in which the first electronic device is in an operating mode requiring assistance and interaction of a second electronic device, and the second operating mode is an operating mode in which the first electronic device is not in an operating mode requiring assistance and interaction of a second electronic device.
And the second method comprises the following steps: determining whether the first electronic device is in a first preset state by detecting user input, specifically as follows:
the first step is as follows: detecting whether a third input by a user is received by the first electronic equipment, wherein the third input is an input for enabling the first electronic equipment to enter or maintain the first preset state;
the second step is that: when the detection result shows that the first electronic equipment receives a third input, determining that the first electronic equipment meets the first preset state; and when the detection result shows that the first electronic equipment does not receive the third input, determining that the first electronic equipment does not meet the first preset state.
In order to save power consumption of the second electronic device, when the second electronic device does not need to perform information transceiving with the first electronic device, the second electronic device may be in a second preset state with lower power consumption, specifically, for example (a state in which the second electronic device is connected to the first electronic device is turned off, or a state in which a detection function module for determining a relative position relationship between the first electronic device and the second electronic device is turned off, or the like).
In this embodiment, the first electronic device preferably sends information to the second electronic device to control the state switching of the second electronic device, and in a specific implementation process, the state switching of the second electronic device may also be implemented by a manual operation performed by a user.
Method example nine:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
As shown in fig. 8, before the receiving the first information sent by the second electronic device, the method further includes:
step S151: detecting whether the first electronic equipment is in a first preset state or not to form a detection result;
step S152: when the detection result shows that the first electronic equipment is in a first preset state, sending second information to the second electronic equipment;
the second information is used for switching the second electronic equipment from a second preset state to a third preset state or maintaining the second electronic equipment in the third preset state;
and the power consumption of the second electronic equipment in the third preset state is greater than the power consumption of the second electronic equipment in the second preset state.
After said selecting a first interactive object associated with a first operation in dependence on said first information, the method further comprises: detecting whether the first electronic equipment is in a first preset state or not to form a detection result; when the detection result shows that the first electronic equipment is not in a first preset state, third information is sent to the second electronic equipment;
the third information is used for switching the second electronic device from a third preset state to a second preset state or maintaining the second electronic device in the second preset state.
On the basis of the first embodiment, how the first electronic device automatically switches the second electronic device to the second preset state with lower energy consumption is further limited, so that energy consumption is saved, the intelligence of the electronic device is improved, and the problem that the energy consumption of the second electronic device is high due to forgetting of a user can be avoided compared with the situation that the user receives switching.
Method example ten:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
Detecting whether the first electronic equipment is in a first preset state or not to form a detection result;
when the detection result shows that the first electronic equipment is in a first preset state, sending second information to the second electronic equipment;
the second information is used for switching the second electronic equipment from a second preset state to a third preset state or maintaining the second electronic equipment in the third preset state;
and the power consumption of the second electronic equipment in the third preset state is greater than the power consumption of the second electronic equipment in the second preset state.
When the detection result indicates that the first electronic device is in a first preset state, sending second information to the second electronic device specifically includes:
and when the detection result shows that the first electronic equipment is in a first preset state and the first information sent by the second electronic equipment is not received within a first specified time, sending second information to the second electronic equipment.
The duration corresponding to the first designated time may be 0.1 second or 1 second, and may be a preset time.
When the first electronic device is in the first preset state and the first information of the second electronic device is not received within the specified time length, the second electronic device is considered to be in the second preset state, and therefore the second information is sent to the second electronic device to trigger the second electronic device to assist the first electronic device in human-computer interaction.
Method example eleven:
as shown in fig. 1, the present embodiment provides an information processing method applied to a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
step S110: receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S120: and selecting a first interaction object associated with the first operation according to the first information.
Before receiving the first information sent by the second electronic device, the method further comprises:
outputting a first signal;
wherein the first information is formed based on the first signal.
The first electronic device sends the first signal, the second electronic device detects the first signal, the second electronic device determines the relative position relationship between the second electronic device and the first electronic device by detecting the first signal, further determines to form the first information, and sends the first information to the first electronic device.
The first signal may be an infrared signal or a laser signal, etc.
In a specific implementation process, the second electronic device may also send a second signal, and the second electronic device collects the first signal and acts on a position on the display interaction unit of the first electronic device, and forms the first information according to the position. Specifically, the second electronic device may capture the first electronic device through a highlight collector, and analyze an image to detect a relative position between the second electronic device and the first electronic device, so as to form the first information.
There are various methods for forming the first information in the specific implementation process, and this embodiment provides a preferable mode, and experiments prove that this mode is simpler to operate and lower in energy consumption of the second electronic device compared with the mode in which the relative position of the first electronic device and the second electronic device is determined through image acquisition and analysis.
Method example twelve:
as shown in fig. 9, the present embodiment provides an information processing method applied to a second electronic device including a detection unit, where the second electronic device is an electronic device worn by a user; the detection unit is used for detecting the position relation between the second electronic equipment and the first electronic equipment;
the method comprises the following steps:
step S210: detecting a relative position of the second electronic device and the first electronic device;
step S220: forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S230: sending the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select a first interaction object associated with a first operation.
The second electronic device is a device worn by a user, so that the user can change the relative position relationship between the first electronic device and the second electronic device through body movement such as limb movement, and the second electronic device assists the first electronic device to interact with the user.
And the second electronic equipment sends the first information to the first electronic equipment, and the first electronic equipment determines a first interaction object associated with the first operation according to the first information so as to facilitate the interaction between the user and the first electronic equipment.
Method example thirteen:
as shown in fig. 9, the present embodiment provides an information processing method applied to a second electronic device including a detection unit, where the second electronic device is an electronic device worn by a user; the detection unit is used for detecting the position relation between the second electronic equipment and the first electronic equipment;
the method comprises the following steps:
step S210: detecting a relative position of the second electronic device and the first electronic device;
step S220: forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S230: sending the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select a first interaction object associated with a first operation.
The step S210 includes:
detecting a first signal output by the first electronic device;
determining the relative position from the first signal.
In this embodiment, the first electronic device outputs a first signal; the second electronic device may detect the first signal and an incident angle of the first signal, and determine a positional relationship between the first electronic device and the second device. Or the first electronic device sends a plurality of beams of first signals; and when the second electronic equipment receives two beams at random, determining the position relation between the first electronic equipment and the second electronic equipment according to the incident included angle and the emission included angle of the two received beams.
In summary, the second electronic device according to this embodiment determines the position relationship with the first electronic device according to the first signal sent by the first electronic device, and has the advantage of being simple and easy to implement.
Method example fourteen:
as shown in fig. 9, the present embodiment provides an information processing method applied to a second electronic device including a detection unit, where the second electronic device is an electronic device worn by a user; the detection unit is used for detecting the position relation between the second electronic equipment and the first electronic equipment;
the method comprises the following steps:
step S210: detecting a relative position of the second electronic device and the first electronic device;
step S220: forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S230: sending the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select a first interaction object associated with a first operation.
Wherein the step S210 includes:
projecting a second signal to the first electronic device;
acquiring a first image of the second signal projected onto the first electronic device;
determining the relative position from the first image.
FIG. 10 is one of the schematic diagrams illustrating the second electronic device sending a second signal to the first electronic device; wherein the second electronic device is glasses worn by a user; and the glasses are provided with a transmitting device for transmitting the second signal.
The method according to this embodiment further defines how to determine the position relationship between the second electronic device and the first electronic device on the basis of the twelfth embodiment of the method, where in this embodiment, the second electronic device projects a second signal to the first electronic device before sending the first information; and acquiring a first image of a second signal projected onto the first electron, determining a relative position according to the first image, and sending first information according to the relative position.
In addition, in a specific implementation process, the second electronic device may further find, through the first image, that the time when the second signal is not projected onto the first electronic device has reached the preset time, indicating that the current user does not need the second electronic device to assist in interacting with the first electronic device any more, so that the second electronic device is controlled to switch from a third preset state with high energy consumption to a second preset state with low energy consumption.
In summary, the second electronic device according to this embodiment determines the position relationship with the first electronic device according to the first signal sent by the first electronic device, and has the advantage of being simple and easy to implement.
Method embodiment fifteen:
as shown in fig. 9, the present embodiment provides an information processing method applied to a second electronic device including a detection unit, where the second electronic device is an electronic device worn by a user; the detection unit is used for detecting the position relation between the second electronic equipment and the first electronic equipment;
the method comprises the following steps:
step S210: detecting a relative position of the second electronic device and the first electronic device;
step S220: forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S230: sending the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select a first interaction object associated with a first operation.
Prior to the detecting the first signal output by the first electronic device, the method further comprises:
receiving second information sent by the first electronic equipment; the second information is sent when the first electronic equipment meets a first preset condition;
controlling the second electronic equipment to be switched from a second preset state to a third preset state or to be maintained in the third preset state according to the second information;
and the power consumption of the second electronic equipment in the third preset state is greater than the power consumption of the second electronic equipment in the second preset state.
Before the second electronic device does not need to assist the first electronic device to interact with the user, the second electronic device is in a second preset state of energy consumption interaction, and the standby time of the second electronic device can be prolonged. Specifically, the second preset state may be a dormant state or a state such as a disconnected state between the second electronic device and the first electronic device; the third state is a state in which the second electronic device can perform normal interaction with the first electronic device.
According to the method, the state switching between the first electronic device and the second electronic device is realized by receiving the information sent by the first electronic device, so that the energy consumption of the second electronic device is saved, and the standby time of the second electronic device is prolonged.
Method example sixteen:
as shown in fig. 9, the present embodiment provides an information processing method applied to a second electronic device including a detection unit, where the second electronic device is an electronic device worn by a user; the detection unit is used for detecting the position relation between the second electronic equipment and the first electronic equipment;
the method comprises the following steps:
step S210: detecting a relative position of the second electronic device and the first electronic device;
step S220: forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
step S230: sending the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select a first interaction object associated with a first operation.
Prior to the detecting the first signal output by the first electronic device, the method further comprises:
receiving second information sent by the first electronic equipment; the second information is sent when the first electronic equipment meets a first preset condition;
controlling the second electronic equipment to be switched from a second preset state to a third preset state or to be maintained in the third preset state according to the second information;
and the power consumption of the second electronic equipment in the third preset state is greater than the power consumption of the second electronic equipment in the second preset state.
The method further comprises the following steps:
receiving third information sent by the second electronic equipment; the third information is sent when the first electronic equipment does not meet the first preset condition;
after the first information is sent to the first electronic device, the second electronic device is controlled to be switched from the third preset state to the second preset state or maintained in the second preset state according to the third information.
The second electronic equipment receives third information sent by the first electronic equipment, and the second electronic equipment is switched from the third high-energy-consumption preset state to the second low-energy-consumption preset state according to the third information, so that the automatic state switching of the second electronic equipment is realized, the intelligent high-performance electronic equipment has the advantage of high intelligence, and the phenomenon of energy waste caused by non-automatic switching is avoided.
The first embodiment of the device:
as shown in fig. 11, the present embodiment provides an electronic device, which is a first electronic device including a display interaction unit 100; at least one interactive object is displayed on the display interactive unit;
the first electronic device includes:
a receiving unit 110, configured to receive first information sent by a second electronic device; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
a selecting unit 120, configured to select, according to the first information, a first interaction object associated with the first operation.
The specific structure of the display interaction unit can comprise a display screen and an interaction sensor; the display screen comprises a liquid crystal display screen, an Organic Light Emitting Diode (OLED) display screen or a projection screen and the like; the interaction sensor may comprise a capacitive matrix or an inductive matrix for detecting user input.
The specific structure of the receiving unit comprises a receiving interface, wherein the receiving interface comprises a wired receiving interface or a wireless receiving interface; preferably a wireless receiving interface; the wired receiving interface can comprise a USB interface or a twisted pair receiving interface and other receiving interfaces; the wireless receiving interface is preferably a receiving antenna, and specifically, the structure is a wifi antenna or a bluetooth antenna.
The specific structure of the selected unit can comprise a processor and a storage medium; the storage medium stores executable instructions, and the processor reads the executable instructions through a communication interface inside first electronic equipment such as a bus and selects the first interaction object according to the first information.
The first electronic device described in this embodiment may specifically be an electronic device of a mobile phone, a tablet computer, or a notebook computer.
The second electronic device is an electronic device wearable by a user of smart glasses or earphones.
In this embodiment, the first interactive object associated with the first operation is selected by receiving the first information sent by the second electronic device, instead of being determined by directly acting on the first electronic device by the user in the prior art, a specific implementation hardware is provided for the method described in any one of the first embodiment of the method, a scenario that the user cannot select a part of the interactive objects presented on the first electronic device in the current interactive state can be well solved, and the user satisfaction is improved.
The second equipment embodiment:
as shown in fig. 11, the present embodiment provides an electronic device, which is a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit;
the first electronic device includes:
a receiving unit 110, configured to receive first information sent by a second electronic device; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
a selecting unit 120, configured to select, according to the first information, a first interaction object associated with the first operation.
As shown in fig. 12, the first electronic device further includes:
an input unit 130 for receiving a first input acting on a first designated area at a first timing; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; wherein the first designated area is an area accessible to the user in the interaction state at the first moment
The execution unit 140 is configured to execute the first operation according to the first input and the first interactive object.
The input unit 130 is a human-computer interaction interface, and the specific structure of the input unit may include an entity control or a virtual control; the entity control can be an entity button or a keyboard; the virtual control may be a control formed by the display interaction unit through a display icon.
The specific structure of the execution unit 140 may also include a processor or a processing chip. The processor in the embodiment of the present invention may include an electronic component having a processing function, such as a central processing unit CPU, a digital signal processor DSP, or a programmable processor PLC.
In a specific implementation process, the execution unit 140 may also execute the first operation according to the first interactive object and a built-in preset operation parameter.
The electronic device described in this embodiment provides specific implementation hardware for the method described in the second embodiment of the method, and the electronic device can assist the user in interacting with the first electronic device, thereby solving the problem that the user is difficult to interact with the first electronic device in some interaction scenarios, and improving the user satisfaction.
In a specific implementation process, the first designated area may be any area of the first electronic device; in particular as described, a partial area of the interactive unit is displayed.
A first control may be fixedly disposed in the first designated area, and the first control may be an entity button or the like.
When the first designated area is a partial area of the display interaction unit, the display interaction unit may form a virtual control in the form of an icon or the like. The display interaction unit is relatively large, the first designated area is specifically which part of the display interaction unit, and can be determined by the processing unit according to the user input received by the input unit earlier than the first time, and specifically how to determine the first designated area can refer to the determination of the corresponding method embodiment; and will not be repeated here. The specific structure of the processing unit may be a processor in the first electronic device.
The third equipment embodiment:
as shown in fig. 11, the present embodiment provides an electronic device, which is a first electronic device including a display interaction unit; at least one interactive object is displayed on the display interactive unit;
the first electronic device includes:
a receiving unit 110, configured to receive first information sent by a second electronic device; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
a selecting unit 120, configured to select, according to the first information, a first interaction object associated with the first operation.
The display interaction unit 100 is further configured to display a first pointing identifier; and determining the position of the first pointing identifier according to the first information, and controlling the first pointing identifier to point to the first interactive object.
In this embodiment, the display interaction unit may further include an image processor, in addition to the display screen and the interaction sensor, where the image processor is configured to form the first pointing identifier and control the first pointing identifier to point to the first interaction object according to the first information.
The specific structure of the image processor can be referred to a processing chip for display signal formation of the existing display screen, and further detailed development is not needed here.
In this embodiment, the first pointing identifier is displayed by the display interaction unit, so that a user can conveniently check whether the first electronic device selects the first interaction object to which the user needs to point according to the first information, and thus the user can conveniently point to different first interaction objects by changing the positions of the first electronic device and the second electronic device, and the purpose of interacting with the first electronic device is achieved.
The fourth equipment embodiment:
as shown in fig. 13, an embodiment of the present disclosure provides an electronic device, where the electronic device is a second electronic device, and the second electronic device is an electronic device worn by a user;
the second electronic device includes:
a detecting unit 210, configured to detect a relative position of the second electronic device and the first electronic device;
a forming unit 220 for forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
a sending unit 230, configured to send the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select the first interaction object associated with the first operation.
The second electronic device may specifically be an electronic device worn by a user, such as an earphone or smart glasses.
The specific structure of the detecting unit 220 may include a collector of signals such as infrared and laser, which is specifically related to how the second electronic device determines the position relationship with the first electronic device.
The specific structure of the forming unit 220 may include various types of processors, such as a cpu in a headset or a processing chip in smart glasses.
The second electronic device is used for assisting the first electronic device to interact with the user, the problem that the user and the first electronic device are difficult to interact under some special scenes is solved, and the using satisfaction of the user is improved.
Device example five:
as shown in fig. 13, an embodiment of the present disclosure provides an electronic device, where the electronic device is a second electronic device, and the second electronic device is an electronic device worn by a user;
the second electronic device includes:
a detecting unit 210, configured to detect a relative position of the second electronic device and the first electronic device;
a forming unit 220 for forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
a sending unit 230, configured to send the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select the first interaction object associated with the first operation.
The detecting unit 210 is specifically configured to detect a first signal output by the first electronic device, and determine the relative position according to the first signal.
The first signal may be any form of first signal output by the first electronic device, such as an infrared signal or a laser signal. The detecting the first signal may be detecting an incident angle of the first signal or an included angle between a plurality of beams of the first signal, or whether the first signal is received.
On the basis of the previous device embodiment, the embodiment specifically defines how the second electronic device determines the position relationship with the first electronic device, and has the advantages of simple structure and simplicity and convenience in implementation.
Device example six:
as shown in fig. 13, an embodiment of the present disclosure provides an electronic device, where the electronic device is a second electronic device, and the second electronic device is an electronic device worn by a user;
the second electronic device includes:
a detecting unit 210, configured to detect a relative position of the second electronic device and the first electronic device;
a forming unit 220 for forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
a sending unit 230, configured to send the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select the first interaction object associated with the first operation.
As shown in fig. 14, the detection unit 210 includes:
a transmitting module 211 for projecting a second signal to the first electronic device;
an acquisition module 212, configured to acquire a first image of the second signal projected onto the first electronic device;
a determining module 213 for determining the relative position from the first image.
The specific structure of the transmitting module 211 is a transmitter, such as a laser transmitter, an ultraviolet light transmitter, or an infrared light transmitter.
The acquisition module 210 may include a camera or video camera, etc.
The specific structure of the determining module 213 may be a processor or the like.
FIG. 15 is a schematic diagram of a second electronic device transmitting the second signal; a headset wherein the second electronic device is user wearable; and the second electronic equipment is provided with a transmitting structure for transmitting the second signal.
On the basis of the fourth device embodiment, the structure of the detection unit and how the detection unit determines the position relationship between the second electronic device and the first electronic device are specifically defined in this embodiment, and the device has the advantages of being simple and convenient to implement and simple in structure.
In a specific implementation process, the second electronic device further includes a state switching unit, configured to switch a state between a second preset state and a third switching state, and how to switch the state may refer to the description of the corresponding part in the method embodiment, which is not repeated here.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (21)

1. An information processing method is applied to a first electronic device comprising a display interaction unit; at least one interactive object is displayed on the display interactive unit; the method comprises the following steps:
receiving first information sent by second electronic equipment; the first electronic equipment is handheld electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
selecting a first interaction object associated with a first operation according to the first information;
receiving a first input acting on a first designated area at a first moment; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; the first designated area is a partial area of the display interaction unit which is accessible to a hand of a user holding the first electronic device in an interaction state of holding the first electronic device by one hand;
and executing the first operation according to the first input and the first interactive object.
2. The method of claim 1,
the control of the first input action is an entity control or a virtual control;
and/or the presence of a gas in the gas,
the second electronic equipment is the electronic equipment worn on the upper body of the user.
3. The method of claim 2,
the method further comprises the following steps:
a second input received at a second time;
determining a first position where the second input acts on the display interaction unit;
determining an area within a first preset distance from the first position as the first designated area;
wherein the second time is earlier than the first time; and the relationship between the interaction state at the second moment and the interaction state at the first moment meets a first preset condition.
4. The method of claim 2,
the method further comprises the following steps:
detecting the interaction state of the user and the first electronic equipment at a third moment;
determining the first designated area according to the interaction state at the third moment and a first preset strategy;
wherein the third time is earlier than the first time;
and the relationship between the interactive state at the third moment and the interactive state at the first moment meets a first preset condition.
5. The method of claim 2,
the method further comprises the following steps:
forming a first control that receives the first input within the first designated area prior to the first time.
6. The method according to claim 1 or 2,
the display interaction unit is also provided with a first pointing identifier;
the method further comprises the following steps:
and determining the position of the first pointing identifier according to the first information, and controlling the first pointing identifier to point to the first interactive object.
7. The method according to claim 1 or 2,
before the receiving the first information sent by the second electronic device, the method further includes:
detecting whether the first electronic equipment is in a first preset state or not to form a detection result;
when the detection result shows that the first electronic equipment meets a first preset state, second information is sent to the second electronic equipment;
the second information is used for switching the second electronic equipment from a second preset state to a third preset state or maintaining the second electronic equipment in the third preset state;
and the power consumption of the second electronic equipment in the third preset state is greater than the power consumption of the second electronic equipment in the second preset state.
8. The method of claim 7,
after said selecting a first interactive object associated with a first operation in dependence on said first information, the method further comprises:
detecting whether the first electronic equipment is in a first preset state or not to form a detection result;
when the detection result shows that the first electronic equipment is not in a first preset state, third information is sent to the second electronic equipment;
the third information is used for switching the second electronic device from a third preset state to a second preset state or maintaining the second electronic device in the second preset state.
9. The method of claim 8,
when the detection result indicates that the first electronic device is in a first preset state, sending second information to the second electronic device specifically includes:
and when the detection result shows that the first electronic equipment is in a first preset state and the first information sent by the second electronic equipment is not received within a first specified time, sending second information to the second electronic equipment.
10. The method according to claim 1 or 2,
before receiving the first information sent by the second electronic device, the method further comprises:
outputting a first signal;
wherein the first information is formed based on the first signal.
11. An information processing method is applied to a second electronic device comprising a detection unit, wherein the second electronic device is an electronic device worn by a user; the detection unit is used for detecting the position relation between the second electronic equipment and the first electronic equipment; the first electronic equipment is handheld electronic equipment;
the method comprises the following steps:
detecting a relative position of the second electronic device and the first electronic device;
forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
sending the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select a first interaction object associated with a first operation; a first input acts on a first designated area of a display interaction unit of the handheld electronic equipment and is used for triggering the first electronic equipment to execute the first operation associated with the first interaction object; the first designated area is a partial area of the display interaction unit which is accessible to a hand of a user holding the first electronic device in an interaction state of holding the first electronic device with a single hand.
12. The method of claim 11,
the detecting the relative position of the second electronic device and the first electronic device comprises:
detecting a first signal output by the first electronic device;
determining the relative position from the first signal.
13. The method of claim 11,
the detecting the relative position of the second electronic device and the first electronic device comprises:
projecting a second signal to the first electronic device;
acquiring a first image of the second signal projected onto the first electronic device;
determining the relative position from the first image.
14. The method of claim 11, 12 or 13,
prior to the detecting the first signal output by the first electronic device, the method further comprises:
receiving second information sent by the first electronic equipment; the second information is sent when the first electronic equipment meets a first preset condition;
controlling the second electronic equipment to be switched from a second preset state to a third preset state or to be maintained in the third preset state according to the second information;
and the power consumption of the second electronic equipment in the third preset state is greater than the power consumption of the second electronic equipment in the second preset state.
15. The method of claim 14,
the method further comprises the following steps:
receiving third information sent by the second electronic equipment; the third information is sent when the first electronic equipment does not meet the first preset condition;
after the first information is sent to the first electronic device, the second electronic device is controlled to be switched from the third preset state to the second preset state or maintained in the second preset state according to the third information.
16. An electronic device is a first electronic device comprising a display interaction unit; at least one interactive object is displayed on the display interactive unit; the first electronic equipment is handheld electronic equipment;
the first electronic device includes:
the receiving unit is used for receiving first information sent by second electronic equipment; the second electronic equipment is electronic equipment worn by a user; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
the selecting unit is used for selecting a first interaction object associated with a first operation according to the first information;
an input unit for receiving a first input acting on a first designated area at a first time; the first input is used for triggering the first electronic device to execute the first operation associated with the first interactive object; the first designated area is a partial area of the display interaction unit which is accessible to a hand of a user holding the first electronic device in an interaction state of holding the first electronic device by one hand;
and the execution unit is used for executing the first operation according to the first input and the first interactive object.
17. The electronic device of claim 16,
the control of the first input action is an entity control or a virtual control;
and/or the presence of a gas in the gas,
the second electronic equipment is the electronic equipment worn on the upper body of the user.
18. The electronic device of claim 16 or 17,
the display interaction unit is also used for displaying a first pointing identifier; and determining the position of the first pointing identifier according to the first information, and controlling the first pointing identifier to point to the first interactive object.
19. An electronic device is a second electronic device, and the second electronic device is an electronic device worn by a user; the first electronic equipment is handheld electronic equipment;
the second electronic device includes:
the detection unit is used for detecting the relative position of the second electronic equipment and the first electronic equipment;
a forming unit for forming first information according to the relative position; the first information is information representing the relative position of the second electronic equipment and the first electronic equipment;
a sending unit, configured to send the first information to the first electronic device;
the first information is used for providing basis for the first electronic equipment to select a first interaction object associated with a first operation; the first input acts on a first designated area of a display interaction unit of the first electronic equipment and is used for triggering the first electronic equipment to execute the first operation associated with a first interaction object; the first designated area is a partial area of the display interaction unit which is accessible to a hand of a user holding the first electronic device in an interaction state of holding the first electronic device with a single hand.
20. The electronic device of claim 19,
the detection unit is specifically configured to detect a first signal output by the first electronic device, and determine the relative position according to the first signal;
and/or the second electronic equipment is electronic equipment worn on the upper body of the user.
21. The electronic device of claim 20,
the detection unit includes:
a transmitting module for projecting a second signal to the first electronic device;
the acquisition module is used for acquiring a first image of the second signal projected onto the first electronic equipment;
a determination module for determining the relative position from the first image.
CN201410468082.2A 2014-09-15 2014-09-15 Information processing method and electronic equipment Active CN104536556B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410468082.2A CN104536556B (en) 2014-09-15 2014-09-15 Information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410468082.2A CN104536556B (en) 2014-09-15 2014-09-15 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN104536556A CN104536556A (en) 2015-04-22
CN104536556B true CN104536556B (en) 2021-01-15

Family

ID=52852096

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410468082.2A Active CN104536556B (en) 2014-09-15 2014-09-15 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN104536556B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106161792B (en) * 2016-06-30 2020-02-21 联想(北京)有限公司 Control method, control device and electronic equipment
CN109271088A (en) * 2018-09-13 2019-01-25 广东小天才科技有限公司 Operation response method, electronic equipment and the storage medium of electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1694043A (en) * 2004-04-29 2005-11-09 国际商业机器公司 System and method for selecting and activating a target object using a combination of eye gaze and key presses
CN101038504A (en) * 2006-03-16 2007-09-19 许丰 Manpower operating method, software and hardware device
CN101702055A (en) * 2009-11-18 2010-05-05 大连海事大学 Calibrating device for tracing aiming point of typoscope telescope
WO2012019322A1 (en) * 2010-08-13 2012-02-16 Xu Hong Input method, input system and input device of vision directing type mouse using monocular camera calibration technique
CN102782459A (en) * 2009-09-11 2012-11-14 诺沃迪吉特公司 Method and system for controlling a user interface of a device using human breath
CN102834789A (en) * 2010-04-16 2012-12-19 高通股份有限公司 Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size
CN103412725A (en) * 2013-08-27 2013-11-27 广州市动景计算机科技有限公司 Touch operation method and device
CN103631483A (en) * 2013-11-27 2014-03-12 华为技术有限公司 Positioning method and positioning device
CN103763427A (en) * 2013-10-29 2014-04-30 小米科技有限责任公司 Hand-held device, one-hand control method and controller

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030142065A1 (en) * 2002-01-28 2003-07-31 Kourosh Pahlavan Ring pointer device with inertial sensors
CN101581990B (en) * 2008-05-13 2011-12-07 联想(北京)有限公司 Electronic equipment as well as wearable pointing device and method applied to same
US20100245268A1 (en) * 2009-03-30 2010-09-30 Stg Interactive S.A. User-friendly process for interacting with informational content on touchscreen devices
CN101692192A (en) * 2009-09-30 2010-04-07 北京邮电大学 Ring-type wireless mouse
US9542097B2 (en) * 2010-01-13 2017-01-10 Lenovo (Singapore) Pte. Ltd. Virtual touchpad for a touch device
WO2012120520A1 (en) * 2011-03-04 2012-09-13 Hewlett-Packard Development Company, L.P. Gestural interaction
CN102789312B (en) * 2011-12-23 2016-03-23 苏州触达信息技术有限公司 A kind of user interactive system and method
CN103885642A (en) * 2012-12-21 2014-06-25 鸿富锦精密工业(深圳)有限公司 Display control system and display control method
CN203299755U (en) * 2013-06-25 2013-11-20 邓小波 Cap type air mouse
CN203338321U (en) * 2013-07-10 2013-12-11 京东方科技集团股份有限公司 Mobile terminal
CN103440033B (en) * 2013-08-19 2016-12-28 中国科学院深圳先进技术研究院 A kind of method and apparatus realizing man-machine interaction based on free-hand and monocular cam
CN203482200U (en) * 2013-09-04 2014-03-12 沈阳理工大学 Wearable intelligent terminal
CN103777451B (en) * 2014-01-24 2015-11-11 京东方科技集团股份有限公司 Projection screen, remote terminal, projection arrangement, display device and optical projection system
CN103995621B (en) * 2014-04-28 2017-02-15 京东方科技集团股份有限公司 Wearable type touch control device and wearable type touch control method
CN103995592A (en) * 2014-05-21 2014-08-20 上海华勤通讯技术有限公司 Wearable equipment and terminal information interaction method and terminal
CN104007930B (en) * 2014-06-09 2015-11-25 努比亚技术有限公司 A kind of mobile terminal and realize the method and apparatus of one-handed performance
CN104007844B (en) * 2014-06-18 2017-05-24 原硕朋 Electronic instrument and wearable type input device for same

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1694043A (en) * 2004-04-29 2005-11-09 国际商业机器公司 System and method for selecting and activating a target object using a combination of eye gaze and key presses
CN101038504A (en) * 2006-03-16 2007-09-19 许丰 Manpower operating method, software and hardware device
CN102782459A (en) * 2009-09-11 2012-11-14 诺沃迪吉特公司 Method and system for controlling a user interface of a device using human breath
CN101702055A (en) * 2009-11-18 2010-05-05 大连海事大学 Calibrating device for tracing aiming point of typoscope telescope
CN102834789A (en) * 2010-04-16 2012-12-19 高通股份有限公司 Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size
WO2012019322A1 (en) * 2010-08-13 2012-02-16 Xu Hong Input method, input system and input device of vision directing type mouse using monocular camera calibration technique
CN103412725A (en) * 2013-08-27 2013-11-27 广州市动景计算机科技有限公司 Touch operation method and device
CN103763427A (en) * 2013-10-29 2014-04-30 小米科技有限责任公司 Hand-held device, one-hand control method and controller
CN103631483A (en) * 2013-11-27 2014-03-12 华为技术有限公司 Positioning method and positioning device

Also Published As

Publication number Publication date
CN104536556A (en) 2015-04-22

Similar Documents

Publication Publication Date Title
EP3046017B1 (en) Unlocking method, device and terminal
EP2708983B9 (en) Method for auto-switching user interface of handheld terminal device and handheld terminal device thereof
JP6273404B2 (en) Charging method, apparatus, program, and recording medium
US11243657B2 (en) Icon display method, and apparatus
US10739994B2 (en) Method and electronic device for recognizing touch
EP3709147B1 (en) Method and apparatus for determining fingerprint collection region
US10310733B2 (en) Method and electronic device for recognizing touch
US9823779B2 (en) Method and device for controlling a head-mounted display by a terminal device
KR20150025385A (en) Mobile terminal and controlling method thereof
US20180101290A1 (en) Wearable device, touchscreen thereof, touch operation method thereof, and graphical user interface thereof
CN108769299B (en) Screen control method and device and mobile terminal
US20180224997A1 (en) Method and system for one-handed operations of mobile terminal
EP3764254B1 (en) Fingerprint unlocking method, and terminal
CN110413148B (en) False touch prevention detection method, device, equipment and storage medium
US10528248B2 (en) Method for providing user interface and electronic device therefor
US10159046B2 (en) Mobile terminal device
CN104536556B (en) Information processing method and electronic equipment
KR20140024565A (en) Dispaly device and wireless charging method using display device
KR102553573B1 (en) Electronic device and method for detecting touch input of the same
WO2022228097A1 (en) Display method, display apparatus and electronic device
CN106020673B (en) Control method and electronic equipment
US11079903B2 (en) Method and system for quick selection by intelligent terminal, and intelligent terminal
EP4206888A1 (en) Display method, graphical interface, and related apparatus
CN107924261B (en) Method for selecting text
WO2022237958A1 (en) Wearable electronic device, electronic device system and methods thereof

Legal Events

Date Code Title Description
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant