CN103677438B - Data processing method and electronic equipment - Google Patents

Data processing method and electronic equipment Download PDF

Info

Publication number
CN103677438B
CN103677438B CN201210333598.7A CN201210333598A CN103677438B CN 103677438 B CN103677438 B CN 103677438B CN 201210333598 A CN201210333598 A CN 201210333598A CN 103677438 B CN103677438 B CN 103677438B
Authority
CN
China
Prior art keywords
touch
user
image
body information
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210333598.7A
Other languages
Chinese (zh)
Other versions
CN103677438A (en
Inventor
贺志强
柴海新
付荣耀
陈柯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201210333598.7A priority Critical patent/CN103677438B/en
Publication of CN103677438A publication Critical patent/CN103677438A/en
Application granted granted Critical
Publication of CN103677438B publication Critical patent/CN103677438B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a data processing method and electronic equipment. The data processing method is suitable for an electronic device, the electronic device comprises a camera and a touch screen, the camera can shoot body information of a user operating the electronic device, and the method comprises the following steps: when detecting that a user sends a touch operation to the touch screen, determining the relative position of a contact corresponding to the touch operation on the touch screen; acquiring a first image which is acquired by the camera and contains body information of a current user; determining a spatial corresponding relation between the contact and the body information of the current user according to the first image and the relative position; and determining a touch instruction required to correspond to the touch point corresponding to the corresponding body information according to the spatial corresponding relation. Therefore, by the scheme, different touch instructions can be corresponded when different user operation bodies touch the same contact, and the purpose of improving user experience is further achieved.

Description

Data processing method and electronic equipment
Technical Field
The present invention relates to the field of electronic devices, and in particular, to a data processing method and an electronic device.
Background
With the development of science and technology, various electronic products are continuously enriched and facilitate the public life. Because the electronic product has the advantages of convenience, rapidness, resource saving and the like when used for processing information, the electronic product becomes an indispensable part in life or work of people.
For electronic devices with touch screens, for example: the electronic equipment can be operated by a user through an operation body such as a finger or a touch pen, and excellent experience is brought to the user due to no need of key operation. In the touch screen using the capacitive sensing technology in the prior art, at most 50 points of multi-point touch can be realized.
In some application scenarios (for example, a user experiences network games, musical instrument software, and the like), for a touch point on the touch screen, the user wants the touch point to correspond to a corresponding touch instruction when the touch operation is sent by the left hand, and the touch instruction is different from the corresponding touch instruction when the touch operation is sent by the right hand, so as to complete different touch operations, thereby improving the user experience, that is, in a specific application scenario, the user wants different user operation bodies to correspond to different touch instructions when the user wants to touch the same touch point.
Therefore, how to realize that different user operation bodies correspond to different touch instructions when touching the same touch point so as to improve user experience is a problem worthy of attention.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present invention provide a data processing method and an electronic device, so as to implement that different user operation bodies correspond to different touch instructions when touching the same touch point, thereby improving the user experience, and the technical scheme is as follows:
in one aspect, an embodiment of the present invention provides a data processing method, which is applied to an electronic device, where the electronic device includes a camera and a touch screen, and the camera can capture body information of a user operating the electronic device, and the method includes:
when detecting that a user sends a touch operation to the touch screen, determining the relative position of a contact corresponding to the touch operation on the touch screen;
acquiring a first image which is acquired by the camera and contains body information of a current user;
according to the first image and the relative position, determining a spatial corresponding relation between the contact and the body information of the current user;
and determining a touch instruction required to correspond to the touch point corresponding to the corresponding body information according to the spatial corresponding relation.
Further, the data processing method further includes:
and executing the touch instruction corresponding to the determined touch point.
Wherein the body information comprises: both hands information of the user;
correspondingly, determining the spatial correspondence between the contact point and the body information of the current user according to the first image and the relative position includes:
acquiring the image position of the left hand and/or the right hand in the first image, wherein the image position is contained in the first image;
and determining that the touch operation corresponding to the contact point is sent by the left hand/the right hand according to the image position and the relative position.
When the touch operation corresponding to one touch point is sent out by the left hand of a first user, the touch point corresponds to a first touch instruction;
when the touch operation corresponding to the contact point is sent by the right hand of the first user, the contact point corresponds to a second touch instruction;
wherein the first touch instruction is different from the second touch instruction.
Wherein the body information comprises: torso and hands information of the user;
correspondingly, determining the spatial correspondence between the contact point and the body information of the current user according to the first image and the relative position includes:
acquiring the image position of the left hand and/or the right hand of the user contained in the first image;
and determining the left hand/right hand of the user and/or the corresponding touch operation of the contact point according to the image position and the relative position.
In one touch operation, when the touch operations corresponding to the first contact and the second contact are both sent by a first user, the first contact corresponds to a first touch instruction, and the second contact corresponds to a second touch instruction;
when the touch operation corresponding to the first contact point is sent by a first user and the touch operation corresponding to the second contact point is sent by a second user, the first contact point corresponds to a third touch instruction, and the second contact point corresponds to a fourth touch instruction;
the first touch instruction is different from the third touch instruction, and the second touch instruction is different from the fourth touch instruction.
On the other hand, an embodiment of the present invention further provides an electronic device, including a camera and a touch screen, where the camera is capable of capturing body information of a user operating the electronic device, and the electronic device further includes:
the relative position determining module is used for determining the relative position of a contact point corresponding to the touch operation on the touch screen when the touch operation of a user on the touch screen is detected;
the first image acquisition module is used for acquiring a first image which is acquired by the camera and contains the body information of the current user;
the space corresponding relation establishing module is used for determining the space corresponding relation between the contact and the body information of the current user according to the first image and the relative position;
and the touch instruction determining module is used for determining a touch instruction required to correspond to the contact corresponding to the corresponding body information according to the spatial corresponding relation.
Further, the electronic device further includes:
and the touch instruction execution module is used for executing the determined touch instruction corresponding to the contact.
The camera in the electronic equipment can shoot information of both hands of a user operating the electronic equipment;
correspondingly, the spatial correspondence establishing module includes:
a first image position acquiring unit configured to acquire an image position of a left hand and/or a right hand included in the first image;
and the first spatial corresponding relation establishing unit is used for determining that the touch operation corresponding to the contact point is sent by the left hand/the right hand according to the image position and the relative position.
The camera in the electronic equipment can shoot the trunk and the two hands information of a user operating the electronic equipment;
correspondingly, the spatial correspondence establishing module includes:
a second image position acquiring unit for acquiring the image position of the left hand and/or the right hand of the user contained in the first image;
and the second spatial corresponding relation establishing unit is used for determining that the contact corresponds to the left hand/right hand of the user and/or the left hand/right hand of the user to which the touch operation belongs according to the image position and the relative position.
According to the technical scheme provided by the embodiment of the invention, when the touch operation of the user on the touch screen is detected, the relative position of the touch point corresponding to the touch operation on the touch screen is determined, the first image which is collected by the camera and contains the body information of the current user is obtained, the spatial corresponding relation between the touch point and the body information of the current user is determined according to the first image and the relative position, and then the touch instruction corresponding to the touch point corresponding to the corresponding body information is determined according to the spatial corresponding relation. Therefore, in the scheme, the user operation body which sends out the touch operation corresponding to the touch point is determined through the relative position of the touch point on the touch screen and the first image containing the body information, and then the touch instruction which is required to correspond to the corresponding touch point is determined, so that different touch instructions are corresponding when the same touch point is touched by different user operation bodies, and the purpose of improving the user experience is achieved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a first flowchart of a data processing method according to an embodiment of the present invention;
fig. 2 is a second flowchart of a data processing method according to an embodiment of the present invention;
fig. 3 is a third flowchart of a data processing method according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to achieve the purpose that different user operation bodies correspond to different touch instructions when touching the same touch point, and further improve user experience, the embodiment of the invention provides a data processing method and electronic equipment.
First, a data processing method provided in an embodiment of the present invention is described below.
It should be noted that the data processing method provided by the embodiment of the present invention is applicable to an electronic device, where the electronic device includes a camera and a touch screen, and the camera can capture body information of a user operating the electronic device. In practical applications, the electronic device may be a mobile phone with a touch screen and a camera, a PAD, or the like.
It is understood that, when the different user operation bodies to be distinguished are the left hand and the right hand of the same user, the body information of the user may include: both hands information of the user; and when the different user operation bodies needing to be distinguished are different users, or the left hand and the right hand of different users, the body information of the user may include: torso and hands information of the user.
As shown in fig. 1, a data processing method may include:
s101, when detecting that a user sends a touch operation to the touch screen, determining the relative position of a contact corresponding to the touch operation on the touch screen;
when a user performs touch operation on the electronic device, the electronic device detects the touch operation and determines the relative position of a touch point corresponding to the touch operation on the touch screen.
S102, acquiring a first image which is acquired by the camera and contains the body information of the current user;
when the touch operation of the user on the touch screen is detected, a first image which is acquired by the camera and contains the body information of the current user can be acquired, and then the first image is utilized to perform subsequent operation.
It is understood that, since the camera can capture the body information of the user operating the electronic device, no matter what the placement state of the electronic device is, the field of view of the camera needs to cover the whole touch screen and a certain range of area around the touch screen, wherein the area is enough to include the current user's hands and torso when operating the electronic device, or only include the user's hands when operating the electronic device.
It should be noted that, when it is detected that the user performs a touch operation on the touch screen, an execution sequence of the step of determining the relative position of the touch point in the touch screen and the step of acquiring the first image containing the body information of the user is not limited to this embodiment, and may also be: it is reasonable that the step of determining the relative position of the touch point in the touch screen is performed simultaneously with the step of acquiring the first image containing the user's body information, or that the step of acquiring the first image containing the user's body information is performed prior to the step of determining the relative position of the touch point in the touch screen.
S103, determining a spatial corresponding relation between the contact and the body information of the current user according to the first image and the relative position;
after the first image containing the body information of the current user and the relative position of the corresponding contact of the current touch operation on the touch screen are acquired, the spatial corresponding relationship between the contact and the body information of the current user can be determined according to the first image and the relative position, that is, the user operation body which sends out the corresponding touch operation of each contact is determined.
For example: for a user to operate the touch screen of the electronic device, when the touch operation corresponds to two touch points, the acquired first image can be subjected to image analysis, the image positions of the left hand and/or the right hand in the first image, which are contained in the first image, are determined, and then the left hand and/or the right hand of the user, which sends out the touch operation corresponding to the two touch points, is determined according to the relative positions of the two touch points on the touch screen and the corresponding image positions of the left hand and/or the right hand, so that the construction of the spatial corresponding relationship between the touch points and the body information of the current user is realized.
And S104, determining a touch instruction corresponding to the touch point corresponding to the corresponding body information according to the spatial corresponding relation.
After the spatial correspondence between the touch point and the body information of the current user is determined, the touch instruction corresponding to the touch point corresponding to the corresponding body information can be determined. It should be noted that, a mapping relationship between different body information and a touch instruction corresponding to a contact is preset, and then after the body information of the touch operation corresponding to the contact is determined, a touch instruction required by the contact can be determined.
Furthermore, after the touch instruction corresponding to the contact corresponding to the corresponding body information is determined, the touch instruction corresponding to the determined contact may be executed, so as to complete the corresponding touch operation.
According to the technical scheme provided by the embodiment of the invention, when the touch operation of the user on the touch screen is detected, the relative position of the touch point corresponding to the touch operation on the touch screen is determined, the first image which is collected by the camera and contains the body information of the current user is obtained, the spatial corresponding relation between the touch point and the body information of the current user is determined according to the first image and the relative position, and then the touch instruction corresponding to the touch point corresponding to the corresponding body information is determined according to the spatial corresponding relation. Therefore, in the scheme, the user operation body which sends out the touch operation corresponding to the touch point is determined through the relative position of the touch point on the touch screen and the first image containing the body information, and then the touch instruction which is required to correspond to the corresponding touch point is determined, so that different touch instructions are corresponding when the same touch point is touched by different user operation bodies, and the purpose of improving the user experience is achieved.
The data processing method provided by the embodiment of the invention is described below by taking the information of both hands of a user as the body information.
It should be noted that the data processing method provided by the embodiment of the present invention is applicable to an electronic device, where the electronic device includes a camera and a touch screen, and the camera can capture body information of a user operating the electronic device. In practical applications, the electronic device may be a mobile phone with a touch screen and a camera, a PAD, or the like.
As shown in fig. 2, a data processing method may include:
s201, when detecting that a user sends a touch operation to the touch screen, determining the relative position of a contact corresponding to the touch operation on the touch screen;
when a user performs touch operation on the electronic device, the electronic device detects the touch operation and determines the relative position of a touch point corresponding to the touch operation on the touch screen.
S202, acquiring a first image which is acquired by the camera and contains the information of both hands of the current user;
when the touch operation of the user on the touch screen is detected, a first image which is collected by the camera and contains the information of both hands of the current user can be obtained, and then the first image is utilized to perform subsequent operation.
It will be appreciated that, because the camera can capture information about the hands of the user operating the electronic device, the field of view of the camera needs to cover the entire touch screen and a range of areas around the touch screen that are sufficient to include the hands of the current user when operating the electronic device, regardless of the placement of the electronic device.
It should be noted that, when it is detected that the user performs a touch operation on the touch screen, an execution sequence of the step of determining the relative position of the touch point in the touch screen and the step of acquiring the first image containing the information of both hands of the user is not limited to this embodiment, and may also be: it is reasonable that the step of determining the relative position of the touch point in the touch screen is performed simultaneously with the step of acquiring the first image containing the information on both hands of the user, or that the step of acquiring the first image containing the information on both hands of the user is performed prior to the step of determining the relative position of the touch point in the touch screen.
S203, acquiring the image position of the left hand and/or the right hand in the first image, wherein the image position is contained in the first image;
s204, determining that the contact point is sent out by a left hand/a right hand corresponding to the touch operation according to the image position and the relative position;
after the first image containing the information of both hands of the user is acquired, it can be determined through an image analysis algorithm that the body information of the user contained in the first image is: and determining the image position of the left hand and/or the right hand contained in the first image by the left hand, the right hand or the left hand and the right hand.
After the image position of the left hand and/or the right hand in the first image and the relative position of the touch point on the touch screen are obtained, it can be determined that the touch point corresponds to the touch operation and is sent by the left hand/the right hand according to the spatial corresponding relationship between the image position and the relative position.
For example: under the situation that a user respectively sends touch operations to a touch screen through a left hand and a right hand and each touch operation corresponds to a contact on the touch screen, when the electronic equipment detects that the user sends the touch operations, the relative positions of the two contacts on the touch screen are respectively determined, and a first image which is acquired by a camera and contains information of both hands of the user is acquired; and then image positions of the left hand and the right hand in the first image, which are contained in the first image, are obtained, finally, according to the image positions and the relative positions, the fact that the touch operation corresponding to the two touch points is sent out by the left hand/the right hand is determined, and the construction of the spatial corresponding relation between the touch points and the body information of the current user is completed.
And S205, determining a touch instruction required to correspond to the left/right-hand touch point.
And when the touch operation corresponding to the touch point is determined to be sent by the left hand/right hand of the user, determining a touch instruction corresponding to the touch point.
It should be noted that, a mapping relationship between the two hands of the user and the touch instruction corresponding to one contact is preset, and then after it is determined that the touch operation corresponding to the contact is sent by the left hand/the right hand, the touch instruction corresponding to the contact can be determined.
That is, when a touch operation corresponding to a touch point is sent by the left hand of a first user, the touch point corresponds to a first touch instruction;
when the touch operation corresponding to the contact point is sent by the right hand of the first user, the contact point corresponds to a second touch instruction;
wherein the first touch instruction is different from the second touch instruction.
Furthermore, after the touch instruction corresponding to the left/right-hand touch point is determined, the touch instruction corresponding to the determined touch point may be executed, so as to complete the corresponding touch operation.
Therefore, in the scheme, the touch operation corresponding to the contact is determined to be sent by the left hand or the right hand of a user through the relative position of the contact on the touch screen and the first image containing the information of the two hands, and then the touch instruction required to correspond to the contact corresponding to the left hand/the right hand is determined, so that different touch instructions corresponding to the same contact when different user operation bodies touch the same contact are achieved, and the purpose of improving user experience is further achieved.
The data processing method provided by the embodiment of the invention is described below by taking the trunk and both hands information of the user as the body information.
It should be noted that the data processing method provided by the embodiment of the present invention is applicable to an electronic device, where the electronic device includes a camera and a touch screen, and the camera can capture body information of a user operating the electronic device. In practical applications, the electronic device may be a mobile phone with a touch screen and a camera, a PAD, or the like.
As shown in fig. 3, a data processing method may include:
s301, when detecting that a user sends a touch operation to the touch screen, determining the relative position of a contact corresponding to the touch operation on the touch screen;
when a user performs touch operation on the electronic device, the electronic device detects the touch operation and determines the relative position of a touch point corresponding to the touch operation on the touch screen.
S302, acquiring a first image which is acquired by the camera and contains the trunk and two-hand information of the current user;
when the touch operation of the user on the touch screen is detected, a first image which is collected by the camera and contains the trunk and both-hand information of the current user can be obtained, and then the first image is utilized to perform subsequent operation.
It will be appreciated that, because the camera can capture torso and hands information of a user operating the electronic device, the field of view of the camera needs to cover the entire touch screen and a range of areas around the touch screen that are sufficient to include a portion of the torso and hands of the user currently operating the electronic device, regardless of the placement of the electronic device.
It should be noted that, when it is detected that the user performs a touch operation on the touch screen, the execution sequence of the step of determining the relative position of the touch point in the touch screen and the step of acquiring the first image containing the information of the user's trunk and both hands is not limited to this embodiment, and may also be: it is reasonable that the step of determining the relative position of the touch point in the touch screen is performed simultaneously with the step of acquiring the first image comprising the information on the torso and both hands of the user, or that the step of acquiring the first image comprising the information on the torso and both hands of the user is performed prior to the step of determining the relative position of the touch point in the touch screen.
S303, acquiring the image position of the left hand and/or the right hand of the user in the first image, wherein the left hand and/or the right hand of the user is contained in the first image;
s304, determining the left hand/right hand of the user and/or the user to which the touch operation belongs to the touch point according to the image position and the relative position;
after a first image containing the torso and the two-hand information of the user is obtained, the number and the body information of the user contained in the first image can be determined through an image analysis algorithm, and then the image position of the left hand and/or the right hand of the user in the first image is determined.
After the image positions of the left hand and/or the right hand of the user in the first image and the relative positions of the touch points on the touch screen are obtained, the left hand/the right hand of the user and/or the touch points corresponding to the touch operation can be determined according to the corresponding relationship between the image positions and the relative positions. That is, according to the image position and the relative position, it is determined whether the users who send out the touch operation corresponding to the contact point are the same, and/or it is determined that the touch operation corresponding to the contact point is sent out by the left hand/right hand of the user.
For example: when the electronic equipment detects that the user sends touch operation, the relative positions of the two touch points on the touch screen are respectively determined, and a first image which is acquired by a camera and contains information of the trunk and the hands of the user is acquired; and further acquiring the image position of the left hand and/or the right hand of the user in the first image, which is contained in the first image, and finally determining that the user to which the touch operation belongs is different according to the image position and the relative position, so that the construction of the spatial corresponding relation between the touch point and the body information of the current user is completed.
S305, determining a touch instruction required to correspond to the left/right hand touch of the corresponding user.
And when the touch point is determined to correspond to the user to which the touch operation belongs and/or the left hand/right hand of the user to which the touch operation belongs, determining a touch instruction required to correspond to the touch point corresponding to the left hand/right hand of the user to which the touch operation belongs.
It should be noted that, mapping relationships between the two hands of different users and the touch commands corresponding to the contacts are preset, and then when it is determined that the touch operation corresponding to the contact is sent by the left hand/right hand of the user, the touch command required by the contact can be determined.
That is, in one touch operation, when the touch operations corresponding to the first contact and the second contact are both sent by the first user, the first contact corresponds to the first touch instruction, and the second contact corresponds to the second touch instruction;
when the touch operation corresponding to the first contact point is sent by a first user and the touch operation corresponding to the second contact point is sent by a second user, the first contact point corresponds to a third touch instruction, and the second contact point corresponds to a fourth touch instruction;
the first touch instruction is different from the third touch instruction, and the second touch instruction is different from the fourth touch instruction.
Furthermore, after determining the touch instruction required to correspond to the left/right-hand touch point of the corresponding user, the touch instruction corresponding to the determined touch point may be executed, so as to complete the corresponding touch operation.
Therefore, in the scheme, the touch control command corresponding to the touch control operation of the contact and/or the left hand/right hand of the user is determined according to the relative position of the contact on the touch screen and the first image containing the trunk and the two-hand information of the user, and then the touch control command required by the contact corresponding to the left hand/right hand of the user is determined, so that different touch control commands are corresponding to different user operation bodies when the same contact is touched, and the purpose of improving user experience is achieved.
Through the above description of the method embodiments, those skilled in the art can clearly understand that the present invention can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media that can store program codes, such as Read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and so on.
Corresponding to the above method embodiment, an embodiment of the present invention further provides an electronic device, where the electronic device includes a camera and a touch screen, and the camera is capable of capturing body information of a user operating the electronic device, as shown in fig. 4, the electronic device may include:
a relative position determining module 110, configured to determine, when it is detected that a user performs a touch operation on the touch screen, a relative position of a touch point corresponding to the touch operation on the touch screen;
a first image obtaining module 120, configured to obtain a first image that includes body information of a current user and is collected by the camera;
a spatial correspondence establishing module 130, configured to determine, according to the first image and the relative position, a spatial correspondence between the contact and the body information of the current user;
and a touch instruction determining module 140, configured to determine, according to the spatial correspondence, a touch instruction required to correspond to the contact corresponding to the corresponding body information.
According to the electronic device provided by the embodiment of the invention, when the touch operation of the user on the touch screen is detected, the relative position of the touch point corresponding to the touch operation on the touch screen is determined, the first image which is collected by the camera and contains the body information of the current user is obtained, the spatial corresponding relation between the touch point and the body information of the current user is determined according to the first image and the relative position, and then the touch instruction corresponding to the touch point corresponding to the corresponding body information is determined according to the spatial corresponding relation. Therefore, in the scheme, the user operation body which sends out the touch operation corresponding to the touch point is determined through the relative position of the touch point on the touch screen and the first image containing the body information, and then the touch instruction which is required to correspond to the corresponding touch point is determined, so that different touch instructions are corresponding when the same touch point is touched by different user operation bodies, and the purpose of improving the user experience is achieved.
Still further, the electronic device may further include:
and the touch instruction execution module is used for executing the determined touch instruction corresponding to the contact.
The camera can shoot information of both hands of a user operating the electronic equipment;
correspondingly, the spatial correspondence establishing module 130 may include:
a first image position acquiring unit configured to acquire an image position of a left hand and/or a right hand included in the first image;
and the first spatial corresponding relation establishing unit is used for determining that the touch operation corresponding to the contact point is sent by the left hand/the right hand according to the image position and the relative position.
In another embodiment of the invention, the camera can shoot the trunk and two hands information of a user operating the electronic equipment;
correspondingly, the spatial correspondence establishing module 130 may include:
a second image position acquiring unit for acquiring the image position of the left hand and/or the right hand of the user contained in the first image;
and the second spatial corresponding relation establishing unit is used for determining that the contact corresponds to the left hand/right hand of the user and/or the left hand/right hand of the user to which the touch operation belongs according to the image position and the relative position.
For device or system embodiments, as they correspond substantially to method embodiments, reference may be made to the method embodiments for some of their descriptions. The above-described embodiments of the apparatus or system are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways without departing from the spirit and scope of the present application. The present embodiment is an exemplary example only, and should not be taken as limiting, and the specific disclosure should not be taken as limiting the purpose of the application. For example, the division of the unit or the sub-unit is only one logical function division, and there may be another division manner in actual implementation, for example, a plurality of units or a plurality of sub-units are combined together. In addition, various elements or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
Additionally, the systems, apparatus, and methods described, as well as the illustrations of various embodiments, may be combined or integrated with other systems, modules, techniques, or methods without departing from the scope of the present application. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The foregoing is directed to embodiments of the present invention, and it is understood that various modifications and improvements can be made by those skilled in the art without departing from the spirit of the invention.

Claims (6)

1. A data processing method is applied to an electronic device, the electronic device comprises a camera and a touch screen, the camera can shoot body information of a user operating the electronic device, and the method comprises the following steps:
when detecting that a user sends a touch operation to the touch screen, determining the relative position of a contact corresponding to the touch operation on the touch screen;
acquiring a first image which is acquired by the camera and contains body information of a user; the body information includes: torso and hands information of the user;
determining a spatial correspondence of the contact point and body information of the user according to the first image and the relative position; the method comprises the following steps: acquiring the image position of the left hand and/or the right hand of the user contained in the first image; determining the left hand or the right hand of the user and/or the user to which the touch operation belongs to the contact point according to the image position and the relative position;
and determining a touch instruction required to correspond to the touch point corresponding to the corresponding body information according to the spatial corresponding relation.
2. The method of claim 1, further comprising:
and executing the touch instruction corresponding to the determined touch point.
3. The method according to claim 1, wherein when a touch operation corresponding to a touch point is issued by a left hand of a first user, the touch point corresponds to a first touch instruction;
when the touch operation corresponding to the contact point is sent by the right hand of the first user, the contact point corresponds to a second touch instruction;
wherein the first touch instruction is different from the second touch instruction.
4. The method according to claim 1, wherein in one touch operation, when the touch operations corresponding to the first touch point and the second touch point are both sent by a first user, the first touch point corresponds to the first touch instruction, and the second touch point corresponds to the second touch instruction;
when the touch operation corresponding to the first contact point is sent by a first user and the touch operation corresponding to the second contact point is sent by a second user, the first contact point corresponds to a third touch instruction, and the second contact point corresponds to a fourth touch instruction;
the first touch instruction is different from the third touch instruction, and the second touch instruction is different from the fourth touch instruction.
5. An electronic device, comprising a camera and a touch screen, wherein the camera can capture body information of a user operating the electronic device, the electronic device further comprising:
the relative position determining module is used for determining the relative position of a contact point corresponding to the touch operation on the touch screen when the touch operation of a user on the touch screen is detected;
the first image acquisition module is used for acquiring a first image which is acquired by the camera and contains body information of a user; the body information includes: torso and hands information of the user;
the spatial corresponding relation establishing module is used for determining the spatial corresponding relation between the contact and the body information of the user according to the first image and the relative position;
the touch instruction determining module is used for determining a touch instruction required to correspond to the contact corresponding to the corresponding body information according to the space corresponding relation;
wherein, the spatial correspondence establishing module includes:
a second image position acquiring unit for acquiring the image position of the left hand and/or the right hand of the user contained in the first image;
and the second spatial corresponding relation establishing unit is used for determining that the contact point corresponds to the left hand or the right hand of the user and/or the user to which the touch operation belongs according to the image position and the relative position.
6. The electronic device of claim 5, further comprising:
and the touch instruction execution module is used for executing the determined touch instruction corresponding to the contact.
CN201210333598.7A 2012-09-10 2012-09-10 Data processing method and electronic equipment Active CN103677438B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210333598.7A CN103677438B (en) 2012-09-10 2012-09-10 Data processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210333598.7A CN103677438B (en) 2012-09-10 2012-09-10 Data processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN103677438A CN103677438A (en) 2014-03-26
CN103677438B true CN103677438B (en) 2020-02-21

Family

ID=50315212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210333598.7A Active CN103677438B (en) 2012-09-10 2012-09-10 Data processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN103677438B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108594995A (en) * 2018-04-13 2018-09-28 广东小天才科技有限公司 A kind of electronic device method and electronic equipment based on gesture identification

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101689096A (en) * 2007-07-26 2010-03-31 诺基亚公司 An apparatus, method, computer program and user interface for enabling access to functions
CN102027439A (en) * 2008-05-12 2011-04-20 夏普株式会社 Display device and control method
CN102622136A (en) * 2012-02-29 2012-08-01 广东威创视讯科技股份有限公司 Multipoint touch system data processing method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130201155A1 (en) * 2010-08-12 2013-08-08 Genqing Wu Finger identification on a touchscreen

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101689096A (en) * 2007-07-26 2010-03-31 诺基亚公司 An apparatus, method, computer program and user interface for enabling access to functions
CN102027439A (en) * 2008-05-12 2011-04-20 夏普株式会社 Display device and control method
CN102622136A (en) * 2012-02-29 2012-08-01 广东威创视讯科技股份有限公司 Multipoint touch system data processing method and device

Also Published As

Publication number Publication date
CN103677438A (en) 2014-03-26

Similar Documents

Publication Publication Date Title
CN104335561B (en) The communication that bio-identification is initiated
KR102041984B1 (en) Mobile apparatus having function of face recognition with additional component
US9535576B2 (en) Touchscreen apparatus user interface processing method and touchscreen apparatus
CN109840061A (en) The method and electronic equipment that control screen is shown
CN107562338B (en) Picture inspection method, device, electronic equipment and storage medium
US20140002396A1 (en) Information processing method, information processing apparatus and electronic device
CN113792277A (en) Method and device for displaying application and picture and electronic equipment
CN109062464B (en) Touch operation method and device, storage medium and electronic equipment
CN104714731A (en) Display method and device for terminal interface
CN102117165A (en) Touch input processing method and mobile terminal
CN103389850B (en) A kind of method and device realizing prompt operation on a web browser
CN112068698A (en) Interaction method and device, electronic equipment and computer storage medium
CN102710846A (en) System and method for realizing electronic book page turning based on forced induction
CN108073380A (en) Electronic device, display control method and related product
CN108595044A (en) A kind of control method and terminal of touch screen
CN104077065A (en) Method for displaying virtual keyboard by touch screen terminal and touch screen terminal
CN103870170A (en) Cursor control method, terminals and system
CN104182161A (en) Method and device for opening screen functional area
CN103488424A (en) Method and device for displaying information
CN108108078A (en) Electronic equipment, display control method and Related product
CN108196781A (en) The display methods and mobile terminal at interface
US9122390B2 (en) Method, application and/or service to collect more fine-grained or extra event data from a user sensor device
CN105829998B (en) Device is tied to calculating equipment
CN106201307A (en) terminal control method, terminal
CN105701383B (en) A kind of function triggering method, device and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant