CN107589834B - Terminal device operation method and device and terminal device - Google Patents

Terminal device operation method and device and terminal device Download PDF

Info

Publication number
CN107589834B
CN107589834B CN201710677517.8A CN201710677517A CN107589834B CN 107589834 B CN107589834 B CN 107589834B CN 201710677517 A CN201710677517 A CN 201710677517A CN 107589834 B CN107589834 B CN 107589834B
Authority
CN
China
Prior art keywords
hand
terminal equipment
user
model
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710677517.8A
Other languages
Chinese (zh)
Other versions
CN107589834A (en
Inventor
周意保
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710677517.8A priority Critical patent/CN107589834B/en
Publication of CN107589834A publication Critical patent/CN107589834A/en
Application granted granted Critical
Publication of CN107589834B publication Critical patent/CN107589834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a terminal equipment operation method and device and terminal equipment, wherein the method comprises the following steps: projecting to users around the terminal equipment by adopting structured light equipment, acquiring and identifying hand 3D models of the users around, and acquiring the hand 3D models of the users around and the position range of the hands of the users around; projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point; identifying the hand 3D model of the terminal equipment user at each time point to obtain the gesture of the terminal equipment user; the terminal equipment is operated according to the gestures of the terminal equipment user, so that the terminal equipment user can control the operation of the terminal equipment by making certain gestures, the terminal equipment is effectively and visually operated in an air-insulated mode without binding the behavior mode of the user, and the operation efficiency and the operation experience of the terminal equipment are improved.

Description

Terminal device operation method and device and terminal device
Technical Field
The invention relates to the field of terminal equipment, in particular to a terminal equipment operation method and device and terminal equipment.
Background
At present, the man-machine interaction technology refers to a technology for realizing human and machine interaction in an effective way through input and output equipment. The existing man-machine interaction mode is usually to interact with a machine system through external devices such as a mouse, a keyboard, a touch screen or a handle, and the machine system then makes a corresponding response. For example, when the screen needs to be controlled to roll, the screen is rolled by moving the mouse, or the screen is rolled by sliding a human body part such as a finger on the touch screen.
The existing terminal equipment operation mode can complete each operation only by contacting the user with input equipment such as a touch screen, so that the user must rely on external equipment when operating the terminal equipment, the behavior mode of the user is bound, and the operation efficiency and the operation experience of the terminal equipment are reduced.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, a first object of the present invention is to provide a terminal device operating method, so as to implement effective and intuitive operation on a terminal device, and solve the problems that a user must rely on an external device to restrict a behavior mode of the user when operating the terminal device in the prior art, and reduce the operating efficiency and the operating experience of the terminal device.
A second object of the present invention is to provide a terminal device operating apparatus.
A third object of the present invention is to provide a terminal device.
A fourth object of the invention is to propose a non-transitory computer-readable storage medium.
To achieve the above object, an embodiment of a first aspect of the present invention provides a terminal device operating method, including:
projecting to surrounding users of the terminal equipment by adopting structured light equipment to obtain a hand 3D model of the surrounding users;
identifying the hand 3D models of the surrounding users to obtain the hand 3D models of the terminal equipment users in the surrounding users and the position range of the hands of the terminal equipment users;
projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point;
identifying a hand 3D model of the terminal equipment user at each time point to acquire a gesture of the terminal equipment user;
and operating the terminal equipment according to the gesture of the terminal equipment user.
As a possible implementation manner of the embodiment of the first aspect of the present invention, the acquiring, by projecting to a user around the terminal device by using a structured light device, a 3D hand model of the user around the terminal device includes:
projecting to surrounding users of the terminal equipment by adopting structured light equipment;
shooting the surrounding users by adopting a camera to obtain depth images of the surrounding users;
and calculating and acquiring a hand 3D model of the surrounding user by combining the depth image of the surrounding user and the position relation between the structured light equipment and the camera.
As a possible implementation manner of the embodiment of the first aspect of the present invention, the structured light generated by the structured light device is non-uniform structured light.
As a possible implementation manner of the embodiment of the first aspect of the present invention, the identifying the 3D hand models of the surrounding users to obtain the 3D hand models of the terminal device users in the surrounding users and the position ranges where the hands of the terminal device users are located includes:
analyzing the hand 3D model of the surrounding user, and extracting feature point information of a fingerprint area in the hand 3D model of the surrounding user;
comparing the characteristic point information of the fingerprint area in the hand 3D model of the surrounding user with the pre-stored fingerprint characteristic point information of the terminal equipment user to acquire the similarity between the characteristic point information of the fingerprint area and the fingerprint characteristic point information;
and determining the hand 3D model with the corresponding similarity larger than a preset threshold value as the hand 3D model of the terminal equipment user, and determining the position range of the hand of the terminal equipment user according to the position range of the hand 3D model of the terminal equipment user.
As a possible implementation manner of the embodiment of the first aspect of the present invention, the identifying a 3D hand model of the terminal device user at each time point to obtain a gesture of the terminal device user includes:
analyzing the hand 3D model of the terminal equipment user at each time point, and extracting feature point information of the hand 3D model of the terminal equipment user at each time point;
aiming at each time point, comparing the characteristic point information of the hand 3D model of the terminal equipment user with the prestored characteristic point information corresponding to each hand gesture to determine the hand gesture of the terminal equipment user;
and determining the gesture of the terminal equipment user according to the hand gesture of the terminal equipment user at each time point.
As a possible implementation manner of the embodiment of the first aspect of the present invention, before the projecting the structured light device to the user around the terminal device and acquiring the 3D hand model of the user around the terminal device, the method further includes:
acquiring a corresponding relation between each gesture of a terminal device user and a control instruction;
generating a control instruction table according to the corresponding relation between each gesture of the terminal equipment user and the control instruction, and storing the control instruction table;
correspondingly, the operating the terminal device according to the gesture of the terminal device user includes:
inquiring the control instruction table according to the gesture of the terminal equipment user, and acquiring a control instruction matched with the gesture of the terminal equipment user;
and operating the terminal equipment according to the control instruction.
According to the terminal equipment operation method, the structured light equipment is adopted to project to the surrounding users of the terminal equipment, and the hand 3D models of the surrounding users are obtained; identifying 3D hand models of surrounding users, and acquiring the 3D hand models of terminal equipment users and the position range of the hands of the terminal equipment users in the surrounding users; projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point; identifying the hand 3D model of the terminal equipment user at each time point to obtain the gesture of the terminal equipment user; the terminal equipment is operated according to the gestures of the terminal equipment user, so that the terminal equipment user can control the operation of the terminal equipment by making certain gestures, the terminal equipment is effectively and visually operated in an air-insulated mode without binding the behavior mode of the user, and the operation efficiency and the operation experience of the terminal equipment are improved.
To achieve the above object, a second embodiment of the present invention provides a terminal device operating apparatus, including:
the first projection module is used for projecting to users around the terminal equipment by adopting structured light equipment to obtain a hand 3D model of the users around;
the first identification module is used for identifying the hand 3D models of the surrounding users to obtain the hand 3D models of the terminal equipment users in the surrounding users and the position range of the hands of the terminal equipment users;
the second projection module is used for projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point;
the second identification module is used for identifying the hand 3D model of the terminal equipment user at each time point to acquire the gesture of the terminal equipment user;
and the operation module is used for operating the terminal equipment according to the gesture of the terminal equipment user.
As a possible implementation manner of the embodiment of the second aspect of the present invention, the first projection module includes:
the projection unit is used for projecting to users around the terminal equipment by adopting structured light equipment;
the camera shooting unit is used for shooting the surrounding users by adopting a camera to acquire depth images of the surrounding users;
and the computing unit is used for combining the depth image of the surrounding user and the position relation between the structured light equipment and the camera to compute and acquire a hand 3D model of the surrounding user.
As a possible implementation manner of the embodiment of the second aspect of the present invention, the structured light generated by the structured light device is non-uniform structured light.
As a possible implementation manner of the embodiment of the second aspect of the present invention, the first identifying module includes:
the first extraction unit is used for analyzing the hand 3D model of the surrounding user and extracting feature point information of a fingerprint area in the hand 3D model of the surrounding user;
the first comparison unit is used for comparing the characteristic point information of the fingerprint area in the hand 3D model of the surrounding user with the pre-stored fingerprint characteristic point information of the terminal equipment user to acquire the similarity between the characteristic point information of the fingerprint area and the fingerprint characteristic point information;
the first determining unit is used for determining the hand 3D model with the corresponding similarity larger than a preset threshold as the hand 3D model of the terminal equipment user, and determining the position range of the hand of the terminal equipment user according to the position range of the hand 3D model of the terminal equipment user.
As a possible implementation manner of the embodiment of the second aspect of the present invention, the second identifying module includes:
the second extraction unit is used for analyzing the hand 3D model of the terminal equipment user at each time point and extracting characteristic point information of the hand 3D model of the terminal equipment user at each time point;
the second comparison unit is used for comparing the characteristic point information of the hand 3D model of the terminal equipment user with the pre-stored characteristic point information corresponding to each hand gesture aiming at each time point, and determining the hand gesture of the terminal equipment user;
and the second determining unit is used for determining the gesture of the terminal equipment user according to the hand gesture of the terminal equipment user at each time point.
As a possible implementation manner of the embodiment of the second aspect of the present invention, the apparatus further includes:
the acquisition module is used for acquiring the corresponding relation between each gesture of a terminal equipment user and the control instruction;
the generating module is used for generating a control instruction list according to the corresponding relation between each gesture of the terminal equipment user and the control instruction and storing the control instruction list;
correspondingly, the operation module is specifically configured to query the control instruction table according to the gesture of the terminal device user, and obtain a control instruction matched with the gesture of the terminal device user;
and operating the terminal equipment according to the control instruction.
According to the terminal equipment operating device, the structured light equipment is adopted to project to the surrounding users of the terminal equipment, and the hand 3D models of the surrounding users are obtained; identifying 3D hand models of surrounding users, and acquiring the 3D hand models of terminal equipment users and the position range of the hands of the terminal equipment users in the surrounding users; projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point; identifying the hand 3D model of the terminal equipment user at each time point to obtain the gesture of the terminal equipment user; the terminal equipment is operated according to the gestures of the terminal equipment user, so that the terminal equipment user can control the operation of the terminal equipment by making certain gestures, the terminal equipment is effectively and visually operated in an air-insulated mode without binding the behavior mode of the user, and the operation efficiency and the operation experience of the terminal equipment are improved.
To achieve the above object, a third aspect of the present invention provides a terminal device, including:
the terminal device comprises a shell, and a processor and a memory which are positioned in the shell, wherein the processor runs a program corresponding to an executable program code by reading the executable program code stored in the memory so as to realize the terminal device operation method according to the embodiment of the first aspect of the invention.
To achieve the above object, a fourth embodiment of the present invention provides a non-transitory computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the terminal device operating method according to the first embodiment.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flowchart of a method for operating a terminal device according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a different form of structured light projection provided by an embodiment of the present invention;
fig. 3 is a schematic flowchart of another terminal device operation method according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of another terminal device operation method according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a terminal device operating apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of another terminal device operating apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of another terminal device operating apparatus according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of another terminal device operating apparatus according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of another terminal device operating apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The following describes a terminal device operation method and apparatus, and a terminal device according to an embodiment of the present invention with reference to the drawings.
The method and the device have the advantages that the terminal equipment can be effectively and visually operated, and the problems that in the prior art, a user must rely on external equipment when operating the terminal equipment, behavior modes of the user are bound, and operation efficiency and operation experience of the terminal equipment are reduced are solved.
Fig. 1 is a schematic flowchart of a terminal device operation method according to an embodiment of the present invention.
As shown in fig. 1, the terminal device operating method includes the steps of:
s101, projecting the peripheral users of the terminal equipment by adopting the structured light equipment, and acquiring the hand 3D models of the peripheral users.
The main execution body of the terminal device operation method provided in this embodiment is a terminal device operation apparatus, and the terminal device operation apparatus may specifically be hardware or software installed on the terminal device. The terminal equipment can be a smart phone, a tablet computer, an ipad and the like.
In this embodiment, structured light refers to a set of projected rays of known spatial directions. Structured light device refers to a device that generates structured light that can be projected onto an object under test. The pattern of structured light may include: a point structured light mode, a line structured light mode, a multi-line structured light mode, a plane structured light mode, and a phase method mode. The point structured light mode refers to that a light beam emitted by structured light equipment is projected onto a measured object to generate a light spot; the light spot is imaged on an imaging plane of the camera through a lens of the camera to form a two-dimensional point. The line structured light mode refers to that light beams emitted by structured light equipment are projected onto a measured object to generate a light ray; the light is imaged on the imaging plane of the camera through the lens of the camera to form a light which can be distorted or broken. The multi-line structured light mode refers to that light beams emitted by the structured light equipment are projected onto a measured object to generate a plurality of light rays. The surface structured light mode refers to that light beams emitted by the structured light device are projected on a measured object to generate a light surface. Wherein the degree of light distortion is proportional to the depth of each part on the object to be measured. The degree of light breakdown is related to cracks and the like on the object to be measured.
Combining a light spot, a light ray or a smooth surface on an imaging plane of the camera and a position relation between the camera and the structured light equipment to obtain a triangular geometric constraint relation, and uniquely determining the spatial position of the light spot, the light ray or the smooth surface in a known world coordinate system by the triangular geometric constraint relation, namely determining the spatial position of each part and each feature point on the measured object in the known world coordinate system; and then the restoration of the three-dimensional space of the measured object can be completed by combining the color information acquired by the camera. As shown in fig. 2, different forms of structured light projections are formed on the measured object for different structured light devices.
In addition, in this embodiment, the mode of the structured light may further include: the speckle structure light mode refers to a mode that light beams emitted by the structured light equipment are projected on a measured object to generate a non-uniform light spot array.
In this embodiment, specifically, the terminal device operating apparatus may call the structured light device to project to a user around the terminal device; shooting surrounding users by using a camera to obtain depth images of the surrounding users; and calculating and acquiring a hand 3D model of the surrounding user by combining the depth image of the surrounding user and the position relation between the structured light equipment and the camera.
In this embodiment, the terminal device operating apparatus may obtain a 3D hand model of each user around the terminal device display screen by projecting the user around the terminal device display screen within a specific range. The terminal device operation apparatus may acquire only the 3D model of the hand of each peripheral user located within the specific range, and may not perform the acquisition operation on the 3D model of the hand of each peripheral user located outside the specific range.
In this embodiment, the manner in which the terminal device operating apparatus acquires the depth images of the surrounding users may be that the terminal device operating apparatus invokes the structured light device to project to the surrounding scene, acquires the depth images of all objects in the surrounding scene, analyzes the depth images of all objects, acquires feature point information of all objects, analyzes the feature point information of all objects, acquires feature point information belonging to a person, determines the position of each user, and further determines the depth image of each surrounding user.
S102, identifying the hand 3D models of the surrounding users, and obtaining the hand 3D models of the terminal device users and the position range of the hands of the terminal device users in the surrounding users.
In this embodiment, the terminal device operating apparatus prestores feature point information of a fingerprint area of a terminal device user, so as to identify feature point information of the fingerprint area in a hand 3D model of a peripheral user according to the feature point information of the fingerprint area of the terminal device user, determine a hand 3D model belonging to the terminal device user in the hand 3D model of the peripheral user, and further determine a position range where a hand of the terminal device user is located according to a position of the hand 3D model of the terminal device user.
In this embodiment, the fingerprint area of the terminal device user may refer to a fingerprint area of a single finger, or fingerprint areas of a plurality of fingers and all fingers, which may be set as required, and is not specifically limited herein.
S103, projecting the position range of the hand of the terminal equipment user by adopting the structured light equipment, and acquiring a 3D hand model of the terminal equipment user at each time point.
In this embodiment, each time point and the time difference between each time point may be set as needed, for example, according to the hand movement speed of the terminal device user.
And S104, recognizing the hand 3D model of the terminal equipment user at each time point, and acquiring the gesture of the terminal equipment user.
In this embodiment, the terminal device operating apparatus may identify the hand 3D model of the terminal device user at each time point, and obtain the hand gesture of the terminal device user at each time point; determining the gestures of the terminal device user according to a series of hand gestures of the terminal device user within a period of time.
And S105, operating the terminal equipment according to the gesture of the terminal equipment user.
Further, on the basis of the foregoing embodiment, before step 101, the method may further include: acquiring a corresponding relation between each gesture of a terminal device user and a control instruction; and generating a control instruction table according to the corresponding relation between each gesture of the terminal equipment user and the control instruction, and storing the control instruction table. Correspondingly, step 105 may specifically be to query the control instruction table according to the gesture of the terminal device user, and obtain a control instruction matched with the gesture of the terminal device user; and operating the terminal equipment according to the control instruction.
In this embodiment, in a scene with many people, for example, in scenes such as a subway and a park, when a terminal device user needs to remotely control the terminal device, the terminal device operating device provided in this embodiment is adopted, so that the terminal device can accurately acquire the gesture of the terminal device user, and exclude the gestures of other users, and operate the terminal device according to the gesture of the terminal device user, thereby improving the accuracy of terminal device operation.
According to the terminal equipment operation method, the structured light equipment is adopted to project to the surrounding users of the terminal equipment, and the hand 3D models of the surrounding users are obtained; identifying 3D hand models of surrounding users, and acquiring the 3D hand models of terminal equipment users and the position range of the hands of the terminal equipment users in the surrounding users; projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point; identifying the hand 3D model of the terminal equipment user at each time point to obtain the gesture of the terminal equipment user; the terminal equipment is operated according to the gestures of the terminal equipment user, so that the terminal equipment user can control the operation of the terminal equipment by making certain gestures, the terminal equipment is effectively and visually operated in an air-insulated mode without binding the behavior mode of the user, and the operation efficiency and the operation experience of the terminal equipment are improved.
Fig. 3 is a flowchart illustrating another terminal device operation method according to an embodiment of the present invention. As shown in fig. 3, based on the embodiment shown in fig. 1, step 102 may specifically include:
and S1021, analyzing the hand 3D model of the surrounding user, and extracting feature point information of the fingerprint area in the hand 3D model of the surrounding user.
S1022, comparing the feature point information of the fingerprint area in the hand 3D model of the surrounding user with the pre-stored fingerprint feature point information of the terminal equipment user, and acquiring the similarity between the feature point information of the fingerprint area and the fingerprint feature point information.
In this embodiment, when the fingerprint area is a fingerprint area of a plurality of fingers, the terminal device operating apparatus may compare the feature point information of the fingerprint area of the corresponding finger in the hand 3D model of the peripheral user with the feature point information of the fingerprint area of the corresponding finger of the terminal device user, determine the similarity of the fingerprint areas of the corresponding fingers, and determine the similarity between the feature point information of the fingerprint area and the fingerprint feature point information according to the similarity of the fingerprint areas of the respective fingers, that is, the similarity between the fingerprint area of the hand 3D model of the peripheral user and the fingerprint area of the terminal device user.
And S1023, determining the hand 3D model with the corresponding similarity larger than a preset threshold as the hand 3D model of the terminal equipment user, and determining the position range of the hand of the terminal equipment user according to the position range of the hand 3D model of the terminal equipment user.
In this embodiment, the terminal device operating apparatus may determine the position range of the hand 3D model of the terminal device user according to the three-dimensional coordinates of each point in the hand 3D model of the terminal device user, and further determine the position range of the hand of the terminal device user.
According to the terminal equipment operation method, the structured light equipment is adopted to project to the surrounding users of the terminal equipment, and the hand 3D models of the surrounding users are obtained; analyzing the hand 3D model of the surrounding user, and extracting feature point information of a fingerprint area in the hand 3D model of the surrounding user; comparing the characteristic point information of the fingerprint area in the hand 3D model of the surrounding user with the pre-stored fingerprint characteristic point information of the terminal equipment user to acquire the similarity between the characteristic point information of the fingerprint area and the fingerprint characteristic point information; determining the hand 3D model with the corresponding similarity larger than a preset threshold value as the hand 3D model of the terminal equipment user, and determining the position range of the hand of the terminal equipment user according to the position range of the hand 3D model of the terminal equipment user; projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point; identifying the hand 3D model of the terminal equipment user at each time point to obtain the gesture of the terminal equipment user; the terminal equipment is operated according to the gestures of the terminal equipment user, so that the terminal equipment user can control the operation of the terminal equipment by making certain gestures, the terminal equipment is effectively and visually operated in an air-insulated mode without binding the behavior mode of the user, and the operation efficiency and the operation experience of the terminal equipment are improved.
Fig. 4 is a flowchart illustrating another terminal device operation method according to an embodiment of the present invention. As shown in fig. 4, on the basis of the embodiment shown in fig. 1, step 104 may specifically include:
s1041, analyzing the hand 3D model of the terminal device user at each time point, and extracting feature point information of the hand 3D model of the terminal device user at each time point.
S1042, aiming at each time point, comparing the characteristic point information of the hand 3D model of the terminal device user with the pre-stored characteristic point information corresponding to each hand gesture, and determining the hand gesture of the terminal device user.
In this embodiment, for each time point, the terminal device operating apparatus may compare the feature point information of the hand 3D model of the terminal device user with the feature point information corresponding to each pre-stored hand gesture, and obtain a similarity between the feature point information of the hand 3D model of the terminal device user and the feature point information corresponding to each pre-stored hand gesture; and determining the hand gesture of which the corresponding similarity is greater than a preset similarity threshold value as the hand gesture of the terminal equipment user.
And S1043, determining the gesture of the terminal equipment user according to the hand gesture of the terminal equipment user at each time point.
In this embodiment, the terminal device operating apparatus may determine, according to the hand gestures of the terminal device user at each time point, a hand gesture sequence of the terminal device user within a period of time, query a preset gesture table according to the hand gesture sequence of the terminal device user within a period of time, obtain a gesture matched with the hand gesture sequence, and determine the gesture as the gesture of the terminal device user.
According to the terminal equipment operation method, the structured light equipment is adopted to project to the surrounding users of the terminal equipment, and the hand 3D models of the surrounding users are obtained; identifying 3D hand models of surrounding users, and acquiring the 3D hand models of terminal equipment users and the position range of the hands of the terminal equipment users in the surrounding users; projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point; analyzing the hand 3D model of the terminal equipment user at each time point, and extracting feature point information of the hand 3D model of the terminal equipment user at each time point; aiming at each time point, comparing the characteristic point information of the hand 3D model of the terminal equipment user with the prestored characteristic point information corresponding to each hand gesture to determine the hand gesture of the terminal equipment user; determining gestures of the terminal equipment user according to hand gestures of the terminal equipment user at each time point; the terminal equipment is operated according to the gestures of the terminal equipment user, so that the terminal equipment user can control the operation of the terminal equipment by making certain gestures, the terminal equipment is effectively and visually operated in an air-insulated mode without binding the behavior mode of the user, and the operation efficiency and the operation experience of the terminal equipment are improved.
Fig. 5 is a schematic structural diagram of a terminal device operating apparatus according to an embodiment of the present invention. As shown in fig. 5, the terminal device operating apparatus includes: a first projection module 51, a first recognition module 52, a second projection module 53, a second recognition module 54, and an operation module 55.
The first projection module 51 is configured to project a projection to a user around a terminal device by using a structured light device, and obtain a 3D hand model of the user around the terminal device;
the first identification module 52 is configured to identify the hand 3D models of the surrounding users, and obtain a hand 3D model of a terminal device user among the surrounding users and a position range where the hand of the terminal device user is located;
the second projection module 53 is configured to project the position range where the hand of the terminal device user is located by using structured light equipment, and obtain a 3D hand model of the terminal device user at each time point;
a second recognition module 54, configured to recognize the 3D hand model of the terminal device user at each time point, and obtain a gesture of the terminal device user;
and an operation module 55, configured to operate the terminal device according to a gesture of the terminal device user.
The terminal device operating apparatus provided in this embodiment may specifically be hardware or software installed on the terminal device. The terminal equipment can be a smart phone, a tablet computer, an ipad and the like.
In this embodiment, the terminal device operating apparatus prestores feature point information of a fingerprint area of a terminal device user, so as to identify feature point information of the fingerprint area in a hand 3D model of a peripheral user according to the feature point information of the fingerprint area of the terminal device user, determine a hand 3D model belonging to the terminal device user in the hand 3D model of the peripheral user, and further determine a position range where a hand of the terminal device user is located according to a position of the hand 3D model of the terminal device user.
In this embodiment, the fingerprint area of the terminal device user may refer to a fingerprint area of a single finger, or fingerprint areas of a plurality of fingers and all fingers, which may be set as required, and is not specifically limited herein.
In this embodiment, the terminal device operating apparatus may identify the hand 3D model of the terminal device user at each time point, and obtain the hand gesture of the terminal device user at each time point; determining the gestures of the terminal device user according to a series of hand gestures of the terminal device user within a period of time.
Based on fig. 5, fig. 6 is a schematic structural diagram of another terminal device operating apparatus according to an embodiment of the present invention. As shown in fig. 6, the first projection module 51 includes: a projection unit 511, an image capturing unit 512, and a calculation unit 513.
The projection unit 511 is configured to project, by using a structured light device, to users around the terminal device;
a camera unit 512, configured to capture an image of the surrounding user with a camera, and obtain a depth image of the surrounding user;
a calculating unit 513, configured to calculate and acquire a 3D hand model of the surrounding user in combination with the depth image of the surrounding user and the position relationship between the structured light device and the camera.
In this embodiment, the terminal device operating apparatus may obtain a 3D hand model of each user around the terminal device display screen by projecting the user around the terminal device display screen within a specific range. The terminal device operation apparatus may acquire only the 3D model of the hand of each peripheral user located within the specific range, and may not perform the acquisition operation on the 3D model of the hand of each peripheral user located outside the specific range.
In this embodiment, the manner in which the terminal device operating apparatus acquires the depth images of the surrounding users may be that the terminal device operating apparatus invokes the structured light device to project to the surrounding scene, acquires the depth images of all objects in the surrounding scene, analyzes the depth images of all objects, acquires feature point information of all objects, analyzes the feature point information of all objects, acquires feature point information belonging to a person, determines the position of each user, and further determines the depth image of each surrounding user.
According to the terminal equipment operating device, the structured light equipment is adopted to project to the surrounding users of the terminal equipment, and the hand 3D models of the surrounding users are obtained; identifying 3D hand models of surrounding users, and acquiring the 3D hand models of terminal equipment users and the position range of the hands of the terminal equipment users in the surrounding users; projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point; identifying the hand 3D model of the terminal equipment user at each time point to obtain the gesture of the terminal equipment user; the terminal equipment is operated according to the gestures of the terminal equipment user, so that the terminal equipment user can control the operation of the terminal equipment by making certain gestures, the terminal equipment is effectively and visually operated in an air-insulated mode without binding the behavior mode of the user, and the operation efficiency and the operation experience of the terminal equipment are improved.
Based on fig. 5, fig. 7 is a schematic structural diagram of another terminal device operating apparatus according to an embodiment of the present invention. As shown in fig. 7, the first identification module 52 includes: a first extraction unit 521, a first comparison unit 522, and a first determination unit 523.
The first extraction unit 521 is configured to analyze the hand 3D model of the surrounding user, and extract feature point information of a fingerprint area in the hand 3D model of the surrounding user;
a first comparing unit 522, configured to compare feature point information of a fingerprint area in the 3D model of the hand of the surrounding user with pre-stored fingerprint feature point information of a terminal device user, to obtain a similarity between the feature point information of the fingerprint area and the fingerprint feature point information;
the first determining unit 523 is configured to determine the hand 3D model with the corresponding similarity greater than the preset threshold as the hand 3D model of the terminal device user, and determine a position range where the hand of the terminal device user is located according to the position range of the hand 3D model of the terminal device user.
In this embodiment, when the fingerprint area is a fingerprint area of a plurality of fingers, the terminal device operating apparatus may compare the feature point information of the fingerprint area of the corresponding finger in the hand 3D model of the peripheral user with the feature point information of the fingerprint area of the corresponding finger of the terminal device user, determine the similarity of the fingerprint areas of the corresponding fingers, and determine the similarity between the feature point information of the fingerprint area and the fingerprint feature point information according to the similarity of the fingerprint areas of the respective fingers, that is, the similarity between the fingerprint area of the hand 3D model of the peripheral user and the fingerprint area of the terminal device user.
In this embodiment, the terminal device operating apparatus may determine the position range of the hand 3D model of the terminal device user according to the three-dimensional coordinates of each point in the hand 3D model of the terminal device user, and further determine the position range of the hand of the terminal device user.
According to the terminal equipment operating device, the structured light equipment is adopted to project to the surrounding users of the terminal equipment, and the hand 3D models of the surrounding users are obtained; analyzing the hand 3D model of the surrounding user, and extracting feature point information of a fingerprint area in the hand 3D model of the surrounding user; comparing the characteristic point information of the fingerprint area in the hand 3D model of the surrounding user with the pre-stored fingerprint characteristic point information of the terminal equipment user to acquire the similarity between the characteristic point information of the fingerprint area and the fingerprint characteristic point information; determining the hand 3D model with the corresponding similarity larger than a preset threshold value as the hand 3D model of the terminal equipment user, and determining the position range of the hand of the terminal equipment user according to the position range of the hand 3D model of the terminal equipment user; projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point; identifying the hand 3D model of the terminal equipment user at each time point to obtain the gesture of the terminal equipment user; the terminal equipment is operated according to the gestures of the terminal equipment user, so that the terminal equipment user can control the operation of the terminal equipment by making certain gestures, the terminal equipment is effectively and visually operated in an air-insulated mode without binding the behavior mode of the user, and the operation efficiency and the operation experience of the terminal equipment are improved.
Based on fig. 5, fig. 8 is a schematic structural diagram of another terminal device operating apparatus according to an embodiment of the present invention. As shown in fig. 8, the second identification module 54 includes: a second extracting unit 541, a second comparing unit 542 and a second determining unit 543.
The second extraction unit 541 is configured to analyze the hand 3D model of the terminal device user at each time point, and extract feature point information of the hand 3D model of the terminal device user at each time point;
the second comparison unit 542 is configured to compare, for each time point, feature point information of the 3D hand model of the terminal device user with feature point information corresponding to each pre-stored hand gesture, and determine a hand gesture of the terminal device user;
the second determining unit 543 is configured to determine the gesture of the terminal device user according to the hand gesture of the terminal device user at each time point.
In this embodiment, for each time point, the terminal device operating apparatus may compare the feature point information of the hand 3D model of the terminal device user with the feature point information corresponding to each pre-stored hand gesture, and obtain a similarity between the feature point information of the hand 3D model of the terminal device user and the feature point information corresponding to each pre-stored hand gesture; and determining the hand gesture of which the corresponding similarity is greater than a preset similarity threshold value as the hand gesture of the terminal equipment user.
In this embodiment, the terminal device operating apparatus may determine, according to the hand gestures of the terminal device user at each time point, a hand gesture sequence of the terminal device user within a period of time, query a preset gesture table according to the hand gesture sequence of the terminal device user within a period of time, obtain a gesture matched with the hand gesture sequence, and determine the gesture as the gesture of the terminal device user.
According to the terminal equipment operating device, the structured light equipment is adopted to project to the surrounding users of the terminal equipment, and the hand 3D models of the surrounding users are obtained; identifying 3D hand models of surrounding users, and acquiring the 3D hand models of terminal equipment users and the position range of the hands of the terminal equipment users in the surrounding users; projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point; analyzing the hand 3D model of the terminal equipment user at each time point, and extracting feature point information of the hand 3D model of the terminal equipment user at each time point; aiming at each time point, comparing the characteristic point information of the hand 3D model of the terminal equipment user with the prestored characteristic point information corresponding to each hand gesture to determine the hand gesture of the terminal equipment user; determining gestures of the terminal equipment user according to hand gestures of the terminal equipment user at each time point; the terminal equipment is operated according to the gestures of the terminal equipment user, so that the terminal equipment user can control the operation of the terminal equipment by making certain gestures, the terminal equipment is effectively and visually operated in an air-insulated mode without binding the behavior mode of the user, and the operation efficiency and the operation experience of the terminal equipment are improved.
Based on fig. 5, fig. 9 is a schematic structural diagram of another terminal device operating apparatus according to an embodiment of the present invention. As shown in fig. 9, the apparatus further includes: an acquisition module 56 and a generation module 57.
The obtaining module 56 is configured to obtain a corresponding relationship between each gesture of the terminal device user and the control instruction;
a generating module 57, configured to generate a control instruction table according to a correspondence between each gesture of the terminal device user and a control instruction, and store the control instruction table;
correspondingly, the operation module 55 is specifically configured to query the control instruction table according to the gesture of the terminal device user, and obtain a control instruction matched with the gesture of the terminal device user;
and operating the terminal equipment according to the control instruction.
In this embodiment, in a scene with many people, for example, in scenes such as a subway and a park, when a terminal device user needs to remotely control the terminal device, the terminal device operating device provided in this embodiment is adopted, so that the terminal device can accurately acquire the gesture of the terminal device user, and exclude the gestures of other users, and operate the terminal device according to the gesture of the terminal device user, thereby improving the accuracy of terminal device operation.
The embodiment of the invention also provides the terminal equipment. The terminal device includes therein an Image Processing circuit, which may be implemented by hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 10 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 10, only aspects of the image processing technique related to the embodiment of the present invention are shown for convenience of explanation.
As shown in fig. 10, the image processing circuit 900 includes an imaging device 910, an ISP processor 930, and control logic 940. The imaging device 910 may include a camera with one or more lenses 912, an image sensor 914, and a structured light projector 916. The structured light projector 916 projects the structured light to the object to be measured. The structured light pattern may be a laser stripe, a gray code, a sinusoidal stripe, or a randomly arranged speckle pattern. The image sensor 914 captures a structured light image projected onto the object to be measured and transmits the structured light image to the ISP processor 930, and the ISP processor 930 demodulates the structured light image to obtain depth information of the object to be measured. At the same time, the image sensor 914 may also capture color information of the object under test. Of course, the structured light image and the color information of the measured object may be captured by the two image sensors 914, respectively.
Taking speckle structured light as an example, the ISP processor 930 demodulates the structured light image, specifically including acquiring a speckle image of the measured object from the structured light image, performing image data calculation on the speckle image of the measured object and the reference speckle image according to a predetermined algorithm, and obtaining a moving distance of each scattered spot of the speckle image on the measured object relative to a reference scattered spot in the reference speckle image. And (4) converting and calculating by using a trigonometry method to obtain the depth value of each scattered spot of the speckle image, and obtaining the depth information of the measured object according to the depth value.
Of course, the depth image information and the like may be acquired by a binocular vision method or a method based on the time difference of flight TOF, and the method is not limited thereto, as long as the depth information of the object to be measured can be acquired or obtained by calculation, and all methods fall within the scope of the present embodiment.
After ISP processor 930 receives the color information of the object to be measured captured by image sensor 914, image data corresponding to the color information of the object to be measured may be processed. ISP processor 930 analyzes the image data to obtain image statistics that may be used to determine and/or image one or more control parameters of imaging device 910. Image sensor 914 may include an array of color filters (e.g., Bayer filters), and image sensor 914 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 914 and provide a set of raw image data that may be processed by ISP processor 930.
ISP processor 930 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 930 may perform one or more image processing operations on the raw image data, collecting image statistics about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 930 may also receive pixel data from image memory 920. The image memory 920 may be a part of a memory device, a storage device, or a separate dedicated memory within an electronic device, and may include a DMA (Direct memory access) feature.
Upon receiving the raw image data, ISP processor 930 may perform one or more image processing operations.
After the ISP processor 930 acquires the color information and the depth information of the object to be measured, they may be fused to obtain a three-dimensional image. The feature of the corresponding object to be measured can be extracted by at least one of an appearance contour extraction method or a contour feature extraction method. For example, the features of the object to be measured are extracted by methods such as an active shape model method ASM, an active appearance model method AAM, a principal component analysis method PCA, and a discrete cosine transform method DCT, which are not limited herein. And then the characteristics of the measured object extracted from the depth information and the characteristics of the measured object extracted from the color information are subjected to registration and characteristic fusion processing. The fusion processing may be a process of directly combining the features extracted from the depth information and the color information, a process of combining the same features in different images after weight setting, or a process of generating a three-dimensional image based on the features after fusion in other fusion modes.
The image data for the three-dimensional image may be sent to an image memory 920 for additional processing before being displayed. ISP processor 930 receives the processed data from image memory 920 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. Image data for a three-dimensional image may be output to a display 960 for viewing by a user and/or further Processing by a Graphics Processing Unit (GPU). Further, the output of ISP processor 930 may also be sent to image memory 920 and display 960 may read the image data from image memory 920. In one embodiment, image memory 920 may be configured to implement one or more frame buffers. Further, the output of the ISP processor 930 may be transmitted to the encoder/decoder 950 to encode/decode image data. The encoded image data may be saved and decompressed before being displayed on the display 960 device. The encoder/decoder 950 may be implemented by a CPU or a GPU or a coprocessor.
The image statistics determined by ISP processor 930 may be sent to control logic 940 unit. Control logic 940 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 910 based on the received image statistics.
The following steps are implemented by using the image processing technology in fig. 10 to realize the operation method of the terminal device:
projecting to users around a terminal device by adopting structured light equipment to obtain a hand 3D model of the users around;
identifying the hand 3D models of the surrounding users to obtain the hand 3D models of the terminal equipment users in the surrounding users and the position range of the hands of the terminal equipment users;
projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point;
identifying a hand 3D model of the terminal equipment user at each time point to acquire a gesture of the terminal equipment user;
and operating the terminal equipment according to the gesture of the terminal equipment user.
In order to implement the above embodiments, the present invention also proposes a non-transitory computer-readable storage medium having stored thereon a computer program capable of implementing the terminal device operating method as described in the foregoing embodiments when executed by a processor.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (12)

1. A method of operating a terminal device, comprising:
projecting to users around a terminal device by adopting structured light equipment to obtain a hand 3D model of the users around;
identifying the hand 3D models of the surrounding users to obtain the hand 3D models of the terminal equipment users in the surrounding users and the position range of the hands of the terminal equipment users;
projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point;
identifying a hand 3D model of the terminal equipment user at each time point to acquire a gesture of the terminal equipment user;
operating the terminal equipment according to the gesture of the terminal equipment user;
the pair of hand 3D models of the surrounding users are identified, the hand 3D models of the terminal device users in the surrounding users and the position range where the hands of the terminal device users are located are obtained, and the method comprises the following steps:
analyzing the hand 3D model of the surrounding user, and extracting feature point information of a fingerprint area in the hand 3D model of the surrounding user;
comparing the characteristic point information of the fingerprint area in the hand 3D model of the surrounding user with the pre-stored fingerprint characteristic point information of the terminal equipment user to acquire the similarity between the characteristic point information of the fingerprint area and the fingerprint characteristic point information;
and determining the hand 3D model with the corresponding similarity larger than a preset threshold value as the hand 3D model of the terminal equipment user, and determining the position range of the hand of the terminal equipment user according to the position range of the hand 3D model of the terminal equipment user.
2. The method of claim 1, wherein the projecting with a structured light device to a surrounding user of the terminal device to obtain a 3D model of a hand of the surrounding user comprises:
projecting to surrounding users of the terminal equipment by adopting structured light equipment;
shooting the surrounding users by adopting a camera to obtain depth images of the surrounding users;
and calculating and acquiring a hand 3D model of the surrounding user by combining the depth image of the surrounding user and the position relation between the structured light equipment and the camera.
3. A method as claimed in claim 1 or 2, wherein the structured light generated by the structured light device is non-uniform structured light.
4. The method according to claim 1, wherein the recognizing the 3D hand model of the terminal device user at each time point to obtain the gesture of the terminal device user comprises:
analyzing the hand 3D model of the terminal equipment user at each time point, and extracting feature point information of the hand 3D model of the terminal equipment user at each time point;
aiming at each time point, comparing the characteristic point information of the hand 3D model of the terminal equipment user with the prestored characteristic point information corresponding to each hand gesture to determine the hand gesture of the terminal equipment user;
and determining the gesture of the terminal equipment user according to the hand gesture of the terminal equipment user at each time point.
5. The method of claim 1, wherein before projecting with a structured light device to a surrounding user of the terminal device to obtain a 3D model of a hand of the surrounding user, further comprising:
acquiring a corresponding relation between each gesture of a terminal device user and a control instruction;
generating a control instruction table according to the corresponding relation between each gesture of the terminal equipment user and the control instruction, and storing the control instruction table;
correspondingly, the operating the terminal device according to the gesture of the terminal device user includes:
inquiring the control instruction table according to the gesture of the terminal equipment user, and acquiring a control instruction matched with the gesture of the terminal equipment user;
and operating the terminal equipment according to the control instruction.
6. A terminal device operating apparatus, characterized by comprising:
the first projection module is used for projecting to users around the terminal equipment by adopting structured light equipment to obtain a hand 3D model of the users around;
the first identification module is used for identifying the hand 3D models of the surrounding users to obtain the hand 3D models of the terminal equipment users in the surrounding users and the position range of the hands of the terminal equipment users;
the second projection module is used for projecting the position range of the hand of the terminal equipment user by adopting structured light equipment to obtain a hand 3D model of the terminal equipment user at each time point;
the second identification module is used for identifying the hand 3D model of the terminal equipment user at each time point to acquire the gesture of the terminal equipment user;
the operation module is used for operating the terminal equipment according to the gesture of the terminal equipment user;
the first identification module comprises:
the first extraction unit is used for analyzing the hand 3D model of the surrounding user and extracting feature point information of a fingerprint area in the hand 3D model of the surrounding user;
the first comparison unit is used for comparing the characteristic point information of the fingerprint area in the hand 3D model of the surrounding user with the pre-stored fingerprint characteristic point information of the terminal equipment user to acquire the similarity between the characteristic point information of the fingerprint area and the fingerprint characteristic point information;
the first determining unit is used for determining the hand 3D model with the corresponding similarity larger than a preset threshold as the hand 3D model of the terminal equipment user, and determining the position range of the hand of the terminal equipment user according to the position range of the hand 3D model of the terminal equipment user.
7. The apparatus of claim 6, wherein the first projection module comprises:
the projection unit is used for projecting to users around the terminal equipment by adopting structured light equipment;
the camera shooting unit is used for shooting the surrounding users by adopting a camera to acquire depth images of the surrounding users;
and the computing unit is used for combining the depth image of the surrounding user and the position relation between the structured light equipment and the camera to compute and acquire a hand 3D model of the surrounding user.
8. An apparatus as claimed in claim 6 or 7, wherein the structured light generated by the structured light device is non-uniform structured light.
9. The apparatus of claim 6, wherein the second identification module comprises:
the second extraction unit is used for analyzing the hand 3D model of the terminal equipment user at each time point and extracting characteristic point information of the hand 3D model of the terminal equipment user at each time point;
the second comparison unit is used for comparing the characteristic point information of the hand 3D model of the terminal equipment user with the pre-stored characteristic point information corresponding to each hand gesture aiming at each time point, and determining the hand gesture of the terminal equipment user;
and the second determining unit is used for determining the gesture of the terminal equipment user according to the hand gesture of the terminal equipment user at each time point.
10. The apparatus of claim 6, further comprising:
the acquisition module is used for acquiring the corresponding relation between each gesture of a terminal equipment user and the control instruction;
the generating module is used for generating a control instruction list according to the corresponding relation between each gesture of the terminal equipment user and the control instruction and storing the control instruction list;
correspondingly, the operation module is specifically configured to query the control instruction table according to the gesture of the terminal device user, and obtain a control instruction matched with the gesture of the terminal device user;
and operating the terminal equipment according to the control instruction.
11. A terminal device, comprising one or more of the following components: a housing, and a processor and a memory located in the housing, wherein the processor runs a program corresponding to an executable program code stored in the memory by reading the executable program code, so as to implement the terminal device operating method according to any one of claims 1 to 5.
12. A non-transitory computer-readable storage medium, on which a computer program is stored, the computer program, when being executed by a processor, implementing a method for operating a terminal device according to any one of claims 1 to 5.
CN201710677517.8A 2017-08-09 2017-08-09 Terminal device operation method and device and terminal device Active CN107589834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710677517.8A CN107589834B (en) 2017-08-09 2017-08-09 Terminal device operation method and device and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710677517.8A CN107589834B (en) 2017-08-09 2017-08-09 Terminal device operation method and device and terminal device

Publications (2)

Publication Number Publication Date
CN107589834A CN107589834A (en) 2018-01-16
CN107589834B true CN107589834B (en) 2020-08-07

Family

ID=61042102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710677517.8A Active CN107589834B (en) 2017-08-09 2017-08-09 Terminal device operation method and device and terminal device

Country Status (1)

Country Link
CN (1) CN107589834B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110531377B (en) * 2019-10-08 2022-02-25 北京邮电大学 Data processing method and device of radar system, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999152A (en) * 2011-09-09 2013-03-27 康佳集团股份有限公司 Method and system for gesture recognition
US9081418B1 (en) * 2013-03-11 2015-07-14 Rawles Llc Obtaining input from a virtual user interface
CN106155299A (en) * 2015-04-23 2016-11-23 青岛海信电器股份有限公司 A kind of method and device that smart machine is carried out gesture control
CN106325509A (en) * 2016-08-19 2017-01-11 北京暴风魔镜科技有限公司 Three-dimensional gesture recognition method and system
CN106503620A (en) * 2016-09-26 2017-03-15 深圳奥比中光科技有限公司 Numerical ciphers input method and its system based on gesture
CN106774850A (en) * 2016-11-24 2017-05-31 深圳奥比中光科技有限公司 A kind of mobile terminal and its interaction control method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999152A (en) * 2011-09-09 2013-03-27 康佳集团股份有限公司 Method and system for gesture recognition
US9081418B1 (en) * 2013-03-11 2015-07-14 Rawles Llc Obtaining input from a virtual user interface
CN106155299A (en) * 2015-04-23 2016-11-23 青岛海信电器股份有限公司 A kind of method and device that smart machine is carried out gesture control
CN106325509A (en) * 2016-08-19 2017-01-11 北京暴风魔镜科技有限公司 Three-dimensional gesture recognition method and system
CN106503620A (en) * 2016-09-26 2017-03-15 深圳奥比中光科技有限公司 Numerical ciphers input method and its system based on gesture
CN106774850A (en) * 2016-11-24 2017-05-31 深圳奥比中光科技有限公司 A kind of mobile terminal and its interaction control method

Also Published As

Publication number Publication date
CN107589834A (en) 2018-01-16

Similar Documents

Publication Publication Date Title
CN107563304B (en) Terminal equipment unlocking method and device and terminal equipment
CN107480613B (en) Face recognition method and device, mobile terminal and computer readable storage medium
CN107025635B (en) Depth-of-field-based image saturation processing method and device and electronic device
CN107481304B (en) Method and device for constructing virtual image in game scene
US9995578B2 (en) Image depth perception device
CN107564050B (en) Control method and device based on structured light and terminal equipment
CA2786439C (en) Depth camera compatibility
CN107479801B (en) Terminal display method and device based on user expression and terminal
CA2786436C (en) Depth camera compatibility
CN107209007A (en) Method, circuit, equipment, accessory, system and the functionally associated computer-executable code of IMAQ are carried out with estimation of Depth
CN107592449B (en) Three-dimensional model establishing method and device and mobile terminal
US20140037135A1 (en) Context-driven adjustment of camera parameters
CN107491744B (en) Human body identity recognition method and device, mobile terminal and storage medium
CN107016348B (en) Face detection method and device combined with depth information and electronic device
CN107480615B (en) Beauty treatment method and device and mobile equipment
CN107517346B (en) Photographing method and device based on structured light and mobile device
CN107463659B (en) Object searching method and device
CN107392874B (en) Beauty treatment method and device and mobile equipment
JP6723814B2 (en) Information processing apparatus, control method thereof, program, and storage medium
CN107452034B (en) Image processing method and device
KR20170092533A (en) A face pose rectification method and apparatus
CN107590828B (en) Blurring processing method and device for shot image
CN107705278B (en) Dynamic effect adding method and terminal equipment
JP2019530059A (en) Method for independently processing multiple target areas
CN107330974B (en) Commodity display method and device and mobile equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

GR01 Patent grant
GR01 Patent grant