CN106569716B - Single-hand control method and control system - Google Patents

Single-hand control method and control system Download PDF

Info

Publication number
CN106569716B
CN106569716B CN201610942444.6A CN201610942444A CN106569716B CN 106569716 B CN106569716 B CN 106569716B CN 201610942444 A CN201610942444 A CN 201610942444A CN 106569716 B CN106569716 B CN 106569716B
Authority
CN
China
Prior art keywords
touch
display screen
control
finger
depth image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610942444.6A
Other languages
Chinese (zh)
Other versions
CN106569716A (en
Inventor
黄源浩
刘龙
肖振中
许星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Orbbec Co Ltd
Original Assignee
Shenzhen Orbbec Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Orbbec Co Ltd filed Critical Shenzhen Orbbec Co Ltd
Priority to CN201610942444.6A priority Critical patent/CN106569716B/en
Publication of CN106569716A publication Critical patent/CN106569716A/en
Priority to PCT/CN2017/089027 priority patent/WO2018076720A1/en
Application granted granted Critical
Publication of CN106569716B publication Critical patent/CN106569716B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The invention discloses a single-hand control method and a single-hand control system, wherein the single-hand control method comprises the following steps: s1: acquiring a control surface and a depth image of a control object on the control surface; s2: acquiring a first position of a control object on a control surface through a depth image; s3: positioning a second position on the display screen according to the first position of the control object on the control surface; s4: according to the shape and the predetermined operation action determined by the action of the operation object, identifying and converting the operation object into a touch instruction to be executed; s5: and executing touch operation at a second position according to the touch instruction. According to the invention, through the control surface, the single-hand control of the large-screen electronic equipment can be well realized, and the touch control conflict is avoided.

Description

Single-hand control method and control system
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to a one-handed control method and a control system.
Background
With the popularization of mobile internet and the higher and higher requirements of people on the functions of intelligent devices such as mobile phones, the electronic device with the functions of communication, internet surfing, video playing and the like is the most basic configuration of the current electronic devices. At present, the size of devices such as mobile phones almost enters a large-screen era, and single-hand operation is far from reaching, and even though the size of devices such as mobile phones and the like is large, the requirements of users on mobile phones are still met.
Taking a mobile phone as an example, adding a function key at the back of the mobile phone is a solution to one-hand operation, but will undoubtedly affect the beauty of the back of the mobile phone, so this solution has not been accepted by users. The other scheme is that an additional touch screen is added on the back of the mobile phone, and the scheme realizes the control of the area, which cannot be operated by one hand, of the mobile phone screen through the control of fingers on the back. However, this solution is costly and cannot be the mainstream one-handed solution.
Meanwhile, currently, a depth image is used for touch operation, touch on a display screen outside a region cannot be achieved between an operation object and the display screen, the position of the operation object on the display screen is directly mapped to the coordinates of pixels on the acquired depth image, namely the position of the operation object on the display screen is mapped by the pixel coordinates of the operation object in the depth image, and the method of directly obtaining the position of the operation object on the display screen through pixel coordinate mapping cannot achieve touch operation on the region where the operation object cannot be touched, so that some functions such as page turning can be simply achieved, and the problem of single-hand operation of a large screen cannot be well solved.
Furthermore, in the existing scheme of performing touch control by using a depth image, a depth camera is mostly arranged on one side of a display screen, the position of a control object on the display screen is obtained by obtaining the depth image of the display screen, and the best mode is obtained when the control object contacts the display screen, the method is suitable for the display screen without a touch function, for the touch screen with the touch function, a finger touching the display screen is determined to generate touch control on the screen, if the position information of the control object is used for performing touch control on other areas of the display screen, because two touch control instructions are generated in the prior art, a touch control conflict is caused, so that the other areas of the display screen can not be touched, and when the other areas are touched, the touch control object can only be required not to contact with the display screen, so that the mapping relation when the position of the control object on the display screen is obtained by using the depth image is relatively complex, the experience is greatly diminished.
The above background disclosure is only for the purpose of assisting understanding of the inventive concept and technical solutions of the present invention, and does not necessarily belong to the prior art of the present patent application, and should not be used for evaluating the novelty and inventive step of the present application in the case that there is no clear evidence that the above content is disclosed at the filing date of the present patent application.
Disclosure of Invention
The invention aims to provide a single-hand control method and a single-hand control system, and aims to solve the technical problems that the single-hand control cannot be well realized and touch conflicts are easily generated when a display screen with a touch function is touched in the prior art.
Therefore, the invention provides a single-hand control method, wherein a display screen and a control surface for touch operation are positioned on different planes, and the method comprises the following steps: s1: acquiring a control surface and a depth image of a control object on the control surface; s2: acquiring a first position of a control object on a control surface through a depth image; s3: positioning a second position on the display screen according to the first position of the control object on the control surface; s4: according to the shape and the predetermined operation action determined by the action of the operation object, identifying and converting the operation object into a touch instruction to be executed; s5: and executing touch operation at a second position according to the touch instruction.
Preferably, the control method of the present invention may further have the following technical features:
the control surface comprises at least one control area, and at least one control area is automatically defined through the acquired depth image and the acquired position of the control object on the control surface.
The size of the automatically defined control region is easily reached on the control surface when the control object is operated by one hand.
The display screen comprises a near touch area and a far touch area, the near touch area is used for controlling an object to be easily touched, the far touch area is not easy to be touched, the object is controlled in the near touch area through a touch function of the display screen, and the object is controlled in the far touch area through a second position obtained through positioning.
And the near touch area and the far touch area are automatically demarcated according to the position of the control object on the display screen.
The step of obtaining the depth image comprises: s11: acquiring a first depth image which comprises a control surface and does not comprise a control object; s12: acquiring a second depth image containing a control surface and a control object; s13: and obtaining a third depth image of the control object through the second depth image and the first depth image.
Acquiring the first position comprises the following steps: s21: judging whether the control object is in contact with the control surface or not according to the second depth image, and if so, executing the next step; s22: and obtaining the spatial position information of the control object in the coordinate system of the display screen according to the third depth image, and taking the vertex coordinate of the control object as the first position.
The obtaining of the second position comprises the steps of: s31: establishing a mapping relation between a control surface and the display screen; s32: and obtaining a second position of the display screen according to the mapping relation and the first position.
And establishing a linear mapping relation according to the horizontal and vertical sizes of the control surface and the display screen.
In addition, the invention also provides a single-hand control system which is used for executing the touch control method and comprises an image acquisition unit, a processor, a control surface and a display screen, wherein the control surface and the display screen are positioned on different planes;
the image acquisition unit is used for acquiring a control surface, a depth image of a control object and depth information of the control object;
the processor comprises an identification module, a positioning module, a conversion module and an execution module, wherein the identification module is used for acquiring a first position of a control object on a control surface according to the depth image and identifying a preset control action of the control object; the positioning module is used for positioning a second position on the display screen needing to be controlled according to the first position; the conversion module converts and generates a corresponding touch instruction according to a predefined control action; the execution module is used for executing the touch instruction at the second position to complete touch operation on the display screen.
Compared with the prior art, the invention has the advantages that:
the invention relates to a method for realizing touch operation by utilizing a depth image, in particular to a one-hand control method. The control surface used for controlling and the display screen are arranged on different planes, the control object completes touch operation on the control surface, the first position of the control object on the control surface is obtained by utilizing the depth image, the second position on the display screen is further obtained through the first position, and the touch operation is executed at the second position by combining the preset control action.
Meanwhile, for the area which is difficult to touch on the display screen, the touch control can be carried out on the display screen of the other plane only by simply completing the preset control action on the control surface, and the control object can be contacted with the control surface constantly on the control surface so as to conveniently and quickly acquire the position of the control object on the control surface.
Compared with the prior art, the method and the device can realize simple gesture operations such as page turning, backspacing and the like on the display screen without the touch function, can also well realize one-hand touch, can perform corresponding touch operation on objects which cannot be touched by one hand through the control surface, and can realize accurate touch.
In a preferred embodiment, the control surface includes at least one control area to solve the problem of obtaining and controlling a depth image of a control object under different touch habits, for example, the control area is automatically defined according to the depth image, generally speaking, the control area is a certain area on the control surface set in advance, the position and the depth image of the control object on the control object are used for automatically defining, touch operation can be performed without requiring the control object to be controlled in a certain area, and the control experience is improved.
On the basis of the above-mentioned manipulation method, for the display screen itself having the touch function, a mode of performing a hybrid manipulation using both the touch function itself and a pointing-type manipulation method is implemented, such as: and the control object performs one-hand touch control in the near touch control area by utilizing the touch control function of the display screen, and performs touch control in the far touch control area by using the second position obtained by positioning. The hybrid control method can solve the problem of low precision caused by high randomness of movement of the controlled object, and provides a better experience scheme for users. The near touch area and the far touch area are also automatically defined according to the touch habits of the user so as to adapt to the habits of the user during operation and optimize the shapes of the near touch area and the far touch area.
And acquiring a third depth image of the control object according to the first depth image and the second depth image of the acquired depth image, and then acquiring the first position according to the third depth image, so that the acquired third depth image only comprises the depth image of the control object, and when the first position is acquired, the calculation amount of the processing process can be reduced, the calculation speed is increased, and the response speed of the system is increased.
The second position on the display screen can be quickly obtained through the first position by establishing a mapping relation between the control surface and the display screen which are positioned on different planes, the control surface or the control area can be in more regular shapes such as a rectangle by adopting a linear mapping relation, and the mapping relation is quickly established according to the horizontal and vertical sizes.
Drawings
Fig. 1 is a schematic structural diagram of a control system according to a first embodiment of the present invention;
fig. 2 is a schematic structural diagram of a processor according to a first embodiment of the present invention.
Fig. 3 is a schematic structural diagram of an electronic device according to a third embodiment of the present invention.
Fig. 4 is a schematic rear view of a single-handed mobile phone according to a third and a fourth embodiment of the present invention.
Fig. 5 is a schematic side view of a single-handed mobile phone according to a third and a fourth embodiment of the present invention.
Fig. 6 is a schematic diagram of the hybrid operation of embodiments two and five of the present invention.
Fig. 7 is a first operation flow chart of the fourth embodiment of the present invention.
Fig. 8 is a second operation flow chart of the fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. It should be emphasized that the following description is merely exemplary in nature and is not intended to limit the scope of the invention or its application.
Non-limiting and non-exclusive embodiments will be described with reference to the following figures, wherein like reference numerals refer to like parts, unless otherwise specified.
The first embodiment is as follows:
the embodiment provides a single-hand control system, as shown in fig. 1, including an image acquisition unit 1, a processor 2, a control surface 3 and a display screen 4;
the image acquisition unit 1 is used for acquiring a depth image of a control surface 3, a control object and depth information of the control object, wherein the control surface 3 and the display screen 4 are positioned on different planes;
as shown in fig. 2, the processor 2 includes a recognition module 21, a positioning module 22, a conversion module 23, and an execution module 24, where the recognition module 21 is configured to obtain a first position of a manipulation object on the manipulation surface 3 according to the depth image, and recognize a predetermined manipulation action of the manipulation object; the positioning module 22 is configured to position a second position on the display screen 4 to be controlled according to the first position; the conversion module 23 is configured to convert a predefined control action to generate a corresponding touch instruction; the execution module 24 is configured to execute the touch instruction at the second position to complete the touch operation on the display screen 4;
this system can acquire the depth image of controlling face 3 through the image acquisition unit, also can acquire the depth image of controlling the object with display screen 4 contact simultaneously, can discern the position and the action of controlling the object at controlling face 3 according to this depth image, thereby change into corresponding position and instruction on display screen 4, so alright in order to realize controlling to equipment, in this embodiment, for example, the display screen of electronic book reader, it can influence the display effect to consider to touch control on the display screen, for example hand must shelter from partial screen when constantly turning over the page with the finger on the electronic book, thereby influence and read experience. And the problem can be reduced by using the control object to control on the control surface, and the use experience of the user is improved.
The image acquisition unit 1 is a depth camera based on the structured light or TOF principle, and generally includes a receiving unit and a transmitting unit.
The problem of touch conflict can be effectively avoided by using the control surface. Meanwhile, for the area which is difficult to touch on the display screen, the touch control can be carried out on the display screen of the other plane only by simply completing the preset control action on the control surface, and the control object can be contacted with the control surface constantly on the control surface so as to conveniently and quickly acquire the position of the control object on the control surface.
Example two:
as shown in fig. 6, the touch screen includes a near touch region ① and a far touch region ②, the control object 16 performs touch in the near touch region ① according to the touch function of the touch screen itself, and the far touch region ② is pointed by the control surface 16 to perform touch operation.
Specifically, the near touch area ① that can be reached by a single hand and the far touch area ② that cannot be reached by a single hand still employ touch control in the near touch area ① to ensure better accuracy, and the far touch area ② employs the first position of the control surface 3 to perform control according to the shape and the motion of the touch object.
Example three:
the embodiment provides an electronic device, which may be a mobile phone or a tablet, including a bus 5 for data transmission, as shown in fig. 3, where the bus 4 is divided into a data bus, an address bus and a control bus, and is used to transmit data, a data address and a control signal, respectively. Connected to the bus 4 are a CPU6, a display 7, an IMU8 (inertial measurement unit), a memory 9, a camera 11, a sound unit 12, a network interface 13, and a mouse/keyboard 14, wherein for some touch-enabled electronic devices, the mouse/keyboard is replaced by the display 7, and the IMU device is used for positioning, tracking, and other functions. The memory 9 is used for storing an operating system, an application program, and the like, and may also be used for storing temporary data during operation.
The image acquisition unit 1 in the control system arranged on the electronic equipment corresponds to the camera 11, the processor 2 corresponds to the CPU8, or can be a separate processor, the display screen 4 corresponds to the display 7, and the control surface is arranged on the back of the mobile phone and is on a different plane from the display screen.
As shown in fig. 4, the camera 11 is generally fixed to the electronic apparatus. In this embodiment, the imaging direction is greatly different from that of the camera of the existing apparatus. Existing cameras are typically front or rear mounted, and such a configuration cannot acquire images of the device itself. In the present embodiment, the configuration of the camera 11 can be in various situations, one of which is to rotate the camera by 90 degrees by means of the rotation shaft, so as to achieve the purpose of shooting the camera; the other is a form that the camera is externally arranged, and the camera as a whole is connected with the equipment through a certain fixing measure and interfaces such as USB and the like. The form of the camera arranged on the electronic device can be selected by a person skilled in the art according to actual conditions without limitation.
At this time, the camera 11 of the present embodiment is a depth camera for acquiring a depth image of the target area, unlike a conventional camera.
Fig. 4 is a schematic rear view of the mobile phone operated by a single hand. The camera 11 is disposed on the top of the mobile phone, and the direction of the camera 11 is along the top-down direction of the mobile phone, so that images of the operation surface and the fingers (operation object) on the back of the mobile phone can be obtained, as shown in fig. 5, a schematic side view of the mobile phone operated by one hand is shown, 17 is a first position on the operation surface, and 16 is the operation object.
The display 7 of the electronic device may be touch-enabled or touch-disabled, and when the display is not touch-enabled, the display can be controlled only through the control surface.
The camera 11 in this embodiment may also be a common RGB camera for shooting RGB images; the camera 11 may also be an infrared camera for shooting infrared images; it may also be a depth camera, such as a depth camera based on the structured light principle or on the TOF principle, etc.
The depth image acquired by the depth camera is not affected by dark light, measurement can be performed even in the dark, and in addition, positioning and motion recognition using the depth image are more accurate than RGB images. Thus, in the following description, a depth camera and a depth image will be described. The present invention should not be limited to depth cameras.
Example four:
a single-handed manipulation method, as shown in fig. 4-5 and 7, comprising the steps of:
s1: acquiring a depth image of a control surface 3 and a control object 16 on the control surface 3;
s2: acquiring a first position 17 of the control object 16 on the control surface 3 from the depth image;
s3: positioning a second position on the display screen 4 as a function of a first position 17 of the control object 16 on the control surface 3;
s4: according to the preset control action determined by the shape and the action of the control object 16, recognizing and converting the preset control action into a touch instruction to be executed;
s5: and executing touch operation at a second position according to the touch instruction.
The control surface 3 comprises at least one control region 15, at least one of which 15 is automatically delimited by the acquired depth image and the acquired position of the control object 16 on the control surface 3. The automatically delimited control region 15 is dimensioned such that it is easily accessible on the control surface 3 when the control object 16 is operated with one hand. In the automatic defining process, the determination may be performed according to the contact point when the control object is in contact with the control surface 3, and in the case of performing the touch operation, all the points in contact with the touch surface in the current operation are determined as the control area 15.
The control surface 3 comprises at least one control area 15, so as to solve the problem of obtaining and controlling the depth image under different touch habits of the control object 16, such as left and right hands, the control area 15 is automatically defined according to the depth image, generally speaking, the control area 15 is a certain area on the control surface 3 that is preset, the position of the control object 16 on the control object 16 and the depth image are used for automatic definition, the control object 16 is not required to be controlled in a certain area for touch operation, the control experience is improved, and particularly, the size of the manipulation area 15 is such that it is easily accessible on said manipulation surface 3 when the manipulation object 16 is manipulated with one hand, in order to optimize the shape defined by the manipulation area 15, and is not limited to a rectangle, can be determined according to the shape of the area on the manipulation surface 3 that is most easily touched by the manipulation object 16, and is generally in the shape of an irregular sector.
The acquired depth image comprises the control surface 3 and the fingers and other irrelevant parts, in this case, the measurement range of the depth camera is limited, namely a certain threshold value is set, and the depth information exceeding the threshold value is removed. In order to further increase the system running speed and reduce the computation amount, an image segmentation method is used for obtaining the depth image, wherein one method comprises the following steps:
s11: acquiring a first depth image comprising the manipulation surface 3 and not comprising the manipulated object 16;
s12: acquiring a second depth image containing the control surface 3 and the control object 16;
s13: a third depth image of the manipulated object 16 is obtained from the second depth image and the first depth image.
The manipulated object 16 in this embodiment is a finger, and the depth image of the front end portion of the finger obtained by the above steps through the background segmentation method can reduce the amount of calculation in modeling and increase the calculation speed.
As shown in fig. 8, acquiring the first position 17 comprises the following steps:
s21: judging whether the control object 16 is in contact with the control surface 3 or not according to the second depth image, and if so, executing the next step;
s22: and obtaining the spatial position information of the manipulated object 16 in the coordinate system of the display screen 4 according to the third depth image, and taking the vertex coordinate of the manipulated object 16 as the first position 17.
The obtaining of the second position comprises the steps of:
s31: establishing a mapping relation between the control surface 3 and the display screen 4;
s32: and obtaining the second position of the display screen 4 according to the mapping relation and the first position 17.
And establishing a linear mapping relation according to the transverse and longitudinal sizes of the control surface 3 and the display screen 4.
By establishing a mapping relation between the control surface 3 and the display screen 4 which are positioned on different planes, the second position on the display screen 4 can be quickly obtained through the first position 17, the linear mapping relation can be adopted to quickly establish the mapping relation according to the transverse and longitudinal dimensions, wherein the control surface 3 or the control area 15 is in a more regular shape such as a rectangle.
In the prior art touch technology, the positioning and the touching are done simultaneously, however, this is not feasible on the invisible back. Therefore, the operation needs to be performed after positioning, and the manipulation action may be a change in the shape of a finger, such as a change in the angle between the finger and the display screen 4, or a finger action, such as a clicking action, and the like. The recognition of the shape and movement of the finger is known in the art and will not be described in detail here.
Operations that do not require positioning, such as page turning, rollback, etc., are certainly not excluded. For such operations, it may be sufficient to recognize only the shape or the motion of the finger.
In this embodiment, when the finger touches the back surface, and the inclination angle of the finger is reduced, one click action is completed. The shape and the operation command corresponding to the motion of the finger need to be preset. When a certain finger shape and action are recognized, the processor converts the finger shape and action into a corresponding control instruction.
Example five:
therefore, for the single-hand operation of some mobile phones with touch function, the touch operation is still selected in the area that can be reached by one hand, and the non-contact operation is performed in the area that cannot be reached by one hand, which is a better experience solution, so the display screen 4 of the embodiment is a touch screen with touch function, as shown in fig. 6, the display screen 4 includes a near-touch area ① for easy touch of the operation object 16 and a far-touch area that is not easy to touch except the near-touch area, the operation object 16 performs the single-hand touch in the near-touch area by using the touch function of the display screen 4 itself, and performs the touch in the far-touch area by using the second position obtained by positioning.
The hand sizes, left-hand habits and right-hand habits of different users are different, so that the touch area ① and the pointing area ② can be automatically recognized and defined by the system, namely, when the finger on the side of the display screen 4 is in contact with the touch display screen 4, the processor processes the signal fed back by the touch display screen 4 and executes a corresponding touch instruction, and when the finger on the side of the display screen 4 is not in contact with the touch display screen 4, the depth camera recognizes the non-contact action and feeds back the non-contact action to the processor, and the processor processes the depth image acquired by the depth camera and performs touch operation through the second position.
On the basis of the above-mentioned manipulation method, for the display screen 4 itself having the touch function, a mode of performing a hybrid manipulation using both the touch function itself and a pointing-type manipulation method is implemented, such as: the manipulation object 16 performs one-hand touch control in the near touch control area by using the touch control function of the display screen 4 itself, and performs touch control in the far touch control area by using the second position obtained by positioning. The hybrid control method can solve the problem of low precision caused by high randomness of movement of the controlled object 16, and provides a better experience scheme for users. The near touch area and the far touch area are also automatically defined according to the touch habits of the user so as to adapt to the habits of the user during operation and optimize the shapes of the near touch area and the far touch area.
Those skilled in the art will recognize that numerous variations are possible in light of the above description, and thus the examples are intended to describe one or more specific embodiments.
While there has been described and illustrated what are considered to be example embodiments of the present invention, it will be understood by those skilled in the art that various changes and substitutions may be made therein without departing from the spirit of the invention. In addition, many modifications may be made to adapt a particular situation to the teachings of the present invention without departing from the central concept described herein. Therefore, it is intended that the invention not be limited to the particular embodiments disclosed, but that the invention will include all embodiments and equivalents falling within the scope of the invention.

Claims (8)

1. A single-hand hybrid control method is characterized in that a display screen of an electronic device and a control surface for touch operation are respectively arranged on the front side and the back side of the electronic device, the electronic device is provided with a depth camera for shooting an image of the device, the display screen comprises a near touch area and a far touch area, the near touch area is easy to touch by a finger, the far touch area is not easy to touch except the near touch area, the finger uses the touch function of the display screen to perform single-hand touch in the near touch area, touch is performed in the far touch area by using a second position obtained by positioning, the near touch area and the far touch area are automatically identified and defined by a system, when the finger on the display screen side is in contact with the touch display screen, a processor processes a signal fed back by the touch display screen and executes a corresponding touch instruction, and when the finger on the display screen side is not in contact with the touch display screen, the method comprises the following steps that after the depth camera recognizes the finger action on the control surface side which is not in contact with the touch display screen, the finger action is fed back to a processor, the processor processes a depth image acquired by the depth camera, and touch operation is carried out through a second position, and the method comprises the following steps:
s1: acquiring a control surface and a depth image of a finger on the control surface, wherein the control surface comprises at least one control area, and at least one control area is automatically defined by the acquired depth image and the acquired position of the finger on the control surface;
s2: acquiring a first position of a finger on the control surface through the depth image;
s3: positioning a second position on the display screen according to the first position of the finger on the control surface;
s4: according to the shape and the predetermined control action determined by the action of the finger, identifying and converting the finger into a touch instruction to be executed;
s5: and executing touch operation at a second position according to the touch instruction.
2. The hybrid manipulation method of claim 1, wherein the automatically demarcated manipulation region is sized to be easily accessible on the manipulation surface when operated with a single finger.
3. The hybrid manipulation method of claim 1, wherein the near-touch region and the far-touch region are automatically demarcated according to the position of the finger on the display screen.
4. The hybrid steering method according to claim 1, wherein the depth image obtaining step includes:
s11: acquiring a first depth image which comprises a control surface and does not contain fingers;
s12: acquiring a second depth image containing the control surface and the finger;
s13: and obtaining a third depth image of the finger through the second depth image and the first depth image.
5. The hybrid steering method of claim 4, wherein obtaining the first location comprises:
s21: judging whether the finger is in contact with the control surface or not according to the second depth image, and if so, executing the next step;
s22: and obtaining the space position information of the finger in the coordinate system where the display screen is located according to the third depth image, and taking the vertex coordinate of the finger as a first position.
6. The hybrid steering method according to claim 1, wherein the acquisition of the second position comprises the steps of:
s31: establishing a mapping relation between a control surface and the display screen;
s32: and obtaining a second position of the display screen according to the mapping relation and the first position.
7. The hybrid manipulation method of claim 6, wherein a linear mapping relationship is established according to the lateral and longitudinal dimensions of the manipulation surface and the display screen.
8. A single-hand hybrid control system, comprising: the hybrid manipulation method of any one of claims 1 to 7, performed on an electronic device comprising an image acquisition unit, a processor, a manipulation surface, and a display screen, the display screen and the manipulation surface for touch operation being on a front side and a back side of the electronic device, respectively;
the image acquisition unit is used for acquiring a control surface, a depth image of the finger and depth information of the finger; wherein the manipulation surface comprises at least one manipulation area, and at least one manipulation area is automatically defined by the acquired depth image and the acquired position of the finger on the manipulation surface;
the processor comprises a recognition module, a positioning module, a conversion module and an execution module, wherein the recognition module is used for acquiring a first position of a finger on the control surface according to the depth image and recognizing a preset control action of the finger; the positioning module is used for positioning a second position on the display screen needing to be controlled according to the first position; the conversion module converts and generates a corresponding touch instruction according to a predefined control action; the execution module is used for executing a touch instruction at the second position to complete touch operation on the display screen;
the display screen comprises a near touch area and a far touch area, the near touch area is used for being easily touched by fingers, the far touch area is not easily touched by the fingers except the near touch area, the fingers perform single-hand touch in the near touch area by using the touch function of the display screen, the far touch area performs touch in a second position obtained by positioning, the near touch area and the far touch area are automatically identified and defined by a system, when the fingers on the side of the display screen are in contact with the touch display screen, a processor processes signals fed back by the touch display screen and executes corresponding touch instructions, and when the fingers on the side of the display screen are not in contact with the touch display screen, the signals are fed back to the processor after the depth camera identifies the actions of the fingers on the side of an operation surface which is not in contact with the touch display screen, the processor processes depth images acquired by the depth camera, and performs touch operation through the second position.
CN201610942444.6A 2016-10-25 2016-10-25 Single-hand control method and control system Active CN106569716B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610942444.6A CN106569716B (en) 2016-10-25 2016-10-25 Single-hand control method and control system
PCT/CN2017/089027 WO2018076720A1 (en) 2016-10-25 2017-06-19 One-hand operation method and control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610942444.6A CN106569716B (en) 2016-10-25 2016-10-25 Single-hand control method and control system

Publications (2)

Publication Number Publication Date
CN106569716A CN106569716A (en) 2017-04-19
CN106569716B true CN106569716B (en) 2020-07-24

Family

ID=58536395

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610942444.6A Active CN106569716B (en) 2016-10-25 2016-10-25 Single-hand control method and control system

Country Status (2)

Country Link
CN (1) CN106569716B (en)
WO (1) WO2018076720A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106569716B (en) * 2016-10-25 2020-07-24 深圳奥比中光科技有限公司 Single-hand control method and control system
CN107613094A (en) * 2017-08-17 2018-01-19 珠海格力电器股份有限公司 A kind of method and mobile terminal of one-handed performance mobile terminal
WO2020042157A1 (en) * 2018-08-31 2020-03-05 深圳市柔宇科技有限公司 Input control method and electronic device
CN117501228A (en) * 2022-05-18 2024-02-02 北京小米移动软件有限公司 Control method, device, equipment and storage medium for switching single-hand mode

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9423876B2 (en) * 2011-09-30 2016-08-23 Microsoft Technology Licensing, Llc Omni-spatial gesture input
US10216286B2 (en) * 2012-03-06 2019-02-26 Todd E. Chornenky On-screen diagonal keyboard
CN102789568B (en) * 2012-07-13 2015-03-25 浙江捷尚视觉科技股份有限公司 Gesture identification method based on depth information
CN102937822A (en) * 2012-12-06 2013-02-20 广州视声电子科技有限公司 Reverse side controlling structure and method of mobile equipment
CN103176605A (en) * 2013-03-27 2013-06-26 刘仁俊 Control device of gesture recognition and control method of gesture recognition
CN103440033B (en) * 2013-08-19 2016-12-28 中国科学院深圳先进技术研究院 A kind of method and apparatus realizing man-machine interaction based on free-hand and monocular cam
CN103777701A (en) * 2014-01-23 2014-05-07 深圳市国华光电研究所 Large-screen touch screen electronic equipment
CN104331182B (en) * 2014-03-06 2017-08-25 广州三星通信技术研究有限公司 Portable terminal with auxiliary touch-screen
CN104750188A (en) * 2015-03-26 2015-07-01 小米科技有限责任公司 Mobile terminal
CN105824553A (en) * 2015-08-31 2016-08-03 维沃移动通信有限公司 Touch method and mobile terminal
CN106569716B (en) * 2016-10-25 2020-07-24 深圳奥比中光科技有限公司 Single-hand control method and control system

Also Published As

Publication number Publication date
WO2018076720A1 (en) 2018-05-03
CN106569716A (en) 2017-04-19

Similar Documents

Publication Publication Date Title
US10043308B2 (en) Image processing method and apparatus for three-dimensional reconstruction
CN106502570B (en) Gesture recognition method and device and vehicle-mounted system
US9069386B2 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
US9389779B2 (en) Depth-based user interface gesture control
EP2790089A1 (en) Portable device and method for providing non-contact interface
CN106569716B (en) Single-hand control method and control system
US20150261373A1 (en) Determining User Handedness and Orientation Using a Touchscreen Device
EP2846308A2 (en) Pointing direction detecting device and its method, program and computer readable-medium
US20130120250A1 (en) Gesture recognition system and method
CN106598422B (en) hybrid control method, control system and electronic equipment
CN110489027B (en) Handheld input device and display position control method and device of indication icon of handheld input device
US20150169134A1 (en) Methods circuits apparatuses systems and associated computer executable code for providing projection based human machine interfaces
CN114138121B (en) User gesture recognition method, device and system, storage medium and computing equipment
US9525906B2 (en) Display device and method of controlling the display device
WO2021004413A1 (en) Handheld input device and blanking control method and apparatus for indication icon of handheld input device
JP2023527906A (en) Control method, device, terminal and storage medium
US20150205360A1 (en) Table top gestures for mimicking mouse control
US9870061B2 (en) Input apparatus, input method and computer-executable program
CN104169858A (en) Method and device of using terminal device to identify user gestures
CN109358755B (en) Gesture detection method and device for mobile terminal and mobile terminal
JP2022525326A (en) Methods to assist object control using 2D cameras, systems and non-transient computer-readable recording media
WO2019100547A1 (en) Projection control method, apparatus, projection interaction system, and storage medium
TW201419087A (en) Micro-somatic detection module and micro-somatic detection method
JP6555958B2 (en) Information processing apparatus, control method therefor, program, and storage medium
CN105528059B (en) A kind of gesture operation in three-dimensional space method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant