CN106598422B - hybrid control method, control system and electronic equipment - Google Patents

hybrid control method, control system and electronic equipment Download PDF

Info

Publication number
CN106598422B
CN106598422B CN201610956745.4A CN201610956745A CN106598422B CN 106598422 B CN106598422 B CN 106598422B CN 201610956745 A CN201610956745 A CN 201610956745A CN 106598422 B CN106598422 B CN 106598422B
Authority
CN
China
Prior art keywords
touch
display screen
control
pointing
depth image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610956745.4A
Other languages
Chinese (zh)
Other versions
CN106598422A (en
Inventor
黄源浩
刘龙
肖振中
许星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Orbbec Co Ltd
Original Assignee
Shenzhen Orbbec Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Orbbec Co Ltd filed Critical Shenzhen Orbbec Co Ltd
Priority to CN201610956745.4A priority Critical patent/CN106598422B/en
Publication of CN106598422A publication Critical patent/CN106598422A/en
Application granted granted Critical
Publication of CN106598422B publication Critical patent/CN106598422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a pointing type based control and hybrid control method, a control system and electronic equipment, wherein the pointing type control method comprises the following steps: s1: acquiring a depth image of a display screen and a control object; s2: acquiring the position on the display screen pointed by the control object through the depth image to obtain a pointing point on the display screen; s3: according to the indication of the pointing point, combining a preset operation and control action, recognizing and generating a touch instruction; s4: and executing touch operation at the position of the pointing point according to the touch instruction. According to the invention, through the pointing type control method, the one-hand control of the large-screen electronic equipment can be well realized, and the accurate touch control of the object is realized.

Description

Hybrid control method, control system and electronic equipment
Technical Field
the present invention relates to the field of electronic technologies, and in particular, to a pointing-based control method, a hybrid control method, a control system thereof, and an electronic device.
background
with the popularization of mobile internet and the higher and higher requirements of people on the functions of intelligent devices such as mobile phones, the electronic device with the functions of communication, internet surfing, video playing and the like is the most basic configuration of the current electronic devices. At present, the size of devices such as mobile phones almost enters a large-screen era, and single-hand operation is far from reaching, and even though the size of devices such as mobile phones and the like is large, the requirements of users on mobile phones are still met.
taking a mobile phone as an example, adding a function key at the back of the mobile phone is a solution to one-hand operation, but will undoubtedly affect the beauty of the back of the mobile phone, so this solution has not been accepted by users. The other scheme is that an additional touch screen is added on the back of the mobile phone, and the scheme realizes the control of the area, which cannot be operated by one hand, of the mobile phone screen through the control of fingers on the back. However, this solution is costly and cannot be the mainstream one-handed solution.
Meanwhile, currently, a depth image is used for touch operation, touch on a display screen outside a region cannot be achieved between an operation object and the display screen, the position of the operation object on the display screen is directly mapped to coordinates corresponding to pixels on an acquired depth image, namely the position of the operation object on the display screen is mapped by the pixel coordinates of the operation object in the depth image, and the method for obtaining the position of the operation object on the display screen directly through pixel coordinate mapping cannot achieve touch operation on the region where the operation object cannot be touched, so that some functions such as page turning can be achieved simply, the problem of single-hand operation of a large screen cannot be solved well, and accurate touch on an object on the display screen cannot be achieved.
the above background disclosure is only for the purpose of assisting understanding of the inventive concept and technical solutions of the present invention, and does not necessarily belong to the prior art of the present patent application, and should not be used for evaluating the novelty and inventive step of the present application in the case that there is no clear evidence that the above content is disclosed at the filing date of the present patent application.
disclosure of Invention
the invention aims to provide a pointing-type-based control and hybrid control method, a control system and electronic equipment, so as to solve the technical problem that single-hand control and accurate touch control cannot be well realized in the prior art.
Therefore, the invention provides a directional control method, which comprises the following steps: s1: acquiring a depth image of a display screen and a control object; s2: acquiring the position on the display screen pointed by the control object through the depth image to obtain a pointing point on the display screen; s3: according to the indication of the pointing point, combining a preset operation and control action, recognizing and generating a touch instruction; s4: and executing touch operation at the position of the pointing point according to the touch instruction.
Preferably, the control method of the present invention may further have the following technical features:
The acquisition of the pointing point in step S2 includes: and extracting characteristic points which can draw the control object on the depth image, and constructing a pointing straight line which draws the control object according to the characteristic points to obtain a pointing point on the display screen.
the pointing point is an intersection point of a pointing straight line and the display screen, and the method comprises the following steps: s21: acquiring depth information of the feature points; s22: converting the depth information of the characteristic points to obtain spatial position information of the characteristic points in a coordinate system where the display screen is located, and obtaining a pointing straight line for drawing the control object according to the spatial position information of the characteristic points; s23: and acquiring the intersection point of the pointing straight line and the display screen according to the pointing straight line to obtain a pointing point on the display screen, and displaying the pointing point on the display screen.
when the control object is in contact with the display screen, the pointing point is a characteristic point of the control object.
the depth image acquired in step S1 includes: s11: acquiring a first depth image which comprises a display screen and does not contain a control object; s12: acquiring a second depth image containing a display screen and a control object; s13: and obtaining a third depth image of the control object through the second depth image and the first depth image, and obtaining a pointing point according to the third depth image.
Meanwhile, the invention also provides a hybrid control method, wherein the display screen is a touch screen with a touch function, the touch screen comprises a touch area and a pointing area, the control object completes touch in the touch area according to the touch function of the touch screen, and the pointing area completes touch operation by pointing from a pointing point according to the control method.
In addition, the invention also provides a control system for executing the touch method, which comprises an image acquisition unit, a processor and a display screen;
The image acquisition unit is used for acquiring a display screen, a depth image of a control object and depth information of the control object;
The processor comprises an identification module, a conversion module and an execution module, wherein the identification module is used for acquiring the position on a display screen pointed by a control object according to the depth image and identifying a preset control action of the control object; the conversion module is used for converting and generating a corresponding touch instruction according to a predefined control action; the execution module is used for executing a touch instruction at the position of the pointing point to complete touch operation;
The display screen is used for displaying the pointing point and the information.
preferably, the hybrid steering method of the present invention may also have the following technical features:
the display screen is a touch screen with a touch function, the control object completes touch in the touch area according to the touch function of the touch screen, and touch operation is completed by pointing from a pointing point in the pointing area.
The image acquisition unit of the manipulation system is a depth camera based on structured light or TOF (time of flight) principles.
The invention further provides the electronic equipment which comprises an operation and control system arranged on the bus, and operation and control operations are completed through the operation and control system.
Compared with the prior art, the invention has the advantages that: the invention uses depth image to realize touch operation and is a control method based on pointing type, the control method obtains the position on the display screen pointed by the control object through the depth image, obtains the indication point on the display screen, the user can combine the indication of the indication point with the preset control action to execute the touch operation at the position of the pointing point, thus when the user touches the electronic device with large screen, if the user needs to touch the object which is difficult to touch, the user can obtain a pointing point on the display screen by simply adjusting the position and state of the finger on the display screen, such as the inclination, the height from the display screen, etc., and pointing the finger to the object, according to the indication of the pointing point, the user can accurately know the pointing condition of the finger, accurately execute the corresponding touch operation, compared with the prior art, the invention can realize simple gesture operations such as page turning, backspacing and the like on the display screen without touch function, can also well realize one-hand touch, can perform corresponding touch operation on objects which can not be touched by one hand in a pointing mode, and can realize accurate touch.
In a preferred scheme, the third depth image of the controlled object is acquired according to the first depth image and the second depth image for the acquired depth image, and then the pointing point is acquired according to the third depth image, so that the acquired third depth image only comprises the depth image of the controlled object, and therefore when the pointing point is acquired, the calculation amount of the processing process can be reduced, the operation speed is increased, and the response speed of the system is increased.
according to the hybrid control method provided by the invention, on the basis of the depth image and pointing control method, for the display screen with the touch function, a mode of performing hybrid control by using the touch function and the pointing control method is implemented, and the hybrid control method can solve the problem of low precision caused by high randomness of movement of a controlled object and provides a better experience scheme for a user.
Drawings
Fig. 1 is a schematic structural diagram of a control system according to a first embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a processor according to a first embodiment of the present invention.
Fig. 3 is a schematic structural diagram of an electronic device according to a third embodiment of the present invention.
Fig. 4 is a schematic front view of a single-handed mobile phone according to a third and a fourth embodiment of the present invention.
Fig. 5 is a schematic side view of a single-handed mobile phone according to a third and a fourth embodiment of the present invention.
fig. 6 is a schematic diagram of the hybrid operation of embodiments two and five of the present invention.
Fig. 7 is a first operation flow chart of the fourth embodiment of the present invention.
fig. 8 is a second operation flow chart of the fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. It should be emphasized that the following description is merely exemplary in nature and is not intended to limit the scope of the invention or its application.
Non-limiting and non-exclusive embodiments will be described with reference to the following figures, wherein like reference numerals refer to like parts, unless otherwise specified.
The first embodiment is as follows:
The embodiment proposes a control system, as shown in fig. 1, including an image acquisition unit 1, a processor 2 and a display screen 3;
The image acquisition unit 1 is used for acquiring a depth image of the display screen 3 and a control object and depth information of the control object;
As shown in fig. 2, the processor 2 includes a recognition module 21, a conversion module 22 and an execution module 23, where the recognition module 21 is configured to obtain a position on the display screen 3 pointed by a manipulation object according to the depth image, and recognize a predetermined manipulation action of the manipulation object; the conversion module 22 is configured to convert a predefined control action to generate a corresponding touch instruction; the execution module 23 is configured to execute a touch instruction at the position of the pointing point to complete a touch operation;
the display screen 3 is used for displaying pointing points and information.
The system can acquire the depth image of the display screen through the image acquisition unit, and can also acquire the depth image of the control object which is not in contact with the display screen, and the position and the action of the display screen pointed by the control object can be identified according to the depth image, so that the position and the action are converted into corresponding positions and instructions on the display screen, and therefore the control of the equipment can be realized. Considering that the display effect is affected by performing touch control on the display screen, for example, when a page is continuously turned by a finger on an electronic book, a part of the screen is inevitably shielded by a hand, so that the reading experience is affected. The directional control by using the control object can reduce the problem and improve the use experience of the user.
The image acquisition unit is a depth camera based on structured light or TOF principle, and generally comprises a receiving unit and a transmitting unit.
Example two:
Compared with the first embodiment, the hybrid control system provided in the present embodiment is different in that the display screen of the first embodiment is a touch screen with a touch function, as shown in fig. 6, the touch screen includes a touch area i and a pointing area ii, the control object 14 performs touch control in the touch area i according to the touch function of the touch screen itself, and performs touch control operation in the pointing area ii from the pointing point 17.
The method specifically comprises the following steps: the touch area that can be reached by a single hand and the pointing control area that can not be reached by a single hand are still touch-controlled in the touch area to ensure better accuracy, and the pointing control area is controlled according to the pointing direction and the action of a finger.
example three:
The embodiment provides an electronic device, which may be a mobile phone or a tablet, including a bus 4 for data transmission, as shown in fig. 3, where the bus 4 is divided into a data bus, an address bus and a control bus, and is used to transmit data, a data address and a control signal, respectively. Connected to the bus 4 are a CPU5, a display 6, an IMU7 (inertial measurement unit), a memory 8, a camera 10, a sound unit 11, a network interface 12, and a mouse/keyboard 13, wherein for some touch-enabled electronic devices, the mouse/keyboard is replaced by the display 6, and the IMU device is used for positioning, tracking, and other functions. The memory 8 is used for storing an operating system, application programs 9, and the like, and may also be used for storing temporary data during operation.
The image acquisition unit in the control system arranged on the electronic equipment corresponds to a camera, the processor corresponds to a CPU (central processing unit), or can be an independent processor, and the display screen corresponds to a display.
as shown in fig. 4, the camera 10 is generally fixed to the electronic apparatus. In this embodiment, the imaging direction is greatly different from that of the camera of the existing apparatus. Existing cameras are typically front or rear mounted, and such a configuration cannot acquire images of the device itself. In the present embodiment, the configuration of the camera 10 can be in various situations, one is to rotate the camera by 90 degrees by means of the rotation shaft, so as to achieve the purpose of shooting the camera; the other is a form that the camera is externally arranged, and the camera as a whole is connected with the equipment through a certain fixing measure and interfaces such as USB and the like. The form of the camera arranged on the electronic device can be selected by a person skilled in the art according to actual conditions without limitation.
At this time, the camera of the present embodiment is a depth camera, as compared with the existing camera, and is used for acquiring a depth image of the target area.
The depth camera may be a depth camera based on the structured light principle or the TOF principle and generally comprises a receiving unit and a transmitting unit.
Fig. 4 is a front view of the mobile phone operated by a single hand. The top of the mobile phone is provided with a camera 10, and the camera collects images from top to bottom along the mobile phone, so that images of the display screen and fingers of the mobile phone can be obtained, and fig. 5 is a schematic side view of the mobile phone operated by a single hand.
The display 6 of the electronic device may be touch-enabled or touch-disabled, and when the display is not touch-enabled, the display can be controlled only by a pointing-type control method.
example four:
a pointing type manipulating method, as shown in fig. 4-5, applied to the manipulation of a display screen without a touch function, as shown in fig. 7, includes the following steps:
S1: acquiring depth images of the display screen 3 and the control object 14;
S2: acquiring the position on the display screen 3 pointed by the control object 14 through the depth image to obtain a pointing point 17 on the display screen;
S3: according to the indication of the pointing point 17, a touch instruction is identified and generated by combining a preset operation and control action;
s4: and executing touch operation at the position of the pointing point 17 according to the touch instruction. Such as clicking on an icon.
The acquisition of the pointing point 17 in step S2 includes: and extracting a characteristic point 15 capable of describing the control object on the depth image, and constructing a pointing straight line 16 describing the control object according to the characteristic point 15 to obtain a pointing point 17 on the display screen.
Specifically, as shown in fig. 5, the pointing point 17 is an intersection point where the pointing straight line 16 intersects with the display screen 3, and as shown in fig. 8, the method includes the following steps:
s21: extracting feature points 15 capable of describing the control object 14 according to the depth image, and acquiring depth information of the feature points 15;
S22: converting the depth information of the characteristic point 15 to obtain spatial position information of the characteristic point 15 in a coordinate system where the display screen 3 is located, and obtaining a pointing straight line 16 for drawing the control object according to the spatial position information of the characteristic point 15;
S23: and acquiring the intersection point of the pointing straight line 16 and the display screen 3 according to the pointing straight line 16 to obtain a pointing point 17 on the display screen 3, and displaying the pointing point 17 on the display screen 3.
In the above steps, when the manipulation object 14 is in contact with the display screen 3, the pointing point 17 is a characteristic point of the manipulation object 14, and may be determined as a touch operation. Of course, the contact with the screen may be regarded as an invalid operation so as to accurately calculate the position of the manipulation object 14 on the display screen 3.
when the feature points 15 are obtained, the skeleton modeling or other models may be used to model the manipulated object 14 to complete the construction of the pointing straight line, and for a finger, joint points and finger vertices of the finger are generally used as the feature points 15.
in the prior art, positioning and touching are completed simultaneously, and positioning and manipulation can be completed simultaneously in the pointing type non-contact (touching can lift a finger to point to an object to be touched), that is, when the position pointed by the finger is determined, a default touch instruction similar to selection and clicking is executed immediately, and at this time, the shape and action of the finger do not need to be further recognized. In addition, the operation can be performed step by step, that is, the positions are calculated first, then the finger performs corresponding actions, such as a change of the shape of the finger, or a click action in the pointing direction, and the operation instructions corresponding to the respective actions are executed after the actions are recognized. The recognition of the shape and movement of the finger is known in the art and will not be described in detail here.
Operations that do not require positioning, such as page turning, rollback, etc., are certainly not excluded. For such operations, it may be sufficient to recognize only the shape or the motion of the finger.
In this embodiment, the finger is pushed forward or backward along its pointing direction to be considered to complete a single click action, and other shapes and manners are also possible. The shape of the finger and the touch command corresponding to the motion need to be preset. When a certain finger shape and action are recognized, the processor converts the finger shape and action into a corresponding touch instruction.
The acquired depth image comprises the mobile phone display screen 3 and the finger as well as other irrelevant parts, at this time, the measurement range of the depth camera can be limited, namely a certain threshold value is set, the depth information exceeding the threshold value is removed, the depth image acquired in the mode only comprises the information of the mobile phone display screen and the finger, and the calculation amount of identification can be reduced. In order to further increase the system running speed and reduce the computation amount, an image segmentation method is used for obtaining the depth image, wherein one method comprises the following steps:
S11: acquiring a first depth image comprising the display screen 3 and not comprising the manipulated object 14;
S12: acquiring a second depth image containing the display screen 3 and the control object 14;
s13: a third depth image of the manipulated object 14 is obtained from the second depth image and the first depth image, and a pointing point 17 is obtained from the third depth image.
If the manipulated object 14 is a finger, the depth image of the front end part of the finger obtained by the background segmentation method in the above steps can reduce the calculation amount in modeling and improve the calculation speed.
Example five:
Considering the randomness of the finger movement, the precision is difficult to reach the touch level. Therefore, for the single-hand operation of some mobile phone devices with touch functions, the touch operation is still selected in the area which can be reached by a single hand, and the non-contact operation is implemented in the area which cannot be reached by the single hand, so that the scheme with better experience is provided. Therefore, the present embodiment proposes a hybrid control method, in which the display screen 3 is a touch screen having a touch function, as shown in fig. 6, the touch screen includes a touch area i and a pointing area ii, the control object 14 performs touch control in the touch area i according to the touch function of the touch screen itself, and on the basis of the fourth embodiment, performs touch control in the pointing area ii according to the pointing type control method, and performs touch control operation from the pointing direction of the pointing point 17.
because the sizes of the hands, the habits of the left hand and the right hand and the like of different users are different, the distinction between the touch area (i) and the pointing area (ii) can be automatically identified by the system. Namely: when the finger contacts the touch display screen 3, the processor processes the signal fed back by the touch screen and executes a corresponding touch instruction, when the finger does not contact the touch display screen 3, the depth camera recognizes the non-contact action and feeds back the non-contact action to the processor, and the processor processes the depth image acquired by the depth camera, recognizes the position pointed by the finger and the action of the finger and executes a corresponding operation instruction so as to realize non-contact control.
Those skilled in the art will recognize that numerous variations are possible in light of the above description, and thus the examples are intended to describe one or more specific embodiments.
while there has been described and illustrated what are considered to be example embodiments of the present invention, it will be understood by those skilled in the art that various changes and substitutions may be made therein without departing from the spirit of the invention. In addition, many modifications may be made to adapt a particular situation to the teachings of the present invention without departing from the central concept described herein. Therefore, it is intended that the invention not be limited to the particular embodiments disclosed, but that the invention will include all embodiments and equivalents falling within the scope of the invention.

Claims (8)

1. A hybrid control method is characterized in that a display screen is a touch screen with a touch function, the touch screen comprises a touch area and a pointing area, wherein an object is controlled in the touch area to complete touch according to the touch function of the touch screen, and the pointing area is pointed by a pointing point to complete touch operation according to a pointing type control method, wherein the distinction between the touch area and the pointing area is automatically recognized by the system, and when the control object is in contact with the touch display screen, the processor processes the signal fed back by the touch display screen and executes a corresponding touch instruction, when the control object is not in contact with the touch display screen, after the depth camera recognizes the non-contact action, the depth image acquired by the depth camera is processed by the processor, the position pointed by the control object is identified, and the corresponding operation instruction is executed after the action of the control object is identified, so that non-contact control is realized; the directional control method comprises the following steps:
s1: acquiring a depth image of a display screen and a control object;
s2: acquiring the position on the display screen pointed by the control object through the depth image to obtain a pointing point on the display screen;
s3: according to the indication of the pointing point, combining a preset operation and control action, recognizing and generating a touch instruction;
S4: and executing touch operation at the position of the pointing point according to the touch instruction.
2. The hybrid manipulation method according to claim 1, wherein the acquisition of the pointing point in step S2 includes: and extracting characteristic points which can draw the control object on the depth image, and constructing a pointing straight line which draws the control object according to the characteristic points to obtain a pointing point on the display screen.
3. The hybrid manipulation method of claim 2, wherein the pointing point is an intersection point where a pointing straight line intersects the display screen, comprising the steps of:
s21: acquiring depth information of the feature points;
S22: converting the depth information of the characteristic points to obtain spatial position information of the characteristic points in a coordinate system where the display screen is located, and obtaining a pointing straight line for drawing the control object according to the spatial position information of the characteristic points;
S23: and acquiring the intersection point of the pointing straight line and the display screen according to the pointing straight line to obtain a pointing point on the display screen, and displaying the pointing point on the display screen.
4. The hybrid manipulation method according to any one of claims 1 to 3, wherein the pointing point is a characteristic point of the manipulation object when the manipulation object is in contact with the display screen.
5. The hybrid manipulation method of any one of claims 1 to 3, wherein the depth image acquired in step S1 includes: s11: acquiring a first depth image which comprises a display screen and does not contain a control object; s12: acquiring a second depth image containing a display screen and a control object; s13: and obtaining a third depth image of the control object through the second depth image and the first depth image, and obtaining a pointing point according to the third depth image.
6. A steering system, characterized by: the hybrid manipulation method for performing any of claims 1-5, comprising an image acquisition unit, a processor, and a display screen;
The image acquisition unit is used for acquiring a display screen, a depth image of a control object and depth information of the control object;
The processor comprises an identification module, a conversion module and an execution module, wherein the identification module is used for acquiring the position on a display screen pointed by a control object according to the depth image and identifying a preset control action of the control object; the conversion module is used for converting and generating a corresponding touch instruction according to a predefined control action; the execution module is used for executing a touch instruction at the position of the pointing point to complete touch operation;
The display screen is used for displaying the pointing point and information;
The display screen is the touch-sensitive screen that self possesses touch-control function, the touch-sensitive screen includes touch-control district and directional district, include control the object in the touch-control district according to touch-control function completion touch-control of touch-sensitive screen self point to accomplish touch-control operation by the directional point in the directional district, distinguish by the system automatic identification to touch-control district and directional district, when controlling object and touch display screen contact, then the signal of processing by the touch-sensitive screen feedback and carry out corresponding touch instruction, and when controlling object and touch display screen contactless, the degree of depth camera discerns this contactless action after, feeds back to the treater, and the treater is then handled the degree of depth image that the degree of depth camera acquireed, discerns the position that the manipulation object points to and controls the action of object and carries out corresponding operating instruction thereby realizes contactless manipulation.
7. Steering system according to claim 6, characterized in that the image acquisition unit of the steering system is a depth camera based on structured light or TOF principles.
8. an electronic device, wherein the control system of any one of claims 6-7 is disposed on a bus of the electronic device, and the control system is used to complete a touch operation.
CN201610956745.4A 2016-10-25 2016-10-25 hybrid control method, control system and electronic equipment Active CN106598422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610956745.4A CN106598422B (en) 2016-10-25 2016-10-25 hybrid control method, control system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610956745.4A CN106598422B (en) 2016-10-25 2016-10-25 hybrid control method, control system and electronic equipment

Publications (2)

Publication Number Publication Date
CN106598422A CN106598422A (en) 2017-04-26
CN106598422B true CN106598422B (en) 2019-12-13

Family

ID=58590263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610956745.4A Active CN106598422B (en) 2016-10-25 2016-10-25 hybrid control method, control system and electronic equipment

Country Status (1)

Country Link
CN (1) CN106598422B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107256100A (en) * 2017-05-23 2017-10-17 珠海市魅族科技有限公司 Method of toch control, device, terminal device and storage medium
CN112363666A (en) * 2020-11-05 2021-02-12 厦门厦华科技有限公司 Window adjusting method and device for electronic whiteboard
CN112987930A (en) * 2021-03-17 2021-06-18 读书郎教育科技有限公司 Method for realizing convenient interaction with large-size electronic product
CN113095243B (en) * 2021-04-16 2022-02-15 推想医疗科技股份有限公司 Mouse control method and device, computer equipment and medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011136783A1 (en) * 2010-04-29 2011-11-03 Hewlett-Packard Development Company L. P. System and method for providing object information
CN103135745B (en) * 2011-11-25 2018-01-02 夏普株式会社 Non-contact control method, information equipment and system based on depth image
CN102855066B (en) * 2012-09-26 2017-05-17 东莞宇龙通信科技有限公司 Terminal and terminal control method
TW201510771A (en) * 2013-09-05 2015-03-16 Utechzone Co Ltd Pointing direction detecting device and its method, program and computer readable-medium
CN104978012B (en) * 2014-04-03 2018-03-16 华为技术有限公司 One kind points to exchange method, apparatus and system
CN105025121A (en) * 2014-04-26 2015-11-04 刘璐 Solution for settling operation of large-screen mobile phone by one hand
CN104991684A (en) * 2015-07-23 2015-10-21 京东方科技集团股份有限公司 Touch control device and working method therefor

Also Published As

Publication number Publication date
CN106598422A (en) 2017-04-26

Similar Documents

Publication Publication Date Title
US9069386B2 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
US9389779B2 (en) Depth-based user interface gesture control
WO2018076523A1 (en) Gesture recognition method and apparatus, and in-vehicle system
US10185433B2 (en) Method and apparatus for touch responding of wearable device as well as wearable device
EP2790089A1 (en) Portable device and method for providing non-contact interface
CN107748641B (en) Numerical value adjustment control method and device, electronic equipment and storage medium
US20150261373A1 (en) Determining User Handedness and Orientation Using a Touchscreen Device
CN106598422B (en) hybrid control method, control system and electronic equipment
US9378427B2 (en) Displaying handwritten strokes on a device according to a determined stroke direction matching the present direction of inclination of the device
CN106569716B (en) Single-hand control method and control system
US9262012B2 (en) Hover angle
US20150169134A1 (en) Methods circuits apparatuses systems and associated computer executable code for providing projection based human machine interfaces
CN104331154A (en) Man-machine interaction method and system for realizing non-contact mouse control
WO2015091638A1 (en) Method for providing user commands to an electronic processor and related processor program and electronic circuit.
CN205050078U (en) A wearable apparatus
Choi et al. Bare-hand-based augmented reality interface on mobile phone
US20140304736A1 (en) Display device and method of controlling the display device
WO2021004413A1 (en) Handheld input device and blanking control method and apparatus for indication icon of handheld input device
CN113515228A (en) Virtual scale display method and related equipment
CN105488832A (en) Optical digital ruler
US9870061B2 (en) Input apparatus, input method and computer-executable program
TWI536794B (en) Cell phone with contact free controllable function
US10620760B2 (en) Touch motion tracking and reporting technique for slow touch movements
JP2022525326A (en) Methods to assist object control using 2D cameras, systems and non-transient computer-readable recording media
JP2018181169A (en) Information processor, and information processor control method, computer program, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant