CN116736937A - Notebook computer and input operation acquisition method - Google Patents

Notebook computer and input operation acquisition method Download PDF

Info

Publication number
CN116736937A
CN116736937A CN202211214135.9A CN202211214135A CN116736937A CN 116736937 A CN116736937 A CN 116736937A CN 202211214135 A CN202211214135 A CN 202211214135A CN 116736937 A CN116736937 A CN 116736937A
Authority
CN
China
Prior art keywords
tof sensor
magnetic pole
solenoid
cambered surface
notebook computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211214135.9A
Other languages
Chinese (zh)
Inventor
辛晶晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211214135.9A priority Critical patent/CN116736937A/en
Publication of CN116736937A publication Critical patent/CN116736937A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

A notebook computer and an input operation acquisition method relate to the technical field of terminals. The notebook computer includes: a processor and at least one time-of-flight ToF sensor module. The ToF sensor module comprises a ToF sensor, wherein the ToF sensor is used for acquiring distance data between the ToF sensor and the hand of a user; the inclination angle of the ToF sensor is adjustable, and the inclination angle is an included angle between the ToF sensor and the plane of the screen of the notebook computer; the processor is used for adjusting the inclination angle of the ToF sensor, identifying the current gesture action of the user according to the distance data, and determining the input operation corresponding to the current gesture action according to the corresponding relation between the pre-established gesture action and the input operation. The problem of the notebook computer has not had the work efficiency that mouse brought low has been solved to this scheme, has solved the notebook computer again and has carried the mouse and lead to the problem that the convenience is poor. In addition, personalized gestures can be customized, the interestingness and the user experience are improved, and the practicability is higher.

Description

Notebook computer and input operation acquisition method
Technical Field
The present application relates to the field of computers, and in particular, to a notebook computer and a method for obtaining input operations.
Background
At present, when a user uses a computer, the user is generally used to input through the mouse and the keyboard, and the input operation can be simpler, more convenient and faster through the use of the mouse.
In order not to reduce the working efficiency, the user needs to carry the notebook computer and the mouse at the same time, and also needs to establish wired connection or wireless connection between the mouse and the notebook computer when using the mouse, so that the portability of the notebook computer is reduced.
Disclosure of Invention
In order to solve the problems, the application provides a notebook computer and an input operation acquisition method, solves the problem of low working efficiency caused by no mouse of the notebook computer, and solves the problem of poor convenience caused by carrying the mouse by the notebook computer.
In a first aspect, the present application provides a notebook computer, the notebook computer comprising: a processor and at least one time-of-flight ToF sensor module; the ToF sensor module comprises a ToF sensor, wherein the ToF sensor is used for acquiring distance data between the ToF sensor and the hand of a user; the inclination angle of the ToF sensor is adjustable, and is an included angle between the ToF sensor and a plane where a screen of the notebook computer is located; the processor is used for adjusting the inclination angle of the ToF sensor, identifying the current gesture action of the user according to the distance data, and determining the input operation corresponding to the current gesture action according to the corresponding relation between the pre-established gesture action and the input operation.
According to the scheme, one or more ToF sensors are arranged on the notebook computer, the processor recognizes gestures of a user, and converts the gestures into corresponding mouse input operations or other user-defined input operations, so that the problem of low working efficiency caused by no mouse of the notebook computer is solved, and the problem of poor convenience caused by carrying the mouse by the notebook computer is also solved. In addition, personalized gestures can be customized, the interestingness and the user experience are improved, and more convenience is provided for users with partial hand disabilities. And when the gesture input is realized, entity information such as user pictures and the like can not be directly intercepted, and personal privacy data of the user are not adopted, so that the privacy disclosure problem can not exist, and the method has higher practicability. And the ToF sensor has lower power consumption, and is beneficial to realizing better cruising of the notebook computer.
In one possible implementation, the processor is specifically configured to adjust the tilt angle of the ToF sensor according to a pre-established correspondence between the ToF sensor and the gesture input area, so that the ToF sensor faces the gesture input area; the gesture input area is an area where a user performs gesture input.
In one possible implementation manner, the processor is specifically configured to determine, according to the distance data, an angle between the hand of the user and the direction in which the ToF sensor is facing, and adjust an inclination angle of the ToF sensor according to the angle, so that the ToF sensor is facing the hand of the user.
In one possible implementation, the processor is further configured to stop adjusting the tilt angle of the ToF sensor when no gesture is recognized for a first preset time. Further reducing power consumption and prolonging the endurance time of the equipment.
In one possible implementation, the ToF sensor module further includes: the first magnetic pole structure comprises a first support, a second support, a first magnetic pole structure, a second magnetic pole structure, a first spring, a second spring, a first magnetic material and a second magnetic material; the first supporting piece and the second supporting piece are of a plane structure; the first magnetic pole structure is fixed at the first end of the first support, the second magnetic pole structure is fixed at the second end of the first support, the first magnetic material is fixed at the first end of the second support, and the second magnetic material is fixed at the second end of the second support; the first spring is connected with the first end of the first supporting piece and the first end of the second supporting piece; the second spring is connected with the second end of the first supporting piece and the second end of the second supporting piece; the ToF sensor is located between the first end of the first support and the second end of the first support; alternatively, the ToF sensor is located between the first end of the second support and the second end of the second support; the processor is specifically configured to control the first magnetic pole structure and the second magnetic pole structure to be electrified and generate an electromagnetic field, so that a first acting force is generated between the first magnetic pole structure and the first magnetic material, a second acting force is generated between the second magnetic pole structure and the second magnetic material, and the directions of the first acting force and the second acting force are opposite.
When the first magnetic pole is an N pole, the second magnetic pole is an S pole; when the first magnetic pole is an S pole, the second magnetic pole is an N pole.
In one possible implementation, the first magnetic pole structure includes a first variable power source and a first solenoid, and the second magnetic pole structure includes a second variable power source and a second solenoid; the first magnetic material and the second magnetic material are provided with a first magnetic pole at one end close to the first support piece and a second magnetic pole at one end far away from the first support piece; the processor is used for controlling the first variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second support, is the second magnetic pole, and the other end of the first solenoid, which is far away from the second support, is the first magnetic pole; and controlling the second variable power supply to output current to the second solenoid so that one end of the second solenoid, which is close to the second support, is the first magnetic pole, and one end of the second solenoid, which is far away from the second support, is the second magnetic pole.
In one possible implementation, the first magnetic pole structure includes a first variable power source and a first solenoid, and the second magnetic pole structure includes a second variable power source and a second solenoid; the first magnetic material is provided with a first magnetic pole at one end close to the first support piece and a second magnetic pole at one end far away from the first support piece; the second magnetic material is provided with a second magnetic pole at one end close to the first support piece, and is provided with a first magnetic pole at one end far away from the first support piece; the processor is used for controlling the first variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second support, is the second magnetic pole, and the other end of the first solenoid, which is far away from the second support, is the first magnetic pole; and controlling the second variable power supply to output current to the second solenoid so that one end of the second solenoid, which is close to the second support, is the second magnetic pole, and one end of the second solenoid, which is far away from the second support, is the first magnetic pole.
In one possible implementation, the ToF sensor module further includes: a variable power supply; the first magnetic pole structure comprises a first solenoid and the second magnetic pole structure comprises a second solenoid; winding directions of windings of the first solenoid and the second solenoid are different; the first magnetic material and the second magnetic material are provided with a first magnetic pole at one end close to the first support piece and a second magnetic pole at one end far away from the first support piece; the processor is used for controlling the variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second support, is the second magnetic pole, and one end of the first solenoid, which is far away from the second support, is the first magnetic pole; and controlling the variable power supply to output current to the second solenoid so that one end of the second solenoid, which is close to the second support, is the first magnetic pole, and one end of the second solenoid, which is far away from the second support, is the second magnetic pole.
In one possible implementation, the ToF sensor module further includes: a variable power supply; the first magnetic pole structure comprises a first solenoid and the second magnetic pole structure comprises a second solenoid; the first magnetic material is provided with a first magnetic pole at one end close to the first support piece and a second magnetic pole at one end far away from the first support piece; the second magnetic material is provided with a second magnetic pole at one end close to the first support piece, and is provided with a first magnetic pole at one end far away from the first support piece; the processor is used for controlling the variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second support, is the second magnetic pole, and one end of the first solenoid, which is far away from the second support, is the first magnetic pole; and controlling the variable power supply to output current to the second solenoid so that one end of the second solenoid, which is close to the second support, is the second magnetic pole, and one end of the second solenoid, which is far away from the second support, is the first magnetic pole.
In one possible implementation, the ToF sensor module further includes: the device comprises a first cambered surface supporting piece, a second cambered surface supporting piece, a first magnetic pole structure, a second magnetic pole structure, a first spring, a second spring, a first magnetic material and a second magnetic material; the area of the first cambered surface supporting piece is larger than that of the second cambered surface supporting piece, the radian of the first cambered surface supporting piece is smaller than that of the second cambered surface supporting piece, and the second cambered surface supporting piece is tangent to the first cambered surface supporting piece; the first magnetic pole structure is fixed at the first end of the first cambered surface supporting piece, the second magnetic pole structure is fixed at the second end of the first cambered surface supporting piece, the first magnetic material is fixed at the first end of the second cambered surface supporting piece, and the second magnetic material is fixed at the second end of the second cambered surface supporting piece; the first spring is connected with the first end of the first cambered surface supporting piece and the first end of the ToF sensor; the second spring is connected with the second end of the first cambered surface supporting piece and the second end of the ToF sensor; the second cambered surface support is positioned between the ToF sensor and the first cambered surface support and between the first spring and the second spring; the processor is specifically configured to control the first magnetic pole structure and the second magnetic pole structure to be electrified and generate an electromagnetic field, so that a first acting force is generated between the first magnetic pole structure and the first magnetic material, a second acting force is generated between the second magnetic pole structure and the second magnetic material, and the directions of the first acting force and the second acting force are opposite.
In one possible implementation, the first magnetic pole structure includes a first variable power source and a first solenoid, and the second magnetic pole structure includes a second variable power source and a second solenoid; the first magnetic material and the second magnetic material are provided with a first magnetic pole at one end close to the first cambered surface supporting piece and a second magnetic pole at one end far away from the first cambered surface supporting piece; the processor is used for controlling the first variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second cambered surface support piece, is the second magnetic pole, and one end of the first solenoid, which is far away from the second cambered surface support piece, is the first magnetic pole; and controlling the second variable power supply to output current to the second solenoid, so that one end of the second solenoid, which is close to the second cambered surface support piece, is the first magnetic pole, and one end of the second solenoid, which is far away from the second cambered surface support piece, is the second magnetic pole.
In one possible implementation, the first magnetic pole structure includes a first variable power source and a first solenoid, and the second magnetic pole structure includes a second variable power source and a second solenoid; the first magnetic material is provided with a first magnetic pole at one end close to the first cambered surface supporting piece and a second magnetic pole at one end far away from the first cambered surface supporting piece; the second magnetic material is provided with a first cambered surface support piece, a second cambered surface support piece and a first cambered surface support piece, wherein one end of the second magnetic material, which is close to the first cambered surface support piece, is provided with a second magnetic pole, and one end of the second magnetic material, which is far away from the first cambered surface support piece, is provided with the first magnetic pole; the processor is used for controlling the first variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second cambered surface support piece, is the second magnetic pole, and one end of the first solenoid, which is far away from the second cambered surface support piece, is the first magnetic pole; and controlling the second variable power supply to output current to the second solenoid, so that one end of the second solenoid, which is close to the second cambered surface support piece, is the second magnetic pole, and one end of the second solenoid, which is far away from the second cambered surface support piece, is the first magnetic pole.
In one possible implementation, the ToF sensor module further includes: a variable power supply; the first magnetic pole structure comprises a first solenoid and the second magnetic pole structure comprises a second solenoid; winding directions of windings of the first solenoid and the second solenoid are different; the first magnetic material and the second magnetic material are provided with a first magnetic pole at one end close to the first cambered surface supporting piece and a second magnetic pole at one end far away from the first cambered surface supporting piece; the processor is used for controlling the variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second cambered surface supporting piece, is the second magnetic pole, and one end of the first solenoid, which is far away from the second cambered surface supporting piece, is the first magnetic pole; and controlling the variable power supply to output current to the second solenoid, so that one end of the second solenoid, which is close to the second cambered surface support piece, is the first magnetic pole, and one end of the second solenoid, which is far away from the second cambered surface support piece, is the second magnetic pole.
In one possible implementation, the ToF sensor module further includes: a variable power supply; the first magnetic pole structure comprises a first solenoid and the second magnetic pole structure comprises a second solenoid; the first magnetic material is provided with a first magnetic pole at one end close to the first cambered surface supporting piece and a second magnetic pole at one end far away from the first cambered surface supporting piece; the second magnetic material is provided with a first cambered surface support piece, a second cambered surface support piece and a first cambered surface support piece, wherein one end of the second magnetic material, which is close to the first cambered surface support piece, is provided with a second magnetic pole, and one end of the second magnetic material, which is far away from the first cambered surface support piece, is provided with the first magnetic pole; the processor is used for controlling the variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second cambered surface supporting piece, is the second magnetic pole, and one end of the first solenoid, which is far away from the second cambered surface supporting piece, is the first magnetic pole; and controlling the variable power supply to output current to the second solenoid, so that one end of the second solenoid, which is close to the second cambered surface support piece, is the second magnetic pole, and one end of the second solenoid, which is far away from the second cambered surface support piece, is the first magnetic pole.
In one possible implementation manner, the notebook computer includes a first ToF sensor, where the first ToF sensor establishes a correspondence with a first gesture input area in advance; the processor is specifically configured to adjust an inclination angle of the first ToF sensor, so that the first ToF sensor faces the first gesture input area; and identifying the gesture action of the user in the first gesture input area by utilizing the distance data of the first gesture input area in the distance data acquired by the first ToF sensor.
In one possible implementation, the notebook computer includes the following two ToF sensors: a first ToF sensor and a second ToF sensor; the first ToF sensor pre-establishes a corresponding relation with the first gesture input area, and the second ToF sensor pre-establishes a corresponding relation with the second gesture input area; the processor is specifically configured to adjust an inclination angle of the first ToF sensor, so that the first ToF sensor faces the first gesture input area, and identify a gesture of the user in the first gesture input area by using distance data of the first gesture input area in the distance data acquired by the first ToF sensor; and adjusting the inclination angle of the second ToF sensor so that the second ToF sensor is opposite to the second gesture input area, and identifying gesture actions of the user in the second gesture input area by using the distance data of the second gesture input area in the distance data acquired by the second ToF sensor.
In one possible implementation, the notebook computer includes the following two ToF sensors: a first ToF sensor and a second ToF sensor; the first ToF sensor and the second ToF sensor each establish in advance a correspondence relationship with a first gesture input region. The processor is specifically configured to adjust an inclination angle of the first ToF sensor and the second ToF sensor, so that the first ToF sensor and the second ToF sensor are opposite to the first gesture input area; and identifying a gesture of the user in the first gesture input area by using the distance data of the first gesture input area in the distance data acquired by the first ToF sensor and the distance data of the first gesture input area in the distance data acquired by the second ToF sensor.
In one possible implementation, the input operation specifically includes at least one of the following: mouse operation or shortcut operation; the mouse operation comprises one or more of a mouse click operation, a mouse double click operation, a mouse right click operation and a mouse wheel rolling operation; the shortcut operation includes: a rollback operation of video/audio, a pause operation of video/audio, a fast forward operation of video/audio.
In a second aspect, the present application further provides a method for acquiring an input operation, which is applied to a notebook computer, where the notebook computer includes at least one time-of-flight ToF sensor module, where the ToF sensor module includes a ToF sensor, and the method includes:
adjusting the inclination angle of the ToF sensor, wherein the inclination angle is an included angle between the ToF sensor and a plane where a screen of the notebook computer is positioned;
and identifying the current gesture action of the user according to the distance data between the ToF sensor and the hand of the user, and determining the input operation corresponding to the current gesture action according to the pre-established corresponding relation between the gesture action and the input operation.
According to the scheme provided by the application, one or more ToF sensors are arranged in the notebook computer, the gesture of the user is identified by utilizing the distance data acquired by the ToF sensors, and the gesture is converted into the corresponding mouse input operation or other user-defined input operation, so that the problem of low working efficiency caused by no mouse of the notebook computer is solved, and the problem of poor convenience caused by carrying the mouse by the notebook computer is also solved. Through adjusting the inclination of the ToF sensor, the ToF sensor is opposite to the hand of a user or a gesture input area, accuracy of acquired distance data is improved, the distance data of an object can be comprehensively acquired, and accuracy of gesture motion recognition is further improved.
In addition, personalized gestures can be customized, the interestingness and the user experience are improved, and more convenience is provided for users with partial hand disabilities. And when the gesture input is realized, entity information such as user pictures and the like can not be directly intercepted, and personal privacy data of the user are not adopted, so that the privacy disclosure problem can not exist, and the method has higher practicability. And the ToF sensor has lower power consumption, and is beneficial to realizing better cruising of the notebook computer.
In one possible implementation manner, the adjusting the inclination angle of the ToF sensor specifically includes:
according to a pre-established corresponding relation between the ToF sensor and the gesture input area, adjusting the inclination angle of the ToF sensor so that the ToF sensor is opposite to the gesture input area; the gesture input area is an area where a user performs gesture input.
In one possible implementation manner, the adjusting the inclination angle of the ToF sensor specifically includes:
and determining an included angle between the hand of the user and the opposite direction of the ToF sensor according to the distance data, and adjusting the inclination angle of the ToF sensor according to the included angle so that the ToF sensor is opposite to the hand of the user.
In one possible implementation, the method further includes:
and stopping adjusting the inclination angle of the ToF sensor when the gesture motion is not recognized for the first preset time.
In one possible implementation, the notebook computer includes a first ToF sensor; before the adjusting the tilt angle of the ToF sensor, the method further includes:
pre-establishing a corresponding relation between the first ToF sensor and a first gesture input area;
the adjusting the inclination angle of the ToF sensor specifically includes:
adjusting the inclination angle of the first ToF sensor so that the first ToF sensor faces the first gesture input area;
the step of identifying the current gesture of the user according to the distance data between the ToF sensor and the hand of the user, specifically includes:
and identifying the gesture action of the user in the first gesture input area by utilizing the distance data of the first gesture input area in the distance data acquired by the first ToF sensor.
In one possible implementation, the notebook computer includes the following two ToF sensors: a first ToF sensor and a second ToF sensor; before the adjusting the tilt angle of the ToF sensor, the method further includes:
Pre-establishing a corresponding relation between the first ToF sensor and a first gesture input region, and pre-establishing a corresponding relation between the second ToF sensor and a second gesture input region;
the adjusting the inclination angle of the ToF sensor specifically includes:
adjusting the tilt angle of the first ToF sensor so that the first ToF sensor is facing the first gesture input area, and adjusting the tilt angle of the second ToF sensor so that the second ToF sensor is facing the second gesture input area;
the step of identifying the current gesture of the user according to the distance data between the ToF sensor and the hand of the user, specifically includes:
identifying gesture actions of the user in the first gesture input area by utilizing the distance data of the first gesture input area in the distance data acquired by the first ToF sensor;
and identifying the gesture action of the user in the second gesture input area by using the distance data of the second gesture input area in the distance data acquired by the second ToF sensor.
In one possible implementation, the notebook computer includes the following two ToF sensors: a first ToF sensor and a second ToF sensor; before the adjusting the tilt angle of the ToF sensor, the method further includes:
Pre-establishing a correspondence between the first ToF sensor and a first gesture input area, and pre-establishing a correspondence between the second ToF sensor and the first gesture input area;
the adjusting the inclination angle of the ToF sensor specifically includes:
adjusting the tilt angle of the first ToF sensor so that the first ToF sensor is facing the first gesture input area, and adjusting the tilt angle of the second ToF sensor so that the second ToF sensor is facing the first gesture input area;
the step of identifying the current gesture of the user according to the distance data between the ToF sensor and the hand of the user, specifically includes:
and identifying gesture actions of the user in the first gesture input area by using the distance data of the first gesture input area in the distance data acquired by the first ToF sensor and the distance data of the first gesture input area in the distance data acquired by the second ToF sensor.
In one possible implementation, before adjusting the tilt angle of the ToF sensor, the method further includes: and pre-establishing a corresponding relation between the gesture action and the input operation. In one possible implementation, the input operation specifically includes at least one of the following: mouse operation or shortcut operation; the mouse operation comprises one or more of a mouse click operation, a mouse double click operation, a mouse right click operation and a mouse wheel rolling operation; the shortcut operation includes: a rollback operation of video/audio, a pause operation of video/audio, a fast forward operation of video/audio.
Drawings
Fig. 1 is a schematic view of an application scenario of a notebook computer;
fig. 2 is a schematic structural diagram of a notebook computer according to the present application;
FIG. 3 is a schematic diagram of a software system of a notebook computer according to an embodiment of the application;
fig. 4 is a schematic diagram of an application scenario provided in an embodiment of the present application;
fig. 5 is a schematic diagram of another application scenario provided in an embodiment of the present application;
fig. 6 is a schematic diagram of still another application scenario provided in an embodiment of the present application;
fig. 7 is a schematic diagram of another application scenario provided in an embodiment of the present application;
fig. 8A is a schematic diagram of a ToF sensor module according to an embodiment of the present application;
fig. 8B is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
fig. 8C is a schematic diagram illustrating deflection of the ToF sensor corresponding to fig. 8B according to an embodiment of the present application;
fig. 9A is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
fig. 9B is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
fig. 9C is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
fig. 10 is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
Fig. 11A is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
FIG. 11B is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
FIG. 11C is a schematic diagram illustrating the deflection of the TOF sensor corresponding to FIG. 11B according to an embodiment of the present application;
fig. 12A is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
fig. 12B is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
fig. 12C is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
fig. 13 is a schematic diagram of another ToF sensor module according to an embodiment of the present application;
fig. 14 is a flowchart of a method for acquiring an input operation according to an embodiment of the present application;
FIG. 15A is a flowchart of a method for inputting gesture actions and performing monitoring settings according to an embodiment of the present application;
FIG. 15B is a flowchart of another method for inputting gesture actions and performing monitoring settings according to an embodiment of the present application;
fig. 16 is a schematic diagram of a notebook computer according to an embodiment of the present application.
Detailed Description
In order to make the technical personnel in the technical field more clearly understand the scheme of the application, the application scenario of the technical scheme of the application is first described below.
The technical scheme provided by the application can be applied to a notebook computer, and is explained below with reference to a specific implementation mode.
Referring to fig. 1, the diagram is a schematic view of an application scenario of a notebook computer.
When using the notebook computer 100, the user generally uses the mouse 200 in cooperation, or else some input operations cannot be performed, but this makes it necessary for the user to carry the notebook computer and the mouse at the same time, and also makes it necessary to establish a wired connection or a wireless connection between the mouse and the notebook computer when using the mouse, thereby reducing portability of the notebook computer.
In order to solve the technical problems, the application provides a notebook computer and an input operation acquisition method, wherein a ToF sensor is arranged on the notebook computer, distance data of a user hand is acquired by the ToF sensor, and gesture actions of the user are determined according to the acquired distance data. The corresponding relation exists between the gesture action and the specific input operation, so that the notebook computer can determine the current corresponding input operation according to the gesture action. The input operation can be operations such as clicking, double clicking, right clicking, roller sliding and the like, namely, the mouse function is realized. By utilizing the scheme of the application, the working efficiency of the notebook computer when the notebook computer is not connected with the mouse is improved, and the portability of the notebook computer is improved as the notebook computer is not required to be carried and connected with the mouse.
The hardware structure of the notebook computer provided by the application is first described below.
Referring to fig. 2, the structure of a notebook computer according to the present application is shown.
The notebook computer 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna group 1, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and the like.
Wherein the sensor module 180 may include at least one ToF sensor 181. And may further include one or more of a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the notebook computer 100. In other embodiments of the application, notebook computer 100 may include more or less components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor through an I2C interface, so that the processor 110 and the touch sensor communicate through an I2C bus interface to implement a touch function of the notebook computer 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing function of notebook computer 100. The processor 110 and the display screen 194 communicate through a DSI interface to implement the display function of the notebook computer 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the notebook computer 100, or may be used to transfer data between the notebook computer 100 and peripheral devices. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other notebook computers, such as AR devices, etc.
It should be understood that the connection between the modules according to the embodiment of the present application is only schematically illustrated, and is not limited to the structure of the notebook computer 100. In other embodiments of the present application, the notebook computer 100 may also employ different interfacing manners, or a combination of interfacing manners, as in the above embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the notebook computer 100. The charging management module 140 can also supply power to the notebook computer through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the notebook computer 100 can be realized by the antenna group 1, the wireless communication module 160, and the like.
The antenna group 1 is used for transmitting and receiving electromagnetic wave signals. Each antenna in the notebook computer 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. The antenna may be used in combination with a tuning switch.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied to the notebook computer 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna group 1, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate the signal, amplify the signal, and convert the signal into electromagnetic waves to radiate the electromagnetic waves through the antenna group 1.
The notebook computer 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a Mini-LED, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the notebook computer 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The notebook computer 100 can realize a photographing function through an image signal processor (Image Signal Processor, ISP), a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, notebook computer 100 may include 1 or N cameras 193, N being a positive integer greater than 1. The application does not limit the specific number of cameras.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the notebook computer 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, etc.
Video codecs are used to compress or decompress digital video. The notebook computer 100 may support one or more video codecs. Thus, the notebook computer 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the notebook computer 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capabilities of the notebook computer 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data) created during use of the notebook computer 100, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications and data processing of the notebook computer 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The notebook computer 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The notebook computer 100 can listen to music through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. In other embodiments, the notebook computer 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the notebook computer 100 may be further provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, implement directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The earphone interface 170D may be a USB interface 130, or may be a 3.5mm open mobile notebook platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The functions of the respective sensors that can be provided in the notebook computer 100 are described below.
Time of flight (ToF) sensors use tiny transmitters to emit infrared light or laser light, where the generated light is reflected from an object and returned to the ToF sensor. The ToF sensor can measure the distance between the object and the sensor based on the time difference between the emission of light and the return of the light to the sensor after reflection by the object.
The ToF sensor 181 may be a dtofsensor or an iToF sensor, which is not particularly limited in the embodiment of the present application.
Among them, the dtofs sensor is known as direct time of flight sensor, i.e. direct light time of flight sensor. The dtofs sensor directly measures the time of flight. The dTOF principle is to emit light pulses to an object to be measured, and directly calculate the distance between the object to be measured and a sensor by measuring the time interval between the reflected light pulses and the emitted light pulses.
The toto sensor is commonly known as a indirect time of flight sensor, i.e., an indirect light time-of-flight sensor. The principle of the iToF sensor is that emitted light is modulated into a periodic signal with a certain frequency, the phase difference between the emitted signal and the signal reaching the object to be measured when the emitted signal is reflected back to the receiving end is measured, and the flight time is calculated indirectly. I.e. indirectly measuring the time of flight of light by measuring the phase shift difference, rather than directly measuring the time of flight of light.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen 194. Pressure sensors are of many kinds, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor, the capacitance between the electrodes changes. The notebook computer 100 determines the intensity of the pressure according to the change of the capacitance. When a touch operation is applied to the display screen 194, the notebook computer 100 detects the intensity of the touch operation according to the pressure sensor. The notebook computer 100 may also calculate the position of the touch based on the detection signal of the pressure sensor. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions.
The fingerprint sensor is used for collecting fingerprints. The notebook computer 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor is used for detecting temperature. In some embodiments, the notebook computer 100 performs a temperature processing strategy using the temperature detected by the temperature sensor. For example, when the temperature reported by the temperature sensor exceeds a threshold, the notebook computer 100 performs a reduction in performance of a processor located near the temperature sensor in order to reduce power consumption for thermal protection. In other embodiments, when the temperature is below another threshold, the notebook computer 100 heats the battery 142 to avoid the abnormal shutdown of the notebook computer 100 caused by the low temperature. In other embodiments, when the temperature is below a further threshold, the notebook computer 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
Touch sensors, also known as "touch devices". The touch sensor may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor may be disposed on the surface of the notebook computer 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The notebook computer 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the notebook computer 100.
The motor 191 may generate a vibration cue. For example, touch operations acting on different applications (e.g., games, audio playback, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The software system of the notebook computer provided by the application is described below.
Referring to fig. 3, a schematic diagram of a software system of a notebook computer according to an embodiment of the application is shown.
The software system of the notebook computer 100 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes a layered architecture as an example, and illustrates a software structure of the notebook computer 100.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, an application layer, an application framework layer, a system library, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include calendar, WLAN, bluetooth, music, etc. applications. In order to realize linkage with the TOF sensor, the scheme of the embodiment of the application further comprises a TOF application component, and a corresponding User Experience (UX) design is carried out, so that a User can realize functions of function switch, use guidance, gesture input and the like through the TOF application component on the notebook computer.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a view system, a resource manager, a notification manager, an algorithm framework, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, image, audio, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification presented in the form of a chart or scroll bar text in the system status bar, such as a notification of a background running application, or a notification presented on a screen in the form of a dialog window. For example, a text message is presented in a status bar, a prompt tone is emitted, and an indicator light blinks, etc.
The algorithm framework comprises an identification and decision algorithm, user intention is identified based on data captured by the ToF sensor, further gestures of the user are determined, and the gestures are converted into mouse-like operations and input to the system so as to finish the mouse operations.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver. The technical scheme provided by the embodiment of the application needs to be applied to the ToF sensor to acquire data, so that the sensor driver also comprises the ToF sensor driver.
The following describes aspects of the application in connection with specific implementations.
It should be understood to those skilled in the art that the top surface of the current notebook computer in the locked state is referred to as "a-side" of the notebook computer, and the bottom surface is referred to as "D-side" of the notebook computer; when the notebook computer is in an open state, the surface of the screen is called as a 'B surface' of the notebook computer, and the surface of the keyboard is called as a 'C surface' of the notebook computer, which are not described in detail in the following description.
The words "first," "second," and the like in the following description are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", etc. may explicitly or implicitly include one or more such feature. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the present application, unless explicitly specified and limited otherwise, the term "connected" is to be construed broadly, and for example, "connected" may be either fixedly connected, detachably connected, or integrally formed; can be directly connected or indirectly connected through an intermediate medium.
Referring to fig. 4, the diagram is a schematic diagram of an application scenario provided in an embodiment of the present application.
Fig. 4 first illustrates a notebook computer including a ToF sensor 181.
The ToF sensor 181 is disposed at the top end position of the B-surface of the notebook computer, and may be specifically located on the left side or the right side of the camera 193 of the notebook computer 100, which is not particularly limited in the embodiment of the present application. Fig. 4 illustrates an example in which the ToF sensor 181 is located on the right side of the camera 193.
The ToF sensor 181 is configured to emit infrared light or laser light, wherein the generated light is reflected from the object and returned to the ToF sensor. The ToF sensor 181 can measure a distance between the object and the ToF sensor 181 based on a time difference between emission of light and return of the light after reflection by the object.
When the user opens the notebook computer 100, the user can control the hand to perform a motion. The ToF sensor 181 measures the distance between the user's hand and the ToF sensor 181 in real time, and distance data is obtained. The ToF sensor 181 transmits the distance data to the processor, so that the processor determines a gesture input by the user according to the distance data, and determines an input operation corresponding to the gesture.
In one possible implementation, after the function of the ToF sensor 181 is turned on, the ToF sensor 181 may acquire distance data within a detection range toward the user side, and determine whether there is gesture input at this time according to the distance data. The size of the detection range is related to the device parameters of the ToF sensor 181. For example, the ToF sensor 181 may acquire distance data between 0 ° and 180 ° of the screen facing the user side, i.e. the ToF sensor 181 has a data distance acquisition range of 180 °; for another example, the ToF sensor 181 may acquire distance data between 45 ° and 135 ° of the screen facing the user side, i.e. the ToF sensor 181 has a distance data acquisition range of 90 °. The above data collection range is only illustrative, and does not limit the technical scheme of the application.
In another possible implementation, when taking into account the actual scenario, the user generally adopts the same input habit as using a mouse when inputting instructions with gestures. That is, when the user uses the left hand to control the mouse, the user is generally used to input instructions by using the left hand, and when the user uses the right hand to control the mouse, the user is generally used to control the mouse by using the right hand. Therefore, in the scheme of the present application, the ToF sensor 181 may not monitor the acquired data acquired in the full-angle range in real time, but rather, may monitor the area on the left side of the keyboard and/or the area on the right side of the keyboard as a gesture input area in a targeted manner, which will be described below.
With continued reference to fig. 4, the first gesture input region 301 is illustrated as being located on the right side of the notebook computer 100.
When the notebook computer 100 is placed on the desktop, the user may set the right side of the notebook computer 100 as the first gesture input region 301 through the ToF application component on the notebook computer 100. At this time, toF sensor 181 determines whether there is a gesture input at present based mainly on the acquired data in first gesture input region 301, and may ignore the acquired data in other regions.
The technical effects of this are: other interfering objects present in the surrounding environment are prevented from misjudging gesture input by the ToF sensor 181 due to movement. That is, the setting of the first gesture input area 301 caters to the input habit of the user, so as to improve the user experience, reduce the interference of the environment, and improve the accuracy.
Referring to fig. 5, the diagram is a schematic diagram of another application scenario provided in an embodiment of the present application.
Fig. 5 illustrates an implementation of the second gesture input area 302 on the left side of the notebook computer 100. At this time, toF sensor 181 determines whether gesture input is currently present based primarily on the acquired data in second gesture input region 302, and may ignore the acquired data in other regions.
In one possible implementation, the user may set the gesture input area currently started through the ToF application component on the notebook computer 100, which is the first gesture input area 301 in fig. 4 or the second gesture input area 302 in fig. 5, so as to adapt to the input habit of the user.
Referring to fig. 6, a schematic diagram of still another application scenario provided in an embodiment of the present application is shown.
In yet another possible implementation, the user may set up to activate the first gesture input area 301 and the second gesture input area 302 simultaneously through the ToF application component on the notebook computer 100. Further, it may be further configured that, when it is determined that the first gesture input area 301 and the second gesture input area 302 simultaneously have gesture inputs, an input operation corresponding to the gesture input of one of the gesture input areas is preferentially performed, that is, a priority order of the plurality of gesture input areas is set.
The following describes an implementation of a notebook computer with multiple ToF sensors. The following description will take a notebook computer with two ToF sensors as an example.
Referring to fig. 7, the diagram is a schematic diagram of another application scenario provided in an embodiment of the present application.
At this time, the notebook computer 100 includes a ToF sensor 181 and a ToF sensor 182. In practice, toF sensor 181 and ToF sensor 182 are typically of the same model specification.
The ToF sensor 181 and the ToF sensor 182 are symmetrically distributed on both sides of the camera 193.
When the notebook computer 100 is provided with two ToF sensors 181, the gesture input area may still be as shown in fig. 4, and be provided with one gesture input area and located on the right side of the notebook computer 100; or as shown in fig. 5, one is arranged and positioned at the left side of the notebook computer; or as shown in fig. 7, two are provided, which are respectively located at the left and right sides of the notebook computer 100, and the present application is not particularly limited.
In one possible implementation manner, two ToF sensors may be set through a ToF application component on the notebook computer 100, and the currently enabled gesture input area is monitored at the same time, and by using a redundancy design, two paths of distance data may be obtained, and gesture input may be determined according to the two paths of distance data, so that accuracy of determining gesture input is improved.
In addition, in one possible implementation, the ToF sensor 181 may also be configured by a ToF application component on the notebook computer 100 to determine whether a gesture input is currently present according to the collected data in the first gesture input area 301, and may ignore the collected data in other areas, and the ToF sensor 182 may be configured to determine whether a gesture input is currently present according to the collected data in the second gesture input area 302, and may ignore the collected data in other areas.
The implementation of the ToF sensor and associated modules on a notebook computer is described below. In the following description, the direction facing the ToF sensor means a direction perpendicular to the ToF sensor and facing the measurement area of the ToF sensor.
Referring to fig. 8A, a schematic diagram of a ToF sensor module according to an embodiment of the present application is shown.
Taking the ToF sensor 181 as an example, the ToF sensor module includes: the first support 81, the second support 82, the first magnetic pole structure 83, the second magnetic pole structure 84, the first magnetic material 85, the second magnetic material 86, the first spring 87, the second spring 88, and the ToF sensor 181.
Wherein the first support 81 and the second support 82 are planar structures. In the implementation shown in fig. 8A, the first magnetic pole structure 83, the second magnetic pole structure 84, and the ToF sensor 181 are fixed on the first support 81; the first magnetic material 85 and the second magnetic material 86 are fixed on the second support 82.
The first spring 87 and the second spring 88 are used for connecting the first support 81 and the second support 82, and the connection positions of the first spring 87 and the second spring 88 with the first support 81 are respectively located on two opposite sides of the ToF sensor 181.
The processor 110 is configured to control the first magnetic pole structure 83 and the second magnetic pole structure 84 to energize and generate an electromagnetic field, thereby generating a first force between the first magnetic pole structure 83 and the first magnetic material 85 and a second force between the second magnetic pole structure 84 and the second magnetic material 86. The directions of the first acting force and the second acting force are opposite, at this time, one spring is compressed, and the other spring is stretched, so that the first supporting member 81 tilts, and the ToF sensor 181 is driven to tilt.
When the first support 81 is not tilted, the ToF sensor 181 is facing the front of the screen of the notebook computer, and when the first gesture input area 301 in fig. 7 is activated, the ToF sensor 181 is not facing the first gesture input area 301. In practical application, when the ToF sensor 181 is opposite to the object, the acquired distance data is more accurate, and the contact surface between the emitted infrared light or laser and the object is relatively larger, so that the distance data of the object can be acquired more comprehensively.
Therefore, the processor 110 can control the first magnetic pole structure 83 and the second magnetic pole structure 84 at this time, so that the angle of the ToF sensor 181 is deflected, so that the ToF sensor 181 can be opposite to the first gesture input area 301, accuracy of the acquired distance data is improved, distance data of the object can be acquired more comprehensively, and accuracy of gesture motion recognition is further improved.
Various implementations of the ToF sensor module are described in detail below.
Referring to fig. 8B, a schematic diagram of another ToF sensor module according to an embodiment of the present application is shown.
The first magnetic pole structure 83 in fig. 8B includes a variable power supply 831 and a solenoid T1, and the second magnetic pole structure 84 includes a variable power supply 841 and a solenoid T2. One end of the first magnetic material 85 facing the first support member 81 is an N pole, and one end of the first magnetic material 85 far away from the first support member 81 is an S pole; the end of the second magnetic material 86 facing the first support 81 is N-pole, and the end of the second magnetic material 86 facing away from the first support 81 is S-pole.
The variable power supplies 831 and 841 can be implemented by a variable power supply chip in the prior art, and the specific implementation thereof will not be described herein. Since the winding patterns of the solenoid T1 and the solenoid T2 are the same, the current direction output from the variable power supply 831 to the solenoid T1 is opposite to the current direction output from the variable power supply 841 to the solenoid T2.
When the processor 110 needs to control the ToF sensor 181 to deflect, the processor 110 sends a control signal to the variable power supply 831 to control the variable power supply 831 to output a current to the solenoid T1, so that the solenoid T1 generates an electromagnetic field, at this time, one end of the solenoid T1 close to the first support 81 corresponds to the S pole, and one end far from the first support 81 corresponds to the N pole. The solenoid T1 attracts the first magnetic material 85, thereby compressing the first spring 87.
Meanwhile, the processor 110 sends a control signal to the variable power source 841, and controls the variable power source 841 to output a current to the solenoid T2, so that the solenoid T2 generates an electromagnetic field, and at this time, one end of the solenoid T2 close to the first support 81 corresponds to the N pole, and one end far from the first support 81 corresponds to the S pole. The solenoid T1 and the first magnetic material 85 repel each other, thereby stretching the second spring 88.
The processor 110 deflects the ToF sensor 181 by the above control.
Referring to fig. 8C, a schematic diagram of deflection of the ToF sensor corresponding to fig. 8B according to an embodiment of the present application is shown.
It is seen that at this point, first spring 87 is compressed and second spring 88 is stretched, which cooperate to deflect ToF sensor 181 to the left as shown.
In some embodiments, the ToF sensor 181 may have a certain tracking capability, and the corresponding relationship between the deflection angle of the ToF sensor 181 and the output current of the variable power supply is calibrated by testing in advance and stored, for example, in the form of a data table. The processor acquires distance data sent by the ToF sensor 181 in real time, determines an included angle between the facing direction of the ToF sensor and the hand of the user, and determines an output current of the variable power supply according to the corresponding relationship and the included angle. The processor deflects the ToF sensor 181 by controlling the variable power supply such that the angle becomes zero. I.e. such that the ToF sensor is facing the user's hand. When the inclination angle of the ToF sensor is not adjusted, the ToF sensor is parallel to the screen of the notebook computer, and the opposite direction of the ToF sensor is perpendicular to the screen.
It will be appreciated that the deflection adjustment capability of the ToF sensor 181 may be limited, and if the angle between the facing direction of the ToF sensor and the hand of the user is too large, the deflection adjustment capability of the ToF sensor 181 is exceeded, and the processor controls the variable power source to deflect the ToF sensor 181, so that the angle is reduced as much as possible, that is, the ToF sensor is as much as possible facing the hand of the user.
In other embodiments, the ToF sensor 181 is directly in front of the screen of the notebook computer when not activated, i.e. the first support 81 and the second support 82 remain horizontal. And the ToF sensor 181 does not track the user's hand, but remains facing the gesture input area. With continued reference to fig. 4, the correspondence between the deflection angle of the ToF sensor 181 and the output current of the variable power supply is calibrated in advance by testing and stored, for example, in the form of a data table. At this time, there are only two sets of data corresponding to each ToF sensor, taking the ToF sensor 181 as an example, when the first gesture input area 301 is started, the processor controls the variable power source according to the corresponding relationship between the first set of deflection angles and the output current of the variable power source, so that the ToF sensor 181 faces the first gesture input area 301; referring to fig. 5, when the second gesture input region 302 is activated, the processor controls the variable power source such that the ToF sensor 181 faces the second gesture input region 302 according to a correspondence between the second set of deflection angles and the output current of the variable power source.
Referring to fig. 9A, a schematic diagram of another ToF sensor module according to an embodiment of the present application is shown.
The ToF sensor module shown in fig. 9A differs from that of fig. 8B in that: solenoid T1 and solenoid T2 are powered by the same variable power supply 89. Solenoid T1 and solenoid T2 are wound in different ways to produce electromagnetic fields in different directions. The variable power supply 89 outputs the same current to the solenoid T1 and the solenoid T2.
At this time, the end of the solenoid T1 close to the first support 81 corresponds to the S pole, the end far from the first support 81 corresponds to the N pole, and the solenoid T1 and the first magnetic material 85 attract each other, so as to compress the first spring 87; the end of the solenoid T2 close to the first support 81 corresponds to the N pole, the end far from the first support 81 corresponds to the S pole, and the solenoid T1 and the first magnetic material 85 repel each other, thereby stretching the second spring 88. The two cooperate to deflect ToF sensor 181 to the left as shown.
The shared variable power supply can reduce hardware cost and reduce the volume of the ToF sensor module, thereby facilitating the realization of miniaturized design.
Referring to fig. 9B, a schematic diagram of another ToF sensor module according to an embodiment of the application is shown.
The ToF sensor module shown in fig. 9B differs from that of fig. 8B in that: the end of the second magnetic material 86 facing the first support 81 is an S-pole, and the end of the second magnetic material 86 facing away from the first support 81 is an N-pole.
At this time, the processor 110 sends a control signal to the variable power supply 831, and controls the variable power supply 831 to output a current to the solenoid T1 so that the solenoid T1 generates an electromagnetic field, one end of the solenoid T1 close to the first support 81 corresponds to an S pole, and one end far from the first support 81 corresponds to an N pole. The solenoid T1 attracts the first magnetic material 85, thereby compressing the first spring 87.
Meanwhile, the processor 110 sends a control signal to the variable power source 841, and controls the variable power source 841 to output a current to the solenoid T2, so that the solenoid T2 generates an electromagnetic field, and at this time, one end of the solenoid T2 close to the first support 81 corresponds to the S pole, and one end far from the first support 81 corresponds to the N pole. The solenoid T1 and the first magnetic material 85 repel each other, thereby stretching the second spring 88. The two cooperate to deflect ToF sensor 181 to the left as shown.
Referring to fig. 9C, a schematic diagram of another ToF sensor module according to an embodiment of the application is shown.
The ToF sensor module shown in fig. 9C differs from that of fig. 9B in that: solenoid T1 and solenoid T2 are powered by the same variable power supply 89. Solenoid T1 and solenoid T2 are wound in the same winding manner to generate electromagnetic fields in the same direction. The variable power supply 89 outputs the same current to the solenoid T1 and the solenoid T2.
The shared variable power supply can reduce hardware cost and reduce the volume of the ToF sensor module, thereby facilitating the realization of miniaturized design.
Referring to fig. 10, a schematic diagram of another ToF sensor module according to an embodiment of the present application is shown.
The ToF sensor module shown in fig. 10 differs from that of fig. 8A-9C in that: the first magnetic material 85, the second magnetic material 86, and the ToF sensor 181 are fixed on the first support 81; the first and second magnetic pole structures 83 and 84 are fixed to the second support 82.
The processor 110 is configured to control the first magnetic pole structure 83 and the second magnetic pole structure 84 to energize and generate an electromagnetic field, thereby generating a first force between the first magnetic pole structure 83 and the first magnetic material 85 and a second force between the second magnetic pole structure 84 and the second magnetic material 86. The directions of the first acting force and the second acting force are opposite, at this time, one spring is compressed, and the other spring is stretched, so that the first supporting member 81 tilts, and the ToF sensor 181 is driven to tilt.
When the structure of fig. 10 is adopted, the specific implementation of the ToF sensor module can be seen from the description in fig. 8A to 9C, and the difference is only that the relative positions of the magnetic material and the magnetic pole structure are exchanged, which is not described herein again.
Other implementations of the ToF sensor and associated modules are described below.
Referring to fig. 11A, a schematic diagram of another ToF sensor module according to an embodiment of the present application is shown.
Continuing with the example of the corresponding ToF sensor 181, the ToF sensor module includes: first cambered surface support 91, second cambered surface support 92, first magnetic pole structure 83, second magnetic pole structure 84, first magnetic material 85, second magnetic material 86, first spring 87, second spring 88, and ToF sensor 181.
The area of the first cambered surface supporting piece 91 is larger than that of the second cambered surface supporting piece 92, and the radian of the first cambered surface supporting piece 91 is smaller than that of the second cambered surface supporting piece 92. The first cambered surface support 91 is in contact with and tangent to the second cambered surface support 92, but is not fixed.
The first magnetic pole structure 83 and the second magnetic pole structure 84 are fixed on the second cambered surface support 92; the first magnetic material 85 and the second magnetic material 86 are fixed on the first arc surface support 91. The first spring 87 and the second spring 88 are used to connect the first arc support 91 and the ToF sensor 181, and the first spring 87 and the second spring 88 are fixed to opposite ends of the ToF sensor 181, respectively.
The processor 110 is configured to control the first magnetic pole structure 83 and the second magnetic pole structure 84 to energize and generate an electromagnetic field, thereby generating a first force between the first magnetic pole structure 83 and the first magnetic material 85 and a second force between the second magnetic pole structure 84 and the second magnetic material 86. The direction of the first acting force and the direction of the second acting force are opposite, so that one end of the second cambered surface supporting member 92 is tilted, and the other end falls back, thereby causing one spring to be compressed, and the other spring to be stretched, so as to drive the ToF sensor 181 to incline.
When the second cambered surface supporting member 92 is not inclined, the ToF sensor 181 is opposite to the front of the screen of the notebook computer, and when the first gesture input area 301 in fig. 7 is activated, the ToF sensor 181 is not opposite to the first gesture input area 301.
The processor 110 may control the first magnetic pole structure 83 and the second magnetic pole structure 84 to deflect the angle of the ToF sensor 181, so that the ToF sensor 181 can be opposite to the first gesture input area 301, accuracy of the acquired distance data is improved, distance data of an object can be acquired more comprehensively, and accuracy of gesture motion recognition is further improved.
Various implementations of the ToF sensor module are described in detail below.
Referring to fig. 11B, a schematic diagram of another ToF sensor module according to an embodiment of the present application is shown.
The first magnetic pole structure 83 in fig. 11B includes a variable power supply 831 and a solenoid T1, and the second magnetic pole structure 84 includes a variable power supply 841 and a solenoid T2. One end of the first magnetic material 85 facing the second cambered surface supporting piece 92 is an N pole, and one end of the first magnetic material 85 far away from the second cambered surface supporting piece 92 is an S pole; the end of the second magnetic material 86 facing the second cambered surface supporting member 92 is an N pole, and the end of the second magnetic material 86 far away from the second cambered surface supporting member 92 is an S pole.
The variable power supplies 831 and 841 can be implemented by a variable power supply chip in the prior art, and the specific implementation thereof will not be described herein. Since the winding patterns of the solenoid T1 and the solenoid T2 are the same, the current direction output from the variable power supply 831 to the solenoid T1 is opposite to the current direction output from the variable power supply 841 to the solenoid T2.
When the processor 110 needs to control the ToF sensor 181 to deflect, the processor 110 sends a control signal to the variable power supply 831 to control the variable power supply 831 to output current to the solenoid T1, so that the solenoid T1 generates an electromagnetic field, at this time, one end of the solenoid T1 close to the first cambered surface supporting member 91 corresponds to an S pole, and one end far from the first cambered surface supporting member 91 corresponds to an N pole. The solenoid T1 and the first magnetic material 85 attract each other, so that the left side of the second arc support 92 is forced to fall back, thereby compressing the first spring 87.
Meanwhile, the processor 110 sends a control signal to the variable power source 841, and controls the variable power source 841 to output current to the solenoid T2, so that the solenoid T2 generates an electromagnetic field, at this time, one end of the solenoid T2 close to the first arc support 91 corresponds to the N pole, and one end far from the first arc support 91 corresponds to the S pole. The solenoid T1 and the first magnetic material 85 repel each other, so that the right side of the second arc support 92 is lifted by force, and the second spring 88 is stretched.
The processor 110 deflects the ToF sensor 181 by the above control.
The two sides of the ToF sensor in fig. 8A to 10 are symmetrical structures, and can co-act to adjust the inclination angle of the ToF sensor. In other embodiments, the ToF sensor may be provided with a fixed structure on one side and the structure of the above embodiment on the other side, e.g. retaining the first magnetic pole structure, the first spring and the first magnetic material, and without providing the second magnetic pole structure and the second magnetic material on the other side, and with a non-deformable support instead of the second spring. At this time, the inclination angle of the ToF sensor is adjusted only by the deformation of the first spring.
Referring to fig. 11C, a schematic diagram of deflection of the ToF sensor corresponding to fig. 11B according to an embodiment of the present application is shown.
It is seen that at this point, first spring 87 is compressed and second spring 88 is stretched, which cooperate to deflect ToF sensor 181 to the left as shown.
In some embodiments, the ToF sensor 181 may have a certain tracking capability, and the corresponding relationship between the deflection angle of the ToF sensor 181 and the output current of the variable power supply is calibrated by testing in advance and stored, for example, in the form of a data table. The processor acquires distance data sent by the ToF sensor 181 in real time, determines an included angle between the facing direction of the ToF sensor and the hand of the user, and determines an output current of the variable power supply according to the corresponding relationship and the included angle. The processor deflects the ToF sensor 181 by controlling the variable power supply such that the angle becomes zero. I.e. such that the ToF sensor is facing the user's hand.
It will be appreciated that the deflection adjustment capability of the ToF sensor 181 may be limited, and if the angle between the facing direction of the ToF sensor and the hand of the user is too large, the deflection adjustment capability of the ToF sensor 181 is exceeded, and the processor controls the variable power source to deflect the ToF sensor 181, so that the angle is reduced as much as possible, that is, the ToF sensor is as much as possible facing the hand of the user.
In other embodiments, the ToF sensor 181 is directly in front of the screen of the notebook computer when not activated, i.e. the first support 81 and the second support 82 remain horizontal. And the ToF sensor 181 does not track the user's hand, but remains facing the gesture input area. With continued reference to fig. 4, the correspondence between the deflection angle of the ToF sensor 181 and the output current of the variable power supply is calibrated in advance by testing and stored, for example, in the form of a data table. At this time, there are only two sets of data corresponding to each ToF sensor, taking the ToF sensor 181 as an example, when the first gesture input area 301 is started, the processor controls the variable power source according to the corresponding relationship between the first set of deflection angles and the output current of the variable power source, so that the ToF sensor 181 faces the first gesture input area 301; referring to fig. 5, when the second gesture input region 302 is activated, the processor controls the variable power source such that the ToF sensor 181 faces the second gesture input region 302 according to a correspondence between the second set of deflection angles and the output current of the variable power source.
Referring to fig. 12A, a schematic diagram of another ToF sensor module according to an embodiment of the present application is shown.
The ToF sensor module shown in fig. 12A differs from that of fig. 11B in that: solenoid T1 and solenoid T2 are powered by the same variable power supply 89. Solenoid T1 and solenoid T2 are wound in different ways to produce electromagnetic fields in different directions. The variable power supply 89 outputs the same current to the solenoid T1 and the solenoid T2.
At this time, one end of the solenoid T1, which is close to the first cambered surface supporting member 91, is equivalent to the S pole, and one end of the solenoid T1, which is far from the first cambered surface supporting member 91, is equivalent to the N pole, and the solenoid T1 and the first magnetic material 85 attract each other, so that the left side of the second cambered surface supporting member 92 is forced to fall back, thereby compressing the first spring 87; the end of the solenoid T2 close to the first cambered surface supporting member 91 corresponds to the N pole, the end far away from the first cambered surface supporting member 91 corresponds to the S pole, and the solenoid T1 and the first magnetic material 85 repel each other, so that the right side of the second cambered surface supporting member 92 is stressed and tilted, and the second spring 88 is stretched. The two cooperate to deflect ToF sensor 181 to the left as shown.
The shared variable power supply can reduce hardware cost and reduce the volume of the ToF sensor module, thereby facilitating the realization of miniaturized design.
Referring to fig. 12B, a schematic diagram of another ToF sensor module according to an embodiment of the application is shown.
The ToF sensor module shown in fig. 12B differs from that of fig. 11B in that: the end of the second magnetic material 86 facing the second cambered surface supporting member 92 is an S pole, and the end of the second magnetic material 86 far away from the second cambered surface supporting member 92 is an N pole.
At this time, the processor 110 sends a control signal to the variable power supply 831, and controls the variable power supply 831 to output a current to the solenoid T1, so that the solenoid T1 generates an electromagnetic field, one end of the solenoid T1 close to the first arc support 91 corresponds to an S pole, and one end far from the first arc support 91 corresponds to an N pole. The solenoid T1 and the first magnetic material 85 attract each other, so that the left side of the second arc support 92 is forced to fall back, thereby compressing the first spring 87.
Meanwhile, the processor 110 sends a control signal to the variable power source 841, and controls the variable power source 841 to output current to the solenoid T2, so that the solenoid T2 generates an electromagnetic field, at this time, one end of the solenoid T2 close to the first arc support 91 corresponds to the S pole, and one end far from the first arc support 91 corresponds to the N pole. The solenoid T1 and the first magnetic material 85 repel each other, so that the right side of the second arc support 92 is lifted by force, and the second spring 88 is stretched. The two cooperate to deflect ToF sensor 181 to the left as shown.
Referring to fig. 12C, a schematic diagram of another ToF sensor module according to an embodiment of the application is shown.
The ToF sensor module shown in fig. 12C differs from that of fig. 12B in that: solenoid T1 and solenoid T2 are powered by the same variable power supply 89. Solenoid T1 and solenoid T2 are wound in the same winding manner to generate electromagnetic fields in the same direction. The variable power supply 89 outputs the same current to the solenoid T1 and the solenoid T2.
The shared variable power supply can reduce hardware cost and reduce the volume of the ToF sensor module, thereby facilitating the realization of miniaturized design.
Referring to fig. 13, a schematic diagram of another ToF sensor module according to an embodiment of the present application is shown.
The ToF sensor module shown in fig. 13 differs from that of fig. 11A-12C in that: the first magnetic material 85 and the second magnetic material 86 are fixed on the first arc surface support 91; the first magnetic pole structure 83 and the second magnetic pole structure 84 are fixed to the second cambered surface support 92.
The processor 110 is configured to control the first magnetic pole structure 83 and the second magnetic pole structure 84 to energize and generate an electromagnetic field, thereby generating a first force between the first magnetic pole structure 83 and the first magnetic material 85 and a second force between the second magnetic pole structure 84 and the second magnetic material 86. The first and second forces are in opposite directions, with one spring being compressed and the other spring being stretched such that one end of the second curved support 92 falls back. The other end of the sensor is tilted to drive the TOF sensor 181 to tilt.
When the structure of fig. 13 is adopted, the specific implementation of the ToF sensor module can be seen from the description in fig. 11A-12C, and the difference is only that the relative positions of the magnetic material and the magnetic pole structure are exchanged, which is not described herein.
The above embodiments illustrate the working principle of the ToF sensor and the related modules on a notebook computer. The following specifically describes a specific flow for realizing the mouse operation function using gestures.
Referring to fig. 14, a flowchart of a method for acquiring an input operation according to an embodiment of the present application is shown.
The method comprises the following steps:
s10: and performing monitoring setting and inputting gesture actions.
The scheme of the application aims at realizing the input operation of the mouse through the gesture action, so that the corresponding relation exists between the gesture action and the input operation, and the corresponding relation needs to be input in the TOF application component on the notebook computer 100 in advance.
The input operation of the mouse mainly comprises the following steps: single-stage operation, double-click operation, scroll wheel sliding operation, right-click operation, and the like. The user needs to input corresponding gesture actions aiming at the operations in advance, the embodiment of the application does not limit specific gesture actions, and the gesture actions can be determined according to the input habit of the user.
In addition, there is a need for monitoring settings, which may include, but are not limited to, one or more of the following: setting an enabled ToF sensor, setting an enabled gesture input area, setting a detection sensitivity of a gesture motion, and setting a correspondence between the enabled ToF sensor and the enabled gesture input area.
The detection sensitivity characterizes the sensitivity of the ToF sensor for detecting gesture input, and when the detection sensitivity of gesture motion is set to be higher, the processor can recognize and judge the hand motion with smaller motion amplitude; when the detection sensitivity of the gesture motion is low, the processor can recognize and judge the hand motion with larger motion amplitude, and the hand motion with smaller motion amplitude is regarded as interference motion, so as to perform rejection processing.
Regarding the correspondence between the set-up activated ToF sensor and the enabled gesture input area, reference may be made to the above description related to fig. 4 to 7, and the embodiments of the present application are not described herein again.
S20: and detecting gestures of a user and acquiring distance data.
S30: the distance data is sent to the processor.
The ToF sensor sends the acquired distance data to the processor, and the processor recognizes the gesture.
S40: and identifying and obtaining the current gesture of the user according to the acquired distance data.
And the processor of the notebook computer determines the gesture action of the user at the moment according to the distance data acquired by the ToF sensor and the self recognition algorithm. The recognition algorithm recognizes the user intention based on the captured distance data, and because the user of the notebook computer is relatively fixed, big data learning can be performed based on the using gesture habit of the user, so that the accuracy rate is higher and higher.
S50: and determining an input operation corresponding to the current gesture and inputting the input operation to the system.
The processor converts the gesture action of the current user into the corresponding input operation according to the corresponding relation between the gesture action and the input operation recorded in the S10 and inputs the corresponding input operation into the system, so that the gesture action is replaced by the mouse to finish the input.
The flow of performing the monitoring setting and entering the gesture operation in S10 is specifically described below.
Referring to fig. 15A, a flowchart of a method for inputting gesture actions and performing monitoring settings according to an embodiment of the present application is shown.
The process comprises the following steps:
s11: setting the detection sensitivity of gesture actions.
The detection sensitivity characterizes the sensitivity degree of the ToF sensor for detecting gesture input, and when the detection sensitivity of gesture motion is set to be higher, the processor can recognize and judge the hand motion with smaller motion amplitude; when the detection sensitivity of the gesture motion is low, the processor can recognize and judge the hand motion with larger motion amplitude, and the hand motion with smaller motion amplitude is regarded as interference motion, so as to perform rejection processing.
The detection sensitivity can comprise a plurality of sensitivity gears, and the user can set the detection sensitivity according to the habit of gesture input.
S12: an enabled ToF sensor and an enabled gesture input area are set.
With continued reference to fig. 7, when two ToF sensors are included on the notebook computer, the user may set at least one of the ToF sensor 181 and the ToF sensor 182 to be on.
And selects at least one of the first gesture input region 301 and the second gesture input region 302 to be enabled.
S13: setting a correspondence between enabled ToF sensors and enabled gesture input areas.
In some embodiments, the user enables both the ToF sensor 181 and the ToF sensor 182, and both the first gesture input area 301 and the second gesture input area 302. At this time, the user may set the ToF sensor 181 to correspond to the first gesture input area 301, so that the processor processes the distance data in the first gesture input area 301, and discards other distance data acquired by the ToF sensor 181; and the ToF sensor 182 is arranged to correspond to the second gesture input area 302 such that the processor processes the distance data within the second gesture input area 301 while discarding other distance data acquired by the ToF sensor 182.
In practical applications, the distance data acquired by the ToF sensor carries direction information, that is, each distance data includes distance information and direction information. The processor can judge whether the distance data is the distance data collected in the gesture input area according to the direction information.
In other embodiments, the user enables both ToF sensor 181 and ToF sensor 182 and only one gesture input area, such as first gesture input area 301, is enabled. At this time, the user may set the ToF sensor 181 and the ToF sensor 182 to correspond to the first gesture input area 301 at the same time, so as to implement redundancy detection and improve accuracy.
In still other embodiments, the user has one ToF sensor enabled, such as ToF sensor 181 enabled, and only one gesture input area, such as first gesture input area 301 enabled. At this point the user may set the ToF sensor 181 to correspond to the first gesture input area 301.
S14: save the settings and start making gesture settings.
After the ToF application component completes the setting, the user saves the setting result and performs gesture input.
S15: the monitoring range of the enabled ToF sensor is adjusted.
The processor adjusts the monitoring range of the activated ToF sensor, i.e. the deflection angle of the activated ToF sensor, according to the user settings in S12-S13.
Regarding the implementation and the working principle of the module of the ToF sensor, reference may be made to the above descriptions of fig. 8A to 13, and the embodiments of the present application are not repeated here.
In some embodiments, with continued reference to fig. 7, the user enables both ToF sensor 181 and ToF sensor 182, and both first gesture input region 301 and second gesture input region 302. After the user sets the ToF sensor 181 to correspond to the first gesture input area 301 and sets the ToF sensor 182 to correspond to the second gesture input area 302, the processor controls the ToF sensor 181 to deflect so that the ToF sensor 181 is as opposite to the first gesture input area 301 as possible; and, the processor controls the ToF sensor 182 to deflect such that the ToF sensor 182 is as opposed to the second gesture input region 302 as possible. And the accuracy of the acquired distance data is further improved, the distance information of the object can be comprehensively acquired, and the accuracy of gesture motion recognition is further improved.
In other embodiments, the user enables both ToF sensor 181 and ToF sensor 182 and only one gesture input area, such as first gesture input area 301, is enabled. After the user sets the ToF sensor 181 and the ToF sensor 182 to simultaneously correspond to the first gesture input area 301, the processor controls the ToF sensor 181 to deflect, so that the ToF sensor 181 faces the first gesture input area 301 as much as possible; and controls the ToF sensor 182 to deflect such that the ToF sensor 182 is facing as far as possible into the first gesture input region 301.
In still other embodiments, the user has one ToF sensor enabled, such as ToF sensor 181 enabled, and only one gesture input area, such as first gesture input area 301 enabled. After the user sets the ToF sensor 181 to correspond to the first gesture input area 301, the processor controls the ToF sensor 181 to deflect, so that the ToF sensor 181 faces the first gesture input area 301 as much as possible.
In practical applications, since the processor consumes power when controlling the ToF sensor to deflect, when a user does not gesture input in the gesture input area for a long time, maintaining the ToF sensor to deflect may cause unnecessary power consumption. Therefore, when the gesture motion is not recognized for the first preset time, the processor stops controlling the ToF sensor to deflect, so that the ToF sensor resumes the position opposite to the screen, and power consumption is reduced.
The first preset time may be set according to practical situations, and the embodiment of the present application is not limited to a specific one, and may be set to 5 minutes, for example. The user can set itself on the ToF application component. When the processor recognizes that the user starts gesture input in the gesture input area according to the distance data of the ToF sensor, the control of the ToF sensor to deflect is resumed again.
In some embodiments, after the processor completes the adjustment of the tilt angle of the ToF sensor, the tilt angle of the ToF sensor is maintained, so that after the subsequent user completes all the steps of entering the gesture motion and monitoring the setting, the ToF sensor is still facing the gesture input area when gesture input is performed.
S16: the user performs gesture input in the enabled gesture input area.
S17: distance data is acquired and transmitted to the processor.
The ToF sensor obtains the distance data and transmits the distance data to the processor.
S18: the processor recognizes and obtains the characteristics of the gesture action, and determines that the input of the current gesture action is completed.
S19: setting an input operation corresponding to the gesture.
The input operation comprises the input operation of the mouse, and after the corresponding relation between the gesture action and the input operation is saved, namely the conversion relation between the gesture action and the input operation of the mouse is recorded at the moment. For example, a single click with an index finger corresponds to a single mouse click operation, two clicks with an index finger corresponds to a double mouse click operation, a single click with a middle finger corresponds to a right mouse click operation, a two-finger sliding with an index finger and a middle finger corresponds to a scroll operation of a mouse wheel, and the like.
The input operation may further include other operations other than the input operation of the mouse, that is, the gesture motion may not only implement the function of the mouse, but also implement other functions, such as a rollback of the left slide of the palm corresponding to the video/audio, a pause of the up slide of the palm corresponding to the video/audio, a fast forward of the right slide of the palm corresponding to the video/audio, and so on.
In addition, the input operation may further include other shortcut operations, for example, copy, paste, etc. functions, and the implementation of the present application will not be repeated.
The above is only illustrative, and does not limit the technical scheme of the application, and the corresponding relation between gesture actions and input operations can be customized according to habits and favorites of users, thereby realizing individuation and increasing interestingness.
The step S16-S19 can be circularly executed for a plurality of times, and the corresponding relation between a group of gesture actions and input operations is input each time until the user inputs all the corresponding relations between the gesture actions and the input operations which the user wants to use. And when the corresponding relation between a group of gesture actions and input operations is input, S16-S17 can be performed for a plurality of times, namely, a user can perform gesture input for a plurality of times in a gesture input area until the processor recognizes the characteristics of the gesture actions.
The above steps are not limited to the technical solution of the present application, and in practical application, the sequence of the steps may be adjusted, for example, the steps of gesture input may be performed first, and then the relevant steps of monitoring setting may be performed. For another example, the order between the relevant steps of performing the monitoring setting may be exchanged, and for another example, the step concerning the detection sensitivity in S11 may be deleted.
In one possible implementation, after the processor determines that the notebook computer is connected with the mouse, the gesture recognition function may be controlled to be automatically turned off to reduce power consumption.
The following describes another process of performing monitoring setting and entering gesture actions, and the steps in the following description that are the same as those in fig. 15A are not repeated.
Referring to fig. 15A, a flowchart of a method for inputting gesture actions and performing monitoring settings according to an embodiment of the present application is shown.
The process comprises the following steps:
s11: setting the detection sensitivity of gesture actions.
S12: an enabled ToF sensor and an enabled gesture input area are set.
S13: setting a correspondence between enabled ToF sensors and enabled gesture input areas.
S14: save the settings and start making gesture settings.
After the ToF application component completes the setting, the user saves the setting result and performs gesture input.
S15', the user performs gesture input in the enabled gesture input area.
S16', distance data are acquired and transmitted to a processor.
The ToF sensor obtains the distance data and transmits the distance data to the processor.
And S17', adjusting the monitoring range of the activated ToF sensor according to the distance data.
The distance data acquired by the TOF sensor comprises direction information and distance information. The processor may determine an angle between the direction the ToF sensor is facing and the user's hand based on the direction information therein. The processor may then determine the output current of the variable power supply based on a pre-stored correspondence between the deflection angle of the ToF sensor and the output current of the variable power supply. The processor deflects the ToF sensor by controlling the variable power supply so that the angle becomes zero or as close to zero as possible. I.e. to make the ToF sensor as facing the user's hand as much as possible.
Regarding the implementation and the working principle of the module of the ToF sensor, reference may be made to the above descriptions of fig. 8A to 13, and the embodiments of the present application are not repeated here.
In practical applications, since the processor consumes power when controlling the ToF sensor to deflect, when a user does not gesture input in the gesture input area for a long time, maintaining the ToF sensor to deflect may cause unnecessary power consumption. Therefore, when the gesture motion is not recognized for the first preset time, the processor stops controlling the ToF sensor to deflect, so that the ToF sensor resumes the position opposite to the screen, and power consumption is reduced.
The first preset time may be set according to practical situations, and the embodiment of the present application is not limited to a specific one, and may be set to 5 minutes, for example. The user can set itself on the ToF application component. When the processor recognizes that the user starts gesture input in the gesture input area according to the distance data of the ToF sensor, the control of the ToF sensor to deflect is resumed again.
And S18', the processor recognizes and obtains the characteristics of the gesture action and determines that the input of the current gesture action is completed.
S19': setting an input operation corresponding to the gesture.
The step S16'-S19' may be performed repeatedly, and a set of corresponding relations between gesture actions and input operations are input each time, until all the corresponding relations between gesture actions and input operations that the user wants to use are input. And when the corresponding relation between a group of gesture actions and input operations is input, S16'-S17' can be performed for a plurality of times, namely, a user can perform gesture input for a plurality of times in a gesture input area until the processor recognizes the characteristics of the gesture actions.
In summary, by using the scheme provided by the embodiment of the application, the ToF sensor is arranged on the notebook computer, and the gesture of the user is accurately identified by combining the algorithm and/or big data, so that the gesture is converted into the corresponding mouse input operation or other user-defined input operations, the problem of low working efficiency caused by no mouse of the notebook computer is solved, and the problem of poor convenience caused by carrying the mouse by the notebook computer is also solved. In addition, personalized gestures can be customized, the interestingness and the user experience are improved, and more convenience is provided for users with partial hand disabilities. Furthermore, based on the working principle of the ToF sensor, entity information such as user pictures and the like can not be directly intercepted when gesture input is realized, and personal privacy data of a user are not adopted, so that the problem of privacy disclosure can not exist, and the gesture input device has higher practicability.
Based on the method for acquiring the input operation provided in the above embodiment, the embodiment of the application further provides a notebook computer, and the following detailed description is given with reference to the accompanying drawings.
Referring to fig. 16, a schematic diagram of a notebook computer according to an embodiment of the application is shown.
The notebook computer 100 provided in the embodiment of the present application includes at least one ToF sensor, and the specific implementation and working principle of the ToF sensor can be referred to the description in the above embodiment, which is not repeated herein.
In fig. 16, the notebook computer 100 includes two ToF sensors as an example. Wherein, toF sensor 181 is located on the right side of camera 193 and ToF sensor 181 is located on the left side of camera 193.
The ToF sensors 181 and 182 are configured to emit infrared light or laser light, wherein the generated light is reflected from the object and returned to the ToF sensor. Based on the time difference between the emission of light and the return of light after reflection by the object, the ToF sensors 181 and 182 can measure and obtain distance data between the object and itself.
The ToF sensors 181 and 182 transmit the acquired distance data to the processor of the notebook computer 100. The hardware architecture of the notebook computer 100 can be referred to in the relevant description corresponding to fig. 2, and will not be described herein again; the software architecture of the notebook computer 100 can be referred to in the relevant description of fig. 3, and will not be described herein.
The notebook computer 100 not only adds a ToF sensor to hardware, but also includes a ToF application component at an application program level, for implementing gesture input, opening and closing of a gesture recognition function, setting, and instruction of use.
In summary, the notebook computer provided by the embodiment of the application is provided with one or more ToF sensors, and the gestures of the user are accurately identified by combining the algorithm and/or big data, so that the gestures are converted into corresponding mouse input operations or other user-defined input operations, the problem of low working efficiency caused by no mouse of the notebook computer is solved, and the problem of poor convenience caused by carrying the mouse with the notebook computer is also solved. In addition, personalized gestures can be customized, the interestingness and the user experience are improved, and more convenience is provided for users with partial hand disabilities.
Furthermore, based on the working principle of the ToF sensor, entity information such as user pictures and the like can not be directly intercepted when gesture input is realized, and personal privacy data of a user are not adopted, so that the problem of privacy disclosure can not exist, and the gesture input device has higher practicability. And the ToF sensor has lower power consumption, and is beneficial to realizing better cruising of the notebook computer.
It can be understood that, according to the scheme of the embodiment of the application, the input operation of the mouse can be realized through gestures when the notebook computer is in a state without the mouse, but after the notebook computer is in wired connection or wireless connection with the mouse, the input operation through the mouse is not influenced or limited. In one possible implementation, after the processor determines that the notebook computer is connected with the mouse, the gesture recognition function may be controlled to be automatically turned off to reduce power consumption.
In some embodiments, the processor of the notebook computer controls the ToF sensor to be opposite to the gesture input area as much as possible, and the processor only recognizes the current gesture of the user according to the distance data of the gesture input area, so that accuracy in recognizing the current gesture of the user can be improved, and interference from surrounding environments is reduced.
In other embodiments, the processor of the notebook computer performs gesture recognition according to the distance data in the whole detection range acquired by the ToF sensor, and when the processor determines that the gesture is recognized, the processor controls the ToF sensor to rotate to be opposite to the hand of the user, so that the implementation manner is more flexible, and the user is not required to input in the gesture input area.
In addition, the embodiment of the present application also provides a readable storage medium having a program stored thereon, which when executed by a processor of a notebook computer, implements the input operation acquisition method provided in the above embodiment.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus and program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by program instructions. These program instructions may be provided to a processor of a programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Readable media, including both permanent and non-permanent, removable and non-removable media, may be implemented in any method or technology for storage of information. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media include, but are not limited to, phase change memory (phase change random access memory, PRAM), static random-access memory (SRAM), dynamic random-access memory (dynamic random access memory, DRAM), other types of random-access memory (random access memory, RAM), read-only memory (ROM), electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), and the like.
It should be understood that in the present application, "at least one (item)" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (26)

1. A notebook computer, the notebook computer comprising: a processor and at least one time-of-flight ToF sensor module;
the ToF sensor module comprises a ToF sensor, wherein the ToF sensor is used for acquiring distance data between the ToF sensor and the hand of a user;
the inclination angle of the ToF sensor is adjustable, and is an included angle between the ToF sensor and a plane where a screen of the notebook computer is located;
the processor is used for adjusting the inclination angle of the ToF sensor, identifying the current gesture action of the user according to the distance data, and determining the input operation corresponding to the current gesture action according to the corresponding relation between the pre-established gesture action and the input operation.
2. The notebook computer according to claim 1, wherein the processor is specifically configured to adjust an inclination angle of the ToF sensor according to a pre-established correspondence between the ToF sensor and the gesture input area, so that the ToF sensor faces the gesture input area; the gesture input area is an area where a user performs gesture input.
3. The notebook computer of claim 1, wherein the processor is specifically configured to determine an included angle between the hand of the user and a direction in which the ToF sensor is facing according to the distance data, and adjust an inclination angle of the ToF sensor according to the included angle, so that the ToF sensor is facing the hand of the user.
4. The notebook computer of any one of claims 1-3, wherein the processor is further configured to stop adjusting the tilt angle of the ToF sensor when no gesture is recognized for a first preset time.
5. The notebook computer of claim 1, wherein the ToF sensor module further comprises: the first magnetic pole structure comprises a first support, a second support, a first magnetic pole structure, a second magnetic pole structure, a first spring, a second spring, a first magnetic material and a second magnetic material;
The first supporting piece and the second supporting piece are of a plane structure;
the first magnetic pole structure is fixed at a first end of the first support member, the second magnetic pole structure is fixed at a second end of the first support member,
the first magnetic material is fixed at the first end of the second support, and the second magnetic material is fixed at the second end of the second support;
the first spring is connected with the first end of the first supporting piece and the first end of the second supporting piece;
the second spring is connected with the second end of the first supporting piece and the second end of the second supporting piece;
the ToF sensor is located between the first end of the first support and the second end of the first support; alternatively, the ToF sensor is located between the first end of the second support and the second end of the second support;
the processor is specifically configured to control the first magnetic pole structure and the second magnetic pole structure to be electrified and generate an electromagnetic field, so that a first acting force is generated between the first magnetic pole structure and the first magnetic material, a second acting force is generated between the second magnetic pole structure and the second magnetic material, and the directions of the first acting force and the second acting force are opposite.
6. The notebook computer of claim 5 wherein the first pole configuration comprises a first variable power source and a first solenoid and the second pole configuration comprises a second variable power source and a second solenoid;
the first magnetic material and the second magnetic material are provided with a first magnetic pole at one end close to the first support piece and a second magnetic pole at one end far away from the first support piece;
the processor is used for controlling the first variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second support, is the second magnetic pole, and the other end of the first solenoid, which is far away from the second support, is the first magnetic pole; and controlling the second variable power supply to output current to the second solenoid so that one end of the second solenoid, which is close to the second support, is the first magnetic pole, and one end of the second solenoid, which is far away from the second support, is the second magnetic pole.
7. The notebook computer of claim 5 wherein the first pole configuration comprises a first variable power source and a first solenoid and the second pole configuration comprises a second variable power source and a second solenoid;
The first magnetic material is provided with a first magnetic pole at one end close to the first support piece and a second magnetic pole at one end far away from the first support piece;
the second magnetic material is provided with a second magnetic pole at one end close to the first support piece, and is provided with a first magnetic pole at one end far away from the first support piece;
the processor is used for controlling the first variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second support, is the second magnetic pole, and the other end of the first solenoid, which is far away from the second support, is the first magnetic pole; and controlling the second variable power supply to output current to the second solenoid so that one end of the second solenoid, which is close to the second support, is the second magnetic pole, and one end of the second solenoid, which is far away from the second support, is the first magnetic pole.
8. The notebook computer of claim 5, wherein the ToF sensor module further comprises: a variable power supply; the first magnetic pole structure comprises a first solenoid and the second magnetic pole structure comprises a second solenoid; winding directions of windings of the first solenoid and the second solenoid are different;
The first magnetic material and the second magnetic material are provided with a first magnetic pole at one end close to the first support piece and a second magnetic pole at one end far away from the first support piece;
the processor is used for controlling the variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second support, is the second magnetic pole, and one end of the first solenoid, which is far away from the second support, is the first magnetic pole; and controlling the variable power supply to output current to the second solenoid so that one end of the second solenoid, which is close to the second support, is the first magnetic pole, and one end of the second solenoid, which is far away from the second support, is the second magnetic pole.
9. The notebook computer of claim 5, wherein the ToF sensor module further comprises: a variable power supply; the first magnetic pole structure comprises a first solenoid and the second magnetic pole structure comprises a second solenoid;
the first magnetic material is provided with a first magnetic pole at one end close to the first support piece and a second magnetic pole at one end far away from the first support piece;
the second magnetic material is provided with a second magnetic pole at one end close to the first support piece, and is provided with a first magnetic pole at one end far away from the first support piece;
The processor is used for controlling the variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second support, is the second magnetic pole, and one end of the first solenoid, which is far away from the second support, is the first magnetic pole; and controlling the variable power supply to output current to the second solenoid so that one end of the second solenoid, which is close to the second support, is the second magnetic pole, and one end of the second solenoid, which is far away from the second support, is the first magnetic pole.
10. The notebook computer of claim 1, wherein the ToF sensor module further comprises: the device comprises a first cambered surface supporting piece, a second cambered surface supporting piece, a first magnetic pole structure, a second magnetic pole structure, a first spring, a second spring, a first magnetic material and a second magnetic material;
the area of the first cambered surface supporting piece is larger than that of the second cambered surface supporting piece, the radian of the first cambered surface supporting piece is smaller than that of the second cambered surface supporting piece, and the second cambered surface supporting piece is tangent to the first cambered surface supporting piece;
the first magnetic pole structure is fixed at the first end of the first cambered surface supporting piece, the second magnetic pole structure is fixed at the second end of the first cambered surface supporting piece,
The first magnetic material is fixed at the first end of the second cambered surface supporting piece, and the second magnetic material is fixed at the second end of the second cambered surface supporting piece;
the first spring is connected with the first end of the first cambered surface supporting piece and the first end of the ToF sensor;
the second spring is connected with the second end of the first cambered surface supporting piece and the second end of the ToF sensor;
the second cambered surface support is positioned between the ToF sensor and the first cambered surface support and between the first spring and the second spring;
the processor is specifically configured to control the first magnetic pole structure and the second magnetic pole structure to be electrified and generate an electromagnetic field, so that a first acting force is generated between the first magnetic pole structure and the first magnetic material, a second acting force is generated between the second magnetic pole structure and the second magnetic material, and the directions of the first acting force and the second acting force are opposite.
11. The notebook computer of claim 10, wherein the first pole configuration comprises a first variable power source and a first solenoid and the second pole configuration comprises a second variable power source and a second solenoid;
The first magnetic material and the second magnetic material are provided with a first magnetic pole at one end close to the first cambered surface supporting piece and a second magnetic pole at one end far away from the first cambered surface supporting piece;
the processor is used for controlling the first variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second cambered surface support piece, is the second magnetic pole, and one end of the first solenoid, which is far away from the second cambered surface support piece, is the first magnetic pole; and controlling the second variable power supply to output current to the second solenoid, so that one end of the second solenoid, which is close to the second cambered surface support piece, is the first magnetic pole, and one end of the second solenoid, which is far away from the second cambered surface support piece, is the second magnetic pole.
12. The notebook computer of claim 10, wherein the first pole configuration comprises a first variable power source and a first solenoid and the second pole configuration comprises a second variable power source and a second solenoid;
the first magnetic material is provided with a first magnetic pole at one end close to the first cambered surface supporting piece and a second magnetic pole at one end far away from the first cambered surface supporting piece;
the second magnetic material is provided with a first cambered surface support piece, a second cambered surface support piece and a first cambered surface support piece, wherein one end of the second magnetic material, which is close to the first cambered surface support piece, is provided with a second magnetic pole, and one end of the second magnetic material, which is far away from the first cambered surface support piece, is provided with the first magnetic pole;
The processor is used for controlling the first variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second cambered surface support piece, is the second magnetic pole, and one end of the first solenoid, which is far away from the second cambered surface support piece, is the first magnetic pole; and controlling the second variable power supply to output current to the second solenoid, so that one end of the second solenoid, which is close to the second cambered surface support piece, is the second magnetic pole, and one end of the second solenoid, which is far away from the second cambered surface support piece, is the first magnetic pole.
13. The notebook computer of claim 11, wherein the ToF sensor module further comprises: a variable power supply; the first magnetic pole structure comprises a first solenoid and the second magnetic pole structure comprises a second solenoid; winding directions of windings of the first solenoid and the second solenoid are different;
the first magnetic material and the second magnetic material are provided with a first magnetic pole at one end close to the first cambered surface supporting piece and a second magnetic pole at one end far away from the first cambered surface supporting piece;
the processor is used for controlling the variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second cambered surface supporting piece, is the second magnetic pole, and one end of the first solenoid, which is far away from the second cambered surface supporting piece, is the first magnetic pole; and controlling the variable power supply to output current to the second solenoid, so that one end of the second solenoid, which is close to the second cambered surface support piece, is the first magnetic pole, and one end of the second solenoid, which is far away from the second cambered surface support piece, is the second magnetic pole.
14. The notebook computer of claim 10, wherein the ToF sensor module further comprises: a variable power supply; the first magnetic pole structure comprises a first solenoid and the second magnetic pole structure comprises a second solenoid;
the first magnetic material is provided with a first magnetic pole at one end close to the first cambered surface supporting piece and a second magnetic pole at one end far away from the first cambered surface supporting piece;
the second magnetic material is provided with a first cambered surface support piece, a second cambered surface support piece and a first cambered surface support piece, wherein one end of the second magnetic material, which is close to the first cambered surface support piece, is provided with a second magnetic pole, and one end of the second magnetic material, which is far away from the first cambered surface support piece, is provided with the first magnetic pole;
the processor is used for controlling the variable power supply to output current to the first solenoid so that one end of the first solenoid, which is close to the second cambered surface supporting piece, is the second magnetic pole, and one end of the first solenoid, which is far away from the second cambered surface supporting piece, is the first magnetic pole; and controlling the variable power supply to output current to the second solenoid, so that one end of the second solenoid, which is close to the second cambered surface support piece, is the second magnetic pole, and one end of the second solenoid, which is far away from the second cambered surface support piece, is the first magnetic pole.
15. The notebook computer of claim 2, wherein the notebook computer includes a first ToF sensor that pre-establishes a correspondence with a first gesture input region;
The processor is specifically configured to adjust an inclination angle of the first ToF sensor, so that the first ToF sensor faces the first gesture input area; and identifying the gesture action of the user in the first gesture input area by utilizing the distance data of the first gesture input area in the distance data acquired by the first ToF sensor.
16. The notebook computer of claim 2, wherein the notebook computer comprises the following two ToF sensors: a first ToF sensor and a second ToF sensor; the first ToF sensor pre-establishes a corresponding relation with the first gesture input area, and the second ToF sensor pre-establishes a corresponding relation with the second gesture input area;
the processor is specifically configured to adjust an inclination angle of the first ToF sensor, so that the first ToF sensor faces the first gesture input area, and identify a gesture of the user in the first gesture input area by using distance data of the first gesture input area in the distance data acquired by the first ToF sensor; and adjusting the inclination angle of the second ToF sensor so that the second ToF sensor is opposite to the second gesture input area, and identifying gesture actions of the user in the second gesture input area by using the distance data of the second gesture input area in the distance data acquired by the second ToF sensor.
17. The notebook computer of claim 2, wherein the notebook computer comprises the following two ToF sensors: a first ToF sensor and a second ToF sensor; the first ToF sensor and the second ToF sensor both pre-establish a corresponding relation with a first gesture input area;
the processor is specifically configured to adjust an inclination angle of the first ToF sensor and the second ToF sensor, so that the first ToF sensor and the second ToF sensor are opposite to the first gesture input area; and identifying a gesture of the user in the first gesture input area by using the distance data of the first gesture input area in the distance data acquired by the first ToF sensor and the distance data of the first gesture input area in the distance data acquired by the second ToF sensor.
18. The notebook computer of claim 1, wherein the input operation specifically comprises at least one of:
mouse operation or shortcut operation;
the mouse operation comprises one or more of a mouse click operation, a mouse double click operation, a mouse right click operation and a mouse wheel rolling operation;
The shortcut operation includes: a rollback operation of video/audio, a pause operation of video/audio, a fast forward operation of video/audio.
19. An input operation acquisition method, applied to a notebook computer, wherein the notebook computer comprises at least one time-of-flight ToF sensor module, and the ToF sensor module comprises a ToF sensor, the method comprises:
adjusting the inclination angle of the ToF sensor, wherein the inclination angle is an included angle between the ToF sensor and a plane where a screen of the notebook computer is positioned;
and identifying the current gesture action of the user according to the distance data between the ToF sensor and the hand of the user, and determining the input operation corresponding to the current gesture action according to the pre-established corresponding relation between the gesture action and the input operation.
20. The method for acquiring an input operation according to claim 19, wherein said adjusting an inclination angle of said ToF sensor specifically comprises:
according to a pre-established corresponding relation between the ToF sensor and the gesture input area, adjusting the inclination angle of the ToF sensor so that the ToF sensor is opposite to the gesture input area; the gesture input area is an area where a user performs gesture input.
21. The method for acquiring an input operation according to claim 19, wherein said adjusting an inclination angle of said ToF sensor specifically comprises:
and determining an included angle between the hand of the user and the opposite direction of the ToF sensor according to the distance data, and adjusting the inclination angle of the ToF sensor according to the included angle so that the ToF sensor is opposite to the hand of the user.
22. The method of acquiring an input operation according to any one of claims 19 to 21, characterized in that the method further comprises:
and stopping adjusting the inclination angle of the ToF sensor when the gesture motion is not recognized for the first preset time.
23. The method of claim 20, wherein the notebook computer comprises a first ToF sensor; before the adjusting the tilt angle of the ToF sensor, the method further includes:
pre-establishing a corresponding relation between the first ToF sensor and a first gesture input area;
the adjusting the inclination angle of the ToF sensor specifically includes:
adjusting the inclination angle of the first ToF sensor so that the first ToF sensor faces the first gesture input area;
The step of identifying the current gesture of the user according to the distance data between the ToF sensor and the hand of the user, specifically includes:
and identifying the gesture action of the user in the first gesture input area by utilizing the distance data of the first gesture input area in the distance data acquired by the first ToF sensor.
24. The method of claim 20, wherein the notebook computer comprises the following two ToF sensors: a first ToF sensor and a second ToF sensor; before the adjusting the tilt angle of the ToF sensor, the method further includes:
pre-establishing a corresponding relation between the first ToF sensor and a first gesture input region, and pre-establishing a corresponding relation between the second ToF sensor and a second gesture input region;
the adjusting the inclination angle of the ToF sensor specifically includes:
adjusting the tilt angle of the first ToF sensor so that the first ToF sensor is facing the first gesture input area, and adjusting the tilt angle of the second ToF sensor so that the second ToF sensor is facing the second gesture input area;
The step of identifying the current gesture of the user according to the distance data between the ToF sensor and the hand of the user, specifically includes:
identifying gesture actions of the user in the first gesture input area by utilizing the distance data of the first gesture input area in the distance data acquired by the first ToF sensor;
and identifying the gesture action of the user in the second gesture input area by using the distance data of the second gesture input area in the distance data acquired by the second ToF sensor.
25. The method of claim 20, wherein the notebook computer comprises the following two ToF sensors: a first ToF sensor and a second ToF sensor; before the adjusting the tilt angle of the ToF sensor, the method further includes:
pre-establishing a correspondence between the first ToF sensor and a first gesture input area, and pre-establishing a correspondence between the second ToF sensor and the first gesture input area;
the adjusting the inclination angle of the ToF sensor specifically includes:
adjusting the tilt angle of the first ToF sensor so that the first ToF sensor is facing the first gesture input area, and adjusting the tilt angle of the second ToF sensor so that the second ToF sensor is facing the first gesture input area;
The step of identifying the current gesture of the user according to the distance data between the ToF sensor and the hand of the user, specifically includes:
and identifying gesture actions of the user in the first gesture input area by using the distance data of the first gesture input area in the distance data acquired by the first ToF sensor and the distance data of the first gesture input area in the distance data acquired by the second ToF sensor.
26. The method of claim 21, further comprising, prior to adjusting the tilt angle of the ToF sensor:
and pre-establishing a corresponding relation between the gesture action and the input operation.
CN202211214135.9A 2022-09-30 2022-09-30 Notebook computer and input operation acquisition method Pending CN116736937A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211214135.9A CN116736937A (en) 2022-09-30 2022-09-30 Notebook computer and input operation acquisition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211214135.9A CN116736937A (en) 2022-09-30 2022-09-30 Notebook computer and input operation acquisition method

Publications (1)

Publication Number Publication Date
CN116736937A true CN116736937A (en) 2023-09-12

Family

ID=87908597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211214135.9A Pending CN116736937A (en) 2022-09-30 2022-09-30 Notebook computer and input operation acquisition method

Country Status (1)

Country Link
CN (1) CN116736937A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112188089A (en) * 2020-09-22 2021-01-05 北京小米移动软件有限公司 Distance acquisition method and device, focal length adjustment method and device, and distance measurement assembly
WO2022111663A1 (en) * 2020-11-30 2022-06-02 华为技术有限公司 Visual acuity test method and electronic device
CN115032640A (en) * 2022-08-09 2022-09-09 荣耀终端有限公司 Gesture recognition method and terminal equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112188089A (en) * 2020-09-22 2021-01-05 北京小米移动软件有限公司 Distance acquisition method and device, focal length adjustment method and device, and distance measurement assembly
WO2022111663A1 (en) * 2020-11-30 2022-06-02 华为技术有限公司 Visual acuity test method and electronic device
CN115032640A (en) * 2022-08-09 2022-09-09 荣耀终端有限公司 Gesture recognition method and terminal equipment

Similar Documents

Publication Publication Date Title
JP7391102B2 (en) Gesture processing methods and devices
CN110134316B (en) Model training method, emotion recognition method, and related device and equipment
JP6309540B2 (en) Image processing method, image processing device, terminal device, program, and recording medium
CN110147805B (en) Image processing method, device, terminal and storage medium
CN110495819B (en) Robot control method, robot, terminal, server and control system
CN112907725B (en) Image generation, training of image processing model and image processing method and device
WO2022179376A1 (en) Gesture control method and apparatus, and electronic device and storage medium
CN111835530B (en) Group joining method and device
CN111103922B (en) Camera, electronic equipment and identity verification method
WO2021000943A1 (en) Method and apparatus for managing fingerprint switch
CN112860428A (en) High-energy-efficiency display processing method and equipment
CN115589051B (en) Charging method and terminal equipment
WO2022206494A1 (en) Target tracking method and device
CN110058729B (en) Method and electronic device for adjusting sensitivity of touch detection
CN113970888A (en) Household equipment control method, terminal equipment and computer readable storage medium
CN117234398B (en) Screen brightness adjusting method and electronic equipment
CN115333941A (en) Method for acquiring application running condition and related equipment
CN114201738B (en) Unlocking method and electronic equipment
CN114422686B (en) Parameter adjustment method and related device
CN113688019A (en) Response time duration detection method and device
CN115032640B (en) Gesture recognition method and terminal equipment
CN115113961A (en) User interface display method, device, equipment and medium
CN116736937A (en) Notebook computer and input operation acquisition method
CN113971823A (en) Method and electronic device for appearance analysis
CN114639114A (en) Vision detection method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination