WO2019062205A1 - Tablette électronique et son procédé de commande, et support de stockage - Google Patents

Tablette électronique et son procédé de commande, et support de stockage Download PDF

Info

Publication number
WO2019062205A1
WO2019062205A1 PCT/CN2018/090767 CN2018090767W WO2019062205A1 WO 2019062205 A1 WO2019062205 A1 WO 2019062205A1 CN 2018090767 W CN2018090767 W CN 2018090767W WO 2019062205 A1 WO2019062205 A1 WO 2019062205A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
display unit
controlling
electronic tablet
gesture image
Prior art date
Application number
PCT/CN2018/090767
Other languages
English (en)
Chinese (zh)
Inventor
薛瑞彬
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2019062205A1 publication Critical patent/WO2019062205A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Embodiments of the present disclosure relate to an electronic tablet, a control method thereof, and a storage medium.
  • writing boards have become very popular. With the ubiquity of applications, tablet has penetrated into every aspect of everyday life.
  • the traditional writing board has caused certain pollution due to the use of special pens and scraps and other consumables, and the electronic writing board has attracted more and more attention as a more environmentally friendly and convenient writing carrier.
  • Touch-type electronic tablet requires a special pen to be in contact with the panel to complete the writing. Therefore, on the one hand, the dedicated electronic pen will cause certain pollution, on the other hand, the contact force will cause further damage to the panel and reduce the service life of the panel. Moreover, the contact type writing method requires the distance between the user and the panel. Nearly, controlling the appropriate writing strength, etc., all cause inconvenience to writing.
  • a method of controlling an electronic tablet comprising a display unit, the method comprising: acquiring a gesture image of an operating body; acquiring the operating body and the display a distance between the units, wherein the distance is greater than zero; determining a control instruction to control the electronic tablet based on the gesture image and the distance; and controlling display output of the display unit based on the control instruction.
  • the step of controlling the control instruction of the electronic tablet according to the gesture image and the distance determination comprises: when the gesture image is a first gesture image, and the distance is a first distance, determining The control command is a prompt icon display instruction regarding the first function.
  • controlling the control instruction of the electronic tablet according to the gesture image and the distance further comprising: when the gesture image is a second gesture image, and the distance changes from the first distance Determining, by the second distance, that the control instruction is an instruction to display an operation icon of the first function, wherein the second distance is smaller than a first distance, the second gesture image and the first gesture image different.
  • the determining, according to the gesture image and the distance determination, the control instruction of the electronic tablet further comprising: when the distance is maintained within the second distance, and the second gesture image is Determining, by the operating body, a second gesture image moving in front of the display unit, the control instruction is an operation instruction regarding the first function; and the controlling the display output step of the display unit comprises: Observing an operation command to acquire a movement trajectory of the operation body; and controlling the display unit to display an operation trajectory about the first function corresponding to a movement trajectory of the operation body based on a movement trajectory of the operation body.
  • the determining, according to the gesture image and the distance determination, the control instruction of the electronic tablet further comprising: when the gesture is a second gesture image, and the distance is changed from the second distance to The third distance determines that the control command is the cue icon display instruction regarding the first function, wherein the third distance is greater than the second distance.
  • the first function includes writing
  • the step of controlling the display unit to display an operation trajectory regarding the first function corresponding to a movement trajectory of the operation body includes: controlling the display unit to display the The writing track corresponding to the movement track of the operating body.
  • the first function includes erasing
  • the step of controlling the display unit to display an operation trajectory with respect to the movement trajectory of the operation body with respect to the first function includes: controlling the display unit to display The erase track corresponding to the movement track of the operating body.
  • controlling the display output of the display unit based on the control instruction comprises: displaying the prompt icon on the display unit based on the prompt icon display instruction; the method for controlling an electronic tablet further comprises Obtaining an initial position of the prompt icon on the display unit; acquiring a motion trajectory of the operating body; and controlling the prompt icon to move on the display unit based on the motion trajectory.
  • the determining, according to the gesture image and the distance determination, the control instruction of the electronic tablet includes: when the gesture image is a third gesture image, and the distance is a fourth distance, determining the control instruction And displaying, by the prompt icon for the second function, the prompt icon of the second function includes at least two selection boxes; and the method for controlling the electronic tablet further includes: when the gesture image is the second gesture image, And the distance is changed from the fourth distance to a fifth distance, and when the fifth distance is smaller than the fourth distance, a projection of the operating body on the display unit is obtained; determining the projection and the Selecting a positional relationship of the frame; determining a selection instruction for selecting one from the at least two selection boxes based on the positional relationship; and performing a function of the selection box based on the selection instruction.
  • the method further includes: determining whether the gesture image is a second gesture image; controlling the display unit to display a function button when the gesture image is a second gesture image; and when the operating body is in the display When the projection on the unit falls above the function button, the information displayed on the display unit is edited and controlled based on the function button.
  • the function button includes: an edit button
  • the step of editing and controlling information displayed on the display unit based on the function button includes: acquiring information displayed on the display unit;
  • the information performs at least one of the following operations: changing the font, changing the font size, and changing the color of the text.
  • the function button includes: a voice play button
  • the step of editing and controlling information displayed on the display unit based on the function button includes: acquiring information displayed on the display unit; and playing the voice Displays the information displayed on the unit.
  • the electronic tablet further includes a network module; the method for controlling the electronic tablet further includes: transmitting, by the network module, voice play data of the information to other devices.
  • the step of acquiring a gesture image of the operating body includes acquiring an image of the gesture of the operating body using an image acquisition unit.
  • the method further includes: playing the text displayed on the display unit by voice.
  • the method further includes: establishing a connection between the electronic tablet and a network or other terminal.
  • a computer readable non-volatile storage medium having stored thereon computer instructions that, when executed by a processor, perform the operations of: acquiring a gesture image of an operating body; Obtaining a distance between the operating body and the display unit, wherein the distance is greater than zero; determining a control instruction for controlling the electronic tablet according to the gesture image and the distance; and controlling based on the control instruction Display output of the display unit.
  • an electronic tablet including: a display unit, a processor, a sensor, a controller, and an image acquisition unit; the image acquisition unit is configured to acquire a gesture image of the operation body; a sensor configured to acquire a distance between the operating body and the display unit; wherein the distance is greater than zero; the processor is configured to control the electronic based on the gesture image and the distance determination a control command of the tablet; and the controller is configured to control a display output of the display unit based on the control command.
  • a voice player is further included; the voice player is configured to play text displayed on the display unit.
  • the network access module is further configured to establish a connection between the electronic tablet and a network or other terminal.
  • the embodiments of the present disclosure perform operations such as writing and font erasing by gestures and distance control, which are more convenient than the general touch-based electronic tablet. Meanwhile, the embodiment of the present disclosure can also perform content input by the user. Voice broadcast has great value for voice teaching and entertainment.
  • FIG. 1 is a flowchart of a method for controlling an electronic tablet according to an embodiment of the present disclosure
  • 2A is a flowchart of determining a control instruction for controlling an electronic tablet according to a gesture and a distance according to an embodiment of the present disclosure
  • 2B-2F are schematic diagrams of a first gesture, a second gesture, a prompt icon, or an operation icon according to an embodiment of the present disclosure
  • FIG. 3 is a flowchart of a mobile prompt icon provided by an embodiment of the present disclosure
  • 4A is a flowchart of implementing all the deletion functions provided by the embodiment of the present disclosure.
  • 4B is a schematic diagram of a third gesture provided by an embodiment of the present disclosure.
  • FIG. 5 is a flowchart of controlling an electronic tablet by using a function button according to an embodiment of the present disclosure
  • FIG. 6 is a block diagram of a composition of an electronic tablet according to an embodiment of the present disclosure.
  • an embodiment of the present disclosure provides a method 100 of controlling an electronic tablet, wherein the electronic tablet includes a display unit (for example, reference may be made to FIG. 6).
  • the method 100 of controlling an electronic tablet may include: step 101, acquiring a gesture image of the operating body; and step 102, acquiring a distance between the operating body and the display unit, wherein the distance is greater than zero; step 103, according to the gesture image And determining, by the distance, a control instruction for controlling the electronic tablet; and step 104, controlling display output of the display unit based on the control instruction.
  • the operating body may be any object capable of making a gesture, such as a human hand, a robot hand, a rubber hand, or the like.
  • the step of acquiring the gesture image of the operating body in step 101 may include acquiring the gesture image of the operating body using the image acquisition unit. For example, an input gesture of an operating body in front of the display unit is acquired by a depth camera to acquire a gesture image.
  • the display unit if the display unit is placed horizontally, its front side refers to a location area located above the display unit.
  • the operating body can be a hand or a foot or the like.
  • the entered gestures can be distinguished by different states of the finger. For example, by dividing the number of fingers to distinguish different input gestures, the input gestures can also be distinguished by different states of the same finger.
  • the input gesture can also be distinguished by different states of the entire hand. For example, using a fist and expanding the entire palm represents two different input gestures.
  • the gesture image may be determined by gesture recognition technology (eg, by gesture recognition technology to identify whether the image obtained according to the input gesture belongs to the first gesture image or the second gesture image, etc.), ie, The captured gesture is matched with a pre-stored control gesture to determine a control command corresponding to the input gesture.
  • the specific form of the input gesture can be judged by computer vision techniques such as two-dimensional hand recognition, two-dimensional gesture recognition, or three-dimensional gesture recognition.
  • the most fundamental difference between the three-dimensional gesture recognition and the two-dimensional gesture recognition is that the input required for the three-dimensional gesture recognition is information containing depth (for example, the image acquisition unit must be a depth camera at this time).
  • hardware implementations of three-dimensional recognition generally include three modes: optical time, structured light, or multi-angle imaging.
  • the gesture recognition process for a specific specific gesture image may include: first capturing a gesture input by the operation body, and then determining whether the gesture in the captured gesture image is one of the pre-stored control gestures.
  • the same for example, the control gesture may include the first gesture or the third gesture
  • the gesture is recognized as the same control instruction as the control gesture, Then, the display unit of the electronic tablet is controlled to update the display content according to the control instruction.
  • the type of control gesture may correspond to multiple functions of the electronic tablet.
  • three different control gestures can be preset for the three functions (for example, FIG. 2B, FIG. 2F, and The three control gestures shown in FIG. 4B, and then when the gesture that recognizes an input belongs to one of the three control gestures, the gesture input this time can be recognized as the control instruction corresponding to the same control gesture ( For example, a display instruction of a prompt icon corresponding to a certain function).
  • control instructions are configured to display associated control instructions, such as an instruction to display a prompt icon, an instruction to display an operation icon, or an instruction to display a text input trajectory, and the like.
  • the control command may be further parsed into a positioning-related control command. For example, when a prompt icon related to a certain function is displayed on the display unit, the displayed prompt icon can be moved to complete the positioning operation.
  • the control instruction may be further parsed into an operation-related control instruction. For example, when an operation icon related to a certain function is displayed on the display unit, a text input operation or a delete operation or the like can be completed.
  • the distance between the operating body and the display unit can be obtained by a distance sensor disposed on the electronic tablet.
  • a distance between the operating body and the display unit may also be measured using a third party device such as a cell phone distance sensor or a remote distance measuring sensor.
  • the electronic tablet is also required to set the communication unit to receive the distance between the display unit and the operating body measured by the third-party device.
  • FIG. 2A is a flowchart of determining a control instruction for controlling an electronic tablet according to a gesture and a distance according to an embodiment of the present disclosure.
  • the step 103 of FIG. 1 determines that the control instruction of the electronic tablet is controlled according to the gesture and the distance may include step 201: when the gesture image is a first gesture image, and the distance When it is the first distance, it is determined that the control instruction is a prompt icon display instruction regarding the first function.
  • the first gesture image belongs to one of pre-set control gestures. For example, when it is detected that the input gesture belongs to the first gesture image, it can be known that the electronic tablet is about to open a control command related to the first function.
  • the first function may include a text input or a partial delete function or the like.
  • the specific shape of the first gesture image may be as shown in FIG. 2B or as shown in FIG. 2F.
  • first gesture image shown in FIG. 2B and FIG. 2F is for illustrative purposes only, and the present disclosure does not limit the specific form of the first gesture image.
  • a gesture in which only one finger is stretched out may be used as the first gesture image.
  • the prompt icon of the writing function can be as shown in FIG. 2C. That is, when it is detected that the first gesture image is obtained by inputting the first gesture at the first distance, the electronic tablet will perform display of the prompt icon as shown in FIG. 2C on the display unit.
  • the prompt icon shown in FIG. 2C is only used for illustrative purposes.
  • the present disclosure does not limit the display content of the prompt icon, and the prompt icon can be used to represent the basic meaning of the corresponding function.
  • the display content of the partially deleted function prompt icon may be an eraser or multiple erasers, and the display content of the prompt icon for the text input function may be one or more pens.
  • the meaning of displaying the prompt icon on the electronic tablet is to prompt the operator or the user that the electronic tablet is about to open the function corresponding to the prompt icon.
  • the first distance is not a fixed distance value, and the first distance may be within a range that the detector can detect, and the values of the first distance may be different in different examples. For example, when a certain operator makes a first gesture at a distance A to obtain a first gesture image, and the detector detects a first gesture at a distance A, then the distance A at this time can be regarded as the first distance. .
  • the electronic tablet then parses the control command corresponding to the first gesture at distance A into a prompt icon display command for the first function.
  • the first distance may be any value within a range of less than 56 cm. That is to say, as long as the operating body makes a first gesture at a certain distance within the detection range of the detector, this distance is the so-called first distance.
  • the step 103 of FIG. 1 determines that the control instruction of the electronic tablet is controlled according to the gesture and the distance, and may further include: step 202: when the gesture is a second gesture image, and When the distance changes from the first distance to the second distance, determining that the control instruction is an instruction to display an operation icon of the first function, wherein the second distance is smaller than a first distance, the second gesture The image is different from the first gesture image.
  • the specific form of the second gesture image may be as shown in FIG. 2D.
  • the second gesture image (ie, the operation gesture) shown in FIG. 2D is only used to explain the technical solution, and the present disclosure does not limit the specific form of the second gesture image.
  • the control instruction is an instruction to display the operation icon.
  • the meaning of the so-called operation gesture can be understood as the specific input operation, deletion operation, etc. can be performed by using this operation gesture.
  • different operational gestures can be set separately for different operations of the electronic tablet.
  • only one common operational gesture eg, the second gesture of an embodiment of the present disclosure is set for a plurality of functions that the electronic tablet has.
  • the embodiment of the present disclosure does not constitute a limitation on the protection range, although only the second gesture is used as an operation gesture shared by a plurality of functions such as a writing function, an erasing function, and a key function of the electronic tablet. That is, the fourth gesture or the fifth gesture or the like may be employed in the embodiment of the present disclosure (wherein the fourth gesture or the fifth gesture is only used to indicate that the gesture is different from the first gesture, the second gesture, or the third gesture, etc.) As the operation gesture corresponding to each function.
  • an operation icon is schematically provided in FIG. 2E, and the operation icon may belong to the related function of the first function, and FIG. 2C and FIG. 2E respectively correspond to the same function.
  • a prompt icon and an operation icon (for example, a writing function).
  • the control process of the prompt icon and the operation icon provided on FIG. 2C and FIG. 2E to switch display on the display unit may be: when it is detected that the first gesture is input at the first distance to obtain the first gesture image, then the electronic tablet will Executing displaying a prompt icon as shown in FIG. 2C on the display unit; thereafter continuing to detect the input gesture and the distance, and when detecting that the second gesture is input at the second distance to obtain the second gesture image, the electronic tablet An operation icon as shown in FIG. 2E is displayed on the display unit.
  • the operation icons shown in FIG. 2E are for illustrative purposes only, and the disclosure does not limit the display content of the operation icons.
  • the meaning of displaying the operation icon on the electronic tablet is to prompt the operator or the user that the electronic tablet is about to open the operation corresponding to the operation icon. For example, when the operation icon of FIG. 2E is displayed on the display unit, the user is prompted to start the text input operation; when the operation icon of the eraser is displayed on the display unit, the user is prompted to use the electronic tablet. Start a partial delete operation.
  • the second distance can be a fixed value.
  • the fixed value can be 20 cm.
  • the second distance can also be a range.
  • the possible value of the second distance must be smaller than the possible value of the first distance, and the second distance is within the detection range of the detector.
  • the step 103 of FIG. 1 determines, according to the gesture and the distance, a control instruction for controlling the electronic tablet, further comprising: step 203: when the distance is maintained within the second distance, and The operating body determines that the control instruction is an operation instruction regarding the first function when the second gesture moves in front of the display unit.
  • the step of controlling the display output of the display unit in the corresponding step 104 may further include: acquiring the operating body based on the operation instruction. Moving the trajectory; controlling the display unit to display an operation trajectory regarding the first function corresponding to the movement trajectory of the operation body based on the movement trajectory of the operation body.
  • the corresponding controlling the display unit to display an operation trajectory about the first function corresponding to the movement trajectory of the operating body may include: controlling the display unit to display A writing trajectory corresponding to the movement trajectory of the operating body.
  • the step of controlling the display unit to display an operation trajectory about the first function corresponding to the movement trajectory of the operating body includes: controlling the display unit to display The erase track corresponding to the movement track of the operating body. That is to say, when the first function is a writing function, the operation track is a writing track; when the first function is an erasing function, the operation track may be an erasing track.
  • the movement trajectory of the corresponding operating body is a movement process of the hand when the user or the operator inputs each character or letter.
  • the movement track of the corresponding operation body is a movement process in which the user or the operator erases part of the display content on the display unit.
  • controlling the display unit to display the operation track means that the input track of each character is displayed on the display unit or the deletion track is displayed.
  • the movement trajectory of the operating body can be acquired by the depth camera, and the operation trajectory corresponding to the movement trajectory on the display unit can be identified by a computer three-dimensional gesture recognition technology.
  • the input text can be a Chinese character.
  • the process of recognizing the Chinese character can use the writing stroke as the recognition and display unit, that is, the individual input strokes are recognized and the recognized individual strokes are displayed one by one on the display unit in the writing order.
  • a complete Chinese character can also be used as the identification display unit, and the recognized Chinese character is displayed on the display unit only after the system recognizes a complete Chinese character.
  • the input text can also be in English or Japanese.
  • the step 103 of FIG. 1 determines that the control instruction of the electronic tablet is controlled according to the gesture and the distance, and may further include step 204: when the gesture image is a second gesture image, and When the distance changes from the second distance to the third distance, the control instruction is determined to be the prompt icon display instruction regarding the first function, wherein the third distance is greater than the second distance.
  • the prompt icon for the corresponding function is displayed on the display unit.
  • the control command when the distance changes from the second distance to the third distance (ie, as long as the process of detecting the operator away from the display unit), the control command may be determined to be related to the first function.
  • the prompt icon displays an instruction.
  • the third distance when the distance changes from the second distance to the third distance, and the third distance must be within a certain set distance range (ie, the operating body is detected to be away from the display unit to reach a certain setting)
  • the predetermined distance range can only be determined as the prompt icon display instruction regarding the first function.
  • the third distance can be a fixed value.
  • the fixed value can be 35 cm.
  • the possible value of the third distance must be greater than the possible value of the second distance.
  • the magnitude relationship between the possible values of the third distance and the possible values of the first distance is uncertain as long as both the first distance and the third distance are within the detection range of the detector.
  • FIG. 3 is a flowchart of a motion control icon moving on a display unit according to an embodiment of the present disclosure.
  • the electronic writing is controlled.
  • the method 300 of the board may further include: step 302, acquiring an initial position of the prompt icon on the display unit; step 303, acquiring a motion track of the operating body; and step 304, controlling the prompt icon based on the motion track Move on the display unit.
  • the motion trajectory of the operating body is acquired by the image acquisition unit.
  • the area through which the cue icon moves on the display unit may be used as the deletion area, or the position obtained after the cue icon is moved may be used as the starting area for writing.
  • the operation area of the electronic tablet can be conveniently selected by the operator or the user by controlling the movement of the prompt icon (for example, selecting a position to start inputting a text or selecting a selected area for executing a delete instruction, etc.).
  • FIG. 2B is a control gesture (ie, a first gesture) preset for the writing function of the electronic tablet, and when it is detected that the user poses the first gesture as shown in FIG. 2B, the display shown in FIG. 2C is displayed on the display unit. Write a function prompt icon. At this time, the user's input gesture is continuously monitored. When it is detected that the user's gesture becomes the second gesture shown in FIG. 2D, it is determined that the writing operation is to be performed, and the micro control unit MCU (for example, the single chip microcomputer) passes the depth.
  • the camera monitors the change of the user's index finger position (ie, detecting the distance between the operating body and the display unit) in real time.
  • the icon on the display unit changes from FIG. 2C to FIG. 2E.
  • the user input action will be displayed on the display unit, just as the user writes on the electronic tablet with the pen.
  • the icon on the display unit changes from the state of FIG. 2E to the state of FIG. 2C; and the display unit displays FIG. 2C.
  • the user can move the finger to make the prompt icon on the display unit move correspondingly, so that the user can select the writing area.
  • the user can perform a writing operation on the electronic writing board like a real pen writing.
  • FIG. 2F is a control gesture (ie, a first gesture) preset for a partial deletion function of the electronic tablet.
  • a control gesture ie, a first gesture
  • the user needs to input the number shown in FIG. 2F in front of the display unit. a gesture.
  • a prompt icon with an "Eraser” pattern will be displayed on the display unit.
  • the user's input gesture changes to the second gesture, and the user's index finger position is closer to the display unit than the previous moment (ie, when the first distance changes to the second distance)
  • the "Eraser" prompt on the display unit The icon will become the action icon for the "hand eraser".
  • the user can move his finger to delete the content on the electronic display panel.
  • the icon on the display unit is held by the hand holding the eraser.
  • the operation icon becomes the "Eraser” prompt icon, at which time the user can move the finger (ie, the second gesture) so that the "Eraser” prompt icon on the display unit moves to the designated position as needed by the user.
  • FIG. 4A is a flowchart of determining a control instruction for controlling an electronic tablet according to a gesture and a distance according to an embodiment of the present disclosure.
  • the control instruction of step 104 to control the electronic tablet according to the gesture and the distance may further include a method 400.
  • the method 400 may include: step 401, when the gesture is a third gesture, and the distance is a fourth distance, determining that the control instruction is a prompt icon display instruction regarding the second function, and the prompt icon of the second function Include at least two selection boxes; step 402, when the gesture image is a second gesture image, and the distance changes from the fourth distance to a fifth distance, the fifth distance is smaller than the fourth a distance, obtaining a projection of the operating body on the display unit; step 403, determining a positional relationship between the projection and the selection frame; and step 404, determining, from the at least two selection boxes based on the positional relationship Selecting one of the selection instructions; and step 405, performing the function of the selection box based on the selection instruction.
  • the third gesture can be as shown in FIG. 4B (ie, the fist mode).
  • the third gesture shown in FIG. 4B is for illustrative purposes only, and the present disclosure does not limit the specific form of the third gesture.
  • a gesture in which only five fingers are extended may be used as the third gesture.
  • the first gesture, the second gesture, and the third gesture can be distinguished from each other.
  • the contents of the two selection boxes can be a determination box and a cancel box, respectively.
  • the electronic tablet when the projection of the operating body on the display unit falls within the determination frame, the electronic tablet will perform an operation of clearing all; when the projection of the operating body on the display unit falls into the cancel box, the electronic tablet will Do not perform all deleted operations.
  • the fourth distance is not a fixed distance value, and the fourth distance may be within a range that the detector can detect, and the specific values of the fourth distance may be different in different examples.
  • the distance B at this point can be considered to be the fourth distance.
  • the electronic tablet then parses the control command corresponding to the third gesture at distance B into a prompt icon display command for the second function.
  • the fourth distance can be a fixed value, which can be 35 cm.
  • the possible value of the fourth distance must be greater than the possible value of the fifth distance. Both the fourth distance and the fifth distance must be within the detection range of the detector.
  • the display unit will display a prompt box of "whether or not all deletes".
  • the user selects a "Yes” or “No” selection box through the third gesture shown in FIG. 4B, and after the content of a selection box in "Yes” or “No” is in a "highlighted” state, the finger position is The corresponding operation is performed by selecting from the display unit at a previous moment (ie, the distance is changed from the fourth distance to the fifth distance).
  • the method 100 of controlling an electronic tablet may further include a control step 500, the control step 500 may further include: step 501, determining whether the gesture image is a second gesture image; Step 502, when the gesture is a second gesture image, controlling the display unit to display a function button; step 503, when the projection of the operating body on the display unit falls below the function button, based on The function button is used to edit and control information displayed on the display unit.
  • the form of the second gesture image can be as shown in FIG. 2D.
  • the function keys can include: an edit button.
  • the step of editing and controlling the information displayed on the display unit based on the function button in step 503 may include: acquiring information displayed on the display unit; and performing the following on the information At least one of the operations: changing the font, changing the font size, and changing the color of the text.
  • the "Edit Button" button is displayed in the upper right corner of the display unit.
  • the function button may further include: a voice play button.
  • the step 530 of editing and controlling the information displayed on the display unit based on the function button includes: acquiring information displayed on the display unit; and playing the voice on the display unit The information displayed.
  • the voice play button can be an illustrative icon such as a speaker.
  • voice playback is initiated by clicking on the speaker icon in the upper left corner of the display unit, thereby performing voice announcement on the content that the user has written.
  • the electronic tablet further includes a network module.
  • the method for controlling the electronic tablet may further include: transmitting the voice play data of the information to the other device through the network module.
  • the embodiment of the present disclosure further provides a computer readable non-volatile storage medium having stored thereon computer instructions, when executed by the processor, to perform an operation of acquiring an operation body in front of a display unit of the electronic tablet Input gesture; obtaining a distance between the operating body and the display unit, wherein the distance is greater than zero; determining a control instruction for controlling the electronic tablet according to the gesture and the distance; A control command controls display output of the display unit.
  • FIG. 6 is a block diagram showing the composition of an electronic tablet 600 according to an embodiment of the present disclosure.
  • the figure provides an electronic tablet 600.
  • the electronic tablet 600 as shown includes an image acquisition unit 601, a process 602, a sensor 603, a controller 604, and a display unit 605.
  • the image acquisition unit 601 is configured to acquire an input gesture of an operating body in front of the display unit of the electronic tablet; the sensor 603 is configured to acquire a distance between the operating body and the display unit; wherein the distance is greater than zero;
  • the processor 602 is configured to determine a control instruction to control the electronic tablet based on the gesture and the distance; and the controller 604 is configured to control a display output of the display unit 605 based on the control instruction.
  • the display unit 605 can be a liquid crystal display unit, or an IPS display unit or the like.
  • the electronic tablet 600 can also include a voice player 606.
  • the voice player 606 is configured to play the text displayed on the display unit.
  • the electronic tablet 600 may further include a network access module 607.
  • the network access module 07 is configured to establish a connection between the electronic tablet and a network or other terminal.
  • processor 602 controller 603, etc., are not described herein.
  • the various embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof.
  • the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be in the processor unit Implemented in the middle.
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory and executed by the
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another device, or some features can be ignored or not executed.
  • the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the functions may be stored in a computer readable storage medium if implemented in the form of a software functional unit and sold or used as a standalone product. Based on such understanding, the portion of the technical solution of the present disclosure that contributes in essence or to the prior art or the portion of the technical solution may be embodied in the form of a software product stored in a storage medium, including The instructions are used to cause a computer device, which may be a personal computer, server, or network device, to perform all or part of the steps of the methods described in various embodiments of the present disclosure.
  • the foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a mobile hard disk, a read only memory, a random access memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne une tablette électronique (600), un procédé de commande (100) pour une tablette électronique (600) et un support de stockage. La tablette électronique (600) comprend une unité d'affichage (605). Le procédé (100) comprend les étapes consistant à : acquérir une image de geste d'un corps d'actionnement (101); acquérir la distance entre le corps d'actionnement et l'unité d'affichage (102); déterminer, en fonction de l'image de geste et de la distance, une instruction de commande pour commander la tablette électronique (103); et commander, sur la base de l'instruction de commande, une sortie d'affichage de l'unité d'affichage (104).
PCT/CN2018/090767 2017-09-29 2018-06-12 Tablette électronique et son procédé de commande, et support de stockage WO2019062205A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710908894.8 2017-09-29
CN201710908894.8A CN109582201A (zh) 2017-09-29 2017-09-29 电子写字板及其控制方法、存储介质

Publications (1)

Publication Number Publication Date
WO2019062205A1 true WO2019062205A1 (fr) 2019-04-04

Family

ID=65900605

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/090767 WO2019062205A1 (fr) 2017-09-29 2018-06-12 Tablette électronique et son procédé de commande, et support de stockage

Country Status (2)

Country Link
CN (1) CN109582201A (fr)
WO (1) WO2019062205A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114415830A (zh) * 2021-12-31 2022-04-29 科大讯飞股份有限公司 隔空输入方法及设备、计算机可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007097548A1 (fr) * 2006-02-20 2007-08-30 Cheol Woo Kim Procédé et appareil destinés à la mise en oeuvre d'une interface-utilisateur à commande par reconnaissance des gestes de la main
CN103093196A (zh) * 2013-01-14 2013-05-08 大连理工大学 一种基于手势的汉字交互输入与识别方法
CN103425238A (zh) * 2012-05-21 2013-12-04 刘鸿达 以手势为输入的控制系统云端系统
CN104951083A (zh) * 2015-07-21 2015-09-30 石狮市智诚通讯器材贸易有限公司 一种远距离手势输入法及输入系统
CN105573582A (zh) * 2015-12-14 2016-05-11 魅族科技(中国)有限公司 一种显示方法以及终端

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9104239B2 (en) * 2011-03-09 2015-08-11 Lg Electronics Inc. Display device and method for controlling gesture functions using different depth ranges
CN106339135A (zh) * 2016-08-30 2017-01-18 科盟(福州)电子科技有限公司 一种支持多人独立操作的红外电子白板a/b分屏方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007097548A1 (fr) * 2006-02-20 2007-08-30 Cheol Woo Kim Procédé et appareil destinés à la mise en oeuvre d'une interface-utilisateur à commande par reconnaissance des gestes de la main
CN103425238A (zh) * 2012-05-21 2013-12-04 刘鸿达 以手势为输入的控制系统云端系统
CN103093196A (zh) * 2013-01-14 2013-05-08 大连理工大学 一种基于手势的汉字交互输入与识别方法
CN104951083A (zh) * 2015-07-21 2015-09-30 石狮市智诚通讯器材贸易有限公司 一种远距离手势输入法及输入系统
CN105573582A (zh) * 2015-12-14 2016-05-11 魅族科技(中国)有限公司 一种显示方法以及终端

Also Published As

Publication number Publication date
CN109582201A (zh) 2019-04-05

Similar Documents

Publication Publication Date Title
US20200257373A1 (en) Terminal and method for controlling the same based on spatial interaction
US9665276B2 (en) Character deletion during keyboard gesture
US20190278376A1 (en) System and method for close-range movement tracking
CN105144037B (zh) 用于输入字符的设备、方法和图形用户界面
US9910498B2 (en) System and method for close-range movement tracking
US9030498B2 (en) Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
JP6233314B2 (ja) 情報処理装置、情報処理方法およびコンピュータ読み取り可能な記録媒体
JP2019516189A (ja) タッチスクリーントラック認識方法及び装置
US20140210797A1 (en) Dynamic stylus palette
CN107850978A (zh) 用于在文档编辑中提供手写支持的设备和方法
KR102186393B1 (ko) 입력 처리 방법 및 그 전자 장치
US20140354553A1 (en) Automatically switching touch input modes
JP2013037675A5 (fr)
JP2010108273A (ja) 情報処理装置、情報処理方法およびプログラム
JP2007317159A (ja) 電子装置の入力装置およびその入力方法
JP2012168619A (ja) タッチ描画表示装置及びその操作方法
JP2013540330A (ja) ディスプレイでジェスチャを認識する方法及びその装置
JP6359862B2 (ja) タッチ操作入力装置、タッチ操作入力方法及びプログラム
JP2013089037A (ja) 描画装置、描画制御方法、及び描画制御プログラム
US8378980B2 (en) Input method using a touchscreen of an electronic device
TWI485616B (zh) 記錄軌跡的方法及電子裝置
WO2019062205A1 (fr) Tablette électronique et son procédé de commande, et support de stockage
WO2022218352A1 (fr) Procédé et appareil de fonctionnement tactile
CN106371644B (zh) 一种在屏幕上多人同时书写的方法和装置
JP5852876B2 (ja) 表示システムおよび表示プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18861764

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 7.09.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18861764

Country of ref document: EP

Kind code of ref document: A1