WO2022257870A1 - Procédé d'affichage d'échelle virtuelle et dispositif associé - Google Patents

Procédé d'affichage d'échelle virtuelle et dispositif associé Download PDF

Info

Publication number
WO2022257870A1
WO2022257870A1 PCT/CN2022/097085 CN2022097085W WO2022257870A1 WO 2022257870 A1 WO2022257870 A1 WO 2022257870A1 CN 2022097085 W CN2022097085 W CN 2022097085W WO 2022257870 A1 WO2022257870 A1 WO 2022257870A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
touch screen
contact
virtual
hand
Prior art date
Application number
PCT/CN2022/097085
Other languages
English (en)
Chinese (zh)
Inventor
叶枫
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022257870A1 publication Critical patent/WO2022257870A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the present application relates to the field of terminal software, in particular to a method for displaying a virtual scale and related equipment.
  • an interactive touch display device such as interactive electronic whiteboards
  • a stylus to facilitate users to write on the screen.
  • Electronic whiteboards are generally suitable for multi-person discussion scenarios and play a role similar to blackboards.
  • conference rooms or teaching scenarios sometimes there is a need for drawing, table drawing, and pattern measurement, and a virtual ruler needs to be called out through specific operations.
  • the existing gestures for triggering and calling the virtual ruler are relatively complicated, and the operation is difficult and costly, and does not conform to the user's operating habits.
  • the present application detects a contact gesture with the touch screen, and based on the contact gesture indicates the contact of the side of the user's hand with the touch screen, and displays a virtual scale on the touch screen according to the contact area of the contact gesture.
  • a contact gesture indicates the contact of the side of the user's hand with the touch screen, and displays a virtual scale on the touch screen according to the contact area of the contact gesture.
  • the contact area between the side of the hand and the touch screen is a flat strip area, close to a straight line. Triggering the display of the virtual ruler based on this operation mode is more in line with the user's operation habits.
  • An embodiment of the present application provides a method for displaying a virtual ruler, which is applied to an electronic device, the electronic device includes a touch screen, and the method includes:
  • the touch screen may include a touch-sensitive surface, a sensor or a sensor group that receives input from the user based on tactile contact, and "detecting a contact gesture with the touch screen” may be understood as detecting a contact gesture with the touch-sensitive surface on the touch screen ;
  • the contact gesture Based on the contact gesture indicating that the side of the user's hand is in contact with the touch screen, displaying a virtual ruler on the touch screen according to the contact area of the contact gesture, wherein the contact area of the contact gesture is a strip-shaped area, The virtual ruler is attached to the long side of the strip-shaped area.
  • contact area can be understood as a detectable area in contact with the touch-sensitive surface, which can be a continuous area or a discrete area composed of dense touch points (for example, a contact area can be a group of densely distributed The long strip area formed by the contact points);
  • the virtual scale is attached to the long side of the strip-shaped area, that is to say, in the display position, the virtual scale and the strip-shaped area need to be very close, and in the direction, the virtual scale and the strip-shaped area need to be very close.
  • the long sides of the shaped region are parallel or close to parallel;
  • the display of the virtual ruler is triggered.
  • the contact area between the side of the hand and the touch screen is a flat strip area, which is close to a straight line. Based on this operation mode Triggering the display of the virtual ruler is more in line with the user's operating habits.
  • the side of the hand is the side of the user's hand that is located on the side of the little finger in an unfolded state.
  • the side of the hand in the embodiment of the present application can be understood as the side of the user's hand located on the side of the little finger in the unfolded state (for example, including at least one of the surface of the hypothenar part of the user's hand or the surface of the little finger).
  • the unfolded state of the hand may be a state where the user's fingers (or only the little finger) are on the same plane (or close to the same plane) as the palm.
  • the contact area includes: a contact area between a hypothenar part on the side of the user's hand and the touch screen; or, a contact area between a little finger on the side of the user's hand and the touch screen.
  • the side of the hand may include the area of the hypothenar area and the area of the little finger.
  • the side of the user's hand is in contact with the touch screen, at least one of the following contact situations may occur due to the posture of the hand or the characteristics of the user's hand shape:
  • the area of the hypothenar is in contact with the touch screen
  • the area of the little finger is in contact with the touch screen
  • the method further includes: based on the contact area of the contact gesture meeting a preset condition, determining that the contact gesture indicates that the side of the user's hand is in contact with the touch screen; the preset condition includes : The contact area is a strip-shaped area, and the shape and size of the strip-shaped area conform to the characteristics of the contact area when the hypothenar part and/or the little finger part of the side of the user's hand is in contact with the touch screen.
  • the shape can be understood as the outline feature of the boundary line of the contact area.
  • the outline shape of the boundary line of the contact area is flat and long, it can be considered that the shape of the contact area conforms to the hypothenar position and/or little finger on the side of the user's hand.
  • the shape can also be understood as the distribution characteristics of the contact points included in the contact area, and whether the above shape characteristics are satisfied can be judged by detecting the distribution shape of the touch points. (For example, if a group of densely distributed elongated touch points are detected, it can be considered that the shape of the contact area conforms to the shape characteristics of the contact area when the hypothenar position on the side of the user's hand and/or the little finger position is in contact with the touch screen) ;
  • the area size of the contact area when the area size of the contact area is within the preset range (the maximum value in this range cannot be too large, and the minimum value in this range cannot be too small, it can be set based on the characteristics of the side of the hand of the human hand. ), then it can be considered that the size of the contact area conforms to the characteristics of the area size of the contact area when the hypothenar part and/or little finger part of the side of the user's hand is in contact with the touch screen.
  • the method also includes:
  • the contact gesture indicates that the side of the user's hand is in contact with the touch screen.
  • the virtual ruler is attached to the long side of the strip-shaped region, including:
  • the acute angle between the direction of the virtual ruler and the direction indicated by the long side of the strip-shaped area is smaller than a preset value, and the preset value can be 1 degree, 2 degrees, 3 degrees, 4 degrees, 5 degrees, 10 degrees, etc.; and the virtual ruler and the strip-shaped area meet one of the following conditions:
  • the preset value can be 1cm, 2cm, 3cm, 4cm, 5cm, 10cm, 15cm, etc.
  • the method before displaying the virtual ruler on the touch screen according to the contact area of the contact gesture, the method further includes: detecting that the contact gesture is maintained for a time longer than a preset time.
  • the maintenance time of the contact gesture can start counting when the contact gesture is detected, or after detecting that the contact intensity between the contact gesture and the touch screen is greater than a certain contact intensity threshold (threshold greater than 0), the maintenance time can be The timing ends when it is detected that the contact intensity of the contact gesture is 0, or the timing ends when it is detected that the contact intensity of the contact gesture is less than a certain contact intensity threshold (threshold greater than 0).
  • the maintenance time of the contact gesture may be the maintenance time when the contact gesture remains in a static state (or the movement is less than a certain range), for example, the maintenance time of the contact gesture may be the maintenance time when the pressing gesture is at rest (or the movement is less than a certain range) .
  • the above static state may be understood as a static state of the contact area of the touch gesture;
  • the method further includes: detecting movement of a contact area of the contact gesture; and adjusting a display position of the virtual ruler so that the display position of the virtual ruler follows the contact area.
  • the so-called movement can be understood as a change in position and/or a change in direction
  • the so-called following can be understood as always fitting the display position of the virtual ruler with the contact area of the touch gesture. Since the contact area of the touch gesture moves, the display position and direction of the virtual ruler will also change, and in the virtual The display area of the ruler is always in close contact with the long side of the contact area.
  • the method further includes: detecting a hand-raising gesture of the user; in response to the hand-raising gesture, hiding the display of the virtual ruler on the touch screen.
  • the gesture of raising the hand can be understood as the side of the hand leaving the touch screen, or the contact intensity between the side of the hand and the touch screen is less than a certain threshold, or the contact area between the side of the hand and the touch screen is smaller than a preset value.
  • the instructing that the side of the user's hand contacts the touch screen based on the contact gesture includes: instructing that the side of the user's hand and the touch screen perform a preset number of taps based on the contact gesture
  • the method further includes: detecting a hand-raising gesture of the user; in response to the hand-raising gesture, maintaining the display of the virtual ruler on the touch screen.
  • the virtual ruler still needs to be used after the user raises his hand, so it is necessary to trigger the fixed display of the virtual ruler on the touch screen based on a certain gesture
  • the preset number of taps needs to occur within a preset time, and the preset time is a short time (for example, 0.1S, 0.2S, 0.3S, 0.4S, 0.5S, etc.);
  • tapping positions of the preset number of taps on the touch screen need to be kept consistent, or slight deviations are allowed.
  • the preset number of times can be 2 times, 3 times, etc.
  • the virtual ruler can be fixedly displayed on the touch screen.
  • the method further includes: detecting a user's selection instruction for the virtual scale; in response to the selection instruction, displaying a trigger control, where the trigger control is used to indicate the selection of the virtual scale At least one of the following operations is performed: a deletion operation, a position adjustment operation, and a rotation operation.
  • the user can trigger the selection of the virtual ruler by clicking on the fixedly displayed virtual ruler.
  • the touch screen will pop up a trigger control.
  • the trigger control can be a prompt for indicating the deletion operation of the virtual ruler. Click the "Delete" prompt or drag the virtual ruler to the trash icon) to delete the virtual ruler.
  • the user can trigger the selection of the virtual ruler by clicking on the fixedly displayed virtual ruler.
  • the touch screen will pop up a trigger control.
  • the trigger control can be a prompt for indicating the position adjustment operation for the virtual ruler.
  • the user can By clicking the "posture adjustment” control, you can enter the posture adjustment mode for the virtual ruler. For example, in this mode, the user can adjust the display posture of the virtual ruler by touching the side of the hand and the touch screen.
  • the adjustment method can refer to the above The relevant descriptions about the control of the display position of the virtual ruler in the embodiment will not be repeated here.
  • the virtual ruler For another example, if you need to rotate the virtual ruler, you can use two fingers (on the virtual ruler or within the preset distance of the virtual ruler) to place two Click at different positions, and the two touch points rotate clockwise or counterclockwise at the same time to turn the virtual ruler.
  • the included angle between the virtual scale before and after the rotation can be displayed in real time.
  • tapping and fixed display of the virtual ruler and other gestures are in line with the user's operating habits and can greatly improve the operating efficiency.
  • the virtual ruler is used to measure the length of the drawing line segment displayed on the touch screen; or, the virtual ruler is used to measure the length of the drawing points displayed on the touch screen distance measurement.
  • the method further includes: detecting that there is a drawing line segment associated with the virtual ruler in a gesture on the touch screen, and displaying the length value of the drawing line segment; the existing gesture association includes : the direction difference from the virtual ruler is smaller than a preset value; and/or, the distance value from the virtual ruler is smaller than a preset value.
  • the so-called posture association can be understood as a high degree of fit between the virtual ruler and the drawing straight line segment.
  • the posture association can be expressed by the direction difference and the distance value.
  • the direction difference between the drawing straight line segment and the virtual ruler is less than the preset value (such as less than 20 degrees, 25 degrees, 30 degrees, 35 degrees, 40 degrees, 45 degrees, etc.); and/or, the distance between the drawn straight line segment and the virtual ruler is less than a preset value (such as 1cm, 2cm, 3cm etc.), it can be considered that there is a drawing line segment associated with the gesture of the virtual ruler on the touch screen.
  • the distance can be understood as the distance between the nearest point on the virtual ruler and the drawing straight line segment, or the average value of the distance between the point on the virtual ruler and the drawing straight line segment, or can be used to characterize the virtual ruler and drawing The method for the distance between straight line segments.
  • the detection module detects that there are multiple drawing straight line segments associated with the virtual ruler in gestures on the touch screen, the length value of the drawing straight line segment with the highest degree of gesture-association may be acquired and displayed.
  • the method further includes: adjusting the display position of the virtual ruler according to the position of the line segment to be measured, so that the display position of the virtual ruler fits with the drawn straight line. That is to say, a display effect similar to magnetic attraction is realized.
  • adjusting the display position of the virtual ruler may include adjusting the position and display direction of the virtual ruler.
  • the method further includes: detecting that there is a first intersection point and a second intersection point between the drawn line segment on the touch screen and the virtual ruler, and displaying A distance value between the first intersection point and the second intersection point.
  • the virtual ruler is used as a reference tool when drawing a straight line on the touch screen.
  • the method further includes: detecting a line-drawing gesture on the touch screen; based on the fact that the virtual ruler is displayed on the touch screen, and the position of the line-drawing gesture of the The distance between the virtual rulers is within a preset distance (for example, 1cm, 2cm, 3cm, 4cm, 5cm, etc.), according to the position of the line drawn by the gesture of drawing a line, a straight line segment for drawing is displayed, wherein the straight line segment for drawing parallel to the virtual ruler.
  • a preset distance for example, 1cm, 2cm, 3cm, 4cm, 5cm, etc.
  • the drawing trajectory of the drawing and painting can be corrected to a straight line.
  • a straight line will be drawn automatically.
  • the line drawing position includes a starting point position and an ending point position
  • the drawing straight line segment is a line segment between the starting point position and the ending point position
  • the line drawing position includes a starting point position and a real-time line drawing position
  • the method further includes: displaying a line drawing length based on the real-time line drawing position, where the line drawing length is the The distance value between the starting point position and the real-time drawing line position.
  • the drawing line between the first and last two drawing points can be calculated and displayed, and the length of the drawing line can be displayed.
  • the line drawing position includes a starting point position and an ending point position
  • the drawing straight line segment is a line segment between the starting point position and the ending point position
  • the detection module may be based on the
  • the real-time line drawing position displays the line drawing length
  • the line drawing length is the distance value between the starting point position and the real-time line drawing position.
  • the length of the line segment drawn exceeds the expected value, you can draw it back, which will have a deletion effect.
  • the generated line segment is the expected value, and the drawn Curves (the distance between the curve and the virtual ruler is within the preset range) will be automatically corrected to a straight line.
  • the two functions of the ruler are converted into intelligent operations suitable for the virtual interface.
  • the measurement is to automatically display the measured value, and the length value is displayed when drawing a straight line, which removes the limitations of the physical world and saves It takes less time for users to take readings, greatly improving user efficiency.
  • the present application provides an object copy method, the method comprising: detecting a drag gesture for a target object displayed on a touch screen; in response to the drag gesture, displaying the target object on the touch screen The mirror image of the target object, and according to the drag gesture, update the display position of the mirror image in real time, so that the mirror image moves with the drag gesture; detect the hand-raising gesture; respond to the hand-raising gesture, move the The mirror image is fixedly displayed at the display position where the mirror image is located.
  • the method of copying and pasting by dragging gestures is simple and intuitive, which reduces the number of steps required for existing copy and paste, and shortens the process of copying and pasting.
  • the method before the detection of the drag gesture for the target object displayed on the touch screen, the method further includes:
  • the copy function for the target object is enabled:
  • a click gesture is detected for the target object.
  • the detecting the long press gesture for the target object includes:
  • a long press gesture is detected in which the contact area with the touch screen covers the target object or is within a preset distance around the target object.
  • the dragging gesture is a gesture of keeping the long press gesture in contact with the touch screen and moving on the touch screen.
  • the long press gesture is a two-finger long press gesture.
  • the detecting the click gesture for the target object includes:
  • a click gesture in which a contact area with the touch screen covers the target object is detected.
  • the embodiment of the present application provides a virtual ruler display device, which is applied to electronic equipment, and the electronic equipment includes a touch screen, and the device includes:
  • a detection module configured to detect a contact gesture with the touch screen
  • a display module configured to indicate the contact of the side of the user's hand with the touch screen based on the contact gesture, and display a virtual ruler on the touch screen according to the contact area of the contact gesture, wherein the contact area of the contact gesture is In the strip-shaped area, the virtual ruler is attached to the long side of the strip-shaped area.
  • the side of the hand is the side of the user's hand that is located on the side of the little finger in an unfolded state.
  • the contact area includes:
  • the contact area between the little finger on the side of the user's hand and the touch screen is the contact area between the little finger on the side of the user's hand and the touch screen.
  • the device also includes:
  • a determining module configured to determine that the contact gesture indicates contact between the side of the user's hand and the touch screen based on that the contact area of the contact gesture satisfies a preset condition; the preset condition includes:
  • the contact area is a strip-shaped area, and the shape and size of the strip-shaped area conform to the characteristics of the contact area when the hypothenar part and/or the little finger part of the side of the user's hand is in contact with the touch screen.
  • the device also includes:
  • An acquisition module configured to acquire gesture data of the contact gesture
  • the determining module is further configured to determine, according to the gesture data, through a neural network, that the contact gesture indicates that the side of the user's hand is in contact with the touch screen.
  • the virtual ruler is attached to the long side of the strip-shaped region, including:
  • the acute angle between the direction of the virtual scale and the direction indicated by the long side of the strip-shaped area is smaller than a preset value; and the virtual scale and the strip-shaped area meet one of the following conditions:
  • the detection module is configured to detect that the duration of the contact gesture is greater than a preset time before displaying a virtual ruler on the touch screen in the contact area according to the contact gesture .
  • the detection module is configured to detect movement of a contact area of the contact gesture
  • the display module is configured to adjust the display position of the virtual ruler so that the display position of the virtual ruler follows the contact area.
  • the detection module is configured to detect a user's hand-raising gesture
  • the display module is configured to hide the display of the virtual ruler on the touch screen in response to the hand-raising gesture.
  • the instructing the contact of the side of the user's hand with the touch screen based on the contact gesture includes:
  • the method further includes:
  • the user's hand gesture is detected
  • the detection module is configured to detect a user's selection instruction for the virtual scale
  • the display module is configured to display a trigger control in response to the selection instruction, and the trigger control is used to indicate at least one of the following operations on the virtual scale:
  • the virtual ruler is used to measure the length of a drawn straight line segment displayed on the touch screen; or,
  • the virtual ruler is used for distance measurement between drawing points displayed on the touch screen.
  • the detection module is configured to detect that there is a drawing line segment associated with the gesture of the virtual ruler on the touch screen;
  • the display module is used to display the length value of the straight line segment of the drawing; the existence gesture association includes:
  • the direction difference from the virtual scale is smaller than a preset value; and/or,
  • the distance from the virtual ruler is smaller than a preset value.
  • the display module is configured to adjust the display position of the virtual ruler according to the position of the line segment to be measured, so that the display position of the virtual ruler fits with the drawn straight line.
  • the detection module is configured to detect that there is a first intersection point and a second intersection point between the drawn line segment on the touch screen and the virtual scale after the virtual scale is displayed on the touch screen;
  • the display module is configured to display a distance value between the first intersection point and the second intersection point.
  • the virtual ruler is used as a reference tool when drawing a straight line on the touch screen.
  • the detection module is configured to detect a line-drawing gesture on the touch screen
  • the display module is configured to display the virtual ruler on the touch screen, and the distance between the line-drawing position of the line-drawing gesture and the virtual ruler is within a preset distance, according to the line-drawing gesture The position of the drawn line, displaying the drawn straight line segment, wherein the drawn straight line segment is parallel to the virtual ruler.
  • the line drawing position includes a starting point position and an ending point position
  • the drawing straight line segment is a line segment between the starting point position and the ending point position
  • the line drawing position includes a starting point position and a real-time line drawing position
  • the method further includes:
  • the line-drawing length is a distance value between the starting point position and the real-time line-drawing position.
  • the present application provides an object replication device, the device comprising:
  • a detection module configured to detect a drag gesture for a target object displayed on the touch screen; the detection module is also configured to detect a hand-raising gesture after the display module displays a mirror image of the target object;
  • a display module configured to display a mirror image of the target object on the touch screen in response to the drag gesture, and update the display position of the mirror image in real time according to the drag gesture, so that the mirror image follows the The drag gesture moves; the display module is further configured to, in response to the hand-raising gesture, fix and display the mirror image at the display position where the mirror image is located.
  • the device also includes:
  • An enabling module configured to enable the copy function for the target object when at least one of the following gestures is detected before the detection of the drag gesture for the target object displayed on the touch screen:
  • a click gesture is detected for the target object.
  • the detecting the long press gesture for the target object includes:
  • a long press gesture is detected in which the contact area with the touch screen covers the target object or is within a preset distance around the target object.
  • the dragging gesture is a gesture of keeping the long press gesture in contact with the touch screen and moving on the touch screen.
  • the long press gesture is a two-finger long press gesture.
  • the detecting the click gesture for the target object includes:
  • a click gesture in which a contact area with the touch screen covers the target object is detected.
  • the present application provides an electronic device, including: a processor, a memory, a touch screen, and a bus, wherein: the processor, the memory, and the touch screen are connected through the bus;
  • the memory is used to store computer programs or instructions
  • the processor is used to call or execute the programs or instructions stored in the memory, and is also used to control the touch screen, so as to implement the steps described in the first aspect and any possible implementation manner of the first aspect, and Any one of the second aspect and the steps described in the possible implementation manners of the second aspect.
  • the present application provides a computer storage medium, including computer instructions.
  • the computer instructions When the computer instructions are run on an electronic device or a server, the steps described in any one of the above-mentioned first aspect and possible implementation modes of the first aspect are executed. , and the second aspect and the steps described in any one possible implementation manner of the second aspect.
  • the present application provides a computer program product.
  • the computer program product When the computer program product is run on an electronic device or a server, it executes the steps described in any one of the above-mentioned first aspect and possible implementation modes of the first aspect, and the first aspect. The steps described in the possible implementation manners of any one of the second aspect and the second aspect.
  • the present application provides a chip system, which includes a processor, configured to support an execution device or a training device to implement the functions involved in the above aspect, for example, send or process the data involved in the above method; or, information.
  • the chip system further includes a memory, and the memory is used for storing necessary program instructions and data of the execution device or the training device.
  • the system-on-a-chip may consist of chips, or may include chips and other discrete devices.
  • An embodiment of the present application provides a method for displaying a virtual scale, including: detecting a contact gesture with the touch screen; indicating the contact of the user's hand side with the touch screen based on the contact gesture, and according to the contact area of the contact gesture, A virtual scale is displayed on the touch screen, wherein the contact area of the contact gesture is a strip-shaped area, and the virtual scale is attached to a long side of the strip-shaped area.
  • the display of the virtual scale is triggered. On the one hand, only the user's single hand is required to operate, and the operation difficulty and cost are very small.
  • the contact area between the side of the hand and the touch screen is a flat strip area, which is close to a straight line. Based on this operation mode, triggering the display of the virtual scale is more in line with the user's operation habits.
  • Fig. 1 is the product structure schematic diagram that the embodiment of the present application provides
  • FIG. 2 is a structural block diagram of an electronic device according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an embodiment of a method for displaying a virtual ruler provided in an embodiment of the present application
  • FIG. 4 is a schematic diagram of a gesture in the embodiment of the present application.
  • Fig. 5 is a schematic diagram of a gesture in the embodiment of the present application.
  • Fig. 6 is a schematic diagram of a gesture in the embodiment of the present application.
  • FIG. 7 is a schematic diagram of a gesture in the embodiment of the present application.
  • Fig. 8 is a schematic diagram of a gesture in the embodiment of the present application.
  • FIG. 9 is a schematic diagram of a gesture contact in the embodiment of the present application.
  • FIG. 10 is a schematic diagram of a gesture contact in the embodiment of the present application.
  • FIG. 11 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 13 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 14 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 15 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 16 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 17 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 18 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 19 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 20 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 21 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • Fig. 22a is a schematic diagram of a terminal interface in the embodiment of the present application.
  • Fig. 22b is a schematic diagram of a terminal interface in the embodiment of the present application.
  • Fig. 22c is a schematic diagram of a terminal interface in the embodiment of the present application.
  • Fig. 22d is a schematic diagram of a terminal interface in the embodiment of the present application.
  • Fig. 23 is a schematic diagram of establishing an index table in the embodiment of the present application.
  • FIG. 24 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 25 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 26 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 27 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 28 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 29 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 30 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 31 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 32 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 33 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 34 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 35 is a schematic diagram of a terminal interface in the embodiment of the present application.
  • FIG. 36 is a schematic diagram of a terminal interface in the embodiment of the present application.
  • Fig. 37a is a schematic diagram of an embodiment of an object copy method provided by the embodiment of the present application.
  • Fig. 37b is a schematic diagram of a terminal interface in the embodiment of the present application.
  • FIG. 38 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 39 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 40 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • FIG. 41 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • Fig. 42 is a schematic diagram of a terminal interface in the embodiment of the present application.
  • Fig. 43 is a schematic diagram of a terminal interface in the embodiment of the present application.
  • FIG. 44 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • Fig. 45 is a schematic diagram of a terminal interface in the embodiment of the present application.
  • Fig. 46 is a schematic diagram of a terminal interface in the embodiment of the present application.
  • Fig. 47 is a schematic diagram of a terminal interface in the embodiment of the present application.
  • FIG. 48 is a schematic diagram of a terminal interface in an embodiment of the present application.
  • Fig. 49 is a schematic structural diagram of a virtual ruler display device provided by an embodiment of the present application.
  • FIG. 50 is a schematic structural diagram of an object copying device provided by an embodiment of the present application.
  • FIG. 51 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
  • the embodiment of the present application may be applied in the system 100 including the touch screen 103 .
  • FIG. 1 shows a system 100 to which the embodiment of the present application is applied, wherein the system 100 may include an electronic device 101 and a pen 102 associated with the electronic device 101 .
  • the electronic device 101 may be an electronic whiteboard (or called an electronic interactive smart board) shown in FIG. 1 , and the electronic device 101 includes a touch screen 103 . It should be understood that the electronic device 101 may also be a portable mobile device including a touch screen, such as but not limited to a mobile or portable computing device (such as a smart phone), a personal computer, a server computer, a handheld device (such as a tablet), or a laptop.
  • a mobile or portable computing device such as a smart phone
  • a personal computer such as a personal computer
  • a server computer such as a server computer
  • a handheld device such as a tablet
  • laptop such as a laptop.
  • the touch screen 103 can recognize a user's contact gesture.
  • the touch screen 103 can be an infrared touch screen (infrared touch screen).
  • the infrared touch screen is composed of infrared emitting and receiving sensing elements mounted on the outer frame of the touch screen. On the surface of the screen, an infrared detection network is formed. Any touching object The touch screen operation can be realized by changing the infrared rays on the contacts.
  • the realization principle of the infrared touch screen is similar to that of the surface acoustic wave touch screen, which uses infrared emitting and receiving sensing elements.
  • These elements form an infrared detection network on the surface of the screen, and the touch-operated object (such as the contact between the side of the hand and the touch screen 103 in the embodiment of the present application) can change the infrared rays of the touch point, and then be converted into the coordinate position of the touch, thereby realizing Recognition of touch gestures.
  • the touch-operated object such as the contact between the side of the hand and the touch screen 103 in the embodiment of the present application
  • the touch screen 103 may be a capacitive touch screen, and the capacitive touch screen works by utilizing the current induction of the human body.
  • the capacitive touch screen can be a four-layer composite glass screen, the inner surface and the interlayer of the glass screen are each coated with a layer of ITO (nano indium tin oxide), and the outermost layer is a thin layer of silica glass protective layer, The interlayer ITO coating is used as the working surface, four electrodes are drawn from the four corners, and the inner ITO layer is the shielding layer to ensure a good working environment.
  • ITO nano indium tin oxide
  • the capacitance is direct conductor, so the finger draws a small current from the point of contact. This current flows out of the electrodes on the four corners of the touch screen respectively, and the current flowing through these four electrodes is proportional to the distance from the finger to the four corners.
  • the controller obtains the position of the touch point through accurate calculation of the four current ratios. , and then the recognition of touch gestures can be realized.
  • touch screen 103 can also be other types of touch screens capable of recognizing gestures of contact, or be replaced with a touch screen that only has a display function but can cooperate with other external devices (such as sensors) to recognize gestures of contact. limited.
  • the pen 102 may also provide input to the electronic device 101 by contacting or otherwise interacting with the touch screen 103 .
  • the touch screen 103 can display patterns and characters, and can also provide a drawing interface for users to freely draw and draw, such as a whiteboard interface provided by an electronic whiteboard, a drawing board provided by an application (APP), and the like.
  • a drawing interface for users to freely draw and draw, such as a whiteboard interface provided by an electronic whiteboard, a drawing board provided by an application (APP), and the like.
  • the user can touch the touch screen 103 through the side of the hand, and the electronic device 101 can trigger the display of the virtual scale after detecting the user's contact gesture.
  • the exemplary operating environment of the present application is introduced above, and the internal structure of the electronic device 101 is described below with reference to an example.
  • FIG. 2 is a schematic structural diagram of an electronic device 101 provided by an embodiment of the present application.
  • the structure shown in FIG. 2 does not constitute a specific limitation on the electronic device 101 .
  • the electronic device 101 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the electronic device 101 can include an input/output controller 218 that can output information to one or more output devices 222 (eg, a touch screen or a speaker) that are separate or integrated with the electronic device 101 .
  • the input/output controller 218 may also be used to receive input from one or more input devices 220 (eg, a keyboard, a microphone, or a touch screen).
  • output device 222 may also serve as input device 220 .
  • An example of such a device would be a touch screen.
  • a user may provide input to input device 220 and/or receive output from output device 222 .
  • the input device 220 may be a touch screen, and the user provides gesture input to the input/output controller 218 by touching the side of the hand with the touch screen, and the input/output controller 218 may transmit the gesture input to the processor, and the processor 204 for processing.
  • the electronic device 101 may include one or more processors 204, and these processors may include one or more processing units, for example: the processor 204 may include an application processor (application processor, AP), a modem processor, a graphics processing unit Graphics processing unit (GPU), image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network Processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor
  • AP application processor
  • modem processor graphics processing unit Graphics processing unit
  • GPU Graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural network Processor neural-network processing unit
  • the controller of the processor 204 can generate an operation control signal according to the instruction operation code and the timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 204 for storing instructions and data.
  • the memory in processor 204 is a cache memory.
  • the memory may hold instructions or data that the processor 204 has just used or recycled. If the processor 204 needs to use the instruction or data again, it can be directly recalled from the memory. Repeated access is avoided, and the waiting time of the processor 204 is reduced, thereby improving the efficiency of the system.
  • processor 204 may include one or more interfaces.
  • the interface may include, but not limited to, a mobile industry processor interface (mobile industry processor interface, MIPI), an external memory interface, and/or a universal serial bus (universal serial bus, USB) interface and the like.
  • MIPI mobile industry processor interface
  • MIPI mobile industry processor interface
  • USB universal serial bus
  • the MIPI interface can be used to connect the processor 204 with peripheral devices such as a touch screen.
  • the MIPI interface may include a display serial interface (display serial interface, DSI) and the like.
  • the processor 204 communicates with the touch screen through the DSI interface to realize the display function of the touch screen.
  • the interface connection relationship between the modules shown in this embodiment is only a schematic illustration, and does not constitute a structural limitation of the electronic device 101 .
  • the electronic device 101 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the electronic device 101 may implement a display function through a GPU, a touch screen, an application processor, and the like.
  • the GPU is a microprocessor for image processing, which connects the touch screen and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 204 may include one or more GPUs that execute program instructions to generate or alter display information. Specifically, one or more GPUs in the processor 204 can implement image rendering tasks (such as rendering tasks related to drawing virtual rulers, distance values, length values, etc. in this application, and deliver the rendering results to the application processor or Other display drivers, the application processor or other display drivers trigger the display to display virtual rulers, distance values, length values, etc.).
  • a touch screen may include a display screen and associated sensors (eg, pressure sensors and touch sensors).
  • Displays are used to display images, videos, etc.
  • the display screen includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED active matrix organic light emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed
  • quantum dot light emitting diodes quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the pressure sensor is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensors such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device 101 can determine the strength of the pressure according to the change of the capacitance.
  • the electronic device 101 may detect the intensity of the touch operation according to the pressure sensor.
  • the electronic device 101 may also calculate the touched position according to the detection signal of the pressure sensor.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions.
  • Touch sensor also known as "touch device”.
  • the touch sensor can be arranged on the touch screen, and the touch screen is composed of the touch sensor and the display screen.
  • the touch sensor is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation may be provided through the display screen.
  • the touch sensor may also be disposed on the surface of the electronic device 101, which is different from the position of the display screen.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 101 can be realized through the NPU, for example, a touch gesture recognition task can be realized based on the NPU.
  • the external memory interface can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 101.
  • the external memory card communicates with the processor 204 through the external memory interface to realize the data storage function. For example, you can save music, video and other files in the external memory card.
  • Memory 214 may be used to store computer-executable program code, which includes instructions.
  • the memory 214 may include an area for storing programs and an area for storing data.
  • the storage program area can store the operating system 206, the application software 208 required by at least one function (such as image playing function, etc.) and the like.
  • the storage data area can store data (such as image data, etc.) created during the use of the electronic device 101 .
  • the memory 214 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the processor 204 executes various functional applications and data processing of the electronic device 101 by executing instructions stored in the memory 214 and/or instructions stored in the memory provided in the processor.
  • the method described in the embodiment of the present application may be a code stored in the memory 214 (or an external memory), and the processor 110 may acquire the code in the memory to implement the method provided in the embodiment of the present application.
  • the electronic device 101 can also interact with other electronic devices through the communication device 216 .
  • the methods described herein may be performed at least in part by one or more hardware logic components.
  • illustrative types of hardware logic components include field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), application specific standard parts, ASSP), system on chip (system on chip, SOC) and programmable logic device (programable logic device, PLD), etc.
  • FPGAs field programmable gate arrays
  • ASICs application specific integrated circuits
  • ASSP application specific standard parts
  • SOC system on chip
  • PLD programmable logic device
  • functions such as detection of contact gestures and determination of gesture categories can be implemented based on hardware logic components.
  • Fig. 3 is a schematic flowchart of a virtual scale display method provided by the embodiment of the present application.
  • the virtual scale display method provided by the embodiment of the present application includes:
  • the electronic device 101 may detect a contact gesture with the touch screen.
  • the touch screen may include a touch-sensitive surface, a sensor or a sensor group that receives input from the user based on tactile contact, and "detecting a contact gesture with the touch screen” may be understood as detecting a contact gesture with the touch-sensitive surface on the touch screen .
  • the devices and modules for detecting contact gestures related to the touch screen are described.
  • the touch screen can cooperate with the detection module to detect a contact gesture on the touch screen.
  • the detection module can be a program, a data structure or a subset thereof stored in the memory related to the detection of the contact gesture, or a part exists in the memory in the form of a program, a data structure or a subset thereof, and a part is in the form of a hardware logic module .
  • the touch screen can capture contact data, and the detection module can perform various actions related to contact gesture detection according to the contact data.
  • the touch screen can capture the contact data (such as electrical signals) between the user and the touch-sensitive surface in real time, and transmit the contact data to the detection module. various actions.
  • the detection module is described to realize the detection of the contact gesture on the touch screen.
  • the detection module can determine the intensity and/or its change when the user is in contact with the touch screen based on the contact data, as well as the size and/or its change of the contact area, and then determine the gesture of the contact gesture based on the above information type.
  • “strength” can be understood as the force or pressure (force per unit area) of a contact (eg, hand side contact) on the touch-sensitive surface of the touch screen.
  • the intensity of contact can be determined using various methods and various sensors or combinations of sensors. For example, below the touch-sensitive surface, the force at different points on the touch-sensitive surface on the touch screen is measured, eg, using one or more sensors adjacent to the touch-sensitive surface. In some implementations, force measurements from multiple sensors may be combined (eg, weighted averaged) to determine the strength of the contact.
  • contact area can be understood as a detectable area in contact with the touch-sensitive surface, which can be a continuous area or a discrete area composed of dense touch points (for example, a contact area can be a group of densely distributed The long strip area formed by the contact points).
  • the detection module can perform various actions related to the contact gesture detection based on the contact data, and then determine the gesture type of the contact gesture, the movement of the contact gesture, the stop of the contact gesture, and the like.
  • the detection module can determine whether contact has occurred and the type of contact gesture (e.g., detect a finger press event, or a contact event from the side of the hand), determine whether there is movement of the contact, and track movement on the touch-sensitive surface (e.g., Detect one or more finger drag events, or a drag event from the side of the hand), and determine whether contact has ceased (for example, detect a finger lift event, or detect a lift event from the side of the hand).
  • the type of contact gesture e.g., detect a finger press event, or a contact event from the side of the hand
  • determine whether there is movement of the contact e.g., track movement on the touch-sensitive surface (e.g., Detect one or more finger drag events, or a drag event from the side of the hand)
  • determine whether contact has ceased for example, detect a finger lift event, or detect a lift event from the side of the hand.
  • the above-mentioned “movement of the contact gesture” can also be referred to as the movement of the contact area of the contact gesture, and the data related to the movement of the contact area can include, but not limited to, the rate (magnitude), speed (magnitude and direction) and/or acceleration (change in magnitude and/or direction).
  • the detection module can perform various actions related to contact gesture detection based on the contact data, so as to determine the gesture type indicated by the contact gesture, wherein the gesture type is, for example but not limited to, click, double click, long press, drag, hand Sideways stationary contact with the touchscreen, sideways hand drag, sideways hand turn, sideways double tap, and more.
  • the gesture type is, for example but not limited to, click, double click, long press, drag, hand Sideways stationary contact with the touchscreen, sideways hand drag, sideways hand turn, sideways double tap, and more.
  • the detection module may be a pre-trained neural network model, and the neural network model has the ability to recognize the gesture category indicated by the contact gesture based on the contact data of the contact gesture.
  • the contact gesture Based on the contact gesture indicating that the side of the user's hand is in contact with the touch screen, display a virtual ruler on the touch screen according to the contact area of the contact gesture, wherein the contact area of the contact gesture is strip-shaped area, the virtual ruler is attached to the long side of the strip-shaped area.
  • the detection module may determine the gesture type indicated by the contact gesture. In one implementation, the detection module may determine that the contact gesture indicates that the side of the user's hand is in contact with the touch screen.
  • the display of the virtual ruler is triggered.
  • the contact area between the side of the hand and the touch screen is a flat strip area, which is close to a straight line. Based on this operation mode Triggering the display of the virtual ruler is more in line with the user's operating habits.
  • the side of the hand in the embodiment of the present application can be understood as the side of the user's hand located on the side of the little finger in the unfolded state (for example, including at least one of the surface of the hypothenar part of the user's hand or the surface of the little finger).
  • the unfolded state of the hand may be a state where the user's fingers (or only the little finger) are on the same plane (or close to the same plane) as the palm.
  • a user's hand posture is shown in Fig. 4.
  • the lower side of the hand is the side of the user's hand, which includes the little finger and the user's hand.
  • the surface of the hypothenar region is shown in Fig. 5.
  • the lower right side of the hand is the user's hand side, which includes the little finger and the user's hand.
  • the surface of the hypothenar region of the hand is shown in Fig. 4.
  • the contact between the side of the user's hand and the touch screen is equivalent to the contact between the hypothenar part of the side of the user's hand and the touch screen and/or the contact of the little finger part of the side of the user's hand with the touch screen.
  • the contact intensity and contact area of the touch screen meet certain conditions, it can be considered as the contact of the side of the user's hand with the touch screen.
  • Fig. 6 is the schematic diagram when the side of the user's hand is in contact with the touch screen from the perspective of facing the touch screen
  • Fig. 7 is the schematic diagram of the side of the user's hand in contact with the touch screen from the perspective of the side view of the touch screen From the point of view of the user's behavior, based on the difference in touch intensity and contact area, the side of the user's hand can be swipe the touch screen (small contact area, low contact intensity), or lightly press the touch screen (large contact area, Medium contact strength), or heavy pressure touch screen (large contact area, high contact strength).
  • the side of the user's hand when the side of the user's hand lightly presses the touchscreen or presses heavily on the touchscreen for more than a certain period of time, it may be considered that the user needs to trigger the display of the virtual scale.
  • the side of the user's hand when the side of the user's hand lightly presses the touchscreen or presses heavily on the touchscreen for more than X seconds, it can be considered that the user needs to trigger the display of the virtual scale.
  • the detection module From the perspective of the detection module, it needs to be processed and analyzed based on the contact intensity of the touch gesture and the contact area of the touch gesture and the touch screen. When it meets certain conditions, it can be considered as the contact between the side of the user's hand and the touch screen.
  • the detection module may acquire the contact area of the contact gesture, and based on the contact area meeting a preset condition, determine that the contact gesture indicates that the side of the user's hand is in contact with the touch screen; the preset The conditions include: the contact area is a strip-shaped area, and the shape and size of the strip-shaped area conform to the characteristics of the contact area when the hypothenar position and/or little finger on the side of the user's hand is in contact with the touch screen .
  • the side of the hand may include the area of the hypothenar area and the area of the little finger.
  • the side of the user's hand is in contact with the touch screen, at least one of the following contact situations may occur due to the posture of the hand or the characteristics of the user's hand shape:
  • the area of the hypothenar is in contact with the touch screen
  • the area of the little finger is in contact with the touch screen
  • the contact area is a flat strip-shaped area
  • the contact area is a flat strip-shaped area, and the size of the area can be smaller than the size of the contact area when the area of the hypothenar is in contact with the touch screen;
  • the contact area is two strip-shaped areas arranged in the same direction, and the long sides of the two strip-shaped areas in the same direction.
  • the detection module can acquire the data of the contact area, and then analyze the shape and area size of the contact area, when the shape and area size of the contact area conform to the hypothenar part and/or the little finger part of the side of the user's hand in contact with the touch screen
  • the touch area is characterized, it may be determined that the touch gesture indicates contact of the side of the user's hand with the touch screen.
  • the shape can be understood as the outline feature of the boundary line of the contact area.
  • the outline shape of the boundary line of the contact area is flat and long, it can be considered that the shape of the contact area conforms to the hypothenar position and/or little finger on the side of the user's hand.
  • the shape can also be understood as the distribution characteristics of the contact points included in the contact area, and whether the above shape characteristics are satisfied can be judged by detecting the distribution shape of the touch points. (For example, if a group of densely distributed elongated touch points are detected, it can be considered that the shape of the contact area conforms to the shape characteristics of the contact area when the hypothenar position on the side of the user's hand and/or the little finger position is in contact with the touch screen) ;
  • the area size of the contact area when the area size of the contact area is within the preset range (the maximum value in this range cannot be too large, and the minimum value in this range cannot be too small, it can be set based on the characteristics of the side of the hand of the human hand. ), then it can be considered that the size of the contact area conforms to the characteristics of the area size of the contact area when the hypothenar part and/or little finger part of the side of the user's hand is in contact with the touch screen.
  • the user may accidentally touch or perform a touch gesture with the touch screen, but this contact gesture is not used to trigger the virtual ruler. It can be based on the screening of the contact intensity of the touch gesture to avoid triggering the virtual ruler in the above scenarios. show.
  • the display of the virtual scale may not be triggered, and
  • the touch gesture is a press gesture, the display of the virtual ruler may be triggered (provided that the contact area of the touch gesture satisfies the above-mentioned shape and area size conditions).
  • the press gesture may include a light press and a deep press, and when the contact intensity increases from an intensity lower than the light press intensity threshold to an intensity between the light press intensity threshold and the deep press intensity threshold, it may be called a touch surface detected "light press” input on the An increase in contact intensity from an intensity below the deep press intensity threshold to an intensity above the deep press intensity threshold may be referred to as detecting a "deep press” input on the touch surface. An increase in contact intensity from an intensity below the contact detection intensity threshold to an intensity between the contact detection intensity threshold and the light press intensity threshold may be referred to as detecting a "swipe touch" input on the touch surface.
  • the contact detection intensity threshold is zero. In some embodiments, the contact detection intensity threshold may be greater than zero.
  • the detection module can be a pre-trained neural network model, and the neural network model has the ability to identify the gesture category indicated by the contact gesture based on the contact data of the contact gesture.
  • the detection module Gesture data of the contact gesture may be acquired, and according to the gesture data, it is determined through a neural network that the contact gesture indicates that the side of the user's hand is in contact with the touch screen.
  • the display of the virtual ruler may not be triggered.
  • the maintenance time of the contact gesture can also be used as the condition for triggering the display of the virtual scale, that is, it needs to be detected that the maintenance time of the contact gesture is greater than the preset time, for example, the preset time can be set to 0.1S, 0.15 S, 0.2S, 0.25S, 0.3S, 0.35S, 0.4S, 0.45S, 1S, 2S.
  • the maintenance time of the contact gesture can start counting when the contact gesture is detected, or after detecting that the contact intensity between the contact gesture and the touch screen is greater than a certain contact intensity threshold (threshold greater than 0), the maintenance time can be The timing ends when it is detected that the contact intensity of the contact gesture is 0, or the timing ends when it is detected that the contact intensity of the contact gesture is less than a certain contact intensity threshold (threshold greater than 0).
  • the maintenance time of the contact gesture may be the maintenance time when the contact gesture remains in a static state (or the movement is less than a certain range), for example, the maintenance time of the contact gesture may be the maintenance time when the pressing gesture is at rest (or the movement is less than a certain range) .
  • the above static state may be understood as a static state of the contact area of the touch gesture.
  • a virtual ruler may be displayed on the touch screen according to the contact area of the contact gesture.
  • the virtual ruler is described next:
  • the virtual ruler is an affordance displayed on the touch screen.
  • the display shape of the virtual ruler is similar to a straightedge (for example, it can be a flat rectangle or a line segment).
  • the user can also use the virtual ruler as a reference tool when drawing a straight line on the touch screen, specifically
  • the process is similar to the process in which a user draws a line with a ruler in a physical real space.
  • the shape of the contact area is a flat strip, and the direction indicated by the long side of the strip is in line with the four directions of the user's hand.
  • the directions pointed by the fingers (or only the little finger) are parallel (or nearly parallel).
  • the display direction of the virtual scale can be made to be parallel or nearly parallel to the direction indicated by the long side of the bar, and the display mode of this virtual scale conforms to the user's operating habits.
  • the display direction of the virtual ruler (which may also be simply referred to as the direction of the virtual ruler in the embodiment of the present application) may be understood as the direction of the side on the virtual ruler used for measurement or drawing a reference.
  • the virtual scale is attached to the long side of the strip-shaped area, that is, in the display position, the virtual scale and the strip-shaped area need to be very close, and in the direction, the virtual scale and the strip-shaped area need to be very close.
  • the long sides of the elongated regions are parallel or nearly parallel.
  • the virtual ruler and the strip-shaped area satisfy one of the following conditions:
  • the virtual ruler there is overlap between the virtual ruler and the strip-shaped region (such as shown in Figure 13); or, the virtual ruler is tangent to the strip-shaped region (such as shown in Figure 12); or, the virtual ruler and the distance between the strip-shaped area is less than a preset value (such as shown in Figure 14 and Figure 15, wherein Figure 15 shows the situation when the virtual ruler is a line segment), the preset value can be 1cm, 2cm, 3cm, 4cm, 5cm, 10cm, 15cm, etc.
  • the acute angle between the direction of the virtual scale and the direction indicated by the long side of the strip-shaped area is smaller than a preset value, and the preset value can be 1 degree, 2 degrees, or 3 degrees. degrees, 4 degrees, 5 degrees, 10 degrees, etc.
  • the posture of the virtual ruler after triggering the display of the virtual ruler, can be adjusted through a specific gesture, for example, changing the posture (position and angle) of the virtual ruler so that the virtual ruler can measure different elements (drawing line length or the distance between plotted points).
  • the movement of the virtual scale can be driven by the movement of the contact area on the side of the user's hand.
  • the detection module may detect the movement of the contact area of the contact gesture, and adjust the display position of the virtual ruler so that the display position of the virtual ruler follows the contact area.
  • the so-called movement can be understood as a change in position and/or a change in direction
  • the so-called following can be understood as always fitting the display position of the virtual ruler with the contact area of the touch gesture. Since the contact area of the touch gesture moves, the display position and direction of the virtual ruler will also change, and in the virtual The display area of the ruler is always in close contact with the long side of the contact area.
  • the contact area of the touch gesture moves, and then the display position and direction of the virtual ruler will also change following the contact area.
  • a certain gesture can be used to trigger the display of the hidden virtual ruler on the touch screen.
  • the gesture can be a gesture of raising the hand.
  • the display of the virtual scale is hidden on the touch screen in response to the user's hand-raising gesture, wherein the hand-raising gesture can be understood as the side of the hand leaving the touch screen, or the contact strength between the side of the hand and the touch screen is less than A certain threshold, or the contact area between the side of the hand and the touch screen is smaller than a preset value.
  • the virtual ruler still needs to be used after the user raises his hand, so it is necessary to trigger the fixed display of the virtual ruler on the touch screen based on a certain gesture.
  • the user can tap the touch screen for a preset number of times by the side of the hand to trigger the fixed display of the virtual scale on the touch screen.
  • the detection module can indicate the The side of the user's hand touches the touch screen for a preset number of times, and after the user's hand-raising gesture is detected, the display of the virtual scale is maintained on the touch screen (for example, as shown in FIG. 31 ).
  • the preset number of taps needs to occur within a preset time, and the preset time is a short time (for example, 0.1S, 0.2S, 0.3S, 0.4S, 0.5S, etc.);
  • tapping positions of the preset number of taps on the touch screen need to be kept consistent, or slight deviations are allowed.
  • the preset number of times can be 2 times, 3 times, etc.
  • the virtual ruler can be fixedly displayed on the touch screen.
  • a certain gesture operation can be used to adjust the display posture of the virtual ruler (the display posture may include a display position and/or a display direction).
  • the detection module can detect a user's selection instruction for the virtual scale, and in response to the selection instruction, a trigger control can be displayed, and the trigger control is used to indicate at least one of the following operations on the virtual scale: delete operation , position adjustment operations, and rotation operations.
  • the user can trigger the selection of the virtual scale by clicking on the fixedly displayed virtual scale, and at the same time of selection, the touch screen will pop up a trigger control, for example, the trigger control can be a prompt for indicating a deletion operation for the virtual scale (such as the "delete" prompt shown in FIG. 33, and the trash can logo shown in FIG. 35), the user can delete the virtual ruler by clicking on the "delete” prompt or dragging the virtual ruler to the trash bin logo).
  • the trigger control can be a prompt for indicating a deletion operation for the virtual scale (such as the "delete" prompt shown in FIG. 33, and the trash can logo shown in FIG. 35)
  • the user can delete the virtual ruler by clicking on the "delete” prompt or dragging the virtual ruler to the trash bin logo).
  • the user can trigger the selection of the virtual ruler by clicking on the fixedly displayed virtual ruler.
  • a trigger control will pop up on the touch screen. 33
  • the user can click on the "posture adjustment" control to enter the posture adjustment mode for the virtual ruler.
  • the user can touch the side of the hand and the touch screen
  • the adjustment method can refer to the relevant description of the display position control of the virtual ruler in the above embodiment, which will not be repeated here.
  • the display posture of the virtual ruler can be adjusted directly through gesture operations.
  • the new virtual ruler when the virtual ruler is not selected, if a touch gesture between the side of the user's hand and the touch screen is detected (the condition for triggering the display of the virtual ruler is met), another virtual ruler can be additionally displayed , the new virtual ruler can also follow the translation of the receiving area on the side of the hand. In addition, the new virtual ruler can also be fixed and displayed with continuous tapping gestures.
  • the touch screen exits the drawing interface and enters the display interface (such as displaying the screen content of other electronic devices)
  • the virtual ruler displayed on the drawing interface will also follow the exit of the drawing interface Disappearing will not affect the display of the projected content.
  • tapping and fixed display of the virtual ruler and other gestures are in line with the user's operating habits and can greatly improve the operating efficiency.
  • the virtual ruler may be used to measure the length of the drawn straight line segment displayed on the touch screen.
  • At least one drawing straight line segment can be displayed on the touch screen.
  • the user wants to measure the length of one of the drawing straight line segments, the user can adjust the virtual position by moving the touch gesture on the touch screen.
  • the display position of the ruler so that the display position of the virtual ruler is close to or accurately fits the straight segment of the drawing to be measured.
  • the detection module can acquire the length value of the drawn straight line segment where the virtual scale fits, and display the length value.
  • the user when the user may not be able to accurately fit the drawing line segment to be measured, or the operation required to fit the drawing line segment to be measured is relatively difficult, the user can draw when the virtual ruler is located near the drawing line segment to be measured Length measurement and length value display of straight line segments.
  • the detection module detects that there is a drawing line segment associated with the virtual scale on the touch screen, and displays the length value of the drawing line segment.
  • the degree of fit is very high.
  • posture association can be expressed by direction difference and distance value.
  • the direction difference between the drawing line segment and the virtual ruler is less than the preset value (for example, less than 20 degrees, 25 degrees, 30 degrees, 35 degrees, 40 degrees , 45 degrees, etc.); and/or, when the distance between the drawing line segment and the virtual ruler is less than a preset value (such as 1cm, 2cm, 3cm, etc.), it can be considered that there is a distance between the virtual ruler and the virtual ruler on the touch screen.
  • a straight line segment of the drawing associated with the pose.
  • the distance can be understood as the distance between the nearest point on the virtual ruler and the drawing straight line segment, or the average value of the distance between the point on the virtual ruler and the drawing straight line segment, or can be used to characterize the virtual ruler and drawing The method for the distance between straight line segments.
  • the detection module detects that there are multiple drawing straight line segments associated with the virtual ruler in gestures on the touch screen, the length value of the drawing straight line segment with the highest degree of gesture association can be acquired and displayed.
  • drawing straight segments drawing straight segment A and drawing straight segment B
  • the distance value between the virtual ruler and the drawing straight segment A is A1
  • the distance value between the virtual ruler and the drawing straight segment B is B1
  • the direction difference between the virtual ruler and the drawing line segment A is A2
  • the direction difference between the virtual ruler and the drawing line segment B is B2
  • A1 is equal to A2
  • B1 is greater than B2
  • the degree of correlation between the attitude of the drawing line segment B and the virtual scale is the largest when A1 is greater than A2 and B1 is equal to B2.
  • a feature quantity that can represent the distance value and direction difference can be obtained.
  • the feature quantity is positively correlated with the distance value and direction difference. smaller.
  • the detection module detects that there is a drawing line segment associated with the posture of the virtual ruler on the touch screen, and the display position of the virtual ruler does not accurately fit the drawing line segment, it can be adjusted based on the position of the drawing line segment.
  • the display position of the virtual ruler so that the display position of the virtual ruler fits the drawing line, that is, a display effect similar to magnetic attraction is realized.
  • adjusting the display position of the virtual ruler may include adjusting the position and display direction of the virtual ruler.
  • the drawing straight line segment within the predefined range of the virtual ruler has a magnetic attraction effect on the virtual ruler. fit.
  • the touch screen in FIG. 21 displays two straight line segments for drawing.
  • the virtual ruler will move along with the user's gestures, refer to Figure 22b, when the virtual ruler moves to the left near the drawing straight line segment, the virtual ruler will automatically adjust the display position to fit the drawing straight line segment 1, refer to Figure 22c , when the contact area between the side of the user's hand and the touch screen continues to the right, the virtual ruler will cancel the magnetic effect, that is, it will not fit the straight line segment 1 of the drawing, and continue to follow the user's gesture.
  • the drawing line if the drawing line is near the virtual scale, the drawing line will have a magnetic attraction effect on the virtual scale, and automatically measure the length of the drawing line, avoiding the process of manually fitting the virtual scale and the drawing line, reducing It reduces the difficulty of measurement, avoids the problem of inaccurate fitting when fitting manually, and improves the accuracy of measurement.
  • the virtual ruler may be used to measure distances between drawing points displayed on the touch screen.
  • At least one drawing line segment can be displayed on the touch screen.
  • the virtual point can be adjusted by moving the touch gesture on the touch screen.
  • the display position of the ruler is such that the virtual ruler intersects with two points (or multiple points) on the measured drawing line segment, and then, referring to Figure 24, the detection module can obtain two intersection points (or at least two of the plurality of intersection points) intersection point) and display the distance value.
  • the detection module may detect that there is a first intersection point and a second intersection point between the drawing line segment on the touch screen and the virtual scale, and display the distance value between the first intersection point and the second intersection point.
  • the virtual ruler is used as a reference tool when drawing a straight line on the touch screen.
  • the virtual ruler can be used as a reference tool when the user draws a straight line on the touch screen (see Figure 25).
  • the so-called reference tool can be understood as when the drawing track fits the ruler When , due to the obstruction of the ruler, the drawing track is limited to a straight line.
  • the straightedge in the physical space can be implemented based on its own structural characteristics and can be used as a reference tool when drawing a straight line (when placed on the drawing plane, when the user draws with a brush, the drawing track
  • the drawing trajectory will be restricted to a straight line by the ruler, and then a straight line can be drawn)
  • the virtual ruler is only an affordance displayed on the touch screen, so it is necessary to recognize the user's line drawing trajectory (or call it is the positional relationship between the line drawing position) and the virtual scale, and based on the positional relationship between the line drawing position and the virtual scale, limit the line drawing track.
  • the drawing trajectory of the writing and painting can be corrected as a straight line (refer to Figure 26, the left figure in Figure 26 is the user's actual line drawing trajectory, and the right figure is the drawing straight line displayed after correction), when the user draws a line along the virtual ruler through the above method, a straight line will be drawn automatically.
  • the detection module may detect a line-drawing gesture on the touch screen, based on the fact that the virtual scale is displayed on the touch screen, and the distance between the line-drawing position of the line-drawing gesture and the virtual scale is within Within a preset distance (such as 1 cm, 2 cm, 3 cm, 4 cm, 5 cm, etc.), according to the line drawing position of the line drawing gesture, a drawing line segment is displayed, wherein the drawing line segment is parallel to the virtual ruler.
  • a preset distance such as 1 cm, 2 cm, 3 cm, 4 cm, 5 cm, etc.
  • the drawing line between the first and last two drawing points can be calculated and displayed, and the length of the drawing line can be displayed.
  • the line drawing position includes a starting point position and an ending point position
  • the drawing straight line segment is a line segment between the starting point position and the ending point position (refer to FIG. 28 ,
  • the left picture in Fig. 28 is the user's actual drawing line trajectory, and the right picture is the drawing straight line displayed after correction)
  • the detection module can display the drawing line length based on the real-time drawing line position, and the drawing line length is the described The distance value between the starting point position and the real-time drawing line position.
  • the length of the line segment drawn exceeds the expected value, you can draw it back, which will have a deletion effect.
  • the generated line segment is the expected value, and the drawn Curves (the distance between the curve and the virtual ruler is within the preset range) will be automatically corrected to a straight line.
  • the general solution is to erase a part or redraw, but it is also difficult to obtain a line segment of an accurate length by erasing or redrawing.
  • the position of the end point will move back (the position of the start point is the starting point of drawing with the stylus, and the position of the end point is the position of the stylus when drawing a picture)
  • the last position, during the drawing process, the position of the end point can be regarded as a dynamic movement, until the drawing is completed, the position where the stylus is detached is the final end point position).
  • the line segment A2 is equivalent to drawing twice, the first time ( Forward drawing) to draw the trajectory line, when the second (reverse drawing) is repeated with the first drawing, the repeated line segment A2 is deleted, that is, part of the first straight line is erased, and finally only the starting point position is retained and the line segment A3 between the position of the end point.
  • marking the length within the line segment only mark the length of the line segment A3 between the start point position and the end point position.
  • the stylus can be drawn again when it is redrawn. Therefore, a line segment with an expected length of 15 cm can be obtained by (repeatedly) adjusting the position of the end point.
  • the intersection of two line segments is deleted (for example, the intersection of the diagonals of a square, when drawing two different diagonals, the intersection will also be drawn twice).
  • the deletion effect of repeated line drawing can increase the time judgment mechanism. Only when the same line segment is drawn twice within the predetermined time can the deletion effect be achieved. Or it can be realized through other judgment mechanisms. For example, only when the direction of movement of the stylus is opposite twice before and after is met, the direction mechanism such as deletion judgment is satisfied.
  • the two functions of the ruler are converted into intelligent operations suitable for the virtual interface.
  • the measurement is to automatically display the measured value, and the length value is displayed when drawing a straight line, which removes the limitations of the physical world and saves It takes less time for users to take readings, greatly improving user efficiency.
  • An embodiment of the present application provides a method for displaying a virtual scale, including: detecting a contact gesture with the touch screen; indicating the contact of the user's hand side with the touch screen based on the contact gesture, and according to the contact area of the contact gesture, A virtual scale is displayed on the touch screen, wherein the contact area of the contact gesture is a strip-shaped area, and the virtual scale is attached to a long side of the strip-shaped area.
  • the display of the virtual scale is triggered. On the one hand, only the user's single hand is required to operate, and the operation difficulty and cost are very small.
  • the contact area between the side of the hand and the touch screen is a flat strip area, which is close to a straight line. Based on this operation mode, triggering the display of the virtual scale is more in line with the user's operation habits.
  • Fig. 37a is a schematic flowchart of an object copying method provided by the embodiment of the present application.
  • an object copying method provided by the embodiment of the present application includes:
  • the existing operation logic includes three-finger pinch to complete copy, and three-finger release to paste.
  • the industry has a set of basic logic: select the content, right click or special button to enter the semantic menu, click the copy button, move to the target position to call up the semantic menu, and click the paste button. It can be seen that the mainstream method must select elements as a prerequisite before copying and pasting. There are many operation steps, and the operation cost of copying and pasting is very high.
  • the copied object when performing a copy and paste operation, can be dragged to a position to be pasted, and the position where the hand is raised is the pasted position.
  • the copy function for the target object when a long press gesture for the target object is detected, the copy function for the target object may be enabled.
  • the long press gesture may be a two-finger long press gesture, for example, refer to FIG. 37b.
  • the target object is a drawing object displayed on the touch screen, for example, it may be a character, pattern, stroke or the like.
  • the long press gesture may include a light press and a deep press, and when the contact intensity increases from an intensity lower than the light press intensity threshold to an intensity between the light press intensity threshold and the deep press intensity threshold, it may be called a touch detection "Light press” input on the surface.
  • An increase in contact intensity from an intensity below the deep press intensity threshold to an intensity above the deep press intensity threshold may be referred to as detecting a "deep press” input on the touch surface.
  • An increase in contact intensity from an intensity below the contact detection intensity threshold to an intensity between the contact detection intensity threshold and the light press intensity threshold may be referred to as detecting a "swipe touch" input on the touch surface.
  • the contact detection intensity threshold is zero. In some embodiments, the contact detection intensity threshold may be greater than zero.
  • the display of the virtual ruler may not be triggered.
  • the duration of the long-press gesture can also be used as a trigger to enable the copy function for the target object, that is, it needs to be detected that the duration of the long-press gesture is greater than a preset time, such as a preset
  • the time can be set as 0.1S, 0.15S, 0.2S, 0.25S, 0.3S, 0.35S, 0.4S, 0.45S, 1S, 2S.
  • the maintenance time of the long press gesture can start counting when the long press gesture is detected, or after detecting that the contact intensity between the long press gesture and the touch screen is greater than a certain contact intensity threshold (threshold greater than 0), the maintenance time The timing can be ended when the contact intensity of the long press gesture is detected to be 0, or when the contact intensity of the long press gesture is detected to be less than a certain contact intensity threshold (threshold greater than 0).
  • the maintenance time of the long press gesture can be the maintenance time when the long press gesture remains in a static state (or the movement is less than a certain range), for example, the maintenance time of the long press gesture can be when the press gesture is at rest (or the movement is less than a certain range) the maintenance time. It should be understood that the above static can be interpreted as that the contact area of the long press gesture does not move or moves in a small range.
  • the so-called enabling the copy function for the target object can be understood as activating the copy state.
  • the copy state is not activated, the display elements on the drawing and writing interface cannot be copied.
  • the so-called long press gesture for the target object can be understood as the target object is covered by the contact area of the long press gesture.
  • the detection module detects the long press gesture
  • the copy command will not be triggered (that is, the copy function for the display element will not be enabled. ).
  • the display element can be selected, and the display elements within this range are placed in a copy activation state (that is, the copy function for the target object is enabled).
  • the touch gesture is a two-finger long-press gesture
  • the touch gesture is a two-finger long-press gesture
  • there is a display element in the contact area then select the display element and put the element in a copy-activated state (that is, enable replication for the target object).
  • the so-called long press gesture for the target object may also be understood as the contact area of the long press gesture is within a preset distance around the target object.
  • the contact area of the long press gesture can be associated with a fixed predefined range, or can be extended on the basis of the predefined range, for example, within the preset distance of the boundary of the predefined range, it can be found pattern, the range can be extended to the newly discovered pattern, and the newly discovered pattern can also be applied to this extension.
  • a circle is determined according to the predefined range area, and select the pattern in the circular area, that is, select the word "cow”.
  • the extended shape can be adapted to a predefined shape, for example, a circular translation is performed to finally obtain an ellipse. It can also be adapted to the preset distance of the boundary judgment, such as an irregular pattern that is adapted to the actual shape and the boundary of the word.
  • this extension can be interrupted when a pattern that does not need to be copied is selected. For example, referring to FIG. 45 , an additional finger slide can be used to divide the selected area into two parts, and then the area corresponding to the two-finger part is deselected. As shown in the figure below, after splitting, if the word "Ji" is not selected, when you move two fingers, only "Nian Da of the Ox" will be copied.
  • the copy function for the target object may be enabled after detecting the click gesture for the target object.
  • the detection of the click gesture for the target object may be understood as the detection of a click gesture in which the contact area with the touch screen covers the target object.
  • the copy selection of the target object can also be performed based on the frame selection control (which can be physical or virtual) displayed on the touch screen.
  • the frame selection of the icon can be carried out , after selecting the icon, select the target object, and then enable the copy function for the target object.
  • the user performs a drag gesture on the touch screen for the target object
  • the end point of the drag gesture (that is, the position of the hand-raising gesture) may be the position where the paste operation for the target object needs to be performed.
  • the drag gesture may be a two-finger drag gesture.
  • the user triggers the copy function for the target object through a long press gesture
  • the drag gesture may be to keep the long press gesture in contact with the touch screen and move on the touch screen gesture.
  • the user triggers the copy function for the target object through the click gesture, and then uses the drag gesture to perform the copy operation, that is, the selection operation and the copy operation are separated.
  • the starting point of the touch area of the drag gesture may not be (or be) near the display position of the target object, but can be dragged at any display position on the touch screen, and the copied target object can be copied, and the target object can be dragged The object slides in the direction of the two-finger swipe.
  • the target object after the detection module detects a drag gesture for the target object, the target object has a tendency to be dragged away, and after detecting the instruction of the drag gesture, the target object is copied Operation, copy the target object, generate a copy body (or called a mirror image), and make the mirror image move with the movement of the drag gesture, and the body of the target object is still in the original position without sliding.
  • a copy body or called a mirror image
  • the mirror image of the target object may appear in a semi-blurred state.
  • a hand raising gesture is detected.
  • the hand gesture when the drag gesture slides to the expected position, the hand gesture is performed, that is, when the hand is out of contact with the display screen or the contact intensity is less than the threshold, the paste operation is triggered. It is to paste the target object at the position of the hand gesture, and then complete the copy and paste process.
  • gesture rules for the copy operation are not only applicable to the drawing elements on the drawing interface, but also applicable to the copying of text on the display interface.
  • press and hold the selected text for example, in Figure 48 Press and hold with two fingers as shown
  • drag to copy a copy after release, the copy will be displayed at the position where the finger leaves.
  • An embodiment of the present application provides an object copying method, the method comprising: detecting a drag gesture for a target object displayed on a touch screen; in response to the drag gesture, displaying the target object on the touch screen and according to the drag gesture, update the display position of the mirror image in real time, so that the mirror image moves with the drag gesture; detect a hand-raising gesture; respond to the hand-raising gesture, move the mirror image It is fixedly displayed at the display position where the mirror image is located.
  • the method of copying and pasting by dragging gestures is simple and intuitive, which reduces the number of steps required for existing copy and paste, and shortens the process of copying and pasting.
  • the way of copying can be performed.
  • the gesture is simple and commonly used, and the operation is simple and conforms to the user's cognition.
  • Fig. 49 is a schematic structural diagram of a virtual ruler display device provided by an embodiment of the present application, which is applied to electronic equipment, the electronic equipment includes a touch screen, and the device 4900 includes:
  • a detection module 4901 configured to detect a contact gesture with the touch screen
  • step 301 For a specific description of the detection module 4901, reference may be made to the description of step 301, which will not be repeated here.
  • the display module 4902 is configured to indicate the contact of the side of the user's hand with the touch screen based on the contact gesture, and display a virtual ruler on the touch screen according to the contact area of the contact gesture, wherein the contact area of the contact gesture is a strip-shaped area, and the virtual scale is attached to the long side of the strip-shaped area.
  • step 302 For a specific description of the display module 4902, reference may be made to the description of step 302, which will not be repeated here.
  • the side of the hand is the side of the user's hand that is located on the side of the little finger in an unfolded state.
  • the contact area includes:
  • the contact area between the little finger on the side of the user's hand and the touch screen is the contact area between the little finger on the side of the user's hand and the touch screen.
  • the device also includes:
  • a determining module 4904 configured to determine that the contact gesture indicates contact between the side of the user's hand and the touch screen based on that the contact area of the contact gesture satisfies a preset condition; the preset condition includes:
  • the contact area is a strip-shaped area, and the shape and size of the strip-shaped area conform to the characteristics of the contact area when the hypothenar part and/or the little finger part of the side of the user's hand is in contact with the touch screen.
  • the device also includes:
  • An acquisition module 4903 configured to acquire gesture data of the contact gesture
  • the determination module 4904 is further configured to determine, according to the gesture data, through a neural network, that the contact gesture indicates that the side of the user's hand is in contact with the touch screen.
  • the virtual ruler is attached to the long side of the strip-shaped region, including:
  • the acute angle between the direction of the virtual scale and the direction indicated by the long side of the strip-shaped area is smaller than a preset value; and the virtual scale and the strip-shaped area meet one of the following conditions:
  • the detection module 4901 is configured to detect that the duration of the contact gesture is longer than a preset before displaying a virtual ruler on the touch screen in the contact area according to the contact gesture time.
  • the detection module 4901 is configured to detect the movement of the contact area of the contact gesture
  • the display module 4902 is configured to adjust the display position of the virtual ruler so that the display position of the virtual ruler follows the contact area.
  • the detection module 4901 is configured to detect the user's hand-raising gesture
  • the display module is configured to hide the display of the virtual ruler on the touch screen in response to the hand-raising gesture.
  • the instructing the contact of the side of the user's hand with the touch screen based on the contact gesture includes:
  • the method further includes:
  • the user's hand gesture is detected
  • the detection module 4901 is configured to detect a user's selection instruction for the virtual scale
  • the display module is configured to display a trigger control in response to the selection instruction, and the trigger control is used to indicate at least one of the following operations on the virtual scale:
  • the virtual ruler is used to measure the length of a drawn line segment displayed on the touch screen; or,
  • the virtual ruler is used for distance measurement between drawing points displayed on the touch screen.
  • the detection module 4901 is configured to detect that there is a drawing line segment associated with the gesture of the virtual ruler on the touch screen;
  • the display module is used to display the length value of the straight line segment of the drawing; the existence gesture association includes:
  • the direction difference from the virtual scale is smaller than a preset value; and/or,
  • the distance from the virtual ruler is smaller than a preset value.
  • the display module 4902 is configured to adjust the display position of the virtual ruler according to the position of the line segment to be measured, so that the display position of the virtual ruler fits with the drawn straight line.
  • the detection module 4901 is configured to detect that there is a first intersection point and a second intersection point between the drawn line segment on the touch screen and the virtual scale after the virtual scale is displayed on the touch screen;
  • the display module 4902 is configured to display the distance value between the first intersection point and the second intersection point.
  • the virtual ruler is used as a reference tool when drawing a straight line on the touch screen.
  • the detection module 4901 is configured to detect a gesture of drawing a line on the touch screen
  • the display module 4902 is configured to display the virtual ruler on the touch screen, and the distance between the line drawing position of the line drawing gesture and the virtual ruler is within a preset distance, according to the line drawing
  • the line drawing position of the gesture displays a drawing line segment, wherein the drawing line segment is parallel to the virtual ruler.
  • the line drawing position includes a starting point position and an ending point position
  • the drawing straight line segment is a line segment between the starting point position and the ending point position
  • the line drawing position includes a starting point position and a real-time line drawing position
  • the method further includes:
  • the line-drawing length is a distance value between the starting point position and the real-time line-drawing position.
  • An embodiment of the present application provides a virtual ruler display device, including: a detection module 4901, configured to detect a contact gesture with the touch screen; a display module 4902, configured to indicate that the side of the user's hand is in contact with the touch screen based on the contact gesture According to the contact area of the contact gesture, a virtual ruler is displayed on the touch screen, wherein the contact area of the contact gesture is a strip-shaped area, and the virtual scale and the long side of the strip-shaped area fit.
  • the display of the virtual scale is triggered. On the one hand, only the user's single hand is required to operate, and the operation difficulty and cost are very small.
  • the contact area between the side of the hand and the touch screen is a flat strip area, which is close to a straight line. Based on this operation mode, triggering the display of the virtual scale is more in line with the user's operation habits.
  • FIG. 50 is a schematic structural diagram of an object replication device provided in an embodiment of the present application. As shown in FIG. 50, the device 5000 includes:
  • the detection module 5001 is configured to detect a drag gesture for a target object displayed on the touch screen; the detection module is also configured to detect a hand-raising gesture after the display module displays a mirror image of the target object;
  • step 3701 and step 3703 For a specific description of the detection module 5001, reference may be made to the descriptions of step 3701 and step 3703, which will not be repeated here.
  • the display module 5002 is configured to display a mirror image of the target object on the touch screen in response to the drag gesture, and update the display position of the mirror image in real time according to the drag gesture, so that the mirror image follows the The drag gesture moves; the display module is further configured to, in response to the hand-raising gesture, fix and display the mirror image at the display position where the mirror image is located.
  • step 3702 and step 3704 For a specific description of the display module 5002, reference may be made to the descriptions of step 3702 and step 3704, which will not be repeated here.
  • the device also includes:
  • the enabling module 5003 is configured to enable the copy function for the target object when at least one of the following gestures is detected before the detection of the drag gesture for the target object displayed on the touch screen:
  • a click gesture is detected for the target object.
  • the detection module 5001 is configured to detect a long press gesture in which the contact area with the touch screen covers the target object or is within a preset distance around the target object.
  • the dragging gesture is a gesture of keeping the long press gesture in contact with the touch screen and moving on the touch screen.
  • the long press gesture is a two-finger long press gesture.
  • the detection module 5001 is configured to detect a click gesture that a contact area with the touch screen covers the target object.
  • An embodiment of the present application provides an object copying device, including: a detection module 5001, configured to detect a drag gesture for a target object displayed on a touch screen; the detection module is also configured to display the target on a display module A hand gesture is detected after the mirror image of the object; the display module 5002 is configured to display the mirror image of the target object on the touch screen in response to the drag gesture, and update the mirror image in real time according to the drag gesture The display position of the mirror image so that the mirror image moves along with the drag gesture; the display module is further configured to display the mirror image at the display position where the mirror image is located in response to the hand-raising gesture.
  • the method of copying and pasting by dragging gestures is simple and intuitive, which reduces the number of steps required for existing copy and paste, and shortens the process of copying and pasting.
  • FIG. 51 is a schematic structural diagram of the terminal device provided by the embodiment of the present application.
  • the terminal device 5100 may specifically be an electronic whiteboard, a virtual reality VR device, a mobile phone, a tablet, a notebook computer, a smart wearable device, etc., which are not limited here.
  • the terminal device 5100 includes: a receiver 5101, a transmitter 5102, a processor 5103, and a memory 5104 (the number of processors 5103 in the terminal device 5100 can be one or more, and one processor is taken as an example in FIG.
  • the processor 5103 may include an application processor 51031 and a communication processor 51032 .
  • the receiver 5101, the transmitter 5102, the processor 5103, and the memory 5104 may be connected through a bus or in other ways.
  • the memory 5104 may include read-only memory and random-access memory, and provides instructions and data to the processor 5103 .
  • a part of the memory 5104 may also include a non-volatile random access memory (non-volatile random access memory, NVRAM).
  • NVRAM non-volatile random access memory
  • the memory 5104 stores processors and operating instructions, executable modules or data structures, or their subsets, or their extended sets, wherein the operating instructions may include various operating instructions for implementing various operations.
  • the processor 5103 controls the operation of the terminal device.
  • various components of the terminal device are coupled together through a bus system, where the bus system may include a power bus, a control bus, and a status signal bus in addition to a data bus.
  • the various buses are referred to as bus systems in the figures.
  • the methods disclosed in the foregoing embodiments of the present application may be applied to the processor 5103 or implemented by the processor 5103 .
  • the processor 5103 may be an integrated circuit chip and has a signal processing capability. In the implementation process, each step of the above method can be completed by an integrated logic circuit of hardware in the processor 5103 or instructions in the form of software.
  • the above-mentioned processor 5103 can be a general-purpose processor, a digital signal processor (digital signal processing, DSP), a microprocessor or a microcontroller, and can further include an application-specific integrated circuit (application specific integrated circuit, ASIC), field programmable Field-programmable gate array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processing
  • ASIC application specific integrated circuit
  • FPGA field programmable Field-programmable gate array
  • the processor 5103 may implement or execute various methods, steps, and logic block diagrams disclosed in the embodiments of the present application.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, register.
  • the storage medium is located in the memory 5104, and the processor 5103 reads the information in the memory 5104, and completes the steps of the above method in combination with its hardware.
  • the processor 5103 can read the information in the memory 5104, and combine its hardware to complete the steps related to the display of the virtual scale in the steps 301 to 302 in the above embodiment, and the steps related to the display of the virtual scale in the steps 3701 to 3704 in the above embodiment. Copy the steps associated with the method.
  • the receiver 5101 can be used to receive input digital or character information, and generate signal input related to related settings and function control of the terminal device.
  • the transmitter 5102 can be used to output digital or character information through the first interface; the transmitter 5102 can also be used to send instructions to the disk group through the first interface to modify the data in the disk group; the transmitter 5102 can also include a display device such as a touch screen.
  • An embodiment of the present application further provides a computer program product, which, when running on a computer, causes the computer to execute the steps of the method described in the embodiment corresponding to FIG. 3 and FIG. 37a in the above embodiments.
  • An embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a program for signal processing, and when it is run on a computer, the computer executes the method described in the foregoing embodiments The steps of the image processing method.
  • the image display device provided by the embodiment of the present application may specifically be a chip, and the chip includes: a processing unit and a communication unit, the processing unit may be, for example, a processor, and the communication unit may be, for example, an input/output interface, a pin or a circuit, etc. .
  • the processing unit can execute the computer-executed instructions stored in the storage unit, so that the chips in the execution device execute the data processing methods described in the above embodiments, or make the chips in the training device execute the data processing methods described in the above embodiments.
  • the storage unit is a storage unit in the chip, such as a register, a cache, etc.
  • the storage unit may also be a storage unit located outside the chip in the wireless access device, such as only Read-only memory (ROM) or other types of static storage devices that can store static information and instructions, random access memory (random access memory, RAM), etc.
  • ROM Read-only memory
  • RAM random access memory
  • the device embodiments described above are only illustrative, and the units described as separate components may or may not be physically separated, and the components shown as units may or may not be A physical unit can be located in one place, or it can be distributed to multiple network units. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • the connection relationship between the modules indicates that they have communication connections, which can be specifically implemented as one or more communication buses or signal lines.
  • the essence of the technical solution of this application or the part that contributes to the prior art can be embodied in the form of a software product, and the computer software product is stored in a readable storage medium, such as a floppy disk of a computer , U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk, etc., including several instructions to make a computer device (which can be a personal computer, a server, or a network device, etc.) execute the method described in each embodiment of the present application .
  • a computer device which can be a personal computer, a server, or a network device, etc.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server, or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.).
  • wired eg, coaxial cable, optical fiber, digital subscriber line (DSL)
  • wireless eg, infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that can be stored by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (Solid State Disk, SSD)), etc.

Abstract

La présente demande concerne un procédé d'affichage d'échelle virtuelle, consistant à : détecter un geste de contact avec un écran tactile, indiquant un contact entre la surface côté main d'un utilisateur et l'écran tactile sur la base du geste de contact, et afficher une échelle virtuelle sur l'écran tactile. Selon la présente demande, lorsqu'il est détecté que le geste de contact indique le contact entre la surface côté main de l'utilisateur et l'écran tactile, l'affichage de l'échelle virtuelle peut être déclenché, d'une part, une seule main de l'utilisateur est nécessaire pour le fonctionnement, de telle sorte que la difficulté de fonctionnement et le coût sont très faibles ; et d'autre part, lorsque la main de l'utilisateur est dépliée, une région de contact entre la surface côté main et l'écran tactile est une région allongée plate et est se rapproche d'une ligne droite, et l'affichage de l'échelle virtuelle déclenché sur la base du mode de fonctionnement est mieux conforme à l'habitude de fonctionnement de l'utilisateur.
PCT/CN2022/097085 2021-06-09 2022-06-06 Procédé d'affichage d'échelle virtuelle et dispositif associé WO2022257870A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110644928.3A CN113515228A (zh) 2021-06-09 2021-06-09 一种虚拟标尺显示方法以及相关设备
CN202110644928.3 2021-06-09

Publications (1)

Publication Number Publication Date
WO2022257870A1 true WO2022257870A1 (fr) 2022-12-15

Family

ID=78065759

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/097085 WO2022257870A1 (fr) 2021-06-09 2022-06-06 Procédé d'affichage d'échelle virtuelle et dispositif associé

Country Status (2)

Country Link
CN (1) CN113515228A (fr)
WO (1) WO2022257870A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113515228A (zh) * 2021-06-09 2021-10-19 华为技术有限公司 一种虚拟标尺显示方法以及相关设备
CN115774513B (zh) * 2022-11-22 2023-07-07 北京元跃科技有限公司 基于尺子确定绘画方向的系统、方法、电子设备及介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096548A (zh) * 2009-12-11 2011-06-15 达索系统公司 采用触控式显示器复制对象的方法和系统
CN102141887A (zh) * 2010-01-28 2011-08-03 微软公司 画笔、复写和填充手势
CN102169408A (zh) * 2010-02-04 2011-08-31 微软公司 链接手势
CN104732007A (zh) * 2013-12-20 2015-06-24 达索系统公司 具有包括用于复制和操作建模对象的机制的触控式显示器的设备
CN105278818A (zh) * 2014-06-27 2016-01-27 腾讯科技(深圳)有限公司 一种即时通讯软件中的内容复制方法和装置
CN107636593A (zh) * 2015-06-07 2018-01-26 苹果公司 用于提供虚拟绘图辅助工具和与其进行交互的设备、方法和图形用户界面
CN113515228A (zh) * 2021-06-09 2021-10-19 华为技术有限公司 一种虚拟标尺显示方法以及相关设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101580570B1 (ko) * 2014-06-10 2015-12-28 주식회사 하이딥 터치 센서 패널의 제어방법 및 제어장치
US10579216B2 (en) * 2016-03-28 2020-03-03 Microsoft Technology Licensing, Llc Applications for multi-touch input detection
JP7103782B2 (ja) * 2017-12-05 2022-07-20 アルプスアルパイン株式会社 入力装置および入力制御装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096548A (zh) * 2009-12-11 2011-06-15 达索系统公司 采用触控式显示器复制对象的方法和系统
CN102141887A (zh) * 2010-01-28 2011-08-03 微软公司 画笔、复写和填充手势
CN102169408A (zh) * 2010-02-04 2011-08-31 微软公司 链接手势
CN104732007A (zh) * 2013-12-20 2015-06-24 达索系统公司 具有包括用于复制和操作建模对象的机制的触控式显示器的设备
CN105278818A (zh) * 2014-06-27 2016-01-27 腾讯科技(深圳)有限公司 一种即时通讯软件中的内容复制方法和装置
CN107636593A (zh) * 2015-06-07 2018-01-26 苹果公司 用于提供虚拟绘图辅助工具和与其进行交互的设备、方法和图形用户界面
CN113515228A (zh) * 2021-06-09 2021-10-19 华为技术有限公司 一种虚拟标尺显示方法以及相关设备

Also Published As

Publication number Publication date
CN113515228A (zh) 2021-10-19

Similar Documents

Publication Publication Date Title
US9996176B2 (en) Multi-touch uses, gestures, and implementation
US8941600B2 (en) Apparatus for providing touch feedback for user input to a touch sensitive surface
TWI514229B (zh) 圖形編輯方法以及電子裝置
JP5507494B2 (ja) タッチ・スクリーンを備える携帯式電子機器および制御方法
WO2022257870A1 (fr) Procédé d'affichage d'échelle virtuelle et dispositif associé
CN105094654B (zh) 一种屏幕控制方法及装置
US20110216015A1 (en) Apparatus and method for directing operation of a software application via a touch-sensitive surface divided into regions associated with respective functions
US20100229090A1 (en) Systems and Methods for Interacting With Touch Displays Using Single-Touch and Multi-Touch Gestures
US10345912B2 (en) Control method, control device, display device and electronic device
WO2010032268A2 (fr) Système et procédé permettant la commande d’objets graphiques
US20160026375A1 (en) Shadeless touch hand-held electronic device, method and graphical user interface
US20160026309A1 (en) Controller
TWI488082B (zh) 可攜式電子裝置及觸控感測方法
US9727151B2 (en) Avoiding accidental cursor movement when contacting a surface of a trackpad
US9256360B2 (en) Single touch process to achieve dual touch user interface
WO2023030377A1 (fr) Procédé d'affichage d'un contenu d'écriture/de dessin et dispositif associé
EP3008556A1 (fr) Desambiguisation d'une entree indirecte
US11137903B2 (en) Gesture-based transitions between modes for mixed mode digital boards
US20150153925A1 (en) Method for operating gestures and method for calling cursor
US11216121B2 (en) Smart touch pad device
US20160026280A1 (en) Shadeless touch hand-held electronic device and touch cover
US20140085340A1 (en) Method and electronic device for manipulating scale or rotation of graphic on display
US11604578B2 (en) Touch control method and touch control system applying ihe same
US20230409145A1 (en) Touch response method, device, interactive white board, and storage medium
US20160026325A1 (en) Hand-held electronic device, touch-sensing cover and computer-executed method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22819468

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE