WO2023216613A1 - Control method, electronic device and computer storage medium - Google Patents

Control method, electronic device and computer storage medium Download PDF

Info

Publication number
WO2023216613A1
WO2023216613A1 PCT/CN2022/141461 CN2022141461W WO2023216613A1 WO 2023216613 A1 WO2023216613 A1 WO 2023216613A1 CN 2022141461 W CN2022141461 W CN 2022141461W WO 2023216613 A1 WO2023216613 A1 WO 2023216613A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
electronic device
target object
boundary
target
Prior art date
Application number
PCT/CN2022/141461
Other languages
French (fr)
Chinese (zh)
Inventor
李雅欣
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2023216613A1 publication Critical patent/WO2023216613A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Definitions

  • the present application relates to non-contact touch control technology in electronic equipment, and in particular, to a control method, electronic equipment and computer storage media.
  • gesture interaction is a new type of Interaction methods are constantly developing in applications such as driving and dining.
  • Gesture interaction can realize a variety of controls on the page, such as sliding up and down the page, turning pages, taking pictures, taking screenshots, ending recording, etc.
  • the electronic device when controlling electronic devices through changes in gesture types or changes in other parts of the body, the electronic device can only be controlled to realize its own functions, resulting in insufficient control. It can be seen from this that the existing The non-contact touch control method is not refined enough.
  • Embodiments of the present application provide a control method, which is applied to electronic devices and includes:
  • the electronic device is controlled to perform a target function corresponding to the touch type.
  • An embodiment of the present application provides an electronic device, including:
  • An acquisition module configured to acquire a video sequence corresponding to a non-contact touch operation when the non-contact touch control function of the electronic device is turned on;
  • a processing module configured to perform touch recognition on the video sequence to obtain the target object and touch type
  • a first determination module configured to determine a first distance between the target object and the electronic device
  • a second determination module configured to determine touch operation parameters at the first distance
  • a control module configured to control the electronic device to perform a target function corresponding to the touch type according to the touch operation parameters.
  • An embodiment of the present application provides an electronic device, including:
  • the storage medium relies on the processor to perform operations through a communication bus.
  • the instructions are executed by the processor, one or more of the above implementations are performed.
  • Embodiments of the present application provide a computer storage medium that stores executable instructions.
  • the executable instructions When executed by one or more processors, the processor performs the control described in one or more embodiments. method.
  • Figure 1 is a schematic flowchart of an optional control method provided by an embodiment of the present application.
  • Figure 2 is a schematic flow chart of a page control method in related technologies
  • Figure 3a is a schematic diagram of Example 1 of an optional gesture box provided by the embodiment of the present application.
  • Figure 3b is a schematic diagram of Example 2 of an optional gesture box provided by the embodiment of the present application.
  • Figure 3c is a schematic diagram of Example 3 of an optional gesture box provided by the embodiment of the present application.
  • Figure 3d is a schematic diagram of Example 4 of an optional gesture box provided by the embodiment of the present application.
  • Figure 4 is a schematic diagram of Example 1 of an optional control method provided by the embodiment of the present application.
  • Figure 5 is a schematic diagram of Example 2 of an optional control method provided by the embodiment of the present application.
  • Figure 6a is a schematic diagram of Example 5 of an optional gesture box provided by the embodiment of the present application.
  • Figure 6b is a schematic diagram of Example 6 of an optional gesture box provided by the embodiment of the present application.
  • Figure 6c is a schematic diagram of Example 7 of an optional gesture box provided by the embodiment of the present application.
  • Figure 7a is a schematic layout diagram of Example 1 of an optional screen provided by the embodiment of the present application.
  • Figure 7b is a schematic layout diagram of Example 2 of an optional screen provided by the embodiment of the present application.
  • Figure 7c is a schematic layout diagram of Example 3 of an optional screen provided by the embodiment of the present application.
  • Figure 8 is a schematic structural diagram of an optional electronic device provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of another optional electronic device provided by an embodiment of the present application.
  • embodiments of the present application provide a control method, which method is applied to an electronic device and includes:
  • the electronic device is controlled to perform a target function corresponding to the touch type.
  • determining the first distance between the target object and the electronic device includes:
  • a first distance between the target object and the electronic device is determined according to the selected boundary information of the target object.
  • determining the first distance between the target object and the electronic device according to the boundary information of the selected target object includes:
  • the first distance between the target object and the electronic device is determined using the relationship between the preset boundary length value and the distance from the target object to the electronic device.
  • determining the target boundary based on the second distance between each boundary of the bounding box and the corresponding edge of the screen includes:
  • the selected boundary is determined as the target boundary.
  • the target is calculated using the relationship between the preset boundary length value and the distance from the target object to the electronic device.
  • the first distance between the object and the electronic device includes:
  • the target boundary is a width boundary
  • the target boundary width value is a width boundary
  • the target object and the electronic device are calculated.
  • the target is calculated using the relationship between the preset boundary length value and the distance from the target object to the electronic device.
  • the first distance between the object and the electronic device includes:
  • the target boundary is a height boundary
  • the target boundary is a height boundary
  • the target object and the electronic device are calculated. A first distance between the electronic devices.
  • determining the first distance between the target object and the electronic device includes:
  • the average value of the distances between the target object and the electronic device in each image frame is determined as the first distance between the target object and the electronic device.
  • the average value of the distance between the target object and the electronic device in each image frame is determined as the distance between the target object and the electronic device.
  • the first distance includes:
  • the target object in each image frame is The average value of the distances to the electronic device is determined as the first distance between the target object and the electronic device.
  • the method further includes:
  • determining the touch operation parameters at the first distance includes:
  • Touch operation parameters at the first distance are determined according to the sensitivity coefficient at the first distance.
  • determining the touch operation parameters at the first distance based on the sensitivity coefficient at the first distance includes:
  • the touch operation parameters at the first distance are calculated using the sensitivity coefficient at the first distance and the preset touch operation parameters at the standard distance.
  • performing touch recognition on the video sequence to obtain the target object and touch type includes:
  • the touch type is determined based on the confidence value of the touch operation of the target object.
  • determining the touch type based on the confidence value of the touch operation of the target object includes:
  • the type of the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object is determined as the touch type.
  • touch operation parameters include any of the following:
  • the sliding distance of the sliding operation The sliding distance of the sliding operation, the sliding speed of the sliding operation, the page turning speed of the page turning operation, the long pressing time of the long pressing operation and the click frequency of the clicking operation.
  • controlling the electronic device to perform a target function corresponding to the touch type according to the touch operation parameters includes:
  • controlled objects include any of the following: pages, interfaces and controls.
  • controlling a controlled object on the screen of the electronic device according to the touch operation parameters to perform a target function corresponding to the touch type includes:
  • the page of the screen is controlled to perform the sliding function.
  • embodiments of the present application also provide an electronic device, including:
  • An acquisition module configured to acquire a video sequence corresponding to a non-contact touch operation when the non-contact touch control function of the electronic device is turned on;
  • a processing module configured to perform touch recognition on the video sequence to obtain the target object and touch type
  • a first determination module configured to determine a first distance between the target object and the electronic device
  • a second determination module configured to determine touch operation parameters at the first distance
  • a control module configured to control the electronic device according to the touch operation parameters to perform the target function corresponding to the touch type.
  • the first determination module is specifically configured to:
  • a first distance between the target object and the electronic device is determined according to the selected boundary information of the target object.
  • the first determination module determines the first distance between the target object and the electronic device according to the boundary information of the selected target object. Distance includes:
  • the first distance between the target object and the electronic device is determined using the relationship between the preset boundary length value and the distance from the target object to the electronic device.
  • the first determination module determines the target boundary according to the second distance between each boundary of the bounding box and the corresponding edge of the screen, including:
  • the selected boundary is determined as the target boundary.
  • the first determination module calculates, based on the length value of the target boundary, using the relationship between the preset boundary length value and the distance from the target object to the electronic device. Obtaining the first distance between the target object and the electronic device includes:
  • the target boundary is a width boundary
  • the target boundary width value is a width boundary
  • the target object and the electronic device are calculated.
  • the first determination module calculates, based on the length value of the target boundary, using the relationship between the preset boundary length value and the distance from the target object to the electronic device. Obtaining the first distance between the target object and the electronic device includes:
  • the target boundary is a height boundary
  • the target boundary is a height boundary
  • the target object and the electronic device are calculated. A first distance between the electronic devices.
  • the first determining module determines the first distance between the target object and the electronic device, including:
  • the average value of the distances between the target object and the electronic device in each image frame is determined as the first distance between the target object and the electronic device.
  • the first determination module determines the average distance between the target object and the electronic device in each image frame as the value between the target object and the electronic device.
  • the first distance between devices includes:
  • the target object in each image frame is The average value of the distances to the electronic device is determined as the first distance between the target object and the electronic device.
  • the electronic device is further configured to:
  • the second determination module is specifically configured to:
  • Touch operation parameters at the first distance are determined according to the sensitivity coefficient at the first distance.
  • the second determination module determines the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance, including:
  • the touch operation parameters at the first distance are calculated using the sensitivity coefficient at the first distance and the preset touch operation parameters at the standard distance.
  • processing module is specifically configured to:
  • the touch type is determined based on the confidence value of the touch operation of the target object.
  • the processing module determines the touch type based on the confidence value of the touch operation of the target object, including:
  • the type of the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object is determined as the touch type.
  • touch operation parameters include any of the following:
  • the sliding distance of the sliding operation The sliding distance of the sliding operation, the sliding speed of the sliding operation, the page turning speed of the page turning operation, the long pressing time of the long pressing operation and the click frequency of the clicking operation.
  • control module is specifically configured to:
  • controlled objects include any of the following: pages, interfaces and controls.
  • control module controls the controlled object on the screen of the electronic device according to the touch operation parameters to perform a target function corresponding to the touch type, including:
  • the page of the screen is controlled to perform the sliding function.
  • embodiments of the present application also provide an electronic device, including:
  • the storage medium relies on the processor to perform operations through a communication bus.
  • the instructions are executed by the processor, one or more of the above implementations are performed.
  • embodiments of the present application further provide a computer storage medium that stores executable instructions.
  • the executable instructions When executed by one or more processors, the processor executes one or more of the above steps. The control method described in the embodiment.
  • FIG. 1 is a schematic flowchart of an optional control method provided by the embodiment of the present application. As shown in Figure 1, the control method can include:
  • Figure 2 is a schematic flow chart of a page control method in related technologies. As shown in Figure 2, taking gesture control as an example, the page control method may include:
  • the electronic device turns on the front camera, uses the front camera to capture a picture containing the gesture, and after capturing the picture, continuously detects the gesture in the picture, obtains the gesture, and judges the gesture. After the judgment, the corresponding value of the gesture is obtained. Operation, for example, move up operation, and finally, operation in response to the gesture.
  • the electronic device can only be controlled to realize its own functions, resulting in insufficient control precision.
  • embodiments of the present application provide an optional control method.
  • the non-contact touch control function of the electronic device is turned on, that is to say, the electronic device is turned on.
  • Non-contact touch control function the electronic device is turned on.
  • the front camera of the electronic device is turned on to capture the front of the screen of the electronic device.
  • a video sequence is obtained to obtain the non-contact A video sequence corresponding to the touch operation, wherein the video sequence includes multiple consecutive image frames.
  • touch operations may include slide-up operations, slide-down operations, long-press operations, page-turning operations, page-turning operations, and click operations.
  • the embodiments of the present application do not specifically limit this.
  • S102 Perform touch recognition on the video sequence to obtain the target object and touch type
  • the above-mentioned target objects may be body parts of the human body, such as hands, heads, eyes, etc., which are not specifically limited in the embodiments of the present application.
  • the user can use gestures to control the electronic device to perform the target function corresponding to the touch type corresponding to the gesture, or can use changes in the head, for example, using the left and right shaking of the head to control the electronic device to perform head movements.
  • the target function corresponding to the touch type corresponding to the left and right shaking can also use the changes of the eyes.
  • the blinking action is used to control the electronic device to perform the target function corresponding to the touch type corresponding to the blinking action.
  • the embodiment of the present application does not do this. Specific limitations.
  • the touch type can include sliding type, page turning type, click type, and long press type.
  • the upward sliding operation and the sliding downward operation belong to the sliding type
  • the long pressing operation belongs to the long pressing type
  • the page turning operation and the page turning operation belong to the page turning type
  • the clicking operation belongs to the clicking type.
  • the embodiment of the present application does not specifically limit this.
  • S102 may include:
  • the touch type is determined based on the confidence value of the target object's touch operation.
  • the target object, the bounding box of the target object, and the confidence value of the touch operation of the target object can be obtained, where the bounding box represents the boundary information of the target object, and the boundary information of the target object refers to is the outline information of the target object, where the outline information may include the shape of the outline, the area of the outline, and the size information of the outline lines;
  • the above-mentioned bounding box is a rectangular box that wraps the above-mentioned target object, and the bounding box includes the height and width of the border.
  • the touch recognition of the video sequence not only the target object can be obtained, but also the boundary information of the target object can be obtained.
  • the gesture category of the target object can be matched to obtain the touch operation of the target object.
  • the touch operations of one or more target objects can be matched, and the confidence value of the touch operation of the target object can also be obtained during touch recognition, that is, the confidence level of the touch operation of the target object. value, in order to determine the touch type, the touch type can be determined based on the confidence value of the touch operation of the target object.
  • determining the touch type based on the confidence value of the target object's touch operation includes:
  • the type of the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object is determined as the touch type.
  • the confidence value of each touch operation of the target object can reflect the credibility of each touch operation.
  • the first distance may be between the target object and the screen of the electronic device.
  • the first distance may also be the first distance between the target object and the backplane of the electronic device, or it may be the first distance between the target object and the image sensor of the electronic device.
  • the embodiment of the present application does not do this. Specific limitations.
  • each image frame in the video sequence acquired by the front camera also corresponds to a depth image
  • the first distance between the target object and the electronic device can be determined based on the depth image
  • the first distance between the target object and the electronic device can also be determined based on the depth image.
  • the first distance between the target object and the electronic device is determined based on the boundary information of the object.
  • the embodiment of the present application does not specifically limit this.
  • the first distance between the target object and the electronic device In order to determine the first distance between the target object and the electronic device through the boundary information of the target object, since each image frame in the video sequence corresponds to a boundary information, here, it is possible to determine the first distance between the target object and the electronic device according to the boundary information of the target object in each image frame.
  • the first distance between the target object and the electronic device can be determined based on the boundary information of one of the target objects in each image frame. The distance is not specifically limited in the embodiment of the present application.
  • S103 may include:
  • a first distance between the target object and the electronic device is determined.
  • the boundary information of a target object is selected from the boundary information of the target object in each image frame, and based on the selected The boundary information of the target object is used to determine the first distance between the target object and the electronic device.
  • determining the first distance between the target object and the electronic device according to the boundary information of the selected target object includes:
  • the first distance between the target object and the electronic device is determined using the relationship between the preset boundary length value and the distance between the target object and the electronic device.
  • the boundary information is represented by a bounding box
  • the first distance between each boundary of the bounding box and the corresponding edge of the screen is determined.
  • the second distance between each boundary can be determined by judging the second distance from each boundary to the corresponding edge of the screen to determine whether the bounding box locks the complete target object, that is, whether the target object is cut off by the screen.
  • the target boundary is the width or height value available for the target boundary when the target object is not truncated or is partially truncated. In this way, the length value of the selected target boundary is used to A first distance between the target object and the electronic device is determined.
  • the target boundary when the boundary is screened out, the target boundary can be used to determine the first distance between the target object and the electronic device, and then the touch operation parameters at the first distance can be determined.
  • the second distance between each edge of the bounding box and the corresponding edge of the screen of the electronic device determines the target boundary, including:
  • the second distance between each boundary and the corresponding edge of the screen is compared with the preset threshold. If both are greater than the preset threshold, it means that the target object in the bounding box is not cut off by the screen. Therefore, One boundary can be selected from each boundary as the target boundary; if there is only one boundary and the second distance between the corresponding edge of the screen is less than or equal to the preset threshold, it means that the target object in the boundary box is cut off by the screen, and although The target object is cut off, but the boundary corresponding to the boundary where the second distance between the corresponding edge of the screen is less than or equal to the preset threshold can reflect the size of the target object. Therefore, the second distance between the corresponding edge of the screen and the boundary can be selected.
  • the boundary whose distance is less than or equal to the preset threshold is used as the target boundary, and the length value of the target boundary is used to determine the first distance between the target object and the electronic device.
  • both are less than or equal to the preset threshold, it means that the target object in the bounding box is cut off by the screen, and the height value and width value in the bounding box do not reflect the size of the target object, so the bounding box cannot be used.
  • the length value and/or the width value is used to calculate the first distance between the target object and the electronic device. At this time, prompt information can be generated to prompt that the target object does not fall within the control range of the electronic device.
  • Figure 3a is a schematic diagram of an optional gesture frame example 1 provided by the embodiment of the present application, as shown in Figure 3a.
  • the second distance between each boundary of the gesture box and the corresponding edge of the screen is greater than the preset threshold. Therefore, the gesture is not cut off by the screen. Any boundary in the gesture box can be selected as the target boundary to determine the hand and electronic device.
  • Figure 3b is a schematic diagram of Example 2 of an optional gesture box provided by the embodiment of the present application.
  • the second distance between each boundary of the gesture box and the corresponding edge of the screen Among the distances, there is only one width boundary whose second distance from the corresponding edge of the screen is less than or equal to the preset threshold. Therefore, the hand is cut off by the screen. However, the width boundary in the gesture box can be selected as the target boundary to determine the distance between the hand and the electronic device.
  • the first distance between; Figure 3c is a schematic diagram of Example 3 of an optional gesture frame provided by the embodiment of the present application. As shown in Figure 3c, the second distance between each boundary of the gesture frame and the corresponding edge of the screen There is only a second distance between the height boundary and the corresponding edge of the screen that is less than or equal to the preset threshold.
  • FIG. 3d is a schematic diagram of Example 4 of an optional gesture box provided by the embodiment of the present application.
  • the second distance between each boundary of the gesture box and the corresponding edge of the screen There are two boundaries whose second distances from the corresponding edges of the screen are less than or equal to the preset threshold, namely the height boundary and the width boundary. Therefore, the hand is cut off by the screen. In this case, the boundary in the gesture box cannot be selected as the target boundary. Determining the first distance between the hand and the electronic device can generate prompt information for prompting that the target object does not fall within the control range of the electronic device.
  • the first distance between the target object and the electronic device is calculated using the relationship between the preset boundary length value and the distance between the target object and the electronic device.
  • the first distance between the target object and the electronic device is calculated using the relationship between the preset boundary width value and the distance between the target object and the electronic device.
  • the relationship between the boundary width value and the distance between the target object and the electronic device is pre-stored in the electronic device. Then, after the width value of the target boundary is known, the preset boundary width can be used according to the target boundary width value. The relationship between the value and the distance between the target object and the electronic device is used to calculate the first distance between the target object and the electronic device.
  • the boundary width value there is a certain relationship between the boundary width value and the distance between the target object and the electronic device. Using this relationship, the first distance between the target object and the electronic device can be calculated. Taking the target object as the hand, the bounding box is Taking the gesture box as an example, the relationship between the border width value and the distance from the target object to the electronic device can be determined in the following way:
  • the current gesture frame W can be calculated as:
  • the relationship between the boundary width value and the distance from the target object to the electronic device can be:
  • the above-mentioned W1 and W2 are both the width values of the gesture frame when the hand is not cut off by the screen.
  • the first distance between the target object and the electronic device is calculated using the relationship between the preset boundary length value and the distance between the target object and the electronic device.
  • the first distance between the target object and the electronic device is calculated according to the length value of the target boundary and the relationship between the preset boundary height value and the distance between the target object and the electronic device.
  • the relationship between the boundary height value and the distance between the target object and the electronic device is pre-stored in the electronic device. Then, after knowing the height value of the target boundary, the preset boundary height value can be used according to the boundary height value.
  • the first distance between the target object and the electronic device is calculated based on the relationship between the distance between the target object and the electronic device.
  • the boundary height value there is a certain relationship between the boundary height value and the distance between the target object and the electronic device. Using this relationship, the first distance between the target object and the electronic device can be calculated. Taking the target object as the hand, the bounding box is Taking the gesture frame as an example, the relationship between the boundary height value and the distance from the target object to the electronic device can be determined in the following way:
  • the width of the gesture frame H1 is recorded; when the distance between the user's hand and the electronic device is recorded as D2, the width of the gesture frame is H2; therefore, when the distance is D, the current gesture frame H is can be calculated as:
  • the relationship between the boundary width value and the distance from the target object to the electronic device can be:
  • H1 and H2 are both the height value of the gesture frame when the hand is not cut off by the screen.
  • the distance between the target object and the electronic device can also be determined based on the boundary information of the target object in each image frame.
  • the above method further includes:
  • the average value of the distances between the target object and the electronic device in each image frame is determined as the first distance between the target object and the electronic device.
  • the distance between the target object and the electronic device in each image frame is determined using the boundary information of the target object in each image frame. Specifically, the distance between the target object and the electronic device is determined based on the selected boundary information of the target object.
  • the first distance between electronic devices is implemented in the same manner, which will not be described again here.
  • an average algorithm can be used to calculate the average value of the distance between the target object and the electronic device in each image frame, and calculate the average value Determine as the first distance between the target object and the electronic device.
  • the average distance between the target object and the electronic device in each image frame is determined as the distance between the target object and the electronic device.
  • the first distance between including:
  • the distance between the target object and the electronic device in each image frame is The average of the distances is determined as the first distance between the target object and the electronic device.
  • the average value it is first determined whether the absolute value of the difference between any two distances between the target object and the electronic device in each image frame is less than or equal to the preset error threshold. If it is less than or equal to the preset error threshold, then This shows that the distance between the target object and the electronic device jitters within the preset error threshold and does not change much. Therefore, the average value can be used to determine the first distance between the target object and the electronic device.
  • the above method further includes:
  • the touch screen When the absolute value of the difference between any two of the distances between the target object and the electronic device in each image frame is greater than the preset error threshold, the touch screen will be triggered according to the preset touch point at the standard distance. Control operating parameters to control the electronic device to perform target functions corresponding to the touch type.
  • the electronic device can be controlled directly according to the preset touch operation parameters at the standard distance. to perform the target function corresponding to the touch type.
  • the touch operation parameters at the first distance can be determined. For example, for each touch operation or touch type, the distance and the touch operation parameter are stored in the electronic device.
  • the corresponding relationship can be used to determine the touch operation parameters at the first distance, and the preset parameter calculation formula can also be used to calculate the touch operation parameters at the first distance.
  • this The application examples do not specifically limit this.
  • the touch operation parameters include any of the following: sliding distance of the sliding operation, sliding speed of the sliding operation, page turning speed of the page turning operation, long pressing time of the long pressing operation, click operation frequency of clicks.
  • the touch operation of the target object can be a sliding operation, a long press operation or a click operation, etc.
  • the operating parameters for the sliding operation can include the sliding distance and sliding speed
  • the operating parameters for the long pressing operation can include the long pressing time.
  • the operating parameters of the click operation may include click frequency, which is not specifically limited in this embodiment of the present application.
  • S104 may include:
  • the touch operation parameters at the first distance are determined.
  • the corresponding relationship between the first distance and the sensitivity coefficient is pre-stored in the electronic device. Then, based on the corresponding relationship, the sensitivity coefficient at the first distance can be calculated, and then based on the corresponding relationship at the first distance, the sensitivity coefficient at the first distance can be calculated.
  • the sensitivity coefficient at a distance determines the touch operation parameters at the first distance.
  • determining the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance includes:
  • the touch operation parameters at the first distance are calculated.
  • the touch operation parameters corresponding to each touch operation are stored in the electronic device. Therefore, in determining the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance, exemplarily , the product of the sensitivity coefficient at the first distance and the preset touch operation parameter at the standard distance is determined as the touch operation parameter at the first distance.
  • the sensitivity coefficient at the first distance is used to determine the touch operation parameters at the first distance, so that for target objects with different distances from the electronic device, even if the touch operation parameters corresponding to the same touch operation are Different, thereby expanding the function of non-contact touch control, making the user's non-contact touch operation more refined control of electronic devices, and improving the user experience.
  • S105 According to the parameters of the touch operation, control the electronic device to perform the target function corresponding to the touch type.
  • the electronic device After determining the touch operation parameters, the electronic device performs the target function corresponding to the touch type according to the determined touch operation parameters. Among them, the electronic device can control the page of the screen to perform the target function corresponding to the touch type according to the touch operation parameters. The electronic device can also perform data processing according to the touch operation parameters, such as the image processing function.
  • the application implements The example does not specifically limit this.
  • S105 may include:
  • the controlled objects include any of the following: pages, interfaces, and controls; that is to say, the above control method can not only control the pages on the screen, but also control the interface, or control the items on the screen.
  • the control of a certain control is not specifically limited in the embodiments of this application.
  • the controlled object on the screen of the electronic device is controlled according to the touch operation parameters to perform the target function corresponding to the touch type, including:
  • the electronic device when the user slides the page with his palm, the electronic device responds to the sliding action of the palm, determines that the operation corresponding to the sliding action is a sliding operation, determines the distance between the palm and the screen, and then determines the sliding operation at this distance. Distance, the electronic device control page performs the sliding function corresponding to the sliding operation according to the sliding distance of the sliding operation at this distance. In this way, when palms with different distances from the screen send sliding actions, the same sliding action corresponds to different sliding operations. The sliding distance makes the user's control of the page more refined and improves the user experience.
  • FIG. 4 is a schematic diagram of Example 1 of an optional control method provided by the embodiment of the present application. As shown in Figure 4, This control method can include:
  • the gesture interaction application will turn on the front camera and capture gesture images through the front camera.
  • Figure 5 is an optional control provided by the embodiment of the present application.
  • a schematic diagram of Example 2 of the method is shown in Figure 5.
  • the gesture detection method may include:
  • the detection of the picture mainly includes operations such as model inference, model post-processing, and non-maximum suppression, so that the gesture in a picture can be detected, and each detection result Including gesture box, gesture category (referring to the category corresponding to the hand posture change, this category corresponds to the touch type) and gesture confidence (referring to the confidence of the touch type corresponding to the gesture category, for example, 0.9), If there are detection results for multiple gesture categories in the picture, the gesture category with the highest confidence will be used as the final detection result.
  • gesture category referring to the category corresponding to the hand posture change, this category corresponds to the touch type
  • gesture confidence referring to the confidence of the touch type corresponding to the gesture category, for example, 0.9
  • the distance between the user's hand and the mobile phone screen is calculated, which is the distance between the hand and the screen.
  • the distance between the hand and the screen can be calculated in the following way:
  • Figure 6a is a schematic diagram of Example 5 of an optional gesture frame provided by the embodiment of the present application. As shown in Figure 6a, record the user When the distance between the hand and the screen is D1, the width W1 and height H1 of the gesture frame;
  • Figure 6b is a schematic diagram of Example 6 of an optional gesture frame provided by the embodiment of the present application. As shown in Figure 6b, record When the distance between the user's hand and the screen is D2, the width W2 and height H2 of the gesture frame;
  • Figure 6c is a schematic diagram of Example 7 of an optional gesture frame provided by the embodiment of the present application. As shown in Figure 6c, when the distance is D, the calculation method of the current gesture frames W and H is formula (1) and formula (3).
  • the width or height of the gesture box can be used.
  • the hand When the hand is at the edge of the screen, the hand may be cut off by the screen. In this case, the distance from the four sides of the gesture box to the edge of the screen can be calculated.
  • distance, to determine whether the width or height of the gesture frame is available when the hand is cut off you need to select an available border to calculate the distance between the hand and the screen. If neither width nor height is available, the distance between the hand and the screen cannot be calculated.
  • the sensitivity coefficient of the touch type corresponding to the gesture is adjusted according to the distance between the user's hand and the mobile phone screen.
  • the sliding distance of the page is calculated based on the sensitivity coefficient of the touch type corresponding to the current gesture.
  • the sensitivity coefficient of the touch type corresponding to the gesture can be adjusted. Assume that when the distance between the hand and the screen is 25cm, the sensitivity coefficient is 1, then the distance is D (10cm ⁇ D ⁇ 60 cm), the sensitivity coefficient K of the touch type corresponding to the updated gesture is:
  • the sensitivity coefficient of the touch type corresponding to the gesture is improved; when the distance between the hand and the screen is greater than 25cm, the sensitivity coefficient of the touch type corresponding to the gesture is reduced.
  • the sliding distance of the mobile phone page is M0.
  • the sliding distance M of the mobile phone page is:
  • Figure 7a is a schematic layout diagram of Example 1 of an optional screen provided by the embodiment of the present application.
  • a communication record list is displayed on the screen.
  • the user can use four fingers except the thumb to Swing up and down to control the electronic device to slide the screen page;
  • Figure 7b is a schematic layout diagram of Example 2 of an optional screen provided by the embodiment of the present application.
  • Figure 7b when the distance between the user's hand and the screen is 25cm, each time the four fingers swing up and down, the page slides by the distance of one communication record. Therefore, the first communication record displayed on the page is "4-17 Sister Access";
  • Figure 7c is a picture provided by the embodiment of the present application.
  • FIG. 7c A schematic diagram of the arrangement of Example 3 of an optional screen is shown in Figure 7c.
  • the distance between the user's hand and the screen is 26cm, each time the four fingers swing up and down, the page slides two lines of communication records. distance, so the first communication record displayed on the page is "4-16 Johnnie called out"; in this way, the distance between the hand and the screen is used to determine the sliding distance of the page corresponding to each up and down swing of the four fingers. Different distances correspond to different sliding distances.
  • the gesture sensitivity coefficient based on gesture detection intelligently adjusts the sliding distance.
  • the current distance between the hand and the screen can be calculated through the gesture detection results, and the current gesture sensitivity coefficient is adjusted according to the distance between the hand and the screen, and acts on the user's operation of the mobile phone.
  • the sliding distance control of the page When the user's hand is farther from the screen, a lower sensitivity coefficient can be obtained, and when the user's hand is closer to the screen, a higher sensitivity coefficient can be obtained.
  • this example can not only be used to adjust the sensitivity coefficient of the sliding distance in gesture interaction, but can also adjust the sliding speed of the sliding operation, the clicking frequency of the clicking operation, the page turning speed of the page turning operation, and the long press operation. long press time, response time to return to the previous level operation, movement distance to control the movement of characters in the game, etc.), and can also be applied to control the sensitivity coefficient of new interactions such as face recognition, body recognition, and human eye gaze. , providing a more comfortable user experience for the above scenarios.
  • Embodiments of the present application provide a control method, which includes: when the non-contact touch control function of the electronic device is turned on, obtaining a video sequence corresponding to the non-contact touch operation, performing touch recognition on the video sequence, and obtaining Target object and touch type, determine a first distance between the target object and the electronic device, determine touch operation parameters at the first distance, and control the electronic device to perform a target corresponding to the touch type according to the touch operation parameters function; that is to say, in the embodiment of the present application, in the implementation of the non-contact touch control function of the electronic device, the first distance between the target object and the electronic device is determined, and then the touch operation at the first distance is determined.
  • the electronic device can respond to the touch type according to the distance between the target object and the electronic device.
  • the same touch type can be used to utilize the touch at different distances.
  • the control operation parameters are different, making the non-contact touch control more refined.
  • FIG. 8 is a schematic structural diagram of an optional electronic device provided by an embodiment of the present application. As shown in Figure 8, the electronic device includes:
  • the acquisition module 81 is configured to acquire the video sequence corresponding to the non-contact touch operation when the non-contact touch control function of the electronic device is turned on;
  • the processing module 82 is configured to perform touch recognition on the video sequence to obtain the target object and touch type;
  • the first determination module 83 is configured to determine the first distance between the target object and the electronic device
  • the second determination module 84 is configured to determine the touch operation parameters at the first distance
  • the control module 85 is configured to control the electronic device according to the touch operation parameters to perform the target function corresponding to the touch type.
  • the first determination module 83 is specifically configured to:
  • a first distance between the target object and the screen of the electronic device is determined according to the selected boundary information of the target object.
  • the first determination module 83 determines the first distance between the target object and the electronic device according to the boundary information of the selected target object, including:
  • the first distance between the target object and the electronic device is determined using the relationship between the preset boundary length value and the distance between the target object and the electronic device.
  • the first determination module 83 determines the target boundary according to the second distance between each boundary of the bounding box and the corresponding edge of the screen, including:
  • the first determination module 83 calculates the distance between the target object and the electronic device according to the length value of the target boundary, using the relationship between the preset boundary length value and the distance between the target object and the electronic device.
  • the first distance between includes:
  • the first distance between the target object and the electronic device is calculated based on the length value of the target boundary and the relationship between the preset boundary width value and the distance between the target object and the electronic device.
  • the first determination module 83 calculates the distance between the target object and the electronic device according to the length value of the target boundary, using the relationship between the preset boundary length value and the distance between the target object and the electronic device.
  • the first distance between includes:
  • the first distance between the target object and the electronic device is calculated according to the length value of the target boundary and the relationship between the preset boundary height value and the distance between the target object and the electronic device.
  • the first determination module 83 is specifically configured to:
  • the average value of the distances between the target object and the electronic device in each image frame is determined as the first distance between the target object and the electronic device.
  • the first determination module 83 determines the average value of the distance between the target object and the electronic device in each image frame as the first distance between the target object and the electronic device, include:
  • the distance between the target object and the electronic device in each image frame is The average of the distances is determined as the first distance between the target object and the electronic device.
  • the electronic device is further configured to:
  • the touch screen When the absolute value of the difference between any two of the distances between the target object and the electronic device in each image frame is greater than the preset error threshold, the touch screen will be triggered according to the preset touch point at the standard distance. Control operating parameters to control the electronic device to perform target functions corresponding to the touch type.
  • the second determination module 84 is specifically configured to:
  • the touch operation parameters at the first distance are determined.
  • the second determination module 84 determines the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance, including:
  • the touch operation parameters at the first distance are calculated.
  • the processing module 82 performs touch recognition on the video sequence to obtain the target object and touch type, including:
  • the touch type is determined based on the confidence value of the target object's touch operation.
  • the processing module 82 determines the touch type based on the confidence value of the target object's touch operation, including:
  • the type of the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object is determined as the touch type.
  • the touch operation parameters include any of the following: sliding distance of the sliding operation, sliding speed of the sliding operation, page turning speed of the page turning operation, long pressing time of the long pressing operation and click operation. frequency of clicks.
  • control module 85 is specifically configured to:
  • controlled objects on the screen of the electronic device are controlled to perform target functions corresponding to the touch type; wherein the controlled objects include any of the following: pages, interfaces, and controls.
  • control module 85 controls the controlled object on the screen of the electronic device according to the touch operation parameters to perform target functions corresponding to the touch type, including:
  • the above-mentioned acquisition module 81, processing module 82, first determination module 83, second determination module 84 and control module 85 can be implemented by a processor located on the electronic device, specifically a central processing unit (CPU). ), microprocessor (MPU, Microprocessor Unit), digital signal processor (DSP, Digital Signal Processing) or field programmable gate array (FPGA, Field Programmable Gate Array) and other implementations.
  • CPU central processing unit
  • MPU Microprocessor Unit
  • DSP Digital Signal Processing
  • FPGA Field Programmable Gate Array
  • FIG. 9 is a schematic structural diagram of another optional electronic device provided by an embodiment of the present application. As shown in Figure 9, an embodiment of the present application provides an electronic device 900, including:
  • the processor 91 and the storage medium 92 that stores instructions executable by the processor 91.
  • the storage medium 92 relies on the processor 91 to perform operations through the communication bus 93.
  • the control method performed in one or more of the above embodiments is executed.
  • the communication bus 93 is used to implement connection communication between these components.
  • the communication bus 93 also includes a power bus, a control bus and a status signal bus.
  • the various buses are labeled as communication bus 93 in FIG. 9 .
  • Embodiments of the present application provide a computer storage medium that stores executable instructions.
  • the processor executes the first step in one or more of the above embodiments.
  • An electronic device executes the control method.
  • the computer-readable storage medium can be magnetic random access memory (ferromagnetic random access memory, FRAM), read-only memory (Read Only Memory, ROM), programmable read-only memory (Programmable Read-Only Memory, PROM), programmable read-only memory (PROM), Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Flash Memory, Magnetic Surface Memory , optical disk, or Compact Disc Read-Only Memory (CD-ROM) and other memories.
  • FRAM magnetic random access memory
  • ROM read-only memory
  • PROM programmable read-only memory
  • PROM Programmable Read-Only Memory
  • EPROM Erasable Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • Flash Memory Magnetic Surface Memory , optical disk, or Compact Disc Read-Only Memory (CD-ROM) and other memories.
  • embodiments of the present application may be provided as methods, systems, or computer program products. Accordingly, the present application may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product implemented on one or more computer-usable storage media (including, but not limited to, magnetic disk storage and optical storage, etc.) embodying computer-usable program code therein.
  • a computer-usable storage media including, but not limited to, magnetic disk storage and optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory that causes a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction means, the instructions
  • the device implements the functions specified in a process or processes of the flowchart and/or a block or blocks of the block diagram.
  • These computer program instructions may also be loaded onto a computer or other programmable data processing device, causing a series of operating steps to be performed on the computer or other programmable device to produce computer-implemented processing, thereby executing on the computer or other programmable device.
  • Instructions provide steps for implementing the functions specified in a process or processes of a flowchart diagram and/or a block or blocks of a block diagram.
  • the control method, electronic device and computer storage medium provided in the embodiments of the present application are applied to electronic devices and include: when the non-contact touch control function of the electronic device is on, obtaining the non-contact touch operation Corresponding video sequence, perform touch recognition on the video sequence, obtain the target object and touch type, determine the first distance between the target object and the electronic device, determine the touch operation parameters at the first distance, and follow the touch operation parameters to control the electronic device to perform the target function corresponding to the touch type.
  • the same touch type is used to control using touch operation parameters at different distances. There are differences, making non-contact touch control more refined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

Embodiments of the present application disclose a control method which is applied to an electronic device. The method comprises: when the non-contact touch control function of the electronic device is enabled, obtaining a video sequence corresponding to the non-contact touch operation; performing touch recognition on the video sequence to obtain a target object and a touch type; determining a first distance between the target object and the electronic device; determining a touch operation parameter under the first distance; and controlling the electronic device to execute a target function corresponding to the touch type according to the touch operation parameter. The embodiments of the present application further provide an electronic device and a computer storage medium.

Description

一种控制方法、电子设备及计算机存储介质A control method, electronic device and computer storage medium
相关申请的交叉引用Cross-references to related applications
本申请基于申请号为202210507343.1、申请日为2022年05月10日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此以全文引用的方式引入本申请。This application is filed based on a Chinese patent application with application number 202210507343.1 and a filing date of May 10, 2022, and claims the priority of the Chinese patent application. The entire content of the Chinese patent application is hereby incorporated by reference into this application in full. .
技术领域Technical field
本申请涉及电子设备中非接触式触控控制技术,尤其涉及一种控制方法、电子设备及计算机存储介质。The present application relates to non-contact touch control technology in electronic equipment, and in particular, to a control method, electronic equipment and computer storage media.
背景技术Background technique
目前,随着近年来智能手机的快速发展,人们使用手机的时间和场景越来越多,目前,主流的人机交互方式是以触摸操作为主,在此基础上,手势交互作为一种新型的交互方式,在驾驶、就餐等场景的应用不断发展,手势交互可以实现对页面的多种控制,例如,对页面的上下滑动、翻页、拍照、截屏、结束录制等。At present, with the rapid development of smart phones in recent years, people use mobile phones more and more time and in more scenarios. At present, the mainstream human-computer interaction method is mainly touch operation. On this basis, gesture interaction is a new type of Interaction methods are constantly developing in applications such as driving and dining. Gesture interaction can realize a variety of controls on the page, such as sliding up and down the page, turning pages, taking pictures, taking screenshots, ending recording, etc.
然而,通过手势类型的变化,或者身体其他部位的变化来实现对电子设备的控制中,仅仅能够控制电子设备实现自身所具有的功能,导致控制不够精细化;由此可以看出,现有的非接触式触控控制方法不够精细化。However, when controlling electronic devices through changes in gesture types or changes in other parts of the body, the electronic device can only be controlled to realize its own functions, resulting in insufficient control. It can be seen from this that the existing The non-contact touch control method is not refined enough.
发明内容Contents of the invention
本申请的技术方案是这样实现的:The technical solution of this application is implemented as follows:
本申请实施例提供一种控制方法,所述方法应用于电子设备中,包括:Embodiments of the present application provide a control method, which is applied to electronic devices and includes:
当所述电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列;When the non-contact touch control function of the electronic device is turned on, obtain a video sequence corresponding to the non-contact touch operation;
对所述视频序列进行触控识别,得到目标对象和触控类型;Perform touch recognition on the video sequence to obtain the target object and touch type;
确定所述目标对象与所述电子设备之间的第一距离;determining a first distance between the target object and the electronic device;
确定在所述第一距离下的触控操作参数;Determine touch operation parameters at the first distance;
按照所述触控操作参数,控制所述电子设备以执行与所述触控类型对应的目标功能。According to the touch operation parameters, the electronic device is controlled to perform a target function corresponding to the touch type.
本申请实施例提供一种电子设备,包括:An embodiment of the present application provides an electronic device, including:
获取模块,配置成当所述电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列;An acquisition module configured to acquire a video sequence corresponding to a non-contact touch operation when the non-contact touch control function of the electronic device is turned on;
处理模块,配置成对所述视频序列进行触控识别,得到目标对象和触控类型;A processing module configured to perform touch recognition on the video sequence to obtain the target object and touch type;
第一确定模块,配置成确定所述目标对象与所述电子设备之间的第一距离;a first determination module configured to determine a first distance between the target object and the electronic device;
第二确定模块,配置成确定在所述第一距离下的触控操作参数;a second determination module configured to determine touch operation parameters at the first distance;
控制模块,配置成按照所述触控操作参数,控制所述电子设备以执行与所述触控类型对应的目标功能。A control module configured to control the electronic device to perform a target function corresponding to the touch type according to the touch operation parameters.
本申请实施例提供一种电子设备,包括:An embodiment of the present application provides an electronic device, including:
处理器以及存储有所述处理器可执行指令的存储介质,所述存储介质通过通信总线依赖所述处理器执行操作,当所述指令被所述处理器执行时,执行上述一个或多个实施例中所述的控制方法。A processor and a storage medium storing instructions executable by the processor. The storage medium relies on the processor to perform operations through a communication bus. When the instructions are executed by the processor, one or more of the above implementations are performed. The control method described in the example.
本申请实施例提供一种计算机存储介质,存储有可执行指令,当所述可执行指令被一个或多个处理器执行的时候,所述处理器执行如一个或多个实施例所述的控制方法。Embodiments of the present application provide a computer storage medium that stores executable instructions. When the executable instructions are executed by one or more processors, the processor performs the control described in one or more embodiments. method.
附图说明Description of the drawings
图1为本申请实施例提供的一种可选的控制方法的流程示意图;Figure 1 is a schematic flowchart of an optional control method provided by an embodiment of the present application;
图2为相关技术中页面控制方法的流程示意图;Figure 2 is a schematic flow chart of a page control method in related technologies;
图3a为本申请实施例提供的一种可选的手势框的实例一的示意图;Figure 3a is a schematic diagram of Example 1 of an optional gesture box provided by the embodiment of the present application;
图3b为本申请实施例提供的一种可选的手势框的实例二的示意图;Figure 3b is a schematic diagram of Example 2 of an optional gesture box provided by the embodiment of the present application;
图3c为本申请实施例提供的一种可选的手势框的实例三的示意图;Figure 3c is a schematic diagram of Example 3 of an optional gesture box provided by the embodiment of the present application;
图3d为本申请实施例提供的一种可选的手势框的实例四的示意图;Figure 3d is a schematic diagram of Example 4 of an optional gesture box provided by the embodiment of the present application;
图4为本申请实施例提供的一种可选的控制方法的实例一的示意图;Figure 4 is a schematic diagram of Example 1 of an optional control method provided by the embodiment of the present application;
图5为本申请实施例提供的一种可选的控制方法的实例二的示意图;Figure 5 is a schematic diagram of Example 2 of an optional control method provided by the embodiment of the present application;
图6a为本申请实施例提供的一种可选的手势框的实例五的示意图;Figure 6a is a schematic diagram of Example 5 of an optional gesture box provided by the embodiment of the present application;
图6b为本申请实施例提供的一种可选的手势框的实例六的示意图;Figure 6b is a schematic diagram of Example 6 of an optional gesture box provided by the embodiment of the present application;
图6c为本申请实施例提供的一种可选的手势框的实例七的示意图;Figure 6c is a schematic diagram of Example 7 of an optional gesture box provided by the embodiment of the present application;
图7a为本申请实施例提供的一种可选的屏幕的实例一的排布示意图;Figure 7a is a schematic layout diagram of Example 1 of an optional screen provided by the embodiment of the present application;
图7b为本申请实施例提供的一种可选的屏幕的实例二的排布示意图;Figure 7b is a schematic layout diagram of Example 2 of an optional screen provided by the embodiment of the present application;
图7c为本申请实施例提供的一种可选的屏幕的实例三的排布示意图;Figure 7c is a schematic layout diagram of Example 3 of an optional screen provided by the embodiment of the present application;
图8为本申请实施例提供的一种可选的电子设备的结构示意图;Figure 8 is a schematic structural diagram of an optional electronic device provided by an embodiment of the present application;
图9为本申请实施例提供的另一种可选的电子设备的结构示意图。FIG. 9 is a schematic structural diagram of another optional electronic device provided by an embodiment of the present application.
具体实施方式Detailed ways
第一方面,本申请实施例提供了一种控制方法,所述方法应用于电子设备中,包括:In a first aspect, embodiments of the present application provide a control method, which method is applied to an electronic device and includes:
当所述电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列;When the non-contact touch control function of the electronic device is turned on, obtain a video sequence corresponding to the non-contact touch operation;
对所述视频序列进行触控识别,得到目标对象和触控类型;Perform touch recognition on the video sequence to obtain the target object and touch type;
确定所述目标对象与所述电子设备之间的第一距离;determining a first distance between the target object and the electronic device;
确定在所述第一距离下的触控操作参数;Determine touch operation parameters at the first distance;
按照所述触控操作参数,控制所述电子设备以执行与所述触控类型对应的目标功能。According to the touch operation parameters, the electronic device is controlled to perform a target function corresponding to the touch type.
在一种可选的实施例中,所述确定所述目标对象与所述电子设备之间的第一距离,包括:In an optional embodiment, determining the first distance between the target object and the electronic device includes:
从所述视频序列的每个图像帧中的目标对象的边界信息中选取出一个目标对象的边界信息;Select the boundary information of a target object from the boundary information of the target object in each image frame of the video sequence;
根据选取出的目标对象的边界信息,确定所述目标对象与所述电子设备之间的第一距离。A first distance between the target object and the electronic device is determined according to the selected boundary information of the target object.
在一种可选的实施例中,以边界框来表示边界信息时,所述根据选取出的目标对象的边界信息,确定所述目标对象与所述电子设备之间的第一距离,包括:In an optional embodiment, when the boundary information is represented by a bounding box, determining the first distance between the target object and the electronic device according to the boundary information of the selected target object includes:
确定所述边界框的各边界与所述电子设备的屏幕的对应边缘之间的第二距离;determining a second distance between each boundary of the bounding box and a corresponding edge of the screen of the electronic device;
根据所述各边界与所述屏幕的对应边缘之间的第二距离,确定目标边界;Determine the target boundary according to the second distance between each boundary and the corresponding edge of the screen;
根据所述目标边界的长度值,利用预设的边界长度值与所述目标对象至所述电子设备的距离之间的关系,确定所述目标对象与所述电子设备之间的第一距离。According to the length value of the target boundary, the first distance between the target object and the electronic device is determined using the relationship between the preset boundary length value and the distance from the target object to the electronic device.
在一种可选的实施例中,所述根据所述边界框的各边界与所述屏幕的对应边缘之间的第二距离,确定目标边界,包括:In an optional embodiment, determining the target boundary based on the second distance between each boundary of the bounding box and the corresponding edge of the screen includes:
当所述各边界与所述屏幕的对应边缘之间的第二距离均大于预设阈值时,从所述各边界中选取出一条边界;When the second distance between each boundary and the corresponding edge of the screen is greater than a preset threshold, select one boundary from each boundary;
当所述各边界与所述屏幕的对应边缘之间的第二距离中仅存在一条边界与所述屏幕的对应边缘的第二距离小于等于预设阈值时,从所述各边界中选取出与所述屏幕的对应边缘的第二距离小于等于预设阈值的边界;When only one of the second distances between the boundaries and the corresponding edge of the screen has a second distance less than or equal to the preset threshold, select the The second distance between the corresponding edges of the screen is less than or equal to the boundary of the preset threshold;
将选取出的边界确定为所述目标边界。The selected boundary is determined as the target boundary.
在一种可选的实施例中,所述根据所述目标边界的长度值,利用预设的边界长度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离,包括:In an optional embodiment, according to the length value of the target boundary, the target is calculated using the relationship between the preset boundary length value and the distance from the target object to the electronic device. The first distance between the object and the electronic device includes:
当所述目标边界为宽度边界时,根据所述目标边界的长度值,利用预设的边界宽度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离。When the target boundary is a width boundary, according to the length value of the target boundary, using the relationship between the preset boundary width value and the distance from the target object to the electronic device, the target object and the electronic device are calculated. A first distance between the electronic devices.
在一种可选的实施例中,所述根据所述目标边界的长度值,利用预设的边界长度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离,包括:In an optional embodiment, according to the length value of the target boundary, the target is calculated using the relationship between the preset boundary length value and the distance from the target object to the electronic device. The first distance between the object and the electronic device includes:
当所述目标边界为高度边界时,根据所述目标边界的长度值,利用预设的边界高度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离。When the target boundary is a height boundary, according to the length value of the target boundary, using the relationship between the preset boundary height value and the distance from the target object to the electronic device, the target object and the electronic device are calculated. A first distance between the electronic devices.
在一种可选的实施例中,所述确定所述目标对象与所述电子设备之间的第一距离,包括:In an optional embodiment, determining the first distance between the target object and the electronic device includes:
根据所述视频序列中每个图像帧中的目标对象中的边界信息,确定所述每个图像帧中的目标对象与所述电子设备之间的距离;Determine the distance between the target object in each image frame and the electronic device according to the boundary information in the target object in each image frame in the video sequence;
将所述每个图像帧中的目标对象与所述电子设备之间的距离的平均值,确定为所述目标对象与所述电子设备之间的第一距离。The average value of the distances between the target object and the electronic device in each image frame is determined as the first distance between the target object and the electronic device.
在一种可选的实施例中,所述将所述每个图像帧中的目标对象与所述电子设备之间的距离的平均值,确定为所述目标对象与所述电子设备之间的第一距离,包括:In an optional embodiment, the average value of the distance between the target object and the electronic device in each image frame is determined as the distance between the target object and the electronic device. The first distance includes:
当所述每个图像帧中目标对象与所述电子设备之间的距离中任意两个之间的差值的绝对值小于等于预设的误差阈值时,将所述每个图像帧中目标对象与所述电子设备之间的距离的平均值,确定为所述目标对象与所述电子设备之间的第一距离。When the absolute value of the difference between any two distances between the target object in each image frame and the electronic device is less than or equal to the preset error threshold, the target object in each image frame is The average value of the distances to the electronic device is determined as the first distance between the target object and the electronic device.
在一种可选的实施例中,所述方法还包括:In an optional embodiment, the method further includes:
当所述每个图像帧中目标对象与所述电子设备之间的距离中任意两个之间的差值的绝对值中存在大于预设的误差阈值的绝对值时,按照预设的在标准距离下的触控操作参数,控制所述电子设备以执行与所述触控类型对应的目标功能。When the absolute value of the difference between any two of the distances between the target object and the electronic device in each image frame has an absolute value greater than the preset error threshold, according to the preset standard The touch operation parameters under the distance control the electronic device to perform the target function corresponding to the touch type.
在一种可选的实施例中,所述确定在所述第一距离下的触控操作参数,包括:In an optional embodiment, determining the touch operation parameters at the first distance includes:
基于预设的距离与灵敏度系数的对应关系,计算在所述第一距离下的灵敏度系数;Calculate the sensitivity coefficient at the first distance based on the preset corresponding relationship between the distance and the sensitivity coefficient;
根据在所述第一距离下的灵敏度系数,确定在所述第一距离下的触控操作参数。Touch operation parameters at the first distance are determined according to the sensitivity coefficient at the first distance.
在一种可选的实施例中,所述根据在所述第一距离下的灵敏度系数,确定在所述第一距离下的触控操作参数,包括:In an optional embodiment, determining the touch operation parameters at the first distance based on the sensitivity coefficient at the first distance includes:
利用在所述第一距离下的灵敏度系数和预设的在标准距离下的触控操作参数,计算得到在所述第一距离下的触控操作参数。The touch operation parameters at the first distance are calculated using the sensitivity coefficient at the first distance and the preset touch operation parameters at the standard distance.
在一种可选的实施例中,所述对所述视频序列进行触控识别,得到目标对象和触控类型,包括:In an optional embodiment, performing touch recognition on the video sequence to obtain the target object and touch type includes:
对所述视频序列进行触控识别,得到所述目标对象、所述目标对象的边界框和所述目标对象的触控操作的置信度值;其中,所述边界框表示边界信息;Perform touch recognition on the video sequence to obtain the target object, the bounding box of the target object, and the confidence value of the touch operation of the target object; wherein the bounding box represents boundary information;
基于所述目标对象的触控操作的置信度值,确定所述触控类型。The touch type is determined based on the confidence value of the touch operation of the target object.
在一种可选的实施例中,所述基于所述目标对象的触控操作的置信度值,确定所述触控类型,包括:In an optional embodiment, determining the touch type based on the confidence value of the touch operation of the target object includes:
将所述目标对象的触控操作的置信度值的最大值对应的触控操作所属的类型,确定为所述触控类型。The type of the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object is determined as the touch type.
在一种可选的实施例中,所述触控操作参数包括以下任一项:In an optional embodiment, the touch operation parameters include any of the following:
滑动操作的滑动距离,滑动操作的滑动速度,翻页操作的翻页速度,长按操作的长按时间和点击操作的点击频率。The sliding distance of the sliding operation, the sliding speed of the sliding operation, the page turning speed of the page turning operation, the long pressing time of the long pressing operation and the click frequency of the clicking operation.
在一种可选的实施例中,所述按照所述触控操作参数,控制所述电子设备以执行与所述触控类型对应的目标功能,包括:In an optional embodiment, controlling the electronic device to perform a target function corresponding to the touch type according to the touch operation parameters includes:
按照所述触控操作参数,控制所述电子设备的屏幕的被控制对象,以执行与所述触控类型对应的目标功能;According to the touch operation parameters, control the controlled object of the screen of the electronic device to perform a target function corresponding to the touch type;
其中,所述被控制对象包括以下任一项:页面,界面和控件。Wherein, the controlled objects include any of the following: pages, interfaces and controls.
在一种可选的实施例中,所述按照所述触控操作参数,控制所述电子设备的屏幕的被控制对象,以执行与所述触控类型对应的目标功能,包括:In an optional embodiment, controlling a controlled object on the screen of the electronic device according to the touch operation parameters to perform a target function corresponding to the touch type includes:
按照滑动操作的滑动距离,控制所述屏幕的页面,以执行滑动功能。According to the sliding distance of the sliding operation, the page of the screen is controlled to perform the sliding function.
第二方面,本申请实施例还提供一种电子设备,包括:In a second aspect, embodiments of the present application also provide an electronic device, including:
获取模块,配置成当所述电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列;An acquisition module configured to acquire a video sequence corresponding to a non-contact touch operation when the non-contact touch control function of the electronic device is turned on;
处理模块,配置成对所述视频序列进行触控识别,得到目标对象和触控类型;A processing module configured to perform touch recognition on the video sequence to obtain the target object and touch type;
第一确定模块,配置成确定所述目标对象与所述电子设备之间的第一距离;a first determination module configured to determine a first distance between the target object and the electronic device;
第二确定模块,配置成确定在所述第一距离下的触控操作参数;a second determination module configured to determine touch operation parameters at the first distance;
控制模块,配置成按照所述触控操作参数,控制所述电子设备以执行所述触控类型对应的目标功能。A control module configured to control the electronic device according to the touch operation parameters to perform the target function corresponding to the touch type.
在一种可选的实施例中,所述第一确定模块,具体配置成:In an optional embodiment, the first determination module is specifically configured to:
从所述视频序列的每个图像帧中的目标对象的边界信息中选取出一个目标对象的边界信息;Select the boundary information of a target object from the boundary information of the target object in each image frame of the video sequence;
根据选取出的目标对象的边界信息,确定所述目标对象与所述电子设备之间的第一距离。A first distance between the target object and the electronic device is determined according to the selected boundary information of the target object.
在一种可选的实施例中,以边界框来表示边界信息时,所述第一确定模块根据选取出的目标对象的边界信息,确定所述目标对象与所述电子设备之间的第一距离中,包括:In an optional embodiment, when the boundary information is represented by a bounding box, the first determination module determines the first distance between the target object and the electronic device according to the boundary information of the selected target object. Distance includes:
确定所述边界框的各边界与所述电子设备的屏幕的对应边缘之间的第二距离;determining a second distance between each boundary of the bounding box and a corresponding edge of the screen of the electronic device;
根据所述各边界与所述屏幕的对应边缘之间的第二距离,确定目标边界;Determine the target boundary according to the second distance between each boundary and the corresponding edge of the screen;
根据所述目标边界的长度值,利用预设的边界长度值与所述目标对象至所述电子设备的距离之间的关系,确定所述目标对象与所述电子设备之间的第一距离。According to the length value of the target boundary, the first distance between the target object and the electronic device is determined using the relationship between the preset boundary length value and the distance from the target object to the electronic device.
在一种可选的实施例中,所述第一确定模块根据所述边界框的各边界与所述屏幕的对应边缘之间的第二距离,确定目标边界中,包括:In an optional embodiment, the first determination module determines the target boundary according to the second distance between each boundary of the bounding box and the corresponding edge of the screen, including:
当所述各边界与所述屏幕的对应边缘之间的第二距离均大于预设阈值时,从所述各边界中选取出一条边界;When the second distance between each boundary and the corresponding edge of the screen is greater than a preset threshold, select one boundary from each boundary;
当所述各边界与所述屏幕的对应边缘之间的第二距离中仅存在一条边界与所述屏幕的对应边缘的第二距离小于等于预设阈值时,从所述各边界中选取出与所述屏幕的对应边缘的第二距离小于等于预设阈值的边界;When only one of the second distances between the boundaries and the corresponding edge of the screen has a second distance less than or equal to the preset threshold, select the The second distance between the corresponding edges of the screen is less than or equal to the boundary of the preset threshold;
将选取出的边界确定为所述目标边界。The selected boundary is determined as the target boundary.
在一种可选的实施例中,所述第一确定模块根据所述目标边界的长度值,利用预设的边界长度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离中,包括:In an optional embodiment, the first determination module calculates, based on the length value of the target boundary, using the relationship between the preset boundary length value and the distance from the target object to the electronic device. Obtaining the first distance between the target object and the electronic device includes:
当所述目标边界为宽度边界时,根据所述目标边界的长度值,利用预设的边界宽度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离。When the target boundary is a width boundary, according to the length value of the target boundary, using the relationship between the preset boundary width value and the distance from the target object to the electronic device, the target object and the electronic device are calculated. A first distance between the electronic devices.
在一种可选的实施例中,所述第一确定模块根据所述目标边界的长度值,利用预设的边界长度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离中,包括:In an optional embodiment, the first determination module calculates, based on the length value of the target boundary, using the relationship between the preset boundary length value and the distance from the target object to the electronic device. Obtaining the first distance between the target object and the electronic device includes:
当所述目标边界为高度边界时,根据所述目标边界的长度值,利用预设的边界高度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离。When the target boundary is a height boundary, according to the length value of the target boundary, using the relationship between the preset boundary height value and the distance from the target object to the electronic device, the target object and the electronic device are calculated. A first distance between the electronic devices.
在一种可选的实施例中,所述第一确定模块确定所述目标对象与所述电子设备之间的第一距离中,包括:In an optional embodiment, the first determining module determines the first distance between the target object and the electronic device, including:
根据所述视频序列中每个图像帧中的目标对象中的边界信息,确定所述每个图像帧中的目标对象与所述电子设备之间的距离;Determine the distance between the target object in each image frame and the electronic device according to the boundary information in the target object in each image frame in the video sequence;
将所述每个图像帧中的目标对象与所述电子设备之间的距离的平均值,确定为所述目标对象与所述电子设备之间的第一距离。The average value of the distances between the target object and the electronic device in each image frame is determined as the first distance between the target object and the electronic device.
在一种可选的实施例中,所述第一确定模块将所述每个图像帧中的目标对象与所述电子设备之间的距离的平均值,确定为所述目标对象与所述电子设备之间的第一距离中,包括:In an optional embodiment, the first determination module determines the average distance between the target object and the electronic device in each image frame as the value between the target object and the electronic device. The first distance between devices includes:
当所述每个图像帧中目标对象与所述电子设备之间的距离中任意两个之间的差值的绝对值小于等于预设的误差阈值时,将所述每个图像帧中目标对象与所述电子设备之间的距离的平均值,确定为所述目标对象与所述电子设备之间的第一距离。When the absolute value of the difference between any two distances between the target object in each image frame and the electronic device is less than or equal to the preset error threshold, the target object in each image frame is The average value of the distances to the electronic device is determined as the first distance between the target object and the electronic device.
在一种可选的实施例中,所述电子设备,还配置成:In an optional embodiment, the electronic device is further configured to:
当所述每个图像帧中目标对象与所述电子设备之间的距离中任意两个之间的差值的绝对值中存在大于预设的误差阈值的绝对值时,按照预设的在标准距离下的触控操作参数,控制所述电子设备以执行与所述触控类型对应的目标功能。When the absolute value of the difference between any two of the distances between the target object and the electronic device in each image frame has an absolute value greater than the preset error threshold, according to the preset standard The touch operation parameters under the distance control the electronic device to perform the target function corresponding to the touch type.
在一种可选的实施例中,所述第二确定模块,具体配置成:In an optional embodiment, the second determination module is specifically configured to:
基于预设的距离与灵敏度系数的对应关系,计算在所述第一距离下的灵敏度系数;Calculate the sensitivity coefficient at the first distance based on the preset corresponding relationship between the distance and the sensitivity coefficient;
根据在所述第一距离下的灵敏度系数,确定在所述第一距离下的触控操作参数。Touch operation parameters at the first distance are determined according to the sensitivity coefficient at the first distance.
在一种可选的实施例中,所述第二确定模块根据在所述第一距离下的灵敏度系数,确定在所述第一距离下的触控操作参数中,包括:In an optional embodiment, the second determination module determines the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance, including:
利用在所述第一距离下的灵敏度系数和预设的在标准距离下的触控操作参数,计算得到在所述第一距离下的触控操作参数。The touch operation parameters at the first distance are calculated using the sensitivity coefficient at the first distance and the preset touch operation parameters at the standard distance.
在一种可选的实施例中,所述处理模块,具体配置成:In an optional embodiment, the processing module is specifically configured to:
对所述视频序列进行触控识别,得到所述目标对象、所述目标对象的边界框和所述目标对象的触控操作的置信度值;其中,所述边界框表示边界信息;Perform touch recognition on the video sequence to obtain the target object, the bounding box of the target object, and the confidence value of the touch operation of the target object; wherein the bounding box represents boundary information;
基于所述目标对象的触控操作的置信度值,确定所述触控类型。The touch type is determined based on the confidence value of the touch operation of the target object.
在一种可选的实施例中,所述处理模块基于所述目标对象的触控操作的置信度值,确定所述触控类型中,包括:In an optional embodiment, the processing module determines the touch type based on the confidence value of the touch operation of the target object, including:
将所述目标对象的触控操作的置信度值的最大值对应的触控操作所属的类型,确定为所述触控类型。The type of the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object is determined as the touch type.
在一种可选的实施例中,所述触控操作参数包括以下任一项:In an optional embodiment, the touch operation parameters include any of the following:
滑动操作的滑动距离,滑动操作的滑动速度,翻页操作的翻页速度,长按操作的长按时间和点击操作的点击频率。The sliding distance of the sliding operation, the sliding speed of the sliding operation, the page turning speed of the page turning operation, the long pressing time of the long pressing operation and the click frequency of the clicking operation.
在一种可选的实施例中,所述控制模块,具体配置成:In an optional embodiment, the control module is specifically configured to:
按照所述触控操作参数,控制所述电子设备的屏幕的被控制对象,以执行与所述触控类型对应的目标功能;According to the touch operation parameters, control the controlled object of the screen of the electronic device to perform a target function corresponding to the touch type;
其中,所述被控制对象包括以下任一项:页面,界面和控件。Wherein, the controlled objects include any of the following: pages, interfaces and controls.
在一种可选的实施例中,所述控制模块按照所述触控操作参数,控制所述电子设备的屏幕的被控制对象,以执行与所述触控类型对应的目标功能中,包括:In an optional embodiment, the control module controls the controlled object on the screen of the electronic device according to the touch operation parameters to perform a target function corresponding to the touch type, including:
按照滑动操作的滑动距离,控制所述屏幕的页面,以执行滑动功能。According to the sliding distance of the sliding operation, the page of the screen is controlled to perform the sliding function.
第三方面,本申请实施例还提供一种电子设备,包括:In a third aspect, embodiments of the present application also provide an electronic device, including:
处理器以及存储有所述处理器可执行指令的存储介质,所述存储介质通过通信总线依赖所述处理器执行操作,当所述指令被所述处理器执行时,执行上述一个或多个实施例所述的控制方法。A processor and a storage medium storing instructions executable by the processor. The storage medium relies on the processor to perform operations through a communication bus. When the instructions are executed by the processor, one or more of the above implementations are performed. The control method described in the example.
第四方面,本申请实施例还提供一种计算机存储介质,存储有可执行指令,当所述可执行指令被一个或多个处理器执行的时候,所述处理器执行如上述一个或多个实施例所述的控制方法。In a fourth aspect, embodiments of the present application further provide a computer storage medium that stores executable instructions. When the executable instructions are executed by one or more processors, the processor executes one or more of the above steps. The control method described in the embodiment.
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application.
本申请实施例提供了一种控制方法,该方法应用于电子设备中,图1为本申请实施例提供的一种可选的控制方法的流程示意图,如图1所示,该控制方法,可以包括:The embodiment of the present application provides a control method, which is applied to electronic equipment. Figure 1 is a schematic flowchart of an optional control method provided by the embodiment of the present application. As shown in Figure 1, the control method can include:
S101:当电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列;S101: When the non-contact touch control function of the electronic device is turned on, obtain the video sequence corresponding to the non-contact touch operation;
图2为相关技术中页面控制方法的流程示意图,如图2所示,以手势来对页面进行控制为例,该页面控制方法可以包括:Figure 2 is a schematic flow chart of a page control method in related technologies. As shown in Figure 2, taking gesture control as an example, the page control method may include:
S201:打开前置摄像头;S201: Turn on the front camera;
S202:抓拍图片;S202: Capture pictures;
S203:持续手势检测;S203: Continuous gesture detection;
S204:手势判断;S204: Gesture judgment;
S205:响应手势的操作。S205: Operation in response to gesture.
其中,电子设备打开前置摄像头,利用前置摄像头抓拍包含有手势的图片,在抓拍到图片之后,对图片中的手势进行持续检测,得到手势并对手势进行判断,经过判断得到该手势对应的操作,例如,上移操作,最后,响应该手势的操作,然而利用这种方式时,仅仅能够控制电子设备实现自身所具有的功能,导致控制不够精细化。Among them, the electronic device turns on the front camera, uses the front camera to capture a picture containing the gesture, and after capturing the picture, continuously detects the gesture in the picture, obtains the gesture, and judges the gesture. After the judgment, the corresponding value of the gesture is obtained. Operation, for example, move up operation, and finally, operation in response to the gesture. However, when using this method, the electronic device can only be controlled to realize its own functions, resulting in insufficient control precision.
为了提高非接触式触控控制的精细化,本申请实施例提供一种可选的控制方法,首先,当电子设备的非接触式触控控制功能处于开启状态时,也就是说,电子设备开启非接触式触控控制功能,此时,电子设备的前置摄像头开启,用于对电子设备的屏幕前方进行抓拍,当通过抓拍到的图片检测到目标对象时获取视频序列,以获取到非接触式触控操作对应的视频序列,其中,该视频序列中包括连续的多个图像帧。In order to improve the refinement of non-contact touch control, embodiments of the present application provide an optional control method. First, when the non-contact touch control function of the electronic device is turned on, that is to say, the electronic device is turned on. Non-contact touch control function. At this time, the front camera of the electronic device is turned on to capture the front of the screen of the electronic device. When the target object is detected through the captured picture, a video sequence is obtained to obtain the non-contact A video sequence corresponding to the touch operation, wherein the video sequence includes multiple consecutive image frames.
其中,上述触控操作可以包括上滑操作,下滑操作,长按操作,向上翻页操作,向下翻页操作,点击操作,这里,本申请实施例对此不作具体限定。The above-mentioned touch operations may include slide-up operations, slide-down operations, long-press operations, page-turning operations, page-turning operations, and click operations. Here, the embodiments of the present application do not specifically limit this.
S102:对视频序列进行触控识别,得到目标对象和触控类型;S102: Perform touch recognition on the video sequence to obtain the target object and touch type;
在获取到非接触式触控操作的视频序列时,需要对该视频序列进行触控识别,从而可以得到视频序列中包含的目标对象以及触控类型。When obtaining a video sequence of a non-contact touch operation, it is necessary to perform touch recognition on the video sequence, so that the target object and touch type contained in the video sequence can be obtained.
其中,上述目标对象可以为人体的身体部位,例如,手,头部,眼睛等种类,这里,本申请实施例对此不作具体限定。The above-mentioned target objects may be body parts of the human body, such as hands, heads, eyes, etc., which are not specifically limited in the embodiments of the present application.
针对目标对象来说,用户可以利用手势来控制电子设备执行手势对应的触控类型对应的目标功能,也可以利用头部的变化,例如,利用头部的左右晃动,控制电子设备执行头部的左右晃动对应的触控类型对应的目标功能,还可以利用眼睛的变化,例如,利用眨眼动作,控制电子设备执行眨眼动作对应的触控类型对应的目标功能,这里,本申请实施例对此不作具体限定。For the target object, the user can use gestures to control the electronic device to perform the target function corresponding to the touch type corresponding to the gesture, or can use changes in the head, for example, using the left and right shaking of the head to control the electronic device to perform head movements. The target function corresponding to the touch type corresponding to the left and right shaking can also use the changes of the eyes. For example, the blinking action is used to control the electronic device to perform the target function corresponding to the touch type corresponding to the blinking action. Here, the embodiment of the present application does not do this. Specific limitations.
基于上述触控操作的说明可知,该触控类型可以包括滑动类型,翻页类型,点击类型,长按类型,其中,上滑操作和下滑操作属于滑动类型,长按操作为长按类型,向上翻页操作和向下翻页操作属于翻页类型,点击操作属于点击类型,这里,本申请实施例对此不作具体限定。Based on the description of the above touch operation, it can be seen that the touch type can include sliding type, page turning type, click type, and long press type. Among them, the upward sliding operation and the sliding downward operation belong to the sliding type, and the long pressing operation belongs to the long pressing type. The page turning operation and the page turning operation belong to the page turning type, and the clicking operation belongs to the clicking type. Here, the embodiment of the present application does not specifically limit this.
在一种可选的实施例中,S102可以包括:In an optional embodiment, S102 may include:
对视频序列进行触控识别,得到目标对象、目标对象的边界框和目标对象的触控操作的置信度值;Perform touch recognition on the video sequence to obtain the target object, the bounding box of the target object, and the confidence value of the target object's touch operation;
基于目标对象的触控操作的置信度值,确定触控类型。The touch type is determined based on the confidence value of the target object's touch operation.
在对视频序列的触控识别中,可以得到目标对象,目标对象的边界框和目标对象的触控操作的置信度值,其中,边界框表示目标对象的边界信息,目标对象的边界信息指的是目标对象的轮廓信息,其中,轮廓信息可以包括轮廓的形状,轮廓的面积,轮廓线条的尺寸信息;上述边界框为包裹上述目标对象的矩形框,边界框包括边界的高度和边界的宽度。In touch recognition of video sequences, the target object, the bounding box of the target object, and the confidence value of the touch operation of the target object can be obtained, where the bounding box represents the boundary information of the target object, and the boundary information of the target object refers to is the outline information of the target object, where the outline information may include the shape of the outline, the area of the outline, and the size information of the outline lines; the above-mentioned bounding box is a rectangular box that wraps the above-mentioned target object, and the bounding box includes the height and width of the border.
也就是说,通过对视频序列的触控识别,不仅可以得到目标对象,还可以得到目标对象的边界信息,同时,对目标对象的姿态类别进行匹配,可以匹配得到目标对象的触控操作,这里,需要说明的是,可以匹配得到一个或者多个目标对象的触控操作,在触控识别中还会得到的目标对象的触控操作的置信度值,即目标对象的触控操作的置信度值,为了确定出触控类型,可以基于目标对象的触控操作的置信度值,确定触控类型。That is to say, by touch recognition of the video sequence, not only the target object can be obtained, but also the boundary information of the target object can be obtained. At the same time, the gesture category of the target object can be matched to obtain the touch operation of the target object. Here , it should be noted that the touch operations of one or more target objects can be matched, and the confidence value of the touch operation of the target object can also be obtained during touch recognition, that is, the confidence level of the touch operation of the target object. value, in order to determine the touch type, the touch type can be determined based on the confidence value of the touch operation of the target object.
可以理解地,当仅仅匹配得到一个目标对象的触控操作时,直接将匹配得到的目标对象的触控操作所属的类型,确定为触控类型;针对匹配得到一个以上的目标对象的触控操作时,在一种可选的实施例中,基于目标对象的触控操作的置信度值,确定触控类型,包括:Understandably, when only the touch operation of one target object is matched, the type of the touch operation of the matched target object is directly determined as the touch type; for the touch operations of more than one target object that are matched, When , in an optional embodiment, determining the touch type based on the confidence value of the target object's touch operation includes:
将目标对象的触控操作的置信度值的最大值对应的触控操作所属的类型,确定为触控类型。The type of the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object is determined as the touch type.
其中,目标对象的每个触控操作的置信度值可以反映出每个触控操作的可信程度,置信度值越大,触控操作的可信程度越高,所以,这里,将目标对象的触控操作的置信度值的最大值选取出来,并将最大值对应的触控操作所属的类型,确定为触控类型,如此,通过置信度值来确定触控类型,能够更加准确地确定出触控类型,进而精准地实现对电子设备的精细化控制。Among them, the confidence value of each touch operation of the target object can reflect the credibility of each touch operation. The greater the confidence value, the higher the credibility of the touch operation. Therefore, here, the target object Select the maximum value of the confidence value of the touch operation, and determine the type of the touch operation corresponding to the maximum value as the touch type. In this way, the touch type can be determined more accurately by using the confidence value. Expose touch types to accurately achieve refined control of electronic devices.
S103:确定目标对象与电子设备之间的第一距离;S103: Determine the first distance between the target object and the electronic device;
这里,在触控识别得到目标对象之后,需要进一步确定出目标对象与电子设备之间的第一距离,其中,需要说明的是,这里的第一距离可以是目标对象与电子设备的屏幕之间的第一距离,也可以是目标对象与电子设备的背板之间的第一距离,也可以是目标对象与电子设备的图像传感器之间的第一距离,这里,本申请实施例对此不作具体限定。Here, after the target object is obtained through touch recognition, it is necessary to further determine the first distance between the target object and the electronic device. It should be noted that the first distance here may be between the target object and the screen of the electronic device. The first distance may also be the first distance between the target object and the backplane of the electronic device, or it may be the first distance between the target object and the image sensor of the electronic device. Here, the embodiment of the present application does not do this. Specific limitations.
可以理解地,若采用前置摄像头获取到的视频序列中的每个图像帧还对应一个深度图像,那么,可以根据深度图像确定出目标对象与电子设备之间的第一距离,还可以根据目标对象的边界信息来确定出目标对象与电子设备之间的第一距离,这里,本申请实施例对此不作具体限定。It can be understood that if each image frame in the video sequence acquired by the front camera also corresponds to a depth image, then the first distance between the target object and the electronic device can be determined based on the depth image, and the first distance between the target object and the electronic device can also be determined based on the depth image. The first distance between the target object and the electronic device is determined based on the boundary information of the object. Here, the embodiment of the present application does not specifically limit this.
为了通过目标对象的边界信息确定出目标对象与电子设备之间的第一距离,由于视频序列中的每个图像帧都对应有一个边界信息,这里,可以根据每个图像帧中的目标对象的边界信息来确定目标对象与电子设备之间的第一距离,还可以根据每个图像帧中的目标对象的边界信息中的一个目标对象的边界信息来确定目标对象与电子设备之间的第一距离,这里,本申请实施例对此不作具体限定。In order to determine the first distance between the target object and the electronic device through the boundary information of the target object, since each image frame in the video sequence corresponds to a boundary information, here, it is possible to determine the first distance between the target object and the electronic device according to the boundary information of the target object in each image frame. The first distance between the target object and the electronic device can be determined based on the boundary information of one of the target objects in each image frame. The distance is not specifically limited in the embodiment of the present application.
为了实现根据每个图像帧中的目标对象的边界信息确定目标对象与电子设备之间的第一距离,在一种可选的实施例中,S103可以包括:In order to determine the first distance between the target object and the electronic device according to the boundary information of the target object in each image frame, in an optional embodiment, S103 may include:
从视频序列的每个图像帧中的目标对象的边界信息中选取出一个目标对象的边界信息;Select the boundary information of a target object from the boundary information of the target object in each image frame of the video sequence;
根据选取出的目标对象的边界信息,确定目标对象与电子设备之间的第一距离。According to the selected boundary information of the target object, a first distance between the target object and the electronic device is determined.
这里,通过触控识别得到视频序列的每个图像帧中的目标对象的边界信息之后,从每个图像帧中的目标对象的边界信息中选取出一个目标对象的边界信息,在根据选取出的这一个目标对象的边界信息来确定目标对象与电子设备之间的第一距离。Here, after obtaining the boundary information of the target object in each image frame of the video sequence through touch recognition, the boundary information of a target object is selected from the boundary information of the target object in each image frame, and based on the selected The boundary information of the target object is used to determine the first distance between the target object and the electronic device.
在一种可选的实施例中,以边界框来表示边界信息时,根据选取出的目标对象的边界信息,确定目标对象与电子设备之间的第一距离,包括:In an optional embodiment, when the boundary information is represented by a bounding box, determining the first distance between the target object and the electronic device according to the boundary information of the selected target object includes:
确定边界框的各边界与电子设备的屏幕的对应边缘之间的第二距离;determining a second distance between each edge of the bounding box and a corresponding edge of the screen of the electronic device;
根据各边界与屏幕的对应边缘之间的第二距离,确定目标边界;Determine the target boundary according to the second distance between each boundary and the corresponding edge of the screen;
根据目标边界的长度值,利用预设的边界长度值与目标对象至电子设备的距离之间的关系,确定目标对象与电子设备之间的第一距离。According to the length value of the target boundary, the first distance between the target object and the electronic device is determined using the relationship between the preset boundary length value and the distance between the target object and the electronic device.
可以理解地,以边界框来表示边界信息时,根据选取出的目标对象的边界框,确定目标对象与电子设备之间的第一距离中,先确定边界框的各边界与屏幕的对应边缘之间的第二距离,可以通过判断每条边界到屏幕的对应边缘的第二距离来确定边界框是否将完整的目标对象锁定在内,即目标对象是否被屏幕截断。It can be understood that when the boundary information is represented by a bounding box, in determining the first distance between the target object and the electronic device according to the selected bounding box of the target object, the first distance between each boundary of the bounding box and the corresponding edge of the screen is determined. The second distance between each boundary can be determined by judging the second distance from each boundary to the corresponding edge of the screen to determine whether the bounding box locks the complete target object, that is, whether the target object is cut off by the screen.
在确定出边界框的各边界与屏幕的对应边缘之间的第二距离之后,可以根据各边界与屏幕的对应边缘之间的第二距离来确定目标对象是否被屏幕所截断,基于此,从各边界中选取出一条边界即为目标边界,目标边界的长度值为目标对象未被截断或者部分被截断时目标边界可用的宽度值或高度值,这样,利用选取出的目标边界的长度值来确定目标对象与电子设备之间的第一距离。After determining the second distance between each boundary of the bounding box and the corresponding edge of the screen, whether the target object is intercepted by the screen can be determined based on the second distance between each boundary and the corresponding edge of the screen. Based on this, from One boundary selected from each boundary is the target boundary. The length value of the target boundary is the width or height value available for the target boundary when the target object is not truncated or is partially truncated. In this way, the length value of the selected target boundary is used to A first distance between the target object and the electronic device is determined.
需要说明的是,可以选取目标边界的两条边界,例如,既选取宽度值也选取高度值,同时采用边界框的宽度值和高度值来确定目标对象与电子设备之间的第一距离。It should be noted that you can select two boundaries of the target boundary, for example, select both a width value and a height value, and simultaneously use the width value and height value of the bounding box to determine the first distance between the target object and the electronic device.
这样,就可以在筛选出边界的情况下,利用目标边界确定出目标对象与电子设备之间的第一距离,进而确定出第一距离下的触控操作参数。In this way, when the boundary is screened out, the target boundary can be used to determine the first distance between the target object and the electronic device, and then the touch operation parameters at the first distance can be determined.
为了选取出未被截断的或者部分被截断的目标对象,以得到可用的目标边界,以提高目标对象与电子设备之间的第一距离的准确性,在一种可选的实施例中,根据边界框的各边界与电子设备的屏幕的对应边缘之间的第二距离,确定目标边界,包括:In order to select untruncated or partially truncated target objects to obtain usable target boundaries to improve the accuracy of the first distance between the target object and the electronic device, in an optional embodiment, according to The second distance between each edge of the bounding box and the corresponding edge of the screen of the electronic device determines the target boundary, including:
当各边界与屏幕的对应边缘之间的第二距离均大于预设阈值时,从各边界中选取出一条边界;When the second distance between each boundary and the corresponding edge of the screen is greater than the preset threshold, select a boundary from each boundary;
当各边界与屏幕的对应边缘之间的第二距离中仅存在一条边界与屏幕的对应边缘的第二距离小于等于预设阈值时,从各边界中选取出与屏幕的对应边缘的第二距离小于等于预设阈值的边界;When only one of the second distances between each boundary and the corresponding edge of the screen has a second distance less than or equal to the preset threshold, select the second distance from each boundary to the corresponding edge of the screen. The boundary is less than or equal to the preset threshold;
将选取出的边界确定为目标边界。Determine the selected boundary as the target boundary.
也就是说,将各边界与屏幕的对应边缘之间的第二距离与预设阈值进行比较,若均大于预设阈值,说明该边界框中的目标对象不存在被屏幕截断的情况,所以,可以从各边界中选取出一条边界作为目标边界;若仅仅存在一条边界与屏幕的对应边缘之间的第二距离小于等于预设阈值,说明该边界框中的目标对象存在被屏幕截断,并且虽然目标对象被截断,但是与屏幕的对应边缘之间的第二距离小于等于预设阈值的边界相对应的边界能够反映出目标对象的尺寸,所以,可以选取与屏幕的对应边缘之间的第二距离小于等于预设阈值的边界作为目标边界,并利用目标边界的长度值来确定目标对象与电子设备之间的第一距离。That is to say, the second distance between each boundary and the corresponding edge of the screen is compared with the preset threshold. If both are greater than the preset threshold, it means that the target object in the bounding box is not cut off by the screen. Therefore, One boundary can be selected from each boundary as the target boundary; if there is only one boundary and the second distance between the corresponding edge of the screen is less than or equal to the preset threshold, it means that the target object in the boundary box is cut off by the screen, and although The target object is cut off, but the boundary corresponding to the boundary where the second distance between the corresponding edge of the screen is less than or equal to the preset threshold can reflect the size of the target object. Therefore, the second distance between the corresponding edge of the screen and the boundary can be selected. The boundary whose distance is less than or equal to the preset threshold is used as the target boundary, and the length value of the target boundary is used to determine the first distance between the target object and the electronic device.
另外,若均小于等于预设阈值,说明该边界框中目标对象被屏幕截断,且边界框中的高度值和宽度值并不能够反映出目标对象的尺寸,所以,不能够采用该边界框的长度值和/或宽度值来计算目标对象与电子设备之间的第一距离,此时,可以生成提示信息,用于提示该目标对象不符合对电子设备的控制范围之内。In addition, if both are less than or equal to the preset threshold, it means that the target object in the bounding box is cut off by the screen, and the height value and width value in the bounding box do not reflect the size of the target object, so the bounding box cannot be used. The length value and/or the width value is used to calculate the first distance between the target object and the electronic device. At this time, prompt information can be generated to prompt that the target object does not fall within the control range of the electronic device.
下面以目标对象为手,边界框为手势框为例,来对边界框进行说明,图3a为本申请实施例提供的一种可选的手势框的实例一的示意图,如图3a所示,手势框的各边界与屏幕的对应边缘之间的第二距离均大于预设阈值,所以,该手势未被屏幕截断,可以选取该手势框中的任意一条边界作为目标边界来确定手与电子设备之间的第一距离;图3b为本申请实施例提供的一种可选的手势框的实例二的示意图,如图3b所示,手势框的各边界与屏幕的对应边缘之间的第二距离中仅仅有一条宽度边界与屏幕的对应边缘的第二距离小于等于预设阈值,所以,该手被屏幕截断,但是可以选取该手势框中的宽度边界作为目标边界来确定手与电子设备之间的第一距离;图3c为本申请实施例提供的一种可选的手势框的实例三的示意图,如图3c所示,手势框的各边界与屏幕的对应边缘之间的第二距离中仅仅有一条高度边界与屏幕的对应边缘的第二距离小于等于预设阈值,所以,该手被屏幕截断,但是可以选取该手势框中的高度边界作为目标边界来确定手与电子设备之间的第一距离;图3d为本申请实施例提供的一种可选的手势框的实例四的示意图,如图3d所示,手势框的各边界与屏幕的对应边缘之间的第二距离中有两条边界与屏幕的对应边缘的第二距离小于等于预设阈值,分别为高度边界和宽度边界,所以,该手被屏幕截断,此种情况无法选取该手势框中的边界作为目标边界来确定手与电子设备之间的第一距离,可以生成提示信息,用于提示该目标对象不符合对电子设备的控制范围之内。The following takes the target object as a hand and the bounding box as a gesture frame as an example to illustrate the bounding box. Figure 3a is a schematic diagram of an optional gesture frame example 1 provided by the embodiment of the present application, as shown in Figure 3a. The second distance between each boundary of the gesture box and the corresponding edge of the screen is greater than the preset threshold. Therefore, the gesture is not cut off by the screen. Any boundary in the gesture box can be selected as the target boundary to determine the hand and electronic device. The first distance between each other; Figure 3b is a schematic diagram of Example 2 of an optional gesture box provided by the embodiment of the present application. As shown in Figure 3b, the second distance between each boundary of the gesture box and the corresponding edge of the screen Among the distances, there is only one width boundary whose second distance from the corresponding edge of the screen is less than or equal to the preset threshold. Therefore, the hand is cut off by the screen. However, the width boundary in the gesture box can be selected as the target boundary to determine the distance between the hand and the electronic device. The first distance between; Figure 3c is a schematic diagram of Example 3 of an optional gesture frame provided by the embodiment of the present application. As shown in Figure 3c, the second distance between each boundary of the gesture frame and the corresponding edge of the screen There is only a second distance between the height boundary and the corresponding edge of the screen that is less than or equal to the preset threshold. Therefore, the hand is cut off by the screen, but the height boundary in the gesture box can be selected as the target boundary to determine the distance between the hand and the electronic device. The first distance; Figure 3d is a schematic diagram of Example 4 of an optional gesture box provided by the embodiment of the present application. As shown in Figure 3d, the second distance between each boundary of the gesture box and the corresponding edge of the screen There are two boundaries whose second distances from the corresponding edges of the screen are less than or equal to the preset threshold, namely the height boundary and the width boundary. Therefore, the hand is cut off by the screen. In this case, the boundary in the gesture box cannot be selected as the target boundary. Determining the first distance between the hand and the electronic device can generate prompt information for prompting that the target object does not fall within the control range of the electronic device.
在一种可选的实施例中,根据目标边界的长度值,利用预设的边界长度值与目标对象至电子设备的距离之间的关系,计算得到目标对象与电子设备之间的第一距离,包括:In an optional embodiment, according to the length value of the target boundary, the first distance between the target object and the electronic device is calculated using the relationship between the preset boundary length value and the distance between the target object and the electronic device. ,include:
当目标边界为宽度边界时,根据目标边界的长度值,利用预设的边界宽度值与目标对象至电子 设备的距离之间的关系,计算得到目标对象与电子设备之间的第一距离。When the target boundary is a width boundary, according to the length value of the target boundary, the first distance between the target object and the electronic device is calculated using the relationship between the preset boundary width value and the distance between the target object and the electronic device.
其中,在电子设备中预先存储有边界宽度值与目标对象至电子设备的距离之间的关系,那么,在得知目标边界的宽度值之后,可以根据目标边界宽度值,利用预设的边界宽度值与目标对象至电子设备的距离之间的关系,来计算得到目标对象与电子设备之间的第一距离。Among them, the relationship between the boundary width value and the distance between the target object and the electronic device is pre-stored in the electronic device. Then, after the width value of the target boundary is known, the preset boundary width can be used according to the target boundary width value. The relationship between the value and the distance between the target object and the electronic device is used to calculate the first distance between the target object and the electronic device.
可以理解地,边界宽度值与目标对象至电子设备的距离之间存在着一定的关系,利用该关系,可以计算目标对象与电子设备之间的第一距离,以目标对象为手,边界框为手势框为例,边界宽度值与目标对象至电子设备的距离之间的关系可以采用如下方式确定:Understandably, there is a certain relationship between the boundary width value and the distance between the target object and the electronic device. Using this relationship, the first distance between the target object and the electronic device can be calculated. Taking the target object as the hand, the bounding box is Taking the gesture box as an example, the relationship between the border width value and the distance from the target object to the electronic device can be determined in the following way:
记录用户的手与电子设备的距离为D1时,手势框的宽度W1;记录用户的手与电子设备的距离为D2时,手势框的宽度W2;因此,当距离为D时,当前手势框W的计算方式可以为:Record the width of the gesture frame W1 when the distance between the user's hand and the electronic device is D1; record the width of the gesture frame W2 when the distance between the user's hand and the electronic device is D2; therefore, when the distance is D, the current gesture frame W can be calculated as:
W=W1+(D-D1)(W2-W1)/(D2-D1)       (1)W=W1+(D-D1)(W2-W1)/(D2-D1) (1)
可以确定出边界宽度值与目标对象至电子设备的距离之间的关系可以为:It can be determined that the relationship between the boundary width value and the distance from the target object to the electronic device can be:
D=(W-W1)(D2-D1)/(W2-W1)+D1         (2)D=(W-W1)(D2-D1)/(W2-W1)+D1 (2)
其中,上述W1和W2均为手未被屏幕截断时手势框的宽度值。Among them, the above-mentioned W1 and W2 are both the width values of the gesture frame when the hand is not cut off by the screen.
在一种可选的实施例中,根据目标边界的长度值,利用预设的边界长度值与目标对象至电子设备的距离之间的关系,计算得到目标对象与电子设备之间的第一距离,包括:In an optional embodiment, according to the length value of the target boundary, the first distance between the target object and the electronic device is calculated using the relationship between the preset boundary length value and the distance between the target object and the electronic device. ,include:
当目标边界为高度边界时,根据目标边界的长度值,利用预设的边界高度值与目标对象至电子设备的距离之间的关系,计算得到目标对象与电子设备之间的第一距离。When the target boundary is a height boundary, the first distance between the target object and the electronic device is calculated according to the length value of the target boundary and the relationship between the preset boundary height value and the distance between the target object and the electronic device.
其中,在电子设备中预先存储有边界高度值与目标对象至电子设备的距离之间的关系,那么,在得知目标边界的高度值之后,可以根据边界高度值,利用预设的边界高度值与目标对象至电子设备的距离之间的关系,来计算得到目标对象与电子设备之间的第一距离。Among them, the relationship between the boundary height value and the distance between the target object and the electronic device is pre-stored in the electronic device. Then, after knowing the height value of the target boundary, the preset boundary height value can be used according to the boundary height value. The first distance between the target object and the electronic device is calculated based on the relationship between the distance between the target object and the electronic device.
可以理解地,边界高度值与目标对象至电子设备的距离之间存在着一定的关系,利用该关系,可以计算目标对象与电子设备之间的第一距离,以目标对象为手,边界框为手势框为例,边界高度值与目标对象至电子设备的距离之间的关系可以采用如下方式确定:Understandably, there is a certain relationship between the boundary height value and the distance between the target object and the electronic device. Using this relationship, the first distance between the target object and the electronic device can be calculated. Taking the target object as the hand, the bounding box is Taking the gesture frame as an example, the relationship between the boundary height value and the distance from the target object to the electronic device can be determined in the following way:
记录用户的手与电子设备的距离为D1时,手势框的宽度H1;记录用户的手与电子设备的距离为D2时,手势框的宽度H2;因此,当距离为D时,当前手势框H的计算方式可以为:When the distance between the user's hand and the electronic device is recorded as D1, the width of the gesture frame H1 is recorded; when the distance between the user's hand and the electronic device is recorded as D2, the width of the gesture frame is H2; therefore, when the distance is D, the current gesture frame H is can be calculated as:
H=H1+(D-D1)(H2-H1)/(D2-D1)        (3)H=H1+(D-D1)(H2-H1)/(D2-D1) (3)
由此,可以确定出边界宽度值与目标对象至电子设备的距离之间的关系可以为:From this, it can be determined that the relationship between the boundary width value and the distance from the target object to the electronic device can be:
D=(H-H1)(D2-D1)/(H2-H1)+D1        (4)D=(H-H1)(D2-D1)/(H2-H1)+D1 (4)
其中,上述H1和H2均为手未被屏幕截断时手势框的高度值。Among them, the above H1 and H2 are both the height value of the gesture frame when the hand is not cut off by the screen.
除了采用上述选取出的一个目标对象的边界信息来确定目标对象与电子设备之间的第一距离之外,还可以根据每个图像帧中的目标对象的边界信息来确定目标对象与电子设备之间的第一距离,在一种可选的实施例中,在获取视频序列的每个图像帧中的目标对象的边界信息之后,上述方法还包括:In addition to using the boundary information of a target object selected above to determine the first distance between the target object and the electronic device, the distance between the target object and the electronic device can also be determined based on the boundary information of the target object in each image frame. In an optional embodiment, after obtaining the boundary information of the target object in each image frame of the video sequence, the above method further includes:
根据每个图像帧中的目标对象中的边界信息,确定每个图像帧中的目标对象与电子设备之间的距离;Determine the distance between the target object in each image frame and the electronic device according to the boundary information in the target object in each image frame;
将每个图像帧中的目标对象与电子设备之间的距离的平均值,确定为目标对象与电子设备之间的第一距离。The average value of the distances between the target object and the electronic device in each image frame is determined as the first distance between the target object and the electronic device.
可以理解地,采用每个图像帧中的目标对象的边界信息,确定每个图像帧中的目标对象与电子设备之间的距离,具体地与根据选取出的目标对象的边界信息确定目标对象与电子设备之间的第一距离的实现方式相同,这里,不再赘述。It can be understood that the distance between the target object and the electronic device in each image frame is determined using the boundary information of the target object in each image frame. Specifically, the distance between the target object and the electronic device is determined based on the selected boundary information of the target object. The first distance between electronic devices is implemented in the same manner, which will not be described again here.
在确定出每个图像帧中的目标对象与电子设备之间的距离之后,可以采用平均值算法,计算每个图像帧中的目标对象与电子设备之间的距离的平均值,将该平均值确定为目标对象与电子设备之间的第一距离。After determining the distance between the target object and the electronic device in each image frame, an average algorithm can be used to calculate the average value of the distance between the target object and the electronic device in each image frame, and calculate the average value Determine as the first distance between the target object and the electronic device.
进一步地,为了确定出准确地第一距离,在一种可选的实施例中,将每个图像帧中的目标对象与电子设备之间的距离的平均值,确定为目标对象与电子设备之间的第一距离,包括:Further, in order to determine the accurate first distance, in an optional embodiment, the average distance between the target object and the electronic device in each image frame is determined as the distance between the target object and the electronic device. The first distance between, including:
当每个图像帧中目标对象与电子设备之间的距离中任意两个之间的差值的绝对值小于等于预设的误差阈值时,将每个图像帧中目标对象与电子设备之间的距离的平均值,确定为目标对象与电子设备之间的第一距离。When the absolute value of the difference between any two distances between the target object and the electronic device in each image frame is less than or equal to the preset error threshold, the distance between the target object and the electronic device in each image frame is The average of the distances is determined as the first distance between the target object and the electronic device.
这里,在计算平均值之前,先判断每个图像帧中目标对象与电子设备之间的距离中任意两个之间的差值的绝对值是否小于等于预设的误差阈值,若小于等于,那么说明目标对象与电子设备之间的距离在预设的误差阈值之内抖动,并没有发生太大的变化,所以,可以采用平均值发来确定目标对象与电子设备的第一距离。Here, before calculating the average value, it is first determined whether the absolute value of the difference between any two distances between the target object and the electronic device in each image frame is less than or equal to the preset error threshold. If it is less than or equal to the preset error threshold, then This shows that the distance between the target object and the electronic device jitters within the preset error threshold and does not change much. Therefore, the average value can be used to determine the first distance between the target object and the electronic device.
另外,在一种可选的实施例中,上述方法还包括:In addition, in an optional embodiment, the above method further includes:
当每个图像帧中目标对象与电子设备之间的距离中任意两个之间的差值的绝对值中存在大于预设的误差阈值的绝对值时,按照预设的在标准距离下的触控操作参数,控制电子设备以执行与触控类型对应的目标功能。When the absolute value of the difference between any two of the distances between the target object and the electronic device in each image frame is greater than the preset error threshold, the touch screen will be triggered according to the preset touch point at the standard distance. Control operating parameters to control the electronic device to perform target functions corresponding to the touch type.
也就是说,通过判断得到每个目标对象与电子设备之间的距离中任意两个之间的差值的绝对值存在大于预设的误差阈值的绝对值,说明,目标对象与电子设备之间的距离抖动较大,发生较大的变化,所以,无法确定出目标对象与电子设备之间的第一距离,所以,可以直接按照预设的在标准距离下的触控操作参数,控制电子设备以执行与触控类型对应的目标功能。That is to say, it is determined that the absolute value of the difference between any two of the distances between each target object and the electronic device is greater than the preset error threshold, indicating that the distance between the target object and the electronic device The distance jitters greatly and changes greatly, so the first distance between the target object and the electronic device cannot be determined. Therefore, the electronic device can be controlled directly according to the preset touch operation parameters at the standard distance. to perform the target function corresponding to the touch type.
S104:确定在第一距离下的触控操作参数;S104: Determine the touch operation parameters at the first distance;
在确定出第一距离之后,可以确定出在第一距离下的触控操作参数,示例性地,电子设备中针对每种触控操作或者触控类型,都存储有距离与触控操作参数之间的对应关系,这里,可以采用该对应关系来确定在第一距离下的触控操作参数,还可以利用预设的参数计算公式来计算出第一距离下的触控操作参数,这里,本申请实施例对此不作具体限定。After determining the first distance, the touch operation parameters at the first distance can be determined. For example, for each touch operation or touch type, the distance and the touch operation parameter are stored in the electronic device. Here, the corresponding relationship can be used to determine the touch operation parameters at the first distance, and the preset parameter calculation formula can also be used to calculate the touch operation parameters at the first distance. Here, this The application examples do not specifically limit this.
在一种可选的实施例中,触控操作参数包括以下任一项:滑动操作的滑动距离,滑动操作的滑动速度,翻页操作的翻页速度,长按操作的长按时间,点击操作的点击频率。In an optional embodiment, the touch operation parameters include any of the following: sliding distance of the sliding operation, sliding speed of the sliding operation, page turning speed of the page turning operation, long pressing time of the long pressing operation, click operation frequency of clicks.
也就是说,目标对象的触控操作可以为滑动操作,长按操作或者点击操作等等,对于滑动操作的操作参数可以包括滑动距离和滑动速度,长按操作的操作参数可以包括长按时间,点击操作的操作参数可以包括点击频率,这里,本申请实施例对此不作具体限定。That is to say, the touch operation of the target object can be a sliding operation, a long press operation or a click operation, etc. The operating parameters for the sliding operation can include the sliding distance and sliding speed, and the operating parameters for the long pressing operation can include the long pressing time. The operating parameters of the click operation may include click frequency, which is not specifically limited in this embodiment of the present application.
为了确定出在第一距离下的触控操作参数,在一种可选的实施例中,S104可以包括:In order to determine the touch operation parameters at the first distance, in an optional embodiment, S104 may include:
基于预设的距离与灵敏度系数的对应关系,计算在第一距离下的灵敏度系数;Based on the preset corresponding relationship between distance and sensitivity coefficient, calculate the sensitivity coefficient at the first distance;
根据在第一距离下的灵敏度系数,确定在第一距离下的触控操作参数。According to the sensitivity coefficient at the first distance, the touch operation parameters at the first distance are determined.
这里,在确定出第一距离之后,在电子设备中预先存储有第一距离与灵敏度系数的对应关系,那么,基于该对应关系,可以计算得到在第一距离下的灵敏度系数,再基于在第一距离下的灵敏度系数,确定在第一距离下的触控操作参数。Here, after the first distance is determined, the corresponding relationship between the first distance and the sensitivity coefficient is pre-stored in the electronic device. Then, based on the corresponding relationship, the sensitivity coefficient at the first distance can be calculated, and then based on the corresponding relationship at the first distance, the sensitivity coefficient at the first distance can be calculated. The sensitivity coefficient at a distance determines the touch operation parameters at the first distance.
在一种可选的实施例中,根据在第一距离下的灵敏度系数,确定在第一距离下的触控操作参数,包括:In an optional embodiment, determining the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance includes:
利用在第一距离下的灵敏度系数和预设的在标准距离下的触控操作参数,计算得到在第一距离下的触控操作参数。Using the sensitivity coefficient at the first distance and the preset touch operation parameters at the standard distance, the touch operation parameters at the first distance are calculated.
可以理解地,在电子设备中存储有每个触控操作对应的触控操作参数,所以,在根据在第一距离下的灵敏度系数确定在第一距离下的触控操作参数中,示例性地,将在第一距离下的灵敏度系数与预设的在标准距离下的触控操作参数的乘积,确定为在第一距离下的触控操作参数。It can be understood that the touch operation parameters corresponding to each touch operation are stored in the electronic device. Therefore, in determining the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance, exemplarily , the product of the sensitivity coefficient at the first distance and the preset touch operation parameter at the standard distance is determined as the touch operation parameter at the first distance.
也就是说,利用第一距离下的灵敏度系数确定在第一距离下的触控操作参数,进而使得与电子设备具有不同距离的目标对象,即使相同的触控操作所对应的触控操作参数也是不同的,从而扩展了非接触式触控控制的功能,使得用户的非接触式触控操作对电子设备的控制更加精细化,提高了用户的体验度。That is to say, the sensitivity coefficient at the first distance is used to determine the touch operation parameters at the first distance, so that for target objects with different distances from the electronic device, even if the touch operation parameters corresponding to the same touch operation are Different, thereby expanding the function of non-contact touch control, making the user's non-contact touch operation more refined control of electronic devices, and improving the user experience.
S105:按照触控操作在参数,控制电子设备以执行与触控类型对应的目标功能。S105: According to the parameters of the touch operation, control the electronic device to perform the target function corresponding to the touch type.
在确定出触控操作参数之后,电子设备按照确定出的触控操作参数执行触控类型对应的目标功能。其中,电子设备可以控制屏幕的页面按照触控操作参数执行触控类型对应的目标功能,电子设备也可以按照触控操作参数执行对数据的处理,例如对图像的处理功能,这里,本申请实施例对此不作具体限定。After determining the touch operation parameters, the electronic device performs the target function corresponding to the touch type according to the determined touch operation parameters. Among them, the electronic device can control the page of the screen to perform the target function corresponding to the touch type according to the touch operation parameters. The electronic device can also perform data processing according to the touch operation parameters, such as the image processing function. Here, the application implements The example does not specifically limit this.
针对实现对电子设备的屏幕的控制来说,在一种可选的实施例中,S105可以包括:In order to control the screen of the electronic device, in an optional embodiment, S105 may include:
按照触控操作参数,控制电子设备的屏幕的被控制对象,以执行与触控类型对应的目标功能;According to the touch operation parameters, control the controlled object on the screen of the electronic device to perform the target function corresponding to the touch type;
其中,被控制对象包括以下任一项:页面,界面,控件;也就是说,上述控制方法,除了可以实现对屏幕上的页面的控制,还可以实现对界面的控制,或者实现对屏幕上的某个控件的控制,这里,本申请实施例对此不作具体限定。Among them, the controlled objects include any of the following: pages, interfaces, and controls; that is to say, the above control method can not only control the pages on the screen, but also control the interface, or control the items on the screen. The control of a certain control is not specifically limited in the embodiments of this application.
以页面进行滑动为例,在一种可选的实施例中,按照触控操作参数,控制电子设备的屏幕的被控制对象,以执行与触控类型对应的目标功能,包括:Taking page sliding as an example, in an optional embodiment, the controlled object on the screen of the electronic device is controlled according to the touch operation parameters to perform the target function corresponding to the touch type, including:
按照滑动操作的滑动距离,控制屏幕的页面,以执行滑动功能。According to the sliding distance of the sliding operation, control the page of the screen to perform the sliding function.
示例性的,当用户通过手掌滑动页面时,电子设备响应手掌的滑动动作,确定滑动动作对应的操作为滑动操作,确定出手掌距离屏幕之间的距离,然后确定出该距离下滑动操作的滑动距离,电子设备控制页面按照该距离下滑动操作的滑动距离来执行滑动操作对应的滑动功能,如此,使得与屏幕具有不同距离的手掌在发出滑动动作时,即时相同的滑动动作对应不同的滑动操作的滑动距离,使得用户对页面的控制更加精细化,提高了用户的体验度。For example, when the user slides the page with his palm, the electronic device responds to the sliding action of the palm, determines that the operation corresponding to the sliding action is a sliding operation, determines the distance between the palm and the screen, and then determines the sliding operation at this distance. Distance, the electronic device control page performs the sliding function corresponding to the sliding operation according to the sliding distance of the sliding operation at this distance. In this way, when palms with different distances from the screen send sliding actions, the same sliding action corresponds to different sliding operations. The sliding distance makes the user's control of the page more refined and improves the user experience.
下面举实例来对上述一个或多个实施例中的控制方法进行描述。The following is an example to describe the control method in one or more of the above embodiments.
下面以目标对象为手,边界框为手势框为例来对上述控制方法进行说明,图4为本申请实施例提供的一种可选的控制方法的实例一的示意图,如图4所示,该控制方法可以包括:The above control method is explained below by taking the target object as a hand and the bounding box as a gesture frame as an example. Figure 4 is a schematic diagram of Example 1 of an optional control method provided by the embodiment of the present application. As shown in Figure 4, This control method can include:
S401:抓拍手势图;S401: Capture gesture picture;
在S401中,手势交互应用会开启前置摄像头,通过前置摄像头抓拍手势图。In S401, the gesture interaction application will turn on the front camera and capture gesture images through the front camera.
S402:手势检测;S402: Gesture detection;
S403:获取手势框;S403: Get the gesture box;
在S401-S402中,在抓拍到图片之后,进行图片进行检测,得到手势类别、手势置信度和手势框;在对手势进行检测中,图5为本申请实施例提供的一种可选的控制方法的实例二的示意图,如图5所示,手势的检测方法可以包括:In S401-S402, after the picture is captured, the picture is detected to obtain the gesture category, gesture confidence and gesture frame; during gesture detection, Figure 5 is an optional control provided by the embodiment of the present application. A schematic diagram of Example 2 of the method is shown in Figure 5. The gesture detection method may include:
S501:获取含有手势的图片;S501: Obtain pictures containing gestures;
S502:对图片进行模型推理;S502: Perform model inference on images;
S503:模型后处理;S503: Model post-processing;
S504:非极大值抑制;S504: Non-maximum suppression;
S505:得到手势框。S505: Get the gesture frame.
这里,在获取到包含手势的图片之后,在对图片进行检测中主要包含模型推理、模型后处理、非极大值抑制等操作,从而可以将一幅图片中的手势检测出来,每个检测结果包含手势框、手势类别(指的是手的姿态变化对应的类别,该类别对应有触控类型)和手势置信度(指的是手势类别对应的触控类型的置信度,例如为0.9),若图片中存在多个手势类别的检测结果,取置信度最高的手势类别作为最终的检测结果。Here, after obtaining a picture containing a gesture, the detection of the picture mainly includes operations such as model inference, model post-processing, and non-maximum suppression, so that the gesture in a picture can be detected, and each detection result Including gesture box, gesture category (referring to the category corresponding to the hand posture change, this category corresponds to the touch type) and gesture confidence (referring to the confidence of the touch type corresponding to the gesture category, for example, 0.9), If there are detection results for multiple gesture categories in the picture, the gesture category with the highest confidence will be used as the final detection result.
S404:计算手与屏幕的距离;S404: Calculate the distance between the hand and the screen;
这里,根据手势框,计算用户的手与手机屏幕间的距离,即为手距离屏幕的距离。其中,可以通过下述方式计算手与屏幕的距离:Here, according to the gesture frame, the distance between the user's hand and the mobile phone screen is calculated, which is the distance between the hand and the screen. Among them, the distance between the hand and the screen can be calculated in the following way:
首先,记录一组手与屏幕的距离为D1时手势框的宽度和高度,图6a为本申请实施例提供的一种可选的手势框的实例五的示意图,如图6a所示,记录用户手与屏幕距离为D1时,手势框的宽度W1和高度H1;First, record the width and height of the gesture frame when the distance between a group of hands and the screen is D1. Figure 6a is a schematic diagram of Example 5 of an optional gesture frame provided by the embodiment of the present application. As shown in Figure 6a, record the user When the distance between the hand and the screen is D1, the width W1 and height H1 of the gesture frame;
然后,再记录一组手与屏幕的距离为D2时手势框的宽度和高度,图6b为本申请实施例提供的一种可选的手势框的实例六的示意图,如图6b所示,记录用户手与屏幕距离为D2时,手势框的宽度W2和高度H2;Then, record the width and height of the gesture frame when the distance between a group of hands and the screen is D2. Figure 6b is a schematic diagram of Example 6 of an optional gesture frame provided by the embodiment of the present application. As shown in Figure 6b, record When the distance between the user's hand and the screen is D2, the width W2 and height H2 of the gesture frame;
最后,图6c为本申请实施例提供的一种可选的手势框的实例七的示意图,如图6c所示,当距离为D时,当前手势框W和H的计算方式为公式(1)和公式(3)。Finally, Figure 6c is a schematic diagram of Example 7 of an optional gesture frame provided by the embodiment of the present application. As shown in Figure 6c, when the distance is D, the calculation method of the current gesture frames W and H is formula (1) and formula (3).
如此,反推可以得到,当手势框的宽和高为W和H时,当前手与屏幕的距离D可以通过公式(2)和公式(4)得到。In this way, it can be obtained by inference that when the width and height of the gesture frame are W and H, the distance D between the current hand and the screen can be obtained through formula (2) and formula (4).
需要说明的是,在计算手与屏幕的距离时可以依据手势框的宽度或高度,当手在屏幕边缘时,可能存在手被屏幕截断的现象,此时可以通过计算手势框四条边到屏幕边缘的距离,判断手被截断时手势框的宽度还是高度是否可用,需要选择一条可用的边框来计算手与屏幕的距离。若宽度和高度都不可用,无法计算手与屏幕的距离。It should be noted that when calculating the distance between the hand and the screen, the width or height of the gesture box can be used. When the hand is at the edge of the screen, the hand may be cut off by the screen. In this case, the distance from the four sides of the gesture box to the edge of the screen can be calculated. distance, to determine whether the width or height of the gesture frame is available when the hand is cut off, you need to select an available border to calculate the distance between the hand and the screen. If neither width nor height is available, the distance between the hand and the screen cannot be calculated.
S405:更新手势对应的触控类型的灵敏度系数;S405: Update the sensitivity coefficient of the touch type corresponding to the gesture;
在S405中,根据用户手与手机屏幕间的距离,调整手势对应的触控类型的灵敏度系数。In S405, the sensitivity coefficient of the touch type corresponding to the gesture is adjusted according to the distance between the user's hand and the mobile phone screen.
S406:控制页面滑动距离。S406: Control the page sliding distance.
在S406中,根据当前手势对应的触控类型的灵敏度系数,计算页面的滑动距离。In S406, the sliding distance of the page is calculated based on the sensitivity coefficient of the touch type corresponding to the current gesture.
通过S403计算出的手与屏幕的距离之后,可以调整手势对应的触控类型的灵敏度系数,假设手与屏幕的距离为25cm时,灵敏度系数为1,则距离为D(10厘米<D<60厘米)时,更新后的手势对应的触控类型的灵敏度系数K为:After calculating the distance between the hand and the screen through S403, the sensitivity coefficient of the touch type corresponding to the gesture can be adjusted. Assume that when the distance between the hand and the screen is 25cm, the sensitivity coefficient is 1, then the distance is D (10cm<D<60 cm), the sensitivity coefficient K of the touch type corresponding to the updated gesture is:
K=1-0.01*(D-25)          (5)K=1-0.01*(D-25) (5)
也就是说,当手与屏幕的距离小于25cm时,手势对应的触控类型的灵敏度系数得到提高;当手与屏幕的距离大于25cm时,手势对应的触控类型的灵敏度得到降低。That is to say, when the distance between the hand and the screen is less than 25cm, the sensitivity coefficient of the touch type corresponding to the gesture is improved; when the distance between the hand and the screen is greater than 25cm, the sensitivity coefficient of the touch type corresponding to the gesture is reduced.
在根据当前手势对应的触控类型的灵敏度系数,计算手机页面上的滑动距离中,以手势对应的触控类型为滑动类型为例,预设手与屏幕的距离为25cm时,手每发出一次滑动操作,手机页面滑动的距离为M0,则当手与屏幕的距离为D时,手势每发出一次滑动操作,手机页面的滑动距离M为:When calculating the sliding distance on the mobile phone page based on the sensitivity coefficient of the touch type corresponding to the current gesture, taking the touch type corresponding to the gesture as sliding type as an example, when the preset distance between the hand and the screen is 25cm, each time the hand makes a In the sliding operation, the sliding distance of the mobile phone page is M0. When the distance between the hand and the screen is D, every time the gesture sends a sliding operation, the sliding distance M of the mobile phone page is:
M=M0*K=1-0.01*(D-25)         (6)M=M0*K=1-0.01*(D-25) (6)
最后,代入公式(2)或者公式(4)可以计算得到手与屏幕的距离D,再按照D控制页面执行滑动操作对应的滑动功能。Finally, by substituting formula (2) or formula (4), the distance D between the hand and the screen can be calculated, and then the sliding function corresponding to the sliding operation is executed on the control page according to D.
图7a为本申请实施例提供的一种可选的屏幕的实例一的排布示意图,如图7a所示,屏幕上显示的是通信记录列表,用户可以通过除大拇指以外的四个手指的上下摆动来控制电子设备滑动屏幕页面;图7b为本申请实施例提供的一种可选的屏幕的实例二的排布示意图,如图7b所示,当用户的手与屏幕之间的距离为25cm时,每次四个手指的上下摆动,页面滑动一条通信记录的距离,所以,页面显示的第一条通信记录为“4-17妹妹接入”;图7c为本申请实施例提供的一种可选的屏幕的实例三的排布示意图,如图7c所示,当用户的手与屏幕之间的距离为26cm时,每次四个手指的上下摆动,页面滑动两条通通信记录的距离,所以,页面显示的第一条通信记录为“4-16李四呼出”;如此,通过手与屏幕之间的距离,来确定每次四个手指的上下摆动对应的页面的滑动距离,实现了不同距离对应不同的滑动距离。Figure 7a is a schematic layout diagram of Example 1 of an optional screen provided by the embodiment of the present application. As shown in Figure 7a, a communication record list is displayed on the screen. The user can use four fingers except the thumb to Swing up and down to control the electronic device to slide the screen page; Figure 7b is a schematic layout diagram of Example 2 of an optional screen provided by the embodiment of the present application. As shown in Figure 7b, when the distance between the user's hand and the screen is 25cm, each time the four fingers swing up and down, the page slides by the distance of one communication record. Therefore, the first communication record displayed on the page is "4-17 Sister Access"; Figure 7c is a picture provided by the embodiment of the present application. A schematic diagram of the arrangement of Example 3 of an optional screen is shown in Figure 7c. When the distance between the user's hand and the screen is 26cm, each time the four fingers swing up and down, the page slides two lines of communication records. distance, so the first communication record displayed on the page is "4-16 Johnnie called out"; in this way, the distance between the hand and the screen is used to determine the sliding distance of the page corresponding to each up and down swing of the four fingers. Different distances correspond to different sliding distances.
在本实例中,基于手势检测的手势灵敏度系数智能调节滑动距离,可以通过手势检测结果计算出当前的手与屏幕的距离,根据手与屏幕的距离调整当前手势灵敏度系数,并作用于用户操作手机时页面的滑动距离控制。当用户手距离屏幕较远时,可以获得较低的灵敏度系数,当用户手距离屏幕较近时,可以获得较高的灵敏度系数。In this example, the gesture sensitivity coefficient based on gesture detection intelligently adjusts the sliding distance. The current distance between the hand and the screen can be calculated through the gesture detection results, and the current gesture sensitivity coefficient is adjusted according to the distance between the hand and the screen, and acts on the user's operation of the mobile phone. The sliding distance control of the page. When the user's hand is farther from the screen, a lower sensitivity coefficient can be obtained, and when the user's hand is closer to the screen, a higher sensitivity coefficient can be obtained.
如此,实现更加精细化的页面控制,用户可以根据自己的使用习惯,选择舒适的距离来进行手势对页面的控制,提升了手势对页面控制的用户体验。In this way, more refined page control is achieved. Users can choose a comfortable distance to control the page with gestures according to their own usage habits, which improves the user experience of gesture control of the page.
另外,需要说明的是,本实例不仅仅可用于手势交互中滑动距离的灵敏度系数的调节,还可以调节滑动操作的滑动速度、点击操作的点击频率、翻页操作的翻页速度、长按操作的长按时间、返回上一级操作的响应时间、控制游戏中的人物的移动操作的移动距离等),还可以应用于人脸识别、身体识别、人眼注视等新型交互的灵敏度系数的控制,为以上场景提供更加舒适的用户体验。In addition, it should be noted that this example can not only be used to adjust the sensitivity coefficient of the sliding distance in gesture interaction, but can also adjust the sliding speed of the sliding operation, the clicking frequency of the clicking operation, the page turning speed of the page turning operation, and the long press operation. long press time, response time to return to the previous level operation, movement distance to control the movement of characters in the game, etc.), and can also be applied to control the sensitivity coefficient of new interactions such as face recognition, body recognition, and human eye gaze. , providing a more comfortable user experience for the above scenarios.
本申请实施例提供了一种控制方法,包括:当电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列,对视频序列进行触控识别,得到目标对象和触控类型,确定目标对象与电子设备之间的第一距离,确定在第一距离下的触控操作参数,按照触控操作参数,控制电子设备以执行与触控类型对应的目标功能;也就是说,在本申请实施例中,在电子设备的非接触式触控控制功能的实现中,通过确定目标对象与电子设备的第一距离,进而确定第一距离下的触控操作参数,再控制电子设备的被控制对象按照触控操作参数执行触控类型对应的目标功能,如此,针对同一触控类型,不同的第一距离可以确定出不同的触控操作参数,那么,电子设备能够根据目标对象与电子设备之间的距离来实现对触控类型的响应,与现有的仅仅对触控类型对应的目标功能进行控制相比,采用同一触控类型利用不同距离下的触控操作参数进行控制有所差别,使得非接触式触控控制更加精细化。Embodiments of the present application provide a control method, which includes: when the non-contact touch control function of the electronic device is turned on, obtaining a video sequence corresponding to the non-contact touch operation, performing touch recognition on the video sequence, and obtaining Target object and touch type, determine a first distance between the target object and the electronic device, determine touch operation parameters at the first distance, and control the electronic device to perform a target corresponding to the touch type according to the touch operation parameters function; that is to say, in the embodiment of the present application, in the implementation of the non-contact touch control function of the electronic device, the first distance between the target object and the electronic device is determined, and then the touch operation at the first distance is determined. parameters, and then control the controlled object of the electronic device to perform the target function corresponding to the touch type according to the touch operation parameters. In this way, for the same touch type, different first distances can determine different touch operation parameters. Then, the electronic device The device can respond to the touch type according to the distance between the target object and the electronic device. Compared with the existing method that only controls the target function corresponding to the touch type, the same touch type can be used to utilize the touch at different distances. The control operation parameters are different, making the non-contact touch control more refined.
基于前述实施例相同的发明构思,本申请实施例提供一种电子设备,图8为本申请实施例提供的一种可选的电子设备的结构示意图,如图8所示,该电子设备包括:Based on the same inventive concept of the previous embodiments, an embodiment of the present application provides an electronic device. Figure 8 is a schematic structural diagram of an optional electronic device provided by an embodiment of the present application. As shown in Figure 8, the electronic device includes:
获取模块81,配置成当电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列;The acquisition module 81 is configured to acquire the video sequence corresponding to the non-contact touch operation when the non-contact touch control function of the electronic device is turned on;
处理模块82,配置成对视频序列进行触控识别,得到目标对象和触控类型;The processing module 82 is configured to perform touch recognition on the video sequence to obtain the target object and touch type;
第一确定模块83,配置成确定目标对象与电子设备之间的第一距离;The first determination module 83 is configured to determine the first distance between the target object and the electronic device;
第二确定模块84,配置成确定在第一距离下的触控操作参数;The second determination module 84 is configured to determine the touch operation parameters at the first distance;
控制模块85,配置成按照触控操作参数,控制电子设备以执行触控类型对应的目标功能。The control module 85 is configured to control the electronic device according to the touch operation parameters to perform the target function corresponding to the touch type.
在一种可选的实施例中,第一确定模块83,具体配置成:In an optional embodiment, the first determination module 83 is specifically configured to:
从视频序列的每个图像帧中的目标对象的边界信息中选取出一个目标对象的边界信息;Select the boundary information of a target object from the boundary information of the target object in each image frame of the video sequence;
根据选取出的目标对象的边界信息,确定目标对象与电子设备的屏幕之间的第一距离。A first distance between the target object and the screen of the electronic device is determined according to the selected boundary information of the target object.
在一种可选的实施例中,以边界框来表示边界信息时,第一确定模块83根据选取出的目标对象的边界信息,确定目标对象与电子设备之间的第一距离中,包括:In an optional embodiment, when the boundary information is represented by a bounding box, the first determination module 83 determines the first distance between the target object and the electronic device according to the boundary information of the selected target object, including:
确定边界框的各边界与电子设备的屏幕的对应边缘之间的第二距离;determining a second distance between each edge of the bounding box and a corresponding edge of the screen of the electronic device;
根据各边界与屏幕的对应边缘之间的第二距离,确定目标边界;Determine the target boundary according to the second distance between each boundary and the corresponding edge of the screen;
根据目标边界的长度值,利用预设的边界长度值与目标对象至电子设备的距离之间的关系,确定目标对象与电子设备之间的第一距离。According to the length value of the target boundary, the first distance between the target object and the electronic device is determined using the relationship between the preset boundary length value and the distance between the target object and the electronic device.
在一种可选的实施例中,第一确定模块83根据边界框的各边界与屏幕的对应边缘之间的第二距离,确定目标边界中,包括:In an optional embodiment, the first determination module 83 determines the target boundary according to the second distance between each boundary of the bounding box and the corresponding edge of the screen, including:
当各边界与屏幕的对应边缘之间的第二距离均大于预设阈值时,从各边界中选取出一条边界;When the second distance between each boundary and the corresponding edge of the screen is greater than the preset threshold, select a boundary from each boundary;
当各边界与屏幕的对应边缘之间的第二距离中仅存在一条边界与屏幕的对应边缘的第二距离小于等于预设阈值时,从各边界中选取出与屏幕的对应边缘的第二距离小于等于预设阈值的边界;When only one of the second distances between each boundary and the corresponding edge of the screen has a second distance less than or equal to the preset threshold, select the second distance from each boundary to the corresponding edge of the screen. The boundary is less than or equal to the preset threshold;
将选取出的边界确定为目标边界。Determine the selected boundary as the target boundary.
在一种可选的实施例中,第一确定模块83根据目标边界的长度值,利用预设的边界长度值与目标对象至电子设备的距离之间的关系,计算得到目标对象与电子设备之间的第一距离中,包括:In an optional embodiment, the first determination module 83 calculates the distance between the target object and the electronic device according to the length value of the target boundary, using the relationship between the preset boundary length value and the distance between the target object and the electronic device. The first distance between includes:
当目标边界为宽度边界时,根据目标边界的长度值,利用预设的边界宽度值与目标对象至电子设备的距离之间的关系,计算得到目标对象与电子设备之间的第一距离。When the target boundary is a width boundary, the first distance between the target object and the electronic device is calculated based on the length value of the target boundary and the relationship between the preset boundary width value and the distance between the target object and the electronic device.
在一种可选的实施例中,第一确定模块83根据目标边界的长度值,利用预设的边界长度值与目标对象至电子设备的距离之间的关系,计算得到目标对象与电子设备之间的第一距离中,包括:In an optional embodiment, the first determination module 83 calculates the distance between the target object and the electronic device according to the length value of the target boundary, using the relationship between the preset boundary length value and the distance between the target object and the electronic device. The first distance between includes:
当目标边界为高度边界时,根据目标边界的长度值,利用预设的边界高度值与目标对象至电子设备的距离之间的关系,计算得到目标对象与电子设备之间的第一距离。When the target boundary is a height boundary, the first distance between the target object and the electronic device is calculated according to the length value of the target boundary and the relationship between the preset boundary height value and the distance between the target object and the electronic device.
在一种可选的实施例中,第一确定模块83,具体配置成:In an optional embodiment, the first determination module 83 is specifically configured to:
根据视频序列中每个图像帧中的目标对象中的边界信息,确定每个图像帧中的目标对象与电子设备之间的距离;Determine the distance between the target object in each image frame and the electronic device according to the boundary information in the target object in each image frame in the video sequence;
将每个图像帧中的目标对象与电子设备之间的距离的平均值,确定为目标对象与电子设备之间的第一距离。The average value of the distances between the target object and the electronic device in each image frame is determined as the first distance between the target object and the electronic device.
在一种可选的实施例中,第一确定模块83将每个图像帧中的目标对象与电子设备之间的距离的平均值,确定为目标对象与电子设备之间的第一距离中,包括:In an optional embodiment, the first determination module 83 determines the average value of the distance between the target object and the electronic device in each image frame as the first distance between the target object and the electronic device, include:
当每个图像帧中目标对象与电子设备之间的距离中任意两个之间的差值的绝对值小于等于预设的误差阈值时,将每个图像帧中目标对象与电子设备之间的距离的平均值,确定为目标对象与电子设备之间的第一距离。When the absolute value of the difference between any two distances between the target object and the electronic device in each image frame is less than or equal to the preset error threshold, the distance between the target object and the electronic device in each image frame is The average of the distances is determined as the first distance between the target object and the electronic device.
在一种可选的实施例中,该电子设备还配置成:In an optional embodiment, the electronic device is further configured to:
当每个图像帧中目标对象与电子设备之间的距离中任意两个之间的差值的绝对值中存在大于预设的误差阈值的绝对值时,按照预设的在标准距离下的触控操作参数,控制电子设备以执行与触控类型对应的目标功能。When the absolute value of the difference between any two of the distances between the target object and the electronic device in each image frame is greater than the preset error threshold, the touch screen will be triggered according to the preset touch point at the standard distance. Control operating parameters to control the electronic device to perform target functions corresponding to the touch type.
在一种可选的实施例中,第二确定模块84,具体配置成:In an optional embodiment, the second determination module 84 is specifically configured to:
基于预设的距离与灵敏度系数的对应关系,计算在第一距离下的灵敏度系数;Based on the preset corresponding relationship between distance and sensitivity coefficient, calculate the sensitivity coefficient at the first distance;
根据在第一距离下的灵敏度系数,确定在第一距离下的触控操作参数。According to the sensitivity coefficient at the first distance, the touch operation parameters at the first distance are determined.
在一种可选的实施例中,第二确定模块84根据在第一距离下的灵敏度系数,确定在第一距离下的触控操作参数中,包括:In an optional embodiment, the second determination module 84 determines the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance, including:
利用在第一距离下的灵敏度系数和预设的在标准距离下的触控操作参数,计算得到在第一距离下的触控操作参数。Using the sensitivity coefficient at the first distance and the preset touch operation parameters at the standard distance, the touch operation parameters at the first distance are calculated.
在一种可选的实施例中,处理模块82对视频序列进行触控识别,得到目标对象和触控类型中,包括:In an optional embodiment, the processing module 82 performs touch recognition on the video sequence to obtain the target object and touch type, including:
对视频序列进行触控识别,得到目标对象、目标对象的边界框和目标对象的触控操作的置信度值;其中,边界框表示边界信息;Perform touch recognition on the video sequence to obtain the target object, the bounding box of the target object, and the confidence value of the touch operation of the target object; where the bounding box represents the boundary information;
基于目标对象的触控操作的置信度值,确定触控类型。The touch type is determined based on the confidence value of the target object's touch operation.
在一种可选的实施例中,处理模块82基于目标对象的触控操作的置信度值,确定触控类型中,包括:In an optional embodiment, the processing module 82 determines the touch type based on the confidence value of the target object's touch operation, including:
将目标对象的触控操作的置信度值的最大值对应的触控操作所属的类型,确定为触控类型。The type of the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object is determined as the touch type.
在一种可选的实施例中,触控操作参数包括以下任一项:滑动操作的滑动距离,滑动操作的滑动速度,翻页操作的翻页速度,长按操作的长按时间和点击操作的点击频率。In an optional embodiment, the touch operation parameters include any of the following: sliding distance of the sliding operation, sliding speed of the sliding operation, page turning speed of the page turning operation, long pressing time of the long pressing operation and click operation. frequency of clicks.
在一种可选的实施例中,控制模块85具体配置成:In an optional embodiment, the control module 85 is specifically configured to:
按照触控操作参数,控制电子设备的屏幕的被控制对象,以执行与触控类型对应的目标功能;其中,被控制对象包括以下任一项:页面,界面和控件。According to the touch operation parameters, controlled objects on the screen of the electronic device are controlled to perform target functions corresponding to the touch type; wherein the controlled objects include any of the following: pages, interfaces, and controls.
在一种可选的实施例中,控制模块85按照触控操作参数,控制电子设备的屏幕的被控制对象,以执行与触控类型对应的目标功能中,包括:In an optional embodiment, the control module 85 controls the controlled object on the screen of the electronic device according to the touch operation parameters to perform target functions corresponding to the touch type, including:
按照滑动操作的滑动距离,控制屏幕的页面,以执行滑动功能。According to the sliding distance of the sliding operation, control the page of the screen to perform the sliding function.
在实际应用中,上述获取模块81、处理模块82、第一确定模块83、第二确定模块84和控制模块85可由位于电子设备上的处理器实现,具体为中央处理器(CPU,Central Processing Unit)、微处理器(MPU,Microprocessor Unit)、数字信号处理器(DSP,Digital Signal Processing)或现场可编程门阵列(FPGA,Field Programmable Gate Array)等实现。In practical applications, the above-mentioned acquisition module 81, processing module 82, first determination module 83, second determination module 84 and control module 85 can be implemented by a processor located on the electronic device, specifically a central processing unit (CPU). ), microprocessor (MPU, Microprocessor Unit), digital signal processor (DSP, Digital Signal Processing) or field programmable gate array (FPGA, Field Programmable Gate Array) and other implementations.
图9为本申请实施例提供的另一种可选的电子设备的结构示意图,如图9所示,本申请实施例提供了一种电子设备900,包括:Figure 9 is a schematic structural diagram of another optional electronic device provided by an embodiment of the present application. As shown in Figure 9, an embodiment of the present application provides an electronic device 900, including:
处理器91以及存储有所述处理器91可执行指令的存储介质92,所述存储介质92通过通信总线93依赖所述处理器91执行操作,当所述指令被所述处理器91执行时,执行上述一个或多个实施例中所执行的所述控制方法。The processor 91 and the storage medium 92 that stores instructions executable by the processor 91. The storage medium 92 relies on the processor 91 to perform operations through the communication bus 93. When the instructions are executed by the processor 91, The control method performed in one or more of the above embodiments is executed.
需要说明的是,实际应用时,终端中的各个组件通过通信总线93耦合在一起。可理解,通信总线93用于实现这些组件之间的连接通信。通信总线93除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。但是为了清楚说明起见,在图9中将各种总线都标为通信总线93。It should be noted that in actual application, various components in the terminal are coupled together through the communication bus 93 . It can be understood that the communication bus 93 is used to implement connection communication between these components. In addition to the data bus, the communication bus 93 also includes a power bus, a control bus and a status signal bus. However, for the sake of clarity, the various buses are labeled as communication bus 93 in FIG. 9 .
本申请实施例提供了一种计算机存储介质,存储有可执行指令,当所述可执行指令被一个或多个处理器执行的时候,所述处理器执行如上述一个或多个实施例中第一电子设备执行的所述的控制方法。Embodiments of the present application provide a computer storage medium that stores executable instructions. When the executable instructions are executed by one or more processors, the processor executes the first step in one or more of the above embodiments. An electronic device executes the control method.
其中,计算机可读存储介质可以是磁性随机存取存储器(ferromagnetic random access memory,FRAM)、只读存储器(Read Only Memory,ROM)、可编程只读存储器(Programmable Read-Only Memory,PROM)、可擦除可编程只读存储器(Erasable Programmable Read-Only Memory,EPROM)、电可擦除可编程只读存储器(Electrically Erasable Programmable Read-Only Memory,EEPROM)、快闪存储器(Flash Memory)、磁表面存储器、光盘、或只读光盘(Compact Disc Read-Only Memory,CD-ROM)等存储器。Among them, the computer-readable storage medium can be magnetic random access memory (ferromagnetic random access memory, FRAM), read-only memory (Read Only Memory, ROM), programmable read-only memory (Programmable Read-Only Memory, PROM), programmable read-only memory (PROM), Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Flash Memory, Magnetic Surface Memory , optical disk, or Compact Disc Read-Only Memory (CD-ROM) and other memories.
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用硬件实施例、软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器和光学存储器等)上实施的计算机程序产品的形式。Those skilled in the art will understand that embodiments of the present application may be provided as methods, systems, or computer program products. Accordingly, the present application may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product implemented on one or more computer-usable storage media (including, but not limited to, magnetic disk storage and optical storage, etc.) embodying computer-usable program code therein.
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each process and/or block in the flowchart illustrations and/or block diagrams, and combinations of processes and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing device to produce a machine, such that the instructions executed by the processor of the computer or other programmable data processing device produce a use A device for realizing the functions specified in one process or multiple processes of the flowchart and/or one block or multiple blocks of the block diagram.
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。These computer program instructions may also be stored in a computer-readable memory that causes a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction means, the instructions The device implements the functions specified in a process or processes of the flowchart and/or a block or blocks of the block diagram.
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。These computer program instructions may also be loaded onto a computer or other programmable data processing device, causing a series of operating steps to be performed on the computer or other programmable device to produce computer-implemented processing, thereby executing on the computer or other programmable device. Instructions provide steps for implementing the functions specified in a process or processes of a flowchart diagram and/or a block or blocks of a block diagram.
以上所述,仅为本申请的较佳实施例而已,并非用于限定本申请的保护范围。The above descriptions are only preferred embodiments of the present application and are not intended to limit the protection scope of the present application.
工业实用性Industrial applicability
本申请实施例中提供的控制方法、电子设备及计算机存储介质,该方法应用于电子设备中,包括:当电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列,对视频序列进行触控识别,得到目标对象和触控类型,确定目标对象与电子设备之间的第一距离,确定在第一距离下的触控操作参数,按照触控操作参数,控制电子设备以执行与触控类型对应的目标功能,与现有的仅仅对触控类型对应的目标功能进行控制相比,采用同一触控类型利用不同距离下的触控操作参数进行控制有所差别,使得非接触式触控控制更加精细化。The control method, electronic device and computer storage medium provided in the embodiments of the present application are applied to electronic devices and include: when the non-contact touch control function of the electronic device is on, obtaining the non-contact touch operation Corresponding video sequence, perform touch recognition on the video sequence, obtain the target object and touch type, determine the first distance between the target object and the electronic device, determine the touch operation parameters at the first distance, and follow the touch operation parameters to control the electronic device to perform the target function corresponding to the touch type. Compared with the existing control that only controls the target function corresponding to the touch type, the same touch type is used to control using touch operation parameters at different distances. There are differences, making non-contact touch control more refined.

Claims (20)

  1. 一种控制方法,所述方法应用于电子设备中,包括:A control method, the method is applied to electronic equipment, including:
    当所述电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列;When the non-contact touch control function of the electronic device is turned on, obtain a video sequence corresponding to the non-contact touch operation;
    对所述视频序列进行触控识别,得到目标对象和触控类型;Perform touch recognition on the video sequence to obtain the target object and touch type;
    确定所述目标对象与所述电子设备之间的第一距离;determining a first distance between the target object and the electronic device;
    确定在所述第一距离下的触控操作参数;Determine touch operation parameters at the first distance;
    按照所述触控操作参数,控制所述电子设备以执行与所述触控类型对应的目标功能。According to the touch operation parameters, the electronic device is controlled to perform a target function corresponding to the touch type.
  2. 根据权利要求1所述的方法,其中,所述确定所述目标对象与所述电子设备之间的第一距离,包括:The method of claim 1, wherein determining the first distance between the target object and the electronic device includes:
    从所述视频序列的每个图像帧中的目标对象的边界信息中选取出一个目标对象的边界信息;Select the boundary information of a target object from the boundary information of the target object in each image frame of the video sequence;
    根据选取出的目标对象的边界信息,确定所述目标对象与所述电子设备之间的第一距离。A first distance between the target object and the electronic device is determined according to the selected boundary information of the target object.
  3. 根据权利要求2所述的方法,其中,以边界框来表示边界信息时,所述根据选取出的目标对象的边界信息,确定所述目标对象与所述电子设备之间的第一距离,包括:The method of claim 2, wherein when the boundary information is represented by a bounding box, determining the first distance between the target object and the electronic device according to the boundary information of the selected target object includes: :
    确定所述边界框的各边界与所述电子设备的屏幕的对应边缘之间的第二距离;determining a second distance between each boundary of the bounding box and a corresponding edge of the screen of the electronic device;
    根据所述各边界与所述屏幕的对应边缘之间的第二距离,确定目标边界;Determine the target boundary according to the second distance between each boundary and the corresponding edge of the screen;
    根据所述目标边界的长度值,利用预设的边界长度值与所述目标对象至所述电子设备的距离之间的关系,确定所述目标对象与所述电子设备之间的第一距离。According to the length value of the target boundary, the first distance between the target object and the electronic device is determined using the relationship between the preset boundary length value and the distance from the target object to the electronic device.
  4. 根据权利要求3所述的方法,其中,所述根据所述边界框的各边界与所述屏幕的对应边缘之间的第二距离,确定目标边界,包括:The method of claim 3, wherein determining the target boundary based on a second distance between each boundary of the bounding box and a corresponding edge of the screen includes:
    当所述各边界与所述屏幕的对应边缘之间的第二距离均大于预设阈值时,从所述各边界中选取出一条边界;When the second distance between each boundary and the corresponding edge of the screen is greater than a preset threshold, select one boundary from each boundary;
    当所述各边界与所述屏幕的对应边缘之间的第二距离中仅存在一条边界与所述屏幕的对应边缘的第二距离小于等于预设阈值时,从所述各边界中选取出与所述屏幕的对应边缘的第二距离小于等于预设阈值的边界;When only one of the second distances between the boundaries and the corresponding edge of the screen has a second distance less than or equal to the preset threshold, select the The second distance between the corresponding edges of the screen is less than or equal to the boundary of the preset threshold;
    将选取出的边界确定为所述目标边界。The selected boundary is determined as the target boundary.
  5. 根据权利要求3所述的方法,其中,所述根据所述目标边界的长度值,利用预设的边界长度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离,包括:The method according to claim 3, wherein the calculation is based on the length value of the target boundary and using a relationship between a preset boundary length value and a distance from the target object to the electronic device. The first distance between the target object and the electronic device includes:
    当所述目标边界为宽度边界时,根据所述目标边界的长度值,利用预设的边界宽度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离。When the target boundary is a width boundary, according to the length value of the target boundary, using the relationship between the preset boundary width value and the distance from the target object to the electronic device, the target object and the electronic device are calculated. A first distance between the electronic devices.
  6. 根据权利要求3所述的方法,其中,所述根据所述目标边界的长度值,利用预设的边界长度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离,包括:The method according to claim 3, wherein the calculation is based on the length value of the target boundary and using a relationship between a preset boundary length value and a distance from the target object to the electronic device. The first distance between the target object and the electronic device includes:
    当所述目标边界为高度边界时,根据所述目标边界的长度值,利用预设的边界高度值与所述目标对象至所述电子设备的距离之间的关系,计算得到所述目标对象与所述电子设备之间的第一距离。When the target boundary is a height boundary, according to the length value of the target boundary, using the relationship between the preset boundary height value and the distance from the target object to the electronic device, the target object and the electronic device are calculated. A first distance between the electronic devices.
  7. 根据权利要求1所述的方法,其中,所述确定所述目标对象与所述电子设备之间的第一距离,包括:The method of claim 1, wherein determining the first distance between the target object and the electronic device includes:
    根据所述视频序列中每个图像帧中的目标对象中的边界信息,确定所述每个图像帧中的目标对象与所述电子设备之间的距离;Determine the distance between the target object in each image frame and the electronic device according to the boundary information in the target object in each image frame in the video sequence;
    将所述每个图像帧中的目标对象与所述电子设备之间的距离的平均值,确定为所述目标对象与所述电子设备之间的第一距离。The average value of the distances between the target object and the electronic device in each image frame is determined as the first distance between the target object and the electronic device.
  8. 根据权利要求7所述的方法,其中,所述将所述每个图像帧中的目标对象与所述电子设备之间的距离的平均值,确定为所述目标对象与所述电子设备之间的第一距离,包括:The method according to claim 7, wherein the average value of the distance between the target object and the electronic device in each image frame is determined as the distance between the target object and the electronic device. The first distance includes:
    当所述每个图像帧中目标对象与所述电子设备之间的距离中任意两个之间的差值的绝对值小于等于预设的误差阈值时,将所述每个图像帧中目标对象与所述电子设备之间的距离的平均值,确定为所述目标对象与所述电子设备之间的第一距离。When the absolute value of the difference between any two distances between the target object in each image frame and the electronic device is less than or equal to the preset error threshold, the target object in each image frame is The average value of the distances to the electronic device is determined as the first distance between the target object and the electronic device.
  9. 根据权利要求8所述的方法,其中,所述方法还包括:The method of claim 8, further comprising:
    当所述每个图像帧中目标对象与所述电子设备之间的距离中任意两个之间的差值的绝对值中存在大于预设的误差阈值的绝对值时,按照预设的在标准距离下的触控操作参数,控制所述电子设备以执行与所述触控类型对应的目标功能。When the absolute value of the difference between any two of the distances between the target object and the electronic device in each image frame has an absolute value greater than the preset error threshold, according to the preset standard The touch operation parameters under the distance control the electronic device to perform the target function corresponding to the touch type.
  10. 根据权利要求1所述的方法,其中,所述确定在所述第一距离下的触控操作参数,包括:The method of claim 1, wherein the determining touch operation parameters at the first distance includes:
    基于预设的距离与灵敏度系数的对应关系,计算在所述第一距离下的灵敏度系数;Calculate the sensitivity coefficient at the first distance based on the preset corresponding relationship between the distance and the sensitivity coefficient;
    根据在所述第一距离下的灵敏度系数,确定在所述第一距离下的触控操作参数。Touch operation parameters at the first distance are determined according to the sensitivity coefficient at the first distance.
  11. 根据权利要求10所述的方法,其中,所述根据在所述第一距离下的灵敏度系数,确定在所述第一距离下的触控操作参数,包括:The method of claim 10, wherein determining the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance includes:
    利用在所述第一距离下的灵敏度系数和预设的在标准距离下的触控操作参数,计算得到在所述第一距离下的触控操作参数。The touch operation parameters at the first distance are calculated using the sensitivity coefficient at the first distance and the preset touch operation parameters at the standard distance.
  12. 根据权利要求1所述的方法,其中,所述对所述视频序列进行触控识别,得到目标对象和触控类型,包括:The method according to claim 1, wherein performing touch recognition on the video sequence to obtain the target object and touch type includes:
    对所述视频序列进行触控识别,得到所述目标对象、所述目标对象的边界框和所述目标对象的触控操作的置信度值;其中,所述边界框表示边界信息;Perform touch recognition on the video sequence to obtain the target object, the bounding box of the target object, and the confidence value of the touch operation of the target object; wherein the bounding box represents boundary information;
    基于所述目标对象的触控操作的置信度值,确定所述触控类型。The touch type is determined based on the confidence value of the touch operation of the target object.
  13. 根据权利要求12所述的方法,其中,所述基于所述目标对象的触控操作的置信度值,确定所述触控类型,包括:The method of claim 12, wherein determining the touch type based on a confidence value of a touch operation of the target object includes:
    将所述目标对象的触控操作的置信度值的最大值对应的触控操作所属的类型,确定为所述触控类型。The type of the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object is determined as the touch type.
  14. 根据权利要求1至13任一项所述的方法,其中,所述触控操作参数包括以下任一项:The method according to any one of claims 1 to 13, wherein the touch operation parameters include any of the following:
    滑动操作的滑动距离,滑动操作的滑动速度,翻页操作的翻页速度,长按操作的长按时间和点击操作的点击频率。The sliding distance of the sliding operation, the sliding speed of the sliding operation, the page turning speed of the page turning operation, the long pressing time of the long pressing operation and the click frequency of the clicking operation.
  15. 根据权利要求1至13任一项所述的方法,其中,所述按照所述触控操作参数,控制所述电子设备以执行与所述触控类型对应的目标功能,包括:The method according to any one of claims 1 to 13, wherein the controlling the electronic device to perform a target function corresponding to the touch type according to the touch operation parameters includes:
    按照所述触控操作参数,控制所述电子设备的屏幕的被控制对象,以执行与所述触控类型对应的目标功能;According to the touch operation parameters, control the controlled object of the screen of the electronic device to perform a target function corresponding to the touch type;
    其中,所述被控制对象包括以下任一项:页面,界面和控件。Wherein, the controlled objects include any of the following: pages, interfaces and controls.
  16. 根据权利要求15所述的方法,其中,所述按照所述触控操作参数,控制所述电子设备的屏幕的被控制对象,以执行与所述触控类型对应的目标功能,包括:The method according to claim 15, wherein the controlling the controlled object of the screen of the electronic device according to the touch operation parameter to perform a target function corresponding to the touch type includes:
    按照滑动操作的滑动距离,控制所述屏幕的页面,以执行滑动功能。According to the sliding distance of the sliding operation, the page of the screen is controlled to perform the sliding function.
  17. 一种电子设备,包括:An electronic device including:
    获取模块,配置成当所述电子设备的非接触式触控控制功能处于开启状态时,获取非接触式触控操作对应的视频序列;An acquisition module configured to acquire a video sequence corresponding to a non-contact touch operation when the non-contact touch control function of the electronic device is turned on;
    处理模块,配置成对所述视频序列进行触控识别,得到目标对象和触控类型;A processing module configured to perform touch recognition on the video sequence to obtain the target object and touch type;
    第一确定模块,配置成确定所述目标对象与所述电子设备之间的第一距离;a first determination module configured to determine a first distance between the target object and the electronic device;
    第二确定模块,配置成确定在所述第一距离下的触控操作参数;a second determination module configured to determine touch operation parameters at the first distance;
    控制模块,配置成按照所述触控操作参数,控制所述电子设备以执行所述触控类型对应的目标功能。A control module configured to control the electronic device to perform a target function corresponding to the touch type according to the touch operation parameters.
  18. 根据权利要求17所述的电子设备,其中,所述第一确定模块,具体配置成:The electronic device according to claim 17, wherein the first determining module is specifically configured to:
    从所述视频序列的每个图像帧中的目标对象的边界信息中选取出一个目标对象的边界信息;Select the boundary information of a target object from the boundary information of the target object in each image frame of the video sequence;
    根据选取出的目标对象的边界信息,确定所述目标对象与所述电子设备之间的第一距离。A first distance between the target object and the electronic device is determined according to the selected boundary information of the target object.
  19. 一种电子设备,包括:An electronic device including:
    处理器以及存储有所述处理器可执行指令的存储介质,所述存储介质通过通信总线依赖所述处理器执行操作,当所述指令被所述处理器执行时,执行上述的权利要求1至16任一项所述的控制方法。A processor and a storage medium storing instructions executable by the processor. The storage medium relies on the processor to perform operations through a communication bus. When the instructions are executed by the processor, the above-mentioned claims 1 to 1 are executed. The control method described in any one of 16.
  20. 一种计算机存储介质,存储有可执行指令,当所述可执行指令被一个或多个处理器执行的时候,所述处理器执行如权利要求1至16任一项所述的控制方法。A computer storage medium stores executable instructions. When the executable instructions are executed by one or more processors, the processor executes the control method according to any one of claims 1 to 16.
PCT/CN2022/141461 2022-05-10 2022-12-23 Control method, electronic device and computer storage medium WO2023216613A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210507343.1A CN114840086B (en) 2022-05-10 2022-05-10 Control method, electronic equipment and computer storage medium
CN202210507343.1 2022-05-10

Publications (1)

Publication Number Publication Date
WO2023216613A1 true WO2023216613A1 (en) 2023-11-16

Family

ID=82568888

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/141461 WO2023216613A1 (en) 2022-05-10 2022-12-23 Control method, electronic device and computer storage medium

Country Status (2)

Country Link
CN (1) CN114840086B (en)
WO (1) WO2023216613A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114840086B (en) * 2022-05-10 2024-07-30 Oppo广东移动通信有限公司 Control method, electronic equipment and computer storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103472916A (en) * 2013-09-06 2013-12-25 东华大学 Man-machine interaction method based on human body gesture recognition
US20140118255A1 (en) * 2012-10-25 2014-05-01 Bryed Billerbeck Graphical user interface adjusting to a change of user's disposition
CN107291221A (en) * 2017-05-04 2017-10-24 浙江大学 Across screen self-adaption accuracy method of adjustment and device based on natural gesture
CN109782906A (en) * 2018-12-28 2019-05-21 深圳云天励飞技术有限公司 A kind of gesture identification method of advertisement machine, exchange method, device and electronic equipment
CN110414495A (en) * 2019-09-24 2019-11-05 图谱未来(南京)人工智能研究院有限公司 A kind of gesture identification method, device, electronic equipment and readable storage medium storing program for executing
CN112947755A (en) * 2021-02-24 2021-06-11 Oppo广东移动通信有限公司 Gesture control method and device, electronic equipment and storage medium
CN114840086A (en) * 2022-05-10 2022-08-02 Oppo广东移动通信有限公司 Control method, electronic device and computer storage medium

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3903968B2 (en) * 2003-07-30 2007-04-11 日産自動車株式会社 Non-contact information input device
US9772689B2 (en) * 2008-03-04 2017-09-26 Qualcomm Incorporated Enhanced gesture-based image manipulation
JP2010147784A (en) * 2008-12-18 2010-07-01 Fujifilm Corp Three-dimensional imaging device and three-dimensional imaging method
US8878779B2 (en) * 2009-09-21 2014-11-04 Extreme Reality Ltd. Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
US9104239B2 (en) * 2011-03-09 2015-08-11 Lg Electronics Inc. Display device and method for controlling gesture functions using different depth ranges
US8902198B1 (en) * 2012-01-27 2014-12-02 Amazon Technologies, Inc. Feature tracking for device input
CN103017730B (en) * 2012-11-30 2015-04-01 中兴通讯股份有限公司 Single-camera ranging method and single-camera ranging system
DE112014000441T5 (en) * 2013-01-15 2015-10-15 David Holz Dynamic User Interactions for Display Control and Custom Gesture Interpretation
US9785248B2 (en) * 2013-03-14 2017-10-10 Lg Electronics Inc. Display device and method for driving the same
KR20150064597A (en) * 2013-12-03 2015-06-11 엘지전자 주식회사 Video display device and operating method thereof
KR102167289B1 (en) * 2014-06-03 2020-10-19 엘지전자 주식회사 Video display device and operating method thereof
CN117784927A (en) * 2019-08-19 2024-03-29 华为技术有限公司 Interaction method of air-separation gestures and electronic equipment
CN111084606A (en) * 2019-10-12 2020-05-01 深圳壹账通智能科技有限公司 Vision detection method and device based on image recognition and computer equipment
CN114449153A (en) * 2020-10-31 2022-05-06 广东小天才科技有限公司 Shooting control method of wearable device, wearable device and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140118255A1 (en) * 2012-10-25 2014-05-01 Bryed Billerbeck Graphical user interface adjusting to a change of user's disposition
CN103472916A (en) * 2013-09-06 2013-12-25 东华大学 Man-machine interaction method based on human body gesture recognition
CN107291221A (en) * 2017-05-04 2017-10-24 浙江大学 Across screen self-adaption accuracy method of adjustment and device based on natural gesture
CN109782906A (en) * 2018-12-28 2019-05-21 深圳云天励飞技术有限公司 A kind of gesture identification method of advertisement machine, exchange method, device and electronic equipment
CN110414495A (en) * 2019-09-24 2019-11-05 图谱未来(南京)人工智能研究院有限公司 A kind of gesture identification method, device, electronic equipment and readable storage medium storing program for executing
CN112947755A (en) * 2021-02-24 2021-06-11 Oppo广东移动通信有限公司 Gesture control method and device, electronic equipment and storage medium
CN114840086A (en) * 2022-05-10 2022-08-02 Oppo广东移动通信有限公司 Control method, electronic device and computer storage medium

Also Published As

Publication number Publication date
CN114840086B (en) 2024-07-30
CN114840086A (en) 2022-08-02

Similar Documents

Publication Publication Date Title
AU2019338180B2 (en) User interfaces for simulated depth effects
US10514842B2 (en) Input techniques for virtual reality headset devices with front touch screens
EP2972727B1 (en) Non-occluded display for hover interactions
JP6013583B2 (en) Method for emphasizing effective interface elements
EP2864932B1 (en) Fingertip location for gesture input
US9965039B2 (en) Device and method for displaying user interface of virtual input device based on motion recognition
US20120176322A1 (en) Systems and methods to present multiple frames on a touch screen
CN106527693A (en) Application control method and mobile terminal
CN106406710A (en) Screen recording method and mobile terminal
KR20140100547A (en) Full 3d interaction on mobile devices
US20150193040A1 (en) Hover Angle
WO2023216613A1 (en) Control method, electronic device and computer storage medium
US9400575B1 (en) Finger detection for element selection
CN104881225A (en) Control method and device for adjusting bar
US20170160797A1 (en) User-input apparatus, method and program for user-input
KR20210000671A (en) Head pose estimation
WO2020055613A1 (en) User interfaces for simulated depth effects
CN107422854A (en) Action identification method and terminal applied to virtual reality
US11782548B1 (en) Speed adapted touch detection
WO2021241038A1 (en) Information processing device, information processing method based on input operation by user, and computer program for executing said method
US20160110881A1 (en) Motion tracking device control systems and methods
KR101993257B1 (en) Apparatus of correcting touch input based on compensation hand vibration
US11237671B1 (en) Temporal filter touch detection
CN106201213A (en) The control method of a kind of virtual reality focus and terminal
KR20210145559A (en) Method of displaying at terminal device text for fast reading

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22941546

Country of ref document: EP

Kind code of ref document: A1