WO2023245373A1 - Electronic display device and display control method and apparatus therefor, and storage medium - Google Patents

Electronic display device and display control method and apparatus therefor, and storage medium Download PDF

Info

Publication number
WO2023245373A1
WO2023245373A1 PCT/CN2022/099937 CN2022099937W WO2023245373A1 WO 2023245373 A1 WO2023245373 A1 WO 2023245373A1 CN 2022099937 W CN2022099937 W CN 2022099937W WO 2023245373 A1 WO2023245373 A1 WO 2023245373A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
display device
electronic display
target
hand
Prior art date
Application number
PCT/CN2022/099937
Other languages
French (fr)
Chinese (zh)
Inventor
张逸帆
Original Assignee
北京小米移动软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小米移动软件有限公司 filed Critical 北京小米移动软件有限公司
Priority to PCT/CN2022/099937 priority Critical patent/WO2023245373A1/en
Priority to CN202280004180.3A priority patent/CN117999537A/en
Publication of WO2023245373A1 publication Critical patent/WO2023245373A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present disclosure relates to the technical field of electronic display equipment, and in particular, to an electronic display equipment, a display control method and device thereof, and a storage medium.
  • the present disclosure provides an electronic display device, a display control method and device thereof, and a storage medium.
  • a display control method for an electronic display device including:
  • the floating window interface is displayed on the upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is determined based on the user's gesture of holding the electronic display device with one hand. ;
  • the first target interface is controlled to respond to the first one-hand touch operation, and the first one-hand touch operation is the user's first one-hand touch operation. Operations performed by holding the electronic display device with one hand; and,
  • the second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the The other one of the display interface and the floating window interface.
  • the first target interface is configured with an application program interface, and controlling the first target interface to respond to the first one-hand touch operation includes:
  • the first target interface is controlled to respond to the first one-hand touch operation.
  • the method also includes:
  • the first target interface is controlled to respond to the second one-hand touch operation.
  • controlling the first target interface to respond to the first one-hand touch operation includes:
  • the display interface is controlled to respond to the mapping result of the first one-hand touch operation.
  • displaying the floating window interface on an upper layer of the display interface in the target display area on the screen of the electronic display device includes:
  • the touch data determine the gesture of the user holding the electronic display device with one hand
  • the floating window interface is displayed on an upper layer of the display interface in the target display area.
  • the touch data includes a touch image
  • determining the user's gesture of holding the electronic display device with one hand according to the touch data includes:
  • the touch image is input into the trained gesture recognition model, and a recognition result output by the gesture recognition model representing the gesture of the user holding the electronic display device with one hand is obtained.
  • the electronic display device includes an electromagnetic wave energy absorption ratio sensor
  • the touch data includes sensor data collected by the electromagnetic wave energy absorption ratio sensor, and based on the touch data, it is determined that the user holds the device with one hand.
  • Gestures for electronic display devices including:
  • a gesture of the user holding the electronic display device is determined.
  • a display control device for an electronic display device comprising:
  • a generating module configured to generate a floating window interface according to the display interface in the current display state of the electronic display device in response to detecting an instruction to start the one-handed operation mode, where the floating window interface is a reduced size of the display interface The final interface;
  • the display module is configured to display the floating window interface on an upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is based on the user holding a single-hand display.
  • the gestures of the electronic display device are determined;
  • a response module configured to control the first target interface to respond to the first one-hand touch operation when it is detected that the user performs a first one-hand touch operation in the floating window interface, and the third A one-hand touch operation is an operation performed by the user holding the electronic display device with one hand;
  • a synchronization module configured to control the second target interface to perform synchronous changes according to changes in the first target interface, wherein the first target interface is one of the display interface and the floating window interface, so The second target interface is the other one of the display interface and the floating window interface.
  • the first target interface is configured with an application program interface
  • the response module includes:
  • a first determination sub-module configured to determine a target application based on the first one-hand touch operation
  • the calling module is configured to control the first target interface to respond to the first one-hand touch operation by calling the interface of the target application program.
  • the device also includes:
  • the control module is configured to control the first target interface to respond to the second one-hand touch operation when it is detected that the user performs a second one-hand touch operation in the display interface.
  • the response module is configured to:
  • first target interface is the display interface and the second target interface is the floating window interface
  • map the first one-hand touch operation to the display interface control the display interface Respond to the mapping result of the first one-hand touch operation.
  • the display module includes:
  • An acquisition submodule configured to acquire the user's touch data on the electronic display device
  • the second determination sub-module is configured to determine the gesture of the user holding the electronic display device with one hand according to the touch data
  • the third determination sub-module is configured to determine, based on the user's gesture of holding the electronic display device with one hand, whether the user's hand holding the electronic display device with one hand performs a one-handed gesture on the screen of the electronic display device.
  • a fourth determination sub-module configured to determine the target display area according to the operation area
  • the display submodule is configured to display the floating window interface on an upper layer of the display interface in the target display area.
  • the touch data includes a touch image
  • the second determination sub-module is configured to:
  • the touch image is input into the trained gesture recognition model, and a recognition result output by the gesture recognition model representing the gesture of the user holding the electronic display device with one hand is obtained.
  • the electronic display device includes an electromagnetic wave energy absorption ratio sensor
  • the touch data includes sensor data collected by the electromagnetic wave energy absorption ratio sensor
  • the second determination sub-module is configured to:
  • a gesture of the user holding the electronic display device is determined.
  • a computer-readable storage medium on which computer program instructions are stored.
  • the program instructions are executed by a processor, the display control of the electronic display device provided by the first aspect of the present disclosure is implemented. Method steps.
  • an electronic display device including:
  • a processor configured to execute the computer program in the memory to implement the steps of the display control method of the electronic display device provided by the first aspect of the present disclosure.
  • the electronic display device In response to detecting the instruction to start the one-handed operation mode, the electronic display device generates a floating window interface based on the display interface in the current display state of the electronic display device.
  • the floating window interface is a reduced size interface of the display interface.
  • a floating window interface is displayed on an upper layer of the display interface in a target display area on the screen of the electronic display device, where the target display area is determined based on a gesture of the user holding the electronic display device with one hand.
  • the first target interface is controlled to respond to the first one-hand touch operation, wherein the first one-hand touch operation is held by the user with one hand.
  • Hands operating electronic display device When it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation, wherein the first one-hand touch operation is held by the user with one hand.
  • the second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface.
  • FIG. 1 is a flow chart of a display control method of an electronic display device according to an exemplary embodiment.
  • FIG. 2 is a block diagram of a display control device of an electronic display device according to an exemplary embodiment.
  • FIG. 3 is a block diagram of an electronic display device according to an exemplary embodiment.
  • FIG. 1 is a flow chart of a display control method of an electronic display device according to an exemplary embodiment. As shown in FIG. 1 , the display control method of the electronic display device may include the following steps.
  • step S11 in response to detecting an instruction to start the one-handed operation mode, a floating window interface is generated according to the display interface in the current display state of the electronic display device, where the floating window interface is the reduced size of the display interface.
  • the one-handed operation mode may refer to a mode of using one hand for touch operation, a small screen operation mode, a virtual operation interface mode, etc.
  • the floating window interface is the interface after the display interface is reduced in size.
  • the floating window interface and the display interface have the same display content. It should be noted that in this disclosure, the floating window interface does not belong to the display content of the display interface.
  • a reduced-size floating window interface can be obtained by obtaining the display interface and reducing the length and width of the display interface in equal proportions.
  • the size of the floating window interface may be a default value, such as 3.5 inches, 4 inches, 4.5 inches, etc. This disclosure does not place a specific limit on the size of the floating window interface.
  • a floating window interface is generated according to the display interface in the current display state of the electronic display device.
  • the instruction to start the one-handed operation mode can be that the user clicks the screen in a preset manner, the user slides on the screen according to a preset trajectory, the user clicks a preset button, the user inputs a preset voice, or the user uses a preset gesture to hold the electronic display. Equipment and other actions.
  • step S12 the floating window interface is displayed on the upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is based on the user holding the electronic display with one hand.
  • Device gestures determined.
  • a floating window interface can be displayed on the upper layer of the display interface in the target display area on the screen of the electronic display device.
  • the transparency of the floating window interface can be freely set by the user. By adjusting the transparency of the floating window interface, the degree of occlusion of the display interface by the floating window interface can be reduced.
  • the target display area is determined based on the user's gesture of holding the electronic display device with one hand, and the target display area is an area other than the blind area where the user performs a one-hand touch operation.
  • the display interface in the current display state of the electronic display device may refer to the home screen interface of the electronic display device, or may refer to the superposition result of multiple interfaces other than the floating window interface in this disclosure.
  • step S13 when it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation.
  • the touch operation is an operation performed by the user holding the electronic display device with one hand.
  • the first one-handed touch operation may be an operation performed by the user holding the electronic display device with one hand.
  • the specific operation may be using a finger (or a finger wearing an auxiliary tool such as a touch finger glove) to click on the floating window interface.
  • the first one-hand touch operation may also be an operation in which the user uses a finger to resize the floating window interface.
  • the first one-hand touch operation may also be an operation in which the user uses a finger to slide the floating window interface (such as a page turning operation).
  • the first one-hand touch operation may also be an operation in which the user inputs text in the floating window interface.
  • the first one-hand touch operation may also be an operation for the user to draw in the floating window interface.
  • the first one-hand touch operation may also be an operation for the user to play video/audio in the floating window interface.
  • the first one-hand touch operation performed by the user in the floating window interface does not include the action of dragging the floating window interface to adjust the display position of the floating window interface on the screen of the electronic display device.
  • step S14 the second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is controlled to change synchronously.
  • the target interface is the other one of the display interface and the floating window interface.
  • the second target interface is controlled to change synchronously according to the change of the first target interface. That is, in the present disclosure, the first target interface and the second target interface have a mutual mapping relationship. The display content of the first target interface and the second target interface remain consistent, but the size of the displayed content may be inconsistent.
  • the first target interface is a display interface
  • the second target interface is a floating window interface.
  • the floating window interface is controlled to change synchronously.
  • the display interface is configured with an application program interface, and the corresponding interface can be called according to the first one-hand touch operation.
  • the background application can be triggered to respond, causing the display interface to change accordingly.
  • the first target interface is a floating window interface
  • the second target interface is a display interface.
  • the floating window interface changes in response to the first one-hand touch operation
  • the display interface is controlled to change synchronously.
  • the floating window interface is configured with an application program interface, and the corresponding interface can be called according to the first one-hand touch operation.
  • the background application can be triggered to respond, causing the floating window interface to change accordingly.
  • the method also includes:
  • the first target interface is controlled to respond to the second one-hand touch operation.
  • the second one-handed touch operation may be a hand other than the hand of the user holding the electronic display device with one hand. For example, if the user holds the electronic display device with his right hand, then the second one-handed touch operation refers to the left hand. operations performed.
  • the second one-hand touch operation may be the user's hand holding the electronic display device with one hand.
  • the second one-handed touch operation may be an operation in which the user (left hand) uses a finger, a stylus, or a mouse to click on a function icon (such as a button control) of the application program in the display interface.
  • the second one-hand touch operation may also be an operation in which the user uses a finger to zoom in on the size of the display interface.
  • the second one-hand touch operation may also be an operation in which the user uses a finger, a stylus, or a mouse to slide (such as turning pages) to display the interface.
  • the second one-hand touch operation may also be an operation in which the user inputs text in the display interface.
  • the second one-hand touch operation may also be an operation for the user to draw within the display interface.
  • the second one-hand touch operation may also be an operation by the user to play video/audio in the display interface.
  • the floating window interface is determined as the first target interface, and the display interface is determined as the second target interface.
  • the display interface is a mapping interface (projection interface) of the floating window interface.
  • the display interface is determined as the first target interface, and the floating window interface is determined as the second target interface.
  • the floating window interface is a mapping interface (projection interface) of the display interface.
  • the display interface in order to generate a floating window interface more simply and quickly, can be defaulted to the first target interface, and the floating window interface can be defaulted to the second target interface.
  • the floating window interface is a mapping interface (projection interface) of the display interface, and the floating window interface has the function of displaying the projection result.
  • the electronic display device responds to detecting an instruction to start the one-handed operation mode and generates a floating window interface based on the display interface in the current display state of the electronic display device.
  • the floating window interface is the reduced size of the display interface.
  • a floating window interface is displayed on an upper layer of the display interface in a target display area on the screen of the electronic display device, where the target display area is determined based on a gesture of the user holding the electronic display device with one hand.
  • the first target interface is controlled to respond to the first one-hand touch operation, wherein the first one-hand touch operation is held by the user with one hand.
  • Hands operating electronic display device When it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation, wherein the first one-hand touch operation is held by the user with one hand.
  • the second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface.
  • this method of the present disclosure not only facilitates the user to perform one-handed touch operations in the floating window interface, but also facilitates the user to perform touch operations in the floating window interface. It also facilitates the user to use the other hand to perform touch operations within the display interface.
  • the teacher uses his right hand to write text in the floating window interface
  • the teacher uses his left hand to slide left in the display interface to move the image in the display interface to the left, thereby making it easier for the right hand to write text continuously and avoid writing. Text overlaps each other.
  • the first target interface is configured with an application program interface, and controlling the first target interface to respond to the first one-hand touch operation includes:
  • the first one-hand touch operation is an operation in which the user clicks on the icon of application A in the floating window interface
  • it is determined based on the operation position and operation characteristics (such as click, slide, etc.) of the first one-hand touch operation.
  • the target application is program a that starts application A.
  • controlling the first target interface to respond to the first one-hand touch operation includes:
  • mapping the first one-hand touch operation to the display interface controlling the display interface to respond to the mapping result of the first one-hand touch operation.
  • the first target interface is a display interface and the second target interface is a floating window interface
  • the first one-handed touch operation is Control operations are mapped to the display interface.
  • the display interface responds to the mapping result of the first one-hand touch operation. For example, assuming that the first one-handed touch operation is the user clicking on the icon of application B in the floating window interface, then the first one-handed touch operation is mapped to the display interface, and the obtained mapping result is that the user clicks on the icon in the display interface.
  • the icon of application B, the display interface changes in response to the user's operation of clicking the icon of application B in the display interface.
  • the first target interface is a floating window interface and the second target interface is a display interface
  • the second one-hand touch operation is Control operations are mapped to the floating window interface.
  • the floating window interface responds to the mapping result of the second one-handed touch operation.
  • one way of determining the target display area of the floating window interface on the screen of the electronic display device may be :
  • the user's touch data on the electronic display device According to the touch data, the user's gesture of holding the electronic display device with one hand is determined. According to the gesture of the user holding the electronic display device with one hand, the operable area for the user's hand holding the electronic display device with one hand to perform a single-handed touch operation on the screen of the electronic display device is determined.
  • the target display area is determined based on the operable area where the user performs touch operations with one hand. For example, the operable area where the user performs touch operation with one hand is used as the display area. For another example, a part of the operable area where the user performs a touch operation with one hand is used as the display area.
  • the touch data may include a touch image.
  • an implementation method of determining the user's gesture of holding the electronic display device with one hand may be:
  • m*n*1 touch points can be obtained when the user holds the electronic display device. control image.
  • the recognition result output by the gesture recognition model that represents the gesture of the user holding the electronic display device can be obtained.
  • the training method of the gesture recognition model may be to determine the touch image sample and the gesture label of the touch image sample, train the gesture recognition model based on the touch image sample and the gesture label of the touch image sample, and obtain the trained Gesture recognition model.
  • An example is to input m*n*1 (assumed to be 16*40*1) touch images into a neural network (such as ResNet, ResNeXt, MobileNet, or ShuffleNet, etc.), and calculate x 1 feature values.
  • a neural network such as ResNet, ResNeXt, MobileNet, or ShuffleNet, etc.
  • x 1 feature values Take MobileNet as an example: using MobileNet's convolution layer (conv) to process a 16*40*1 touch image, a convolution result of 2*5*x 1 can be obtained.
  • the probability vector represents the gesture of the user holding the electronic display device, such as holding the electronic display device with the right hand, holding the electronic display device with the left hand, gestures other than holding the electronic display device with one hand, that is, holding the electronic display device with both hands.
  • a touch image of m*n*1 (assumed to be 16*40*1) is segmented to obtain a first segmented image and a second segmented image, where the first segmented image is a segmented image including the touch image.
  • the first segmented image includes a first sub-image, a second sub-image, a third sub-image, and a fourth sub-image).
  • Input the first segmented image into a neural network such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.
  • a neural network such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.
  • a convolution result of 28*1*Y 1 can be obtained.
  • Use MobileNet's average pooling layer (avg pool) to calculate the mean of 28*1*Y 1 and you can get the eigenvalues of 1*1*Y 1 (that is, Y 1 eigenvalues), and the eigenvalues of 1*1*Y 1 It can represent the first segmented image of 112*4*1.
  • the second segmented image is input into a neural network (such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.), and Y 2 feature values are calculated.
  • a neural network such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.
  • Y 2 feature values are calculated.
  • a neural network such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.
  • a convolution result of 2*5*Y 2 can be obtained.
  • Use MobileNet's average pooling layer (avg pool) to calculate the mean of 2*5*Y 2 , and you can get the eigenvalues of 1*1*Y 2 (that is, Y 2 eigenvalues), and the eigenvalues of 1*1*Y 2 It can represent the second segmented image of 8*32*1.
  • Concatenate Concatenate (concatenate) the eigenvalues of 1*1*Y 1 with the eigenvalues of 1*1*Y 2 to obtain the concatenation vector of 1*1*(Y 1 +Y 2 ).
  • the splicing vector of 1 + Y 2 ) is input into the fully connected + softmax layer of MobileNet to obtain the probability vector.
  • the probability vector represents the gesture of the user holding the electronic display device, such as holding the electronic display device with the right hand, holding the electronic display device with the left hand, holding the electronic display device with both hands, etc.
  • Another example is to segment a touch image of m*n*1 (assumed to be 16*40*1) to obtain a first segmented image, where the first segmented image includes pixels around the edges of the touch image. image.
  • Two long sides and two wide sides, 4 represents the segmentation width of the touch image (i.e., move the first edge of the touch image inward by 4 pixels, and segment it to obtain the first sub-image with a width of 4.
  • Move the touch image Move the second side of the image inward by 4 pixels, and divide it into a second sub-image with a width of 4.
  • Move the third side of the touch image inward by 4 pixels and divide it into a third sub-image with a width of 4.
  • Sub-image. Move the fourth side of the touch image inward by 4 pixels, and segment it to obtain a fourth sub-image with a width of 4.
  • the first segmented image includes the first sub-image, the second sub-image, and the third sub-image. , and the fourth sub-image).
  • Input the first segmented image into a neural network such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.
  • a neural network such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.
  • a convolution result of 28*1*Y 1 can be obtained.
  • m*n*1 touch images into a neural network (such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.), and calculate Y 2 feature values.
  • a neural network such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.
  • Y 2 feature values such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.
  • Conv convolution layer
  • a convolution result of 2*5*Y 2 can be obtained.
  • Use MobileNet's average pooling layer (avg pool) to calculate the mean of 2*5*Y 2 , and you can get the eigenvalues of 1*1*Y 2 (that is, Y 2 eigenvalues), and the eigenvalues of 1*1*Y 2 Can represent a 16*40*1 touch image.
  • Concatenate Concatenate (concatenate) the eigenvalues of 1*1*Y 1 with the eigenvalues of 1*1*Y 2 to obtain the concatenation vector of 1*1*(Y 1 +Y 2 ).
  • the splicing vector of 1 + Y 2 ) is input into the fully connected + softmax layer of MobileNet to obtain the probability vector.
  • the probability vector represents the gesture of the user holding the electronic display device, such as holding the electronic display device with the right hand, holding the electronic display device with the left hand, holding the electronic display device with both hands, etc.
  • m*n*1 touch images into a target detection model (such as ResNet, ResNeXt, MobileNet, or ShuffleNet, etc.), and calculate x 2 Eigenvalues.
  • a target detection model such as ResNet, ResNeXt, MobileNet, or ShuffleNet, etc.
  • Use the second fully connected layer + softmax to calculate the total probability vector based on the position (x, y, w, h) of each anchor box and the category probability value corresponding to each anchor box.
  • the total probability vector represents the gesture of the user holding the electronic display device, such as holding the electronic display device with the right hand, holding the electronic display device with the left hand, holding the electronic display device with both hands, etc.
  • the anchor box refers to multiple a priori boxes with different aspect ratios predefined by the algorithm, centered on the anchor point in the target detection algorithm.
  • one implementation method of determining the display area of the floating window interface on the screen of the electronic display device may be :
  • the operable area determines the display area of the floating window interface. For example, assuming that it is detected that the user is standing in the lower right corner of the electronic display device, the lower right corner of the screen of the electronic display device can be determined as an operable area that is convenient for the user to operate.
  • the electronic display device in the present disclosure may include an electromagnetic wave energy absorption ratio sensor SAR (specific absorption rate) sensor
  • the touch data may include sensor data collected by the electromagnetic wave energy absorption ratio sensor.
  • an implementation method of determining the gesture of the user holding the electronic display device may be:
  • the user's contact area with the electronic display device is determined based on the sensor data. According to the position distribution of the contact area on the electronic display device, the gesture of the user holding the electronic display device is determined. For example, assuming that the multiple touch areas are located on the rear right side and the front right side of the electronic display device, it can be determined that the user's gesture of holding the electronic display device is holding the electronic device with the left hand.
  • the SAR sensor can determine the contact status between the user and the electronic display device (such as a mobile phone) by detecting capacitance.
  • the display control method of the electronic display device may further include:
  • the floating window interface is controlled to enter a hidden state. For example, if the floating window interface is hidden in the user-invisible area at the edge of the screen, the user can redisplay the floating window interface in the user's visible area by dragging the floating window interface hidden at the edge of the screen. For another example, the floating window interface is changed into a floating ball (suspended point) with a smaller area. By clicking on the floating ball (suspended point), the user can change the floating ball (suspended point) into a floating window interface again.
  • the display control method of the electronic display device may further include:
  • the floating window interface can be closed or controlled to enter a hidden state.
  • the display position of the floating window interface is adjusted according to the changed gesture.
  • another implementation of determining the display area of the floating window interface on the screen of the electronic display device determining a fixed target display area based on the user's selection/preset settings.
  • FIG. 2 is a block diagram of a display control device of an electronic display device according to an exemplary embodiment.
  • the device 200 includes:
  • the generation module 210 is configured to generate a floating window interface according to the display interface in the current display state of the electronic display device in response to detecting an instruction to start the one-handed operation mode, where the floating window interface is a reduction of the display interface.
  • the interface after size;
  • the display module 220 is configured to display the floating window interface on the upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is based on the user holding it with one hand.
  • the gesture of the electronic display device is determined;
  • the response module 230 is configured to control the first target interface to respond to the first one-hand touch operation when it is detected that the user performs a first one-hand touch operation in the floating window interface, the The first one-hand touch operation is an operation performed by the user holding the electronic display device with one hand;
  • the synchronization module 240 is configured to control the second target interface to perform synchronous changes according to changes in the first target interface, wherein the first target interface is one of the display interface and the floating window interface, The second target interface is the other one of the display interface and the floating window interface.
  • the electronic display device responds to detecting an instruction to activate the one-handed operation mode and generates a floating window interface based on the display interface in the current display state of the electronic display device.
  • the floating window interface is a reduced size of the display interface. interface.
  • a floating window interface is displayed on an upper layer of the display interface in a target display area on the screen of the electronic display device, where the target display area is determined based on a gesture of the user holding the electronic display device with one hand.
  • the first target interface is controlled to respond to the first one-hand touch operation, wherein the first one-hand touch operation is held by the user with one hand. Hands performing operations on electronic display devices.
  • the second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface.
  • the first target interface is configured with an application program interface
  • the response module 230 includes:
  • a first determination sub-module configured to determine a target application based on the first one-hand touch operation
  • the calling module is configured to control the first target interface to respond to the first one-hand touch operation by calling the interface of the target application program.
  • the device 200 also includes:
  • the control module is configured to control the first target interface to respond to the second one-hand touch operation when it is detected that the user performs a second one-hand touch operation in the display interface.
  • the response module 230 is configured to:
  • first target interface is the display interface and the second target interface is the floating window interface
  • map the first one-hand touch operation to the display interface control the display interface Respond to the mapping result of the first one-hand touch operation.
  • the display module 220 includes:
  • An acquisition submodule configured to acquire the user's touch data on the electronic display device
  • the second determination sub-module is configured to determine the gesture of the user holding the electronic display device with one hand according to the touch data
  • the third determination sub-module is configured to determine, based on the user's gesture of holding the electronic display device with one hand, whether the user's hand holding the electronic display device with one hand performs a one-handed gesture on the screen of the electronic display device.
  • a fourth determination sub-module configured to determine the target display area according to the operation area
  • the display submodule is configured to display the floating window interface on an upper layer of the display interface in the target display area.
  • the touch data includes a touch image
  • the second determination sub-module is configured to:
  • the touch image is input into the trained gesture recognition model, and a recognition result output by the gesture recognition model representing the gesture of the user holding the electronic display device with one hand is obtained.
  • the electronic display device includes an electromagnetic wave energy absorption ratio sensor
  • the touch data includes sensor data collected by the electromagnetic wave energy absorption ratio sensor
  • the second determination sub-module is configured to:
  • a gesture of the user holding the electronic display device is determined.
  • the present disclosure also provides a computer-readable storage medium on which computer program instructions are stored. When the program instructions are executed by a processor, the steps of the display control method of the electronic display device provided by the present disclosure are implemented.
  • FIG. 3 is a block diagram of an electronic display device 800 according to an exemplary embodiment.
  • the electronic display device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like.
  • electronic display device 800 may include one or more of the following components: processing component 802, memory 804, power supply component 806, multimedia component 808, audio component 810, input/output interface 812, sensor component 814, and communication component 816 .
  • Processing component 802 generally controls the overall operations of electronic display device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • the processing component 802 may include one or more processors 820 to execute instructions to complete all or part of the steps of the above-mentioned display control method of the electronic display device.
  • processing component 802 may include one or more modules that facilitate interaction between processing component 802 and other components.
  • processing component 802 may include a multimedia module to facilitate interaction between multimedia component 808 and processing component 802.
  • Memory 804 is configured to store various types of data to support operations at electronic display device 800 . Examples of such data include instructions for any application or method operating on electronic display device 800, contact data, phonebook data, messages, pictures, videos, etc.
  • Memory 804 may be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EEPROM), Programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EEPROM erasable programmable read-only memory
  • EPROM Programmable read-only memory
  • PROM programmable read-only memory
  • ROM read-only memory
  • magnetic memory flash memory, magnetic or optical disk.
  • Power supply component 806 provides power to various components of electronic display device 800 .
  • Power supply components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power to electronic display device 800 .
  • Multimedia component 808 includes a screen that provides an output interface between the electronic display device 800 and the user.
  • the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide action.
  • multimedia component 808 includes a front-facing camera and/or a rear-facing camera.
  • the front camera and/or the rear camera may receive external multimedia data.
  • Each front-facing camera and rear-facing camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
  • Audio component 810 is configured to output and/or input audio signals.
  • audio component 810 includes a microphone (MIC) configured to receive external audio signals when electronic display device 800 is in operating modes, such as call mode, recording mode, and voice recognition mode. The received audio signal may be further stored in memory 804 or sent via communication component 816 .
  • audio component 810 also includes a speaker for outputting audio signals.
  • the input/output interface 812 provides an interface between the processing component 802 and a peripheral interface module, which may be a keyboard, a click wheel, a button, etc. These buttons may include, but are not limited to: Home button, Volume buttons, Start button, and Lock button.
  • Sensor component 814 includes one or more sensors for providing various aspects of status assessment for electronic display device 800 .
  • the sensor component 814 can detect the open/closed state of the electronic display device 800, the relative positioning of components, such as the display and keypad of the electronic display device 800, the sensor component 814 can also detect the electronic display device 800 or the electronic display device 800. Changes in the position of a component of the display device 800 , the presence or absence of user contact with the electronic display device 800 , the orientation or acceleration/deceleration of the electronic display device 800 and changes in the temperature of the electronic display device 800 .
  • Sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor component 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 816 is configured to facilitate wired or wireless communication between electronic display device 800 and other devices.
  • the electronic display device 800 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
  • the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
  • the communications component 816 also includes a near field communications (NFC) module to facilitate short-range communications.
  • NFC near field communications
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • electronic display device 800 may be configured by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field Programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are implemented for executing the display control method of the above electronic display device.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field Programmable gate array
  • controller microcontroller, microprocessor or other electronic components are implemented for executing the display control method of the above electronic display device.
  • a non-transitory computer-readable storage medium including instructions such as a memory 804 including instructions, which can be executed by the processor 820 of the electronic display device 800 to complete the above-mentioned electronic display device is also provided.
  • Display control method the non-transitory computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
  • a computer program product comprising a computer program executable by a programmable device, the computer program having a function for performing the above when executed by the programmable device.
  • the code part of the display control method of the electronic display device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

An electronic display device and a display control method and apparatus therefor, and a storage medium. The method comprises: in response to an instruction to enable a one-hand operation mode having been detected, generating a floating window interface according to a display interface of an electronic display device in the current display state (S11); displaying the floating window interface on an upper layer of the display interface in a target display region on a screen of the electronic display device, wherein the target display region is determined on the basis of a gesture of a user holding the electronic display device with one hand (S12); when it is detected that the user performs a first one-hand touch-control operation in the floating window interface, controlling a first target interface to respond to the first one-hand touch-control operation (S13); and according to a change of the first target interface, controlling a second target interface to synchronously change, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface (S14). By means of the method, it is more convenient for a user to perform a one-hand touch-control operation.

Description

电子显示设备及其显示控制方法、装置,存储介质Electronic display equipment and display control method and device thereof, storage medium 技术领域Technical field
本公开涉及电子显示设备技术领域,尤其涉及一种电子显示设备及其显示控制方法、装置,存储介质。The present disclosure relates to the technical field of electronic display equipment, and in particular, to an electronic display equipment, a display control method and device thereof, and a storage medium.
背景技术Background technique
随着屏幕技术的日新月异,大屏幕电子显示设备、折叠屏幕电子显示设备也逐渐普及大众。With the rapid development of screen technology, large-screen electronic display devices and folding-screen electronic display devices are gradually becoming popular among the public.
人们为了更好的观看影视、浏览网页、或查看文档,更愿意购买大屏幕电子显示设备。然而大屏幕电子显示设备存在触控操作不方便的问题。In order to better watch movies, browse the web, or view documents, people are more willing to buy large-screen electronic display devices. However, large-screen electronic display devices have the problem of inconvenient touch operations.
发明内容Contents of the invention
为克服相关技术中存在的问题,本公开提供一种电子显示设备及其显示控制方法、装置,存储介质。In order to overcome the problems existing in related technologies, the present disclosure provides an electronic display device, a display control method and device thereof, and a storage medium.
根据本公开实施例的第一方面,提供一种电子显示设备的显示控制方法,所述方法包括:According to a first aspect of an embodiment of the present disclosure, a display control method for an electronic display device is provided, the method including:
响应于检测到启动单手操作模式的指令,根据所述电子显示设备当前显示状态下的显示界面生成悬浮窗界面,所述悬浮窗界面是所述显示界面缩小尺寸后的界面;In response to detecting the instruction to start the one-handed operation mode, generating a floating window interface according to the display interface in the current display state of the electronic display device, where the floating window interface is a reduced size of the display interface;
在所述电子显示设备的屏幕上的目标显示区域内所述显示界面的上层显示所述悬浮窗界面,其中,所述目标显示区域是基于用户单手握持所述电子显示设备的手势确定的;The floating window interface is displayed on the upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is determined based on the user's gesture of holding the electronic display device with one hand. ;
在检测到用户在所述悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应所述第一单手触控操作,所述第一单手触控操作是用户单手握持所述电子显示设备的手进行的操作;并,When it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation, and the first one-hand touch operation is the user's first one-hand touch operation. Operations performed by holding the electronic display device with one hand; and,
根据所述第一目标界面的变化控制第二目标界面进行同步变化,其中所述第一目标界面为所述显示界面和所述悬浮窗界面中的一者,所述第二目标界面为所述显示界面和所述悬浮窗界面中的另一者。The second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the The other one of the display interface and the floating window interface.
可选地,所述第一目标界面配置有应用程序的接口,所述控制第一目标界面响应所述第一单手触控操作,包括:Optionally, the first target interface is configured with an application program interface, and controlling the first target interface to respond to the first one-hand touch operation includes:
根据所述第一单手触控操作,确定目标应用程序;Determine the target application according to the first one-hand touch operation;
通过调用所述目标应用程序的接口,以控制所述第一目标界面响应所述第一单手触控操作。By calling the interface of the target application program, the first target interface is controlled to respond to the first one-hand touch operation.
可选地,所述方法还包括:Optionally, the method also includes:
在检测到用户在所述显示界面内进行第二单手触控操作的情况下,控制所述第一目标界面响应所述第二单手触控操作。When it is detected that the user performs a second one-hand touch operation in the display interface, the first target interface is controlled to respond to the second one-hand touch operation.
可选地,若所述第一目标界面为所述显示界面、且所述第二目标界面为所述悬浮窗界面,则所述在检测到用户在所述悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应所述第一单手触控操作,包括:Optionally, if the first target interface is the display interface and the second target interface is the floating window interface, the user performs the first one-handed operation in the floating window interface. In the case of a touch operation, controlling the first target interface to respond to the first one-hand touch operation includes:
将所述第一单手触控操作映射到所述显示界面;Map the first one-handed touch operation to the display interface;
控制所述显示界面对所述第一单手触控操作的映射结果进行响应。The display interface is controlled to respond to the mapping result of the first one-hand touch operation.
可选地,在所述电子显示设备的屏幕上的目标显示区域内所述显示界面的上层显示所述悬浮窗界面,包括:Optionally, displaying the floating window interface on an upper layer of the display interface in the target display area on the screen of the electronic display device includes:
获取用户在所述电子显示设备上的触控数据;Obtain the user's touch data on the electronic display device;
根据所述触控数据,确定用户单手握持所述电子显示设备的手势;According to the touch data, determine the gesture of the user holding the electronic display device with one hand;
根据用户单手握持所述电子显示设备的手势,确定用户单手握持所述电子显示设备的手在所述电子显示设备的屏幕上进行单手触控操作的操作区域;Determine the operation area where the user's hand holding the electronic display device with one hand performs a one-handed touch operation on the screen of the electronic display device according to the gesture of the user holding the electronic display device with one hand;
根据所述操作区域确定所述目标显示区域;Determine the target display area according to the operation area;
在所述目标显示区域内所述显示界面的上层显示所述悬浮窗界面。The floating window interface is displayed on an upper layer of the display interface in the target display area.
可选地,所述触控数据包括触控图像,所述根据所述触控数据,确定用户单手握持所述电子显示设备的手势,包括:Optionally, the touch data includes a touch image, and determining the user's gesture of holding the electronic display device with one hand according to the touch data includes:
将所述触控图像输入训练完成的手势识别模型,得到所述手势识别模型输出的表征用户单手握持所述电子显示设备的手势的识别结果。The touch image is input into the trained gesture recognition model, and a recognition result output by the gesture recognition model representing the gesture of the user holding the electronic display device with one hand is obtained.
可选地,所述电子显示设备包括电磁波能量吸收比率传感器,所述触控数据包括所述电磁波能量吸收比率传感器采集的传感器数据,所述根据所述触控数据,确定用户单手握持所述电子显示设备的手势,包括:Optionally, the electronic display device includes an electromagnetic wave energy absorption ratio sensor, the touch data includes sensor data collected by the electromagnetic wave energy absorption ratio sensor, and based on the touch data, it is determined that the user holds the device with one hand. Gestures for electronic display devices, including:
根据所述传感器数据确定用户与所述电子显示设备的接触区域;Determine the contact area between the user and the electronic display device based on the sensor data;
根据所述接触区域在所述电子显示设备上的位置分布,确定用户握持所述电子显示设备的手势。According to the position distribution of the contact area on the electronic display device, a gesture of the user holding the electronic display device is determined.
根据本公开实施例的第二方面,提供一种电子显示设备的显示控制装置,所述装置 包括:According to a second aspect of an embodiment of the present disclosure, a display control device for an electronic display device is provided, the device comprising:
生成模块,被配置为用于响应于检测到启动单手操作模式的指令,根据所述电子显示设备当前显示状态下的显示界面生成悬浮窗界面,所述悬浮窗界面是所述显示界面缩小尺寸后的界面;a generating module configured to generate a floating window interface according to the display interface in the current display state of the electronic display device in response to detecting an instruction to start the one-handed operation mode, where the floating window interface is a reduced size of the display interface The final interface;
显示模块,被配置为用于在所述电子显示设备的屏幕上的目标显示区域内所述显示界面的上层显示所述悬浮窗界面,其中,所述目标显示区域是基于用户单手握持所述电子显示设备的手势确定的;The display module is configured to display the floating window interface on an upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is based on the user holding a single-hand display. The gestures of the electronic display device are determined;
响应模块,被配置为用于在检测到用户在所述悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应所述第一单手触控操作,所述第一单手触控操作是用户单手握持所述电子显示设备的手进行的操作;a response module configured to control the first target interface to respond to the first one-hand touch operation when it is detected that the user performs a first one-hand touch operation in the floating window interface, and the third A one-hand touch operation is an operation performed by the user holding the electronic display device with one hand;
同步模块,被配置为用于根据所述第一目标界面的变化控制第二目标界面进行同步变化,其中所述第一目标界面为所述显示界面和所述悬浮窗界面中的一者,所述第二目标界面为所述显示界面和所述悬浮窗界面中的另一者。A synchronization module configured to control the second target interface to perform synchronous changes according to changes in the first target interface, wherein the first target interface is one of the display interface and the floating window interface, so The second target interface is the other one of the display interface and the floating window interface.
可选地,所述第一目标界面配置有应用程序的接口,所述响应模块包括:Optionally, the first target interface is configured with an application program interface, and the response module includes:
第一确定子模块,被配置为用于根据所述第一单手触控操作,确定目标应用程序;A first determination sub-module configured to determine a target application based on the first one-hand touch operation;
调用模块,被配置为用于通过调用所述目标应用程序的接口,以控制所述第一目标界面响应所述第一单手触控操作。The calling module is configured to control the first target interface to respond to the first one-hand touch operation by calling the interface of the target application program.
可选地,所述装置还包括:Optionally, the device also includes:
控制模块,被配置为用于在检测到用户在所述显示界面内进行第二单手触控操作的情况下,控制所述第一目标界面响应所述第二单手触控操作。The control module is configured to control the first target interface to respond to the second one-hand touch operation when it is detected that the user performs a second one-hand touch operation in the display interface.
可选地,所述响应模块被配置为用于:Optionally, the response module is configured to:
若所述第一目标界面为所述显示界面、且所述第二目标界面为所述悬浮窗界面,则将所述第一单手触控操作映射到所述显示界面;控制所述显示界面对所述第一单手触控操作的映射结果进行响应。If the first target interface is the display interface and the second target interface is the floating window interface, map the first one-hand touch operation to the display interface; control the display interface Respond to the mapping result of the first one-hand touch operation.
可选地,所述显示模块包括:Optionally, the display module includes:
获取子模块,被配置为用于获取用户在所述电子显示设备上的触控数据;An acquisition submodule configured to acquire the user's touch data on the electronic display device;
第二确定子模块,被配置为用于根据所述触控数据,确定用户单手握持所述电子显示设备的手势;The second determination sub-module is configured to determine the gesture of the user holding the electronic display device with one hand according to the touch data;
第三确定子模块,被配置为用于根据用户单手握持所述电子显示设备的手势,确定 用户单手握持所述电子显示设备的手在所述电子显示设备的屏幕上进行单手触控操作的操作区域;The third determination sub-module is configured to determine, based on the user's gesture of holding the electronic display device with one hand, whether the user's hand holding the electronic display device with one hand performs a one-handed gesture on the screen of the electronic display device. Operating area for touch operations;
第四确定子模块,被配置为用于根据所述操作区域确定所述目标显示区域;a fourth determination sub-module configured to determine the target display area according to the operation area;
显示子模块,被配置为用于在所述目标显示区域内所述显示界面的上层显示所述悬浮窗界面。The display submodule is configured to display the floating window interface on an upper layer of the display interface in the target display area.
可选地,所述触控数据包括触控图像,所述第二确定子模块被配置为用于:Optionally, the touch data includes a touch image, and the second determination sub-module is configured to:
将所述触控图像输入训练完成的手势识别模型,得到所述手势识别模型输出的表征用户单手握持所述电子显示设备的手势的识别结果。The touch image is input into the trained gesture recognition model, and a recognition result output by the gesture recognition model representing the gesture of the user holding the electronic display device with one hand is obtained.
可选地,所述电子显示设备包括电磁波能量吸收比率传感器,所述触控数据包括所述电磁波能量吸收比率传感器采集的传感器数据,所述第二确定子模块被配置为用于:Optionally, the electronic display device includes an electromagnetic wave energy absorption ratio sensor, the touch data includes sensor data collected by the electromagnetic wave energy absorption ratio sensor, and the second determination sub-module is configured to:
根据所述传感器数据确定用户与所述电子显示设备的接触区域;Determine the contact area between the user and the electronic display device based on the sensor data;
根据所述接触区域在所述电子显示设备上的位置分布,确定用户握持所述电子显示设备的手势。According to the position distribution of the contact area on the electronic display device, a gesture of the user holding the electronic display device is determined.
根据本公开实施例的第三方面,提供一种计算机可读存储介质,其上存储有计算机程序指令,该程序指令被处理器执行时实现本公开第一方面所提供的电子显示设备的显示控制方法的步骤。According to a third aspect of an embodiment of the present disclosure, a computer-readable storage medium is provided, on which computer program instructions are stored. When the program instructions are executed by a processor, the display control of the electronic display device provided by the first aspect of the present disclosure is implemented. Method steps.
根据本公开实施例的第三方面,提供一种电子显示设备,包括:According to a third aspect of an embodiment of the present disclosure, an electronic display device is provided, including:
存储器,其上存储有计算机程序;A memory on which a computer program is stored;
处理器,用于执行所述存储器中的所述计算机程序,以实现本公开第一方面所提供的电子显示设备的显示控制方法的步骤。A processor, configured to execute the computer program in the memory to implement the steps of the display control method of the electronic display device provided by the first aspect of the present disclosure.
本公开的实施例提供的技术方案可以包括以下有益效果:The technical solutions provided by the embodiments of the present disclosure may include the following beneficial effects:
电子显示设备响应于检测到启动单手操作模式的指令,根据该电子显示设备当前显示状态下的显示界面生成悬浮窗界面,该悬浮窗界面是该显示界面缩小尺寸后的界面。在电子显示设备的屏幕上的目标显示区域内显示界面的上层显示悬浮窗界面,其中目标显示区域是基于用户单手握持电子显示设备的手势确定的。在检测到用户在悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应第一单手触控操作,其中,第一单手触控操作是用户单手握持电子显示设备的手进行的操作。并,根据第一目标界面的变化控制第二目标界面进行同步变化,其中第一目标界面为显示界面和悬浮窗界面中的一者,第二目标界面为显示界面和悬浮窗界面中的另一者。采用这种方法,由于悬 浮窗界面是显示界面缩小尺寸后的界面,所以通过用户在悬浮窗界面内进行单手触控操作,可实现对显示界面的单手触控操作。并且由于悬浮窗界面的尺寸比显示界面的尺寸小,所以相较于显示界面,悬浮窗界面更便于用户进行单手触控操作。因此本公开这种方法解决了相关技术中因大屏幕电子显示设备的屏幕过大而存在的单手进行触控操作不方便的问题,例如解决了在大屏幕电子显示设备的屏幕上不便进行单手触控操作的问题。In response to detecting the instruction to start the one-handed operation mode, the electronic display device generates a floating window interface based on the display interface in the current display state of the electronic display device. The floating window interface is a reduced size interface of the display interface. A floating window interface is displayed on an upper layer of the display interface in a target display area on the screen of the electronic display device, where the target display area is determined based on a gesture of the user holding the electronic display device with one hand. When it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation, wherein the first one-hand touch operation is held by the user with one hand. Hands operating electronic display device. And, the second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface. By. Using this method, since the floating window interface is an interface after the display interface has been reduced in size, the user can perform a one-handed touch operation on the display interface by performing a one-handed touch operation in the floating window interface. And because the size of the floating window interface is smaller than the size of the display interface, the floating window interface is more convenient for users to perform one-handed touch operations compared to the display interface. Therefore, this method of the present disclosure solves the problem in the related art that it is inconvenient to perform touch operations with one hand due to the large screen of a large-screen electronic display device. Problems with hand touch operation.
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。It should be understood that the foregoing general description and the following detailed description are exemplary and explanatory only, and do not limit the present disclosure.
附图说明Description of the drawings
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
图1是根据一示例性实施例示出的一种电子显示设备的显示控制方法的流程图。FIG. 1 is a flow chart of a display control method of an electronic display device according to an exemplary embodiment.
图2是根据一示例性实施例示出的一种电子显示设备的显示控制装置的框图。FIG. 2 is a block diagram of a display control device of an electronic display device according to an exemplary embodiment.
图3是根据一示例性实施例示出的一种电子显示设备的框图。FIG. 3 is a block diagram of an electronic display device according to an exemplary embodiment.
具体实施方式Detailed ways
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。Exemplary embodiments will be described in detail herein, examples of which are illustrated in the accompanying drawings. When the following description refers to the drawings, the same numbers in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the following exemplary embodiments do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with aspects of the disclosure as detailed in the appended claims.
需要说明的是,本申请中所有获取信号、信息、指令或数据的动作都是在遵照所在地国家相应的数据保护法规政策的前提下,并获得由相应装置所有者给予授权的情况下进行的。It should be noted that all actions to obtain signals, information, instructions or data in this application are performed under the premise of complying with the corresponding data protection regulations and policies of the country where the location is located, and with authorization from the owner of the corresponding device.
图1是根据一示例性实施例示出的一种电子显示设备的显示控制方法的流程图,如图1所示,该电子显示设备的显示控制方法可以包括以下步骤。FIG. 1 is a flow chart of a display control method of an electronic display device according to an exemplary embodiment. As shown in FIG. 1 , the display control method of the electronic display device may include the following steps.
在步骤S11中,响应于检测到启动单手操作模式的指令,根据所述电子显示设备当前显示状态下的显示界面生成悬浮窗界面,所述悬浮窗界面是所述显示界面缩小尺寸后的界面。In step S11, in response to detecting an instruction to start the one-handed operation mode, a floating window interface is generated according to the display interface in the current display state of the electronic display device, where the floating window interface is the reduced size of the display interface. .
单手操作模式可以是指使用一只手进行触控操作的模式、小屏操作模式、虚拟操作界面模式等。悬浮窗界面是显示界面缩小尺寸后的界面,悬浮窗界面和显示界面具有相 同的显示内容。应说明的是,在本公开中,悬浮窗界面不属于显示界面所具有的显示内容。The one-handed operation mode may refer to a mode of using one hand for touch operation, a small screen operation mode, a virtual operation interface mode, etc. The floating window interface is the interface after the display interface is reduced in size. The floating window interface and the display interface have the same display content. It should be noted that in this disclosure, the floating window interface does not belong to the display content of the display interface.
在一些实施方式中,可以通过获取显示界面,并对显示界面的长和宽进行等比例缩小,以得到缩小尺寸后的悬浮窗界面。在另一些实施方式中,悬浮窗界面的尺寸可以为默认值,如为3.5英寸、4英寸、4.5英寸等,本公开对悬浮窗界面的尺寸大小不作具体限制。In some implementations, a reduced-size floating window interface can be obtained by obtaining the display interface and reducing the length and width of the display interface in equal proportions. In other embodiments, the size of the floating window interface may be a default value, such as 3.5 inches, 4 inches, 4.5 inches, etc. This disclosure does not place a specific limit on the size of the floating window interface.
响应于检测到启动单手操作模式的指令,根据电子显示设备当前显示状态下的显示界面生成悬浮窗界面。其中,启动单手操作模式的指令可以是用户按照预设方式点击屏幕、用户按照预设轨迹在屏幕上滑动、用户点击预设按钮、用户输入预设语音、用户使用预设手势握持电子显示设备等动作。In response to detecting an instruction to start the one-handed operation mode, a floating window interface is generated according to the display interface in the current display state of the electronic display device. Among them, the instruction to start the one-handed operation mode can be that the user clicks the screen in a preset manner, the user slides on the screen according to a preset trajectory, the user clicks a preset button, the user inputs a preset voice, or the user uses a preset gesture to hold the electronic display. Equipment and other actions.
在步骤S12中,在所述电子显示设备的屏幕上的目标显示区域内所述显示界面的上层显示所述悬浮窗界面,其中,所述目标显示区域是基于用户单手握持所述电子显示设备的手势确定的。In step S12, the floating window interface is displayed on the upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is based on the user holding the electronic display with one hand. Device gestures determined.
应说明的是,由于电子显示设备可显示多个界面的叠加结果,所以在一些实施方式中,可在电子显示设备的屏幕上的目标显示区域内显示界面的上层显示悬浮窗界面。并且该悬浮窗界面的透明度可以由用户进行自由设置。通过调整悬浮窗界面的透明度,可减小该悬浮窗界面对显示界面的遮挡程度。It should be noted that since the electronic display device can display the superposition result of multiple interfaces, in some embodiments, a floating window interface can be displayed on the upper layer of the display interface in the target display area on the screen of the electronic display device. And the transparency of the floating window interface can be freely set by the user. By adjusting the transparency of the floating window interface, the degree of occlusion of the display interface by the floating window interface can be reduced.
目标显示区域是基于用户单手握持电子显示设备的手势确定的,该目标显示区域为除了用户进行单手触控操作的盲区之外的区域。The target display area is determined based on the user's gesture of holding the electronic display device with one hand, and the target display area is an area other than the blind area where the user performs a one-hand touch operation.
在一些实施方式中,电子显示设备当前显示状态下的显示界面可以是指电子显示设备的主屏幕界面,也可以是指除了本公开中的悬浮窗界面以外的多个界面的叠加结果。In some embodiments, the display interface in the current display state of the electronic display device may refer to the home screen interface of the electronic display device, or may refer to the superposition result of multiple interfaces other than the floating window interface in this disclosure.
在步骤S13中,在检测到用户在所述悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应所述第一单手触控操作,所述第一单手触控操作是用户单手握持所述电子显示设备的手进行的操作。In step S13, when it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation. The touch operation is an operation performed by the user holding the electronic display device with one hand.
示例地,第一单手触控操作可以是用户单手握持电子显示设备的手进行的操作,具体操作可以是使用手指(或佩戴有触控手指套等辅助工具的手指)点击悬浮窗界面内应用程序的功能图标(如按钮控件)的操作。第一单手触控操作也可以是用户使用手指缩放悬浮窗界面大小的操作。第一单手触控操作也可以是用户使用手指滑动悬浮窗界面的操作(如翻页操作)。第一单手触控操作也可以是用户在悬浮窗界面内输入文字的操作。 第一单手触控操作也可以是用户在悬浮窗界面内进行绘画的操作。第一单手触控操作也可以是用户在悬浮窗界面内播放视频/音频的操作。For example, the first one-handed touch operation may be an operation performed by the user holding the electronic display device with one hand. The specific operation may be using a finger (or a finger wearing an auxiliary tool such as a touch finger glove) to click on the floating window interface. Operation of function icons (such as button controls) within the application. The first one-hand touch operation may also be an operation in which the user uses a finger to resize the floating window interface. The first one-hand touch operation may also be an operation in which the user uses a finger to slide the floating window interface (such as a page turning operation). The first one-hand touch operation may also be an operation in which the user inputs text in the floating window interface. The first one-hand touch operation may also be an operation for the user to draw in the floating window interface. The first one-hand touch operation may also be an operation for the user to play video/audio in the floating window interface.
应说明的是,用户在悬浮窗界面内进行的第一单手触控操作,不包括通过拖拽悬浮窗界面以调整悬浮窗界面在电子显示设备的屏幕上的显示位置的动作。It should be noted that the first one-hand touch operation performed by the user in the floating window interface does not include the action of dragging the floating window interface to adjust the display position of the floating window interface on the screen of the electronic display device.
在步骤S14中,根据所述第一目标界面的变化控制第二目标界面进行同步变化,其中所述第一目标界面为所述显示界面和所述悬浮窗界面中的一者,所述第二目标界面为所述显示界面和所述悬浮窗界面中的另一者。In step S14, the second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is controlled to change synchronously. The target interface is the other one of the display interface and the floating window interface.
在第一目标界面因响应第一单手触控操作而发生变化时,根据第一目标界面的变化控制第二目标界面进行同步变化。即在本公开中,第一目标界面与第二目标界面具有相互映射的关系。第一目标界面与第二目标界面的显示内容保持一致,而显示内容的尺寸大小可以不一致。When the first target interface changes in response to the first one-hand touch operation, the second target interface is controlled to change synchronously according to the change of the first target interface. That is, in the present disclosure, the first target interface and the second target interface have a mutual mapping relationship. The display content of the first target interface and the second target interface remain consistent, but the size of the displayed content may be inconsistent.
示例地,第一目标界面为显示界面,第二目标界面为悬浮窗界面。在显示界面因响应第一单手触控操作而发生变化时,控制悬浮窗界面进行同步变化。此种情况下,显示界面上配置有应用程序的接口,根据第一单手触控操作可调用对应的接口。通过调用接口可触发后台应用程序进行响应,使得显示界面进行相应地变化。For example, the first target interface is a display interface, and the second target interface is a floating window interface. When the display interface changes in response to the first one-hand touch operation, the floating window interface is controlled to change synchronously. In this case, the display interface is configured with an application program interface, and the corresponding interface can be called according to the first one-hand touch operation. By calling the interface, the background application can be triggered to respond, causing the display interface to change accordingly.
再示例地,第一目标界面为悬浮窗界面,第二目标界面为显示界面。在悬浮窗界面因响应第一单手触控操作而发生变化时,控制显示界面进行同步变化。此种情况下,悬浮窗界面上配置有应用程序的接口,根据第一单手触控操作可调用对应的接口。通过调用接口可触发后台应用程序进行响应,使得悬浮窗界面进行相应地变化。As another example, the first target interface is a floating window interface, and the second target interface is a display interface. When the floating window interface changes in response to the first one-hand touch operation, the display interface is controlled to change synchronously. In this case, the floating window interface is configured with an application program interface, and the corresponding interface can be called according to the first one-hand touch operation. By calling the interface, the background application can be triggered to respond, causing the floating window interface to change accordingly.
可选地,所述方法还包括:Optionally, the method also includes:
在检测到用户在所述显示界面内进行第二单手触控操作的情况下,控制所述第一目标界面响应所述第二单手触控操作。When it is detected that the user performs a second one-hand touch operation in the display interface, the first target interface is controlled to respond to the second one-hand touch operation.
其中,第二单手触控操作可以是除了用户单手握持电子显示设备的手之外的手,例如,如果用户用右手握持电子显示设备,那么第二单手触控操作是指左手进行的操作。The second one-handed touch operation may be a hand other than the hand of the user holding the electronic display device with one hand. For example, if the user holds the electronic display device with his right hand, then the second one-handed touch operation refers to the left hand. operations performed.
在一些实施方式中,第二单手触控操作可以是用户单手握持电子显示设备的手。In some embodiments, the second one-hand touch operation may be the user's hand holding the electronic display device with one hand.
示例地,第二单手触控操作可以是用户(的左手)使用手指、触控笔、或者鼠标点击显示界面内应用程序的功能图标(如按钮控件)的操作。第二单手触控操作也可以是用户使用手指缩放显示界面大小的操作。第二单手触控操作也可以是用户使用手指、触控笔、或者鼠标滑动(如翻页)显示界面的操作。第二单手触控操作也可以是用户在显 示界面内输入文字的操作。第二单手触控操作也可以是用户在显示界面内进行绘画的操作。第二单手触控操作也可以是用户在显示界面内播放视频/音频的操作。For example, the second one-handed touch operation may be an operation in which the user (left hand) uses a finger, a stylus, or a mouse to click on a function icon (such as a button control) of the application program in the display interface. The second one-hand touch operation may also be an operation in which the user uses a finger to zoom in on the size of the display interface. The second one-hand touch operation may also be an operation in which the user uses a finger, a stylus, or a mouse to slide (such as turning pages) to display the interface. The second one-hand touch operation may also be an operation in which the user inputs text in the display interface. The second one-hand touch operation may also be an operation for the user to draw within the display interface. The second one-hand touch operation may also be an operation by the user to play video/audio in the display interface.
在一些实施方式中,若用户在悬浮窗界面内进行触控操作,则将悬浮窗界面确定为第一目标界面,将显示界面确定为第二目标界面。如此,显示界面为悬浮窗界面的映射界面(投影界面)。In some implementations, if the user performs a touch operation in the floating window interface, the floating window interface is determined as the first target interface, and the display interface is determined as the second target interface. In this way, the display interface is a mapping interface (projection interface) of the floating window interface.
在另一些实施方式中,若用户在显示界面内进行触控操作,则将显示界面确定为第一目标界面,将悬浮窗界面确定为第二目标界面。如此,悬浮窗界面为显示界面的映射界面(投影界面)。In other embodiments, if the user performs a touch operation in the display interface, the display interface is determined as the first target interface, and the floating window interface is determined as the second target interface. In this way, the floating window interface is a mapping interface (projection interface) of the display interface.
在另一些实施方式中,为了更简单、快速的生成悬浮窗界面,可将显示界面默认为第一目标界面,将悬浮窗界面默认为第二目标界面。如此,悬浮窗界面为显示界面的映射界面(投影界面),悬浮窗界面具备显示投影结果的功能。In other implementations, in order to generate a floating window interface more simply and quickly, the display interface can be defaulted to the first target interface, and the floating window interface can be defaulted to the second target interface. In this way, the floating window interface is a mapping interface (projection interface) of the display interface, and the floating window interface has the function of displaying the projection result.
采用这种方法,电子显示设备响应于检测到启动单手操作模式的指令,根据该电子显示设备当前显示状态下的显示界面生成悬浮窗界面,该悬浮窗界面是该显示界面缩小尺寸后的界面。在电子显示设备的屏幕上的目标显示区域内显示界面的上层显示悬浮窗界面,其中目标显示区域是基于用户单手握持电子显示设备的手势确定的。在检测到用户在悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应第一单手触控操作,其中,第一单手触控操作是用户单手握持电子显示设备的手进行的操作。并,根据第一目标界面的变化控制第二目标界面进行同步变化,其中第一目标界面为显示界面和悬浮窗界面中的一者,第二目标界面为显示界面和悬浮窗界面中的另一者。采用这种方法,由于悬浮窗界面是显示界面缩小尺寸后的界面,所以通过用户在悬浮窗界面内进行单手触控操作,可实现对显示界面的单手触控操作。并且由于悬浮窗界面的尺寸比显示界面的尺寸小,所以相较于显示界面,悬浮窗界面更便于用户进行单手触控操作。因此本公开这种方法解决了相关技术中因大屏幕电子显示设备的屏幕过大而存在的单手进行触控操作不方便的问题,例如解决了在大屏幕电子显示设备的屏幕上不便进行单手触控操作的问题。Using this method, the electronic display device responds to detecting an instruction to start the one-handed operation mode and generates a floating window interface based on the display interface in the current display state of the electronic display device. The floating window interface is the reduced size of the display interface. . A floating window interface is displayed on an upper layer of the display interface in a target display area on the screen of the electronic display device, where the target display area is determined based on a gesture of the user holding the electronic display device with one hand. When it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation, wherein the first one-hand touch operation is held by the user with one hand. Hands operating electronic display device. And, the second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface. By. Using this method, since the floating window interface is an interface after the display interface has been reduced in size, the user can perform a one-handed touch operation on the display interface by performing a one-handed touch operation in the floating window interface. And because the size of the floating window interface is smaller than the size of the display interface, the floating window interface is more convenient for users to perform one-handed touch operations compared to the display interface. Therefore, this method of the present disclosure solves the problem in the related art that it is inconvenient to perform touch operations with one hand due to the large screen of a large-screen electronic display device. Problems with hand touch operation.
并且,由于用户不仅可在悬浮窗界面内进行触控操作,用户还可以在显示界面内进行触控操作,所以本公开这种方式,不仅便于用户在悬浮窗界面内进行单手触控操作,还便于用户使用另一只手在显示界面内进行触控操作。例如,在教学场景下,老师使用右手在悬浮窗界面内书写文字,并且,老师使用左手在显示界面内向左滑动以使得显示 界面内的图像向左移动,从而便于右手连续书写文字并避免书写的文字相互重叠。Moreover, since the user can not only perform touch operations in the floating window interface, but also can perform touch operations in the display interface, this method of the present disclosure not only facilitates the user to perform one-handed touch operations in the floating window interface, but also facilitates the user to perform touch operations in the floating window interface. It also facilitates the user to use the other hand to perform touch operations within the display interface. For example, in a teaching scenario, the teacher uses his right hand to write text in the floating window interface, and the teacher uses his left hand to slide left in the display interface to move the image in the display interface to the left, thereby making it easier for the right hand to write text continuously and avoid writing. Text overlaps each other.
可选地,所述第一目标界面配置有应用程序的接口,所述控制第一目标界面响应所述第一单手触控操作,包括:Optionally, the first target interface is configured with an application program interface, and controlling the first target interface to respond to the first one-hand touch operation includes:
根据所述第一单手触控操作,确定目标应用程序;通过调用所述目标应用程序的接口,以控制所述第一目标界面响应所述第一单手触控操作。Determine a target application program according to the first one-hand touch operation; and control the first target interface to respond to the first one-hand touch operation by calling an interface of the target application program.
示例地,假设第一单手触控操作为用户点击悬浮窗界面内的应用程序A的图标的操作,根据第一单手触控操作的操作位置、操作特征(如点击、滑动等特征)确定目标应用程序为启动应用程序A的程序a,通过调用启动应用程序A的程序a的接口,以控制第一目标界面响应第一单手触控操作,从而启动应用程序A,而通过启动应用程序A,可使得第一目标界面发生相应地变化,如第一目标界面变化为进入应用程序A后所显示的主界面。For example, assuming that the first one-hand touch operation is an operation in which the user clicks on the icon of application A in the floating window interface, it is determined based on the operation position and operation characteristics (such as click, slide, etc.) of the first one-hand touch operation. The target application is program a that starts application A. By calling the interface of program a that starts application A, it controls the first target interface to respond to the first one-hand touch operation, thereby starting application A, and by starting application A A, the first target interface can be changed accordingly, for example, the first target interface changes to the main interface displayed after entering application A.
可选地,若所述第一目标界面为所述显示界面、且所述第二目标界面为所述悬浮窗界面,则所述在检测到用户在所述悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应所述第一单手触控操作,包括:Optionally, if the first target interface is the display interface and the second target interface is the floating window interface, the user performs the first one-handed operation in the floating window interface. In the case of a touch operation, controlling the first target interface to respond to the first one-hand touch operation includes:
将所述第一单手触控操作映射到所述显示界面;控制所述显示界面对所述第一单手触控操作的映射结果进行响应。Mapping the first one-hand touch operation to the display interface; controlling the display interface to respond to the mapping result of the first one-hand touch operation.
示例地,若第一目标界面为显示界面、且第二目标界面为悬浮窗界面,那么在检测到用户在悬浮窗界面内进行第一单手触控操作的情况下,将第一单手触控操作映射到显示界面。显示界面对第一单手触控操作的映射结果进行响应。示例地,假设第一单手触控操作为用户点击悬浮窗界面内的应用程序B的图标,那么将第一单手触控操作映射到显示界面,得到的映射结果为用户点击显示界面内的应用程序B的图标,显示界面对用户点击显示界面内的应用程序B的图标的操作进行响应,从而发生变化。For example, if the first target interface is a display interface and the second target interface is a floating window interface, then when it is detected that the user performs a first one-handed touch operation in the floating window interface, the first one-handed touch operation is Control operations are mapped to the display interface. The display interface responds to the mapping result of the first one-hand touch operation. For example, assuming that the first one-handed touch operation is the user clicking on the icon of application B in the floating window interface, then the first one-handed touch operation is mapped to the display interface, and the obtained mapping result is that the user clicks on the icon in the display interface. The icon of application B, the display interface changes in response to the user's operation of clicking the icon of application B in the display interface.
可选地,若第一目标界面为悬浮窗界面、且第二目标界面为显示界面,那么在检测到用户在显示界面内进行第二单手触控操作的情况下,将第二单手触控操作映射到悬浮窗界面。悬浮窗界面对第二单手触控操作的映射结果进行响应。Optionally, if the first target interface is a floating window interface and the second target interface is a display interface, then when it is detected that the user performs a second one-hand touch operation in the display interface, the second one-hand touch operation is Control operations are mapped to the floating window interface. The floating window interface responds to the mapping result of the second one-handed touch operation.
在一些实施方式中,在电子显示设备为平板电脑、可触控操作的笔记本、或者折叠屏手机的情况下,确定悬浮窗界面在电子显示设备的屏幕上的目标显示区域的一种方式可以是:In some embodiments, when the electronic display device is a tablet computer, a touch-operated notebook, or a folding screen mobile phone, one way of determining the target display area of the floating window interface on the screen of the electronic display device may be :
获取用户在电子显示设备上的触控数据。根据触控数据,确定用户单手握持电子显 示设备的手势。根据用户单手握持电子显示设备的手势,确定用户单手握持所述电子显示设备的手在所述电子显示设备的屏幕上进行单手进行触控操作的可操作区域。根据用户单手进行触控操作的可操作区域确定目标显示区域。例如,将用户单手进行触控操作的可操作区域作为显示区域。又例如,将用户单手进行触控操作的可操作区域中的一部分区域作为显示区域。Obtain the user's touch data on the electronic display device. According to the touch data, the user's gesture of holding the electronic display device with one hand is determined. According to the gesture of the user holding the electronic display device with one hand, the operable area for the user's hand holding the electronic display device with one hand to perform a single-handed touch operation on the screen of the electronic display device is determined. The target display area is determined based on the operable area where the user performs touch operations with one hand. For example, the operable area where the user performs touch operation with one hand is used as the display area. For another example, a part of the operable area where the user performs a touch operation with one hand is used as the display area.
其中,触控数据可以包括触控图像,根据触控数据,确定用户单手握持电子显示设备的手势的一种实施方式可以是:The touch data may include a touch image. According to the touch data, an implementation method of determining the user's gesture of holding the electronic display device with one hand may be:
将触控图像输入训练完成的手势识别模型,得到该手势识别模型输出的表征用户单手握持电子显示设备的手势的识别结果。Input the touch image into the trained gesture recognition model, and obtain the recognition result output by the gesture recognition model that represents the gesture of the user holding the electronic display device with one hand.
假设电子显示设备的屏幕上分布有m*n的电容点阵,由于接近屏幕的金属物体和人体会改变相应位置的电容值大小,所以在用户与电子显示设备的屏幕发生接触的情况下,屏幕对应有m*n*1的触控图像(电容图像)。Assume that there are m*n capacitive lattices distributed on the screen of the electronic display device. Since metal objects and human bodies close to the screen will change the capacitance value at the corresponding position, when the user comes into contact with the screen of the electronic display device, the screen Corresponds to a touch image (capacitive image) of m*n*1.
由于在用户握持电子显示设备的情况下,用户可能与电子显示设备的屏幕(如屏幕边缘位置)发生接触,所以在用户握持电子显示设备的情况下,可获得m*n*1的触控图像。将m*n*1的触控图像输入训练完成的手势识别模型,可得到该手势识别模型输出的表征用户握持电子显示设备的手势的识别结果。Since the user may come into contact with the screen of the electronic display device (such as the edge of the screen) when the user holds the electronic display device, m*n*1 touch points can be obtained when the user holds the electronic display device. control image. By inputting m*n*1 touch images into the trained gesture recognition model, the recognition result output by the gesture recognition model that represents the gesture of the user holding the electronic display device can be obtained.
其中,手势识别模型的训练方式可以是,确定触控图像样本,以及该触控图像样本的手势标签,根据触控图像样本和该触控图像样本的手势标签训练手势识别模型,得到训练完成的手势识别模型。The training method of the gesture recognition model may be to determine the touch image sample and the gesture label of the touch image sample, train the gesture recognition model based on the touch image sample and the gesture label of the touch image sample, and obtain the trained Gesture recognition model.
一种例子,将m*n*1(假设为16*40*1)的触控图像输入一个神经网络(例如ResNet、ResNeXt、MobileNet、或者ShuffleNet等)之中,计算得到x 1个特征值。以MobileNet进行举例说明:使用MobileNet的卷积层(conv)处理16*40*1的触控图像,可得到2*5*x 1的卷积结果。使用MobileNet的平均池化层(avg pool)计算2*5*x 1的均值,可得到1*1*x 1的特征值(即x 1个特征值),1*1*x 1的特征值可以代表16*40*1的触控图像。将1*1*x 1的特征值输入MobileNet的全连接+softmax层,可得到概率向量。该概率向量表征用户握持电子显示设备的手势,如右手握持电子显示设备、左手握持电子显示设备、非单手握持电子显示设备的手势即双手握持电子显示设备等手势。 An example is to input m*n*1 (assumed to be 16*40*1) touch images into a neural network (such as ResNet, ResNeXt, MobileNet, or ShuffleNet, etc.), and calculate x 1 feature values. Take MobileNet as an example: using MobileNet's convolution layer (conv) to process a 16*40*1 touch image, a convolution result of 2*5*x 1 can be obtained. Use MobileNet's average pooling layer (avg pool) to calculate the mean of 2*5*x 1 , and you can get the eigenvalues of 1*1*x 1 (that is, x 1 eigenvalues), and the eigenvalues of 1*1*x 1 Can represent a 16*40*1 touch image. Input the 1*1*x 1 feature value into the fully connected + softmax layer of MobileNet to get the probability vector. The probability vector represents the gesture of the user holding the electronic display device, such as holding the electronic display device with the right hand, holding the electronic display device with the left hand, gestures other than holding the electronic display device with one hand, that is, holding the electronic display device with both hands.
另一种例子,对m*n*1(假设为16*40*1)的触控图像进行分割,得到第一分割图像和第二分割图像,其中第一分割图像为包括该触控图像的四周边缘像素点的图像。如 第一分割图像的像素点个数为(16+40)*2*4=(112*4),第二分割图像的像素点个数为(16-2*4)*(40-2*4)=(8*32),其中16表征触控图像的宽,40表征触控图像的长,2表征触控图像有两条长边和两条宽边,4表征触控图像的分割宽度(即将触控图像的第一条边向内移动4个像素点,分割得到宽度为4的第一子图像。将触控图像的第二条边向内移动4个像素点,分割得到宽度为4的第二子图像。将触控图像的第三条边向内移动4个像素点,分割得到宽度为4的第三子图像。将触控图像的第四条边向内移动4个像素点,分割得到宽度为4的第四子图像。第一分割图像包括第一子图像、第二子图像、第三子图像、以及第四子图像)。In another example, a touch image of m*n*1 (assumed to be 16*40*1) is segmented to obtain a first segmented image and a second segmented image, where the first segmented image is a segmented image including the touch image. The image of the surrounding edge pixels. For example, the number of pixels in the first segmented image is (16+40)*2*4=(112*4), and the number of pixels in the second segmented image is (16-2*4)*(40-2* 4)=(8*32), where 16 represents the width of the touch image, 40 represents the length of the touch image, 2 represents that the touch image has two long sides and two wide sides, and 4 represents the segmentation width of the touch image. (That is, move the first side of the touch image inward by 4 pixels, and divide it to obtain the first sub-image with a width of 4. Move the second side of the touch image inward by 4 pixels, and divide it to obtain a width of 4 The second sub-image of 4. Move the third side of the touch image inward by 4 pixels, and divide it to obtain the third sub-image with a width of 4. Move the fourth side of the touch image inward by 4 pixels. point, a fourth sub-image with a width of 4 is obtained by segmentation. The first segmented image includes a first sub-image, a second sub-image, a third sub-image, and a fourth sub-image).
将第一分割图像输入一个神经网络(例如ResNet、ResNeXt、MobileNet、ShuffleNet等)之中,计算得到Y 1个特征值。以MobileNet进行举例说明:使用MobileNet的卷积层(conv)处理112*4*1的第一分割图像,可得到28*1*Y 1的卷积结果。使用MobileNet的平均池化层(avg pool)计算28*1*Y 1的均值,可得到1*1*Y 1的特征值(即Y 1个特征值),1*1*Y 1的特征值可以代表112*4*1的第一分割图像。 Input the first segmented image into a neural network (such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.), and calculate Y 1 feature values. Taking MobileNet as an example: Using MobileNet's convolution layer (conv) to process the first segmentation image of 112*4*1, a convolution result of 28*1*Y 1 can be obtained. Use MobileNet's average pooling layer (avg pool) to calculate the mean of 28*1*Y 1 , and you can get the eigenvalues of 1*1*Y 1 (that is, Y 1 eigenvalues), and the eigenvalues of 1*1*Y 1 It can represent the first segmented image of 112*4*1.
并且,将第二分割图像输入神经网络(例如ResNet、ResNeXt、MobileNet、ShuffleNet等)之中,计算得到Y 2个特征值。以MobileNet进行举例说明:使用MobileNet的卷积层(conv)处理8*32*1的第二分割图像,可得到2*5*Y 2的卷积结果。使用MobileNet的平均池化层(avg pool)计算2*5*Y 2的均值,可得到1*1*Y 2的特征值(即Y 2个特征值),1*1*Y 2的特征值可以代表8*32*1的第二分割图像。 Furthermore, the second segmented image is input into a neural network (such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.), and Y 2 feature values are calculated. Taking MobileNet as an example: Using MobileNet's convolution layer (conv) to process the second segmentation image of 8*32*1, a convolution result of 2*5*Y 2 can be obtained. Use MobileNet's average pooling layer (avg pool) to calculate the mean of 2*5*Y 2 , and you can get the eigenvalues of 1*1*Y 2 (that is, Y 2 eigenvalues), and the eigenvalues of 1*1*Y 2 It can represent the second segmented image of 8*32*1.
将1*1*Y 1的特征值与1*1*Y 2的特征值进行拼接(concatenate),可得到1*1*(Y 1+Y 2)的拼接向量,将1*1*(Y 1+Y 2)的拼接向量输入MobileNet的全连接+softmax层,得到概率向量。该概率向量表征用户握持电子显示设备的手势,如右手握持电子显示设备、左手握持电子显示设备、双手握持电子显示设备等手势。 Concatenate (concatenate) the eigenvalues of 1*1*Y 1 with the eigenvalues of 1*1*Y 2 to obtain the concatenation vector of 1*1*(Y 1 +Y 2 ). The splicing vector of 1 + Y 2 ) is input into the fully connected + softmax layer of MobileNet to obtain the probability vector. The probability vector represents the gesture of the user holding the electronic display device, such as holding the electronic display device with the right hand, holding the electronic display device with the left hand, holding the electronic display device with both hands, etc.
另一种例子,对m*n*1(假设为16*40*1)的触控图像进行分割,得到第一分割图像,其中第一分割图像为包括该触控图像的四周边缘像素点的图像。如第一分割图像的像素点个数为(16+40)*2*4=(112*4),其中16表征触控图像的宽,40表征触控图像的长,2表征触控图像有两条长边和两条宽边,4表征触控图像的分割宽度(即将触控图像的第一条边向内移动4个像素点,分割得到宽度为4的第一子图像。将触控图像的第二条边向内移动4个像素点,分割得到宽度为4的第二子图像。将触控图像的第三条边向内移动4个像素点,分割得到宽度为4的第三子图像。将触控图像的第四条边向内移 动4个像素点,分割得到宽度为4的第四子图像。第一分割图像包括第一子图像、第二子图像、第三子图像、以及第四子图像)。Another example is to segment a touch image of m*n*1 (assumed to be 16*40*1) to obtain a first segmented image, where the first segmented image includes pixels around the edges of the touch image. image. For example, the number of pixels in the first divided image is (16+40)*2*4=(112*4), where 16 represents the width of the touch image, 40 represents the length of the touch image, and 2 represents the length of the touch image. Two long sides and two wide sides, 4 represents the segmentation width of the touch image (i.e., move the first edge of the touch image inward by 4 pixels, and segment it to obtain the first sub-image with a width of 4. Move the touch image Move the second side of the image inward by 4 pixels, and divide it into a second sub-image with a width of 4. Move the third side of the touch image inward by 4 pixels, and divide it into a third sub-image with a width of 4. Sub-image. Move the fourth side of the touch image inward by 4 pixels, and segment it to obtain a fourth sub-image with a width of 4. The first segmented image includes the first sub-image, the second sub-image, and the third sub-image. , and the fourth sub-image).
将第一分割图像输入一个神经网络(例如ResNet、ResNeXt、MobileNet、ShuffleNet等)之中,计算得到Y 1个特征值。以MobileNet进行举例说明:使用MobileNet的卷积层(conv)处理112*4*1的第一分割图像,可得到28*1*Y 1的卷积结果。使用MobileNet的平均池化层(avg pool)计算28*1*Y 1的均值,可得到1*1*Y 1的特征值(即Y 1个特征值),1*1*Y 1的特征值可以代表112*4*1的第一分割图像。 Input the first segmented image into a neural network (such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.), and calculate Y 1 feature values. Taking MobileNet as an example: Using MobileNet's convolution layer (conv) to process the first segmentation image of 112*4*1, a convolution result of 28*1*Y 1 can be obtained. Use MobileNet's average pooling layer (avg pool) to calculate the mean value of 28*1*Y 1 , and you can get the eigenvalues of 1*1*Y 1 (that is, Y 1 eigenvalues), and the eigenvalues of 1*1*Y 1 It can represent the first segmented image of 112*4*1.
并且,将m*n*1(假设为16*40*1)的触控图像输入一个神经网络(例如ResNet、ResNeXt、MobileNet、ShuffleNet等)之中,计算得到Y 2个特征值。以MobileNet进行举例说明:使用MobileNet的卷积层(conv)处理16*40*1的触控图像,可得到2*5*Y 2的卷积结果。使用MobileNet的平均池化层(avg pool)计算2*5*Y 2的均值,可得到1*1*Y 2的特征值(即Y 2个特征值),1*1*Y 2的特征值可以代表16*40*1的触控图像。 Moreover, input m*n*1 (assumed to be 16*40*1) touch images into a neural network (such as ResNet, ResNeXt, MobileNet, ShuffleNet, etc.), and calculate Y 2 feature values. Take MobileNet as an example: Using MobileNet's convolution layer (conv) to process a 16*40*1 touch image, a convolution result of 2*5*Y 2 can be obtained. Use MobileNet's average pooling layer (avg pool) to calculate the mean of 2*5*Y 2 , and you can get the eigenvalues of 1*1*Y 2 (that is, Y 2 eigenvalues), and the eigenvalues of 1*1*Y 2 Can represent a 16*40*1 touch image.
将1*1*Y 1的特征值与1*1*Y 2的特征值进行拼接(concatenate),可得到1*1*(Y 1+Y 2)的拼接向量,将1*1*(Y 1+Y 2)的拼接向量输入MobileNet的全连接+softmax层,可得到概率向量。该概率向量表征用户握持电子显示设备的手势,如右手握持电子显示设备、左手握持电子显示设备、双手握持电子显示设备等手势。 Concatenate (concatenate) the eigenvalues of 1*1*Y 1 with the eigenvalues of 1*1*Y 2 to obtain the concatenation vector of 1*1*(Y 1 +Y 2 ). The splicing vector of 1 + Y 2 ) is input into the fully connected + softmax layer of MobileNet to obtain the probability vector. The probability vector represents the gesture of the user holding the electronic display device, such as holding the electronic display device with the right hand, holding the electronic display device with the left hand, holding the electronic display device with both hands, etc.
另一种例子,将m*n*1(假设为18*40*1)的触控图像输入一个目标检测模型(例如ResNet、ResNeXt、MobileNet、或ShuffleNet等模型)之中,计算得到x 2个特征值。使用第一全连接+softmax层计算每一锚框(Anchor Box)的位置(x、y、w、h)以及每一锚框对应的类别概率值。类别包括指头误触、虎口误触、左手按压、右手按压等类别。使用第二全连接层+softmax根据每一锚框的位置(x、y、w、h)以及每一锚框对应的类别概率值,计算总概率向量。该总概率向量表征用户握持电子显示设备的手势,如右手握持电子显示设备、左手握持电子显示设备、双手握持电子显示设备等手势。 Another example, input m*n*1 (assumed to be 18*40*1) touch images into a target detection model (such as ResNet, ResNeXt, MobileNet, or ShuffleNet, etc.), and calculate x 2 Eigenvalues. Use the first fully connected + softmax layer to calculate the position (x, y, w, h) of each anchor box (Anchor Box) and the category probability value corresponding to each anchor box. Categories include accidental contact with fingers, accidental contact with tiger's mouth, left-hand pressing, right-hand pressing, etc. Use the second fully connected layer + softmax to calculate the total probability vector based on the position (x, y, w, h) of each anchor box and the category probability value corresponding to each anchor box. The total probability vector represents the gesture of the user holding the electronic display device, such as holding the electronic display device with the right hand, holding the electronic display device with the left hand, holding the electronic display device with both hands, etc.
其中,锚框是指目标检测算法中,以锚点为中心,由算法预定义的多个不同长宽比的先验框。Among them, the anchor box refers to multiple a priori boxes with different aspect ratios predefined by the algorithm, centered on the anchor point in the target detection algorithm.
在电子显示设备为可触控操作的电视机、用于教学的电子黑板等大屏幕电子显示设备的情况下,确定悬浮窗界面在电子显示设备的屏幕上的显示区域的一种实施方式可以是:When the electronic display device is a large-screen electronic display device such as a touch-operable television, an electronic blackboard for teaching, etc., one implementation method of determining the display area of the floating window interface on the screen of the electronic display device may be :
获取用户相对电子显示设备的位置,基于用户相对电子显示设备的位置,确定便于 用户操作的可操作区域,或者确定用户单手扶持电子显示设备的手势以确定便于用户操作的可操作区域,根据该可操作区域确定悬浮窗界面的显示区域。例如,假设检测到用户站在电子显示设备的右下角位置,那么可将电子显示设备的屏幕右下角确定为便于用户操作的可操作区域。Obtain the user's position relative to the electronic display device, and determine an operable area that is convenient for the user to operate based on the user's position relative to the electronic display device, or determine the user's gesture of holding the electronic display device with one hand to determine an operable area that is convenient for the user to operate, according to the The operable area determines the display area of the floating window interface. For example, assuming that it is detected that the user is standing in the lower right corner of the electronic display device, the lower right corner of the screen of the electronic display device can be determined as an operable area that is convenient for the user to operate.
可选地,本公开中的电子显示设备可以包括电磁波能量吸收比率传感器SAR(specific absorption rate)sensor,触控数据可以包括电磁波能量吸收比率传感器采集的传感器数据。根据触控数据,确定用户握持电子显示设备的手势的一种实施方式可以是:Optionally, the electronic display device in the present disclosure may include an electromagnetic wave energy absorption ratio sensor SAR (specific absorption rate) sensor, and the touch data may include sensor data collected by the electromagnetic wave energy absorption ratio sensor. According to the touch data, an implementation method of determining the gesture of the user holding the electronic display device may be:
根据传感器数据确定用户与电子显示设备的接触区域。根据接触区域在电子显示设备上的位置分布,确定用户握持电子显示设备的手势。例如,假设多个触控区域位于电子显示设备的右后侧和右前侧,那么可确定用户握持电子显示设备的手势为左手握持电子设备。The user's contact area with the electronic display device is determined based on the sensor data. According to the position distribution of the contact area on the electronic display device, the gesture of the user holding the electronic display device is determined. For example, assuming that the multiple touch areas are located on the rear right side and the front right side of the electronic display device, it can be determined that the user's gesture of holding the electronic display device is holding the electronic device with the left hand.
其中,SAR sensor可以通过检测电容的方式,判断用户与电子显示设备(如手机)的接触状态。Among them, the SAR sensor can determine the contact status between the user and the electronic display device (such as a mobile phone) by detecting capacitance.
在一些实施方式中,电子显示设备的显示控制方法还可以包括:In some implementations, the display control method of the electronic display device may further include:
若预设时长内未检测到用户在悬浮窗界面内进行第一单手触控操作,则控制悬浮窗界面进入隐藏状态。例如将悬浮窗界面隐藏在屏幕边缘的用户不可视区域内,用户通过拖动隐藏在屏幕边缘的悬浮窗界面,可使悬浮窗界面重新显示在用户的可视区域内。又例如将悬浮窗界面变化为面积更小的悬浮球(悬浮点),用户通过点击该悬浮球(悬浮点),可使悬浮球(悬浮点)重新变化为悬浮窗界面。If the user's first one-hand touch operation in the floating window interface is not detected within the preset time period, the floating window interface is controlled to enter a hidden state. For example, if the floating window interface is hidden in the user-invisible area at the edge of the screen, the user can redisplay the floating window interface in the user's visible area by dragging the floating window interface hidden at the edge of the screen. For another example, the floating window interface is changed into a floating ball (suspended point) with a smaller area. By clicking on the floating ball (suspended point), the user can change the floating ball (suspended point) into a floating window interface again.
在一些实施方式中,电子显示设备的显示控制方法还可以包括:In some implementations, the display control method of the electronic display device may further include:
若识别到用户握持电子显示设备的手势为双手握持,则可关闭悬浮窗界面或控制悬浮窗界面进入隐藏状态。If it is recognized that the user's gesture of holding the electronic display device is holding it with both hands, the floating window interface can be closed or controlled to enter a hidden state.
若检测到用户更换握持电子显示设备的手势,根据更换后的手势调整悬浮窗界面的显示位置。If it is detected that the user changes the gesture of holding the electronic display device, the display position of the floating window interface is adjusted according to the changed gesture.
在一些实施方式中,确定悬浮窗界面在电子显示设备的屏幕上的显示区域的另一种实施方式:基于用户的选择/预先设置,确定固定的目标显示区域。In some implementations, another implementation of determining the display area of the floating window interface on the screen of the electronic display device: determining a fixed target display area based on the user's selection/preset settings.
图2是根据一示例性实施例示出的一种电子显示设备的显示控制装置的框图。参照图2,该装置200包括:FIG. 2 is a block diagram of a display control device of an electronic display device according to an exemplary embodiment. Referring to Figure 2, the device 200 includes:
生成模块210,被配置为用于响应于检测到启动单手操作模式的指令,根据所述电子 显示设备当前显示状态下的显示界面生成悬浮窗界面,所述悬浮窗界面是所述显示界面缩小尺寸后的界面;The generation module 210 is configured to generate a floating window interface according to the display interface in the current display state of the electronic display device in response to detecting an instruction to start the one-handed operation mode, where the floating window interface is a reduction of the display interface. The interface after size;
显示模块220,被配置为用于在所述电子显示设备的屏幕上的目标显示区域内所述显示界面的上层显示所述悬浮窗界面,其中,所述目标显示区域是基于用户单手握持所述电子显示设备的手势确定的;The display module 220 is configured to display the floating window interface on the upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is based on the user holding it with one hand. The gesture of the electronic display device is determined;
响应模块230,被配置为用于在检测到用户在所述悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应所述第一单手触控操作,所述第一单手触控操作是用户单手握持所述电子显示设备的手进行的操作;The response module 230 is configured to control the first target interface to respond to the first one-hand touch operation when it is detected that the user performs a first one-hand touch operation in the floating window interface, the The first one-hand touch operation is an operation performed by the user holding the electronic display device with one hand;
同步模块240,被配置为用于根据所述第一目标界面的变化控制第二目标界面进行同步变化,其中所述第一目标界面为所述显示界面和所述悬浮窗界面中的一者,所述第二目标界面为所述显示界面和所述悬浮窗界面中的另一者。The synchronization module 240 is configured to control the second target interface to perform synchronous changes according to changes in the first target interface, wherein the first target interface is one of the display interface and the floating window interface, The second target interface is the other one of the display interface and the floating window interface.
采用这种装置200,电子显示设备响应于检测到启动单手操作模式的指令,根据该电子显示设备当前显示状态下的显示界面生成悬浮窗界面,该悬浮窗界面是该显示界面缩小尺寸后的界面。在电子显示设备的屏幕上的目标显示区域内显示界面的上层显示悬浮窗界面,其中目标显示区域是基于用户单手握持电子显示设备的手势确定的。在检测到用户在悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应第一单手触控操作,其中,第一单手触控操作是用户单手握持电子显示设备的手进行的操作。并,根据第一目标界面的变化控制第二目标界面进行同步变化,其中第一目标界面为显示界面和悬浮窗界面中的一者,第二目标界面为显示界面和悬浮窗界面中的另一者。采用这种方法,由于悬浮窗界面是显示界面缩小尺寸后的界面,所以通过用户在悬浮窗界面内进行单手触控操作,可实现对显示界面的单手触控操作。并且由于悬浮窗界面的尺寸比显示界面的尺寸小,所以相较于显示界面,悬浮窗界面更便于用户进行单手触控操作。因此本公开这种方法解决了相关技术中因大屏幕电子显示设备的屏幕过大而存在的单手进行触控操作不方便的问题,例如解决了在大屏幕电子显示设备的屏幕上不便进行单手触控操作的问题。Using this device 200, the electronic display device responds to detecting an instruction to activate the one-handed operation mode and generates a floating window interface based on the display interface in the current display state of the electronic display device. The floating window interface is a reduced size of the display interface. interface. A floating window interface is displayed on an upper layer of the display interface in a target display area on the screen of the electronic display device, where the target display area is determined based on a gesture of the user holding the electronic display device with one hand. When it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation, wherein the first one-hand touch operation is held by the user with one hand. Hands performing operations on electronic display devices. And, the second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface. By. Using this method, since the floating window interface is an interface after the display interface has been reduced in size, the user can perform a one-handed touch operation on the display interface by performing a one-handed touch operation in the floating window interface. And because the size of the floating window interface is smaller than the size of the display interface, the floating window interface is more convenient for users to perform one-handed touch operations compared to the display interface. Therefore, this method of the present disclosure solves the problem in the related art that it is inconvenient to perform touch operations with one hand due to the large screen of a large-screen electronic display device. Problems with hand touch operation.
可选地,所述第一目标界面配置有应用程序的接口,所述响应模块230包括:Optionally, the first target interface is configured with an application program interface, and the response module 230 includes:
第一确定子模块,被配置为用于根据所述第一单手触控操作,确定目标应用程序;A first determination sub-module configured to determine a target application based on the first one-hand touch operation;
调用模块,被配置为用于通过调用所述目标应用程序的接口,以控制所述第一目标界面响应所述第一单手触控操作。The calling module is configured to control the first target interface to respond to the first one-hand touch operation by calling the interface of the target application program.
可选地,所述装置200还包括:Optionally, the device 200 also includes:
控制模块,被配置为用于在检测到用户在所述显示界面内进行第二单手触控操作的情况下,控制所述第一目标界面响应所述第二单手触控操作。The control module is configured to control the first target interface to respond to the second one-hand touch operation when it is detected that the user performs a second one-hand touch operation in the display interface.
可选地,所述响应模块230被配置为用于:Optionally, the response module 230 is configured to:
若所述第一目标界面为所述显示界面、且所述第二目标界面为所述悬浮窗界面,则将所述第一单手触控操作映射到所述显示界面;控制所述显示界面对所述第一单手触控操作的映射结果进行响应。If the first target interface is the display interface and the second target interface is the floating window interface, map the first one-hand touch operation to the display interface; control the display interface Respond to the mapping result of the first one-hand touch operation.
可选地,所述显示模块220包括:Optionally, the display module 220 includes:
获取子模块,被配置为用于获取用户在所述电子显示设备上的触控数据;An acquisition submodule configured to acquire the user's touch data on the electronic display device;
第二确定子模块,被配置为用于根据所述触控数据,确定用户单手握持所述电子显示设备的手势;The second determination sub-module is configured to determine the gesture of the user holding the electronic display device with one hand according to the touch data;
第三确定子模块,被配置为用于根据用户单手握持所述电子显示设备的手势,确定用户单手握持所述电子显示设备的手在所述电子显示设备的屏幕上进行单手触控操作的操作区域;The third determination sub-module is configured to determine, based on the user's gesture of holding the electronic display device with one hand, whether the user's hand holding the electronic display device with one hand performs a one-handed gesture on the screen of the electronic display device. Operating area for touch operations;
第四确定子模块,被配置为用于根据所述操作区域确定所述目标显示区域;a fourth determination sub-module configured to determine the target display area according to the operation area;
显示子模块,被配置为用于在所述目标显示区域内所述显示界面的上层显示所述悬浮窗界面。The display submodule is configured to display the floating window interface on an upper layer of the display interface in the target display area.
可选地,所述触控数据包括触控图像,所述第二确定子模块被配置为用于:Optionally, the touch data includes a touch image, and the second determination sub-module is configured to:
将所述触控图像输入训练完成的手势识别模型,得到所述手势识别模型输出的表征用户单手握持所述电子显示设备的手势的识别结果。The touch image is input into the trained gesture recognition model, and a recognition result output by the gesture recognition model representing the gesture of the user holding the electronic display device with one hand is obtained.
可选地,所述电子显示设备包括电磁波能量吸收比率传感器,所述触控数据包括所述电磁波能量吸收比率传感器采集的传感器数据,所述第二确定子模块被配置为用于:Optionally, the electronic display device includes an electromagnetic wave energy absorption ratio sensor, the touch data includes sensor data collected by the electromagnetic wave energy absorption ratio sensor, and the second determination sub-module is configured to:
根据所述传感器数据确定用户与所述电子显示设备的接触区域;Determine the contact area between the user and the electronic display device based on the sensor data;
根据所述接触区域在所述电子显示设备上的位置分布,确定用户握持所述电子显示设备的手势。According to the position distribution of the contact area on the electronic display device, a gesture of the user holding the electronic display device is determined.
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。Regarding the devices in the above embodiments, the specific manner in which each module performs operations has been described in detail in the embodiments related to the method, and will not be described in detail here.
本公开还提供一种计算机可读存储介质,其上存储有计算机程序指令,该程序指令被处理器执行时实现本公开提供的电子显示设备的显示控制方法的步骤。The present disclosure also provides a computer-readable storage medium on which computer program instructions are stored. When the program instructions are executed by a processor, the steps of the display control method of the electronic display device provided by the present disclosure are implemented.
图3是根据一示例性实施例示出的一种电子显示设备800的框图。例如,电子显示设备800可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等。FIG. 3 is a block diagram of an electronic display device 800 according to an exemplary embodiment. For example, the electronic display device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like.
参照图3,电子显示设备800可以包括以下一个或多个组件:处理组件802,存储器804,电源组件806,多媒体组件808,音频组件810,输入/输出接口812,传感器组件814,以及通信组件816。Referring to Figure 3, electronic display device 800 may include one or more of the following components: processing component 802, memory 804, power supply component 806, multimedia component 808, audio component 810, input/output interface 812, sensor component 814, and communication component 816 .
处理组件802通常控制电子显示设备800的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件802可以包括一个或多个处理器820来执行指令,以完成上述的电子显示设备的显示控制方法的全部或部分步骤。此外,处理组件802可以包括一个或多个模块,便于处理组件802和其他组件之间的交互。例如,处理组件802可以包括多媒体模块,以方便多媒体组件808和处理组件802之间的交互。 Processing component 802 generally controls the overall operations of electronic display device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to complete all or part of the steps of the above-mentioned display control method of the electronic display device. Additionally, processing component 802 may include one or more modules that facilitate interaction between processing component 802 and other components. For example, processing component 802 may include a multimedia module to facilitate interaction between multimedia component 808 and processing component 802.
存储器804被配置为存储各种类型的数据以支持在电子显示设备800的操作。这些数据的示例包括用于在电子显示设备800上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器804可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。 Memory 804 is configured to store various types of data to support operations at electronic display device 800 . Examples of such data include instructions for any application or method operating on electronic display device 800, contact data, phonebook data, messages, pictures, videos, etc. Memory 804 may be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EEPROM), Programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
电源组件806为电子显示设备800的各种组件提供电力。电源组件806可以包括电源管理系统,一个或多个电源,及其他与为电子显示设备800生成、管理和分配电力相关联的组件。 Power supply component 806 provides power to various components of electronic display device 800 . Power supply components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power to electronic display device 800 .
多媒体组件808包括在所述电子显示设备800和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件808包括一个前置摄像头和/或后置摄像头。当电子显示设备800处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具 有焦距和光学变焦能力。 Multimedia component 808 includes a screen that provides an output interface between the electronic display device 800 and the user. In some embodiments, the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from the user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide action. In some embodiments, multimedia component 808 includes a front-facing camera and/or a rear-facing camera. When the electronic display device 800 is in an operating mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front-facing camera and rear-facing camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
音频组件810被配置为输出和/或输入音频信号。例如,音频组件810包括一个麦克风(MIC),当电子显示设备800处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器804或经由通信组件816发送。在一些实施例中,音频组件810还包括一个扬声器,用于输出音频信号。 Audio component 810 is configured to output and/or input audio signals. For example, audio component 810 includes a microphone (MIC) configured to receive external audio signals when electronic display device 800 is in operating modes, such as call mode, recording mode, and voice recognition mode. The received audio signal may be further stored in memory 804 or sent via communication component 816 . In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
输入/输出接口812为处理组件802和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。The input/output interface 812 provides an interface between the processing component 802 and a peripheral interface module, which may be a keyboard, a click wheel, a button, etc. These buttons may include, but are not limited to: Home button, Volume buttons, Start button, and Lock button.
传感器组件814包括一个或多个传感器,用于为电子显示设备800提供各个方面的状态评估。例如,传感器组件814可以检测到电子显示设备800的打开/关闭状态,组件的相对定位,例如所述组件为电子显示设备800的显示器和小键盘,传感器组件814还可以检测电子显示设备800或电子显示设备800一个组件的位置改变,用户与电子显示设备800接触的存在或不存在,电子显示设备800方位或加速/减速和电子显示设备800的温度变化。传感器组件814可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件814还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件814还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。 Sensor component 814 includes one or more sensors for providing various aspects of status assessment for electronic display device 800 . For example, the sensor component 814 can detect the open/closed state of the electronic display device 800, the relative positioning of components, such as the display and keypad of the electronic display device 800, the sensor component 814 can also detect the electronic display device 800 or the electronic display device 800. Changes in the position of a component of the display device 800 , the presence or absence of user contact with the electronic display device 800 , the orientation or acceleration/deceleration of the electronic display device 800 and changes in the temperature of the electronic display device 800 . Sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. Sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor component 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
通信组件816被配置为便于电子显示设备800和其他设备之间有线或无线方式的通信。电子显示设备800可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件816经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件816还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。 Communication component 816 is configured to facilitate wired or wireless communication between electronic display device 800 and other devices. The electronic display device 800 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communications component 816 also includes a near field communications (NFC) module to facilitate short-range communications. For example, the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
在示例性实施例中,电子显示设备800可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述电子显示设备的显示控制方法。In an exemplary embodiment, electronic display device 800 may be configured by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field Programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are implemented for executing the display control method of the above electronic display device.
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如 包括指令的存储器804,上述指令可由电子显示设备800的处理器820执行以完成上述电子显示设备的显示控制方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。In an exemplary embodiment, a non-transitory computer-readable storage medium including instructions, such as a memory 804 including instructions, which can be executed by the processor 820 of the electronic display device 800 to complete the above-mentioned electronic display device is also provided. Display control method. For example, the non-transitory computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
在另一示例性实施例中,还提供一种计算机程序产品,该计算机程序产品包含能够由可编程的装置执行的计算机程序,该计算机程序具有当由该可编程的装置执行时用于执行上述的电子显示设备的显示控制方法的代码部分。In another exemplary embodiment, a computer program product is also provided, the computer program product comprising a computer program executable by a programmable device, the computer program having a function for performing the above when executed by the programmable device. The code part of the display control method of the electronic display device.
本领域技术人员在考虑说明书及实践本公开后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。Other embodiments of the disclosure will be readily apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure that follow the general principles of the disclosure and include common knowledge or customary technical means in the technical field that are not disclosed in the disclosure. . It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。It is to be understood that the present disclosure is not limited to the precise structures described above and illustrated in the accompanying drawings, and various modifications and changes may be made without departing from the scope thereof. The scope of the disclosure is limited only by the appended claims.

Claims (10)

  1. 一种电子显示设备的显示控制方法,其特征在于,所述方法包括:A display control method for an electronic display device, characterized in that the method includes:
    响应于检测到启动单手操作模式的指令,根据所述电子显示设备当前显示状态下的显示界面生成悬浮窗界面,所述悬浮窗界面是所述显示界面缩小尺寸后的界面;In response to detecting the instruction to start the one-handed operation mode, generating a floating window interface according to the display interface in the current display state of the electronic display device, where the floating window interface is a reduced size of the display interface;
    在所述电子显示设备的屏幕上的目标显示区域内所述显示界面的上层显示所述悬浮窗界面,其中,所述目标显示区域是基于用户单手握持所述电子显示设备的手势确定的;The floating window interface is displayed on the upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is determined based on the user's gesture of holding the electronic display device with one hand. ;
    在检测到用户在所述悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应所述第一单手触控操作,所述第一单手触控操作是用户单手握持所述电子显示设备的手进行的操作;并,When it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation, and the first one-hand touch operation is the user's first one-hand touch operation. Operations performed by holding the electronic display device with one hand; and,
    根据所述第一目标界面的变化控制第二目标界面进行同步变化,其中所述第一目标界面为所述显示界面和所述悬浮窗界面中的一者,所述第二目标界面为所述显示界面和所述悬浮窗界面中的另一者。The second target interface is controlled to change synchronously according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the The other one of the display interface and the floating window interface.
  2. 根据权利要求1所述的方法,其特征在于,所述第一目标界面配置有应用程序的接口,所述控制第一目标界面响应所述第一单手触控操作,包括:The method of claim 1, wherein the first target interface is configured with an application program interface, and controlling the first target interface to respond to the first one-hand touch operation includes:
    根据所述第一单手触控操作,确定目标应用程序;Determine the target application according to the first one-hand touch operation;
    通过调用所述目标应用程序的接口,以控制所述第一目标界面响应所述第一单手触控操作。By calling the interface of the target application program, the first target interface is controlled to respond to the first one-hand touch operation.
  3. 根据权利要求1所述的方法,其特征在于,所述方法还包括:The method of claim 1, further comprising:
    在检测到用户在所述显示界面内进行第二单手触控操作的情况下,控制所述第一目标界面响应所述第二单手触控操作。When it is detected that the user performs a second one-hand touch operation in the display interface, the first target interface is controlled to respond to the second one-hand touch operation.
  4. 根据权利要求1所述的方法,其特征在于,若所述第一目标界面为所述显示界面、且所述第二目标界面为所述悬浮窗界面,则所述在检测到用户在所述悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应所述第一单手触控操作,包括:The method according to claim 1, characterized in that, if the first target interface is the display interface and the second target interface is the floating window interface, then after detecting that the user is in the When a first one-hand touch operation is performed in the floating window interface, controlling the first target interface to respond to the first one-hand touch operation includes:
    将所述第一单手触控操作映射到所述显示界面;Map the first one-handed touch operation to the display interface;
    控制所述显示界面对所述第一单手触控操作的映射结果进行响应。The display interface is controlled to respond to the mapping result of the first one-hand touch operation.
  5. 根据权利要求1-4中任一项所述的方法,其特征在于,在所述电子显示设备的屏幕上的目标显示区域内所述显示界面的上层显示所述悬浮窗界面,包括:The method according to any one of claims 1 to 4, characterized in that displaying the floating window interface on the upper layer of the display interface in the target display area on the screen of the electronic display device includes:
    获取用户在所述电子显示设备上的触控数据;Obtain the user's touch data on the electronic display device;
    根据所述触控数据,确定用户单手握持所述电子显示设备的手势;According to the touch data, determine the gesture of the user holding the electronic display device with one hand;
    根据用户单手握持所述电子显示设备的手势,确定用户单手握持所述电子显示设备的手在所述电子显示设备的屏幕上进行单手触控操作的操作区域;Determine the operation area where the user's hand holding the electronic display device with one hand performs a one-handed touch operation on the screen of the electronic display device according to the gesture of the user holding the electronic display device with one hand;
    根据所述操作区域确定所述目标显示区域;Determine the target display area according to the operation area;
    在所述目标显示区域内所述显示界面的上层显示所述悬浮窗界面。The floating window interface is displayed on an upper layer of the display interface in the target display area.
  6. 根据权利要求5所述的方法,其特征在于,所述触控数据包括触控图像,所述根据所述触控数据,确定用户单手握持所述电子显示设备的手势,包括:The method of claim 5, wherein the touch data includes a touch image, and determining a gesture of the user holding the electronic display device with one hand based on the touch data includes:
    将所述触控图像输入训练完成的手势识别模型,得到所述手势识别模型输出的表征用户单手握持所述电子显示设备的手势的识别结果。The touch image is input into the trained gesture recognition model, and a recognition result output by the gesture recognition model representing the gesture of the user holding the electronic display device with one hand is obtained.
  7. 根据权利要求5所述的方法,其特征在于,所述电子显示设备包括电磁波能量吸收比率传感器,所述触控数据包括所述电磁波能量吸收比率传感器采集的传感器数据,所述根据所述触控数据,确定用户单手握持所述电子显示设备的手势,包括:The method of claim 5, wherein the electronic display device includes an electromagnetic wave energy absorption ratio sensor, and the touch data includes sensor data collected by the electromagnetic wave energy absorption ratio sensor. Data to determine the gesture of the user holding the electronic display device with one hand, including:
    根据所述传感器数据确定用户与所述电子显示设备的接触区域;Determine the contact area between the user and the electronic display device based on the sensor data;
    根据所述接触区域在所述电子显示设备上的位置分布,确定用户握持所述电子显示设备的手势。According to the position distribution of the contact area on the electronic display device, a gesture of the user holding the electronic display device is determined.
  8. 一种电子显示设备的显示控制装置,其特征在于,所述装置包括:A display control device for an electronic display device, characterized in that the device includes:
    生成模块,被配置为用于响应于检测到启动单手操作模式的指令,根据所述电子显示设备当前显示状态下的显示界面生成悬浮窗界面,所述悬浮窗界面是所述显示界面缩小尺寸后的界面;a generating module configured to generate a floating window interface according to the display interface in the current display state of the electronic display device in response to detecting an instruction to start the one-handed operation mode, where the floating window interface is a reduced size of the display interface The final interface;
    显示模块,被配置为用于在所述电子显示设备的屏幕上的目标显示区域内所述显示界面的上层显示所述悬浮窗界面,其中,所述目标显示区域是基于用户单手握持所述电子显示设备的手势确定的;The display module is configured to display the floating window interface on an upper layer of the display interface in a target display area on the screen of the electronic display device, wherein the target display area is based on the user holding a single-hand display. The gestures of the electronic display device are determined;
    响应模块,被配置为用于在检测到用户在所述悬浮窗界面内进行第一单手触控操作的情况下,控制第一目标界面响应所述第一单手触控操作,所述第一单手触控操作是用 户单手握持所述电子显示设备的手进行的操作;a response module configured to control the first target interface to respond to the first one-hand touch operation when it is detected that the user performs a first one-hand touch operation in the floating window interface, and the third A one-hand touch operation is an operation performed by the user holding the electronic display device with one hand;
    同步模块,被配置为用于根据所述第一目标界面的变化控制第二目标界面进行同步变化,其中所述第一目标界面为所述显示界面和所述悬浮窗界面中的一者,所述第二目标界面为所述显示界面和所述悬浮窗界面中的另一者。A synchronization module configured to control the second target interface to perform synchronous changes according to changes in the first target interface, wherein the first target interface is one of the display interface and the floating window interface, so The second target interface is the other one of the display interface and the floating window interface.
  9. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现权利要求1-7中任一项所述方法的步骤。A computer-readable storage medium on which a computer program is stored, characterized in that when the program is executed by a processor, the steps of the method described in any one of claims 1-7 are implemented.
  10. 一种电子显示设备,其特征在于,包括:An electronic display device, characterized by including:
    存储器,其上存储有计算机程序;A memory on which a computer program is stored;
    处理器,用于执行所述存储器中的所述计算机程序,以实现权利要求1-7中任一项所述方法的步骤。A processor, configured to execute the computer program in the memory to implement the steps of the method according to any one of claims 1-7.
PCT/CN2022/099937 2022-06-20 2022-06-20 Electronic display device and display control method and apparatus therefor, and storage medium WO2023245373A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/099937 WO2023245373A1 (en) 2022-06-20 2022-06-20 Electronic display device and display control method and apparatus therefor, and storage medium
CN202280004180.3A CN117999537A (en) 2022-06-20 2022-06-20 Electronic display device, display control method and device thereof, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/099937 WO2023245373A1 (en) 2022-06-20 2022-06-20 Electronic display device and display control method and apparatus therefor, and storage medium

Publications (1)

Publication Number Publication Date
WO2023245373A1 true WO2023245373A1 (en) 2023-12-28

Family

ID=89378946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/099937 WO2023245373A1 (en) 2022-06-20 2022-06-20 Electronic display device and display control method and apparatus therefor, and storage medium

Country Status (2)

Country Link
CN (1) CN117999537A (en)
WO (1) WO2023245373A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110083099A1 (en) * 2009-10-05 2011-04-07 Samsung Electronics Co. Ltd. Mobile device and overlay display method thereof
CN106569672A (en) * 2016-11-09 2017-04-19 宇龙计算机通信科技(深圳)有限公司 Application icon managing method and terminal equipment
CN108366301A (en) * 2018-04-24 2018-08-03 中国广播电视网络有限公司 A kind of video suspension playback method based on Android
CN109165076A (en) * 2018-10-17 2019-01-08 Oppo广东移动通信有限公司 Display methods, device, terminal and the storage medium of game application
CN111124201A (en) * 2019-11-29 2020-05-08 华为技术有限公司 One-hand operation method and electronic equipment
CN113805745A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Control method of suspension window and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110083099A1 (en) * 2009-10-05 2011-04-07 Samsung Electronics Co. Ltd. Mobile device and overlay display method thereof
CN106569672A (en) * 2016-11-09 2017-04-19 宇龙计算机通信科技(深圳)有限公司 Application icon managing method and terminal equipment
CN108366301A (en) * 2018-04-24 2018-08-03 中国广播电视网络有限公司 A kind of video suspension playback method based on Android
CN109165076A (en) * 2018-10-17 2019-01-08 Oppo广东移动通信有限公司 Display methods, device, terminal and the storage medium of game application
CN111124201A (en) * 2019-11-29 2020-05-08 华为技术有限公司 One-hand operation method and electronic equipment
CN113805745A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Control method of suspension window and electronic equipment

Also Published As

Publication number Publication date
CN117999537A (en) 2024-05-07

Similar Documents

Publication Publication Date Title
US10788978B2 (en) Method and apparatus for displaying interface and storage medium
US9965039B2 (en) Device and method for displaying user interface of virtual input device based on motion recognition
US11770600B2 (en) Wide angle video conference
KR20130102834A (en) Mobile terminal and control method thereof
JP2019500631A (en) Electronic device and display method
WO2012128835A1 (en) Gesture-based configuration of image processing techniques
CN103927080A (en) Method and device for controlling control operation
CN111031398A (en) Video control method and electronic equipment
TW201443764A (en) Apparatus and method of controlling screens in a device
JP6093088B2 (en) Method and apparatus for coordinating web pages and electronic devices
WO2019184947A1 (en) Image viewing method and mobile terminal
CN107172347B (en) Photographing method and terminal
CN106980409B (en) Input control method and device
WO2023025060A1 (en) Interface display adaptation processing method and apparatus, and electronic device
KR102661343B1 (en) Image cropping method and electronic devices
KR20220093091A (en) Labeling method and apparatus, electronic device and storage medium
CN111522498A (en) Touch response method and device and storage medium
KR20140019215A (en) Camera cursor system
CN111273979A (en) Information processing method, device and storage medium
WO2023245373A1 (en) Electronic display device and display control method and apparatus therefor, and storage medium
EP4064016A1 (en) Interface display method of an application and apparatus
CN115543064A (en) Interface display control method, interface display control device and storage medium
WO2023245332A1 (en) Sensor module, electronic device, induction and recognition method, and storage medium
WO2023220983A1 (en) Control method and apparatus for switching single-hand mode, device and storage medium
CN113495666B (en) Terminal control method, terminal control device and storage medium

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202280004180.3

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22947154

Country of ref document: EP

Kind code of ref document: A1