WO2022161432A1 - Procédé et appareil de commande d'affichage, dispositif électronique et support - Google Patents

Procédé et appareil de commande d'affichage, dispositif électronique et support Download PDF

Info

Publication number
WO2022161432A1
WO2022161432A1 PCT/CN2022/074264 CN2022074264W WO2022161432A1 WO 2022161432 A1 WO2022161432 A1 WO 2022161432A1 CN 2022074264 W CN2022074264 W CN 2022074264W WO 2022161432 A1 WO2022161432 A1 WO 2022161432A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
interface
interfaces
wearable device
target
Prior art date
Application number
PCT/CN2022/074264
Other languages
English (en)
Chinese (zh)
Inventor
刘琨
陈喆
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2022161432A1 publication Critical patent/WO2022161432A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the embodiments of the present application relate to the field of communication technologies, and in particular, to a display control method, apparatus, electronic device, and medium.
  • Augmented Reality (AR) glasses are also a wearable device.
  • the user can see the interface formed by the superimposition of the virtual object (for example, text information or image information) in the AR glasses and the real environment through the AR glasses. If the user is not satisfied with the interface he sees, the user can trigger the AR glasses to operate the virtual objects in the interface by inputting the function button on the AR glasses to obtain a new interface.
  • the virtual object for example, text information or image information
  • the user when the user has other operational requirements for the interface displayed by the AR glasses, since the operation function of the AR glasses is relatively simple and cannot meet the user's needs, the user may need to use other devices (such as a remote control) ) controls the AR glasses to realize the operation of the interface displayed by the AR glasses, so that the operation of the interface displayed by the AR glasses is very inconvenient.
  • other devices such as a remote control
  • the purpose of the embodiments of the present application is to provide a display control method, device, electronic device, and medium, which can solve the problem of very inconvenient operation of an interface displayed by AR glasses.
  • an embodiment of the present application provides a display control method.
  • the method includes: displaying M interfaces on a virtual screen of a wearable device, where the M interfaces are interfaces sent by T target devices to the wearable device; Receive the user's first input, the first input is the input of the first floating gesture; in the case that the first floating gesture matches the first preset gesture, in response to the first input, the M interface The first interface executes target processing, and the target processing is used to adjust display parameters of the first interface in the virtual screen.
  • M and T are both positive integers.
  • an embodiment of the present application provides a display control device, where the display control device includes: a display module, a receiving module, and a processing module.
  • the display module is used for displaying M interfaces on the virtual screen of the wearable device, where the M interfaces are interfaces sent by the T target devices to the wearable device.
  • the receiving module is configured to receive the first input of the user, where the first input is the input of the first hovering gesture.
  • a processing module configured to perform target processing on the first interface among the M interfaces in response to the first input when the first floating gesture matches the first preset gesture, and the target processing is used to adjust the Display parameters of the first interface on the virtual screen.
  • M and T are both positive integers.
  • an embodiment of the present application provides an electronic device, the electronic device includes a processor, a memory, and a program or instruction stored in the memory and executable on the processor.
  • the program or instruction is executed by the processor, the The steps of the method in the first aspect above.
  • an embodiment of the present application provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the steps of the method in the foregoing first aspect are implemented.
  • an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used for running a program or instruction to implement the method in the first aspect.
  • M interfaces may be displayed on the virtual screen of the wearable device, where the M interfaces are interfaces sent by the T target devices to the wearable device; the first input from the user is received, and the first input is The input of the first suspension gesture; in the case that the first suspension gesture matches the first preset gesture, in response to the first input, target processing is performed on the first interface among the M interfaces, and the target processing is used for Adjust the display parameters of the first interface in the virtual screen.
  • M and T are both positive integers.
  • the virtual screen of the wearable device can display M interfaces sent by the T target devices to the wearable device, when the user operates the interface displayed by the AR glasses
  • the user can directly use the gesture input that matches the first preset gesture to trigger the execution of target processing on the first interface of the M interfaces, so that the user does not need to control the AR glasses through a remote control or other devices, so as to realize the AR glasses.
  • the operation of the displayed interface In this way, the operation of the interface displayed by the AR glasses is more convenient.
  • FIG. 1 is a schematic diagram of a display control method provided by an embodiment of the present application.
  • FIG. 2 is one of schematic diagrams of a gesture input interface provided by an embodiment of the present application
  • FIG. 3 is one of interface schematic diagrams of an interface display control provided by an embodiment of the present application.
  • FIG. 4 is the second schematic diagram of a gesture input interface provided by an embodiment of the present application.
  • FIG. 5 is a third schematic diagram of a gesture input interface provided by an embodiment of the present application.
  • FIG. 6 is the second schematic diagram of an interface display control provided by an embodiment of the present application.
  • FIG. 7 is a third interface schematic diagram of an interface display control provided by an embodiment of the present application.
  • FIG. 8 is a fourth schematic diagram of a gesture input interface provided by an embodiment of the present application.
  • FIG. 9 is a fourth schematic diagram of an interface display control provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a display control device provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • FIG. 12 is a schematic hardware diagram of an electronic device provided by an embodiment of the present application.
  • words such as “exemplarily” or “for example” are used to represent examples, illustrations or illustrations. Any embodiments or designs described in the embodiments of the present application as “exemplarily” or “for example” should not be construed as being advantageous over other embodiments or designs. Rather, the use of words such as “exemplarily” or “such as” is intended to present the related concepts in a specific manner.
  • Embodiments of the present application provide a display control method, device, electronic device, and medium, which can display M interfaces on a virtual screen of a wearable device, where the M interfaces are interfaces sent by a target device to the wearable device, and M is a positive Integer; and when it is detected that the user's gesture matches the first preset gesture, perform the target operation on the first interface among the M interfaces.
  • the target operation includes at least one of the following: grabbing, zooming in, zooming out, moving, and typesetting.
  • an embodiment of the present application provides a display control method, and the method includes the following steps 101 to 104 .
  • Step 101 The display control apparatus displays M interfaces on the virtual screen of the wearable device.
  • M interfaces are interfaces sent by the T target devices to the wearable device, and M and T are both positive integers.
  • the wearable device may be a device using AR technology, or a device using other possible smart technologies.
  • the wearable device is AR glasses or AR helmet.
  • the above-mentioned M interfaces may include at least one of the following: an interface of a picture, a webpage, and an application program.
  • the M interfaces can also be other possible types of interfaces. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the interface content of each of the above-mentioned M interfaces may be the same or different.
  • the most preferable way is that the interface content of each interface in the M interfaces is different from each other.
  • the display forms of each of the M interfaces may be the same or different. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the number of T target devices is also different accordingly.
  • M is greater than 1
  • the following two possible situations can be included:
  • the M interfaces can be different interfaces sent by one target device.
  • the M interfaces can be different interfaces sent by a target device at the same time; or different interfaces sent by a target device at different times.
  • the M interfaces may be interfaces respectively sent by the T target devices.
  • the M interfaces are interfaces displayed on the virtual screen of the wearable device, the M interfaces are virtual interfaces.
  • each of the M interfaces can exist independently of each other, and will not mutually exclude and influence each other.
  • Step 102 The display control apparatus receives a first input from the user.
  • the above-mentioned first input is the input of the first hovering gesture.
  • the user can perform an air gesture operation on the interface displayed on the virtual screen.
  • the first input is an input of a zoom gesture, an input of a swipe gesture, an input of a drag gesture, an input of a grab gesture, or an input of other gestures.
  • the first input is the input of the zoom gesture as an example.
  • the user may drag a frame of an interface on the virtual screen by dragging gestures with both hands to zoom in or out horizontally and vertically.
  • the first input is the input of a grab gesture.
  • the user can move the border of an interface on the virtual screen through a grasping gesture.
  • the display control method provided in this embodiment of the present application may further include: detecting whether the first hovering gesture matches the first preset gesture. After that, if yes, execute the following step 103; if not, execute the following step 104.
  • Step 103 In the case that the first hovering gesture matches the first preset gesture, the display control apparatus performs target processing on the first interface among the M interfaces in response to the first input.
  • the above-mentioned target processing is used to adjust the display parameters of the first interface in the virtual screen.
  • the above-mentioned target processing may include at least one of the following: enlargement, reduction, movement, and typesetting.
  • the display parameters may include display size, display position or typesetting format, and the like.
  • the display size of the first interface in the virtual screen can be enlarged; when the target processing is zoom-out, the display size of the first interface in the virtual screen can be reduced; when the target processing is moving , the display position of the first interface in the virtual screen can be moved; when the target processing is typesetting, the typesetting format of the first interface in the virtual screen can be adjusted.
  • the above-mentioned first preset gesture may be a factory setting of the wearable device, or a user-defined setting in advance. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the first preset gesture is a one-hand drag gesture, or a two-hand drag gesture.
  • the user can drag the border of the interface by dragging gestures with both hands to trigger the horizontal enlargement of the interface displayed on the virtual screen, or the vertical reduction of the interface displayed on the virtual screen.
  • a user can trigger movement of an interface through grasping and moving gestures.
  • the wearable device may be preset with multiple gestures, the multiple gestures include a first preset gesture, and one gesture corresponds to one processing method. In this way, the wearable device can be triggered to perform different processing on the virtual interface through different gestures of the user.
  • Step 104 In the case that the first hovering gesture does not match the first preset gesture, the display control apparatus does not perform any processing on the M interfaces.
  • target processing in the foregoing embodiments are only several exemplary descriptions provided by the embodiments of the present application.
  • target processing may also include other types of processing. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the display control device is AR glasses and the first preset gesture is a zoom gesture.
  • the user wears AR glasses, as shown in (a) of FIG. 3 , the user can see the three interfaces displayed on the virtual screen 01 through the AR glasses. If the user wants to zoom in on the interface 02 in the three interfaces, as shown in (a) in FIG. 3 , the user can perform a zoom gesture (ie, the first input) toward the interface 02 along the arrow.
  • the camera of the AR glasses detects that the zoom gesture matches the first preset gesture, the interface 02 is enlarged, so that as shown in (b) in FIG. 3 , the AR glasses display the interface 02 on the virtual screen. Enlarged interface 03.
  • the first preset gesture as a finger drag gesture as an example.
  • FIG. 4 it is assumed that the A interface and the B interface are displayed on the virtual screen.
  • the B interface can be moved to the right, so that the A interface can be moved to the right. Prevent the enlarged A interface from being affected.
  • An embodiment of the present application provides a display control method.
  • a user wears a wearable device
  • M interfaces sent by T target devices to the wearable device can be displayed on the virtual screen of the wearable device
  • the user controls
  • the interface displayed by the AR glasses has operational requirements
  • the user can directly use the gesture input matching the first preset gesture to trigger the execution of target processing on the first interface of the M interfaces, so that the user does not need to control the AR through a remote control or other devices glasses to realize the operation of the interface displayed by the AR glasses. In this way, the operation of the interface displayed by the AR glasses is more convenient.
  • the T target devices have established connections with the wearable device.
  • the display control method provided by this embodiment of the present application may further include steps 105 to 107 .
  • Step 105 Receive a second input from the user.
  • the above-mentioned second input is the input of the second hovering gesture.
  • the second input is a different input from the hovering gesture of the first input.
  • the second input may be a zoom gesture input, a slide gesture input, a drag gesture input, or the like.
  • the user may perform input of a two-finger swipe gesture on the wearable device.
  • the display control method provided in this embodiment of the present application may further include: detecting whether the second hovering gesture matches the second preset gesture. After that, if yes, execute the following step 106; if no, do nothing.
  • the display control apparatus detects whether the second hovering gesture matches the second preset gesture.
  • T target devices are within the field of view of the camera of the wearable device means: when the user wears the wearable device, the user can turn the head to drive the wearable device. camera to detect. When the camera of the wearable device collects images of the T target devices, it can be determined that the T target devices are within the field of view of the camera of the wearable device.
  • Step 106 In the case that the second hovering gesture matches the second preset gesture, the display control apparatus sends a request message to each of the T target devices in response to the second input.
  • the above request message is used to request to acquire the second interface currently displayed by each target device.
  • the above-mentioned second preset gesture may be a factory setting of the wearable device, or a user-defined setting in advance.
  • the second preset gesture is a gesture among a plurality of gestures preset by the wearable device. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the second preset gesture is a sliding gesture.
  • the display control method provided in this embodiment of the present application may further include: each target device receives a request message and sends a second interface to the wearable device.
  • the request message is used to request to obtain the second interface currently displayed by each target device.
  • Step 107 The display control apparatus receives the second interface sent by each target device. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the second interface sent by each target device may be the same or different.
  • the display control method provided in this embodiment of the present application may further include: the display control apparatus displays at least one target interface on the virtual screen.
  • each of the above target interfaces is the same as the content of the second interface, that is, one target interface is an interface among the M interfaces. It can be understood that one target interface displayed on the virtual screen may be the same as or different from the second interface displayed by the target device. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the user can first perform a zoom operation on the screen of the target device by using two fingers to trigger the target device to zoom the interface.
  • the mobile phone displays the zoomed interface; then, as shown in (c) in Figure 5, the user can perform a double-point sliding gesture on the virtual screen of the wearable device , triggering sending to the target device a request message requesting to obtain the interface currently displayed by the target device; after that, the target device can send an interface currently displayed to the wearable device, so that the wearable device can obtain the interface, and then the wearable device can display
  • the one interface, that is, one interface currently displayed by the target device can be directly mapped to the virtual screen of the wearable device.
  • the wearable device displays at least one target interface on the virtual screen, when the user wears the wearable device and turns his head, the at least one target interface can follow the user.
  • the viewing angle moves, but the position of the wearable device relative to the virtual screen (or at least one target interface) remains the same.
  • the display control device is AR glasses
  • the target device is a mobile phone
  • the second preset gesture is a two-finger sliding gesture.
  • an interface 05 is displayed on the screen of the mobile phone 04 (ie, the target device).
  • the user wants to view the interface 05 through the AR glasses, as shown in (b) in Figure 6, the user can perform a two-finger swipe gesture on the AR glasses (i.e. second input).
  • the AR glasses can send a request message to the mobile phone 04 for requesting to obtain the interface 05 (ie, the second interface) currently displayed by the mobile phone 04 .
  • the mobile phone 04 can receive the request message and send the interface 05 to the AR glasses.
  • the AR glasses can receive the interface 05, and as shown in (c) of FIG. 6 , the AR glasses display the interface 06 with the same content as the interface 05 on the virtual screen 01 .
  • the user when the target device and the wearable device are connected, the user can send a request to the target device to obtain the current status of the target device by inputting a floating gesture matching the second preset gesture.
  • the request message of the second interface is displayed, so the wearable device can receive the second interface sent by the target device. That is, an interaction mode between the target device and the wearable device is provided, so as to map the interface of the target device to display on the wearable device.
  • the display control method provided by this embodiment of the present application may further include step 108 and step 109 .
  • Step 108 The display control apparatus receives a third input from the user to the first object and the third interface among the M interfaces.
  • the above-mentioned first object is an object in the third interface among the M interfaces, or an object in the pasteboard of the wearable device.
  • the third input is a third floating touch input.
  • the above-mentioned first object may include at least one of the following: an icon, a text, a picture, a link, a table, etc., or other possible objects.
  • the above-mentioned third input may be a user's hovering gesture input on the first object and the third interface, an air touch input, or other possible input. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the third input may be an input that includes selecting the first object in one interface in the virtual screen in space, and moving in space. The input of the first object to the third interface.
  • the third input may be an input of sliding the pasteboard icon to the third interface in space.
  • Step 109 The display control apparatus adds the first object to the third interface in response to the third input.
  • the first object is an object in a fifth interface among the M interfaces.
  • the first input includes a first sub-input and a second sub-input. That is, the first sub-input is the input of the user moving the first object in the fourth interface to the pasteboard of the wearable device; the second sub-input is the input of moving the pasteboard icon to the third interface.
  • the above-mentioned step 109 can be implemented by the following steps 109A and 109B.
  • Step 109A In response to the first sub-input, the display control apparatus saves the first object to the pasteboard of the wearable device.
  • Step 109B the display control apparatus adds the first object in the pasteboard to the third interface in response to the second sub-input.
  • the display control device as an example of AR glasses.
  • the user wears AR glasses, as shown in (a) in FIG. 7 , the user can see the three interfaces displayed on the virtual screen 01 through the AR glasses. If the user wants to paste the object 08 of the interface 07 in these three interfaces to other interfaces, then the user's finger can make a circle gesture towards the object 08 in the interface 07, and slide the object 08 in the air to paste icon 09 (ie the first sub input). After receiving the first sub-input, in response to the first sub-input, the AR glasses select the object 08 and save the object 08 in the paste board of the wearable device indicated by the paste icon 09 .
  • the user can drag the paste icon 09 into the interface 10 (ie, the second sub-input) in space.
  • the wearable device receives the second sub-input, in response to the second sub-input, as shown in (b) of FIG. 7 , the AR glasses display the interface 10 after adding the object 08 on the virtual screen.
  • the user can first trigger the copying of some objects in one interface of the M interfaces to the wearer according to actual needs. and then trigger to add the part of the object to another interface of the M interfaces. That is, it provides an implementation method of copying some objects in one interface to another interface.
  • the second object is a partial object in one interface of the M interfaces; the first input includes a third sub-input and a fourth sub-input.
  • the user can first trigger a screenshot of the second object through the third sub-input; then, the user can trigger the pasting of the second object to the M through the fourth sub-input in another interface of one interface.
  • the above step 109 can be implemented by the following steps 109C and 109D.
  • Step 109C In response to the third sub-input, the display control apparatus takes a screenshot of the second object in one interface of the M interfaces, and saves the second object to the pasteboard of the wearable device.
  • Step 109D In response to the fourth sub-input, the display control apparatus adds the second object in the pasteboard to another interface of the M interfaces.
  • the user can trigger a part of an object or a part of an interface in one of the M interfaces through input (for example, a floating gesture input).
  • the object in the pasteboard of the wearable device is added to another interface of the M interfaces. In this way, the user can realize the mutual communication between the multiple interfaces by operating between the multiple interfaces.
  • the display control method provided by this embodiment of the present application may further include steps 110 to 112.
  • Step 110 The display control apparatus displays at least one device identifier on the virtual screen.
  • each device identifier in the above at least one device identifier indicates a device
  • the one device is a device that is within the field of view of the camera of the wearable device and has established a connection with the wearable device.
  • the logos in this application are used to indicate words, symbols, images, etc. of information, and controls or other containers can be used as carriers for displaying information, including but not limited to text logos, symbol logos, and image logos.
  • the above at least one device identifier corresponds to at least one device, and the at least one device may include a device in the T target devices.
  • Step 111 The display control apparatus receives a fourth input from the user.
  • the above-mentioned fourth input is a floating touch input to the fifth interface in the M interfaces and the first device identification in the at least one device identification.
  • the above-mentioned fourth input may be an air gesture input, an air touch input or other possible input by the user to the fifth interface. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the fourth input may include gesture input and sliding gesture input of grabbing the fifth interface in the air.
  • the fourth input may include an input on the fifth interface by air-clicking and an input on the fifth interface by air-drag.
  • the fourth input is the user's air gesture input on the sixth interface
  • the fourth input can be determined according to the depth of field information collected by the camera. 4. The spatial position of the input, and then determine that the gesture input in the air is an input to the fifth interface.
  • Step 112 In response to the fourth input, the display control apparatus sends a fifth interface to the first device indicated by the first device identifier.
  • the fourth input may include grabbing a sub-input of the fifth interface from among the M interfaces, and moving the fifth interface to another sub-input of the first device identification.
  • the above step 112 may specifically include: in response to the one sub-input, the display control device selects a sixth interface corresponding to the spatial position of the second input from the M interfaces; and in response to the other sub-input, selects the fifth interface corresponding to the spatial position of the second input. The interface is sent to the first device indicated by the first device identifier.
  • the user can first trigger the capture of an interface displayed on the virtual screen of the selected wearable device by inputting a pinch-to-zoom gesture, and the interface will follow the user's The finger moves; as shown in (b) of Figure 8, the user can use the input of a two-finger swipe gesture to trigger the interface to be mapped to another interface connected to the wearable device and within the field of view of the wearable device. on one device.
  • the first device may be a device in the T target devices, or other devices that have established connections with the wearable device except the T target devices. Specifically, it can be determined according to the actual situation, which is not limited in this embodiment of the present application.
  • the first device as one of the T target devices as an example.
  • the first device is the one device; and when the T target devices are multiple devices, the first device is a device among the multiple devices.
  • the display control method provided in this embodiment of the present application may further include: the first device receives a fifth interface, and updates the currently displayed interface to the fifth interface.
  • the display control device as an example of AR glasses.
  • the user wears AR glasses.
  • the user can see the three interfaces displayed on the virtual screen 01 through the AR glasses. If the user wants to put interface 03 of these three interfaces on other devices, the user can wear AR glasses and turn his head, so that the AR glasses device detects that it is within the field of view of the camera of the wearable device, and
  • the AR glasses display the device ID 11 , the device ID 12 , and the device ID 13 on the virtual screen 01 .
  • the user can make a gesture of pinching fingers in the air toward the interface 03, and move the interface 03 to the device identification 11 in the air along the arrow direction (that is, the fourth input).
  • the AR glasses select the interface 03 (ie, the fifth interface) corresponding to the spatial position of the fourth input from the three interfaces, and identify 11 to the device.
  • the indicated device 11 sends the interface 03 .
  • the device 11 updates the interface 14 shown in (b) in FIG. 9 to the interface 15 shown in (c) in FIG. The content is consistent with the content of this interface 03 .
  • the user can input a trigger to the wearable device that is within the field of view of the camera of the wearable device and has been established with the wearable device.
  • Any connected device sends one interface among the M interfaces, so that any device can display the one interface. In this way, an interaction mode between multiple devices is realized, and the interface is switched and displayed among the multiple devices.
  • the execution subject may be a display control device, or a control module in the display control device for executing the display control method.
  • the display control device provided by the embodiment of the present application is described by taking the display control device executing the display control method as an example.
  • an embodiment of the present application provides a display control apparatus 200
  • the display control apparatus may include a display module 201 , a receiving module 202 and a processing module 203 .
  • the display module 201 may be configured to display M interfaces on the virtual screen of the wearable device, where the M interfaces are interfaces sent by the T target devices to the wearable device.
  • the receiving module 202 may be configured to receive a first input from the user, where the first input is an input of the first hovering gesture.
  • the processing module 203 can be configured to perform target processing on the first interface among the M interfaces in response to the first input received by the receiving module 202 when the first floating gesture matches the first preset gesture, The target processing is used to adjust display parameters of the first interface in the virtual screen.
  • M and T are both positive integers.
  • the display control apparatus may further include a sending module 204 and a transceiver module 205 .
  • the receiving module 202 may also be configured to receive a second input from the user, where the second input is an input of a second hovering gesture.
  • the sending module 204 can be configured to send a request message to each of the T target devices in response to the second input received by the receiving module 202 when the second floating gesture matches the second preset gesture, The request message is used to request to obtain the second interface currently displayed by each target device.
  • the transceiver module 205 can also be used to receive the second interface sent by each target device.
  • the receiving module 202 can also be configured to receive the user's third input on the first object and the third interface in the M interfaces after the display module 201 displays the M interfaces on the virtual screen of the wearable device.
  • the first object is an object in a fourth interface of the M interfaces, or an object in a pasteboard of a wearable device
  • the third input is a third floating touch input.
  • the processing module 202 may also be configured to add the first object to the third interface in response to the third input received by the receiving module 202 .
  • the first object is an object in a fourth interface among the M interfaces; the first input may include a first sub-input and a second sub-input.
  • the processing module 203 can be specifically configured to save the first object in the pasteboard of the wearable device in response to the first sub-input received by the receiving module 202; and in response to the second sub-input received by the receiving module 202, The first object in the pasteboard is added to the third interface.
  • the display module 201 can also be used to display at least one device identifier in the virtual screen after displaying M interfaces on the virtual screen of the wearable device, and each device identifier in the at least one device identifier indicates a device.
  • the one device is within the field of view of the camera of the wearable device and has established a connection with the wearable device.
  • the receiving module 202 may also be configured to receive a fourth input from the user, where the fourth input is a floating touch input to the fifth interface in the M interfaces and the first device identification in the at least one device identification.
  • the processing module 202 may also be configured to, in response to the fourth input received by the receiving module 202, send a fifth interface to the first device indicated by the first device identifier.
  • M is greater than one.
  • the M interfaces may be different interfaces sent by one target device; or, in the case of T>1, the M interfaces may be interfaces respectively sent by the T devices.
  • An embodiment of the present application provides a display control device.
  • a user wears a wearable device
  • M interfaces sent by T target devices to the wearable device can be displayed on the virtual screen of the wearable device
  • the user controls
  • the interface displayed by the AR glasses has operational requirements
  • the user can directly use the gesture input matching the first preset gesture to trigger the display control device to perform target processing on the first interface of the M interfaces, so that the user does not need to use the remote control or Other devices control the AR glasses to operate the interface displayed by the AR glasses. In this way, the operation of the interface displayed by the AR glasses is more convenient.
  • the display control device in this embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal.
  • the apparatus may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, an in-vehicle electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant).
  • UMPC ultra-mobile personal computer
  • netbook or a personal digital assistant (personal digital assistant).
  • assistant, PDA personal digital assistant
  • the non-mobile electronic device can be a server, a network attached storage (NAS), a personal computer (PC), a television (television, TV), a teller machine or a self-service machine, etc.
  • NAS network attached storage
  • PC personal computer
  • TV television
  • teller machine a teller machine
  • self-service machine etc.
  • the display control device in the embodiment of the present application may be a device with an operating system.
  • the operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
  • the display control apparatus provided by the embodiment of the present application can implement each process implemented by the method embodiments in FIG. 1 to FIG. 9 , and to avoid repetition, details are not repeated here.
  • an embodiment of the present application further provides an electronic device 300 , including a processor 301 , a memory 302 , a program or instruction stored in the memory 302 and executable on the processor 301 , the program Or, when the instruction is executed by the processor 301, each process of the above-mentioned display control method embodiment can be implemented, and the same technical effect can be achieved. To avoid repetition, details are not repeated here.
  • the electronic devices in the embodiments of the present application include the aforementioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 12 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
  • the electronic device 400 includes but is not limited to: a radio frequency unit 401, a network module 402, an audio output unit 403, an input unit 404, a sensor 405, a display unit 406, a user input unit 407, an interface unit 408, a memory 409, and a processor 410, etc. part.
  • the electronic device 400 may also include a power source (such as a battery) for supplying power to various components, and the power source may be logically connected to the processor 410 through a power management system, so as to manage charging, discharging, and power management through the power management system. consumption management and other functions.
  • a power source such as a battery
  • the structure of the electronic device shown in FIG. 12 does not constitute a limitation on the electronic device.
  • the electronic device may include more or less components than the one shown, or combine some components, or arrange different components, which will not be repeated here. .
  • the display unit 406 may be configured to display M interfaces on the virtual screen of the wearable device, where the M interfaces are interfaces sent by the T target devices to the wearable device.
  • the user input unit 407 may be configured to receive a first input from a user, where the first input is an input of a first hovering gesture.
  • the processor 410 can be configured to perform target processing on the first interface among the M interfaces in response to the first input received by the user input unit 407 when the first hovering gesture matches the first preset gesture , the target processing is used to adjust the display parameters of the first interface in the virtual screen.
  • M and T are both positive integers.
  • the T target devices have established connections with the wearable device.
  • the user input unit 407 is further configured to receive a second input from the user, where the second input is an input of a second hovering gesture.
  • the radio frequency unit 401 can be configured to send a request message to each of the T target devices in response to the second input received by the user input unit 407 when the second levitation gesture matches the second preset gesture , the request message is used to request to obtain the second interface currently displayed by each target device; and to receive the second interface sent by each target device.
  • the user input unit 407 can also be configured to receive the user's third response to the first object and the third interface among the M interfaces after the display unit 406 displays the M interfaces on the virtual screen of the wearable device.
  • the first object is an object in a fourth interface among the M interfaces, or an object in a pasteboard of a wearable device
  • the third input is a third floating touch input.
  • the processor 410 may be configured to add the first object to the third interface in response to the third input received by the user input unit 407 .
  • the first object is an object in a fourth interface among the M interfaces; the first input includes a first sub-input and a second sub-input.
  • the processor 410 may be specifically configured to save the first object to the pasteboard of the wearable device in response to the first sub-input received by the user input unit 407; and in response to the second sub-input received by the user input unit 407 Input to add the first object in the pasteboard to the third interface.
  • the display unit 406 is configured to display at least one device identifier in the virtual screen after displaying M interfaces on the virtual screen of the wearable device, and each device identifier in the at least one device identifier indicates a device, the A device is a device that is within the field of view of the camera of the wearable device and has established a connection with the wearable device.
  • the user input unit 407 may be configured to receive a fourth input from the user, where the fourth input is a hovering touch input to the fifth interface among the M interfaces and the first device identifier of the at least one device identifier.
  • the radio frequency unit 401 may be configured to, in response to the fourth input received by the user input unit 407, send a fifth interface to the first device indicated by the first device identifier.
  • An embodiment of the present application provides an electronic device.
  • a user wears a wearable device
  • M interfaces sent by T target devices to the wearable device can be displayed on the virtual screen of the wearable device
  • the user can directly use the gesture input matching the first preset gesture to trigger the electronic device to perform target processing on the first interface of the M interfaces, so that the user does not need to control the remote controller or other devices.
  • AR glasses to realize the operation of the interface displayed by the AR glasses. In this way, the operation of the interface displayed by the AR glasses is more convenient.
  • the input unit 404 may include a graphics processing unit (graphics processing unit, GPU) 4041 and a microphone 4042. Such as camera) to obtain still pictures or video image data for processing.
  • the display unit 406 may include a display panel 4061, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 407 includes a touch panel 4071 and other input devices 4072 .
  • the touch panel 4071 is also called a touch screen.
  • the touch panel 4071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 4072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which are not described herein again.
  • Memory 409 may be used to store software programs as well as various data including, but not limited to, application programs and operating systems.
  • the processor 410 may integrate an application processor and a modem processor, wherein the application processor mainly processes the operating system, user interface, and application programs, and the like, and the modem processor mainly processes wireless communication. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor 410.
  • Embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, each process of the foregoing display control method embodiment can be achieved, and can achieve the same The technical effect, in order to avoid repetition, will not be repeated here.
  • a readable storage medium includes a computer-readable storage medium, such as a computer read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, and the like.
  • ROM computer read-only memory
  • RAM random access memory
  • magnetic disk or an optical disk and the like.
  • An embodiment of the present application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used for running a program or an instruction to implement each process of the above display control method embodiment, and can achieve the same In order to avoid repetition, the technical effect will not be repeated here.
  • the chip mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip, or the like.
  • the methods of the above embodiments can be implemented by means of software plus a necessary general hardware platform, and of course hardware can also be used, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or in a part that contributes to the prior art, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, CD-ROM), including several instructions to make a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods in the various embodiments of the present application.
  • a storage medium such as ROM/RAM, magnetic disk, CD-ROM

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention divulgue un procédé et un appareil de commande d'affichage, un dispositif électronique et un support, qui appartiennent au domaine technique des communications. Le procédé comprend les étapes consistant à : afficher M interfaces dans un écran virtuel d'un dispositif pouvant être porté sur soi, les M interfaces étant des interfaces envoyées au dispositif pouvant être porté sur soi par T dispositifs cibles ; recevoir une première entrée d'un utilisateur, la première entrée étant une entrée d'un premier geste de suspension ; et lorsque le premier geste de suspension correspond à un premier geste prédéfini, en réponse à la première entrée, effectuer un traitement cible sur une première interface des M interfaces, le traitement cible étant utilisé pour régler un paramètre d'affichage de la première interface dans l'écran virtuel. M et N sont tous deux des nombres entiers positifs.
PCT/CN2022/074264 2021-01-28 2022-01-27 Procédé et appareil de commande d'affichage, dispositif électronique et support WO2022161432A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110116791.4 2021-01-28
CN202110116791.4A CN112947825A (zh) 2021-01-28 2021-01-28 显示控制方法、装置、电子设备及介质

Publications (1)

Publication Number Publication Date
WO2022161432A1 true WO2022161432A1 (fr) 2022-08-04

Family

ID=76238501

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/074264 WO2022161432A1 (fr) 2021-01-28 2022-01-27 Procédé et appareil de commande d'affichage, dispositif électronique et support

Country Status (2)

Country Link
CN (1) CN112947825A (fr)
WO (1) WO2022161432A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112947825A (zh) * 2021-01-28 2021-06-11 维沃移动通信有限公司 显示控制方法、装置、电子设备及介质
CN115509476A (zh) * 2021-06-23 2022-12-23 华为技术有限公司 一种屏幕共享方法、系统和虚拟显示设备
CN113687721A (zh) * 2021-08-23 2021-11-23 Oppo广东移动通信有限公司 设备控制方法、装置、头戴显示设备及存储介质
CN113703592A (zh) * 2021-08-31 2021-11-26 维沃移动通信有限公司 安全输入方法和装置
CN114063778A (zh) * 2021-11-17 2022-02-18 北京蜂巢世纪科技有限公司 一种利用ar眼镜模拟图像的方法、装置、ar眼镜及介质
CN114510171B (zh) * 2022-02-14 2023-10-24 广州塔普鱼网络科技有限公司 一种基于图像处理技术的便捷式三维交互系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106803988A (zh) * 2017-01-03 2017-06-06 苏州佳世达电通有限公司 信息传送系统以及信息传送方法
CN106997242A (zh) * 2017-03-28 2017-08-01 联想(北京)有限公司 界面管理方法及头戴式显示设备
US20200162851A1 (en) * 2018-11-20 2020-05-21 Navitaire Llc Systems and methods for sharing information between augmented reality devices
CN111190488A (zh) * 2019-12-30 2020-05-22 华为技术有限公司 设备控制方法、通信装置及存储介质
CN112947825A (zh) * 2021-01-28 2021-06-11 维沃移动通信有限公司 显示控制方法、装置、电子设备及介质
CN113515192A (zh) * 2021-05-14 2021-10-19 闪耀现实(无锡)科技有限公司 用于可穿戴设备的信息处理方法、装置及可穿戴设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160357263A1 (en) * 2014-07-22 2016-12-08 Augumenta Ltd Hand-gesture-based interface utilizing augmented reality
CN104914999A (zh) * 2015-05-27 2015-09-16 广东欧珀移动通信有限公司 一种控制设备的方法及可穿戴设备
CN107450717B (zh) * 2016-05-31 2021-05-18 联想(北京)有限公司 一种信息处理方法及穿戴式设备
CN109725947A (zh) * 2017-10-30 2019-05-07 华为技术有限公司 一种未读消息的处理方法及终端
CN111031471A (zh) * 2019-11-25 2020-04-17 维沃移动通信有限公司 一种数据传输方法、终端及基站

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106803988A (zh) * 2017-01-03 2017-06-06 苏州佳世达电通有限公司 信息传送系统以及信息传送方法
CN106997242A (zh) * 2017-03-28 2017-08-01 联想(北京)有限公司 界面管理方法及头戴式显示设备
US20200162851A1 (en) * 2018-11-20 2020-05-21 Navitaire Llc Systems and methods for sharing information between augmented reality devices
CN111190488A (zh) * 2019-12-30 2020-05-22 华为技术有限公司 设备控制方法、通信装置及存储介质
CN112947825A (zh) * 2021-01-28 2021-06-11 维沃移动通信有限公司 显示控制方法、装置、电子设备及介质
CN113515192A (zh) * 2021-05-14 2021-10-19 闪耀现实(无锡)科技有限公司 用于可穿戴设备的信息处理方法、装置及可穿戴设备

Also Published As

Publication number Publication date
CN112947825A (zh) 2021-06-11

Similar Documents

Publication Publication Date Title
WO2022161432A1 (fr) Procédé et appareil de commande d'affichage, dispositif électronique et support
US20220377128A1 (en) File transfer display control method and apparatus, and corresponding terminal
WO2021083052A1 (fr) Procédé de partage d'objet et dispositif électronique
US20200059500A1 (en) Simultaneous input system for web browsers and other applications
WO2020259651A1 (fr) Procédé de commande d'interface utilisateur et dispositif électronique
WO2020063091A1 (fr) Procédé de traitement d'image et dispositif terminal
WO2022063022A1 (fr) Procédé et appareil de prévisualisation vidéo et dispositif électronique
CN109002243B (zh) 一种图像参数调节方法及终端设备
US20140078091A1 (en) Terminal Device and Method for Quickly Starting Program
US20170199662A1 (en) Touch operation method and apparatus for terminal
CN110737374B (zh) 操作方法及电子设备
CN110221885B (zh) 一种界面显示方法及终端设备
US20210357067A1 (en) Device and method for processing user input
WO2015161653A1 (fr) Procédé d'exploitation de terminal et dispositif terminal
CN109857306B (zh) 截屏方法及终端设备
US11546457B2 (en) Electronic device and method of operating electronic device in virtual reality
CN111638837B (zh) 一种消息处理方法及电子设备
WO2020001604A1 (fr) Procédé d'affichage et dispositif terminal
CN112099707A (zh) 显示方法、装置和电子设备
WO2020181956A1 (fr) Procédé d'affichage d'identifiant d'application et appareil terminal
CN112911147B (zh) 显示控制方法、显示控制装置及电子设备
WO2022135409A1 (fr) Procédé de traitement d'affichage, appareil de traitement d'affichage et dispositif portable
WO2023045885A1 (fr) Procédé et appareil de synchronisation
WO2020173316A1 (fr) Procédé d'affichage d'images, terminal et terminal mobile
US20140195935A1 (en) Information processing device, information processing method, and information processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22745305

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22745305

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.01.2024)