WO2023173726A1 - 交互方法、装置及存储介质 - Google Patents

交互方法、装置及存储介质 Download PDF

Info

Publication number
WO2023173726A1
WO2023173726A1 PCT/CN2022/123489 CN2022123489W WO2023173726A1 WO 2023173726 A1 WO2023173726 A1 WO 2023173726A1 CN 2022123489 W CN2022123489 W CN 2022123489W WO 2023173726 A1 WO2023173726 A1 WO 2023173726A1
Authority
WO
WIPO (PCT)
Prior art keywords
trigger request
trigger
action
executed
interface
Prior art date
Application number
PCT/CN2022/123489
Other languages
English (en)
French (fr)
Inventor
周超
Original Assignee
北京字跳网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字跳网络技术有限公司 filed Critical 北京字跳网络技术有限公司
Priority to EP22799838.2A priority Critical patent/EP4273670A4/en
Priority to US17/998,718 priority patent/US20240176456A1/en
Publication of WO2023173726A1 publication Critical patent/WO2023173726A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/65Updates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present disclosure relates to virtual reality technology, and in particular, to an interactive method, device and storage medium.
  • Virtual Reality (VR) technology is a virtual environment created using modern computer technology. Users can use specific human-computer interaction equipment and devices to interact with the virtual environment to create an immersive experience.
  • Extended Reality (XR) is a further development of virtual reality technology. Extended reality refers to the use of computer technology and wearable devices to generate a real and virtual environment that can be interacted with by humans.
  • Interaction with virtual objects is an essential part of the XR world. It is divided into two interaction methods according to distance: one is near-field interaction, where the user touches the object using finger clicks or handles; the other is far-field interaction, where the user Generally, rays are used to detect collisions of virtual objects to further complete the entire trigger cycle.
  • the present disclosure provides an interaction method, device and storage medium to manage the interaction between multiple input devices and UI in an XR scenario.
  • embodiments of the present disclosure provide an interaction method applied to a target execution component, where the target execution component is one of multiple execution components, and one execution component corresponds to one input device.
  • the method includes:
  • the trigger request is sent by a target input device.
  • the target execution component corresponds to the target input device.
  • the status of the target execution component is an active state. Among the multiple execution components, except the target The status of each remaining execution component outside the execution component is inactive;
  • trigger signal collision detection After trigger signal collision detection, obtain a trigger request from the preset action queue as a trigger request to be executed according to a preset sequence;
  • the corresponding action execution interface is called to process the trigger event, and the processing result of the action execution interface is displayed on the UI.
  • the method before calling the corresponding action execution interface to process the trigger event according to the trigger request to be executed, the method further includes:
  • Calling the corresponding action execution interface to process the trigger event according to the trigger request to be executed includes:
  • the corresponding action execution interface is called to process the trigger event according to the trigger request to be executed.
  • the method further includes:
  • Calling the corresponding action execution interface to process the trigger event according to the trigger request to be executed includes:
  • the corresponding action execution interface is called to process the trigger event according to the to-be-executed trigger request;
  • the action triggered by the to-be-executed trigger request is mutually exclusive with the action triggered by the currently-executed trigger request, then after the execution of the currently-executed trigger request is completed, the corresponding action execution is called according to the to-be-executed trigger request.
  • the interface handles trigger events.
  • the trigger request carries an activation identifier, and the activation identifier is used to indicate whether the action triggered by the trigger request is an activation action;
  • Determining whether the action triggered by the currently executed trigger request is within the action cycle includes:
  • the action triggered by the currently executed trigger request is an activation action, it is determined that the action triggered by the currently executed trigger request is within the action period.
  • determining whether the action triggered by the to-be-executed trigger request and the action triggered by the currently executed trigger request are mutually exclusive includes:
  • the method further includes:
  • the corresponding hover processing interface is called to process the hover event, and The processing results of the hover processing interface are displayed on the UI.
  • the hover processing interface includes a hover entry interface, a hover stay interface, and a hover end interface;
  • calling the corresponding action execution interface to process the trigger event according to the trigger request to be executed includes:
  • the corresponding action execution interface is called to process the trigger event.
  • the target execution component interacts with the management system
  • the receiving trigger request includes:
  • the trigger request is sent by the management system upon receiving the target input device.
  • the trigger request is used to instruct the management system to set the status of the target execution component. is an active state, and the status of each remaining execution component among the plurality of execution components except the target execution component is set to an inactive state.
  • the trigger signal is a ray.
  • inventions of the present disclosure provide an interaction device, which is applied to a target execution component.
  • the target execution component is one of multiple execution components, and one execution component corresponds to an input device.
  • the device includes:
  • a receiving module configured to receive a trigger request.
  • the trigger request is sent by a target input device.
  • the target execution component corresponds to the target input device.
  • the status of the target execution component is an active state.
  • the plurality of execution components The status of each remaining execution component except the target execution component is inactive;
  • a processing module configured to cache the trigger request in a preset action queue, and after receiving the update instruction, perform trigger signal collision detection based on the UI;
  • An acquisition module configured to acquire a trigger request from the preset action queue as a trigger request to be executed according to a preset order after trigger signal collision detection
  • the calling module is configured to call the corresponding action execution interface to process the trigger event according to the trigger request to be executed, and display the processing result of the action execution interface on the UI.
  • the calling module is specifically used to:
  • the corresponding action execution interface is called to process the trigger event according to the trigger request to be executed.
  • the calling module is specifically used to:
  • the corresponding action execution interface is called to process the trigger event according to the to-be-executed trigger request;
  • the action triggered by the to-be-executed trigger request is mutually exclusive with the action triggered by the currently-executed trigger request, then after the execution of the currently-executed trigger request is completed, the corresponding action execution is called according to the to-be-executed trigger request.
  • the interface handles trigger events.
  • the trigger request carries an activation identifier, and the activation identifier is used to indicate whether the action triggered by the trigger request is an activation action;
  • the calling module is specifically used for:
  • the action triggered by the currently executed trigger request is an activation action, it is determined that the action triggered by the currently executed trigger request is within the action period.
  • the calling module is specifically used to:
  • the acquisition module acquires a trigger request from the preset action queue as a trigger request to be executed according to a preset order as a trigger request to be executed. used for
  • the corresponding hover processing interface is called to process the hover event, and The processing results of the hover processing interface are displayed on the UI.
  • the hover processing interface includes a hover entry interface, a hover stay interface, and a hover end interface;
  • the acquisition module is specifically used for:
  • the calling module is specifically used to:
  • the corresponding action execution interface is called to process the trigger event.
  • the target execution component interacts with the management system.
  • the receiving module is specifically used for:
  • the trigger request is sent by the management system upon receiving the target input device.
  • the trigger request is used to instruct the management system to set the status of the target execution component. is an active state, and the status of each remaining execution component among the plurality of execution components except the target execution component is set to an inactive state.
  • the trigger signal is a ray.
  • a target execution component including:
  • the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method according to the first aspect.
  • embodiments of the present disclosure provide a computer-readable storage medium that stores a computer program, and the computer program causes a server to execute the method described in the first aspect.
  • embodiments of the present disclosure provide a computer program product, including computer instructions, and the computer instructions are used by a processor to execute the method described in the first aspect.
  • embodiments of the present disclosure provide a computer program.
  • the computer program is stored in a readable storage medium.
  • At least one processor of an electronic device can read the computer program from the readable storage medium.
  • the computer program is stored in a readable storage medium.
  • At least one processor executes the computer program, so that the electronic device executes the method described in the first aspect.
  • the interactive method, device and storage medium receive a trigger request sent by an input device through an execution component.
  • the execution component corresponds to the input device.
  • the status of the execution component is the active state, and the status of other execution components is The inactive state makes only one input device active at the same time (can trigger events), so that the UI only responds to one trigger at the same time.
  • trigger signal collision detection and event processing are performed, and the processing is displayed on the UI
  • the problem of how the UI responds correctly when multiple input devices interact with the UI in an XR scenario is solved, ensuring that the UI interaction in the XR scenario is correct, fast and complete.
  • the execution component after receiving the trigger request sent by the management system, the execution component caches the trigger request in the preset action queue. In this way, on the one hand, it can manage the trigger requests with a mutually exclusive relationship and ensure the action cycle of the two. There is no overlap. On the other hand, within the action cycle, even if the trigger request is not sent every frame, the execution component can be guaranteed to generate the trigger event every frame, so that the UI can completely process the corresponding event.
  • Figure 1 is a schematic diagram of the interactive system architecture provided by an embodiment of the present disclosure
  • Figure 2 is a schematic flowchart of an interaction method provided by an embodiment of the present disclosure
  • Figure 3 is a schematic diagram of hover event processing provided by an embodiment of the present disclosure.
  • Figure 4 is a schematic diagram of trigger event processing provided by an embodiment of the present disclosure.
  • Figure 5 is a schematic flowchart of another interaction method provided by an embodiment of the present disclosure.
  • Figure 6 is a schematic flowchart of yet another interaction method provided by an embodiment of the present disclosure.
  • Figure 7 is a schematic structural diagram of an interactive device provided by an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of the basic hardware architecture of a target execution component provided by an embodiment of the present disclosure.
  • XR technology includes augmented reality (Augmented Reality, AR), VR, and mixed reality (Mixed Reality, MR), which uses hardware devices combined with a variety of technical means to integrate virtual content with real scenes.
  • AR Augmented Reality
  • VR VR
  • MR Mixed reality
  • near-field interaction where the user uses finger clicks or handles to touch objects
  • far-field interaction where users generally use rays to detect collisions with virtual objects.
  • further completing the entire trigger cycle for the latter, more than one input device may exist in the XR scene at the same time, and the UI in the scene can only respond to one trigger at the same time. Therefore, a mechanism is needed to manage the interaction of multiple input devices with the UI.
  • embodiments of the present disclosure propose an interaction method that receives a trigger request sent by an input device through an execution component.
  • the execution component corresponds to the input device.
  • the status of the execution component is the active state, and the status of other execution components is the inactive state.
  • the interaction method provided by the embodiment of the present disclosure can be applied to the interaction system as shown in Figure 1.
  • the interactive system architecture may include an execution component 101 .
  • the execution component 101 is one of multiple execution components, and one execution component corresponds to one input device.
  • an execution component is bound to an input device, and the execution component only processes relevant information sent by the input device bound to it.
  • input devices include A, B, and C.
  • Execution component 1 corresponds to input device A.
  • Execution component 1 receives the trigger request sent by input device A, and can execute trigger signal collision based on the trigger request. detection and event handling, etc.
  • the execution component 101 receives the trigger request sent by the corresponding input device, performs trigger signal collision detection and event processing based on the above trigger request, and displays the processing results in the UI, solving the problem of multiple input devices in the XR scenario. Interaction with the UI and how the UI responds correctly. Among them, the state of the execution component 101 is the active state, and the state of the other execution components is the inactive state. In this way, only one input device is active (can trigger events) at the same time, so that the UI only responds to one trigger at the same time.
  • the above-mentioned interactive system architecture may include a management system 102, and the execution component 101 may interact with the management system 102 to manage the interaction between multiple input devices and the UI.
  • the management system 102 can receive each trigger request from each input device. If the user sends a trigger request through a certain input device, the management system 102 receives the trigger request and manages multiple execution components according to the trigger request, such as setting the status of the execution component corresponding to the above input device to the active state, and other The state of the execution component is set to inactive state, etc., so that only one input device is active at the same time, so that the UI only responds to one trigger at the same time. Then, the management system 102 can also send the above-mentioned trigger request to the execution component corresponding to the above-mentioned input device.
  • the execution component is the above-mentioned execution component 101, and the execution component 101 performs trigger signal collision detection and event processing based on the above-mentioned trigger request. , ensuring correct, fast and complete interaction between multiple input devices and UI in XR scenarios.
  • Figure 2 is a schematic flow chart of an interaction method provided by an embodiment of the present disclosure.
  • the execution subject of this embodiment takes the execution component in Figure 1 as an example.
  • the specific execution subject can be determined according to the actual application scenario.
  • This embodiment of the present disclosure does not Make special restrictions.
  • the execution component mentioned above in Figure 2 can be understood as a target execution component.
  • the target execution component is one of multiple execution components, and one execution component corresponds to one input device.
  • the interaction method provided by the embodiment of the present disclosure may include the following steps:
  • S201 Receive a trigger request.
  • the trigger request is sent by the target input device.
  • the above-mentioned target execution component corresponds to the target input device.
  • the status of the above-mentioned target execution component is an active state. The remaining among the above-mentioned multiple execution components except the above-mentioned target execution component The status of each execution component is set to inactive.
  • the target input device can be any input device.
  • the input device corresponds to an execution component, that is, it is bound to an execution component.
  • the execution component processes the information sent by the target input device. information.
  • the above-mentioned trigger request may be a trigger request sent by the management system received by the above-mentioned target execution component, that is, the management system receives the trigger request sent by the target input device, and then forwards the trigger request to the above-mentioned target execution component, so that the target execution component Able to accurately process relevant information sent by the corresponding input device.
  • the above-mentioned management system can manage multiple execution components including the above-mentioned target execution component.
  • the above-mentioned control forwards corresponding information to each execution component, and for example, the status of the execution component is controlled, and the status of the above-mentioned target execution component is Set it to the active state, and set the status of each remaining execution component except the above target execution component to the inactive state, so that only one input device is active at the same time (events can be triggered), so that the UI only responds to one input device at the same time. trigger.
  • the trigger request may carry the identity of the target input device.
  • the two input devices are a left controller and a right controller respectively, and the target input device is a right controller, then the above trigger request can carry the identity of the right controller.
  • the above management system receives each trigger request from each input device.
  • the management system can determine the target execution component corresponding to the target input device according to the identity of the target input device carried in the trigger request, and then set the status of the target execution component is in the active state, and among the multiple execution components managed, the status of each remaining execution component except the target execution component is set to the inactive state, so that only one input device is active at the same time (can trigger events), So that the UI only responds to one trigger at the same time to meet actual application needs.
  • the above-mentioned management system can pre-store the corresponding relationship between the identity of the input device and the execution component. In this way, after receiving the trigger request from the above-mentioned target input device, the above-mentioned corresponding relationship can be based on the above-mentioned corresponding relationship and the information carried by the above-mentioned trigger request.
  • the identity of the target input device determines the target execution component corresponding to the above target input device, which is simple and convenient.
  • the above-mentioned identity identifier can be the name, number and other information of the input device that can uniquely identify the identity of the device.
  • the input device corresponding to the inactive execution component is triggered.
  • the execution component corresponding to the left handle is in the inactive state
  • the execution component corresponding to the right handle is active.
  • the above management system will first Cancel and clear all trigger events of the currently active handle (right handle), and set the execution component corresponding to the left handle to the active state to respond to the corresponding trigger to meet actual application needs, and the user has no awareness of whether the handle is active, improving user experience.
  • S202 Cache the above trigger request in the preset action queue, and after receiving the update instruction, perform trigger signal collision detection based on the UI.
  • the above-mentioned target execution component caches the above-mentioned trigger request in the preset action queue. On the one hand, it can manage the trigger requests with a mutually exclusive relationship to ensure that the two action periods do not overlap. On the other hand, within the action period, even if the trigger request does not It will be sent every frame, and it can also ensure that the execution component generates trigger events every frame, so that the UI can completely handle the corresponding events.
  • the above-mentioned trigger signal may be a collision trigger signal, such as a ray.
  • the above-mentioned target execution component can track the position and attitude of the target input device to determine the starting point and direction of the ray, update the position and direction of the ray, etc., and perform ray collision detection in each frame to determine the ray.
  • the target execution component can also store the node where the trigger signal collides with the UI, so as to facilitate subsequent processing of related events based on the stored collided node.
  • S203 After trigger signal collision detection, obtain a trigger request from the above-mentioned preset action queue as a trigger request to be executed according to the preset sequence.
  • the above-mentioned preset sequence can be set according to the actual situation, for example, first-in-first-out, that is, after the trigger signal collision detection, the above-mentioned target execution component obtains a trigger request from the above-mentioned preset action queue as a pending sequence in the order of first-in, first-out. Execute trigger requests to meet various application needs.
  • a trigger request is obtained from the preset action queue as the trigger request to be executed.
  • the execution component can also update the length of the ray rendering, and update the visibility and position of the cursor at the intersection of the ray and the UI.
  • the corresponding hover processing interface is called to process the hover event, and the processing results of the hover processing interface are displayed in the UI, which solves the problem of hovering when multiple input devices interact with the UI in the XR scene. event handling.
  • the above-mentioned hover processing interface includes a hover entry interface, a hover stay interface, and a hover end interface.
  • the above target execution component can compare the node where the trigger signal collides with the UI in the previous frame with the node where the trigger signal collides with the UI in the current frame. According to the comparison result, there is no trigger signal and UI in the previous frame.
  • the hover entry interface is called to process the hover event; according to the above comparison results, the trigger signal collided with the UI in the previous frame.
  • the hover stay interface is called to process the hover event; according to the above comparison results, the node where the trigger signal collides with the UI in the previous frame , but there is no node in the node where the trigger signal collides with the UI in the current frame, call the hover end interface to process the hover event, that is, different events call different interface processing, so that the corresponding event can be successfully processed through the called interface.
  • the nodes that the trigger signal collided with the UI in the previous frame are: A, B, and C
  • the nodes that the trigger signal collides with the UI in the current frame are: A, C, and D.
  • the hover interface is called at A and C.
  • call the hover end interface in B to handle the hover event
  • call the hover entry interface in D to handle the hover event.
  • the target execution component may also determine whether the action triggered by the currently executed trigger request is within the action cycle before calling the corresponding action execution interface to process the trigger event according to the trigger request to be executed. If the action triggered by the currently executed trigger request is not within the action cycle, it means that the action triggered by the currently executed trigger request may be completed or the user has sent an end request, and there will be no mutual exclusion with the action triggered by the above pending trigger request. , at this time, the above-mentioned target execution component can call the corresponding action execution interface to process the trigger event according to the above-mentioned trigger request to be executed.
  • the above-mentioned target execution component can further determine whether the action triggered by the above-mentioned trigger request to be executed and the action triggered by the currently executed trigger request are mutually exclusive. If they are not mutually exclusive, Then according to the above-mentioned trigger request to be executed, the corresponding action execution interface is called to process the trigger event. If mutually exclusive, after the execution of the currently executed trigger request is completed, according to the above-mentioned trigger request to be executed, the corresponding action execution interface is called to process the trigger event.
  • the above target execution components can be processed in parallel to meet application needs.
  • the above-mentioned target execution component processes the first trigger request and then processes another trigger request to ensure the integrity of the action cycle.
  • the above-mentioned target execution component when the above-mentioned target execution component determines whether the action triggered by the above-mentioned trigger request to be executed and the action triggered by the currently executed trigger request are mutually exclusive, the above-mentioned target execution component can obtain the type of the action triggered by the above-mentioned trigger request to be executed, and the current The type of action triggered by the executed trigger request, and then determine whether the type of action triggered by the above-mentioned trigger request to be executed is consistent with the type of action triggered by the currently executed trigger request. If they are consistent, then determine whether the type of action triggered by the above-mentioned trigger request to be executed is consistent with the type of action triggered by the currently executed trigger request.
  • the actions triggered by the currently executed trigger request are not mutually exclusive. If they are inconsistent, it is judged that the action triggered by the above-mentioned trigger request to be executed and the action triggered by the currently executed trigger request are mutually exclusive, thereby executing subsequent steps based on the above judgment results. , to ensure that subsequent processing proceeds normally.
  • the trigger request may carry the type of the triggered action.
  • the target execution component obtains the trigger of the to-be-executed trigger request based on the type of the triggered action carried by the to-be-executed trigger request and the type of the triggered action carried by the currently executed trigger request. The type of action, and the type of action triggered by the currently executed trigger request.
  • the trigger request may carry an activation identifier, which is used to indicate whether the action triggered by the trigger request is an activation action.
  • the above target execution component determines whether the action triggered by the currently executed trigger request is within the action cycle, it can determine whether the action triggered by the currently executed trigger request is an activation action based on the activation identifier carried by the currently executed trigger request. If the currently executed trigger request If the action triggered by the trigger request is an activation action, then it is judged that the action triggered by the currently executed trigger request is within the action cycle. For example, taking the button on the handle as an example, the user sends a trigger request by pressing the button. The trigger request carries an activation identifier.
  • the activation identifier is used to indicate that the action triggered by the trigger request is an activation action (that is, when the button is pressed, the trigger request carries The activation identification indicates that the action triggered by the trigger request is an activation action. When the button is raised, the activation identification carried by the trigger request indicates that the action triggered by the trigger request is not an activation action.)
  • the above-mentioned management system receives the above-mentioned trigger request and sends the above-mentioned trigger request to the corresponding execution component.
  • the execution component determines whether the action triggered by the above-mentioned trigger request is an activation action based on the above-mentioned activation identifier, thereby being able to accurately determine the action triggered by the above-mentioned trigger request. Whether it is within the action cycle.
  • the above-mentioned target execution component caches the received trigger request in a queue, and then takes out a trigger request from the queue as a trigger request to be executed. Under the condition of ensuring the integrity of the action cycle, call The corresponding action execution interface processes the trigger event and displays the processing result of the action execution interface in the UI.
  • the above-mentioned target execution component calls the corresponding action execution interface to process the trigger event, it can obtain the type of the action triggered by the above-mentioned trigger request to be executed, and then, according to the type, determine the corresponding action execution interface to be called, and then call The corresponding action execution interface processes trigger events to ensure that subsequent processing is carried out accurately.
  • the above-mentioned type may include click, joystick, etc.
  • the above-mentioned target execution component determines the action execution interface such as click, joystick, etc. according to the type, thereby calling the action execution interface such as click, joystick, etc. to process the click, joystick, etc. and other events.
  • the above-mentioned action execution interface may include interactive components, such as ordinary buttons (Buttons), progress bars (Slider), single check buttons (Tab), scroll views (ScrollView) and other UI components that carry specific functions, so that in The above interfaces implement respective functional logic, respond to the trigger of the input device, and open up a complete UI interactive link.
  • interactive components such as ordinary buttons (Buttons), progress bars (Slider), single check buttons (Tab), scroll views (ScrollView) and other UI components that carry specific functions, so that in The above interfaces implement respective functional logic, respond to the trigger of the input device, and open up a complete UI interactive link.
  • the trigger request sent by the input device is received through the execution component.
  • the execution component corresponds to the input device.
  • the status of the execution component is the active state, and the status of the other execution components is the inactive state, so that there is only one input device at the same time. is active (events can be triggered) so that the UI only responds to one trigger at the same time.
  • trigger signal collision detection and event processing are performed, and the processing results are displayed on the UI, solving the problem of multiple inputs in XR scenarios.
  • the interaction between the device and the UI, and how the UI responds correctly, ensure that the UI interaction in the XR scene is correct, fast and complete.
  • the execution component after receiving the trigger request sent by the management system, the execution component caches the trigger request in the preset action queue. In this way, on the one hand, it can manage the trigger requests with a mutually exclusive relationship and ensure the action cycle of the two. There is no overlap. On the other hand, within the action cycle, even if the trigger request is not sent every frame, the execution component can be guaranteed to generate the trigger event every frame, so that the UI can completely process the corresponding event.
  • FIG. 5 shows a flow diagram of another interaction method. , that is, the process of the interaction method corresponding to the management system. The relevant description is shown in Figure 2 and will not be repeated here. As shown in Figure 5, the method may include:
  • S501 Receive a trigger request from the target input device.
  • S502 According to the above trigger request, set the status of the target execution component to the active state, and set the status of each remaining execution component among the multiple managed execution components except the above target execution component to the inactive state, where , one execution component corresponds to one input device, and the above target execution component corresponds to the above target input device.
  • an input device corresponds to an execution component, that is, is bound to an execution component.
  • the above management system can set the status of the target execution component corresponding to the target input device to the active state, and set the status of other execution components to The inactive state allows only one input device to be active (can trigger events) at the same time, so that the UI can only respond to one trigger at the same time to meet actual application needs.
  • S503 Send the above-mentioned trigger request to the above-mentioned target execution component.
  • the above-mentioned trigger request is used to instruct the above-mentioned target execution component to cache the above-mentioned trigger request in the preset action queue, and after receiving the update instruction, perform trigger signal collision based on the UI.
  • Detection after trigger signal collision detection, according to the preset order, obtain a trigger request from the above preset action queue as a trigger request to be executed.
  • the corresponding action execution interface is called to process the trigger event, and in The UI displays the processing results of the action execution interface.
  • the above-mentioned management system sends the above-mentioned trigger request to the above-mentioned target execution component, so that the above-mentioned target execution component caches the received trigger request in a queue, and then takes out a trigger request from the queue.
  • the trigger request is processed as a trigger request to be executed.
  • the corresponding action execution interface is called to process the trigger event, and the processing result of the action execution interface is displayed in the UI.
  • the embodiment of the present disclosure manages the interaction between multiple input devices and the UI by interacting with multiple execution components through a management system.
  • one execution component corresponds to an input device.
  • the management system After receiving the trigger request of the input device, the management system sets the status of the execution component corresponding to the input device to the active state, and sets the status of other execution components to the inactive state, so that at the same time Only one input device is active (can trigger events), and then the trigger request is sent to the corresponding execution component, which performs trigger signal collision detection and event processing, and displays the processing results in the UI to ensure multiple The input device interacts with the UI correctly, quickly and completely.
  • Figure 6 provides a schematic flow chart of yet another interaction method according to an embodiment of the present disclosure. This embodiment illustrates the flow of the interaction method from the perspective of interaction between the management system and the execution component. As shown in Figure 6 , the method may include:
  • S601 The management system receives a trigger request from the target input device.
  • the management system Based on the above trigger request, the management system sets the status of the target execution component to the active state, and sets the status of the remaining execution components among the multiple managed execution components, except the above target execution component, to the inactive state. , wherein one execution component corresponds to one input device, and the above-mentioned target execution component corresponds to the above-mentioned target input device.
  • the above-mentioned target execution component receives the above-mentioned trigger request, caches the above-mentioned trigger request in the preset action queue, and after receiving the update instruction, performs trigger signal collision detection based on the UI.
  • the above-mentioned target execution component caches the above-mentioned trigger request in the preset action queue. On the one hand, it can manage the trigger requests with a mutually exclusive relationship to ensure that the two action periods do not overlap. On the other hand, within the action period, even if the trigger request does not It will be sent every frame, and it can also ensure that the execution component generates trigger events every frame.
  • the above-mentioned target execution component After trigger signal collision detection, the above-mentioned target execution component obtains a trigger request from the above-mentioned preset action queue as a trigger request to be executed according to a preset sequence.
  • the target execution component can also detect the previous frame according to the trigger signal collision.
  • the node where the trigger signal collides with the UI, and the node where the trigger signal collides with the UI in the current frame calls the corresponding hover processing interface to process the hover event, and displays the processing result of the hover processing interface in the UI.
  • the above-mentioned hover processing interface includes a hover entry interface, a hover stay interface, and a hover end interface.
  • the above target execution component can compare the node where the trigger signal collides with the UI in the previous frame with the node where the trigger signal collides with the UI in the current frame; according to the comparison result, there is no collision between the trigger signal and the UI in the previous frame.
  • the hover entry interface For nodes in the node where the trigger signal collided with the UI in the current frame, call the hover entry interface to handle the hover event; according to the above comparison results, the node where the trigger signal collided with the UI in the previous frame Among the nodes, and among the nodes where the trigger signal collides with the UI in the current frame, call the hover stay interface to process the hover event; according to the above comparison results, among the nodes where the trigger signal collides with the UI in the previous frame , but the node in the node where the signal collides with the UI is not triggered in the current frame, and the hover end interface is called to handle the hover event.
  • the above-mentioned target execution component calls the corresponding action execution interface to process the trigger event according to the above-mentioned trigger request to be executed, and displays the processing result of the action execution interface on the UI.
  • the above-mentioned target execution component can first determine whether the action triggered by the currently executed trigger request is within the action cycle. If the action triggered by the currently executed trigger request is not within the action cycle, the corresponding action execution will be called based on the above-mentioned trigger request to be executed.
  • the interface handles trigger events.
  • the above-mentioned target execution component can further determine whether the action triggered by the above-mentioned trigger request to be executed and the action triggered by the currently executed trigger request are mutually exclusive. If they are not mutually exclusive, Then according to the above-mentioned trigger request to be executed, the corresponding action execution interface is called to process the trigger event; if mutually exclusive, after the execution of the currently executed trigger request is completed, according to the above-mentioned trigger request to be executed, the corresponding action execution interface is called to process the trigger event.
  • the target execution component can obtain the type of action triggered by the trigger request to be executed and the type of action triggered by the currently executed trigger request, and determine whether the trigger request to be executed is consistent by determining whether the two types are consistent. Whether the triggered action and the action triggered by the currently executed trigger request are mutually exclusive. If the type of action triggered by the above-mentioned trigger request to be executed is consistent with the type of action triggered by the currently executed trigger request, then the above-mentioned target execution component determines that the action triggered by the above-mentioned trigger request to be executed and the action triggered by the currently executed trigger request are not mutually exclusive. , otherwise, it is judged that the action triggered by the above-mentioned trigger request to be executed and the action triggered by the currently executed trigger request are mutually exclusive.
  • the trigger request carries an activation identifier, which is used to indicate whether the action triggered by the trigger request is an activation action.
  • the above target execution component determines whether the action triggered by the currently executed trigger request is within the action cycle, it can determine whether the action triggered by the currently executed trigger request is an activation action based on the activation identifier carried by the currently executed trigger request. If the currently executed trigger request If the action triggered by the trigger request is an activation action, then it is judged that the action triggered by the currently executed trigger request is within the action cycle.
  • the target execution component calls the corresponding action execution interface to process the trigger event, it can obtain the type of the action triggered by the above-mentioned trigger request to be executed, and then, based on the type, determine the corresponding action execution interface to be called, and then call the corresponding The action execution interface handles trigger events.
  • the embodiments of the present disclosure use a management system to interact with multiple execution components to manage the interaction between multiple input devices and UI.
  • one execution component corresponds to an input device.
  • the management system After receiving the trigger request of the input device, the management system sets the status of the execution component corresponding to the input device to the active state, and sets the status of other execution components to the inactive state, so that at the same time Only one input device is active (can trigger events) so that the UI only responds to one trigger at a time. Then, the management system sends the above-mentioned trigger request to the execution component corresponding to the input device.
  • the execution component After receiving the trigger request sent by the management system, the execution component caches the trigger request in the preset action queue, so that on the one hand, it can manage the interaction
  • the trigger request in the exclusive relationship ensures that the action cycles of the two do not overlap. On the other hand, within the action cycle, even if the trigger request is not sent every frame, it can also ensure that the execution component generates a trigger event every frame.
  • the execution component performs trigger signal collision detection and event processing, and displays the processing results in the UI to ensure that UI interaction in the XR scene is correct, fast and complete.
  • FIG. 7 is a schematic structural diagram of an interaction device provided by an embodiment of the present disclosure.
  • the interactive device 70 includes: a receiving module 701, a processing module 702, an obtaining module 703, and a calling module 704.
  • the interactive device here may be the above-mentioned target execution component itself, or a chip or integrated circuit that implements the functions of the target execution component. It should be noted here that the division of the receiving module, processing module, acquisition module and calling module is only a division of logical functions. Physically, the two can be integrated or independent.
  • the receiving module 701 is used to receive a trigger request, the trigger request is sent by a target input device, the target execution component corresponds to the target input device, the state of the target execution component is an active state, and the multiple The status of each remaining execution component among the execution components, except the target execution component, is an inactive state.
  • the processing module 702 is configured to cache the trigger request in the preset action queue, and after receiving the update instruction, perform trigger signal collision detection based on the UI.
  • the acquisition module 703 is configured to acquire a trigger request from the preset action queue as a trigger request to be executed according to a preset order after trigger signal collision detection.
  • the calling module 704 is configured to call the corresponding action execution interface to process the trigger event according to the trigger request to be executed, and display the processing result of the action execution interface on the UI.
  • the calling module 704 is specifically used to:
  • the corresponding action execution interface is called to process the trigger event according to the trigger request to be executed.
  • the calling module 704 is specifically used to:
  • the corresponding action execution interface is called to process the trigger event according to the to-be-executed trigger request;
  • the action triggered by the to-be-executed trigger request is mutually exclusive with the action triggered by the currently-executed trigger request, then after the execution of the currently-executed trigger request is completed, the corresponding action execution is called according to the to-be-executed trigger request.
  • the interface handles trigger events.
  • the trigger request carries an activation identifier
  • the activation identifier is used to indicate whether the action triggered by the trigger request is an activation action.
  • the calling module 704 is specifically used for:
  • the action triggered by the currently executed trigger request is an activation action, it is determined that the action triggered by the currently executed trigger request is within the action period.
  • the calling module 704 is specifically used to:
  • the acquisition module 703 acquires a trigger request from the preset action queue in accordance with a preset order as a trigger request to be executed, also used for
  • the corresponding hover processing interface is called to process the hover event, and The processing results of the hover processing interface are displayed on the UI.
  • the hover processing interface includes a hover entry interface, a hover stay interface, and a hover end interface;
  • the acquisition module 703 is specifically used for:
  • the calling module 704 is specifically used to:
  • the corresponding action execution interface is called to process the trigger event.
  • the target execution component interacts with the management system.
  • the receiving module 701 is specifically used for:
  • the trigger request is sent by the management system upon receiving the target input device.
  • the trigger request is used to instruct the management system to set the status of the target execution component. is an active state, and the status of each remaining execution component among the plurality of execution components except the target execution component is set to an inactive state.
  • the device provided by the embodiment of the present disclosure can be used to execute the technical solution of the method embodiment shown in Figures 2 to 4. Its implementation principles and technical effects are similar, and the embodiments of the present disclosure will not be repeated here.
  • FIG. 8 schematically provides a possible basic hardware architecture diagram of the target execution component described in this disclosure.
  • the target execution component includes at least one processor 801 and a communication interface 803 . Further optionally, a memory 802 and a bus 804 may also be included.
  • the number of processors 801 in the target execution component may be one or more, and FIG. 8 only illustrates one of the processors 801 .
  • the processor 801 can be a central processing unit (CPU), a graphics processing unit (GPU) or a digital signal processor (DSP). If the target execution component has multiple processors 801, the multiple processors 801 may be of different types, or may be the same. Optionally, multiple processors 801 of the target execution component can also be integrated into a multi-core processor.
  • the memory 802 stores computer instructions and data; the memory 802 can store the computer instructions and data required to implement the above-mentioned interactive method provided by the present disclosure.
  • the memory 802 stores instructions for implementing the steps of the above-mentioned interactive method.
  • the memory 802 can be any one or any combination of the following storage media: non-volatile memory (such as read-only memory (Read-Only Memory, ROM), solid state disk (Solid State Disk, SSD), hard disk (Hard Disk) Drive, HDD), optical disk), volatile memory.
  • Communication interface 803 may provide information input/output to the at least one processor. It may also include any one or any combination of the following devices: network interfaces (such as Ethernet interfaces), wireless network cards and other devices with network access functions.
  • the communication interface 803 can also be used for data communication between the target execution component and other computing devices or terminals.
  • Figure 8 uses a thick line to represent bus 804.
  • Bus 804 may connect processor 801 with memory 802 and communication interface 803 .
  • the processor 801 can access the memory 802, and can also use the communication interface 803 to interact with other computing devices or terminals.
  • the target execution component executes the computer instructions in the memory 802, so that the target execution component implements the above-mentioned interaction method provided by the present disclosure, or causes the target execution component to deploy the above-mentioned interaction device.
  • the memory 802 may include a receiving module 701, a processing module 702, an obtaining module 703, and a calling module 704.
  • the inclusion here only relates to the functions of the receiving module, the processing module, the acquiring module and the calling module that can be realized respectively when the instructions stored in the memory are executed, and is not limited to the physical structure.
  • the present disclosure provides a computer-readable storage medium, the computer program product includes computer instructions, and the computer instructions instruct a computing device to execute the above interactive method provided by the present disclosure.
  • the present disclosure provides a computer program product, which includes computer instructions, and the computer instructions are used by a processor to execute the above interaction method.
  • the present disclosure provides a computer program.
  • the computer program is stored in a readable storage medium.
  • At least one processor of an electronic device can read the computer program from the readable storage medium.
  • the at least one processor executes
  • the computer program causes the electronic device to execute the method provided in any of the above embodiments.
  • the present disclosure provides a chip including at least one processor and a communication interface, the communication interface providing information input and/or output to the at least one processor. Further, the chip may also include at least one memory, which is used to store computer instructions. The at least one processor is used to call and run the computer instructions to execute the above interactive method provided by the present disclosure.
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in various embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开提供一种交互方法、装置及存储介质,该方法通过执行组件接收输入设备发送的触发请求,该执行组件与输入设备对应,该执行组件的状态为活跃状态,其它执行组件的状态为非活跃状态,使得同一时刻只有一个输入设备是活跃的(可以触发事件),以便同一时刻UI只响应一个触发,然后,根据上述触发请求,执行触发信号碰撞检测和事件处理,并在UI显示处理结果,解决了在XR场景下多个输入设备与UI的交互,UI如何正确响应的问题,保证XR场景中的UI交互正确、快速和完整。

Description

交互方法、装置及存储介质
相关申请的交叉引用
本申请要求于2022年3月16日提交中国专利局、申请号为202210260731.4、申请名称为“交互方法、装置及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开涉及虚拟现实技术,尤其涉及一种交互方法、装置及存储介质。
背景技术
虚拟现实(Virtual Reality,VR)技术是利用现代计算机技术创建的虚拟环境,用户可以使用特定的人机交互设备和装置与虚拟环境进行互动,产生身临其境的感受。扩展现实(Extended Reality,XR)是虚拟现实技术的进一步发展,扩展现实是指通过计算机技术和可穿戴设备产生一个真实与虚拟组合的、可人机交互的环境。
与虚拟物体的交互是XR世界必不可少的环节,按照距离划分为两种交互方式:一种是近场交互,用户使用手指点击或手柄等接触到物体;另一种是远场交互,用户一般使用射线对虚拟物体进行碰撞检测,进一步完成整个触发周期。
其中,对于远场交互,在XR场景下可能会出现不止一个输入设备同时存在的情况,而场景中的用户界面(User Interface,UI)同一时刻只能响应一个触发。因此如何在XR场景下管理多个输入设备与UI的交互成为一个急需解决的问题。
发明内容
本公开提供一种交互方法、装置及存储介质,以在XR场景下管理多个输入设备与UI的交互。
第一方面,本公开实施例提供一种交互方法,应用于目标执行组件,所述目标执行组件为多个执行组件中的一个,一个执行组件对应一个输入设备,所述方法包括:
接收触发请求,所述触发请求是目标输入设备发送的,所述目标执行组件与所述目标输入设备对应,所述目标执行组件的状态为活跃状态,所述多个执行组件中除所述目标执行组件外剩余的各个执行组件的状态为非活跃状态;
将所述触发请求缓存在预设动作队列中,并在接收到更新指令后,基于UI,进行触发信号碰撞检测;
在触发信号碰撞检测后,按照预设顺序,从所述预设动作队列中获取一触发请求作为待执行触发请求;
根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,并在所述UI显示所述动作执行接口的处理结果。
在一种可能的实现方式中,在所述根据所述待执行触发请求,调用相应的动作执行接口处理触发事件之前,还包括:
判断当前执行的触发请求触发的动作是否在动作周期内;
所述根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,包括:
若所述当前执行的触发请求触发的动作不在动作周期内,则根据所述待执行触发请求,调用相应的动作执行接口处理触发事件。
在一种可能的实现方式中,在所述判断当前执行的触发请求触发的动作是否在动作周期内之后,还包括:
若所述当前执行的触发请求触发的动作在动作周期内,则判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作是否互斥;
所述根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,包括:
若所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作不互斥,则根据所述待执行触发请求,调用相应的动作执行接口处理触发事件;
若所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作互斥,则在所述当前执行的触发请求执行完成后,根据所述待执行触发请求,调用相应的动作执行接口处理触发事件。
在一种可能的实现方式中,触发请求携带激活标识,所述激活标识用于表示触发请求触发的动作是否为激活动作;
所述判断当前执行的触发请求触发的动作是否在动作周期内,包括:
根据所述当前执行的触发请求携带的激活标识,确定所述当前执行的触发请求触发的动作是否为激活动作;
若所述当前执行的触发请求触发的动作为激活动作,则判断所述当前执行的触发请求触发的动作在动作周期内。
在一种可能的实现方式中,所述判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作是否互斥,包括:
获取所述待执行触发请求触发的动作的类型,以及所述当前执行的触发请求触发的动作的类型;
判断所述待执行触发请求触发的动作的类型与所述当前执行的触发请求触发的动作的类型是否一致;
若所述待执行触发请求触发的动作的类型与所述当前执行的触发请求触发的动作的类型一致,则判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作不互斥。
在一种可能的实现方式中,在所述触发信号碰撞检测后,所述按照预设顺序,从所述预设动作队列中,获取一触发请求作为待执行触发请求之前,还包括:
根据触发信号碰撞检测到的上一帧中触发信号与所述UI碰撞到的节点,以及当前帧中触发信号与所述UI碰撞到的节点,调用相应的悬停处理接口处理悬停事件,并在所述UI显示所述悬停处理接口的处理结果。
在一种可能的实现方式中,所述悬停处理接口包括悬停进入接口、悬停停留接口和悬停结束接口;
所述根据触发信号碰撞检测到的上一帧中触发信号与所述UI碰撞到的节点,以及当前帧中触发信号与所述UI碰撞到的节点,调用相应的悬停处理接口处理悬停事件,包括:
将所述上一帧中触发信号与所述UI碰撞到的节点,与所述当前帧中触发信号与所述UI碰撞到的节点进行比较;
根据比较结果中,没有在所述上一帧中触发信号与所述用户界面碰撞到的节点中,但在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停进入接口处理悬停事件;
根据所述比较结果中,在所述上一帧中触发信号与所述用户界面碰撞到的节点中,且在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停停留接口处理悬停事件;
根据所述比较结果中,在所述上一帧中触发信号与所述用户界面碰撞到的节点中,但没有在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停结束接口处理悬停事件。
在一种可能的实现方式中,所述根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,包括:
获取所述待执行触发请求触发的动作的类型;
根据所述待执行触发请求触发的动作的类型,确定待调用的相应的动作执行接口;
调用所述相应的动作执行接口处理触发事件。
在一种可能的实现方式中,所述目标执行组件与管理系统进行交互;
所述接收触发请求,包括:
接收所述管理系统发送的所述触发请求,所述触发请求是所述管理系统接收所述目标输入设备发送的,所述触发请求用于指示所述管理系统将所述目标执行组件的状态设置为活跃状态,并将所述多个执行组件中除与所述目标执行组件外剩余的各个执行组件的状态设置为非活跃状态。
在一种可能的实现方式中,所述触发信号为射线。
第二方面,本公开实施例提供一种交互装置,应用于目标执行组件,所述目标执行组件为多个执行组件中的一个,一个执行组件对应一个输入设备,所述装置包括:
接收模块,用于接收触发请求,所述触发请求是目标输入设备发送的,所述目标执行组件与所述目标输入设备对应,所述目标执行组件的状态为活跃状态,所述多个执行组件中除所述目标执行组件外剩余的各个执行组件的状态为非活跃状态;
处理模块,用于将所述触发请求缓存在预设动作队列中,并在接收到更新指令后,基于UI,进行触发信号碰撞检测;
获取模块,用于在触发信号碰撞检测后,按照预设顺序,从所述预设动作队列中获取一触发请求作为待执行触发请求;
调用模块,用于根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,并在所述UI显示所述动作执行接口的处理结果。
在一种可能的实现方式中,所述调用模块,具体用于:
判断当前执行的触发请求触发的动作是否在动作周期内;
若所述当前执行的触发请求触发的动作不在动作周期内,则根据所述待执行触发请求,调用相应的动作执行接口处理触发事件。
在一种可能的实现方式中,所述调用模块,具体用于:
若所述当前执行的触发请求触发的动作在动作周期内,则判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作是否互斥;
若所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作不互斥,则根据所述待执行触发请求,调用相应的动作执行接口处理触发事件;
若所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作互斥,则在所述当前执行的触发请求执行完成后,根据所述待执行触发请求,调用相应的动作执行接口处理触发事件。
在一种可能的实现方式中,触发请求携带激活标识,所述激活标识用于表示触发请求触发的动作是否为激活动作;
所述调用模块,具体用于:
根据所述当前执行的触发请求携带的激活标识,确定所述当前执行的触发请求触发的动作是否为激活动作;
若所述当前执行的触发请求触发的动作为激活动作,则判断所述当前执行的触发请求触发的动作在动作周期内。
在一种可能的实现方式中,所述调用模块,具体用于:
获取所述待执行触发请求触发的动作的类型,以及所述当前执行的触发请求触发的动作的类型;
判断所述待执行触发请求触发的动作的类型与所述当前执行的触发请求触发的动作的类型是否一致;
若所述待执行触发请求触发的动作的类型与所述当前执行的触发请求触发的动作的类型一致,则判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作不互斥。
在一种可能的实现方式中,所述获取模块在所述触发信号碰撞检测后,所述按照预设顺序,从所述预设动作队列中,获取一触发请求作为待执行触发请求之前,还用于
根据触发信号碰撞检测到的上一帧中触发信号与所述UI碰撞到的节点,以及当前帧中触发信号与所述UI碰撞到的节点,调用相应的悬停处理接口处理悬停事件,并在所述UI显示所述悬停处理接口的处理结果。
在一种可能的实现方式中,所述悬停处理接口包括悬停进入接口、悬停停留接口和悬停结束接口;
所述获取模块,具体用于:
将所述上一帧中触发信号与所述UI碰撞到的节点,与所述当前帧中触发信号与所述UI碰撞到的节点进行比较;
根据比较结果中,没有在所述上一帧中触发信号与所述用户界面碰撞到的节点中,但在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停进入接口处理悬停事件;
根据所述比较结果中,在所述上一帧中触发信号与所述用户界面碰撞到的节点中,且在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停停留接口处理悬停事件;
根据所述比较结果中,在所述上一帧中触发信号与所述用户界面碰撞到的节点中,但没有在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停结束接口处理悬停事件。
在一种可能的实现方式中,所述调用模块,具体用于:
获取所述待执行触发请求触发的动作的类型;
根据所述待执行触发请求触发的动作的类型,确定待调用的相应的动作执行接口;
调用所述相应的动作执行接口处理触发事件。
在一种可能的实现方式中,所述目标执行组件与管理系统进行交互。
所述接收模块,具体用于:
接收所述管理系统发送的所述触发请求,所述触发请求是所述管理系统接收所述目标输入设备发送的,所述触发请求用于指示所述管理系统将所述目标执行组件的状态设置为活跃状态,并将所述多个执行组件中除与所述目标执行组件外剩余的各个执行组件的状态设置为非活跃状态。
在一种可能的实现方式中,所述触发信号为射线。
第三方面,本公开实施例提供一种目标执行组件,包括:
处理器;
存储器;以及
计算机程序;
其中,所述计算机程序被存储在所述存储器中,并且被配置为由所述处理器执行,所述计算机程序包括用于执行如第一方面所述的方法的指令。
第四方面,本公开实施例提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序使得服务器执行第一方面所述的方法。
第五方面,本公开实施例提供一种计算机程序产品,包括计算机指令,所述计算机指令被处理器执行第一方面所述的方法。
第六方面,本公开实施例提供一种计算机程序,所述计算机程序存储在可读存储介质中,电子设备的至少一个处理器可以从所述可读存储介质中读取上述计算机程序,所述至少一个处理器执行所述计算机程序,使得所述电子设备执行上述第一方面所述的方法。
本公开实施例提供的交互方法、装置及存储介质,该方法通过执行组件接收输入设备发送的触发请求,该执行组件与输入设备对应,该执行组件的状态为活跃状态,其它执行组件的状态为非活跃状态,使得同一时刻只有一个输入设备是活跃的(可以触发事件),以便同一时刻UI只响应一个触发,然后,根据上述触发请求,执行触发信号碰撞检测和事件处理,并在UI显示处理结果,解决了在XR场景下多个输入设备与UI的交互,UI如何正确响应的问题,保证XR场景中的UI交互正确、快速和完整。而且,本公开实施例中执行组件在接收到管理系统发送的触发请求后,将该触发请求缓存在预设动作队列中,这样一方面可以管理有互斥关系的触发请求,保证两者动作周期不重叠,另一方面在动作周期内,即使触发请求不会每帧发送,也可以保证执行组件每帧去生成触发事件,使得UI能够完整的处理相应事件。
附图说明
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的交互系统架构示意图;
图2为本公开实施例提供的一种交互方法的流程示意图;
图3为本公开实施例提供的一种悬停事件处理示意图;
图4为本公开实施例提供的一种触发事件处理示意图;
图5为本公开实施例提供的另一种交互方法的流程示意图;
图6为本公开实施例提供的再一种交互方法的流程示意图;
图7为本公开实施例提供的一种交互装置的结构示意图;
图8为本公开实施例提供的一种目标执行组件的基本硬件架构示意图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”及“第四”等(如果存在)是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本公开的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
XR技术包含了增强现实(Augmented Reality,AR)、VR、混合现实(Mixed Reality,MR),利用硬件设备结合多种技术手段,将虚拟的内容和真实场景融合。在XR场景下,按照距离划分为两种交互方式:一种是近场交互,用户使用手指点击或手柄等接触到物体;另一种是远场交互,用户一般使用射线对虚拟物体进行碰撞检测,进一步完成整个触发周期。其中,对于后者,XR场景下可能会出现不止一个输入设备同时存在的情况,而场景中的UI同一时刻只能响应一个触发。因此需要一个机制去管理多输入设备对UI的交互。
因此,本公开实施例提出一种交互方法,通过执行组件接收输入设备发送的触发请求,该执行组件与输入设备对应,该执行组件的状态为活跃状态,其它执行组件的状态为非活跃状态,使得同一时刻只有一个输入设备是活跃的(可以触发事件),以便同一时刻UI只响应一个触发,然后根据上述触发请求,执行触发信号碰撞检测和事件处理,并 在UI显示处理结果,保证XR场景中多个输入设备与UI交互的正确、快速和完整。
可选地,本公开实施例提供的交互方法可以应用于如图1所示的交互系统中。在图1中,该交互系统架构可以包括执行组件101。其中,执行组件101为多个执行组件中的一个,一个执行组件对应一个输入设备。这里,对应可以理解一个执行组件与一个输入设备绑定,该执行组件只处理与其绑定的输入设备发送的有关信息。如执行组件有1、2和3,输入设备有A、B和C,执行组件1对应输入设备A,执行组件1接收输入设备A发送的触发请求,并可以基于该触发请求,执行触发信号碰撞检测和事件处理等。
在具体实现过程中,执行组件101接收对应的输入设备发送的触发请求,基于上述触发请求,执行触发信号碰撞检测和事件处理,并在UI显示处理结果,解决了在XR场景下多个输入设备与UI的交互,UI如何正确响应的问题。其中,执行组件101的状态为活跃状态,其它执行组件的状态为非活跃状态,这样,使得同一时刻只有一个输入设备是活跃的(可以触发事件),以便同一时刻UI只响应一个触发。
另外,上述交互系统架构可以包括管理系统102,执行组件101可以与管理系统102进行交互,以管理多个输入设备与UI的交互。
其中,管理系统102可以接收各输入设备的各触发请求。如用户通过某一输入设备发送触发请求,管理系统102接收该触发请求,根据该触发请求,对多个执行组件进行管理,如将与上述输入设备对应的执行组件的状态设置为活跃状态,其它执行组件的状态设置为非活跃状态等,使得同一时刻只有一个输入设备是活跃的,以便同一时刻UI只响应一个触发。然后,管理系统102还可以将上述触发请求发送给与上述输入设备对应的执行组件,如该执行组件为上述执行组件101,由执行组件101基于上述触发请求,执行触发信号碰撞检测和事件处理等,保证XR场景中多个输入设备与UI交互的正确、快速和完整。
需要进行说明的是,本公开实施例描述的系统架构以及业务场景是为了更加清楚的说明本公开实施例的技术方案,并不构成对于本公开实施例提供的技术方案的限定,本领域普通技术人员可知,随着系统架构的演变和新业务场景的出现,本公开实施例提供的技术方案对于类似的技术问题,同样适用。
下面以几个实施例为例对本公开的技术方案进行描述,对于相同或相似的概念或过程可能在某些实施例不再赘述。
图2为本公开实施例提供的一种交互方法的流程示意图,本实施例的执行主体以图1中的执行组件为例,具体执行主体可以根据实际应用场景确定,本公开实施例对此不做特别限制。为了方便描述,在图2中上述执行组件可以理解为目标执行组件,该目标执行组件为多个执行组件中的一个,一个执行组件对应一个输入设备。如图2所示,本公开实施例提供的交互方法可以包括如下步骤:
S201:接收触发请求,该触发请求是目标输入设备发送的,上述目标执行组件与目标输入设备对应,上述目标执行组件的状态为活跃状态,上述多个执行组件中除上述目标执行组件外剩余的各个执行组件的状态设置为非活跃状态。
这里,用户可以通过目标输入设备发送触发请求,上述目标输入设备可以为任意一个输入设备,该输入设备对应一个执行组件,即与一个执行组件绑定,由该执行组件处理上述目标输入设备发送的有关信息。
其中,上述触发请求可以为上述目标执行组件接收的管理系统发送的触发请求,即管理系统接收目标输入设备发送的触发请求,然后,将该触发请求转发给上述目标执行组件,以使目标执行组件能够准确处理对应的输入设备发送的有关信息。这里,上述管理系统可以对包括上述目标执行组件在内的多个执行组件进行管理,如上述控制转发相应的信息至每一执行组件,又如控制执行组件的状态,将上述目标执行组件的状态设置为活跃状态,并将除与上述目标执行组件外剩余的各个执行组件的状态设置为非活跃状态,使得同一时刻只有一个输入设备是活跃的(可以触发事件),以便同一时刻UI只响应一个触发。
这里,上述触发请求可以携带上述目标输入设备的身份标识。以两个输入设备为例,假如两个输入设备分别为左手柄和右手柄,上述目标输入设备为右手柄,则上述触发请求可以携带右手柄的身份标识。上述管理系统接收各输入设备的各触发请求。在接收到上述目标输入设备的触发请求后,上述管理系统可以根据该触发请求携带的目标输入设备的身份标识,确定与上述目标输入设备对应的目标执行组件,进而,将目标执行组件的状态设置为活跃状态,并将所管理的多个执行组件中,除与目标执行组件外剩余的各个执行组件的状态设置为非活跃状态,使得同一时刻只有一个输入设备是活跃的(可以触发事件),以便同一时刻UI只响应一个触发,满足实际应用需要。
在本公开实施例中,上述管理系统可以预存输入设备的身份标识与执行组件的对应关系,这样,在接收到上述目标输入设备的触发请求后,可以根据上述对应关系,以及上述触发请求携带的目标输入设备的身份标识,确定与上述目标输入设备对应的目标执行组件,简单、方便。其中,上述身份标识可以为输入设备的名称、编号等可以唯一标识设备身份的信息。
另外,与非活跃状态的执行组件对应的输入设备被触发,如上述左手柄对应的执行组件处于非活跃状态,右手柄对应的执行组件处于活跃状态,当左手柄被触发,上述管理系统会先取消并清空当前活跃手柄(右手柄)的所有触发事件,同时将左手柄对应的执行组件设置为活跃状态,以响应相应的触发,满足实际应用需要,且用户对手柄是否活跃无感知,提高用户体验。
S202:将上述触发请求缓存在预设动作队列中,并在接收到更新指令后,基于UI,进行触发信号碰撞检测。
这里,上述目标执行组件将上述触发请求缓存在预设动作队列中,一方面可以管理有互斥关系的触发请求,保证两者动作周期不重叠,另一方面在动作周期内,即使触发请求不会每帧发送,也可以保证执行组件每帧去生成触发事件,使得UI能够完整的处理相应事件。
在本公开实施例中,上述触发信号可以为碰撞触发信号,如射线。上述目标执行组件在接收到更新指令后,可以对目标输入设备的位置和姿态进行跟踪,从而确定射线的起点和方向,更新射线的位置方向等,并在每一帧进行射线碰撞检测,确定射线与UI碰撞到的节点,并在UI显示碰撞到的节点,以便用户能够了解输入设备的状态。
另外,在上述确定触发信号与UI碰撞到的节点后,上述目标执行组件还可以存储触发信号与UI碰撞到的节点,方便后续基于存储的碰撞到的节点处理相关事件。
S203:在触发信号碰撞检测后,按照预设顺序,从上述预设动作队列中获取一触发 请求作为待执行触发请求。
这里,上述预设顺序可以根据实际情况设置,例如为先入先出,即上述目标执行组件在触发信号碰撞检测后,按照先入先出的顺序,从上述预设动作队列中获取一触发请求作为待执行触发请求,满足多种应用需要。
示例性的,以上述触发信号为射线为例,如图3所示,在射线碰撞检测后,按照预设顺序,从预设动作队列中,获取一触发请求作为待执行触发请求之前,上述目标执行组件还可以更新射线渲染的长度,并更新射线与UI交点光标的显隐和位置,在目标执行组件的状态为活跃状态时,根据射线碰撞检测到的上一帧中射线与UI交点光标,以及当前帧中射线与UI交点光标,调用相应的悬停处理接口处理悬停事件,并在UI显示悬停处理接口的处理结果,解决了在XR场景下多个输入设备与UI交互的悬停事件处理。
其中,上述悬停处理接口包括悬停进入接口、悬停停留接口和悬停结束接口。上述目标执行组件可以将上述上一帧中触发信号与UI碰撞到的节点,与当前帧中触发信号与UI碰撞到的节点进行比较,根据比较结果中,没有在上一帧中触发信号与UI碰撞到的节点中,但在当前帧中触发信号与UI碰撞到的节点中的节点,调用悬停进入接口处理悬停事件;根据上述比较结果中,在上一帧中触发信号与UI碰撞到的节点中,且在当前帧中触发信号与UI碰撞到的节点中的节点,调用悬停停留接口处理悬停事件;根据上述比较结果中,在上一帧中触发信号与UI碰撞到的节点中,但没有在当前帧中触发信号与UI碰撞到的节点中的节点,调用悬停结束接口处理悬停事件,即不同事件调用不同的接口处理,以通过调用的接口成功处理相应的事件。例如,上一帧触发信号与UI碰撞到的节点有:A、B、C,当前帧中触发信号与UI碰撞到的节点有:A、C、D,则在A、C调用悬停停留接口处理悬停事件,在B调用悬停结束接口处理悬停事件,在D调用悬停进入接口处理悬停事件。
S204:根据上述待执行触发请求,调用相应的动作执行接口处理触发事件,并在UI显示上述动作执行接口的处理结果。
示例性的,如图4所示,上述目标执行组件在根据上述待执行触发请求,调用相应的动作执行接口处理触发事件之前,还可以判断当前执行的触发请求触发的动作是否在动作周期内。如果当前执行的触发请求触发的动作不在动作周期内,说明当前执行的触发请求触发的动作可能执行完成或者用户发送了结束的请求,不会存在与上述待执行触发请求触发的动作互斥的情况,此时上述目标执行组件可以根据上述待执行触发请求,调用相应的动作执行接口处理触发事件。
这里,如果当前执行的触发请求触发的动作在动作周期内,则上述目标执行组件可以进一步判断上述待执行触发请求触发的动作与当前执行的触发请求触发的动作是否互斥,如果不互斥,则根据上述待执行触发请求,调用相应的动作执行接口处理触发事件,如果互斥,则在当前执行的触发请求执行完成后,根据上述待执行触发请求,调用相应的动作执行接口处理触发事件。
其中,对于不互斥的触发请求,上述目标执行组件可以并行处理,满足应用需要。对于互斥的触发请求,上述目标执行组件在处理先触发请求后,再处理另一触发请求,保证动作周期完整性。
在本公开实施例中,上述目标执行组件在判断上述待执行触发请求触发的动作与当 前执行的触发请求触发的动作是否互斥时,可以获取上述待执行触发请求触发的动作的类型,以及当前执行的触发请求触发的动作的类型,进而判断上述待执行触发请求触发的动作的类型与当前执行的触发请求触发的动作的类型是否一致,如果一致,则判断上述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作不互斥,如果不一致,则判断上述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作互斥,从而,基于上述判断结果执行后续步骤,保证后续处理正常进行。其中,触发请求可以携带触发的动作的类型,上述目标执行组件根据上述待执行触发请求携带的触发的动作的类型和当前执行的触发请求携带的触发的动作的类型,获取上述待执行触发请求触发的动作的类型,以及当前执行的触发请求触发的动作的类型。
另外,触发请求可以携带激活标识,该激活标识用于表示触发请求触发的动作是否为激活动作。上述目标执行组件在判断当前执行的触发请求触发的动作是否在动作周期内时,可以根据当前执行的触发请求携带的激活标识,确定当前执行的触发请求触发的动作是否为激活动作,如果当前执行的触发请求触发的动作为激活动作,则判断当前执行的触发请求触发的动作在动作周期内。例如以手柄上的按钮为例,用户通过按下该按钮发送一触发请求,该触发请求携带激活标识,该激活标识用于表示触发请求触发的动作为激活动作(即按钮按下,触发请求携带的激活标识表示触发请求触发的动作为激活动作,按钮抬起,触发请求携带的激活标识表示触发请求触发的动作不为激活动作)。上述管理系统接收上述触发请求,将上述触发请求发送至对应的执行组件,该执行组件根据上述激活标识,确定上述触发请求触发的动作是否为激活动作,从而,能够准确判断上述触发请求触发的动作是否在动作周期内。
在本公开实施例中,上述目标执行组件将接收的触发请求缓存在一个队列中,然后会从该队列中取出一个触发请求作为待执行触发请求处理,在保证动作周期完整性的情况下,调用相应的动作执行接口处理触发事件,并在UI显示动作执行接口的处理结果。
其中,上述目标执行组件在调用相应的动作执行接口处理触发事件时,可以获取上述待执行触发请求触发的动作的类型,然后,根据该类型,确定待调用的相应的动作执行接口,进而,调用相应的动作执行接口处理触发事件,保证后续处理准确进行。这里,上述类型可以包括单击、摇杆等,上述目标执行组件根据该类型,确定单击、摇杆等动作执行接口,从而,调用单击、摇杆等动作执行接口处理单击、摇杆等事件。
这里,上述动作执行接口中可以包括可交互组件,例如普通按钮(Button)、进度条(Slider)、单复选按钮(Tab)、滚动视图(ScrollView)等承载了具体功能的UI组件,以便在上述接口中实现各自的功能逻辑,响应输入设备的触发,打通完整UI交互链路。
本公开实施例,通过执行组件接收输入设备发送的触发请求,该执行组件与输入设备对应,该执行组件的状态为活跃状态,其它执行组件的状态为非活跃状态,使得同一时刻只有一个输入设备是活跃的(可以触发事件),以便同一时刻UI只响应一个触发,然后,根据上述触发请求,执行触发信号碰撞检测和事件处理,并在UI显示处理结果,解决了在XR场景下多个输入设备与UI的交互,UI如何正确响应的问题,保证XR场景中的UI交互正确、快速和完整。而且,本公开实施例中执行组件在接收到管理系统发送的触发请求后,将该触发请求缓存在预设动作队列中,这样一方面可以管理有互斥关系的触发请求,保证两者动作周期不重叠,另一方面在动作周期内,即使触发请求不会每 帧发送,也可以保证执行组件每帧去生成触发事件,使得UI能够完整的处理相应事件。
另外,以目标执行组件与管理系统进行交互,管理系统接收目标输入设备发送的触发请求,并将该触发请求转发至上述目标执行组件为例,图5给出了另一种交互方法的流程示意图,即管理系统对应的交互方法的流程,其中相关描述参见图2,此处不再赘述,如图5所示,该方法可以包括:
S501:接收目标输入设备的触发请求。
S502:根据上述触发请求,将目标执行组件的状态设置为活跃状态,并将所管理的多个执行组件中,除与上述目标执行组件外剩余的各个执行组件的状态设置为非活跃状态,其中,一个执行组件对应一个输入设备,上述目标执行组件与上述目标输入设备对应。
这里,一个输入设备对应一个执行组件,即与一个执行组件绑定,这样,上述管理系统可以将与目标输入设备对应的目标执行组件的状态设置为活跃状态,并将其它执行组件的状态设置为非活跃状态,使得同一时刻只有一个输入设备是活跃的(可以触发事件),以便同一时刻UI只响应一个触发,满足实际应用需要。
S503:将上述触发请求发送给上述目标执行组件,上述触发请求用于指示上述目标执行组件将上述触发请求缓存在预设动作队列中,并在接收到更新指令后,基于UI,进行触发信号碰撞检测,在触发信号碰撞检测后,按照预设顺序,从上述预设动作队列中获取一触发请求作为待执行触发请求,根据该待执行触发请求,调用相应的动作执行接口处理触发事件,并在UI显示动作执行接口的处理结果。
其中,上述管理系统在设置执行组件的活跃状态后,将上述触发请求发送给上述目标执行组件,以使上述目标执行组件将接收的触发请求缓存在一个队列中,然后会从该队列中取出一个触发请求作为待执行触发请求处理,在保证动作周期完整性的情况下,调用相应的动作执行接口处理触发事件,并在UI显示动作执行接口的处理结果。
本公开实施例通过管理系统与多个执行组件进行交互,管理多个输入设备与UI的交互。其中,一个执行组件对应一个输入设备,管理系统在接收输入设备的触发请求后,将与输入设备对应的执行组件的状态设置为活跃状态,其它执行组件的状态设置为非活跃状态,使得同一时刻只有一个输入设备是活跃的(可以触发事件),然后将该触发请求发送给对应的执行组件,由该执行组件执行触发信号碰撞检测和事件处理,并在UI显示处理结果,保证XR场景中多个输入设备与UI交互的正确、快速和完整。
另外,图6为本公开实施例提供了再一种交互方法的流程示意图,本实施例从管理系统和执行组件两端交互说明交互方法的流程,如图6所示,该方法可以包括:
S601:管理系统接收目标输入设备的触发请求。
S602:管理系统根据上述触发请求,将目标执行组件的状态设置为活跃状态,并将所管理的多个执行组件中,除与上述目标执行组件外剩余的各个执行组件的状态设置为非活跃状态,其中,一个执行组件对应一个输入设备,上述目标执行组件与上述目标输入设备对应。
S603:将上述触发请求发送给上述目标执行组件。
S604:上述目标执行组件接收上述触发请求,将上述触发请求缓存在预设动作队列中,并在接收到更新指令后,基于UI,进行触发信号碰撞检测。
其中,上述目标执行组件将上述触发请求缓存在预设动作队列中,一方面可以管理有互斥关系的触发请求,保证两者动作周期不重叠,另一方面在动作周期内,即使触发请求不会每帧发送,也可以保证执行组件每帧去生成触发事件。
S605:在触发信号碰撞检测后,上述目标执行组件按照预设顺序,从上述预设动作队列中获取一触发请求作为待执行触发请求。
这里,在所述触发信号碰撞检测后,按照预设顺序,从预设动作队列中,获取一触发请求作为待执行触发请求之前,上述目标执行组件还可以根据触发信号碰撞检测到的上一帧中触发信号与UI碰撞到的节点,以及当前帧中触发信号与UI碰撞到的节点,调用相应的悬停处理接口处理悬停事件,并在UI显示悬停处理接口的处理结果。
其中,上述悬停处理接口包括悬停进入接口、悬停停留接口和悬停结束接口。上述目标执行组件可以将上一帧中触发信号与UI碰撞到的节点,与当前帧中触发信号与UI碰撞到的节点进行比较;根据比较结果中,没有在上一帧中触发信号与UI碰撞到的节点中,但在当前帧中触发信号与UI碰撞到的节点中的节点,调用悬停进入接口处理悬停事件;根据上述比较结果中,在上一帧中触发信号与UI碰撞到的节点中,且在当前帧中触发信号与UI碰撞到的节点中的节点,调用悬停停留接口处理悬停事件;根据上述比较结果中,在上一帧中触发信号与UI碰撞到的节点中,但没有在当前帧中触发信号与UI碰撞到的节点中的节点,调用悬停结束接口处理悬停事件。
S606:上述目标执行组件根据上述待执行触发请求,调用相应的动作执行接口处理触发事件,并在UI显示动作执行接口的处理结果。
其中,上述目标执行组件可以先判断当前执行的触发请求触发的动作是否在动作周期内,如果当前执行的触发请求触发的动作不在动作周期内,则根据上述待执行触发请求,调用相应的动作执行接口处理触发事件。
这里,如果当前执行的触发请求触发的动作在动作周期内,则上述目标执行组件可以进一步判断上述待执行触发请求触发的动作与当前执行的触发请求触发的动作是否互斥,如果不互斥,则根据上述待执行触发请求,调用相应的动作执行接口处理触发事件;如果互斥,则在当前执行的触发请求执行完成后,根据上述待执行触发请求,调用相应的动作执行接口处理触发事件。
在本公开实施例中,上述目标执行组件可以获取上述待执行触发请求触发的动作的类型,以及当前执行的触发请求触发的动作的类型,通过判断两者类型是否一致,确定上述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作是否互斥。如上述待执行触发请求触发的动作的类型与当前执行的触发请求触发的动作的类型一致,则上述目标执行组件判断上述待执行触发请求触发的动作与当前执行的触发请求触发的动作不互斥,否则,判断上述待执行触发请求触发的动作与当前执行的触发请求触发的动作互斥。
另外,触发请求携带激活标识,该激活标识用于表示触发请求触发的动作是否为激活动作。
上述目标执行组件在判断当前执行的触发请求触发的动作是否在动作周期内时,可以根据当前执行的触发请求携带的激活标识,确定当前执行的触发请求触发的动作是否为激活动作,如果当前执行的触发请求触发的动作为激活动作,则判断当前执行的触发 请求触发的动作在动作周期内。
上述目标执行组件在调用相应的动作执行接口处理触发事件时,可以获取上述待执行触发请求触发的动作的类型,然后,根据该类型,确定待调用的相应的动作执行接口,进而,调用相应的动作执行接口处理触发事件。
综上所述,本公开实施例通过管理系统与多个执行组件进行交互,管理多个输入设备与UI的交互。其中,一个执行组件对应一个输入设备,管理系统在接收输入设备的触发请求后,将与输入设备对应的执行组件的状态设置为活跃状态,其它执行组件的状态设置为非活跃状态,使得同一时刻只有一个输入设备是活跃的(可以触发事件),以便同一时刻UI只响应一个触发。然后,管理系统将上述触发请求发送给与输入设备对应的执行组件,执行组件在接收到管理系统发送的触发请求后,将该触发请求缓存在预设动作队列中,这样一方面可以管理有互斥关系的触发请求,保证两者动作周期不重叠,另一方面在动作周期内,即使触发请求不会每帧发送,也可以保证执行组件每帧去生成触发事件。另外,执行组件执行触发信号碰撞检测和事件处理,并在UI显示处理结果,保证XR场景中的UI交互正确、快速和完整。
对应于上文实施例的交互方法,图7为本公开实施例提供的交互装置的结构示意图。为了便于说明,仅示出了与本公开实施例相关的部分。图7为本公开实施例提供的一种交互装置的结构示意图,该交互装置70包括:接收模块701、处理模块702、获取模块703以及调用模块704。这里的交互装置可以是上述目标执行组件本身,或者是实现目标执行组件的功能的芯片或者集成电路。这里需要说明的是,接收模块、处理模块、获取模块以及调用模块的划分只是一种逻辑功能的划分,物理上两者可以是集成的,也可以是独立的。
其中,接收模块701,用于接收触发请求,所述触发请求是目标输入设备发送的,所述目标执行组件与所述目标输入设备对应,所述目标执行组件的状态为活跃状态,所述多个执行组件中除所述目标执行组件外剩余的各个执行组件的状态为非活跃状态。
处理模块702,用于将所述触发请求缓存在预设动作队列中,并在接收到更新指令后,基于UI,进行触发信号碰撞检测。
获取模块703,用于在触发信号碰撞检测后,按照预设顺序,从所述预设动作队列中获取一触发请求作为待执行触发请求。
调用模块704,用于根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,并在所述UI显示所述动作执行接口的处理结果。
在一种可能的实现方式中,所述调用模块704,具体用于:
判断当前执行的触发请求触发的动作是否在动作周期内;
若所述当前执行的触发请求触发的动作不在动作周期内,则根据所述待执行触发请求,调用相应的动作执行接口处理触发事件。
在一种可能的实现方式中,所述调用模块704,具体用于:
若所述当前执行的触发请求触发的动作在动作周期内,则判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作是否互斥;
若所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作不互斥,则根据所述待执行触发请求,调用相应的动作执行接口处理触发事件;
若所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作互斥,则在所述当前执行的触发请求执行完成后,根据所述待执行触发请求,调用相应的动作执行接口处理触发事件。
在一种可能的实现方式中,触发请求携带激活标识,所述激活标识用于表示触发请求触发的动作是否为激活动作。
所述调用模块704,具体用于:
根据所述当前执行的触发请求携带的激活标识,确定所述当前执行的触发请求触发的动作是否为激活动作;
若所述当前执行的触发请求触发的动作为激活动作,则判断所述当前执行的触发请求触发的动作在动作周期内。
在一种可能的实现方式中,所述调用模块704,具体用于:
获取所述待执行触发请求触发的动作的类型,以及所述当前执行的触发请求触发的动作的类型;
判断所述待执行触发请求触发的动作的类型与所述当前执行的触发请求触发的动作的类型是否一致;
若所述待执行触发请求触发的动作的类型与所述当前执行的触发请求触发的动作的类型一致,则判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作不互斥。
在一种可能的实现方式中,所述获取模块703在所述触发信号碰撞检测后,所述按照预设顺序,从所述预设动作队列中,获取一触发请求作为待执行触发请求之前,还用于
根据触发信号碰撞检测到的上一帧中触发信号与所述UI碰撞到的节点,以及当前帧中触发信号与所述UI碰撞到的节点,调用相应的悬停处理接口处理悬停事件,并在所述UI显示所述悬停处理接口的处理结果。
在一种可能的实现方式中,所述悬停处理接口包括悬停进入接口、悬停停留接口和悬停结束接口;
所述获取模块703,具体用于:
将所述上一帧中触发信号与所述UI碰撞到的节点,与所述当前帧中触发信号与所述UI碰撞到的节点进行比较;
根据比较结果中,没有在所述上一帧中触发信号与所述用户界面碰撞到的节点中,但在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停进入接口处理悬停事件;
根据所述比较结果中,在所述上一帧中触发信号与所述用户界面碰撞到的节点中,且在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停停留接口处理悬停事件;
根据所述比较结果中,在所述上一帧中触发信号与所述用户界面碰撞到的节点中,但没有在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停结束接口处理悬停事件。
在一种可能的实现方式中,所述调用模块704,具体用于:
获取所述待执行触发请求触发的动作的类型;
根据所述待执行触发请求触发的动作的类型,确定待调用的相应的动作执行接口;
调用所述相应的动作执行接口处理触发事件。
在一种可能的实现方式中,所述目标执行组件与管理系统进行交互。
所述接收模块701,具体用于:
接收所述管理系统发送的所述触发请求,所述触发请求是所述管理系统接收所述目标输入设备发送的,所述触发请求用于指示所述管理系统将所述目标执行组件的状态设置为活跃状态,并将所述多个执行组件中除与所述目标执行组件外剩余的各个执行组件的状态设置为非活跃状态。
本公开实施例提供的装置,可用于执行上述图2-图4所示方法实施例的技术方案,其实现原理和技术效果类似,本公开实施例此处不再赘述。
可选地,图8示意性地提供本公开所述目标执行组件的一种可能的基本硬件架构示意图。
参见图8,目标执行组件包括至少一个处理器801以及通信接口803。进一步可选的,还可以包括存储器802和总线804。
其中,目标执行组件中,处理器801的数量可以是一个或多个,图8仅示意了其中一个处理器801。可选地,处理器801,可以是中央处理器(central processing unit,CPU)、图形处理器(graphics processing unit,GPU)或者数字信号处理器(digital signal processor,DSP)。如果目标执行组件具有多个处理器801,多个处理器801的类型可以不同,或者可以相同。可选地,目标执行组件的多个处理器801还可以集成为多核处理器。
存储器802存储计算机指令和数据;存储器802可以存储实现本公开提供的上述交互方法所需的计算机指令和数据,例如,存储器802存储用于实现上述交互方法的步骤的指令。存储器802可以是以下存储介质的任一种或任一种组合:非易失性存储器(例如只读存储器(Read-Only Memory,ROM)、固态硬盘(Solid State Disk,SSD)、硬盘(Hard Disk Drive,HDD)、光盘),易失性存储器。
通信接口803可以为所述至少一个处理器提供信息输入/输出。也可以包括以下器件的任一种或任一种组合:网络接口(例如以太网接口)、无线网卡等具有网络接入功能的器件。
可选的,通信接口803还可以用于目标执行组件与其它计算设备或者终端进行数据通信。
进一步可选的,图8用一条粗线表示总线804。总线804可以将处理器801与存储器802和通信接口803连接。这样,通过总线804,处理器801可以访问存储器802,还可以利用通信接口803与其它计算设备或者终端进行数据交互。
在本公开中,目标执行组件执行存储器802中的计算机指令,使得目标执行组件实现本公开提供的上述交互方法,或者使得目标执行组件部署上述的交互装置。
从逻辑功能划分来看,示例性的,如图8所示,存储器802中可以包括接收模块701、处理模块702、获取模块703以及调用模块704。这里的包括仅仅涉及存储器中所存储的指令被执行时可以分别实现接收模块、处理模块、获取模块以及调用模块的功能,而不限定是物理上的结构。
本公开提供一种计算机可读存储介质,所述计算机程序产品包括计算机指令,所述计算机指令指示计算设备执行本公开提供的上述交互方法。
本公开提供一种计算机程序产品,包括计算机指令,所述计算机指令被处理器执行上述交互方法。
本公开提供一种计算机程序,所述计算机程序存储在可读存储介质中,电子设备的至少一个处理器可以从所述可读存储介质中读取所述计算机程序,所述至少一个处理器执行所述计算机程序,使得所述电子设备执行上述任一实施例提供的方法。
本公开提供一种芯片,包括至少一个处理器和通信接口,所述通信接口为所述至少一个处理器提供信息输入和/或输出。进一步,所述芯片还可以包含至少一个存储器,所述存储器用于存储计算机指令。所述至少一个处理器用于调用并运行该计算机指令,以执行本公开提供的上述交互方法。
在本公开所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本公开各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。

Claims (15)

  1. 一种交互方法,其中,应用于目标执行组件,所述目标执行组件为多个执行组件中的一个,一个执行组件对应一个输入设备,所述方法包括:
    接收触发请求,所述触发请求是目标输入设备发送的,所述目标执行组件与所述目标输入设备对应,所述目标执行组件的状态为活跃状态,所述多个执行组件中除所述目标执行组件外剩余的各个执行组件的状态为非活跃状态;
    将所述触发请求缓存在预设动作队列中,并在接收到更新指令后,基于用户界面,进行触发信号碰撞检测;
    在触发信号碰撞检测后,按照预设顺序,从所述预设动作队列中获取一触发请求作为待执行触发请求;
    根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,并在所述用户界面显示所述动作执行接口的处理结果。
  2. 根据权利要求1所述的方法,其中,在所述根据所述待执行触发请求,调用相应的动作执行接口处理触发事件之前,还包括:
    判断当前执行的触发请求触发的动作是否在动作周期内;
    所述根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,包括:
    若所述当前执行的触发请求触发的动作不在动作周期内,则根据所述待执行触发请求,调用相应的动作执行接口处理触发事件。
  3. 根据权利要求2所述的方法,其中,在所述判断当前执行的触发请求触发的动作是否在动作周期内之后,还包括:
    若所述当前执行的触发请求触发的动作在动作周期内,则判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作是否互斥;
    所述根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,包括:
    若所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作不互斥,则根据所述待执行触发请求,调用相应的动作执行接口处理触发事件;
    若所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作互斥,则在所述当前执行的触发请求执行完成后,根据所述待执行触发请求,调用相应的动作执行接口处理触发事件。
  4. 根据权利要求2或3所述的方法,其中,触发请求携带激活标识,所述激活标识用于表示触发请求触发的动作是否为激活动作;
    所述判断当前执行的触发请求触发的动作是否在动作周期内,包括:
    根据所述当前执行的触发请求携带的激活标识,确定所述当前执行的触发请求触发的动作是否为激活动作;
    若所述当前执行的触发请求触发的动作为激活动作,则判断所述当前执行的触发请求触发的动作在动作周期内。
  5. 根据权利要求3所述的方法,其中,所述判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作是否互斥,包括:
    获取所述待执行触发请求触发的动作的类型,以及所述当前执行的触发请求触发的动作的类型;
    判断所述待执行触发请求触发的动作的类型与所述当前执行的触发请求触发的动作的类型是否一致;
    若所述待执行触发请求触发的动作的类型与所述当前执行的触发请求触发的动作的类型一致,则判断所述待执行触发请求触发的动作与所述当前执行的触发请求触发的动作不互斥。
  6. 根据权利要求1至5中任一项所述的方法,其中,在所述触发信号碰撞检测后,所述按照预设顺序,从所述预设动作队列中,获取一触发请求作为待执行触发请求之前,还包括:
    根据触发信号碰撞检测到的上一帧中触发信号与所述用户界面碰撞到的节点,以及当前帧中触发信号与所述用户界面碰撞到的节点,调用相应的悬停处理接口处理悬停事件,并在所述用户界面显示所述悬停处理接口的处理结果。
  7. 根据权利要求6所述的方法,其中,所述悬停处理接口包括悬停进入接口、悬停停留接口和悬停结束接口;
    所述根据触发信号碰撞检测到的上一帧中触发信号与所述用户界面碰撞到的节点,以及当前帧中触发信号与所述用户界面碰撞到的节点,调用相应的悬停处理接口处理悬停事件,包括:
    将所述上一帧中触发信号与所述用户界面碰撞到的节点,与所述当前帧中触发信号与所述用户界面碰撞到的节点进行比较;
    根据比较结果中,没有在所述上一帧中触发信号与所述用户界面碰撞到的节点中,但在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停进入接口处理悬停事件;
    根据所述比较结果中,在所述上一帧中触发信号与所述用户界面碰撞到的节点中,且在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停停留接口处理悬停事件;
    根据所述比较结果中,在所述上一帧中触发信号与所述用户界面碰撞到的节点中,但没有在所述当前帧中触发信号与所述用户界面碰撞到的节点中的节点,调用所述悬停结束接口处理悬停事件。
  8. 根据权利要求1至7中任一项所述的方法,其中,所述根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,包括:
    获取所述待执行触发请求触发的动作的类型;
    根据所述待执行触发请求触发的动作的类型,确定待调用的相应的动作执行接口;
    调用所述相应的动作执行接口处理触发事件。
  9. 根据权利要求1至8中任一项所述的方法,其中,所述目标执行组件与管理系统进行交互;
    所述接收触发请求,包括:
    接收所述管理系统发送的所述触发请求,所述触发请求是所述管理系统接收所述目标输入设备发送的,所述触发请求用于指示所述管理系统将所述目标执行组件的状态设置为活跃状态,并将所述多个执行组件中除与所述目标执行组件外剩余的各个执行组件的状态设置为非活跃状态。
  10. 一种交互装置,其中,应用于目标执行组件,所述目标执行组件为多个执行组件中的一个,一个执行组件对应一个输入设备,所述装置包括:
    接收模块,用于接收触发请求,所述触发请求是目标输入设备发送的,所述目标执行组件与所述目标输入设备对应,所述目标执行组件的状态为活跃状态,所述多个执行组件中除所述目标执行组件外剩余的各个执行组件的状态为非活跃状态;
    处理模块,用于将所述触发请求缓存在预设动作队列中,并在接收到更新指令后,基于用户界面,进行触发信号碰撞检测;
    获取模块,用于在触发信号碰撞检测后,按照预设顺序,从所述预设动作队列中获取一触发请求作为待执行触发请求;
    调用模块,用于根据所述待执行触发请求,调用相应的动作执行接口处理触发事件,并在所述用户界面显示所述动作执行接口的处理结果。
  11. 根据权利要求10所述的装置,其中,所述调用模块,具体用于:
    判断当前执行的触发请求触发的动作是否在动作周期内;
    若所述当前执行的触发请求触发的动作不在动作周期内,则根据所述待执行触发请求,调用相应的动作执行接口处理触发事件。
  12. 一种目标执行组件,包括:
    处理器;
    存储器;以及
    计算机程序;
    其中,所述计算机程序被存储在所述存储器中,并且被配置为由所述处理器执行,所述计算机程序包括用于执行如权利要求1至9任一项所述的方法的指令。
  13. 一种计算机可读存储介质,其中,所述计算机可读存储介质存储有计算机程序,所述计算机程序使得服务器执行如权利要求1至9任一项所述的方法。
  14. 一种计算机程序产品,其中,包括计算机指令,所述计算机指令被处理器执行如权利要求1至9任一项所述的方法。
  15. 一种计算机程序,其中,所述计算机程序在被处理器执行时实现如权利要求1至9任一项所述的方法。
PCT/CN2022/123489 2022-03-16 2022-09-30 交互方法、装置及存储介质 WO2023173726A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP22799838.2A EP4273670A4 (en) 2022-03-16 2022-09-30 INTERACTION METHOD AND APPARATUS, AND STORAGE MEDIUM
US17/998,718 US20240176456A1 (en) 2022-03-16 2022-09-30 Interaction method and apparatus, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210260731.4 2022-03-16
CN202210260731.4A CN114625253A (zh) 2022-03-16 2022-03-16 交互方法、装置及存储介质

Publications (1)

Publication Number Publication Date
WO2023173726A1 true WO2023173726A1 (zh) 2023-09-21

Family

ID=81902592

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/123489 WO2023173726A1 (zh) 2022-03-16 2022-09-30 交互方法、装置及存储介质

Country Status (4)

Country Link
US (1) US20240176456A1 (zh)
EP (1) EP4273670A4 (zh)
CN (1) CN114625253A (zh)
WO (1) WO2023173726A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625253A (zh) * 2022-03-16 2022-06-14 北京字跳网络技术有限公司 交互方法、装置及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109782920A (zh) * 2019-01-30 2019-05-21 上海趣虫科技有限公司 一种用于扩展现实的人机交互方法及处理终端
US20200043243A1 (en) * 2018-07-31 2020-02-06 Splunk Inc. Precise manipulation of virtual object position in an extended reality environment
CN111586452A (zh) * 2020-04-30 2020-08-25 北京盛世辉科技有限公司 用于跨设备互动的方法及装置、播放设备
CN111656256A (zh) * 2018-03-21 2020-09-11 三星电子株式会社 利用注视跟踪和焦点跟踪的系统和方法
CN114625253A (zh) * 2022-03-16 2022-06-14 北京字跳网络技术有限公司 交互方法、装置及存储介质

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180359448A1 (en) * 2017-06-07 2018-12-13 Digital Myths Studio, Inc. Multiparty collaborative interaction in a virtual reality environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111656256A (zh) * 2018-03-21 2020-09-11 三星电子株式会社 利用注视跟踪和焦点跟踪的系统和方法
US20200043243A1 (en) * 2018-07-31 2020-02-06 Splunk Inc. Precise manipulation of virtual object position in an extended reality environment
CN109782920A (zh) * 2019-01-30 2019-05-21 上海趣虫科技有限公司 一种用于扩展现实的人机交互方法及处理终端
CN111586452A (zh) * 2020-04-30 2020-08-25 北京盛世辉科技有限公司 用于跨设备互动的方法及装置、播放设备
CN114625253A (zh) * 2022-03-16 2022-06-14 北京字跳网络技术有限公司 交互方法、装置及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4273670A4 *

Also Published As

Publication number Publication date
EP4273670A4 (en) 2023-12-06
EP4273670A1 (en) 2023-11-08
CN114625253A (zh) 2022-06-14
US20240176456A1 (en) 2024-05-30

Similar Documents

Publication Publication Date Title
CN108287657B (zh) 技能施加方法及装置、存储介质、电子设备
JP2018535459A (ja) ロボットによるプロセス自動化
US9300520B2 (en) Mobile network application test
US10831331B2 (en) Window control for simultaneously running applications
CN105474160A (zh) 高性能触摸拖放
US10055388B2 (en) Declarative style rules for default touch behaviors
CN109446832A (zh) 一种截屏方法及装置
JP6250151B2 (ja) タッチパッド操作およびダブルタップ・ズーミングに対する独立ヒット・テスト
CN110109598A (zh) 一种动画交互实现方法、装置及电子设备
US8943373B1 (en) Keyboard, video and mouse switch identifying and displaying nodes experiencing a problem
US20200278823A1 (en) Screen sharing system, and information processing apparatus
US9383908B2 (en) Independent hit testing
WO2023173726A1 (zh) 交互方法、装置及存储介质
CN112073301B (zh) 删除聊天群组成员的方法、设备及计算机可读介质
US9830184B2 (en) Systems and methods for determining desktop readiness using interactive measures
US10346031B2 (en) View activation via hit testing in an asynchronous windowing system
CN104024991A (zh) 使用单个输入源支持不同的事件模型
US9950542B2 (en) Processing digital ink input subject to monitoring and intervention by an application program
CN117762303A (zh) 基于手势识别的图片动态移动控制方法、装置及存储介质
Chen et al. MSA: A Novel App Development Framework for Transparent Multi-Screen Support on Android Apps
CN114168194A (zh) 用于控制计算设备的方法、计算设备和计算机存储介质
CN112908329A (zh) 语音控制方法及装置、电子设备和介质
Andreychuk Evaluating the Performance of Direct Injection and TUIO-based Protocols for Multi-Touch Data Transfer

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 17998718

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2022799838

Country of ref document: EP

Effective date: 20221114