WO2024041429A1 - 一种任务接续方法、设备及系统 - Google Patents

一种任务接续方法、设备及系统 Download PDF

Info

Publication number
WO2024041429A1
WO2024041429A1 PCT/CN2023/113299 CN2023113299W WO2024041429A1 WO 2024041429 A1 WO2024041429 A1 WO 2024041429A1 CN 2023113299 W CN2023113299 W CN 2023113299W WO 2024041429 A1 WO2024041429 A1 WO 2024041429A1
Authority
WO
WIPO (PCT)
Prior art keywords
task
electronic device
user
icon
interface
Prior art date
Application number
PCT/CN2023/113299
Other languages
English (en)
French (fr)
Inventor
倪银堂
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024041429A1 publication Critical patent/WO2024041429A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication

Definitions

  • the present application relates to the field of terminal technology, and in particular, to a task connection method, device and system.
  • AR augmented reality
  • AR devices can provide users with services that overlay and display virtual images in the real world based on AR technology.
  • AR devices such as AR glasses, AR helmets, and electronic devices equipped with AR applications (such as mobile phones, tablets, etc.).
  • AR devices can also support users to interact with AR objects in the AR field of view.
  • the user can interact with the AR object through direct contact between the hand and the AR object, by pointing the extended line of the hand at the AR object, through voice interaction or body movement interaction (such as head movement interaction, gesture interaction, etc.).
  • the above interaction methods often fail to bring a good interactive experience to users.
  • the above-mentioned method of direct contact between the hand and the AR object and the method of pointing to the AR object through the extension line of the hand require the user's hand to stay in the air for a long time, but it is difficult for the human hand to stay in the air for a long time without support. , so this method is not suitable for long-term, continuous operations.
  • the user does not want to make a sound or make a body movement, so the above voice interaction method and body movement interaction method are usually not welcomed by the user.
  • This application provides a task connection method, device and system, which can provide users with a more convenient and easy-to-operate AR interactive experience.
  • a task continuation method includes: when the AR device recognizes that a preset condition is met, displaying icons of one or more AR tasks being executed by the AR device in the AR field of view.
  • the icon of the AR task includes a first task icon, and the first task icon corresponds to the first AR task; in response to the user's first operation on the first task icon, the AR device instructs the first electronic device to continue the AR device to perform the first AR task.
  • the purpose of the first operation is to switch the first task from the AR device to the first electronic device.
  • the first task may include an interface display task and/or an audio playback task.
  • the AR device when the AR device performs an AR task, such as displaying an AR interface and/or playing AR audio, can provide a task icon in the AR field of view for the user to select when the user's intention to continue is recognized.
  • the next AR mission after receiving the user's operation of selecting the AR task and the user's operation of determining the target device, the AR device may instruct the target device (such as the first electronic device) to continue the AR device to perform the AR task selected by the user.
  • the target device such as the first electronic device
  • the target device continues the AR device to execute the user-selected During AR tasks, there will be no problems such as horizontal stripes or display jitter, so the display effect is better.
  • the above-mentioned AR device recognizes that a preset condition is met, including: the AR device recognizes a first electronic device that satisfies the preset condition.
  • the above-mentioned AR device identifies the first electronic device that meets the preset conditions, including: the AR device receives the user's operation of selecting the first electronic device on the first interface of the AR device, wherein the first interface includes the The device establishes a list of multiple electronic devices with communication connections.
  • the above-mentioned AR device recognizes the first electronic device that meets the preset condition, including: the AR device recognizes the first electronic device located within the preset range of the AR device. It can be understood that usually, if the user has the intention to connect the AR task on the AR device to a certain electronic device (such as the first electronic device), or has the intention to connect the AR task through a certain electronic device (such as the first electronic device) During related operations, the user usually picks up the first electronic device. In this case, the first electronic device will fall within the preset range of the AR device. Based on this, in this embodiment of the present application, the AR device can determine the user's task continuing intention based on whether it is recognized to be within the preset range of the AR device.
  • the above-mentioned first electronic device is located within the AR field of view of the AR device. It can be understood that if the user intends to continue the AR task on the AR device to the first electronic device, the user can pick up the first electronic device and drop it into the AR field of view of the AR device. Based on this, in this embodiment of the present application, the AR device can determine the user's task continuing intention based on whether it is recognized to be within the AR field of view of the AR device.
  • the above-mentioned first electronic device satisfies a preset spatial posture. It can be understood that if the user intends to continue the AR task on the AR device to the first electronic device, the user can pick up the first electronic device.
  • the spatial posture of the first electronic device when the user picks up the first electronic device usually follows certain rules, such as satisfying the spatial posture when being held by the user. Based on this, in this embodiment of the present application, the AR device can determine the user's task continuation intention based on whether the first electronic device satisfies the preset spatial posture.
  • the above-mentioned AR device identifies the first electronic device located within the preset range of the AR device, including: the AR device identifies the first electronic device located within the preset range of the AR device based on automatic positioning technology. ; Alternatively, the AR device identifies the first electronic device located within the preset range of the AR device based on the captured real-world image information; the AR device identifies the first electronic device located within the preset range of the AR device based on the motion data of the first electronic device. First electronic device. This application does not limit the specific method of identifying electronic devices located within the preset range of the AR device.
  • the AR device can be based on automatic positioning technology such as ultrasonic positioning technology, emit ultrasonic signals through a first module (such as a speaker) provided therein, and receive signals from other electronic devices through a second module (such as a microphone) provided therein.
  • the echo signal of the ultrasonic signal and based on the transmission path of the sending signal, the transmission path of the receiving signal, and the relative position relationship between the first module and the second module, determine the specific location of multiple electronic devices in the environment, and then determine which The electronic device is within the preset range of the AR device.
  • the AR device can determine which electronic devices are located within the preset range of the AR device based on the image information about the electronic devices included in the real-world image information captured by the camera.
  • the AR device can determine the specific position of the electronic device based on motion data such as speed, acceleration, angular velocity, and angular acceleration of the electronic device, and then determine which electronic devices are within the preset range of the AR device.
  • the user's first operation on the first task icon includes any of the following: the user slides the first task icon toward the first electronic device; the user pinches and drags the first task icon.
  • the task connection process provided by this application can be based on an easy-to-operate method, while realizing the selection of tasks to be switched and triggering a quick and convenient target device selection.
  • the above method further includes: the AR device switches the current focus icon according to the user's operation.
  • the user's operation may be a left-right sliding operation, an up-down sliding operation, etc., which is not limited in this application. Based on this, users can switch focus icons consistently, quickly and conveniently. The design is easy for users to operate and has a high user experience.
  • the above method further includes: the AR device performs one or more AR tasks, and displays in the AR field of view corresponding to the one or more AR tasks. AR interface. Based on this, the AR device can provide users with services corresponding to one or more of the above AR tasks through the AR field of view.
  • the AR interface corresponding to the one or more AR tasks is a task card corresponding to one or more AR tasks, where the task card includes abbreviated information of the corresponding AR task.
  • task cards corresponding to one or more AR tasks include a music card including information about the music currently being played, a document card including information about the document currently being edited, and a user's location information and destination in the real world. map card with location information, chat card including the user’s current chat partner and chat content, smart home card including smart home device information and current status information (such as online status or offline status, etc.), including the time and date of the memo event Memo cards with specific event information, etc.
  • a task card corresponds to an AR task running on a device.
  • one task card corresponds to multiple AR tasks running in multiple devices.
  • This task card is also called a fusion card.
  • the arrangement of the icons of the one or more AR tasks is consistent with the arrangement of the AR interfaces corresponding to the one or more AR tasks in the AR field of view. Based on this, it is convenient for users to quickly and accurately determine the specific location of the task icon corresponding to the AR task they want to continue.
  • At least one of the applications corresponding to the one or more AR tasks is running in the AR device; and/or, at least one of the applications corresponding to the one or more AR tasks is running in the AR device.
  • One or more electronic devices in a communication relationship For example, the applications corresponding to one or more of the above AR tasks are all running in the AR device. For another example, applications corresponding to one or more of the above AR tasks are run in one or more other electronic devices. For another example, the applications corresponding to the one or more AR tasks mentioned above are respectively run in the AR device and one or more other electronic devices.
  • the above-mentioned display of icons of one or more AR tasks being performed by the AR device within the AR field of view includes: displaying the above-mentioned one or more AR tasks in the task icon display area of the first electronic device. icon; wherein, the task icon display area is located on the upper side, lower side, left side, right side or side of the first electronic device, or the task icon display area is located on the drop-down menu bar or lock screen interface of the first electronic device.
  • the arrangement of the icons of the one or more AR tasks is related to the positional relationship between the electronic device running the application corresponding to the one or more AR tasks and the AR device.
  • the arrangement of the icons of one or more AR tasks includes any of the following: a straight arrangement, a double-row arrangement, a circular arrangement, and a free arrangement. .
  • the above method further includes: the first electronic device continues the AR device to perform the first AR task according to the instruction of the AR device.
  • the first electronic device continues the AR device to perform the first AR task according to the instruction of the AR device.
  • users can interact with mobile phones, tablets, personal computers and other highly operable and easy-to-operate target devices to achieve interaction with AR objects in the AR field of view, which can provide users with a more convenient and easy-to-operate AR interactive experience.
  • mobile phones, tablets, personal computers and other devices usually have higher display resolutions than AR devices and are not affected by the external environment (such as lighting, refraction of spatial objects, etc.)
  • the target device continues the AR device to execute the user-selected During AR tasks, there will be no problems such as horizontal stripes or display jitter, so the display effect is better.
  • the above-mentioned first AR task includes an interface display task
  • the first electronic device continues the AR device to perform the first AR task according to the instructions of the AR device, including: the first electronic device displays the first AR task corresponding to the first AR task.
  • Interface wherein, the interface corresponding to the first AR task is a task card corresponding to the first AR task, or the interface corresponding to the first AR task includes interface elements in the task card corresponding to the first AR task.
  • the target device can display the interface corresponding to the original first AR task, and can also perform interface adaptation, such as interface size adaptation, interface layout adaptation, etc., to obtain a better display. Effect.
  • the above-mentioned first AR task includes an audio playback task
  • the first electronic device continues the AR device to perform the first AR task according to the instructions of the AR device, including: the first electronic device plays the audio corresponding to the first AR task. audio; or, the first electronic device instructs the audio playback device connected to the first electronic device to play the audio corresponding to the first AR task.
  • the target device can play the audio received from the AR device through its own audio playback module, or it can play the audio received from the AR device through other audio playback peripherals (such as speakers, etc.) audio.
  • the above method further includes: in response to the user's second operation on the first task icon, the AR device indicates The second electronic device performs the first AR task following the first electronic device. Based on this, AR tasks can be freely continued between devices.
  • the above method further includes: in response to the user's third operation on the first task icon, the AR device continues The first electronic device performs the first AR task. Based on this, AR tasks can be freely continued between devices.
  • a task continuation method includes: when the AR device recognizes that a preset condition is met, displaying icons of one or more AR tasks being executed by the AR device in the AR field of view.
  • the icon of the AR task includes a first task icon, and the first task icon corresponds to the first AR task; in response to the user's first operation on the first task icon, the AR device instructs the first electronic device to continue the AR device to perform the first AR task.
  • the purpose of the first operation is to switch the first task from the AR device to the first electronic device.
  • the first task may include an interface display task and/or an audio playback task.
  • the AR device when the AR device performs an AR task, such as displaying an AR interface and/or playing AR audio, can provide a task icon in the AR field of view for the user to select when the user's intention to continue is recognized.
  • the next AR mission after receiving the user's operation of selecting the AR task and the user's operation of determining the target device, the AR device may instruct the target device (such as the first electronic device) to continue the AR device to perform the AR task selected by the user.
  • the target device such as the first electronic device
  • the target device continues the AR device to execute the user-selected During AR tasks, there will be no problems such as horizontal stripes or display jitter, so the display effect is better.
  • the above-mentioned AR device recognizes that a preset condition is met, including: the AR device recognizes a first electronic device that satisfies the preset condition.
  • the above-mentioned AR device identifies the first electronic device that meets the preset conditions, including: the AR device receives the user's operation of selecting the first electronic device on the first interface of the AR device, where the first interface includes the The device establishes a list of multiple electronic devices with communication connections.
  • the above-mentioned AR device recognizes the first electronic device that meets the preset condition, including: the AR device recognizes the first electronic device located within the preset range of the AR device. It can be understood that if the user intends to continue the AR task on the AR device to the first electronic device, the user can pick up the first electronic device and drop it into the preset range of the AR device. Based on this, in this embodiment of the present application, the AR device can determine the user's task continuing intention based on whether it is recognized to be within the preset range of the AR device.
  • the above-mentioned first electronic device is located within the AR field of view of the AR device. It can be understood that if the user intends to continue the AR task on the AR device to the first electronic device, the user can pick up the first electronic device and drop it into the AR field of view of the AR device. Based on this, in this embodiment of the present application, the AR device can determine the user's task continuing intention based on whether it is recognized to be within the AR field of view of the AR device.
  • the above-mentioned first electronic device satisfies a preset spatial posture. It can be understood that if the user intends to continue the AR task on the AR device to the first electronic device, the user can pick up the first electronic device.
  • the spatial posture of the first electronic device when the user picks up the first electronic device usually follows certain rules, such as satisfying the spatial posture when being held by the user. Based on this, in this embodiment of the present application, the AR device can determine the user's task continuation intention based on whether the first electronic device satisfies the preset spatial posture.
  • the above-mentioned AR device identifies the first electronic device located within the preset range of the AR device, including: the AR device identifies the first electronic device located within the preset range of the AR device based on automatic positioning technology. ; Alternatively, the AR device identifies the first electronic device located within the preset range of the AR device based on the captured real-world image information; the AR device identifies the first electronic device located within the preset range of the AR device based on the motion data of the first electronic device. First electronic device. This application does not limit the specific method of identifying electronic devices located within the preset range of the AR device.
  • the AR device can be based on automatic positioning technology such as ultrasonic positioning technology, emit ultrasonic signals through a first module (such as a speaker) provided therein, and receive signals from other electronic devices through a second module (such as a microphone) provided therein.
  • the echo signal of the ultrasonic signal and based on the transmission path of the sending signal, the transmission path of the receiving signal, and the relative position relationship between the first module and the second module, determine the specific location of multiple electronic devices in the environment, and then determine which The electronic device is within the preset range of the AR device.
  • the AR device can determine which electronic devices are located within the preset range of the AR device based on the image information about the electronic devices included in the real-world image information captured by the camera.
  • the AR device can determine the specific position of the electronic device based on motion data such as speed, acceleration, angular velocity, and angular acceleration of the electronic device, and then determine which electronic devices are within the preset range of the AR device.
  • the user's first operation on the first task icon includes any of the following: the user's operation of sliding the first task icon toward the first electronic device; the user's pinching and dragging of the first task icon.
  • the task connection process provided by this application can be based on an easy-to-operate method, while realizing the selection of tasks to be switched and triggering a quick and convenient target device selection.
  • the above method further includes: the AR device switches the current focus icon according to the user's operation.
  • the user's operation may be a left-right sliding operation, an up-down sliding operation, etc., which is not limited in this application. Based on this, users can switch focus icons consistently, quickly and conveniently. The design is easy for users to operate and has a high user experience.
  • the above method further includes: the AR device performs one or more AR tasks, and displays in the AR field of view corresponding to the one or more AR tasks. AR interface. Based on this, the AR device can provide users with services corresponding to one or more of the above AR tasks through the AR field of view.
  • the AR interface corresponding to the one or more AR tasks is a task card corresponding to one or more AR tasks, where the task card includes abbreviated information of the corresponding AR task.
  • task cards corresponding to one or more AR tasks include a music card including information about the music currently being played, a document card including information about the document currently being edited, and a user's location information and destination in the real world. map card with location information, chat card including the user’s current chat partner and chat content, smart home card including smart home device information and current status information (such as online status or offline status, etc.), including the time and date of the memo event Memo cards with specific event information, etc.
  • a task card corresponds to an AR task running on a device.
  • one task card corresponds to multiple AR tasks running in multiple devices.
  • This task card is also called a fusion card.
  • the arrangement of the icons of the one or more AR tasks is consistent with the arrangement of the AR interfaces corresponding to the one or more AR tasks in the AR field of view. Based on this, it is convenient for users to quickly and accurately determine the specific location of the task icon corresponding to the AR task they want to continue.
  • At least one of the applications corresponding to the one or more AR tasks is running in the AR device; and/or, at least one of the applications corresponding to the one or more AR tasks is running in the AR device.
  • One or more electronic devices in a communication relationship For example, the applications corresponding to one or more of the above AR tasks are all running in the AR device. For another example, applications corresponding to one or more of the above AR tasks are run in one or more other electronic devices. For another example, the applications corresponding to the one or more AR tasks mentioned above are respectively run in the AR device and one or more other electronic devices.
  • the above-mentioned display of icons of one or more AR tasks being performed by the AR device within the AR field of view includes: displaying the above-mentioned one or more AR tasks in the task icon display area of the first electronic device. icon; wherein, the task icon display area is located on the upper side, lower side, left side, right side or side of the first electronic device, or the task icon display area is located on the drop-down menu bar or lock screen interface of the first electronic device.
  • the arrangement of the icons of the one or more AR tasks is related to the positional relationship between the electronic device running the application corresponding to the one or more AR tasks and the AR device.
  • the arrangement of the icons of one or more AR tasks includes any of the following: a straight arrangement, a double-row arrangement, a circular arrangement, and a free arrangement. .
  • the above method further includes: in response to the user's second operation on the first task icon, the AR device indicates The second electronic device performs the first AR task following the first electronic device. Based on this, AR tasks can be freely continued between devices.
  • the above method further includes: in response to the user's third operation on the first task icon, the AR device continues The first electronic device performs the first AR task. Based on this, AR tasks can be freely continued between devices.
  • a task connection method includes: the first electronic device continues the AR device to perform the first AR task according to the instruction of the AR device.
  • the solution provided by the above third aspect allows users to interact with highly operable and easy-to-operate target devices such as mobile phones, tablets, and personal computers to achieve interaction with AR objects in the AR field of view, which can provide users with a more convenient and easy-to-operate solution.
  • AR interactive experience Moreover, assuming that mobile phones, tablets, personal computers and other devices usually have higher display resolutions than AR devices and are not affected by the external environment (such as lighting, refraction of spatial objects, etc.), then the target device continues the AR device to execute the user-selected During AR tasks, there will be no problems such as horizontal stripes or display jitter, so the display effect is better.
  • the above-mentioned first AR task includes an interface display task
  • the first electronic device The device's instructions are followed by the AR device performing the first AR task, including: the first electronic device displays an interface corresponding to the first AR task; wherein the interface corresponding to the first AR task is a task card corresponding to the first AR task, or the first AR task.
  • the interface corresponding to the task includes the interface elements in the task card corresponding to the first AR task.
  • the target device can display the interface corresponding to the original first AR task, and can also perform interface adaptation, such as interface size adaptation, interface layout adaptation, etc., to obtain a better display. Effect.
  • the above-mentioned first AR task includes an audio playback task
  • the first electronic device continues the AR device to perform the first AR task according to the instructions of the AR device, including: the first electronic device plays the audio corresponding to the first AR task. audio; or, the first electronic device instructs the audio playback device connected to the first electronic device to play the audio corresponding to the first AR task.
  • the target device can play the audio received from the AR device through its own audio playback module, or it can play the audio received from the AR device through other audio playback peripherals (such as speakers, etc.) audio.
  • an AR device in a fourth aspect, includes: an optical module for imaging within the field of view of the AR device; a memory for storing computer program instructions; and a processor for executing the instructions, so that The AR device implements the method described in any possible implementation manner of the second aspect.
  • a fifth aspect provides an electronic device.
  • the electronic device includes: a memory for storing computer program instructions; and a processor for executing the instructions, so that the electronic device implements any possible implementation manner as in the third aspect. the method described.
  • a sixth aspect provides an AR system, which includes: the AR device as described in the fourth aspect, and the electronic device as described in the fifth aspect.
  • a computer-readable storage medium is provided.
  • Computer-readable instructions are stored on the computer-readable storage medium.
  • any one of the possibilities of the second aspect or the third aspect can be realized. method in the implementation.
  • a chip system in an eighth aspect, includes a processor and a memory, and instructions are stored in the memory; when the instructions are executed by the processor, any of the possible methods of the second aspect or the third aspect is implemented. Methods in the implementation.
  • the chip system can be composed of chips or include chips and other discrete devices.
  • a computer program product which includes computer-readable instructions.
  • the computer-readable instructions When the computer-readable instructions are run on a computer, the method in any possible implementation manner of the second aspect or the third aspect is implemented.
  • Figure 1 is a schematic diagram of an AR task connection scenario provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of another AR task connection scenario provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application.
  • Figure 4 is a structural diagram of AR glasses provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of the display effect of an AR technology applied to a navigation scene according to an embodiment of the present application
  • FIG. 6 is a flowchart 1 of the AR task connection method provided by the embodiment of the present application.
  • Figure 7 is a schematic diagram of displaying task cards in an AR field of view provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of a scene used to represent a user's task continuation intention provided by an embodiment of the present application.
  • Figure 9 is a schematic diagram 1 of displaying task icons in the AR field of view provided by an embodiment of the present application.
  • Figure 10 is a schematic diagram of a principle for determining the spatial posture of an electronic device provided by an embodiment of the present application.
  • Figure 11 is a schematic diagram 2 of displaying task icons in the AR field of view provided by an embodiment of the present application.
  • Figure 12 is another schematic diagram of displaying task icons in the AR field of view provided by an embodiment of the present application.
  • Figure 13 is a schematic diagram 3 of displaying task icons in the AR field of view provided by an embodiment of the present application.
  • Figure 14 is a schematic diagram 4 of displaying task icons in the AR field of view provided by an embodiment of the present application.
  • Figure 15 is a schematic diagram 5 of displaying task icons in the AR field of view provided by an embodiment of the present application.
  • Figure 16 is a schematic diagram 6 of displaying task icons in the AR field of view provided by the embodiment of the present application.
  • Figure 17 is a schematic diagram 7 of displaying task icons in the AR field of view provided by the embodiment of the present application.
  • Figure 18 is a schematic diagram 8 of displaying task icons in the AR field of view provided by the embodiment of the present application.
  • Figure 19 is a schematic diagram 9 of displaying task icons in the AR field of view provided by the embodiment of the present application.
  • Figure 20A is Figure 1 of an AR task connection example provided by an embodiment of the present application.
  • Figure 20B is Figure 2 of an AR task connection example provided by an embodiment of the present application.
  • Figure 21 is Figure 3 of an AR task connection example provided by the embodiment of this application.
  • Figure 22 is Figure 4 of an AR task connection example provided by the embodiment of the present application.
  • Figure 23 is Figure 5 of an example of AR task succession provided by the embodiment of this application.
  • Figure 24A is Figure 6 of an AR task connection example provided by an embodiment of the present application.
  • Figure 24B is Figure 7 of an AR task connection example provided by the embodiment of the present application.
  • Figure 24C is Figure 8 of an AR task connection example provided by an embodiment of the present application.
  • Figure 24D is Figure 9 of an example of AR task continuation provided by the embodiment of the present application.
  • Figure 24E is Figure 10 of an example of AR task continuation provided by the embodiment of this application.
  • Figure 25 is a schematic diagram of an AR task connection scenario provided by an embodiment of the present application.
  • Figure 26 is Figure 11 of an AR task connection example provided by the embodiment of this application.
  • Figure 27 is Figure 12 of an AR task connection example provided by the embodiment of this application.
  • Figure 28 is Figure 13 of an AR task connection example provided by the embodiment of this application.
  • Figure 29 is Figure 14 of an example of AR task succession provided by the embodiment of this application.
  • Figure 30A is a schematic diagram of the display effect of the continued task provided by the embodiment of the present application.
  • Figure 30B is a schematic diagram of a task card temporary storage area provided by an embodiment of the present application.
  • Figure 30C is a schematic diagram of two other task card temporary storage areas provided by the embodiment of the present application.
  • Figure 31 is a flowchart 2 of the AR task connection method provided by the embodiment of the present application.
  • Figure 32 is a flowchart three of the AR task connection method provided by the embodiment of the present application.
  • Figure 33 is a flowchart 4 of the AR task connection method provided by the embodiment of the present application.
  • first and second are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features. Therefore, features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, “plurality” means two or more.
  • the AR device can connect AR objects in the AR field of view to any electronic device according to the actual needs of the user.
  • device hereinafter referred to as the target device.
  • the target device After connecting the AR objects in the AR field of view to the target device, the target device can be used as a display device corresponding to the AR field of view to display the AR objects in the AR field of view to the user.
  • users can interact with AR objects in the AR field of view by interacting with the target device.
  • the user can interact with target devices such as mobile phones, tablets, and personal computers (PCs) through an easy-to-operate and fast interaction method to realize interaction with the AR field of view.
  • target devices such as mobile phones, tablets, and personal computers (PCs)
  • PCs personal computers
  • the above-mentioned easy-to-operate and fast interaction methods can be click operations (such as single-click, double-click, etc.), sliding operations, long-press operations, preset gesture operations, etc. Therefore, based on the task connection method provided by the embodiment of the present application, a more convenient and easy-to-operate AR interactive experience can be provided for users.
  • AR is only one application scenario of this application, and may also be other scenarios in XR, such as mixed reality (mixed reality, MR) and virtual reality (virtual reality, VR).
  • MR mixed reality
  • VR virtual reality
  • This application does not limit this.
  • the following embodiments of this application only take AR task continuation in an AR scenario as an example.
  • the AR device can display objects in the AR field of view with the help of other display devices, where the display device is usually more flexible than the AR device. It is not affected by the external environment (such as lighting, refraction of space objects, etc.), and there will be no problems such as horizontal stripes or display jitter, so the display effect is better.
  • the AR device is a wearable device, such as AR glasses and AR helmets
  • the AR device if the AR device is relied solely on the AR interface display, the user will need to wear the AR device all the time. Wearing the AR device for a long time may make the user uncomfortable. It is comfortable and will also affect the user's normal activities.
  • the subsequent AR interface display does not need to rely on the AR device for image collection. , the user can take off the AR device and the user experience will be much better.
  • the AR task continuation may be an AR interface continuation, such as a continuation of AR object display in the AR field of view.
  • the AR task connection may also be the connection of AR audio, such as the connection of virtual audio signals.
  • the AR task connection may also include both AR interface connection and AR audio connection, which are not specifically limited in the embodiments of the present application.
  • FIG 1 shows an example of a task connection scenario provided by an embodiment of the present application.
  • AR glasses i.e. AR device
  • the mobile phone i.e. target device
  • AR task continuation shown in Figure 1, on the one hand, it can facilitate the user to interact with the mobile phone in an easy-to-control manner to achieve interaction with AR objects in the field of view of the AR glasses; on the other hand, the display effect can be good and not easily affected by the external environment.
  • the mobile phone replaces AR glasses for AR interface display to improve the display effect of the AR interface.
  • FIG. 2 shows another example of an AR task continuation scenario provided by an embodiment of the present application.
  • AR glasses i.e., AR device
  • the mobile phone i.e., the target device
  • AR audio Figure 2
  • the mobile phone i.e., the target device
  • the display effect can be good and not easily affected by the external environment.
  • the mobile phone can replace AR glasses to display the AR interface to improve the display effect of the AR interface; on the other hand, speakers with good audio playback effects can be used instead of AR glasses to play AR audio.
  • the AR device can connect the AR interface and/or AR audio to the target device.
  • the target device can also connect the AR interface and/or AR audio from the target device back to the AR device according to the user's actual needs, such as in response to the user's device switching operation.
  • the mobile phone can respond to the user's operation of switching the AR interface and/or AR audio to the AR device, and connect the AR interface and/or AR audio from the mobile phone back to the AR device.
  • the target device can also connect the AR interface and/or AR audio from the target device to other electronic devices according to the user's actual needs, such as in response to the user's device switching operation.
  • the mobile phone can respond to the user's operation of switching the AR interface and/or AR audio to the laptop computer, and connect the AR interface and/or AR audio from the mobile phone to the laptop computer.
  • the AR device has the function of providing AR information display.
  • the AR device may be AR glasses, AR helmets, electronic devices installed with AR applications, etc.
  • electronic devices installed with AR applications may include but are not limited to mobile phones (such as folding screen mobile phones, including inward-folding folding screen mobile phones and outward-folding folding screen mobile phones), netbooks, tablets, vehicle-mounted devices, and wearable devices (such as smart watches). , smart bracelets, smart glasses, etc.), cameras (such as SLR cameras, card cameras, etc.), PCs (including desktop computers or laptops), handheld computers, personal digital assistants (personal digital assistants, PDAs), portable multimedia players (portable multimedia player, PMP), projection equipment, smart screen equipment, augmented reality (AR)/virtual reality (VR) equipment, mixed reality (mixed reality, MR) equipment, television or human-computer interaction Motion sensing game consoles in the scene, etc.
  • This application does not limit the specific functions and structure of the AR device.
  • the target device has interface display and/or audio playback capabilities.
  • target devices may include but are not limited to mobile phones (such as folding screen mobile phones), netbooks, tablets, vehicle-mounted devices, wearable devices (such as smart watches, smart bracelets, etc.), cameras (such as SLR cameras, compact cameras, etc.) , PC (including desktop computer or notebook computer), handheld computer, PDA, PMP, projection equipment, smart screen equipment, vehicle-mounted equipment, AR/VR equipment, MR equipment, television or somatosensory game console in human-computer interaction scenarios, etc. Electronic equipment with interface display and audio playback functions.
  • the target device can also be an electronic device with an audio playback function such as a speaker or a headset.
  • This application targets the device Specific functions and structures are not limited.
  • FIG. 3 shows a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application.
  • the electronic device may be an AR device or a target device.
  • the electronic device may include a processor 310, an external memory interface 320, an internal memory 321, a universal serial bus (USB) interface 330, a charging management module 340, a power management module 341, and a battery 342.
  • Antenna 1 antenna 2, mobile communication module 350, wireless communication module 360, audio module 370, speaker 370A, receiver 370B, microphone 370C, headphone interface 370D, sensor module 380, button 390, motor 391, indicator 392, camera 393, Display 394 etc.
  • the sensor module 380 may include a pressure sensor 380A, a gyro sensor 380B, an air pressure sensor 380C, a magnetic sensor 380D, an acceleration sensor 380E, a distance sensor 380F, a proximity light sensor 380G, a fingerprint sensor 380H, a temperature sensor 380J, a touch sensor 380K, and ambient light.
  • the structures illustrated in the embodiments of the present application do not constitute specific limitations on the electronic equipment.
  • the electronic device may include more or less components than shown in the figures, or some components may be combined, or some components may be separated, or may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 310 may include one or more processing units.
  • the processor 310 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor ( image signal processor (ISP), audio processor/digital processor (the audio processor), controller, memory, video codec, audio codec, digital signal processor (digital signal processor, DSP), baseband processor processor, and/or neural network processing unit (NPU), etc.
  • application processor application processor
  • AP graphics processor
  • ISP image signal processor
  • audio processor/digital processor the audio processor
  • controller memory
  • video codec audio codec
  • digital signal processor digital signal processor
  • NPU neural network processing unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller can be the nerve center and command center of the electronic device.
  • the controller can generate operation control signals based on the operation codes and timing signals of the user's operation instructions to complete the control of fetching and executing instructions.
  • the processor 310 may also be provided with a memory for storing instructions and data.
  • the memory in processor 310 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 310 . If processor 310 needs to use the instructions or data again, it can be called directly from the memory. Repeated accesses are avoided and waiting events of the processor 310 are reduced, thus improving the efficiency of the system.
  • the processor 310 may be configured to perform virtual and real fusion according to the AR input information and output the AR input information (such as an AR interface and/or AR audio).
  • processor 310 may include one or more interfaces. Interfaces may include integrated circuit (inter-intergrated circuit, I2C) interface, integrated circuit built-in audio (inter-intergrated circuit sound, I2S) interface, pulse code modulation (pluse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO), user identification module interface, and/or universal serial bus interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • user identification module interface and/or universal serial bus interface, etc.
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (derail clock line, SCL).
  • processor 310 may include multiple sets of I2C buses.
  • the processor 310 can couple the touch sensor 380K, the microphone, the camera 393, etc. respectively through different I2C bus interfaces.
  • the processor 310 can be coupled to the touch sensor 380K through an I2C interface, so that the processor 310 and the touch sensor 380K communicate through the I2C bus interface to implement the touch function of the electronic device.
  • the processor 310 can obtain touch operations such as click operations, long press operations, preset gesture operations, or drag operations on the interface detected by the touch sensor 380K through the I2C bus interface, thereby determining the touch operations. corresponding specific intention, and then respond to the touch operation, such as selecting the AR task to be continued, selecting the target device, etc.
  • touch operations such as click operations, long press operations, preset gesture operations, or drag operations on the interface detected by the touch sensor 380K through the I2C bus interface, thereby determining the touch operations. corresponding specific intention, and then respond to the touch operation, such as selecting the AR task to be continued, selecting the target device, etc.
  • the interface connection relationships between the modules illustrated in the embodiments of the present application are only schematic illustrations and do not constitute structural limitations on the electronic equipment.
  • the electronic device may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charge management module 340 is used to receive charging input from the charger.
  • the power management module 341 is used to connect the battery 342, the charging management module 340 and the processor 310.
  • the power management module 341 receives input from the battery 342 and/or the charging management module 340, and provides power for the processor 310, internal memory 321, external memory, display screen 394, camera 393, wireless communication module 363, etc. electricity.
  • the wireless communication function of the electronic device can be implemented through antenna 1, antenna 2, mobile communication module 350, wireless communication module 360, modem processor and baseband processor, etc.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the mobile communication module 350 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the wireless communication module 360 can provide applications on electronic devices including wireless local area networks (WLAN) (such as wireless fidelity (wireless fidelity, WI-FI) network), Bluetooth (bluetooth, BT), Beidou satellite navigation system (BeiDou navigation satellite system, BDS), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • WI-FI wireless fidelity
  • BDS blue-BT
  • Beidou satellite navigation system Beidou satellite navigation system
  • GSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the electronic device implements display functions through a graphics processor (graphics processing unit, GPU), display screen 394, and application processor.
  • the GPU is an image processing microprocessor and is connected to the display screen 394 and the application processor. GPUs are used to perform data and geometric calculations for graphics rendering.
  • Processor 310 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 394 is used to display images, videos, etc.
  • Display 394 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode (AMOLED), flexible light-emitting diode Diode (flex light-emitting diode, FLED), quantum dot light emitting diode (quantum dot light emitting diode, QLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED active-matrix organic light-emitting diode
  • FLED flexible light-emitting diode Diode
  • quantum dot light emitting diode quantum dot light emitting diode
  • the display screen 394 may be a micro display screen.
  • the GPU can be used for interface rendering.
  • Display 394 may be used to display the interface.
  • the above interface may include but is not limited to application interface (such as browser interface, office application interface, mailbox interface, news application interface, social application interface, etc.), functional interface, applet interface and other interfaces.
  • the interface displayed on the display screen 394 after the GPU renders the interface may include the real world and may also include virtual images.
  • the GPU can be used to render the corresponding AR interface (such as a task card, etc.) according to the AR task to be continued selected by the user, and the display screen 394 can be used to display the above-mentioned AR interface.
  • the electronic device can realize the shooting function through an image signal processor (image signal processor, ISP), camera 393, video codec, GPU, display screen 394 and application processor.
  • the camera 393 may include a front camera and a rear camera of an electronic device, which may be an optical zoom lens, etc. This application does not limit this.
  • the ISP may be set in the camera 393, which is not limited in this application.
  • Camera 393 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge couple device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard image signals in three primary colors (red green blue, RGB), YUV and other formats.
  • the electronic device may include one or more cameras 393, such as at least one front camera and rear camera, multiple front cameras or multiple rear cameras, etc.
  • the electronic device if the electronic device is an AR device, the electronic device usually includes multiple cameras, and the division of labor between each camera is different.
  • some cameras can be used to provide image collection based on simultaneous localization and mapping (SLAM), some cameras are used for interactive gesture recognition, and some cameras are used for daily photography and video recording.
  • SLAM simultaneous localization and mapping
  • the electronic device can collect real environment information within its field of view through a camera.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic devices may support one or more video codecs. In this way, electronic devices can play or record videos in multiple encoding formats, such as moving picture experts group (MPEG)1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • the external memory interface 320 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device.
  • the external memory card communicates with the processor 310 through the external memory interface 320 to implement the data storage function. For example, save audio, video and other files in an external memory card.
  • the external memory card can be used to save the real-world image information captured by the electronic device through the camera, the information of the task card displayed in the electronic device, etc.
  • Internal memory 321 may be used to store executable program code for computer programs.
  • computer programs may include operating system programs and application programs. Operating systems may include, but are not limited to OS and other operating systems. Among them, the executable program code includes instructions.
  • the processor 310 executes instructions stored in the internal memory 321 to execute various functional applications and data processing of the electronic device.
  • the internal memory 321 may include a program storage area and a storage data area. Among them, the stored program area can store the operating system, at least one application program required for the function, etc.
  • the storage data area can store data created during the use of the electronic device (such as task cards, etc.).
  • the internal memory 321 may include high-speed random access memory, and may also include non-volatile memory, for example, at least one disk storage device, a flash memory device, universal flash storage (UFS), etc.
  • UFS universal flash storage
  • the electronic device can implement audio functions through the audio module 370, the speaker 370A, the receiver 370B, the microphone 370C, the headphone interface 370D, and the application processor. For example, audio playback, recording, etc.
  • the audio module 370 is used to convert digital audio information into analog signal output, and is also used to convert analog audio input into digital audio signals. Audio module 370 may also be used to encode and decode audio signals. In some embodiments, the audio module 370 may be provided in the processor 310 , or some functional modules of the audio module 370 may be provided in the processor 310 .
  • Speaker 370A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device can enable the user to listen to audio through speaker 370A, or listen to hands-free calls, etc.
  • the electronic device can play audio, such as AR audio, through the speaker 370A, where the AR audio can include sounds in the real world and can also include virtual sounds.
  • Receiver 370B also called “earpiece” is used to convert audio electrical signals into sound signals. When the electronic device answers a call or a voice message, the voice can be heard by placing the receiver 370B against the human ear.
  • Microphone 370C also known as “microphone” and “microphone”, is used to convert sound signals into electrical signals.
  • the user can speak close to the microphone 370C with the human mouth and input the sound signal to the microphone 370C.
  • the electronic device may be provided with at least two microphones 370C, such as local microphones or wireless microphones.
  • the electronic device can be equipped with three, four or more microphones 370C to implement functions such as sound signal collection and noise reduction.
  • the electronic device can collect sound signals in the real world through the microphone 370C.
  • Touch sensor 380K also called “touch panel”.
  • the touch sensor 380K can be disposed on the display screen 394.
  • the touch sensor 380K and the display screen 394 form a touch screen, which is also called a "touch screen”.
  • Touch sensor 380K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation (including information such as touch location, touch strength, contact area, and touch duration) to the processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 394.
  • the touch sensor 380K may also be disposed on the surface of the electronic device at a different location from the display screen 394 .
  • the touch operation detected by the touch sensor 380K may be the user's operation on or near the touch screen through his fingers, or it may be the user's use of a stylus, stylus, touch ball or other touch auxiliary tools on the touch screen. Operations on or near this application are not limited.
  • the structures illustrated in the embodiments of the present application do not constitute specific limitations on the electronic equipment.
  • the electronic device may include more or less components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the electronic device may also include a subscriber identification module (subscriber identification module, SIM) card interface 395.
  • SIM subscriber identification module
  • the electronic device is AR glasses, as shown in Figure 4, the electronic device includes in addition to a camera installed on the frame, a processor (such as a CPU, etc.) (including a wireless communication module, a sensor, a display screen (such as a micro display), etc. Screen) and other other modules are not shown (please refer to the schematic structural diagram shown in Figure 3), and may also include optical modules.
  • the optical module is mainly responsible for the imaging work of AR glasses.
  • optical modules include optical combiners, optical waveguides/prisms/free-form surfaces, etc.
  • AR glasses need to provide a see-through function, which means that the AR glasses must not only support the user to see the real environment through the AR glasses, but also support the user to see virtual information through the AR glasses. Therefore, the imaging system of AR glasses cannot block the front of the user's line of sight.
  • optical combiners to integrate virtual information and the real environment in the form of "cascades" to complement and enhance each other. For example, optical combiners such as curved mirrors and semi-reflective Birdbath.
  • optical waveguides, prisms and free-form surfaces can reflect, refract, and diffract the light beam emitted by the micro display screen, and finally project it onto the user's retina to form an image.
  • the prism can split the light beam of the image source through a transparent object surrounded by two intersecting but non-parallel planes or cause the light beam to diffract before entering the human eye to form an image.
  • the free-form surface can refract the light beam from the image source through an optical surface without translational or rotational symmetry before entering the human eye to form an image.
  • the optical waveguide can be a high-refractive index transparent substrate.
  • a specific coupling structure is provided on the side of the optical waveguide substrate.
  • the coupling structure can couple the beam of the image source, and the coupled image source beam can be processed within the substrate of the optical waveguide. Total reflection propagates until it is coupled out through a specific coupling structure when it is close to the human eye, and enters the human eye to form an image.
  • optical waveguides such as geometric reflective waveguides, diffraction etched grating waveguides, holographic grating waveguides, etc.
  • a task connection method provided by the embodiment of the present application can be applied to office scenes, game scenes, navigation scenes, medical scenes, education scenes and other scenes based on AR technology.
  • a task connection method provided by embodiments of the present application will be specifically introduced below with reference to the accompanying drawings, taking the target device to connect the AR task in the AR device as an example.
  • the application corresponding to the AR task can be installed and run in the AR device.
  • the application corresponding to the AR task can be installed and run in other electronic devices (such as the first electronic device), but the AR interface corresponding to the AR task is displayed by the AR device and/or the AR interface corresponding to the AR task AR audio is played by the AR device.
  • the AR device can connect the display task (that is, the AR task) to the target device (such as the first electronic device).
  • the application corresponding to the AR task can be installed and run in the first electronic device, and the AR device is responsible for display and/or playback.
  • the AR device can connect the display task and/or audio playback task to the first electronic device.
  • the AR task can be installed and run in the first electronic device, and the AR device is responsible for display and/or playback.
  • the AR device can continue the display task and/or audio playback task to the second electronic device.
  • the application corresponding to the AR task can be installed and run in the first electronic device, and the AR device is responsible for display and playback.
  • the AR device can continue the display task to the first electronic device, and the audio playback task to the second electronic device.
  • Electronic equipment is a Wi-Fi Protected Access
  • a communication connection is established between the AR device and the first electronic device, and between the AR device and the second electronic device.
  • the communication connection follows the wireless transmission protocol.
  • the wireless transmission protocol may include but is not limited to Bluetooth (BT) transmission protocol or WiFi transmission protocol.
  • BT Bluetooth
  • WiFi wireless transmission protocol
  • the communication connection is a Bluetooth connection, a point-to-point WiFi (peer to peer, P2P) connection, etc.
  • devices such as between an AR device and a first electronic device, between an AR device and a second electronic device, etc.
  • device discovery technology such as automatic proximity discovery, near-field discovery technology
  • other methods to discover one or more other devices that can be connected around, and perform authentication and pairing to establish a communication connection with them .
  • device discovery technology such as automatic proximity discovery, near-field discovery technology
  • the AR task connection method provided by the embodiment of the present application may include S601-S605:
  • S601 The AR device performs the first AR task.
  • the first AR task may be an interface display task and/or an AR audio playback task.
  • the AR device performs the first AR task, that is, the AR device displays the first AR interface and/or plays the first AR audio.
  • the first AR task can be an AR task corresponding to an application, an AR task corresponding to a functional module, an AR task corresponding to a mini program, etc., which are not limited in this application.
  • the corresponding task interface can be displayed in the AR field of view.
  • the task interface ie, AR interface
  • a task card includes abbreviated information corresponding to the AR task.
  • the AR device may provide multiple task cards shown in Figure 7 in the AR field of view.
  • card 1 shown in Figure 7 is a music card, and card 1 includes information about the music currently being played;
  • card 2 is a document card, and card 2 includes information about the document currently being edited;
  • card 3 is a map card, and card 3 includes user information.
  • Card 4 is a chat card, which includes the user’s current chat partner and chat content;
  • Card 5 is a smart home card, and Card 5 includes smart home device information and current status.
  • Information (such as online status or offline status, etc.);
  • card 6 is a memo card, and card 6 includes the time of the memo event and specific event information.
  • the application including quick application, light application, etc.
  • functional module/small program corresponding to the first AR task can be run in the AR device.
  • a map application may be run in the AR device.
  • the AR device displays a map interface (ie, AR interface) and/or plays navigation audio (ie, AR audio).
  • the above-mentioned map interface i.e., AR interface
  • the application including quick application, light application, etc.
  • functional module/small program corresponding to the first AR task can be run in other electronic devices other than the AR device, and the AR device provides the AR interface.
  • Display/AR audio playback service For example, a map application can be run on the mobile phone, and a communication connection is established between the mobile phone and the AR device. When the mobile phone runs the map application, the mobile phone can display the map interface (i.e., AR interface) and/or play navigation audio (i.e., AR device). Audio) is sent to the AR device to provide users with corresponding interface display and/or audio playback services through the AR device.
  • the map interface i.e., AR interface
  • play navigation audio i.e., AR device
  • Audio is sent to the AR device to provide users with corresponding interface display and/or audio playback services through the AR device.
  • the above-mentioned map interface (i.e., AR interface) can not only include buildings, roads, vehicles, trees and other things in the real world, as well as the user's location in the real world, but also include virtual information such as navigation routes, prompt signs, etc.
  • the AR device can perform corresponding AR tasks based on the information from the mobile phone, that is, display the map interface (i.e., AR interface) and/or play navigation audio (i.e., AR audio).
  • the electronic device running the application can directly generate the corresponding task card and send it to the AR device.
  • the electronic device running the application can send the interface configuration information of the relevant process to the AR device, and the AR device generates the corresponding task card on its own.
  • the AR device can call its card generation module to configure the interface according to the above. Information generation task cards.
  • the AR device can respectively Task cards corresponding to the application running in the first electronic device and the application running in the second electronic device are provided.
  • a memo application is running on both the first electronic device and the second electronic device.
  • the memo application on the first electronic device contains memo event 1 and memo event 2
  • the memo application on the second electronic device contains memo event 1 and memo event 2.
  • the AR device can provide card A and card B respectively in the AR field of view, where card A includes memo event 1 and memo event 2, and card B includes memo event 3.
  • the AR device can integrate the applications running in the first electronic device and the second electronic device to provide a task card in the AR field of view.
  • the task card is also called a fusion card.
  • the AR device can view the The card 6 shown in Figure 7 is provided in the field, and the card 6 includes both memo event 1, memo event 2, and memo event 3.
  • S602 The AR device recognizes that the preset conditions are met.
  • the AR device can determine that a preset condition is met based on the user's device selection operation on the AR device, where the preset condition is used to characterize the user's intention to continue.
  • the user's device selection operation on the AR device is such as the user's operation of selecting a first electronic device on a first interface, where the first interface includes a list of multiple electronic devices that have established communication connection relationships with the AR device, Included in this list is the first electronic device.
  • the AR device can determine that the preset conditions are met based on the recognized user's preset voice instructions.
  • the AR device recognizing that the preset conditions are met may mean that the AR device recognizes that the electronic device (such as the first electronic device) is located in the preset range of the AR device. Inside. For example, the AR device can determine that the electronic device (such as the first electronic device) is located within the preset range of the AR device through one or more of the following methods 1 to 3:
  • the AR device can obtain the electronic device (such as the first electronic device) located within the preset range of the AR device based on automatic positioning technology.
  • the above-mentioned automatic positioning technology is such as ultrasonic positioning technology.
  • the AR device can be based on ultrasonic positioning technology, emit ultrasonic signals through a speaker provided in it, and receive echo signals of ultrasonic signals from other electronic devices through a microphone provided in it. Furthermore, the AR device can determine the specific positions of multiple electronic devices in the environment based on the transmission path of the emitted signal, the transmission path of the received signal, and the relative position of the speaker and microphone based on triangulation positioning technology, and then determine which electronic devices. Located within the preset range of the AR device. If the first electronic device is located within the preset range of the AR device, the AR device can determine that the user has a need for cross-device AR task connection, and the target device is the first electronic device.
  • the AR device can determine the electronic device located within the preset range of the AR device based on the image information about the electronic device included in the captured real-world image information.
  • the AR device can collect real-world image information through a camera. If the image information includes image information of the first electronic device, the AR device can determine the spatial position of the first electronic device by analyzing the image information, and then determine the location of the first electronic device. Whether an electronic device is an electronic device within the preset range of the AR device. If the AR device determines that the first electronic device is located within the preset range of the AR device based on the image information of the first electronic device, the AR device can determine that the user has a need for cross-device AR task continuity, and the target device is the first electronic device.
  • the AR device can determine the spatial position of the electronic device based on the motion data of other electronic devices, and then determine the electronic device located within the preset range of the AR device based on the spatial position of the electronic device.
  • the motion data of the electronic device can be measured by the electronic device through a motion sensor (such as an acceleration sensor, a gyroscope sensor, etc.) during the movement of the electronic device.
  • a motion sensor such as an acceleration sensor, a gyroscope sensor, etc.
  • the motion data of the electronic device includes the speed, acceleration, angular velocity and angular acceleration of the electronic device, etc.
  • the AR device recognizing that the preset conditions are met may mean that the AR device recognizes that the electronic device (such as the first electronic device) is located in front of the AR device.
  • the electronic device within the setting range, and the electronic device is in a preset spatial posture.
  • the AR device can determine that the electronic device is within the preset range of the AR device and that the electronic device is in the preset spatial posture through the following methods A and/or B:
  • the AR device can determine the electronic device located within the preset range of the AR device and in the preset spatial posture based on the image information about the electronic device included in the captured real-world image information.
  • the user usually picks up the first electronic device (as shown in Figure 8).
  • the first electronic device will fall within the preset range of the AR device, and the camera of the AR device can capture the image information of the first electronic device.
  • the AR device can determine that the first electronic device is located within the preset range of the AR device based on the image information about the first electronic device included in the real-world image information captured by the camera.
  • the AR device can perform an operation on the first electronic device located within the preset range of the AR device based on the image information about the first electronic device included in the real-world image information captured by the camera. Spatial posture recognition, and then determining whether the spatial posture of the first electronic device satisfies the preset spatial posture. If the AR device determines that the spatial posture of the first electronic device satisfies the preset spatial posture based on the image information of the first electronic device, the AR device can determine that the user has a need for cross-device AR task continuity, and the target device is the first electronic device. .
  • the first electronic device is located within a preset range of the AR device, such as the first electronic device is located within the AR field of view of the AR device.
  • the AR device can obtain the spatial posture of the first electronic device by analyzing the image information of the first electronic device collected by the camera. As shown in Figure 8, assuming that the AR device is AR glasses, the AR glasses collect real-world image information through the camera. After image information analysis, the AR glasses determine that the mobile phone (i.e., the first electronic device) is within the preset range of the AR glasses and the mobile phone ( That is, the spatial posture of the first electronic device (the spatial posture shown in Figure 8) is the preset spatial posture. In this case, the AR glasses can determine that the user has a need to continue AR tasks across devices, and the target device is a mobile phone.
  • the mobile phone i.e., the first electronic device
  • the spatial posture shown in Figure 8 the preset spatial posture.
  • the AR glasses can determine that the user has a need to continue AR tasks across devices, and the target device is a mobile phone.
  • Method B The AR device can determine the spatial posture (including spatial position and spatial posture) of the electronic device based on the motion data of other electronic devices, and then determine the electronic device located within the preset range of the AR device and in the preset spatial posture.
  • the user has the intention to connect the AR task on the AR device to a certain electronic device (such as the first electronic device), or performs operations related to AR task connection through a certain electronic device (such as the first electronic device)
  • a certain electronic device such as the first electronic device
  • the user usually picks up the first electronic device (as shown in Figure 8).
  • the motion sensor of the first electronic device collects motion data of the first electronic device.
  • the AR device can determine that the first electronic device is located within the preset range of the AR device based on the motion data of the first electronic device.
  • the AR device can perform spatial posture recognition on the first electronic device located within the preset range of the AR device based on the motion data of the first electronic device, and then determine whether the spatial posture of the first electronic device satisfies the preset spatial posture. If the AR device determines that the spatial posture of the first electronic device satisfies the preset spatial posture based on the motion data of the first electronic device, the AR device can determine that the user has a need for cross-device AR task continuity, and the target device is the first electronic device. .
  • the motion data of the electronic device may be acceleration data and angular velocity data during the motion of the electronic device (such as when it is picked up by the user to the state shown in Figure 8).
  • the motion data of the electronic device can be expressed by the three-axis angular velocity ⁇ x , ⁇ y , ⁇ z and the three-axis acceleration a x , a y , a z .
  • the three-axis angular velocity can be understood as the angular velocity of the electronic device around the three axes of x, y, and z
  • the three-axis acceleration can be understood as the acceleration of the electronic device on the three axes of x, y, and z.
  • the movement of the electronic device in the x, y, z three-dimensional coordinate system may include three translational movements and three rotational movements, where the three translational movements include the movements of the electronic device on the x-axis.
  • the three rotational motions include the rotational motion of the electronic device around the x-axis (the angle of rotation is also called the pitch angle), the rotational motion around the y-axis (the angle of rotation is also called the roll angle Roll), and the rotational motion around the z-axis (the angle of rotation is also called the roll angle).
  • the angle is also called the yaw angle (Yaw).
  • each of the three axes of x, y, and z will correspond to an acceleration, that is, a x , a y , and a z .
  • an acceleration that is, a x , a y , and a z .
  • the posture changes caused by the electronic device during movement can be determined by the three-axis angular velocity ⁇ x , ⁇ y , ⁇ z and the three-axis acceleration a x , a y of the electronic device on the three axes of x , y , and z .
  • a z to represent.
  • the above-mentioned method of determining that preset conditions are met based on automatic positioning technology and analyzing the image information collected by the camera is only an example.
  • the AR device can also identify the user's connection intention based on other methods, which are not limited by the embodiments of this application.
  • the AR device provides one or more task icons in the AR field of view, and the one or more task icons include the first task icon.
  • one or more task icons provided by the AR device in the AR field of view are icons corresponding to one or more AR tasks being executed by the AR device.
  • the AR device provides the one or more task icons in the AR field of view to facilitate the user to select subsequent tasks and determine the target device.
  • One or more task icons provided by the AR device in the AR field of view respectively correspond to the task interface (ie, AR interface).
  • the AR device may provide one or more task icons by displaying virtual task icons in the AR field of view and near the first electronic device (eg, within a preset range).
  • the AR glasses can be in the AR field of view of the AR glasses and the mobile phone recognized by the AR glasses.
  • Multiple task icons are displayed on the upper side (as shown in Figure 9, icon 1, icon 2, icon 3, icon 4, icon 5 and icon 6).
  • one or more task icons provided by the AR device in the AR field of view include task icons corresponding to applications/function modules/applets, etc. running in the AR device.
  • one or more task icons provided by the AR device in the AR field of view include other electronic devices (such as a first electronic device and a second electronic device) running a communication connection established with the AR device. Task icons corresponding to applications/functional modules/mini-programs, etc.
  • Figure 8 is only an example of a way in which an AR device displays a task icon, and the embodiment of the present application does not limit the relative position of the task icon and the first electronic device.
  • the task icon display area can also be located on the left/right side of the mobile phone (ie, the first electronic device) /lower side.
  • the AR device can display multiple task icons in the AR field of view of the AR glasses and on the left/right/lower side of the first electronic device recognized by the AR glasses.
  • the AR device can also display multiple task icons through the drop-down menu bar of the first electronic device, that is, the task icon display area is located in the drop-down menu bar of the first electronic device.
  • the AR device can also display multiple task icons through the lock screen interface of the first electronic device, that is, the task icon display area is located on the lock screen interface of the first electronic device.
  • the task icon display area can also be located on the side of the AR device.
  • the AR device may display multiple task icons on the side of the first electronic device. As shown in Figure 14, assuming that the first electronic device is a folding screen device, the task icon display area is located on the folding side of the AR device.
  • FIG. 9 is only an example of an arrangement manner of task icons, and the embodiment of the present application does not limit the specific arrangement rules of task icons.
  • the AR device in addition to displaying multiple task icons in a one-line arrangement as shown in Figure 9, can also display multiple task icons in a double-row arrangement as shown in Figure 15(b), or in a circular arrangement or a free-form arrangement. Display multiple task icons in an arrangement, etc.
  • the AR device can also display one or more of the task icons in a hidden display form (in the form of an ellipsis as shown in Figure 16). As shown in Figure 16, the user can control the normal display of the hidden and displayed task icons through finger sliding and other operations.
  • the AR device can also display task icons corresponding to multiple task cards in the AR field of view based on the relative positions of the multiple task cards. Based on this, it is convenient for users to quickly and accurately determine the specific location of the task icon corresponding to the AR task they want to continue. For example, as shown in (b) of Figure 15 , assuming that the AR device provides multiple task cards shown in (a) of Figure 15 in the AR field of view, the AR device can display the task icons as shown in the figure.
  • Card 1 i.e., music card
  • card 2 i.e., document card
  • card 3 i.e., map card
  • card 4 i.e., chat card
  • card 5 i.e., smart home card
  • 6 displays its corresponding task icon 1, icon 2, icon 3, icon 4, icon 5 and icon 6.
  • the AR device can also display the task icon corresponding to the task card at the corresponding position according to the orientation (such as spatial posture) of the AR device.
  • the AR device provides multiple task cards as shown in (a) in Figure 17 in the AR field of view.
  • the AR device can display the task icons corresponding to cards 1 and 4 on the left. ; If the AR device is facing the right, when displaying task icons, the AR device can display the task icons corresponding to card 3 and card 6 on the right; if the AR device is facing forward, when displaying task icons, the AR device can display The task icons corresponding to card 2 and card 5 in the middle.
  • the AR device can also arrange corresponding task icons based on the relative positional relationship between the devices on which the applications run. For example, assume that the applications corresponding to icons 1 and 2 are running in the second electronic device, the applications corresponding to icons 3 and 4 are running in the AR device, and the applications corresponding to icons 5 and 6 are running in the third electronic device.
  • the positional relationship between the second electronic device, the third electronic device and the AR device is shown in (a) in Figure 18, where the second electronic device is located on the left side of the AR device, and the third electronic device is located on the right side of the AR device, as shown in Figure 18
  • the AR device when the AR device displays a task icon, it can display icon 1 and icon 2 of the application running in the second electronic device on the left side of the task icon display area, and in the middle of the task icon display area.
  • Diagram showing applications running in an AR device icon 3 and icon 4, and icon 5 and icon 6 of the application running in the third electronic device are displayed on the right side of the task icon display area.
  • the AR device can also display task icons of applications running in the electronic device facing the AR device according to the orientation (such as spatial posture) of the AR device. For example, assume that the applications corresponding to icons 1 and 2 are running in the second electronic device, the applications corresponding to icons 3 and 4 are running in the AR device, and the applications corresponding to icons 5 and 6 are running in the third electronic device.
  • the positional relationship between the second electronic device, the third electronic device and the AR device is shown in (a) in Figure 19, where the second electronic device is located on the left side of the AR device, and the third electronic device is located on the right side of the AR device, as shown in Figure 19
  • the AR device faces the second electronic device
  • when the AR device displays the task icon it can display the icon 1 and the icon of the application running in the second electronic device on the left side of the task icon display area. 2;
  • the AR device is facing the third electronic device.
  • the AR device displays a task icon the icon 5 and icon 6 of the application running in the third electronic device can be displayed on the left side of the task icon display area; assuming that the AR device is facing the forward direction.
  • the AR device displays task icons, it can display icon 3 and icon 4 of the application running in the AR device on the left side of the task icon display area.
  • the AR task connection instruction is used to instruct the first electronic device to continue the AR device to perform the first AR task currently being executed by the AR device. For example, assuming that the first AR task currently being executed by the AR device is to display the AR interface, the AR task continuation instruction is used to instruct the first electronic device to continue the AR device to display the above-mentioned AR interface. For another example, assuming that the first AR task currently being executed by the AR device is to play AR audio, the AR task continuation instruction is used to instruct the first electronic device to continue playing the AR audio by the AR device.
  • the task connection method provided by the embodiment of the present application can support various forms of operations for switching the first AR task from the AR device to the first electronic device.
  • the following is a detailed introduction through several examples:
  • the operation of the user to switch the first AR task from the AR device to the first electronic device may be that the user switches the first AR task from one or more task icons provided by the AR device.
  • the task icon corresponding to an AR task slides toward the operation of the first electronic device.
  • the user's operation of switching the first AR task from the AR device to the first electronic device is as follows: The user slides the icon 4 toward the mobile phone (ie, the first electronic device).
  • FIG. 20A only takes as an example that the AR device always provides multiple task icons in the AR field of view.
  • the AR device may also respond to the user's operation of sliding the icon 4 toward the mobile phone (i.e., the first electronic device) as shown in (a) in Figure 21 after completing the AR task.
  • the task icon is displayed, as shown in Figure 21.
  • the user's operation of switching the first AR task from the AR device to the first electronic device may be the user pinching one or more task icons provided by the AR device.
  • the user's operation of switching the first AR task from the AR device to the first electronic device is as follows: The user pinches the icon 4 and continues dragging it toward the mobile phone (ie, the first electronic device).
  • the user's operation of switching the first AR task from the AR device to the first electronic device may be when the user's current focus icon is the task icon corresponding to the first AR task. (i.e. the first task icon).
  • the preset operations include preset sliding operations, click (such as single click, double click, etc.) operations, long press operations, etc., which are not limited in the embodiments of this application.
  • the current focus icon can be switched by the user through operations.
  • the operation may be a left-right sliding operation, an up-down sliding operation, etc., which are not limited in the embodiments of this application.
  • the AR device switches the current focus icon from icon 3 to icon 4.
  • the user's operation of switching the first AR task from the AR device to the first electronic device is shown in (b) in Figure 22.
  • the current focus icon is icon 4 (i.e., the task icon corresponding to the first AR task)
  • the user switches from the mobile phone to The operation 2202 of sliding the left edge of the first electronic device to the right.
  • the user's operation of switching the first AR task from the AR device to the first electronic device may be the user selecting the first electronic device among multiple candidate device options. operate.
  • the AR device can display virtual icons of multiple candidate devices for the user to select the target device that is about to take over the AR task. Further, the AR device can determine the target device when receiving an operation of the user dragging the task icon corresponding to the first AR task to the virtual icon of a candidate device (such as the first electronic device) while maintaining the above preset gesture. This is the candidate device (ie, the first electronic device).
  • the user's operation of switching the first AR task from the AR device to the first electronic device is the continuous action of maintaining the preset gesture until dragging the task icon.
  • preset gestures include pinch gestures, long press gestures, etc.
  • the multiple candidate devices mentioned above all have communication connections established with the AR device.
  • the AR device detects the user's operation of pinching icon 4 (ie, the task icon corresponding to the first AR task) in the AR field of view. Further, in response to the user's operation of pinching icon 4 (i.e., the task icon corresponding to the first AR task) in the AR field of view, the AR device displays the AR field of view shown in (b) in Figure 24A, which includes tablet computers, mobile phones , laptops, and other virtual icons for multiple candidate devices.
  • the AR device determines that the target device is the mobile phone (i.e., the first electronic device), and determines that the user The intention is to connect the AR task corresponding to icon 4 (ie, the first AR task) to the mobile phone (ie, the first electronic device).
  • multiple candidate devices corresponding to the multiple virtual icons displayed by the AR device can be configured with the first AR task.
  • the multiple candidate devices may include devices with display screens, such as mobile phones, tablets, etc.; if the first AR task includes an audio playback task, the multiple candidate devices may Includes audio-capable devices such as speakers.
  • the AR device in response to receiving that the user pinches icon 4 in the AR field of view (ie, the first AR task corresponds to (task icon), the AR device displays the AR field of view shown in (b) in Figure 24B, which includes multiple candidate devices such as tablets, mobile phones, and laptops. Among them, as shown in (b) in FIG. 24B, irrelevant icons except icon 4 disappear.
  • the AR device determines that the target device is the mobile phone (i.e., the first electronic device), and determines that the user's intention is Connect the AR task corresponding to icon 4 (ie, the first AR task) to the mobile phone (ie, the first electronic device).
  • the AR device displays multiple candidate devices according to user operations (such as the user's operation of pinching the icon 4 in the AR field of view as described in (a) in FIG. 24A or (a) in FIG. 24B), so that the user can quickly , conveniently select the target device.
  • user operations such as the user's operation of pinching the icon 4 in the AR field of view as described in (a) in FIG. 24A or (a) in FIG. 24B.
  • the design is easy for users to operate, more intuitive, and has a high user experience.
  • the user's operation of switching the first AR task from the AR device to the first electronic device may be the user's operation of selecting the first electronic device on the HyperTerminal interface.
  • the HyperTerminal interface includes multiple candidate devices that can be used for task collaboration or task connection.
  • the AR device can send instructions to the mobile phone to enter the AR drag state.
  • the mobile phone can display a hyper terminal interface, where the hyper terminal interface includes icons of multiple candidate devices such as tablets, mobile phones, and laptops.
  • the AR device determines that the target device is a mobile phone (ie, the first electronic device), and determines that the user's intention is to move the icon to 4.
  • the corresponding AR task i.e., the first AR task is connected to the mobile phone (i.e., the first electronic device).
  • the multiple candidate devices included on the above-mentioned hyper terminal interface may be identified by the mobile phone and/or AR device based on positioning technology (such as automatic positioning technology), or may be captured by the mobile phone and/or AR device through a camera, or may be captured by the mobile phone and/or AR device through a camera. It can be determined by combining the above two methods, and is not limited in the embodiment of this application. Among them, positioning technology such as ultrasonic positioning technology is not limited in this application.
  • FIG. 24C only takes the target device selected by the user as a device that displays the HyperTerminal interface (ie, a mobile phone) as an example.
  • users can select any device as the target device in the HyperTerminal interface.
  • the AR device sends an AR task connection instruction to the tablet according to the user's selection to instruct the tablet to continue the execution of the AR device.
  • the AR device is currently executing, and the task selected by the user The first AR mission to be continued.
  • the electronic device displays a hyper terminal interface including multiple candidate devices according to the user operation (as shown in (a) in Figure 24C, the user pinches the icon 4 and continues to drag it to the phone), so that the user can quickly and Convenient target device selection.
  • the design is easy for users to operate, more intuitive, and has a high user experience.
  • the user switches the first AR task from the AR device to the first electronic device through an auxiliary input device (such as a mouse, stylus, touch pen, etc.) pen, touch ball, keyboard, boss, etc.) in the AR field of view to select the operation of switching the first AR task to the first electronic device.
  • an auxiliary input device such as a mouse, stylus, touch pen, etc.
  • the AR field of view includes a laptop and multiple task icons displayed on the upper side of the laptop (as shown in FIG. 24D , icon 1, icon 2, icon 3, Icon 4, Icon 5 and Icon 6), in response to the user dragging icon 4 toward the laptop using the mouse connected to the laptop, the mobile phone displays the HyperTerminal interface shown in (b) in Figure 24D, where the HyperTerminal interface It includes multiple candidate devices such as tablets, mobile phones, and laptops. Further, in response to the user's operation of selecting a mobile phone on the hyperterminal interface as shown in (b) of FIG.
  • the AR device determines that the target device is a mobile phone (ie, the first electronic device), and determines that the user's intention is to move icon 4
  • the corresponding AR task ie, the first AR task
  • the mobile phone ie, the first electronic device
  • the AR field of view includes a laptop, a mobile phone, and multiple task icons displayed on the upper side of the laptop (as shown in Figure 24E, icon 1, icon 2, icon 3. Icon 4, Icon 5 and Icon 6), in response to the user using the mouse connected to the laptop to drag icon 4 to the mobile phone, the AR device determines that the target device is the mobile phone (i.e. the first electronic device), and determines that the user The intention is to connect the AR task corresponding to icon 4 (ie, the first AR task) to the mobile phone (ie, the first electronic device).
  • the mouse cursor is determined by the AR device itself and displayed in the AR field of view in the form of virtual information.
  • the AR device can determine the display position of the mouse cursor outside the range of the laptop based on the movement trajectory of the mouse cursor sent by the laptop and the user's dragging and moving operation of the mouse.
  • the AR device can determine the display position of the mouse cursor outside the range of the laptop based on the movement trajectory of the mouse cursor acquired through the camera and the information of the user dragging the mouse.
  • the embodiments of the present application do not limit the specific method of displaying the position of the mouse cursor outside the range of the laptop computer.
  • the AR device provides a mouse cursor display function so that the user can directly drag the task icon with the mouse to select the next task and the target device, so that the user can quickly and conveniently select the target device.
  • the design is easy for users to operate, more intuitive, and has a high user experience.
  • the AR task continuation instruction may carry relevant information of the first AR task for the first AR task.
  • the electronic device performs AR interface display and/or AR audio playback corresponding to the AR task.
  • the AR task continuation instruction carries task card information of the first AR task.
  • the AR task continuation instruction may carry the identification (such as application ID) of the application corresponding to the first AR task. , used by the first electronic device to continue the first AR task of the AR device, display the application interface corresponding to the first AR task and/or play AR audio.
  • the AR task continuation instruction may also carry relevant information of the first AR task for the first electronic device to perform the corresponding AR task.
  • the AR task continuation instruction carries task card information of the first AR task.
  • the task card of the first AR task may be a fusion card.
  • S605 The first electronic device continues the AR device to perform the first AR task according to the AR task connection instruction.
  • the first electronic device may continue the AR device to display the above-mentioned AR interface according to the AR task continuation instruction. For another example, assuming that the first AR task currently being executed by the AR device is to play AR audio, the first electronic device can continue the AR device to play the AR audio according to the AR task continuation instruction. For another example, assuming that the first AR task currently being executed by the AR device is to display the AR interface and play AR audio, the first electronic device can continue the AR device to display the AR interface and play the AR audio according to the AR task continuation instruction.
  • the first electronic device continues the AR device to perform the first AR task according to the AR task connection instruction, as shown in (b) in Figure 20A, (b) in Figure 20B, (b) in Figure 21, and Figure 22 (c), (c) in Figure 23, (c) in Figure 24A, (c) in Figure 24B, (c) in Figure 24C, (c) in Figure 24D, or (b) in Figure 24E ) shown in the mobile phone (i.e., the first electronic device) continues the AR device according to the AR task connection instruction and displays the chat task (i.e., the first AR task) interface corresponding to the icon 4.
  • the mobile phone i.e., the first electronic device
  • the first AR task is to display the AR interface and play AR audio
  • the first electronic device has established a communication connection (such as Bluetooth connection, WiFi P2P connection, etc.) with the audio playback device (such as a speaker, etc.) ), as shown in Figure 25, the first electronic device (the mobile phone shown in Figure 25) can continue the AR device according to the AR task connection instruction of the AR device to display the AR interface corresponding to the first AR task (the music interface shown in Figure 25) , and play the AR audio corresponding to the first AR task (Music 1 as shown in Figure 25) through the audio playback device (speaker as shown in Figure 25).
  • a communication connection such as Bluetooth connection, WiFi P2P connection, etc.
  • the audio playback device such as a speaker, etc.
  • the AR task continuation instruction carries relevant information of the first AR task, such as task card information of the first AR task.
  • the first electronic device can carry out the AR task continuation instruction according to the information carried in the AR task continuation instruction.
  • the task card information of the first AR task displays the AR interface of the first AR task.
  • the AR task continuation instruction carries the identification (such as application ID) of the application corresponding to the first AR task, and the application corresponding to the first AR task is running in the first electronic device.
  • the first electronic device may display the interface according to the application corresponding to the first AR task running in the first electronic device.
  • the AR interface may be the AR interface corresponding to the first AR task, for example, it may be the AR card corresponding to the first AR task (( in Figure 20A b), (b) in Figure 20B, (b) in Figure 21, (c) in Figure 22, (c) in Figure 23, (c) in Figure 24A, (c) in Figure 24B , (c) in Figure 24C, (c) in Figure 24D, or (b) in Figure 24E).
  • the AR interface may be slightly different from the AR interface corresponding to the first AR task.
  • the AR interface may include the same interface elements as the AR card corresponding to the first AR task, but the layout of the interface elements is different.
  • the application interface corresponding to the first AR task displayed by the first electronic device is more adapted to the display screen size of the first electronic device.
  • the first electronic device can obtain the interface configuration information of the relevant process locally, and then display the corresponding application interface (that is, the AR interface of the first AR task).
  • the specific method and process of the device displaying the application interface based on the relevant interface configuration information when the application is running please refer to conventional techniques and will not be described in detail here.
  • the AR interface can also be displayed in a non-full-screen window (as shown in Figure 26 Show).
  • the first electronic device when the first electronic device continues to display the AR interface corresponding to the first AR task, the first electronic device can also display the AR interface corresponding to the first AR task.
  • the AR interface is displayed in a split-screen format with the first electronic device currently displaying other interfaces (ie, the first interface).
  • the mobile phone i.e., the first electronic device
  • the smart home application interface i.e., the first interface
  • the chat interface i.e., the AR interface corresponding to the first AR task
  • it can Split-screen display of smart home application interface and chat interface.
  • the mobile phone in Figure 27 can be a single-screen mobile phone or a folding-screen mobile phone in a folded state.
  • the first electronic device (the mobile phone in Figure 28) is a folding screen mobile phone in an unfolded state, and the smart home application interface ( That is, the first interface), the mobile phone desktop is displayed on the second screen of the folding screen phone.
  • the folding screen phone is connected to the AR device to display the chat interface (i.e., the AR interface corresponding to the first AR task), the chat can be displayed on the second screen.
  • Interface that is, the AR interface corresponding to the first AR task).
  • the AR device can display the second virtual screen of the folding screen phone.
  • the folding screen phone continues to display the chat interface (i.e., the AR interface corresponding to the first AR task) on the AR device, it can display on the virtual screen
  • the chat interface i.e., the AR interface corresponding to the first AR task
  • the smart home application interface i.e., the first interface
  • the folding screen mobile phone displays the chat interface (i.e., the AR interface corresponding to the first AR task) through the virtual screen, and displays the smart home application interface (i.e., the first interface) through the real screen
  • the screen of the folding screen mobile phone is expanded, and the folding screen mobile phone includes a first screen and a second screen.
  • the folding screen mobile phone displays the smart home application interface (i.e., the first interface) through the first screen, and displays the smart home application interface (i.e., the first interface) through the second screen.
  • the AR device can also provide the function of temporarily storing (or Pin) the task card corresponding to the AR task to a fixed location (also called the temporary storage area) according to the user's operation, to facilitate subsequent users. It can be arbitrarily connected between multiple electronic devices for this AR task. Among them, after the AR device temporarily stores (or pins) the task card to a fixed location, the task card can always be displayed at the fixed location.
  • the fixed location for temporarily storing (or Pin) task cards can be located within the preset range of the AR device, or near the mobile phone that is within the preset range of the AR device and satisfies the preset spatial posture.
  • Default location area also called temporary storage area.
  • the fixed position for temporarily storing (or Pin) the task card can be located within the preset range of the AR device, or on an electronic device located within the preset range of the AR device and meeting the preset spatial posture.
  • the default location area also called the temporary storage area
  • the AR device can also provide the function of temporarily storing (or Pin) the task icon corresponding to the AR task to a fixed location according to the user's operation, so as to facilitate subsequent users to target the AR task.
  • the AR device can also provide the function of temporarily storing (or Pin) the task icon corresponding to the AR task to a fixed location according to the user's operation, so as to facilitate subsequent users to target the AR task.
  • the AR device when the AR device performs an AR task, such as displaying an AR interface and/or playing AR audio, it can recognize the user's intention to continue in the AR field of view.
  • the AR device may instruct the target device to continue the AR device to perform the AR task selected by the user.
  • users can interact with mobile phones, tablets, personal computers and other highly operable and easy-to-operate target devices to achieve interaction with AR objects in the AR field of view, which can provide users with a more convenient and easy-to-operate AR interactive experience.
  • the target device continues the AR device to execute the user-selected During AR tasks, there will be no problems such as horizontal stripes or display jitter, so the display effect is better.
  • the target device is a mobile phone
  • the user can use the mobile phone with strong operability and easy operation in any scenario. , realizing interaction with AR objects in the AR field of view, which is very convenient for users.
  • the target device connects the AR device to perform the AR task selected by the user, in some cases, such as the subsequent AR interface display without relying on the AR device for image collection, the user You can take off the AR device and the user experience will be much better.
  • the above-mentioned embodiments of the present application mainly take the first electronic device to connect the AR device to display the AR interface and/or play AR audio as an example to introduce a task connection method provided by the embodiments of the present application.
  • the first electronic device can also display the AR interface in succession from the AR device, The other electronic device (such as the fourth electronic device) continues to play the AR audio from the AR device.
  • the fourth electronic device may be selected and determined by the user.
  • the fourth electronic device may be determined by the user-selected target device (eg, the first electronic device).
  • the fourth electronic device may be an audio playback device (speaker as shown in Figure 25) that has established a communication connection with the first electronic device.
  • the AR device can further switch the first AR task from the first electronic device to other electronic devices (such as the second electronic device) according to the user's selection. ) operation, sending an AR task continuation instruction to the first electronic device to instruct the second electronic device to continue the first electronic device to perform the first AR task currently being executed by the first electronic device.
  • the method provided by the embodiment of the present application further includes S3101-S3104:
  • S3101 The AR device recognizes that the second electronic device meets the preset conditions.
  • the AR device provides one or more task icons in the AR field of view, and the one or more task icons include the first task icon.
  • S3103 In response to the user's operation (such as the second operation) of switching the first AR task from the first electronic device to the second electronic device, the AR device sends an AR task continuation instruction to the second electronic device.
  • the AR device In response to the user's operation (such as the second operation) of switching the first AR task from the first electronic device to the second electronic device, the AR device sends an AR task continuation instruction to the second electronic device.
  • S3104 The second electronic device continues the first electronic device to perform the first AR task according to the AR task connection instruction.
  • the AR task connection method shown in Figure 31 only takes as an example that after the first electronic device continues the AR device to perform the first AR task according to the AR task connection instruction, the task icon is no longer provided in the AR device.
  • the AR device always provides multiple task icons in the AR field of view
  • this application implements The methods provided by the example also include S3103 and S3104.
  • the AR device can further connect the first AR task from the first electronic device back to the AR device according to the user's selection.
  • the first electronic device sends an AR task reverse connection instruction to reverse the first AR task from the first electronic device back to the AR device.
  • the method provided by the embodiment of the present application also includes S3301 and S3302:
  • S3301 In response to the user's operation of switching the first AR task from the first electronic device to the AR device (such as the third operation), the AR device sends an AR task reverse connection instruction to the first electronic device.
  • S3302 The AR device continues the first electronic device to perform the first AR task.
  • the AR device can realize arbitrary switching of AR tasks between multiple electronic devices according to the user's instructions, for example, switching from one electronic device (such as a first electronic device) to Another electronic device (such as a second electronic device), or the target device (such as a first electronic device) is switched back to the AR device.
  • This method can support users to achieve multi-device collaboration according to actual needs in office scenarios, game scenarios, navigation scenarios, medical scenarios, education scenarios and other scenarios.
  • the AR glasses i.e., the AR device
  • the AR glasses i.e., the AR device
  • the AR glasses determine the user's intention to continue.
  • the AR glasses i.e., AR device
  • the AR glasses provide one or more task icons in the AR field of view for the user to select the AR task to be continued.
  • the one or more task icons Includes first mission icon.
  • the AR glasses i.e., the AR device switches to the laptop (i.e., the first electronic device).
  • An AR task connection instruction is sent to instruct the laptop (i.e., the first electronic device) to connect the AR glasses (i.e., the AR device) to perform the first AR task.
  • the AR glasses in response to the user switching the first AR task from the laptop computer (i.e., the first electronic device) to During the operation of the mobile phone (i.e., the second electronic device), the AR glasses (i.e., the AR device) send an AR task continuation instruction to the mobile phone (i.e., the second electronic device) to Instruct the mobile phone (i.e., the second electronic device) to connect to the laptop (i.e., the first electronic device) to perform the first AR task.
  • each module in the electronic device can be implemented in the form of software and/or hardware, and there is no specific limitation on this.
  • smart skipping ropes/electronic devices are presented in the form of functional modules.
  • Module here may refer to an application specific integrated circuit (ASIC), a circuit, a processor and memory that executes one or more software or firmware programs, an integrated logic circuit, and/or other devices that can provide the above functions.
  • ASIC application specific integrated circuit
  • the computer program product includes one or more computer instructions.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., the computer instructions may be transferred from a website, computer, server, or data center Transmission to another website, computer, server or data center through wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available media that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
  • the available media may be magnetic media, (such as floppy disks, hard disks, etc. , tape), optical media (such as digital video disk (DVD)), or semiconductor media (such as solid state disk (SSD)), etc.
  • the steps of the methods or algorithms described in conjunction with the embodiments of the present application can be implemented in hardware, or can be implemented in a processor executing software instructions.
  • Software instructions can be composed of corresponding software modules, and the software modules can be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disks, mobile hard disks, CD-ROM or any other form of storage well known in the art. in the medium.
  • An exemplary storage medium is coupled to the processor such that the processor can read information from the storage medium and write information to the storage medium.
  • the storage medium can also be an integral part of the processor.
  • the processor and storage media may be located in an ASIC. Additionally, the ASIC can be located in an electronic device.
  • the processor and storage medium may also exist as discrete components in electronic devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了一种任务接续方法、设备及系统,涉及终端技术领域,可以为用户提供更加便捷、易操作的增强现实(augmented reality,AR)交互体验。本申请中,在AR设备执行AR任务时,AR设备可以在识别用户的接续意图时,在AR视场中提供任务图标供用户选择即将要接续的AR任务。进一步的,在接收到用户选择AR任务的操作以及用户确定目标设备的操作之后,AR设备可以指示目标设备(如第一电子设备)接续AR设备执行用户选择的AR任务。基于该方法,用户可以与手机、平板、个人电脑等操作性强且操作方便的目标设备交互,实现与AR视场中AR物体的交互,可以为用户提供更加便捷、易操作的AR交互体验。

Description

一种任务接续方法、设备及系统
本申请要求于2022年8月22日提交国家知识产权局、申请号为202211008771.6、申请名称为“一种任务接续方法、设备及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种任务接续方法、设备及系统。
背景技术
随着增强现实(augmented reality,AR)技术的发展,AR技术可以广泛应用于人们工作生活的各个方面。AR设备可以基于AR技术为用户提供在真实世界中叠加显示虚拟画面的服务。其中,AR设备如AR眼镜、AR头盔、安装有AR应用的电子设备(如手机、平板等)。
为了为用户提供更加沉浸式的AR体验,通常AR设备还可以支持用户与AR视场中的AR物体交互。例如,用户可以通过手部与AR物体直接接触的方式、手部延长线指向AR物体的方式、语音交互方式或肢体动作交互方式(如头动交互方式、手势交互方式等)与AR物体交互。
但是,上述交互方式常常不能为用户带来良好的交互体验。例如,上述通过手部与AR物体直接接触的方式与通过手部延长线指向AR物体的方式需要用户手部在空中长时间停留,但是人手很难在没有支撑的情况下在空中停留过长时间,因此该方法不适合长时间、连续性的操作。又如,在一些情况下,尤其是在用户处于公共场合时,用户并不希望发出声音或者做出肢体动作,因此上述语音交互方式与肢体动作交互方式通常不被用户欢迎。
发明内容
本申请提供一种任务接续方法、设备及系统,可以为用户提供更加便捷、易操作的AR交互体验。
为达到上述目的,本申请实施例采用如下技术方案:
第一方面,提供一种任务接续方法,该方法包括:AR设备识别到满足预设条件时,在AR视场内显示AR设备正在执行的一个或多个AR任务的图标,该一个或多个AR任务的图标包括第一任务图标,第一任务图标对应于第一AR任务;响应于用户对第一任务图标的第一操作,AR设备指示第一电子设备接续AR设备执行第一AR任务。
其中,第一操作的意图是将第一任务由AR设备切换至第一电子设备。
示例性地,第一任务可以包括界面显示任务和/或音频播放任务。
上述第一方面提供的方案,在AR设备执行AR任务时,例如显示AR界面和/或播放AR音频时,可以在识别用户的接续意图时,在AR视场中提供任务图标供用户选择即将要接续的AR任务。进一步的,在接收到用户选择AR任务的操作以及用户确定目标设备的操作之后,AR设备可以指示目标设备(如第一电子设备)接续AR设备执行用户选择的AR任务。基于该方法,用户可以与手机、平板、个人电脑等操作性强且操作方便的目标设备交互,实现与AR视场中AR物体的交互,可以为用户提供更加便捷、易操作的AR交互体验。并且,假设手机、平板、个人电脑等设备相比于AR设备通常显示分辨率较高且不受外界环境(如光照、空间物体折射等)的影响,那么在目标设备接续AR设备执行用户选择的AR任务时,不会出现横纹或显示抖动等问题,因此显示效果更好。
在一种可能的实现方式中,上述AR设备识别到满足预设条件,包括:AR设备识别到满足预设条件的第一电子设备。
示例性地,上述AR设备识别到满足预设条件的第一电子设备,包括:AR设备接收到用户在AR设备的第一界面上选择第一电子设备的操作,其中第一界面上包括与AR设备建立有通信连接的多个电子设备的列表。
或者,示例性地,上述AR设备识别到满足预设条件的第一电子设备,包括:AR设备识别到位于AR设备预设范围内的第一电子设备。可以理解,通常,若用户有将AR设备上的AR任务接续至某一电子设备(如第一电子设备)的意图时,或者有通过某一电子设备(如第一电子设备)进行AR任务接续相关操作时,用户通常会将第一电子设备拿起。在这种情况下,第一电子设备便会落入AR设备的预设范围内。基于此,在本申请实施例中,AR设备可以根据是否识别到位于AR设备预设范围内,确定用户的任务接续意图。
在一种可能的实现方式中,上述第一电子设备位于AR设备的AR视场内。可以理解,若用户有将AR设备上的AR任务接续至第一电子设备的意图时,用户可以将第一电子设备拿起并落入AR设备的AR视场内。基于此,在本申请实施例中,AR设备可以根据是否识别到位于AR设备的AR视场内,确定用户的任务接续意图。
在一种可能的实现方式中,上述第一电子设备满足预设空间姿态。可以理解,若用户有将AR设备上的AR任务接续至第一电子设备的意图时,用户可以将第一电子设备拿起。其中,用户拿起第一电子设备时第一电子设备的空间姿态通常具备一定的规律,如满足被用户握持时的空间姿态。基于此,在本申请实施例中,AR设备可以根据第一电子设备是否满足预设空间姿态,确定用户的任务接续意图。
在一种可能的实现方式中,上述AR设备识别到位于AR设备预设范围内的第一电子设备,包括:AR设备基于自动定位技术,识别到位于AR设备预设范围内的第一电子设备;或者,AR设备根据捕获的真实世界的图像信息,识别到位于AR设备预设范围内的第一电子设备;AR设备根据第一电子设备的运动数据,识别到位于AR设备预设范围内的第一电子设备。本申请不限定识别位于AR设备预设范围内的电子设备的具体方法。
示例性地,AR设备可以基于自动定位技术如超声波定位技术,通过设置在其中的第一模块(如扬声器发)射超声波信号,通过设置在其中的第二模块(如麦克风)接收来自其它电子设备的超声波信号的回波信号,并根据发出信号的传输路径、接收信号的传输路径,结合第一模块和第二模块的相对位置关系,确定多个电子设备在环境中的具体位置,进而确定哪些电子设备位于AR设备预设范围内。
或者示例性地,AR设备可以根据摄像头捕获的真实世界的图像信息中所包括的关于电子设备的图像信息,进而确定哪些电子设备位于AR设备预设范围内。
或者示例性地,AR设备可以根据电子设备的速度、加速度、角速度和角加速度等运动数据确定电子设备的具体位置等,进而确定哪些电子设备位于AR设备预设范围内。
在一种可能的实现方式中,上述用户对第一任务图标的第一操作包括以下中的任一种:用户将第一任务图标滑向第一电子设备的操作;用户捏合第一任务图标拖拽至第一电子设备的操作;用户在第一任务图标为当前焦点图标时的预设操作;用户将第一任务图标滑向第一电子设备的虚拟图标的操作;用户将第一任务图标滑向超级终端界面上的第一电子设备图标的操作;用户通过辅助输入设备将第一任务图标拖拽至第一电子设备的操作。本申请提供的任务接续过程可以基于易于操作的方式,同时实现待切换任务选择并触发快速、便捷的目标设备选择。
在一种可能的实现方式中,上述方法还包括:AR设备根据用户的操作,切换当前焦点图标。示例性地,用户的操作可以是左右滑动操作、上下滑动操作等,本申请不做限定。基于此,用户可以连贯、快速、便捷地进行焦点图标切换。该设计对用户来说易于操作,用户体验度高。
在一种可能的实现方式中,在AR设备识别到满足预设条件之前,上述方法还包括:AR设备执行一个或多个AR任务,在AR视场中显示上述一个或多个AR任务对应的AR界面。基于此,AR设备可以通过AR视场为用户提供上述一个或多个AR任务对应的服务。
在一种可能的实现方式中,上述一个或多个AR任务对应的AR界面为一个或多个AR任务对应的任务卡片,其中任务卡片中包括对应AR任务的缩略信息。示例性地,一个或多个AR任务对应的任务卡片如包括当前正在播放的音乐的信息的音乐卡片、包括当前正在编辑的文档信息的文档卡片、包括用户在真实世界中的位置信息以及目的地的位置信息的地图卡片、包括用户当前的聊天对象以及聊天内容的聊天卡片、包括智能家居的设备信息和当前状态信息(如在线状态或者离线状态等)的智能家居卡片、包括备忘录事件的时间和具体事件信息的备忘录卡片等。
作为一种示例,一个任务卡片对应于运行在一个设备中的一个AR任务。
作为另一种示例,一个任务卡片对应于运行在多个设备中的多个AR任务。该任务卡片也称融合性卡片。
在一种可能的实现方式中,上述一个或多个AR任务的图标的排布方式与一个或多个AR任务对应的AR界面在AR视场中的排布方式一致。基于此,可以方便用户在快速、准确地确定想要接续的AR任务对应的任务图标的具体位置。
在一种可能的实现方式中,上述一个或多个AR任务对应的应用中至少一个运行在AR设备中;和/或,一个或多个AR任务对应的应用中至少一个运行在与AR设备建立有通信关系的一个或多个电子设备中。例如,上述一个或多个AR任务对应的应用均运行在AR设备中。又如,上述一个或多个AR任务对应的应用运行在其它一个或多个电子设备中。又如,上述一个或多个AR任务对应的应用分别运行在AR设备和其它一个或多个电子设备中。
在一种可能的实现方式中,上述在AR视场内显示AR设备正在执行的一个或多个AR任务的图标,包括:在第一电子设备的任务图标显示区显示上述一个或多个AR任务的图标;其中,任务图标显示区位于第一电子设备的上侧、下侧、左侧、右侧或者侧边,或者任务图标显示区位于第一电子设备下拉菜单栏或者锁屏界面。
在一种可能的实现方式中,上述一个或多个AR任务的图标的排布方式与运行一个或多个AR任务对应的应用的电子设备与AR设备的位置关系相关。
在一种可能的实现方式中,上述一个或多个AR任务的图标的排布方式包括以下任一种:一字型排布方式、双排排布方式、环形排布方式、自由排布方式。
在一种可能的实现方式中,上述方法还包括:第一电子设备根据AR设备的指示接续AR设备执行第一AR任务。基于此,用户可以与手机、平板、个人电脑等操作性强且操作方便的目标设备交互,实现与AR视场中AR物体的交互,可以为用户提供更加便捷、易操作的AR交互体验。并且,假设手机、平板、个人电脑等设备相比于AR设备通常显示分辨率较高且不受外界环境(如光照、空间物体折射等)的影响,那么在目标设备接续AR设备执行用户选择的AR任务时,不会出现横纹或显示抖动等问题,因此显示效果更好。
在一种可能的实现方式中,上述第一AR任务包括界面显示任务,第一电子设备根据AR设备的指示接续AR设备执行第一AR任务,包括:第一电子设备显示第一AR任务对应的界面;其中,第一AR任务对应的界面是第一AR任务对应的任务卡片,或者第一AR任务对应的界面包括第一AR任务对应的任务卡片中的界面元素。对于目标设备接续AR设备执行界面显示任务时,目标设备可以显示原第一AR任务对应的界面,也可以进行界面自适应,如界面尺寸自适应、界面布局自适应等,以获得更好的显示效果。
在一种可能的实现方式中,上述第一AR任务包括音频播放任务,第一电子设备根据AR设备的指示接续AR设备执行第一AR任务,包括:第一电子设备播第一AR任务对应的音频;或者,第一电子设备指示与第一电子设备连接的音频播放设备播放第一AR任务对应的音频。对于目标设备接续AR设备执行音频播放任务时,目标设备可以通过其自身的音频播放模块播放从AR设备接续来的音频,也可以通过其它音频播放外设(如音箱等)播放从AR设备接续来的音频。
在一种可能的实现方式中,在第一电子设备根据AR设备的指示接续AR设备执行第一AR任务之后,上述方法还包括:响应于用户对第一任务图标的第二操作,AR设备指示第二电子设备接续第一电子设备执行第一AR任务。基于此,可以实现AR任务在设备之间的自由接续。
在一种可能的实现方式中,在第一电子设备根据AR设备的指示接续AR设备执行第一AR任务之后,上述方法还包括:响应于用户对第一任务图标的第三操作,AR设备接续第一电子设备执行第一AR任务。基于此,可以实现AR任务在设备之间的自由接续。
第二方面,提供一种任务接续方法,该方法包括:AR设备识别到满足预设条件时,在AR视场内显示AR设备正在执行的一个或多个AR任务的图标,该一个或多个AR任务的图标包括第一任务图标,第一任务图标对应于第一AR任务;响应于用户对第一任务图标的第一操作,AR设备指示第一电子设备接续AR设备执行第一AR任务。
其中,第一操作的意图是将第一任务由AR设备切换至第一电子设备。
示例性地,第一任务可以包括界面显示任务和/或音频播放任务。
上述第二方面提供的方案,在AR设备执行AR任务时,例如显示AR界面和/或播放AR音频时,可以在识别用户的接续意图时,在AR视场中提供任务图标供用户选择即将要接续的AR任务。进一步的,在接收到用户选择AR任务的操作以及用户确定目标设备的操作之后,AR设备可以指示目标设备(如第一电子设备)接续AR设备执行用户选择的AR任务。基于该方法,用户可以与手机、平板、个人电脑等操作性强且操作方便的目标设备交互,实现与AR视场中AR物体的交互,可以为用户提供更加便捷、易操作的AR交互体验。并且,假设手机、平板、个人电脑等设备相比于AR设备通常显示分辨率较高且不受外界环境(如光照、空间物体折射等)的影响,那么在目标设备接续AR设备执行用户选择的AR任务时,不会出现横纹或显示抖动等问题,因此显示效果更好。
在一种可能的实现方式中,上述AR设备识别到满足预设条件,包括:AR设备识别到满足预设条件的第一电子设备。
示例性地,上述AR设备识别到满足预设条件的第一电子设备,包括:AR设备接收到用户在AR设备的第一界面上选择第一电子设备的操作,其中第一界面上包括与AR设备建立有通信连接的多个电子设备的列表。
或者,示例性地,上述AR设备识别到满足预设条件的第一电子设备,包括:AR设备识别到位于AR设备预设范围内的第一电子设备。可以理解,若用户有将AR设备上的AR任务接续至第一电子设备的意图时,用户可以将第一电子设备拿起并落入AR设备的预设范围内。基于此,在本申请实施例中,AR设备可以根据是否识别到位于AR设备的预设范围内,确定用户的任务接续意图。
在一种可能的实现方式中,上述第一电子设备位于AR设备的AR视场内。可以理解,若用户有将AR设备上的AR任务接续至第一电子设备的意图时,用户可以将第一电子设备拿起并落入AR设备的AR视场内。基于此,在本申请实施例中,AR设备可以根据是否识别到位于AR设备的AR视场内,确定用户的任务接续意图。
在一种可能的实现方式中,上述第一电子设备满足预设空间姿态。可以理解,若用户有将AR设备上的AR任务接续至第一电子设备的意图时,用户可以将第一电子设备拿起。其中,用户拿起第一电子设备时第一电子设备的空间姿态通常具备一定的规律,如满足被用户握持时的空间姿态。基于此,在本申请实施例中,AR设备可以根据第一电子设备是否满足预设空间姿态,确定用户的任务接续意图。
在一种可能的实现方式中,上述AR设备识别到位于AR设备预设范围内的第一电子设备,包括:AR设备基于自动定位技术,识别到位于AR设备预设范围内的第一电子设备;或者,AR设备根据捕获的真实世界的图像信息,识别到位于AR设备预设范围内的第一电子设备;AR设备根据第一电子设备的运动数据,识别到位于AR设备预设范围内的第一电子设备。本申请不限定识别位于AR设备预设范围内的电子设备的具体方法。
示例性地,AR设备可以基于自动定位技术如超声波定位技术,通过设置在其中的第一模块(如扬声器发)射超声波信号,通过设置在其中的第二模块(如麦克风)接收来自其它电子设备的超声波信号的回波信号,并根据发出信号的传输路径、接收信号的传输路径,结合第一模块和第二模块的相对位置关系,确定多个电子设备在环境中的具体位置,进而确定哪些电子设备位于AR设备预设范围内。
或者示例性地,AR设备可以根据摄像头捕获的真实世界的图像信息中所包括的关于电子设备的图像信息,进而确定哪些电子设备位于AR设备预设范围内。
或者示例性地,AR设备可以根据电子设备的速度、加速度、角速度和角加速度等运动数据确定电子设备的具体位置等,进而确定哪些电子设备位于AR设备预设范围内。
在一种可能的实现方式中,上述用户对第一任务图标的第一操作包括以下中的任一种:用户将第一任务图标滑向第一电子设备的操作;用户捏合第一任务图标拖拽至第一电子设备的操作;用户在第一任务图标为当前焦点图标时的预设操作;用户将第一任务图标滑向第一电子设备的虚拟图标的操作;用户将第一任务图标滑向超级终端界面上的第一电子设备图标的操作;用户通过 辅助输入设备将第一任务图标拖拽至第一电子设备的操作。本申请提供的任务接续过程可以基于易于操作的方式,同时实现待切换任务选择并触发快速、便捷的目标设备选择。
在一种可能的实现方式中,上述方法还包括:AR设备根据用户的操作,切换当前焦点图标。示例性地,用户的操作可以是左右滑动操作、上下滑动操作等,本申请不做限定。基于此,用户可以连贯、快速、便捷地进行焦点图标切换。该设计对用户来说易于操作,用户体验度高。
在一种可能的实现方式中,在AR设备识别到满足预设条件之前,上述方法还包括:AR设备执行一个或多个AR任务,在AR视场中显示上述一个或多个AR任务对应的AR界面。基于此,AR设备可以通过AR视场为用户提供上述一个或多个AR任务对应的服务。
在一种可能的实现方式中,上述一个或多个AR任务对应的AR界面为一个或多个AR任务对应的任务卡片,其中任务卡片中包括对应AR任务的缩略信息。示例性地,一个或多个AR任务对应的任务卡片如包括当前正在播放的音乐的信息的音乐卡片、包括当前正在编辑的文档信息的文档卡片、包括用户在真实世界中的位置信息以及目的地的位置信息的地图卡片、包括用户当前的聊天对象以及聊天内容的聊天卡片、包括智能家居的设备信息和当前状态信息(如在线状态或者离线状态等)的智能家居卡片、包括备忘录事件的时间和具体事件信息的备忘录卡片等。
作为一种示例,一个任务卡片对应于运行在一个设备中的一个AR任务。
作为另一种示例,一个任务卡片对应于运行在多个设备中的多个AR任务。该任务卡片也称融合性卡片。
在一种可能的实现方式中,上述一个或多个AR任务的图标的排布方式与一个或多个AR任务对应的AR界面在AR视场中的排布方式一致。基于此,可以方便用户在快速、准确地确定想要接续的AR任务对应的任务图标的具体位置。
在一种可能的实现方式中,上述一个或多个AR任务对应的应用中至少一个运行在AR设备中;和/或,一个或多个AR任务对应的应用中至少一个运行在与AR设备建立有通信关系的一个或多个电子设备中。例如,上述一个或多个AR任务对应的应用均运行在AR设备中。又如,上述一个或多个AR任务对应的应用运行在其它一个或多个电子设备中。又如,上述一个或多个AR任务对应的应用分别运行在AR设备和其它一个或多个电子设备中。
在一种可能的实现方式中,上述在AR视场内显示AR设备正在执行的一个或多个AR任务的图标,包括:在第一电子设备的任务图标显示区显示上述一个或多个AR任务的图标;其中,任务图标显示区位于第一电子设备的上侧、下侧、左侧、右侧或者侧边,或者任务图标显示区位于第一电子设备下拉菜单栏或者锁屏界面。
在一种可能的实现方式中,上述一个或多个AR任务的图标的排布方式与运行一个或多个AR任务对应的应用的电子设备与AR设备的位置关系相关。
在一种可能的实现方式中,上述一个或多个AR任务的图标的排布方式包括以下任一种:一字型排布方式、双排排布方式、环形排布方式、自由排布方式。
在一种可能的实现方式中,在第一电子设备根据AR设备的指示接续AR设备执行第一AR任务之后,上述方法还包括:响应于用户对第一任务图标的第二操作,AR设备指示第二电子设备接续第一电子设备执行第一AR任务。基于此,可以实现AR任务在设备之间的自由接续。
在一种可能的实现方式中,在第一电子设备根据AR设备的指示接续AR设备执行第一AR任务之后,上述方法还包括:响应于用户对第一任务图标的第三操作,AR设备接续第一电子设备执行第一AR任务。基于此,可以实现AR任务在设备之间的自由接续。
第三方面,提供一种任务接续方法,该方法包括:第一电子设备根据AR设备的指示接续AR设备执行第一AR任务。
上述第三方面提供的方案,用户可以与手机、平板、个人电脑等操作性强且操作方便的目标设备交互,实现与AR视场中AR物体的交互,可以为用户提供更加便捷、易操作的AR交互体验。并且,假设手机、平板、个人电脑等设备相比于AR设备通常显示分辨率较高且不受外界环境(如光照、空间物体折射等)的影响,那么在目标设备接续AR设备执行用户选择的AR任务时,不会出现横纹或显示抖动等问题,因此显示效果更好。
在一种可能的实现方式中,上述第一AR任务包括界面显示任务,第一电子设备根据AR设 备的指示接续AR设备执行第一AR任务,包括:第一电子设备显示第一AR任务对应的界面;其中,第一AR任务对应的界面是第一AR任务对应的任务卡片,或者第一AR任务对应的界面包括第一AR任务对应的任务卡片中的界面元素。对于目标设备接续AR设备执行界面显示任务时,目标设备可以显示原第一AR任务对应的界面,也可以进行界面自适应,如界面尺寸自适应、界面布局自适应等,以获得更好的显示效果。
在一种可能的实现方式中,上述第一AR任务包括音频播放任务,第一电子设备根据AR设备的指示接续AR设备执行第一AR任务,包括:第一电子设备播第一AR任务对应的音频;或者,第一电子设备指示与第一电子设备连接的音频播放设备播放第一AR任务对应的音频。对于目标设备接续AR设备执行音频播放任务时,目标设备可以通过其自身的音频播放模块播放从AR设备接续来的音频,也可以通过其它音频播放外设(如音箱等)播放从AR设备接续来的音频。
第四方面,提供一种AR设备,该AR设备包括:光学模组,用于在AR设备的视场内成像;存储器,用于存储计算机程序指令;处理器,用于执行所述指令,使得AR设备实现如第二方面任一种可能的实现方式中所述的方法。
第五方面,提供一种电子设备,该电子设备包括:存储器,用于存储计算机程序指令;处理器,用于执行所述指令,使得电子设备实现如第三方面任一种可能的实现方式中所述的方法。
第六方面,提供一种AR系统,该AR系统包括:如第四方面所述的AR设备,以及如第五方面所述的电子设备。
第七方面,提供一种计算机可读存储介质,该计算机可读存储介质上存储有计算机可读指令,该计算机可读指令被处理器执行时实现如第二方面或第三方面任一种可能的实现方式中的方法。
第八方面,提供一种芯片系统,该芯片系统包括处理器、存储器,存储器中存储有指令;所述指令被所述处理器执行时,实现如第二方面或第三方面任一种可能的实现方式中的方法。该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
第九方面,提供一种计算机程序产品,包括计算机可读指令,当该计算机可读指令在计算机上运行时,使得实现如第二方面或第三方面任一种可能的实现方式中的方法。
附图说明
图1为本申请实施例提供的一种AR任务接续场景示意图;
图2为本申请实施例提供的另一种AR任务接续场景示意图;
图3为本申请实施例提供的一种电子设备的硬件结构示意图;
图4为本申请实施例提供的一种AR眼镜结构图;
图5为本申请实施例提供的一种AR技术应用于导航场景的显示效果示意图;
图6为本申请实施例提供的AR任务接续方法流程图一;
图7为本申请实施例提供的一种在AR视场中显示任务卡片的示意图;
图8为本申请实施例提供的一种用于表示用户任务接续意图的场景示意图;
图9为本申请实施例提供的在AR视场中显示任务图标的示意图一;
图10为本申请实施例提供的一种确定电子设备空间姿态的原理示意图;
图11为本申请实施例提供的在AR视场中显示任务图标的示意图二;
图12为本申请实施例提供的另外一种在AR视场中显示任务图标的示意图;
图13为本申请实施例提供的在AR视场中显示任务图标的示意图三;
图14为本申请实施例提供的在AR视场中显示任务图标的示意图四;
图15为本申请实施例提供的在AR视场中显示任务图标的示意图五;
图16为本申请实施例提供的在AR视场中显示任务图标的示意图六;
图17为本申请实施例提供的在AR视场中显示任务图标的示意图七;
图18为本申请实施例提供的在AR视场中显示任务图标的示意图八;
图19为本申请实施例提供的在AR视场中显示任务图标的示意图九;
图20A为本申请实施例提供的AR任务接续示例图一;
图20B为本申请实施例提供的AR任务接续示例图二;
图21为本申请实施例提供的AR任务接续示例图三;
图22为本申请实施例提供的AR任务接续示例图四;
图23为本申请实施例提供的AR任务接续示例图五;
图24A为本申请实施例提供的AR任务接续示例图六;
图24B为本申请实施例提供的AR任务接续示例图七;
图24C为本申请实施例提供的AR任务接续示例图八;
图24D为本申请实施例提供的AR任务接续示例图九;
图24E为本申请实施例提供的AR任务接续示例图十;
图25为本申请实施例提供的一种AR任务接续场景示意图;
图26为本申请实施例提供的AR任务接续示例图十一;
图27为本申请实施例提供的AR任务接续示例图十二;
图28为本申请实施例提供的AR任务接续示例图十三;
图29为本申请实施例提供的AR任务接续示例图十四;
图30A为本申请实施例提供的接续任务显示效果示意图;
图30B为本申请实施例提供的一种任务卡片暂存区示意图;
图30C为本申请实施例提供的另外两种任务卡片暂存区示意图;
图31为本申请实施例提供的AR任务接续方法流程图二;
图32为本申请实施例提供的AR任务接续方法流程图三;
图33为本申请实施例提供的AR任务接续方法流程图四。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
为了为用户提供更加便捷、易操作的AR交互体验,本申请实施例提供一种任务接续方法,基于该方法,AR设备可以根据用户的实际需求,将AR视场里的AR物体接续到任意电子设备(以下称为目标设备)。在将AR视场里的AR物体接续到目标设备之后,目标设备可以作为AR视场对应的显示设备,向用户展示AR视场中的AR物体。进一步的,用户可以通过与目标设备交互,实现与AR视场中AR物体的交互。
可以理解,基于本申请实施例提供的一种任务接续方法,用户可以通过易操作且快捷的交互方式与手机、平板、个人电脑(personal computer,PC)等目标设备交互,实现与AR视场中AR物体的交互。例如,上述易操作且快捷的交互方式可以是点击(如单击、双击等)操作、滑动操作、长按操作、预设手势操作等。因此,基于本申请实施例提供的一种任务接续方法,可以为用户提供更加便捷、易操作的AR交互体验。
可以理解的是,AR只是本申请的一种应用场景,还可能是XR中的其他场景,例如混合现实(mixed reality,MR)和虚拟现实(virtual reality,VR),本申请对此不做限定。本申请以下实施例仅以AR场景下的AR任务接续作为示例。
进一步地,对于任务接续涉及AR界面接续的情况,基于本申请实施例提供的一种任务接续方法,AR设备可以借助其他显示设备显示AR视场中的物体,其中显示设备相比于AR设备通常不受外界环境(如光照、空间物体折射等)的影响,不会出现横纹或显示抖动等问题,因此显示效果更好。
进一步的,可以理解,对于AR设备为穿戴设备,如AR眼镜、AR头盔的情况,若仅依靠AR设备进行AR界面显示,则需要用户一直穿戴AR设备,长时间穿戴AR设备可能会让用户不舒服,还会影响用户的正常活动。而基于本申请实施例提供的一种任务接续方法,在AR界面和/或AR音频由AR设备接续至目标设备之后,在一些情况下,例如后续AR界面显示无需依靠AR设备进行图像采集的情况,用户可以脱下AR设备,用户体验会好很多。
其中,在本申请一些实施例中,AR任务接续可以是AR界面接续,如AR视场中的AR物体显示的接续。
在本申请另一些实施例中,AR任务接续也可以是AR音频的接续,如虚拟音频信号的接续。
在本申请另一些实施例中,AR任务接续也可以既包括AR界面接续,又包括AR音频的接续,本申请实施例不做具体限定。
作为一种示例,请参考图1,图1示出了本申请实施例提供的一种任务接续场景示例。如图1中S1所示,AR眼镜(即AR设备)可以根据用户的实际需求,将AR视场里的AR物体接续到图1所示手机(即目标设备)中。通过图1所示AR任务接续,一方面可以便于用户通过易于操控的方式通过与手机交互,实现与AR眼镜视场中AR物体的交互;另一方面可以通过显示效果好且不易受外界环境影响的手机代替AR眼镜进行AR界面显示,以提高AR界面的显示效果。
作为另一种示例,请参考图2,图2示出了本申请实施例提供的另一种AR任务接续场景示例。
如图2所示,AR眼镜(即AR设备)可以根据用户的实际需求,将AR视场里的AR物体接续到图2所示手机(即目标设备)中,以及将AR音频接续到图2所示音箱中。通过图2所示AR任务接续,一方面可以便于用户通过易于操控的方式通过与手机交互,实现与AR眼镜视场中AR物体的交互;另一方面可以通过显示效果好且不易受外界环境影响的手机代替AR眼镜进行AR界面显示,以提高AR界面的显示效果;再一方面可以通过音频播放效果好的音箱代替AR眼镜进行AR音频的播放。
可以理解,基于本申请实施例提供的任务接续方法,AR设备可以将AR界面和/或AR音频接续至目标设备。相反的,目标设备也可以根据用户的实际需求,如响应于用户的设备切换操作,将AR界面和/或AR音频由目标设备反接续回AR设备。如图1中S2a所示,手机可以响应于用户将AR界面和/或AR音频切换至AR设备的操作,将AR界面和/或AR音频由手机反接续回AR设备。
或者,可选的,目标设备也可以根据用户的实际需求,如响应于用户的设备切换操作,将AR界面和/或AR音频由目标设备接续至其它电子设备。如图1中S2b所示,手机可以响应于用户将AR界面和/或AR音频切换至笔记本电脑的操作,将AR界面和/或AR音频由手机接续至笔记本电脑。
其中,在本申请实施例中,AR设备具备提供AR信息展示的功能。示例性地,AR设备可以是AR眼镜、AR头盔、安装有AR应用的电子设备等。
例如,安装有AR应用的电子设备可以包括但不限于手机(如折叠屏手机,包括内折折叠屏手机和外折折叠屏手机)、上网本、平板电脑、车载设备、可穿戴设备(如智能手表、智能手环、智能眼镜等)、相机(如单反相机、卡片式相机等)、PC(包括台式电脑或者笔记本电脑)、掌上电脑、个人数字助理(personal digital assistant,PDA)、便携式多媒体播放器(portable multimedia player,PMP)、投影设备、智慧屏设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、混合现实(mixed reality,MR)设备、电视机或人机交互场景中的体感游戏机等。本申请对AR设备的具体功能和结构不做限定。
目标设备具备界面显示和/或音频播放功能。
例如,目标设备可以包括但不限于手机(如折叠屏手机)、上网本、平板电脑、车载设备、可穿戴设备(如智能手表、智能手环等)、相机(如单反相机、卡片式相机等)、PC(包括台式电脑或者笔记本电脑)、掌上电脑、PDA、PMP、投影设备、智慧屏设备、车载设备、AR/VR设备、MR设备、电视机或人机交互场景中的体感游戏机等具备界面显示和音频播放功能的电子设备。
又如,目标设备还可以是音箱、耳机等具备音频播放功能的电子设备。本申请对目标设备的 具体功能和结构不做限定。
作为一种示例,请参考图3,图3示出了本申请实施例提供的一种电子设备的硬件结构示意图。该电子设备可以是AR设备,也可以是目标设备。
如图3所示,电子设备可以包括处理器310,外部存储器接口320,内部存储器321,通用串行总线(universal serial bus,USB)接口330,充电管理模块340,电源管理模块341,电池342,天线1,天线2,移动通信模块350,无线通信模块360,音频模块370,扬声器370A,受话器370B,麦克风370C,耳机接口370D,传感器模块380,按键390,马达391,指示器392,摄像头393,显示屏394等。其中传感器模块380可以包括压力传感器380A,陀螺仪传感器380B,气压传感器380C,磁传感器380D,加速度传感器380E,距离传感器380F,接近光传感器380G,指纹传感器380H,温度传感器380J,触摸传感器380K,环境光传感器380L,骨传导传感器380M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备的具体限定。在本申请另一些实施例中,电子设备可以包括比图示更多或更少的不见,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件、软件或软件和硬件的组合实现。
处理器310可以包括一个或多个处理单元,例如:处理器310可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processiong,GPU),图像信号处理器(image signal processor,ISP),音频处理器/数字处理器(the audio processor),控制器、存储器、视频编解码器、音频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器器,和/或神经网络出合理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备的神经中枢和指挥中心。控制器可以根据用户操作指令的操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。例如,处理器310中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器310中的存储器为高速缓冲存储器。该存储器可以保存处理器310刚用过或循环使用的指令或数据。如果处理器310需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器310的等待事件,因而提高了系统的效率。
在本申请实施例中,若电子设备是AR设备,处理器310可以用于根据AR输入信息进行虚实融合并输出AR输入信息(如AR界面和/或AR音频)。
在一些实施例中,处理器310可以包括一个或多个接口。接口可以包括集成电路(inter-intergrated circuit,I2C)接口,集成电路内置音频(inter-intergrated circuit sound,I2S)接口,脉冲编码调制(pluse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO),用户标识模块接口,和/或通用串行总线接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器310可以包含多组I2C总线。处理器310可以通过不同的I2C总线接口分别耦合触摸传感器380K,麦克风,摄像头393等。例如,处理器310可以通过I2C接口耦合触摸传感器380K,使处理器310与触摸传感器380K通过I2C总线接口通信,实现电子设备的触摸功能。
在本申请实施例中,处理器310可以通过I2C总线接口获取触摸传感器380K检测到的用户在界面上进行点击操作、长按操作、预设手势操作或者拖拽操作等触摸操作,从而确定触摸操作所对应的具体意图,进而响应该触摸操作,如选择待接续AR任务、选择目标设备等。
应理解,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备的结构限定。在本申请另一些实施例中,电子设备也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块340用于从充电器接收充电输入。电源管理模块341用于连接电池342,充电管理模块340与处理器310。电源管理模块341接收电池342和/或充电管理模块340的输入,为处理器310,内部存储器321,外部存储器,显示屏394,摄像头393,和无线通信模块363等供 电。
电子设备的无线通信功能可以通过天线1,天线2,移动通信模块350,无线通信模块360,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。移动通信模块350可以提供应用在电子设备上的包括2G/3G/4G/5G等无线通信的解决方案。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接受的电磁波信号解调为低频基带信号。无线通信模块360可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,WI-FI)网络),蓝牙(bluetooth,BT),北斗卫星导航系统(BeiDou navigation satellite system,BDS),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。
电子设备通过图形处理器(graphics processing unit,GPU),显示屏394,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏394和应用处理器。GPU用于执行数据和几何计算,用于图形渲染。处理器310可以包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏394用于显示图像,视频等。显示屏394包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),量子点发光二极管(quantum dot light emitting diodes,QLED)等。
在本申请实施例中,若电子设备是AR眼镜,显示屏394可以是微型显示屏。
在本申请实施例中,GPU可以用于进行界面渲染。显示屏394可以用于显示界面。示例性地,上述界面可以包括但不限于应用界面(如浏览器界面、办公应用界面、邮箱界面、新闻应用界面、社交应用界面等)、功能界面、小程序界面等界面。
在本申请实施例中,若电子设备是AR设备,GPU进行界面渲染后通过显示屏394显示的界面上可以包括真实世界,还可以包括虚拟画面。若电子设备是目标设备,GPU可以用于根据用户选择的待接续AR任务渲染对应的AR界面(如任务卡片等),显示屏394可以用于显示上述AR界面。
电子设备可以通过图像信号处理器(image signal processor,ISP),摄像头393,视频编解码器,GPU,显示屏394以及应用处理器等实现拍摄功能。在本申请中,摄像头393可以包括电子设备的前置摄像头和后置摄像头,其可以是光学变焦镜头等,本申请对此不作限定。
在一些实施例中,ISP可以设置在摄像头393中,本申请对此不作限定。
摄像头393用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以使电荷耦合器件(charge couple device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的三原色(red green blue,RGB),YUV等格式的图像信号。
其中,在本申请实施例中,电子设备可以包括一个或多个摄像头393,比如至少一个前置摄像头和后置摄像头、多个前置摄像头或多个后置摄像头等。
在本申请实施例中,若电子设备是AR设备,电子设备通常包括多个摄像头,并且每个摄像头之间的分工是不一样的。例如,有的摄像头可以用于提供基于即时定位与地图构建(simultaneous localization and mapping,SLAM)的图像采集,有的摄像头用于进行交互手势识别,有的摄像头用于日常拍照和录像等。示例性地,在本申请实施例中,电子设备可以通过摄像头采集其视场范围内的真实环境信息。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其它数字信号。例如,当电子设备在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备可以支持一种或多种视频编解码器。这样,电子设备可以播放或录制多种编码格式的视频,例如,动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备的智能认知等应用,例如,图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口320可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备的存储能力。外部存储卡通过外部存储器接口320与处理器310通信,实现数据存储功能。例如将音频、视频等文件保存在外部存储卡中。
在本申请实施例中,若电子设备是AR设备,外部存储卡可以用于保存电子设备通过摄像头捕获的真实世界的图像信息,电子设备中显示的任务卡片的信息等。
内部存储器321可以用于存储计算机程序的可执行程序代码。示例性地,计算机程序可以包括操作系统程序和应用程序。操作系统可包括但不限于 OS等操作系统。其中,可执行程序代码包括指令。处理器310通过运行存储在内部存储器321的指令,从而执行电子设备的各种功能应用以及数据处理。内部存储器321可以包括存储程序区和存储数据。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序等。存储数据区可存储电子设备使用过程中所创建的数据(比如任务卡片等)等。此外,内部存储器321可以包括高速随机存取存储器,还可以包括非易失性存储器,例如,至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备可以通过音频模块370,扬声器370A,受话器370B,麦克风370C,耳机接口370D,以及应用处理器等实现音频功能。例如,音频播放,录音等。
音频模块370用于数字音频信息转换为模拟信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块370还可以用于对音频信号编码和解码。在一些实施例中,音频模块370可以设置于处理器310中,或将音频模块370的部分功能模块设置于处理器310中。
扬声器370A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备可以通过扬声器370A使用户收听音频,或收听免提通话等。在本申请实施例中,电子设备可以通过扬声器370A播放音频,例如AR音频,其中AR音频中可以包括真实世界中的声音,还可以包括虚拟的声音。
受话器370B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备接听电话或语音信息时,可以通过将受话器370B靠人耳接听语音。
麦克风370C,也称“话筒”、“传声器”,用于将声音信号转换成电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风370C发声,将声音信号输入到麦克风370C。在本申请中,电子设备可以设置至少两个麦克风370C,例如本机麦克风或者无线麦克风。在另一些实施例中,电子设备可以设置三个,四个或更多麦克风370C,实现采集声音信号,降噪等功能。在本申请实施例中,电子设备可以通过麦克风370C采集真实世界中的声音信号。
触摸传感器380K,也称“触控面板”。触摸传感器380K可以设置于显示屏394,由触摸传感器380K与显示屏394组成触摸屏,也称“触控屏”。触摸传感器380K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作(包括触摸位置、触摸力度、接触面积和触摸时长等信息)传递给处理器,以确定触摸事件类型。可以通过显示屏394提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器380K也可以设置于电子设备的表面,与显示屏394所处的位置不同。
在本申请实施例中,触摸传感器380K检测到的触摸操作可以是用户通过手指在触摸屏上或附近的操作,也可以是用户使用手写笔、触控笔、触控球等触控辅助工具在触摸屏上或附近的操作,本申请不做限定。
可以理解的是,本申请实施例示意的结构并不构成对电子设备的具体限定。在本申请另一些实施例中,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
例如,若电子设备是手机,电子设备还可以包括用户标识模块(subscriber identification module,SIM)卡接口395。
又如,若电子设备是AR眼镜,如图4所示,电子设备除了包括安装在镜架上的摄像头、处理器(如CPU等)等(其中无线通信模块、传感器、显示屏(如微型显示屏)等其它模块未示出,可参考图3所示结构示意图),还可以包括光学模组。其中,光学模组主要用于负责AR眼镜的成像工作。
其中,光学模组包括光学组合器(optical combiner)、光波导/棱镜/自由曲面等。
可以理解,由于AR眼镜需要提供透视(see-through)功能,也就是说AR眼镜既要支持用户透过该AR眼镜看到真实环境,也要支持用户通过该AR眼镜看到虚拟信息。因此,AR眼镜的成像系统不能挡在用户视线前方。这就需要通过光学组合器(optical combiner)以“层叠”的形式,将虚拟信息和真实环境融为一体,互相补充,互相“增强。示例性地,光学组合器如曲面镜半反射Birdbath。
光波导、棱镜和自由曲面作为三种光学镜片组件,可以通过将微型显示屏幕发出的光线束进行反射、折射、衍射,最终投射到用户的视网膜上以形成图像。
其中,棱镜可以通过由两两相交但彼此均不平行的平面围成的透明物体对图像源的光束进行分光或使光束发生衍射后进入人眼,形成图像。
自由曲面可以通过不具有平移对称性或旋转对称性的光学曲面对图像源的光束进行折射后进入人眼,形成图像。
光波导可以是一片高折射率透明基底,光波导的基底侧边设置有特定耦入结构,该耦入结构可以耦合图像源的光束,耦合到的图像源的光束可以在光波导的基底内进行全反射传播,直至靠近人眼时通过特定耦出结构耦出,进入人眼,形成图像。示例性地,光波导如几何反射波导、衍射刻蚀光栅波导、全息光栅波导等。
关于AR眼镜的具体机构和各个模块的具体工作原理等相关介绍,可以参考常规技术,本申请实施例不做赘述。另外,图4仅作为一种AR眼镜结构示例,本申请对AR眼镜的具体结构不做限定。
需要说明的是,本申请实施例不限定任务接续方法可以适用的具体场景。示例性地,本申请实施例提供的一种任务接续方法可以适用于基于AR技术的办公场景、游戏场景、导航场景、医疗场景、教育场景等场景中。
以AR眼镜基于AR技术应用于导航场景为例,如图5所示,当用户戴上AR眼镜时,用户不仅可以看到真实世界中的建筑物、道路、车辆、树木等,还可以看到由AR眼镜在AR视场(AR眼镜的显示屏提供AR视场)中提供的虚拟信息,如图5所示导航路线、会议室入口提示标志。
以下将结合附图,以目标设备接续AR设备中的AR任务为例,对本申请实施例提供的一种任务接续方法进行具体介绍。
以AR任务为应用运行时的界面展示和/或音频播放任务为例,在一些实施例中,AR任务对应的应用可以安装并运行在AR设备中。
在另一些实施例中,AR任务对应的应用可以安装并运行在其它电子设备(如第一电子设备)中,但是该AR任务对应的AR界面由AR设备负责显示和/或该AR任务对应的AR音频由AR设备负责播放。AR设备可以将显示任务(即AR任务)接续至目标设备(如第一电子设备)中。
例如,AR任务对应的应用可以安装并运行在第一电子设备中,且由AR设备负责显示和/或播放,AR设备可以将该显示任务和/或音频播放任务接续至第一电子设备。又如,AR任务可以安装并运行在第一电子设备中,且由AR设备负责显示和/或播放,AR设备可以将该显示任务和/或音频播放任务接续至第二电子设备。又如,AR任务对应的应用可以安装并运行在第一电子设备中,且由AR设备负责显示和播放,AR设备可以将显示任务接续至第一电子设备,以及将音频播放任务接续至第二电子设备。
其中,AR设备与第一电子设备之间,以及AR设备与第二电子设备之间建立有通信连接。该通信连接遵循无线传输协议。该无线传输协议可以包含但不限于蓝牙(BT)传输协议或WiFi传输协议等。示例性的,该通信连接如蓝牙连接、点对点WiFi(peer to peer,P2P)连接等。
作为一种示例,在本申请实施例中,设备之间(如AR设备与第一电子设备之间、AR设备与第二电子设备之间等)可以通过“碰一碰”、“扫一扫”(如扫描二维码或条形码)、基于设备发现技术(如靠近自动发现、近场发现技术)等方式发现周围可连接的其它一个或多个设备,并进行鉴权配对以与其建立通信连接。关于设备之间建立通信连接的具体方法和具体过程,可以参考常规技术,本申请实施例不做赘述。
如图6所示,本申请实施例提供的AR任务接续方法可以包括S601-S605:
S601:AR设备执行第一AR任务。
示例性地,第一AR任务可以是界面显示任务和/或AR音频播放任务。对应的,AR设备执行第一AR任务即AR设备显示第一AR界面和/或播放第一AR音频。
其中,第一AR任务可以是应用对应的AR任务、功能模块对应的AR任务、小程序对应的AR任务等,本申请不做限定。
可以理解,在本申请一些实施例中,AR设备在执行AR任务时,可以在AR视场中显示对应的任务界面(即AR界面)。示例性地,任务界面(即AR界面)可以是任务卡片的形式。其中,一个任务卡片包括对应AR任务的缩略信息。
示例性地,AR设备可以在AR视场中提供图7所示多个任务卡片。其中,图7所示卡片1为音乐卡片,卡片1中包括当前正在播放的音乐的信息;卡片2为文档卡片,卡片2包括当前正在编辑的文档信息;卡片3为地图卡片,卡片3包括用户在真实世界中的位置信息以及目的地的位置信息;卡片4为聊天卡片,卡片4包括用户当前的聊天对象以及聊天内容;卡片5为智能家居卡片,卡片5包括智能家居的设备信息和当前状态信息(如在线状态或者离线状态等);卡片6为备忘录卡片,卡片6包括备忘录事件的时间和具体事件信息。
在一些实施例中,第一AR任务对应的应用(包括快应用、轻应用等)/功能模块/小程序等可以运行在AR设备中。示例性地,AR设备中可以运行有地图应用,在AR设备运行该地图应用时,AR设备显示地图界面(即AR界面)和/或播放导航音频(即AR音频)。例如,上述地图界面(即AR界面)上不仅可以包括真实世界中的建筑物、道路、车辆、树木等事物,以及用户在真实世界中的位置,还可以包括导航路线、提示标志等虚拟信息,如图5所示。
在另一些实施例中,第一AR任务对应的应用(包括快应用、轻应用等)/功能模块/小程序等可以运行在除AR设备以外的其它电子设备中,且由AR设备提供AR界面显示/AR音频播放服务。示例性地,手机中可以运行有地图应用,手机与AR设备之间建立有通信连接,在手机运行该地图应用时,手机可以将地图界面(即AR界面)和/或播放导航音频(即AR音频)发送至AR设备,以通过AR设备向用户提供相应界面显示和/或音频播放服务。其中,上述地图界面(即AR界面)上不仅可以包括真实世界中的建筑物、道路、车辆、树木等事物,以及用户在真实世界中的位置,还可以包括导航路线、提示标志等虚拟信息,如图5所示。对应的,AR设备可以根据来自手机的信息执行相应的AR任务,即显示地图界面(即AR界面)和/或播放导航音频(即AR音频)。
其中,对于AR任务对应的应用运行在其它电子设备中的情况,作为一种可能的示例,运行应用的电子设备可以直接生成对应的任务卡片并发送给AR设备。作为另一种可能的示例,运行应用的电子设备可以向AR设备发送相关进程的界面配置信息,由AR设备自行生成对应的任务卡片,例如AR设备可以调用其卡片生成模块,以根据上述界面配置信息生成任务卡片。
对于第一电子设备和第二电子设备中均运行有AR设备在AR视场中提供的某一任务图标对应的应用的情况,作为一种可能的实现方式,AR设备可以在AR视场中分别提供与第一电子设备中运行的应用和第二电子设备中运行的应用对应的任务卡片。例如,假设第一电子设备和第二电子设备上均运行有备忘录应用,其中第一电子设备上的备忘录应用中备注有备忘录事件1和备忘录事件2,第二电子设备上的备忘录应用中备注有备忘录事件3,AR设备可以在AR视场中分别提供卡片A和卡片B,其中卡片A中包括备忘录事件1和备忘录事件2、卡片B中包括备忘录事件3。
作为另一种可能的实现方式,AR设备可以综合第一电子设备和第二电子设备对中运行的应用,在AR视场中提供一个任务卡片,该任务卡片也称融合性卡片。例如,假设第一电子设备和第二 电子设备上均运行有备忘录应用,其中第一电子设备上的备忘录应用中备注有备忘录事件1和备忘录事件2,第二电子设备上的备忘录应用中备注有备忘录事件3,AR设备可以在AR视场中提供图7所示卡片6,其中卡片6中既包括备忘录事件1、备忘录事件2,又包括备忘录事件3。
S602:AR设备识别到满足预设条件。
作为一种可能的实现方法,AR设备可以根据用户在AR设备上的设备选择操作,确定满足预设条件,其中该预设条件用于表征用户的接续意图。示例性地,用户在AR设备上的设备选择操作如用户在第一界面上选择第一电子设备的操作,其中第一界面上包括与AR设备建立有通信连接关系的多个电子设备的列表,该列表中包括第一电子设备。
作为另一种可能的实现方式,AR设备可以根据识别到的用户的预设语音指示,确定满足预设条件。
为了提供更加智能化的用户接续意图确定,作为另一种可能的实现方式,AR设备识别到满足预设条件可以是指AR设备识别到电子设备(如第一电子设备)位于AR设备预设范围内。示例性地,AR设备可以通过以下方式1-方式3中的一项或多项确定电子设备(如第一电子设备)位于AR设备预设范围内:
方式1:AR设备可以基于自动定位技术,获取位于AR设备预设范围内的电子设备(如第一电子设备)。
示例性地,上述自动定位技术如超声波定位技术。AR设备可以基于超声波定位技术,通过设置在其中的扬声器发射超声波信号,通过设置在其中的麦克风接收来自其它电子设备的超声波信号的回波信号。进一步的,AR设备可以基于三角定位技术,根据发出信号的传输路径、接收信号的传输路径,结合扬声器和麦克风的相对位置关系,确定多个电子设备在环境中的具体位置,进而确定哪些电子设备位于AR设备预设范围内。若第一电子设备位于AR设备预设范围内,则AR设备可以确定用户有跨设备AR任务接续的需求,且目标设备为该第一电子设备。
方式2:AR设备可以根据捕获的真实世界的图像信息中所包括的关于电子设备的图像信息,确定位于AR设备预设范围内的电子设备。
示例性地,AR设备可以通过摄像头采集真实世界的图像信息,若该图像信息中包括第一电子设备的图像信息,AR设备可以通过分析该图像信息确定第一电子设备的空间位置,进而判断第一电子设备是否位于AR设备预设范围内的电子设备。若AR设备根据该第一电子设备的图像信息确定第一电子设备位于AR设备预设范围内,则AR设备可以确定用户有跨设备AR任务接续的需求,且目标设备为该第一电子设备。
方式3:AR设备可以根据其它电子设备的运动数据确定电子设备的空间位置,进而根据电子设备的空间位置确定位于AR设备预设范围内的电子设备。
示例性地,电子设备的运动数据可以由电子设备通过运动传感器(如加速度传感器、陀螺仪传感器等)在电子设备运动的过程中测量得到。其中,电子设备的运动数据如电子设备的速度、加速度、角速度和角加速度等。
优选的,为了更加准确的进行用户接续意图判断,作为另一种可能的实现方式,AR设备识别到满足预设条件可以是指AR设备识别到电子设备(如第一电子设备)位于AR设备预设范围内,且电子设备处于预设空间姿态。示例性地,AR设备可以通过以下方式A和/或方式B确定电子设备位于AR设备预设范围内,且电子设备处于预设空间姿态:
方式A:AR设备可以根据捕获的真实世界的图像信息中所包括的关于电子设备的图像信息,确定位于AR设备预设范围内且处于预设空间姿态的电子设备。
可以理解,通常,若用户有将AR设备上的AR任务接续至某一电子设备(如第一电子设备)的意图时,或者有通过某一电子设备(如第一电子设备)进行AR任务接续相关操作时,用户通常会将第一电子设备拿起(如图8所示)。在这种情况下,第一电子设备便会落入AR设备的预设范围内,AR设备的摄像头也便可以捕获第一电子设备的图像信息。基于此,在本申请实施例中,AR设备可以根据摄像头捕获的真实世界的图像信息中所包括的关于第一电子设备的图像信息,确定第一电子设备位于AR设备预设范围内。以及,AR设备可以根据摄像头捕获的真实世界的图像信息中所包括的关于第一电子设备的图像信息,对位于AR设备预设范围内的第一电子设备进行 空间姿态识别,进而判断第一电子设备的空间姿态是否满足预设空间姿态。若AR设备根据该第一电子设备的图像信息确定第一电子设备的空间姿态满足预设空间姿态,则AR设备可以确定用户有跨设备AR任务接续的需求,且目标设备为该第一电子设备。
可选的,第一电子设备位于AR设备的预设范围内如第一电子设备位于AR设备的AR视场范围内。
示例性地,AR设备可以通过分析摄像头采集的第一电子设备的图像信息,获取第一电子设备的空间姿态。如图8所示,假设AR设备是AR眼镜,AR眼镜通过摄像头采集真实世界的图像信息,经过图像信息分析,AR眼镜确定手机(即第一电子设备)位于AR眼镜预设范围内且手机(即第一电子设备)的空间姿态(如图8所示空间姿态)为预设空间姿态,对于这种情况,AR眼镜可以确定用户有跨设备AR任务接续的需求,且目标设备为手机。
方式B:AR设备可以根据其它电子设备的运动数据确定电子设备的空间位姿(包括空间位置以及空间姿态),进而确定位于AR设备预设范围内且处于预设空间姿态的电子设备。
可以理解,若用户有将AR设备上的AR任务接续至某一电子设备(如第一电子设备)的意图时,或者有通过某一电子设备(如第一电子设备)进行AR任务接续相关操作时,用户通常会将第一电子设备拿起(如图8所示)。在这个过程中,第一电子设备的运动传感器会采集第一电子设备的运动数据。基于此,在本申请实施例中,AR设备可以根据第一电子设备的运动数据确定第一电子设备位于AR设备预设范围内。以及,AR设备可以根据第一电子设备的运动数据对位于AR设备预设范围内的第一电子设备进行空间姿态识别,进而判断第一电子设备的空间姿态是否满足预设空间姿态。若AR设备根据该第一电子设备的运动数据确定第一电子设备的空间姿态满足预设空间姿态,则AR设备可以确定用户有跨设备AR任务接续的需求,且目标设备为该第一电子设备。
在一种可能的实现方式中,电子设备的运动数据可以是电子设备运动过程中(如被用户拿起至图8所示状态过程中)的加速度数据和角速度数据。可选地,在x,y,z三维坐标系中,电子设备的运动数据可以用三轴角速度ωx,ωy,ωz和三轴加速度ax,ay,az表示。其中,三轴角速度可以理解为电子设备围绕x,y,z三个轴的角速度,三轴加速度可以理解为电子设备在x,y,z三个轴上的加速度。
示例性,如图10所示,电子设备在x,y,z三维坐标系中的运动可以包括三个平移运动和三个旋转运动,其中,三个平移运动包括电子设备在x轴上进行的向左、向右平移运动,在y轴上进行的向前、向后平移运动以及在z轴上进行的向上、向下平移运动。三个旋转运动包括电子设备围绕x轴的旋转运动(旋转的角度也称俯仰角Pitch),围绕y轴的旋转运动(旋转的角度也称横滚角Roll),围绕z轴的旋转运动(旋转的角度也称偏航角Yaw)。
可以理解,电子设备的运动包括在x,y,z三个轴上各自的平移运动时,在x,y,z三个轴上各自会对应一个加速度即ax,ay,az。电子设备的运动包括围绕x,y,z三个轴上各自的旋转运动时,在x,y,z三个轴上各自会对应一个角速度即ωx,ωy,ωz。由于电子设备在三维空间中的运动可以分解为在x,y,z三个轴上的平移和/或运动。因此,电子设备在运动过程中产生的位姿变化,可以通过电子设备在x,y,z三个轴上的三轴角速度ωx,ωy,ωz和三轴加速度ax,ay,az来表示。
需要说明的是,上述基于自动定位技术和通过分析摄像头采集的图像信息确定满足预设条件的方法仅作为示例,AR设备还可以基于其它方法识别用户的接续意图,本申请实施例不做限定。
S603:AR设备在AR视场中提供一个或多个任务图标,该一个或多个任务图标包括第一任务图标。
其中,AR设备在AR视场中提供的一个或多个任务图标是AR设备正在执行的一个或多个AR任务对应的图标。AR设备在AR视场中提供该一个或多个任务图标用于方便用户进行接续任务选择以及目标设备确定。AR设备在AR视场中提供的一个或多个任务图标分别与任务界面(即AR界面)对应。
其中,作为一种示例,AR设备可以通过在AR视场中、第一电子设备附近(如预设范围内)显示虚拟任务图标的方式提供一个或多个任务图标。
示例性地,假设AR设备是AR眼镜,第一电子设备是手机,如图9所示,在识别用户的接续意图之后,AR眼镜可以在AR眼镜的AR视场中、AR眼镜识别到的手机的上侧显示多个任务图标(如图9所示图标1、图标2、图标3、图标4、图标5和图标6)。
在一些实施例中,AR设备在AR视场中提供的一个或多个任务图标包括运行在AR设备中的应用/功能模块/小程序等对应的任务图标。
在另一些实施例中,AR设备在AR视场中提供的一个或多个任务图标包括运行在与AR设备之间建立有通信连接的其它电子设备(如第一电子设备和第二电子设备)上的应用/功能模块/小程序等对应的任务图标。
需要说明的是,图8仅作为一种AR设备展示任务图标的方式示例,本申请实施例不限定任务图标与第一电子设备的相对位置。
例如,如图11中的(a)、图11中的(b)和图11中的(c)所示,任务图标显示区还可以位于手机(即第一电子设备)的左侧/右侧/下侧。AR设备可以在AR眼镜的AR视场中、AR眼镜识别到的第一电子设备的左侧/右侧/下侧显示多个任务图标。
又如,如图12所示,AR设备还可以通过第一电子设备的下拉菜单栏显示多个任务图标,即任务图标显示区位于第一电子设备的下拉菜单栏。
又如,如图13所示,AR设备还可以通过第一电子设备的锁屏界面显示多个任务图标,即任务图标显示区位于第一电子设备的锁屏界面。
又如,任务图标显示区还可以位于AR设备的侧边。AR设备可以在第一电子设备的侧边显示多个任务图标。如图14所示,假设第一电子设备为折叠屏设备,任务图标显示区位于AR设备的折叠侧边。
另外,需要说明的是,图9仅作为一种任务图标排布方式示例,本申请实施例也不限定任务图标的具体排布规则。例如,AR设备除了以图9所示一字型排布方式显示多个任务图标之外,还可以以图15中的(b)所示双排排布方式,或以环形排布方式、自由排布方式等方式显示多个任务图标。
又如,对于任务图标较多的情况,AR设备还可以以隐藏显示的形式(如图16所示以省略号的形式)显示其中一个或多个任务图标。如图16所示,用户可以通过手指滑动等操作,控制隐藏显示的任务图标正常显示。
又如,AR设备还可以根据多个任务卡片的相对位置在AR视场中展示多个任务卡片所对应的任务图标。基于此,可以方便用户在快速、准确地确定想要接续的AR任务对应的任务图标的具体位置。示例性地,如图15中的(b)所示,假设AR设备在AR视场中提供图15中的(a)所示多个任务卡片,AR设备在进行任务图标展示时,可以按照图15中的(a)所示卡片1(即音乐卡片)、卡片2(即文档卡片)、卡片3(即地图卡片)、卡片4(即聊天卡片)、卡片5(即智能家居卡片)和卡片6(即备忘录卡片)的相对位置展示其对应的任务图标1、图标2、图标3、图标4、图标5和图标6。
又如,AR设备还可以根据AR设备的朝向(如空间姿态)显示对应位置的任务卡片所对应的任务图标。示例性地,假设AR设备在AR视场中提供图17中的(a)所示多个任务卡片,如图17中的(b)所示,若AR设备朝向左侧,在进行任务图标展示时,AR设备可以展示左侧卡片1和卡片4所对应的任务图标。;若AR设备朝向右侧,在进行任务图标展示时,AR设备可以展示右侧卡片3和卡片6所对应的任务图标;若AR设备朝向正前方,在进行任务图标展示时,AR设备可以展示中间卡片2和卡片5所对应的任务图标。
又如,对于多个AR任务对应的应用分别运行在多个设备中的情况,AR设备还可以根据应用所运行的设备之间的相对位置关系排布对应的任务图标。示例性地,假设图标1和图标2对应的应用运行在第二电子设备中,图标3和图标4对应的应用运行在AR设备中,图标5和图标6对应的应用运行在第三电子设备中,第二电子设备、第三电子设备和AR设备的位置关系如图18中的(a)所示,其中第二电子设备位于AR设备左侧,第三电子设备位于AR设备右侧,如图18中的(b)所示,AR设备在进行任务图标展示时,可以在任务图标显示区域的左侧显示第二电子设备中运行的应用的图标1和图标2,在任务图标显示区域的中间显示AR设备中运行的应用的图 标3和图标4,在任务图标显示区域的右侧显示第三电子设备中运行的应用的图标5和图标6。
或者,对于多个AR任务对应的应用分别运行在多个设备中的情况,AR设备还可以根据AR设备的朝向(如空间姿态)显示运行在AR设备朝向的电子设备中的应用的任务图标。示例性地,假设图标1和图标2对应的应用运行在第二电子设备中,图标3和图标4对应的应用运行在AR设备中,图标5和图标6对应的应用运行在第三电子设备中,第二电子设备、第三电子设备和AR设备的位置关系如图19中的(a)所示,其中第二电子设备位于AR设备左侧,第三电子设备位于AR设备右侧,如图19中的(b)所示,假设AR设备朝向第二电子设备,AR设备在进行任务图标展示时,可以在任务图标显示区域的左侧显示第二电子设备中运行的应用的图标1和图标2;假设AR设备朝向第三电子设备,AR设备在进行任务图标展示时,可以在任务图标显示区域的左侧显示第三电子设备中运行的应用的图标5和图标6;假设AR设备朝向正前方,AR设备在进行任务图标展示时,可以在任务图标显示区域的左侧显示AR设备中运行的应用的图标3和图标4。
S604:响应于用户将第一AR任务由AR设备切换至第一电子设备的操作,AR设备向第一电子设备发送AR任务接续指示。
其中,AR任务接续指示用于指示第一电子设备接续AR设备执行AR设备当前正在执行的第一AR任务。示例性地,假设AR设备当前正在执行的第一AR任务是显示AR界面,则AR任务接续指示用于指示第一电子设备接续AR设备显示上述AR界面。又如,假设AR设备当前正在执行的第一AR任务是播放AR音频,则AR任务接续指示用于指示第一电子设备接续AR设备播放上述AR音频。
本申请实施例提供的任务接续方法可以支持多种形式的用于将第一AR任务由AR设备切换至第一电子设备的操作。以下通过几个示例做具体介绍:
(1)、作为一种示例,在本申请实施例中,用户将第一AR任务由AR设备切换至第一电子设备的操作可以是用户在AR设备提供的一个或多个任务图标中将第一AR任务所对应的任务图标滑向第一电子设备的操作。
例如,以图20A中的(a)所示AR视场为例,假设第一AR任务为图标4所对应的AR任务,用户将第一AR任务由AR设备切换至第一电子设备的操作如用户将图标4向手机(即第一电子设备)滑动的操作。
可以理解,基于类似图20A中的(a)所示将图标4向目标设备滑动的操作,用户可以同时实现快速、便捷地待接续任务选择和目标设备选择。该设计对用户来说易于操作,用户体验度高。
需要说明的是,图20A仅以AR设备始终在AR视场中提供多个任务图标作为示例。在本申请另一些实施例中,AR设备还可以在完成AR任务接续之后,响应于图21中的(a)所示用户将图标4向手机(即第一电子设备)滑动的操作,不继续显示任务图标,如图21所示。
(2)、作为另一种示例,在本申请实施例中,用户将第一AR任务由AR设备切换至第一电子设备的操作可以是用户在AR设备提供的一个或多个任务图标中捏合第一AR任务所对应的任务图标向第一电子设备拖拽的操作。
例如,以图20B中的(a)所示AR视场为例,假设第一AR任务为图标4所对应的AR任务,用户将第一AR任务由AR设备切换至第一电子设备的操作如用户捏合图标4并持续向手机(即第一电子设备)拖拽的操作。
可以理解,基于类似图20B中的(a)所示捏合图标4并持续向目标设备拖拽的操作,用户可以同时实现快速、便捷地待接续任务选择和目标设备选择。该设计对用户来说易于操作,用户体验度高。
(3)、作为另一种示例,在本申请实施例中,用户将第一AR任务由AR设备切换至第一电子设备的操作可以是用户在当前焦点图标为第一AR任务对应的任务图标(即第一任务图标)时的预设操作。
示例性地,预设操作如预设滑动操作、点击(如单击、双击等)操作、长按操作等,本申请实施例不做限定。
其中,当前焦点图标可以由用户通过操作切换。该操作可以是左右滑动操作、上下滑动操作等,本申请实施例不做限定。
例如,以图22中的(a)所示AR视场为例,在当前焦点图标为图标3时,在接收到用户在手机(即第一电子设备)左侧边缘从上向下滑动的操作2201之后,如图22中的(b)所示,AR设备将当前焦点图标由图标3切换为图标4。用户将第一AR任务由AR设备切换至第一电子设备的操作如图22中的(b)所示用户在当前焦点图标为图标4(即第一AR任务对应的任务图标)时,从手机(即第一电子设备)左侧边缘向右滑动的操作2202。
又如,以图23中的(a)所示AR视场为例,在当前焦点图标为图标3时,在接收到用户在手机(即第一电子设备)上边缘从右向左滑动的操作2301之后,如图23中的(b)所示,AR设备将当前焦点图标由图标3切换为图标4。用户将第一AR任务由AR设备切换至第一电子设备的操作如图23中的(b)所示用户在当前焦点图标为图标4(即第一AR任务对应的任务图标)时,从手机(即第一电子设备)上边缘向下滑动的操作2302。
可以理解,基于类似图22所示连续的操作2201和操作2202,或者图23所示连续的操作2301和操作2302,用户可以连贯、快速、便捷地进行焦点图标切换以及任务接续。该设计对用户来说易于操作,用户体验度高。
(4)、作为另一种示例,在本申请实施例中,用户将第一AR任务由AR设备切换至第一电子设备的操作可以是用户在多个候选设备选项中选择第一电子设备的操作。
例如,AR设备可以在接收到用户对第一AR任务对应的任务图标的预设手势后,展示多个候选设备的虚拟图标供用户从中选择即将接续AR任务的目标设备。进一步的,AR设备可以在接收到用户保持上述预设手势的同时将第一AR任务对应的任务图标拖拽至某一候选设备(如第一电子设备)的虚拟图标的操作时,确定目标设备为该候选设备(即第一电子设备)。用户将第一AR任务由AR设备切换至第一电子设备的操作即上述保持预设手势直至拖拽任务图标的连续性动作。示例性地,预设手势如捏合手势、长按手势等。上述多个候选设备均与AR设备之间建立有通信连接。
如图24A中的(a)所示,AR设备检测用户在AR视场中捏合图标4(即第一AR任务对应的任务图标)的操作。进一步的,响应于用户在AR视场中捏合图标4(即第一AR任务对应的任务图标)的操作,AR设备显示图24A中的(b)所示AR视场,其中包括平板电脑、手机、笔记本电脑等多个候选设备的虚拟图标。进一步的,响应于接收到用户保持上述捏合手势将图标4向手机(即第一电子设备)的虚拟图标拖拽的操作,AR设备确定目标设备是手机(即第一电子设备),以及确定用户的意图是将图标4对应的AR任务(即第一AR任务)接续至手机(即第一电子设备)。
可选的,在一些实施例中,在接收到用户对第一AR任务对应的任务图标的预设手势后,AR设备展示的多个虚拟图标所对应的多个候选设备可以与第一AR任务相关。示例性地,若第一AR任务包括界面显示任务,该多个候选设备可以包括具有显示屏的设备,如手机、平板电脑等;若第一AR任务包括音频播放任务,该多个候选设备可以包括具有音频能力的设备,如音箱等。
可选的,在一些实施例中,如图24B中的(b)所示,响应于接收到如24B中的(a)所示用户在AR视场中捏合图标4(即第一AR任务对应的任务图标)的操作,AR设备显示图24B中的(b)所示AR视场,其中包括平板电脑、手机、笔记本电脑等多个候选设备。其中,如图24B中的(b)所示,除图标4以外的其他无关图标消失。进一步的,响应于接收到用户保持上述捏合手势将图标4向手机(即第一电子设备)拖拽的操作,AR设备确定目标设备是手机(即第一电子设备),以及确定用户的意图是将图标4对应的AR任务(即第一AR任务)接续至手机(即第一电子设备)。
可以理解,AR设备通过根据用户操作(如图24A中的(a)或图24B中的(a)所述用户在AR视场中捏合图标4的操作)显示多个候选设备,以便用户可以快速、便捷地进行目标设备选择。该设计对用户来说易于操作,且更加直观,用户体验度高。
(5)、作为另一种示例,在本申请实施例中,用户将第一AR任务由AR设备切换至第一电子设备的操作可以是用户在超级终端界面上选择第一电子设备的操作。其中,超级终端界面上包括可以用于进行任务协同或者任务接续的多个候选设备。
示例性地,响应于接收到图24C中的(a)所示用户捏合图标4并持续向手机拖拽的操作, AR设备可以向手机发送进入AR拖拽状态的指示信息。在手机根据该指示信息进入AR拖拽状态时,手机可以显示超级终端界面,其中超级终端界面上包括平板电脑、手机、笔记本电脑等多个候选设备的图标。进一步的,响应于用户在如图24C中的(b)所示超级终端界面上选择手机图标的操作,AR设备确定目标设备是手机(即第一电子设备),以及确定用户的意图是将图标4对应的AR任务(即第一AR任务)接续至手机(即第一电子设备)。
示例性地,上述超级终端界面上包括的多个候选设备可以是手机和/或AR设备基于定位技术(如自动定位技术)识别的,还可以是手机和/或AR设备通过摄像头捕获的,还可以是综合上述两种方式确定的,本申请实施例不做限定。其中,定位技术如超声波定位技术,本申请不做限定。
其中,图24C仅以用户选择的目标设备为显示超级终端界面的设备(即手机)为例。实际上,用户可以在超级终端界面选择任意设备作为目标设备。例如,假设用户在超级终端界面上选择平板电脑作为目标设备,则AR设备根据用户的选择向平板电脑发送AR任务接续指示,以指示平板电脑接续AR设备执行AR设备当前正在执行,且用户选择的待接续的第一AR任务。
可以理解,电子设备通过根据用户操作(如图24C中的(a)所示用户捏合图标4并持续向手机拖拽的操作)显示包括有多个候选设备的超级终端界面,以便用户可以快速、便捷地进行目标设备选择。该设计对用户来说易于操作,且更加直观,用户体验度高。
(6)、作为另一种示例,在本申请实施例中,用户将第一AR任务由AR设备切换至第一电子设备的操作可以是用户通过辅助输入设备(如鼠标、手写笔、触控笔、触控球、键盘、头目等)在AR视场中选择将第一AR任务切换至第一电子设备的操作。
示例性地,如图24D中的(a)所示,假设AR视场中包括笔记本电脑以及显示在笔记本电脑上侧的多个任务图标(如图24D所示图标1、图标2、图标3、图标4、图标5和图标6),响应于用户使用与笔记本电脑连接的鼠标将图标4向笔记本电脑拖拽的操作,手机显示图24D中的(b)所示超级终端界面,其中超级终端界面上包括平板电脑、手机、笔记本电脑等多个候选设备。进一步的,响应于用户在如图24D中的(b)所示超级终端界面上选择手机的操作,AR设备确定目标设备是手机(即第一电子设备),以及确定用户的意图是将图标4对应的AR任务(即第一AR任务)接续至手机(即第一电子设备)。
示例性地,如图24E中的(a)所示,假设AR视场中包括笔记本电脑、手机以及显示在笔记本电脑上侧的多个任务图标(如图24E所示图标1、图标2、图标3、图标4、图标5和图标6),响应于用户使用与笔记本电脑连接的鼠标将图标4向手机拖拽的操作,AR设备确定目标设备是手机(即第一电子设备),以及确定用户的意图是将图标4对应的AR任务(即第一AR任务)接续至手机(即第一电子设备)。
其中,需要说明的是,在图24D中的(a)或图24E中的(a)所示用户使用与笔记本电脑连接的鼠标拖拽图标4的过程中,若鼠标光标位置不在笔记本电脑屏幕上,则该鼠标光标是AR设备自行确定并以虚拟信息的形式显示在AR视场中的。示例性地,AR设备可以根据笔记本电脑发送的鼠标光标的运动轨迹和用户拖拽鼠标移动的操作确定笔记本电脑范围之外的鼠标光标的显示位置。或者,AR设备可以根据通过摄像头获取的鼠标光标的运动轨迹和用户拖拽鼠标移动的信息确定笔记本电脑范围之外的鼠标光标的显示位置。本申请实施例不限定处于笔记本电脑范围之外的鼠标光标显示位置的具体方法。
可以理解,AR设备通过提供鼠标光标显示功能,以便用户可以通过鼠标直接拖拽任务图标的方式进行待接续任务选择和目标设备选择,以便用户可以快速、便捷地进行目标设备选择。该设计对用户来说易于操作,且更加直观,用户体验度高。
其中,在本申请一些实施例中,例如对于第一AR任务对应的应用未运行在第一电子设备中的情况,AR任务接续指示中可以携带有第一AR任务的相关信息,用于第一电子设备进行AR任务对应的AR界面显示和/或AR音频播放。示例性地,AR任务接续指示中携带有第一AR任务的任务卡片信息。
在本申请另一些实施例中,例如对于第一AR任务对应的应用运行在第一电子设备中的情况,AR任务接续指示中可以携带有第一AR任务对应的应用的标识(如应用ID),用于第一电子设备接续AR设备的第一AR任务,显示与第一AR任务对应的应用界面和/或播放AR音频。
可选的,对于第一AR任务对应的应用运行在第一电子设备中的情况,AR任务接续指示中也可以携带有第一AR任务的相关信息,用于第一电子设备进行AR任务对应的AR界面显示和/或AR音频播放。示例性地,AR任务接续指示中携带有第一AR任务的任务卡片信息。可选的,第一AR任务的任务卡片可能是融合性卡片。
S605:第一电子设备根据AR任务接续指示接续AR设备执行第一AR任务。
示例性地,假设AR设备当前正在执行的第一AR任务是显示AR界面,则第一电子设备可以根据AR任务接续指示接续AR设备显示上述AR界面。又如,假设AR设备当前正在执行的第一AR任务是播放AR音频,则第一电子设备可以根据AR任务接续指示接续AR设备播放上述AR音频。又如,假设AR设备当前正在执行的第一AR任务是显示AR界面且播放AR音频,则第一电子设备可以根据AR任务接续指示接续AR设备显示上述AR界面以及播放上述AR音频。
示例性地,第一电子设备根据AR任务接续指示接续AR设备执行第一AR任务如图20A中的(b)、图20B中的(b)、图21中的(b)、图22中的(c)、图23中的(c)、图24A中的(c)、图24B中的(c)、图24C中的(c)、图24D中的(c)或者图24E中的(b)所示手机(即第一电子设备)根据AR任务接续指示接续AR设备显示图标4对应的聊天任务(即第一AR任务)界面。
可选的,在一些实施例中,假设第一AR任务是显示AR界面且播放AR音频,第一电子设备与音频播放设备(如音箱等)建立有通信连接(如蓝牙连接、WiFi P2P连接等),如图25所示,第一电子设备(如图25所示手机)可以根据AR设备的AR任务接续指示接续AR设备显示第一AR任务对应的AR界面(如图25所示音乐界面),并通过音频播放设备(如图25所示音箱)播放第一AR任务对应的AR音频(如图25所示音乐1)。
在一些实施例中,AR任务接续指示中携带有第一AR任务的相关信息,例如携带有第一AR任务的任务卡片信息,对于这种情况,第一电子设备可以根据AR任务接续指示中携带的第一AR任务的任务卡片信息显示第一AR任务的AR界面。
在另一些实施例中,AR任务接续指示中携带有第一AR任务对应的应用的标识(如应用ID),且第一电子设备中运行有第一AR任务对应的应用,对于这种情况,第一电子设备可以根据运行在其中的第一AR任务对应的应用进行界面显示。
需要说明的是,在第一电子设备接续AR设备显示AR界面时,该AR界面可以是第一AR任务对应的AR界面,例如可以是第一AR任务对应的AR卡片(如图20A中的(b)、图20B中的(b)、图21中的(b)、图22中的(c)、图23中的(c)、图24A中的(c)、图24B中的(c)、图24C中的(c)、图24D中的(c)或者图24E中的(b)所示)。
或者,在第一电子设备接续AR设备显示AR界面时,该AR界面可以与第一AR任务对应的AR界面略有不同。例如,该AR界面可以包括与第一AR任务对应的AR卡片相同的界面元素,但是界面元素的布局不同。其中,第一电子设备显示的与第一AR任务对应的应用界面更加适配于第一电子设备的显示屏尺寸。示例性地,第一电子设备可以从本地获取相关进程的界面配置信息,进而显示对应的应用界面(即第一AR任务的AR界面)。关于设备根据应用运行时的相关界面配置信息显示应用界面的具体方法和过程,可以参考常规技术,这里不做赘述。
需要说明的是,上述图20A中的(b)、图20B中的(b)、图21中的(b)、图22中的(c)、图23中的(c)、图24A中的(c)、图24B中的(c)、图24C中的(c)、图24D中的(c)或者图24E中的(b)所示示例仅以第一电子设备全屏显示第一AR任务对应的AR界面座位示例。
可选的,基于本申请实施例提供的方法中,第一电子设备在接续AR设备显示第一AR任务对应的AR界面时,还可以以非全屏窗口的方式显示该AR界面(如图26所示)。
或者,可选的,假设第一电子设备正在显示其它界面(如第一界面),第一电子设备在接续AR设备显示第一AR任务对应的AR界面时,还可以以第一AR任务对应的AR界面与第一电子设备正在显示其它界面(即第一界面)分屏显示的形式显示该AR界面。如图27所示,假设手机(即第一电子设备)正在显示智能家居应用界面(即第一界面),手机在接续AR设备显示聊天界面(即第一AR任务对应的AR界面)时,可以分屏显示智能家居应用界面和聊天界面。
需要说明的是,图27中的手机可以是单屏手机,也可以是处于折叠状态的折叠屏手机。作为 一种可能的情况,如图28所示,若第一电子设备(如图28中的手机)为处于展开状态的折叠屏手机,且折叠屏手机的第一屏上显示有智能家居应用界面(即第一界面),折叠屏手机的第二屏上显示有手机桌面,折叠屏手机在接续AR设备显示聊天界面(即第一AR任务对应的AR界面)时,可以在第二屏上显示聊天界面(即第一AR任务对应的AR界面)。
或者,若第一电子设备(如图29中的手机)为处与折叠状态的折叠屏手机,且折叠屏手机的屏幕(记作第一屏)上显示有智能家居应用界面(即第一界面),AR设备的AR视场中可以显示折叠屏手机的第二屏的虚拟屏幕,折叠屏手机在接续AR设备显示聊天界面(即第一AR任务对应的AR界面)时,可以在虚拟屏幕上显示聊天界面(即第一AR任务对应的AR界面),真实屏幕(即第一屏)上继续显示智能家居应用界面(即第一界面)。
可选的,进一步的,如图30A所示,在折叠屏手机通过虚拟屏幕显示聊天界面(即第一AR任务对应的AR界面),通过真实屏幕显示智能家居应用界面(即第一界面)时,响应于折叠屏手机由折叠状态切换为展开状态(如用户展开折叠屏手机的操作),折叠屏手机的屏幕展开,折叠屏手机包括第一屏和第二屏。以及,折叠屏手机通过第一屏显示智能家居应用界面(即第一界面),通过第二屏显示智能家居应用界面(即第一界面)。
需要说明的是,本申请上述实施例仅以用户直接对AR设备提供的任务图标操作,以向AR设备指示待接续AR任务和目标设备作为示例。在本申请另一些实施例中,AR设备还可以提供根据用户的操作将AR任务对应的任务卡片暂存(或者Pin)至某一固定位置(也称暂存区)的功能,以方便后续用户能针对该AR任务在多个电子设备之间任意接续。其中,在AR设备将任务卡片暂存(或者Pin)至某一固定位置之后,该任务卡片可以始终显示在该固定位置。
示例性地,如图30B所示,用于暂存(或者Pin)任务卡片的固定位置可以位于AR设备预设范围内,或者位于AR设备预设范围内且满足预设空间姿态的手机附近的预设位置区域(也称暂存区)。
或者,如图30C所示,用于暂存(或者Pin)任务卡片的固定位置可以在位于AR设备预设范围内,或者位于AR设备预设范围内且满足预设空间姿态的电子设备上的预设位置区域(也称暂存区)(如图30C所示手机的侧边栏、图30C所示手机的屏幕右下角)。
或者,在本申请另一些实施例中,AR设备还可以提供根据用户的操作将AR任务对应的任务图标暂存(或者Pin)至某一固定位置的功能,以方便后续用户能针对该AR任务在多个电子设备之间任意接续。其中,在AR设备将任务图标暂存(或者Pin)至某一固定位置之后,该任务图标可以始终显示在该固定位置。
可以理解,基于本申请实施例提供的一种任务接续方法,在AR设备执行AR任务时,例如显示AR界面和/或播放AR音频时,可以在识别用户的接续意图时,在AR视场中提供任务图标供用户选择即将要接续的AR任务。进一步的,在接收到用户选择AR任务的操作以及用户确定目标设备的操作之后,AR设备可以指示目标设备接续AR设备执行用户选择的AR任务。基于该方法,用户可以与手机、平板、个人电脑等操作性强且操作方便的目标设备交互,实现与AR视场中AR物体的交互,可以为用户提供更加便捷、易操作的AR交互体验。并且,假设手机、平板、个人电脑等设备相比于AR设备通常显示分辨率较高且不受外界环境(如光照、空间物体折射等)的影响,那么在目标设备接续AR设备执行用户选择的AR任务时,不会出现横纹或显示抖动等问题,因此显示效果更好。
以及,在本申请上述实施例中,对于目标设备是手机的情况,基于本申请实施例提供的一种任务接续方法,用户可以在任意场景下,通过随身携带的操作性强且操作方便的手机,实现与AR视场中AR物体的交互,这对于用户来说非常方便。
另外,基于本申请实施例提供的一种任务接续方法,在目标设备接续AR设备执行用户选择的AR任务之后,在一些情况下,例如后续AR界面显示无需依靠AR设备进行图像采集的情况,用户可以脱下AR设备,用户体验会好很多。
需要说明的是,本申请上述实施例主要以第一电子设备接续AR设备显示AR界面和/或播放AR音频为例,介绍本申请实施例提供的一种任务接续方法。对于AR任务既包括AR界面显示,又包括AR音频播放的情况,在一些实施例中,还可以由第一电子设备接续AR设备显示AR界面, 由其它电子设备(如第四电子设备)接续AR设备播放AR音频。
作为一种示例,第四电子设备可以由用户选择确定。
作为另一种示例,第四电子设备可以由用户选择的目标设备(如第一电子设备)自行确定。例如,第四电子设备可以是与第一电子设备建立有通信连接的音频播放设备(如图25所示音箱)。
可选的,在第一电子设备接续AR设备执行第一AR任务之后,进一步的,AR设备还可以根据用户选择将第一AR任务由第一电子设备切换至其它电子设备(如第二电子设备)的操作,向第一电子设备发送AR任务接续指示,以指示第二电子设备接续第一电子设备执行第一电子设备当前正在执行的第一AR任务。
示例性地,如图31所示,在第一电子设备根据AR任务接续指示接续AR设备执行第一AR任务(即S605)之后,本申请实施例提供的方法还包括S3101-S3104:
S3101:AR设备识别第二电子设备满足预设条件。
S3102:AR设备在AR视场中提供一个或多个任务图标,该一个或多个任务图标包括第一任务图标。
S3103:响应于用户将第一AR任务由第一电子设备切换至第二电子设备的操作(如第二操作),AR设备向第二电子设备发送AR任务接续指示。
S3104:第二电子设备根据AR任务接续指示接续第一电子设备执行第一AR任务。
其中,关于S3101、S3102、S3103和S3104的具体过程,可以分别参考上文中对S602、S603、S604和S605的介绍。
需要说明的是,图31所示AR任务接续方法仅以在第一电子设备根据AR任务接续指示接续AR设备执行第一AR任务之后,AR设备中不再继续提供任务图标作为示例。对于AR设备始终在AR视场中提供多个任务图标的情况,如图32所示,在第一电子设备根据AR任务接续指示接续AR设备执行第一AR任务(即S605)之后,本申请实施例提供的方法还包括S3103和S3104。
或者,可选的,在第一电子设备接续AR设备执行第一AR任务之后,进一步的,AR设备还可以根据用户选择将第一AR任务由第一电子设备反接续回AR设备的操作,向第一电子设备发送AR任务反接续指示,以将第一AR任务由第一电子设备反接续回AR设备。如图33所示,在第一电子设备根据AR任务接续指示接续AR设备执行第一AR任务(即S605)之后,本申请实施例提供的方法还包括S3301和S3302:
S3301:响应于用户将第一AR任务由第一电子设备切换至AR设备的操作(如第三操作),AR设备向第一电子设备发送AR任务反接续指示。
S3302:AR设备接续第一电子设备执行第一AR任务。
其中,关于S3301和S3302的具体过程,可以分别参考上文中对S604和S605的介绍。
可以理解,基于本申请实施例提供的任务接续方法,AR设备可以根据用户的指示,实现AR任务在多个电子设备之间的任意切换,例如由一个电子设备(如第一电子设备)切换至另一个电子设备(如第二电子设备),又如由目标设备(如第一电子设备)切换回AR设备。该方法可以支持用户在办公场景、游戏场景、导航场景、医疗场景、教育场景等场景中根据实际需求实现多设备协同。
以办公场景为例,示例性地,在AR眼镜(即AR设备)执行第一AR任务时,若AR眼镜(即AR设备)识别位于AR视场中的笔记本电脑位于AR设备预设范围内,或者笔记本电脑满足预设空间姿态时,AR眼镜(即AR设备)确定用户的接续意图。在AR眼镜(即AR设备)确定用户的接续意图后,AR眼镜(即AR设备)在AR视场中提供一个或多个任务图标供用户选择待接续的AR任务,该一个或多个任务图标包括第一任务图标。进一步的,响应于用户将第一AR任务由AR眼镜(即AR设备)切换至笔记本电脑(即第一电子设备)的操作,AR眼镜(即AR设备)向笔记本电脑(即第一电子设备)发送AR任务接续指示,以指示笔记本电脑(即第一电子设备)接续AR眼镜(即AR设备)执行第一AR任务。
进一步的,在笔记本电脑(即第一电子设备)接续AR眼镜(即AR设备)执行第一AR任务的过程中,响应于用户将第一AR任务由笔记本电脑(即第一电子设备)切换至手机(即第二电子设备)的操作,AR眼镜(即AR设备)向手机(即第二电子设备)发送AR任务接续指示,以 指示手机(即第二电子设备)接续笔记本电脑(即第一电子设备)执行第一AR任务。
应理解,电子设备(如AR设备、第一电子设备、第二电子设备、第三电子设备等)中的各个模块可以通过软件和/或硬件形式实现,对此不作具体限定。换言之,智能跳绳/电子设备是以功能模块的形式来呈现。这里的“模块”可以指特定应用集成电路ASIC、电路、执行一个或多个软件或固件程序的处理器和存储器、集成逻辑电路,和/或其它可以提供上述功能的器件。
在一种可选的方式中,当使用软件实现数据传输时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地实现本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其它可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线((digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如软盘、硬盘、磁带)、光介质(例如数字化视频光盘(digital video disk,DVD))、或者半导体介质(例如固态硬盘solid state disk(SSD))等。
结合本申请实施例所描述的方法或者算法的步骤可以硬件的方式来实现,也可以是由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于RAM存储器、闪存、ROM存储器、EPROM存储器、EEPROM存储器、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性地存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。另外,该ASIC可以位于电子设备中。当然,处理器和存储介质也可以作为分立组件存在于电子设备中。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。

Claims (24)

  1. 一种任务接续方法,其特征在于,所述方法包括:
    增强现实AR设备识别到满足预设条件时,在AR视场内显示所述AR设备正在执行的一个或多个AR任务的图标,所述一个或多个AR任务的图标包括第一任务图标,所述第一任务图标对应于第一AR任务;
    响应于用户对所述第一任务图标的第一操作,所述AR设备指示第一电子设备接续所述AR设备执行所述第一AR任务。
  2. 根据权利要求1所述的方法,其特征在于,所述AR设备识别到满足预设条件,包括:
    所述AR设备识别到满足预设条件的第一电子设备。
  3. 根据权利要求2所述的方法,其特征在于,所述AR设备识别到满足预设条件的第一电子设备,包括:
    所述AR设备识别到位于所述AR设备预设范围内的所述第一电子设备;或者,
    所述AR设备接收到用户在所述AR设备的第一界面上选择所述第一电子设备的操作,其中所述第一界面上包括与所述AR设备建立有通信连接的多个电子设备的列表。
  4. 根据权利要求3所述的方法,其特征在于,所述第一电子设备位于所述AR视场内。
  5. 根据权利要求4所述的方法,其特征在于,所述第一电子设备满足预设空间姿态。
  6. 根据权利要求3-5中任一项所述的方法,其特征在于,所述AR设备识别到位于所述AR设备预设范围内的所述第一电子设备,包括:
    所述AR设备基于自动定位技术,识别到位于所述AR设备预设范围内的所述第一电子设备;或者,
    所述AR设备根据捕获的真实世界的图像信息,识别到位于所述AR设备预设范围内的所述第一电子设备;
    所述AR设备根据所述第一电子设备的运动数据,识别到位于所述AR设备预设范围内的所述第一电子设备。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述用户对所述第一任务图标的第一操作包括以下中的任一种:
    用户将所述第一任务图标滑向所述第一电子设备的操作;
    用户捏合所述第一任务图标拖拽至所述第一电子设备的操作;
    用户在所述第一任务图标为当前焦点图标时的预设操作;
    用户将所述第一任务图标滑向所述第一电子设备的虚拟图标的操作;
    用户将所述第一任务图标滑向超级终端界面上的第一电子设备图标的操作;
    用户通过辅助输入设备将所述第一任务图标拖拽至所述第一电子设备的操作。
  8. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    所述AR设备根据用户的操作,切换当前焦点图标。
  9. 根据权利要求1-8中任一项所述的方法,其特征在于,在所述AR设备识别到满足所述预设条件之前,所述方法还包括:
    所述AR设备执行所述一个或多个AR任务,在所述AR视场中显示所述一个或多个AR任务对应的AR界面。
  10. 根据权利要求9所述的方法,其特征在于,所述一个或多个AR任务对应的AR界面为所述一个或多个AR任务对应的任务卡片,其中任务卡片中包括对应AR任务的缩略信息。
  11. 根据权利要求9或10所述的方法,其特征在于,所述一个或多个AR任务的图标的排布方式与所述一个或多个AR任务对应的AR界面在所述AR视场中的排布方式一致。
  12. 根据权利要求1-11中任一项所述的方法,其特征在于,
    所述一个或多个AR任务对应的应用中至少一个运行在所述AR设备中;和/或,
    所述一个或多个AR任务对应的应用中至少一个运行在与所述AR设备建立有通信关系的一个或多个电子设备中。
  13. 根据权利要求1-12中任一项所述的方法,其特征在于,所述在AR视场内显示所述AR设备正在执行的一个或多个AR任务的图标,包括:
    在所述第一电子设备的任务图标显示区显示所述一个或多个AR任务的图标;
    其中,所述任务图标显示区位于所述第一电子设备的上侧、下侧、左侧、右侧或者侧边,或者所述任务图标显示区位于所述第一电子设备下拉菜单栏或者锁屏界面。
  14. 根据权利要求13所述的方法,其特征在于,
    所述一个或多个AR任务的图标的排布方式与运行所述一个或多个AR任务对应的应用的电子设备与所述AR设备的位置关系相关。
  15. 根据权利要求13所述的方法,其特征在于,
    所述一个或多个AR任务的图标的排布方式包括以下任一种:一字型排布方式、双排排布方式、环形排布方式、自由排布方式。
  16. 根据权利要求1-15中任一项所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备根据所述AR设备的指示接续所述AR设备执行所述第一AR任务。
  17. 根据权利要求16所述的方法,其特征在于,所述第一AR任务包括界面显示任务,所述第一电子设备根据所述AR设备的指示接续所述AR设备执行所述第一AR任务,包括:
    所述第一电子设备显示所述第一AR任务对应的界面;
    其中,所述第一AR任务对应的界面是所述第一AR任务对应的任务卡片,或者所述第一AR任务对应的界面包括所述第一AR任务对应的任务卡片中的界面元素。
  18. 根据权利要求16或17所述的方法,其特征在于,所述第一AR任务包括音频播放任务,所述第一电子设备根据所述AR设备的指示接续所述AR设备执行所述第一AR任务,包括:
    所述第一电子设备播放所述第一AR任务对应的音频;或者,
    所述第一电子设备指示与所述第一电子设备连接的音频播放设备播放所述第一AR任务对应的音频。
  19. 根据权利要求16-18中任一项所述的方法,其特征在于,在所述第一电子设备根据所述AR设备的指示接续所述AR设备执行所述第一AR任务之后,所述方法还包括:
    响应于用户对所述第一任务图标的第二操作,所述AR设备指示第二电子设备接续所述第一电子设备执行所述第一AR任务。
  20. 根据权利要求16-18中任一项所述的方法,其特征在于,在所述第一电子设备根据所述AR设备的指示接续所述AR设备执行所述第一AR任务之后,所述方法还包括:
    响应于用户对所述第一任务图标的第三操作,所述AR设备接续所述第一电子设备执行所述第一AR任务。
  21. 一种AR设备,其特征在于,所述AR设备包括:
    光学模组,用于在所述AR设备的视场内成像;
    存储器,用于存储计算机程序指令;
    处理器,用于执行所述指令,使得所述AR设备执行如权利要求1-15任一项所述的方法。
  22. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机可读指令,所述计算机可读指令被处理电路执行时实现如权利要求1-20中任一项所述的方法。
  23. 一种芯片系统,其特征在于,所述芯片系统包括处理电路、存储介质,所述存储介质中存储有指令;所述指令被所述处理电路执行时,实现如权利要求1-20中任一项所述的方法。
  24. 一种计算机程序产品,其特征在于,所述计算机程序产品包括计算机可读指令,所述计算机可读指令被执行时,以实现如权利要求1-20中任一项所述的方法。
PCT/CN2023/113299 2022-08-22 2023-08-16 一种任务接续方法、设备及系统 WO2024041429A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211008771.6A CN117667433A (zh) 2022-08-22 2022-08-22 一种任务接续方法、设备及系统
CN202211008771.6 2022-08-22

Publications (1)

Publication Number Publication Date
WO2024041429A1 true WO2024041429A1 (zh) 2024-02-29

Family

ID=90012554

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/113299 WO2024041429A1 (zh) 2022-08-22 2023-08-16 一种任务接续方法、设备及系统

Country Status (2)

Country Link
CN (1) CN117667433A (zh)
WO (1) WO2024041429A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180121432A1 (en) * 2016-11-02 2018-05-03 Microsoft Technology Licensing, Llc Digital assistant integration with music services
CN108628449A (zh) * 2018-04-24 2018-10-09 北京小米移动软件有限公司 设备控制方法、装置、电子设备及计算机可读存储介质
CN114706664A (zh) * 2020-01-08 2022-07-05 华为技术有限公司 跨设备任务处理的交互方法、电子设备及存储介质
CN114924682A (zh) * 2019-10-24 2022-08-19 华为终端有限公司 一种内容接续方法及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180121432A1 (en) * 2016-11-02 2018-05-03 Microsoft Technology Licensing, Llc Digital assistant integration with music services
CN108628449A (zh) * 2018-04-24 2018-10-09 北京小米移动软件有限公司 设备控制方法、装置、电子设备及计算机可读存储介质
CN114924682A (zh) * 2019-10-24 2022-08-19 华为终端有限公司 一种内容接续方法及电子设备
CN114706664A (zh) * 2020-01-08 2022-07-05 华为技术有限公司 跨设备任务处理的交互方法、电子设备及存储介质

Also Published As

Publication number Publication date
CN117667433A (zh) 2024-03-08

Similar Documents

Publication Publication Date Title
CN109917956B (zh) 一种控制屏幕显示的方法和电子设备
WO2021057830A1 (zh) 一种信息处理方法及电子设备
WO2021017836A1 (zh) 控制大屏设备显示的方法、移动终端及第一系统
WO2021244443A1 (zh) 分屏显示方法、电子设备及计算机可读存储介质
US11954200B2 (en) Control information processing method and apparatus, electronic device, and storage medium
EP3299933B1 (en) Method for displaying a navigator associated with content and electronic device for implementing the same
WO2023010940A1 (zh) 分屏显示方法和装置
CN109922356B (zh) 视频推荐方法、装置和计算机可读存储介质
CN112527174B (zh) 一种信息处理方法及电子设备
WO2021115103A1 (zh) 显示控制方法和终端设备
CN112835445B (zh) 虚拟现实场景中的交互方法、装置及系统
CN112527222A (zh) 一种信息处理方法及电子设备
JP2018032440A (ja) 制御可能なヘッドセットコンピュータディスプレイ
US20230119849A1 (en) Three-dimensional interface control method and terminal
WO2022057644A1 (zh) 设备交互方法、电子设备及交互系统
CN114594923A (zh) 车载终端的控制方法、装置、设备及存储介质
CN111437600A (zh) 剧情展示方法、装置、设备及存储介质
CN113160031B (zh) 图像处理方法、装置、电子设备及存储介质
WO2021052488A1 (zh) 一种信息处理方法及电子设备
CN113132668A (zh) 显示设备、移动设备、由显示设备执行的视频呼叫方法以及由移动设备执行的视频呼叫方法
WO2024041429A1 (zh) 一种任务接续方法、设备及系统
US20230076068A1 (en) Systems for interpreting a digit-to-digit gesture by a user differently based on roll values of a wrist-wearable device worn by the user, and methods of use thereof
CN113391775A (zh) 一种人机交互方法及设备
WO2022111690A1 (zh) 一种共享输入设备的方法、电子设备及系统
WO2022228004A1 (zh) 多屏协同过程中恢复窗口的方法、电子设备和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23856519

Country of ref document: EP

Kind code of ref document: A1