WO2021136266A1 - Virtual image synchronization method and wearable device - Google Patents
Virtual image synchronization method and wearable device Download PDFInfo
- Publication number
- WO2021136266A1 WO2021136266A1 PCT/CN2020/140836 CN2020140836W WO2021136266A1 WO 2021136266 A1 WO2021136266 A1 WO 2021136266A1 CN 2020140836 W CN2020140836 W CN 2020140836W WO 2021136266 A1 WO2021136266 A1 WO 2021136266A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wearable device
- display
- virtual screen
- input
- screen
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
Definitions
- the embodiments of the present invention relate to the field of communication technology, and in particular, to a virtual screen synchronization method and a wearable device.
- AR augmented reality
- Embodiments of the present invention provide a virtual screen synchronization method and wearable device, which can solve the problem of long delay between AR devices due to the long storage and download process when virtual information is shared between AR devices in related technologies. .
- an embodiment of the present invention provides a method for synchronizing a virtual screen.
- the method includes: receiving a first input from a user; responding to the first input, synchronizing a virtual screen of a first virtual screen of a first wearable device to a second The second virtual screen of the wearable device is displayed; wherein the second wearable device is determined based on the target area in the shooting preview screen of the camera of the first wearable device, and the target area is the area selected by the first input.
- an embodiment of the present invention also provides a first wearable device, the first wearable device includes: a receiving module and a synchronization module; a receiving module for receiving a user's first input; a synchronization module for receiving a response
- the first input received by the module synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display; wherein, the second wearable device is based on the first wearable device
- the target area in the shooting preview screen of the camera is determined, and the target area is the area selected by the first input.
- an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor.
- the computer program is executed by the processor to achieve the following On the one hand, the steps of the virtual picture synchronization method.
- an embodiment of the present invention provides a computer-readable storage medium that stores a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, it implements the virtual screen synchronization method as described in the first aspect. step.
- the first wearable device when the display screen of the target area of the first wearable device includes the second wearable device, the first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to The display area of the second wearable device can conveniently and quickly realize the screen sharing of the first wearable device.
- FIG. 1 is a schematic structural diagram of a possible operating system provided by an embodiment of the present invention
- FIG. 2 is a schematic flowchart of a method for synchronizing a virtual picture provided by an embodiment of the present invention
- FIG. 3 is one of the schematic diagrams of an interface applied by a method for synchronizing a virtual picture according to an embodiment of the present invention
- FIG. 4 is a second schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention.
- FIG. 5 is the third schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention.
- FIG. 6 is a fourth schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention.
- FIG. 7 is a fifth schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention.
- FIG. 8 is a schematic structural diagram of a first wearable device according to an embodiment of the present invention.
- FIG. 9 is a schematic structural diagram of an electronic device provided by an embodiment of the present invention.
- A/B can mean A or B
- the "and/or" in this article is only an association relationship describing associated objects, indicating that there may be three A relationship, for example, A and/or B, can mean that: A alone exists, A and B exist at the same time, and B exists alone.
- the words “first”, “second”, etc. are used for the same or similar items with basically the same function or effect. Distinguishing, those skilled in the art can understand that words such as “first” and “second” do not limit the quantity and execution order.
- the first wearable device and the second wearable device are used to distinguish different wearable devices, rather than to describe a specific order of wearable devices.
- the execution subject of the virtual screen synchronization method provided by the embodiment of the present invention may be the first wearable device, or may be a functional module and/or functional entity in the first wearable device that can implement the virtual screen synchronization method.
- the specifics may be based on The actual use requirements are determined, which is not limited in the embodiment of the present invention.
- the wearable device in the embodiment of the present invention may be: AR glasses, AR helmet, smart bracelet, smart watch, etc. It should be noted that the first wearable device and the second wearable device in the embodiment of the present invention may be the same wearable device (for example, both are AR glasses), or may be different wearable devices (for example, the first wearable device).
- the type device is AR glasses, and the second wearable device is a mobile phone), which is not limited in the embodiment of the present invention.
- the virtual screen in the embodiment of the present invention may be any carrier that can be used to display the content projected by the projection device when the AR technology is used to display content.
- the projection device may be a projection device using AR technology, such as an electronic device, a wearable device, or an AR device in the embodiment of the present invention.
- the projection device can project the virtual scene acquired (or internally integrated) by the projection device, or virtual scene and real scene onto the virtual screen, so that the virtual screen can display the content, thereby The user shows the effect of superimposing the real scene and the virtual scene.
- the virtual screen can usually be any possible carrier such as the display screen of an electronic device (such as a mobile phone), the lens of AR glasses, the windshield of a car, the wall of a room, and so on.
- an electronic device such as a mobile phone
- the lens of AR glasses the windshield of a car
- the wall of a room and so on.
- the following takes the virtual screen as the display screen of the electronic device, the lens of the AR glasses, and the windshield of the car as examples to illustrate the process of displaying content on the virtual screen by using the AR technology.
- the projection device may be the electronic device.
- the electronic device can capture the real scene in the area where the electronic device is located through its camera, and display the real scene on the display screen of the electronic device, and then the electronic device can project the acquired (or internally integrated) virtual scene to the electronic device.
- the virtual scene can be superimposed and displayed in the real scene, so that the user can see the superimposed effect of the real scene and the virtual scene through the display screen of the electronic device.
- the projection device may be the AR glasses.
- the user wears the glasses, the user can see the real scene in the area where he is located through the lens of the AR glasses, and the AR glasses can project the virtual scene acquired (or internally integrated) onto the lens of the AR glasses, so that The user can see the display effect of the real scene and the virtual scene superimposed through the lens of the AR glasses.
- the projection device may be any electronic device.
- the user can see the real scene in the area through the windshield of the car, and the projection device can project the virtual scene acquired (or internally integrated) onto the windshield of the car , So that the user can see the display effect of the real scene and the virtual scene superimposed through the windshield of the car.
- the specific form of the virtual screen may not be limited, for example, it may be a non-carrier real space.
- the user when the user is in the real space, the user can directly see the real scene in the real space, and the projection device can project the virtual scene acquired (or internally integrated) into the real space, so that The user can see the display effect of the real scene and the virtual scene superimposed in the real space.
- the wearable device in the embodiment of the present invention may be a wearable device with an operating system.
- the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present invention.
- the following uses an operating system as an example to introduce the software environment to which the virtual screen synchronization method provided by the embodiment of the present invention is applied.
- FIG. 1 it is a schematic structural diagram of a possible operating system provided by an embodiment of the present invention.
- the architecture of the operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
- the application layer includes various applications in the operating system (including system applications and third-party applications).
- the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
- the system runtime layer includes a library (also called a system library) and an operating system runtime environment.
- the library mainly provides various resources required by the operating system.
- the operating system operating environment is used to provide a software environment for the operating system.
- the kernel layer is the operating system layer of the operating system and belongs to the lowest level of the operating system software.
- the kernel layer provides core system services and hardware-related drivers for the operating system based on the Linux kernel.
- developers can develop a software program that implements the virtual screen synchronization method provided by the embodiment of the present invention based on the system architecture of the operating system shown in FIG. 1, so that the virtual screen
- the synchronization method can be run based on the operating system as shown in FIG. 1. That is, the processor or the wearable device can implement the virtual screen synchronization method provided by the embodiment of the present invention by running the software program in the operating system.
- FIG. 2 is a schematic flowchart of a virtual screen synchronization method provided by an embodiment of the present invention.
- the virtual picture synchronization method provided by the embodiment of the invention includes the following steps 201 and 202:
- Step 201 The first wearable device receives the first input of the user.
- the above-mentioned first input may include: a specific gesture input by the user, or a voice input by the user to the first wearable device, or the user input to a specific button on the first wearable device, or the user Specific posture.
- the above-mentioned specific gesture may be: a specific gesture input by the user in the shooting area of the camera of the first wearable device.
- the above-mentioned specific gesture may be any one of a sliding gesture, a palm hovering gesture, a click gesture, a long-press gesture, an area change gesture, a single-finger hovering gesture, and a multi-finger hovering gesture. There is no restriction on this.
- Step 202 In response to the first input, the first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display.
- the second wearable device is determined based on the target area in the shooting preview screen of the camera of the first wearable device, and the target area is the area selected by the first input.
- the target area selected by the user's first input may be a target area determined by the user through a button on the first wearable device or the user controls the first wearable device through voice input.
- the shooting preview picture of the first wearable device includes: a real picture taken by a camera of the first wearable device and a virtual picture generated based on the real picture.
- the aforementioned virtual picture is: virtual information generated by the aforementioned first wearable device according to a target object photographed by a camera set on the first wearable device.
- the virtual picture may include: the length, width, and height of the table displayed on the first virtual screen in the form of a logo.
- the second wearable device after the first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display, when the second wearable device When the second virtual screen displays the target object, the second wearable device will mark the received virtual information on the target object.
- Example 1 in the case that the second wearable device receives the length, width and height identification of the table synchronized by the first wearable device, when the second wearable device’s second virtual screen includes the table, the second wearable device The wearable device will display the length, width and height of the table on the second virtual screen.
- Example 2 Taking the first wearable device and the second wearable device as AR glasses as an example, the user uses AR glasses 1 to play a game (for example, making a cake on the table). After the game is over, the user sets the AR glasses 1 The above virtual screen (for example, the cake made) is sent to the AR glasses 2 of other users. When the above table appears in the shooting preview screen of the AR glasses 2 of other users, the AR glasses 1 will be displayed in the shooting preview screen of the AR glasses 2 A synchronized virtual screen (for example, a cake made by the user of the AR glasses 1 is displayed on the table).
- the second wearable device may The virtual picture is displayed directly on the second virtual screen.
- Example 3 taking the first wearable device and the second wearable device as AR glasses as an example, the user uses AR glasses 1 to play a game (for example, building a house with virtual building blocks), after the game is over, the user sets the AR glasses 1
- the aforementioned virtual screen (for example, a house built by the user) is sent to the AR glasses 2 of other users, and the AR glasses 2 of other users can directly display the houses built by the user using the AR glasses 1.
- the first wearable device determines the target area in the display area of its first virtual screen, and determines the wearable device within the range of the target area as the second wearable device.
- the first wearable device determines the second wearable device within the range of the target area, in order to synchronize the virtual screen of the first virtual screen of the first wearable device to the second wearable device of the second wearable device
- the virtual screen shows that the first wearable device needs to establish a connection with the second wearable device.
- the first wearable device may establish a communication connection between the first wearable device and the second wearable device before synchronizing the virtual screen.
- the virtual screen synchronization method provided in the embodiment of the present invention may further include the following steps 301 and 302:
- Step 301 The first wearable device sends an infrared signal to the second wearable device through the infrared emitting device, the infrared signal includes: the device feature code of the first wearable device, so that the second wearable device can pass the device feature The code establishes a connection with the first wearable device.
- the first wearable device may include: an infrared transmitting device; the second wearable device may further include: an infrared receiving device.
- Step 302 After receiving the infrared signal sent by the first wearable device, the infrared receiving device of the second wearable device establishes a connection with the first wearable device through the device feature code.
- the first wearable device synchronizes the virtual screen of the first virtual screen to the second virtual screen of the second wearable device for display.
- the infrared signal sent by the first wearable device to the second wearable device is sent by directional transmission.
- the user wants to enable the virtual screen synchronization function provided by the embodiment of the present invention, as shown in FIG. 3(A), it is the shooting preview screen of the first wearable device, which includes the virtual control 30 and the use of wearable Other users of the device.
- the user can start the virtual screen synchronization method provided by the embodiment of the present invention by overlapping the finger of one hand with the aforementioned virtual control 30 for 2 seconds.
- the first wearable device provided in the embodiment of the present invention is provided with an image acquisition device (for example, a camera).
- an image acquisition device for example, a camera
- the image capture device can capture the real-time picture image in front of the user who uses the first wearable device, so that the first wearable device can collect the real-time picture image from the collection. Recognize the user's specific gesture input in the camera shooting area of the first wearable device.
- the virtual screen synchronization method provided by the embodiment of the present invention can be applied in a variety of scenarios, and it is convenient for the user to select the second wearable device with which the virtual screen needs to be synchronized by using different selection methods.
- the user when the user needs to synchronize the virtual screen with multiple second wearable devices at the same time, he can create a selection box to select multiple second wearable
- the second wearable device achieves the above-mentioned purpose by performing virtual screen synchronization.
- the first wearable device in order to quickly select the second wearable device to reduce the data processing load of the first wearable device, may create a selection box to select the second wearable device.
- step 202 may further include the following steps 202a and 202b:
- Step 202a In response to the first input, the first wearable device creates a selection box on the shooting preview screen.
- Step 202b The first wearable device determines the wearable device selected by the selection box as the second wearable device, and synchronizes the virtual screen to the second virtual screen of the second wearable device for display.
- the shape of the foregoing selection frame may be any possible shape such as a circle, a rectangle, a triangle, a diamond, a ring, or a polygon, which may be specifically determined according to actual usage requirements, and is not limited in the embodiment of the present invention.
- the first wearable device and the second wearable device are both AR glasses, as shown in Figure 3, the first virtual screen of the first wearable device displays a virtual screen synchronization function virtual control (as shown in Figure 3).
- the user's finger moves to the position where the synchronization function virtual control 30 is located and stays for 2 seconds to turn on the virtual screen synchronization function.
- the screen synchronization function is turned on, as shown in Figure 3
- the first wearable device will create a rectangular selection box (31 in Figure 3) on the display screen. Wherein, the selection box 31 is used to select the wearable device in the shooting preview screen of the first wearable device.
- the first wearable device only needs to identify the wearable device in the selection box, which reduces the data processing load of the first wearable device and at the same time reduces the energy consumption of the first wearable device.
- the first wearable device can expand or reduce the range of the select box according to actual needs according to the specific input of the user.
- the foregoing step 202a may include the following steps 202a1 and 202a2:
- Step 202a1 In the case where the first sub-input is included in the shooting preview screen of the first wearable device, based on the diagonal of the target rectangle, the first wearable device displays a selection box on the shooting preview screen.
- Step 202a2 upon receiving the second sub-input of the user, the first wearable device updates the display of the selection box.
- the first sub-input is used to trigger the first wearable device to create a selection box
- the second sub-input is used to trigger the first wearable device to adjust the size of the selection box.
- the diagonal line of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
- the first sub-input of the first hand and the second hand may be a gesture input of the user extending the fingers of both hands.
- the first wearable device recognizes the gesture input, it displays on its shooting surface A selection box for selecting the second wearable device.
- the above-mentioned second sub-input may be a gesture input in which the user extends a finger in both hands and expands to both sides along the diagonal line (that is, the diagonal line of the above-mentioned selection box) or gathers in the middle.
- the first wearable device recognizes the gesture of extending a finger with both hands in the shooting preview screen, it starts to create a frame selection area, creates a rectangular selection frame based on the diagonal line of the finger, and adjusts the selection by moving the fingers of both hands The size of the frame.
- the first wearable device after the first wearable device has turned on the screen synchronization function, it recognizes the user's hand gestures with fingers extending respectively, and will create a preset at the target position of the display screen. Set the size of the rectangular selection box. The user can move the fingers of both hands to the two end points of the rectangular selection frame 41, and adjust the size and position of the rectangular selection frame 41 by moving the fingers. For example, a rectangular selection frame 41 of a preset size created by the first wearable device can be adjusted to a selection frame 42 as shown in FIG. 4(B) according to the above method.
- the user can display a selection frame on the shooting preview screen of the first wearable device through gesture input, and can adjust the size of the selection frame through gestures, so that the user can adjust the size of the selection frame to change the size of the selection frame.
- the wearable device whose screen is synchronized is included in the selection box.
- the size of the selection box needs to be locked to prevent the user from being able to adjust the size of the selection box when the user moves his finger out of the shooting preview screen.
- the virtual screen synchronization method provided in the embodiment of the present invention may further include the following step 202a3:
- Step 202a3 In the case of receiving the user's third sub-input, the first wearable device locks the display of the selection box.
- the above-mentioned third sub-input may be a gesture input of the user spreading the palms of both hands.
- the first wearable device detects the gesture input of spreading the palms of the user's hands in the shooting preview screen, the size of the selection box is locked.
- the above-mentioned locking the size of the selection box includes: fixing the size and position of the selection box.
- the creation of the selection box is completed, and the wearable device contained in the selection box is determined as the second wearable device.
- the first wearable device and the second wearable device as AR glasses as an example, as shown in Figure 5, when the AR glasses recognize that the user's gesture in the display screen becomes a gesture of spreading the palms of both hands, the selection box is completed 31 is created, and it is recognized whether the user wearing AR glasses is included in the selection frame, and at the same time, the AR glasses in the selection frame are searched, and the screen is displayed synchronously to them.
- the first wearable device determines the second wearable device from the selection frame, it is necessary to mark the second wearable device in the selection frame Location.
- the virtual screen synchronization method provided in the embodiment of the present invention further includes the following step 202b1:
- Step 202b1 the first wearable device displays the first logo in the first area on the shooting preview screen.
- the first identifier is used to mark the second wearable device, and the first area is the area where the second wearable device is located.
- the above-mentioned first area is the area where the second wearable device is located in the selection frame in the shooting preview screen of the first wearable device.
- the first wearable device reminds the user that the second wearable device is in the shooting preview screen Location, the first logo is displayed within the area where the second wearable device is located.
- the first mark may be a red dot
- the red dot is displayed in the first area to mark the second wearable device in the shooting preview screen of the first wearable device.
- the first wearable device recognizes the user wearing the AR glasses in the selection frame 31 , And search the AR glasses within the selection frame 31 at the same time, and display the origin mark (that is, the above-mentioned first mark, 32 in Fig. 5) at the position of the two AR glasses in the selection frame 31, thereby marking the AR glasses s position.
- the origin mark that is, the above-mentioned first mark, 32 in Fig. 5
- the user can clearly see the position of the second wearable device in the selection frame of the shooting preview screen of the first wearable device, which is convenient for the user to perform secondary screening.
- the selection range is large, and it is likely to include wearable devices with which the user does not want to synchronize the virtual screen. At this time, the user can filter the second wearable devices in the selection box.
- the virtual picture synchronization method provided in the embodiment of the present invention may include the following step 202b2:
- Step 202b2 when the second wearable device is blocked by the target object, the first wearable device cancels the display of the first logo.
- the aforementioned target object may be the palm of the user, or other opaque objects.
- using the target object to block the second wearable device is to prevent the blocked wearable device from appearing in the selection frame of the shooting preview screen of the first wearable device. In this way, the first wearable device does not Determine it as the second wearable device.
- the above-mentioned target object is also used to block the infrared signal sent by the first wearable device to the second wearable device.
- the first wearable device and the second wearable device as both AR glasses as an example
- when the user completes the creation of the selection box if there are AR glasses that the user does not want to synchronize with the virtual screen in the selection box , As shown in Figure 6, use the one-hand palm open gesture to cover the AR glasses for 2 seconds, the object that is hidden by the one-hand palm open gesture will be deselected, and the first mark of the AR glasses position disappears, making the first The display screen of the wearable device will not be synchronized to the display area of the second wearable device that is hidden by the open palm gesture of one hand.
- the second wearable device can be blocked to prevent the virtual screen from being performed with it. Synchronize.
- the user can drive the first wearable device to rotate by turning his head to align the second logo displayed on the first virtual screen of the first wearable device
- the virtual screen of the first wearable device is synchronized to the second wearable device.
- the virtual screen synchronization method provided in the embodiment of the present invention may include the following step 202c1:
- Step 202c1 the first wearable device displays the second identifier at the target location of the target area.
- the foregoing step 202 may further include the following step 202c2:
- Step 202c2 The first wearable device determines the wearable device in the second area including the second identifier as the second wearable device, and synchronizes the virtual screen to the second virtual screen display of the second wearable device.
- the second identifier is displayed at the target location of the target area.
- the target area may be the entire display area of the shooting preview screen of the first wearable device, and the target position may be the center position of the aforementioned target area.
- the above-mentioned second mark may be a cross auxiliary line, or may be an image, or may be another mark used to mark the second wearable device, which is not limited in the embodiment of the present invention.
- the first wearable device determines the wearable device within the second area range as the second wearable device.
- the foregoing second area is an area where the second wearable device is located.
- the wearable device can be determined as the second wearable device, and the virtual screen can be synchronized with it.
- the user can control the rotation of the first wearable device by rotating the head, and align the second mark displayed on the shooting preview screen with the wearable device with which the user wants to synchronize the virtual screen.
- the virtual screen synchronization method provided in the embodiment of the present invention may include the following steps 202c3 and 202c4:
- Step 202c3 The first wearable device obtains the rotation direction and the rotation angle of the first wearable device
- Step 202c4 The first wearable device updates the display position of the second identifier based on the rotation direction and the rotation angle.
- the user can control the rotation of the first wearable device by rotating the head or body.
- a head-mounted device eg, AR glasses, AR helmet, etc.
- the shooting preview screen of the first wearable device contains the wearable device that the user wants to synchronize with the virtual screen, and the second identifier is not aligned with the area of the wearable device, use the first wearable device
- the user of the wearable device can align the second mark with the wearable device by turning the head.
- the first virtual screen of the first wearable device displays a virtual screen synchronization function virtual control (as shown in FIG. 7 in 60), when the user wants to open the virtual screen synchronization function, the user's finger moves to the position where the synchronization function virtual control 60 is located, and stays for 2 seconds to turn on the screen synchronization function.
- the screen synchronization function is turned on, as shown in the figure As shown in 7, the first wearable device will create a cross auxiliary line 61 on the shooting preview screen, and the user can turn the head or move the body to turn the cross auxiliary line 61 to the one that the user wants to synchronize with the virtual screen. Wearable devices.
- the virtual screen of the shooting preview screen of the above-mentioned first wearable device is synchronized to the second logo position overlapping with the second logo.
- the second virtual screen of the wearable device is displayed on the shooting preview screen of the first wearable device, indicating that the transmission is complete.
- the user can continue to turn his head and continue to select the next wearable device that needs to be synchronized with the virtual screen.
- the user can determine the second wearable device by turning the head or moving the body without extending his hands, avoiding the user having to put down the objects in his hands to manipulate the first wearable device through gestures when holding other objects in both hands.
- the wearable device situation happened.
- the user can use voice input, press a physical button on the first wearable device, or other gesture input methods to change the first virtual device of the first wearable device.
- the virtual image of the screen is synchronized to the second virtual screen of the second wearable device for display.
- the single-finger hovering gesture is moved to the location of the virtual control, and after hovering for a preset time, the first wearable device of the first wearable device
- the virtual screen of the virtual screen is synchronized to the second virtual screen of the second wearable device and displayed on the second virtual screen.
- the aforementioned first wearable device may adopt the 4th generation mobile communication technology (4G), the 5th generation mobile communication technology (5G) or wireless high-fidelity (5G).
- wireless fidelity WIFI
- WIFI wireless fidelity
- the virtual screen synchronization method provided by the embodiment of the present invention, by acquiring the shooting preview screen of the first wearable device, if the second wearable device is included in the target area of the shooting preview screen, the first wearable device of the first wearable device.
- the virtual screen of the virtual screen is synchronized to the second virtual screen display of the second wearable device, which can easily and quickly realize the screen sharing of the first wearable device, which to a certain extent avoids the sharing of virtual information between AR devices in the traditional technology.
- the virtual screen synchronization methods shown in the figures of the above methods are all exemplified in conjunction with a figure in the embodiment of the present invention.
- the virtual screen synchronization method shown in the figures of the above methods can also be implemented in combination with any other figures that can be combined as illustrated in the above embodiments, and will not be repeated here.
- FIG. 8 is a schematic diagram of a possible structure for implementing a first wearable device provided by an embodiment of the present invention.
- the first wearable device 500 includes: a receiving module 501 and a synchronization module 502, wherein:
- the receiving module 501 is configured to receive the first input of the user.
- the synchronization module 502 is configured to respond to the first input received by the receiving module 501 to synchronize the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display.
- the second wearable device is determined based on the target area in the shooting preview picture of the camera of the first wearable device, and the target area is the area selected by the first input.
- the first wearable device 500 further includes: a creation module 503 and a determination module 504.
- the creation module 503 is used to respond to the first input and create a selection box on the shooting preview screen.
- the determining module 504 is configured to determine the wearable device selected by the selection box created by the creating module 503 as the second wearable device.
- the synchronization module 502 is further configured to synchronize the virtual screen to the second virtual screen display of the second wearable device determined by the determining module 504.
- the first wearable device only needs to identify the wearable device in the selection box, which reduces the data processing load of the first wearable device and at the same time reduces the energy consumption of the first wearable device.
- the first wearable device 500 further includes: a display module 505.
- the first input includes: a first sub-input of the user's first hand and second hand; the first sub-input is used to trigger the first wearable device to create a selection box.
- the display module 505 is configured to display a selection box on the shooting preview screen based on the diagonal of the target rectangle when the first sub-input is included in the shooting preview screen.
- the display module 505 is also configured to update the display of the selection box when the receiving module 501 receives the second sub-input of the user.
- the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
- the user can display a selection frame on the shooting preview screen of the first wearable device through gesture input, and can adjust the size of the selection frame through gestures, so that the user can adjust the size of the selection frame to change the size of the selection frame.
- the wearable device whose screen is synchronized is included in the selection box.
- the display module 505 is further configured to lock the display of the selection box when the receiving module 501 receives the third sub-input of the user.
- the display module 505 is further configured to display a first identifier in the first area on the shooting preview screen, the first identifier is used to mark the second wearable device, and the first area is the area where the second wearable device is located.
- the user can clearly see the position of the second wearable device in the selection frame of the shooting preview screen of the first wearable device, which is convenient for the user to perform secondary screening.
- the display module 505 is further configured to cancel the display of the first logo when the second wearable device is blocked by the target object.
- the second wearable device can be blocked to prevent the virtual screen from being performed with it. Synchronize.
- the display module 505 is further configured to display the second identifier at the target position of the target area.
- the determining module 504 is further configured to determine the wearable device in the second area including the second identifier as the second wearable device.
- the display module 505 is also configured to synchronize the virtual screen to the second virtual screen of the second wearable device determined by the determining module 504 for display.
- the wearable device can be determined as the second wearable device, and the virtual screen can be synchronized with it.
- the first wearable device further includes: an obtaining module 506.
- the obtaining module 506 is used to obtain the rotation direction and the rotation angle of the first wearable device.
- the display module 505 is further configured to update the display position of the second indicator based on the rotation direction and the rotation angle acquired by the acquisition module 506.
- the user can determine the second wearable device by turning the head or moving the body without extending his hands, avoiding the user having to put down the objects in his hands to manipulate the first wearable device through gestures when holding other objects in both hands.
- the wearable device situation happened.
- the wearable device by acquiring the shooting preview screen of the first wearable device, when the second wearable device is included in the target area of the shooting preview screen, the first virtual device of the first wearable device
- the virtual screen of the screen is synchronized to the second virtual screen display of the second wearable device, which can easily and quickly realize the screen sharing of the first wearable device, which to a certain extent avoids the sharing of virtual information between AR devices in the traditional technology.
- the storage and downloading process takes a long time and causes the problem of large delays between AR devices.
- the electronic device provided in the embodiment of the present invention can implement each process implemented by the wearable device in the foregoing method embodiment, and to avoid repetition, details are not described herein again.
- the electronic device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, a power supply 111, a camera assembly 112 and other components.
- a radio frequency unit 101 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, a power supply 111, a camera assembly 112 and other components.
- the electronic device 100 may include more or less components than those shown in the figure, or combine certain components, or Different component arrangements.
- the aforementioned camera component 112 includes a camera.
- the electronic device 100 includes, but is not limited to, a mobile phone
- the processor 110 can recognize the user's first sub-input and send a first instruction to the display unit 106 according to the first sub-input.
- the display unit 106 responds to the first instruction sent by the processor 110 based on the diagonal of the target rectangle, On the shooting preview screen of the first wearable device, a selection frame is displayed. Wherein, the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
- the processor 110 may also recognize the user's second sub-input, and send a second instruction to the display unit 106 according to the second sub-input.
- the display unit 106 responds to the second instruction sent by the processor 110, based on the diagonal of the target rectangle, On the shooting preview screen of the first wearable device, the display of the selection box is updated.
- the electronic device provided by the embodiment of the present invention obtains the shooting preview screen of the first wearable device, and in the case that the second wearable device is included in the target area of the shooting preview screen, the first virtual screen of the first wearable device.
- the virtual screen of the second wearable device is synchronized to the second virtual screen display of the second wearable device, which can easily and quickly realize the screen sharing of the first wearable device. To a certain extent, it avoids the storage of virtual information between AR devices in the traditional technology. And the long delay between the AR device caused by the long download process.
- the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
- the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
- the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
- the electronic device 100 provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
- the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic device 100 (for example, call signal reception sound, message reception sound, etc.).
- the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
- the electronic device 100 obtains a real-time picture taken by the camera in the camera component 112 (for example, a photographed preview picture of the first wearable device), and displays it on the display unit 106.
- a real-time picture taken by the camera in the camera component 112 for example, a photographed preview picture of the first wearable device
- the input unit 104 is used to receive audio or video signals.
- the input unit 104 may include a graphics processing unit (GPU) 1041, a microphone 1042, and an image capture device 1043.
- the graphics processor 1041 is configured to capture still pictures or videos obtained by the image capture device in the video capture mode or the image capture mode. Image data is processed.
- the processed image frame can be displayed on the display unit 106.
- the image frames processed by the graphics processor 1041 can be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
- the microphone 1042 can receive sound, and can process such sound into audio data.
- the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
- the electronic device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
- the light sensor includes an ambient light sensor and a proximity sensor.
- the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
- the proximity sensor can close the display panel 1061 and the display panel 1061 when the electronic device 100 is moved to the ear. / Or backlight.
- the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
- the display unit 106 is used to display information input by the user or information provided to the user.
- the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
- LCD liquid crystal display
- OLED organic light-emitting diode
- the user input unit 107 may be used to receive input numeric or character information, and generate key signal input related to user settings and function control of the electronic device 100.
- the user input unit 107 includes a touch panel 1071 and other input devices 1072.
- the touch panel 1071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
- the touch panel 1071 may include two parts: a touch detection device and a touch controller.
- the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
- the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
- the user input unit 107 may also include other input devices 1072.
- other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), trackball, mouse, and joystick, which will not be repeated here.
- the touch panel 1071 can be overlaid on the display panel 1061.
- the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch.
- the type of event provides corresponding visual output on the display panel 1061.
- the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the electronic device 100, in some embodiments, the touch panel 1071 and the display panel 1061 may be combined.
- the input and output functions of the electronic device 100 are realized by integration, which is not specifically limited here.
- the interface unit 108 is an interface for connecting an external device with the electronic device 100.
- the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
- the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 100 or can be used to connect to the electronic device 100 and the external device. Transfer data between devices.
- the memory 109 can be used to store software programs and various data.
- the memory 109 may mainly include a program storage area and a data storage area.
- the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
- the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
- the processor 110 is the control center of the electronic device 100, which uses various interfaces and lines to connect the various parts of the entire electronic device 100, runs or executes the software programs and/or modules stored in the memory 109, and calls the storage in the memory 109 , Execute various functions of the electronic device 100 and process data, so as to monitor the electronic device 100 as a whole.
- the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
- the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
- the electronic device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
- a power source 111 such as a battery
- the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
- the electronic device 100 includes some functional modules not shown, which will not be repeated here.
- an embodiment of the present invention further provides an AR device, including a processor, a memory, and a computer program stored in the memory and running on the processor 110.
- the computer program realizes the aforementioned virtual reality when the computer program is executed by the processor.
- Each process of the embodiment of the picture synchronization method can achieve the same technical effect, and in order to avoid repetition, it will not be repeated here.
- the electronic device in the foregoing embodiment may be an AR device.
- the AR device may include all or part of the functional modules in the foregoing electronic device.
- the AR device may also include functional modules not included in the above electronic device.
- the electronic device in the foregoing embodiment is an AR device
- the electronic device may be an electronic device integrated with AR technology.
- the above-mentioned AR technology refers to a technology that realizes the combination of a real scene and a virtual scene.
- the use of AR technology can restore human visual functions, so that humans can experience the combination of real scenes and virtual scenes through AR technology, so that humans can better experience the immersive feelings.
- the AR device when the user wears the AR glasses, the scene that the user sees is generated through AR technology processing, that is, the virtual scene can be superimposed and displayed in the real scene through the AR technology.
- the user manipulates the content displayed by the AR glasses, the user can see the AR glasses "peel off" the real scene, thereby showing the user a more realistic side.
- the user can only observe the carton shell when observing a carton with naked eyes, but when the user wears AR glasses, the user can directly observe the internal structure of the carton through the AR glasses.
- the above-mentioned AR device may include a camera, so that the AR device can display and interact with a virtual screen based on the image captured by the camera.
- the AR device can synchronize the virtual screen information generated when the user uses the AR device for entertainment activities to the display screen of other AR devices, so that the virtual screen can be shared between AR devices.
- the embodiment of the present invention also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
- a computer program is stored on the computer-readable storage medium.
- the computer program is executed by a processor, each process of the above-mentioned virtual screen synchronization method embodiment is realized, and the same can be achieved.
- the technical effect, in order to avoid repetition, will not be repeated here.
- the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM for short), random access memory (Random Access Memory, RAM for short), magnetic disk, or optical disk, etc.
- the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.
- a storage medium such as ROM/RAM, magnetic disk,
- the optical disc includes several instructions to make an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (18)
- 一种虚拟画面同步方法,应用于第一穿戴式设备,所述方法包括:A method for synchronizing a virtual picture, applied to a first wearable device, the method including:接收用户的第一输入;Receive the user's first input;响应所述第一输入,将所述第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示;In response to the first input, synchronizing the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display;其中,所述第二穿戴式设备为基于所述第一穿戴式设备的摄像头的拍摄预览画面中的目标区域确定的,所述目标区域为所述第一输入选择的区域。Wherein, the second wearable device is determined based on a target area in a shooting preview image of a camera of the first wearable device, and the target area is an area selected by the first input.
- 根据权利要求1所述的方法,其中,所述响应于所述第一输入,将所述第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示,包括:The method according to claim 1, wherein, in response to the first input, the virtual screen of the first virtual screen of the first wearable device is synchronized to the second virtual screen of the second wearable device for display ,include:响应所述第一输入,在所述拍摄预览画面上,创建选择框;In response to the first input, creating a selection box on the shooting preview screen;将所述选择框所框选的穿戴式设备确定为第二穿戴式设备,并将所述虚拟画面同步至所述第二穿戴式设备的第二虚拟屏幕显示。The wearable device selected by the selection box is determined to be the second wearable device, and the virtual screen is synchronized to the second virtual screen of the second wearable device for display.
- 根据权利要求2所述的方法,其中,所述第一输入包括:用户第一手部和第二手部的第一子输入;所述第一子输入用于触发所述第一穿戴式设备创建所述选择框;The method according to claim 2, wherein the first input comprises: a first sub-input of the user's first hand and second hand; the first sub-input is used to trigger the first wearable device Create the selection box;所述响应于所述第一输入,在所述拍摄预览画面上,创建选择框,包括:The creating a selection frame on the shooting preview screen in response to the first input includes:在所述拍摄预览画面中包含所述第一子输入的情况下,基于目标矩形对角线,在所述拍摄预览画面上,显示选择框;In the case that the shooting preview screen includes the first sub-input, based on the diagonal of the target rectangle, displaying a selection box on the shooting preview screen;在接收到用户的第二子输入的情况下,更新所述选择框的显示;In the case of receiving the second sub-input of the user, updating the display of the selection box;其中,所述目标矩形对角线为所述第一手部的第一部位和所述第二手部的第二部位之间的连线。Wherein, the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
- 根据权利要求3所述的方法,其中,所述更新所述选择框的显示之后,还包括:在接收到用户的第三子输入的情况下,锁定所述选择框的显示。The method according to claim 3, wherein after the updating the display of the selection box, the method further comprises: locking the display of the selection box in the case of receiving a third sub-input of the user.
- 根据权利要求2所述的方法,其中,所述将所述选择框所框选的穿戴式设备确定为第二穿戴式设备之后,所述方法还包括:The method according to claim 2, wherein after the determining the wearable device selected by the selection box as the second wearable device, the method further comprises:在所述拍摄预览画面上的第一区域,显示第一标识,所述第一标识用于标记所述第二穿戴式设备,所述第一区域为所述第二穿戴式设备所在区域。In the first area on the shooting preview screen, a first identifier is displayed, the first identifier is used to mark the second wearable device, and the first area is the area where the second wearable device is located.
- 根据权利要求5所述的方法,其中,所述在所述拍摄预览画面上显示第 一标识之后,所述方法还包括:The method according to claim 5, wherein, after the first logo is displayed on the shooting preview screen, the method further comprises:在所述第二穿戴式设备被目标对象遮挡的情况下,取消所述第一标识的显示。When the second wearable device is blocked by the target object, the display of the first logo is cancelled.
- 根据权利要求1或2所述的方法,其中,所述将所述第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示之前,所述方法还包括:The method according to claim 1 or 2, wherein before the synchronization of the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display, the method further include:在所述目标区域的目标位置,显示第二标识;At the target position of the target area, display a second mark;所述将所述第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示,包括:The synchronizing the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display includes:将包括所述第二标识的第二区域中的穿戴式设备确定为第二穿戴式设备,并将所述虚拟画面同步至所述第二穿戴式设备的第二虚拟屏幕显示。The wearable device in the second area including the second identifier is determined to be the second wearable device, and the virtual screen is synchronized to the second virtual screen of the second wearable device for display.
- 根据权利要求7所述的方法,其中,所述在所述目标区域的目标位置,显示第二标识之后,所述方法还包括:The method according to claim 7, wherein, after the second identifier is displayed at the target position of the target area, the method further comprises:获取所述第一穿戴式设备的转动方向和转动角度;Acquiring the rotation direction and the rotation angle of the first wearable device;基于所述转动方向和转动角度,更新所述第二标识的显示位置。Based on the rotation direction and the rotation angle, the display position of the second indicator is updated.
- 一种第一穿戴式设备,所述第一穿戴式设备包括:接收模块和同步模块;A first wearable device, the first wearable device comprising: a receiving module and a synchronization module;所述接收模块,用于接收用户的第一输入;The receiving module is used to receive a user's first input;所述同步模块,用于响应所述接收模块接收到的第一输入,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示;The synchronization module is configured to respond to the first input received by the receiving module to synchronize the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display;其中,所述第二穿戴式设备为基于所述第一穿戴式设备的摄像头的拍摄预览画面中的目标区域确定的,所述目标区域为所述第一输入选择的区域。Wherein, the second wearable device is determined based on a target area in a shooting preview image of a camera of the first wearable device, and the target area is an area selected by the first input.
- 根据权利要求9所述的第一穿戴式设备,其中,所述第一穿戴式设备还包括:创建模块和确定模块;The first wearable device according to claim 9, wherein the first wearable device further comprises: a creation module and a determination module;所述创建模块,用于响应所述第一输入,在所述拍摄预览画面上,创建选择框;The creation module is configured to create a selection box on the shooting preview screen in response to the first input;所述确定模块,用于将所述创建模块创建的选择框所框选的穿戴式设备确定为第二穿戴式设备;The determining module is configured to determine the wearable device selected by the selection box created by the creating module as the second wearable device;所述同步模块,具体用于将所述虚拟画面同步至所述确定模块确定的第二 穿戴式设备的第二虚拟屏幕显示。The synchronization module is specifically configured to synchronize the virtual screen to the second virtual screen display of the second wearable device determined by the determining module.
- 根据权利要求10所述的第一穿戴式设备,其中,所述第一穿戴式设备还包括:显示模块;The first wearable device according to claim 10, wherein the first wearable device further comprises: a display module;所述第一输入包括:用户第一手部和第二手部的第一子输入;所述第一子输入用于触发所述第一穿戴式设备创建所述选择框;The first input includes: a first sub-input of a user's first hand and a second hand; the first sub-input is used to trigger the first wearable device to create the selection box;所述显示模块,用于在所述拍摄预览画面中包含所述第一子输入的情况下,基于目标矩形对角线,在所述拍摄预览画面上,显示选择框;The display module is configured to display a selection box on the shooting preview screen based on the diagonal of the target rectangle when the first sub-input is included in the shooting preview screen;所述显示模块,还用于在所述接收模块接收到用户的第二子输入的情况下,更新所述选择框的显示;The display module is further configured to update the display of the selection box when the receiving module receives the second sub-input of the user;其中,所述目标矩形对角线为所述第一手部的第一部位和所述第二手部的第二部位之间的连线。Wherein, the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
- 根据权利要求11所述的第一穿戴式设备,其中,The first wearable device according to claim 11, wherein:所述显示模块,还用于在所述接收模块接收到用户的第三子输入的情况下,锁定所述选择框的显示。The display module is further configured to lock the display of the selection box when the receiving module receives the third sub-input of the user.
- 根据权利要求10所述的第一穿戴式设备,其中,所述第一穿戴式设备还包括:显示模块;The first wearable device according to claim 10, wherein the first wearable device further comprises: a display module;所述显示模块,用于在所述拍摄预览画面上的第一区域,显示第一标识,所述第一标识用于标记所述第二穿戴式设备,所述第一区域为所述第二穿戴式设备所在区域。The display module is configured to display a first identifier in a first area on the shooting preview screen, the first identifier is used to mark the second wearable device, and the first area is the second The area where the wearable device is located.
- 根据权利要求13所述的第一穿戴式设备,其中,The first wearable device according to claim 13, wherein:所述显示模块,还用于在所述第二穿戴式设备被目标对象遮挡的情况下,取消所述第一标识的显示。The display module is further configured to cancel the display of the first logo when the second wearable device is blocked by the target object.
- 根据权利要求14所述的第一穿戴式设备,其中,The first wearable device according to claim 14, wherein:所述显示模块,还用于在所述目标区域的目标位置,显示第二标识;The display module is further configured to display a second identifier at the target position of the target area;所述确定模块,还用于将包括所述第二标识的第二区域中的穿戴式设备确定为第二穿戴式设备;The determining module is further configured to determine the wearable device in the second area including the second identifier as the second wearable device;所述显示模块,还用于将所述虚拟画面同步至所述确定模块确定的第二穿戴式设备的第二虚拟屏幕显示。The display module is further configured to synchronize the virtual screen to the second virtual screen of the second wearable device determined by the determining module for display.
- 根据权利要求15所述的第一穿戴式设备,其中,所述第一穿戴式设备,还包括:获取模块;The first wearable device according to claim 15, wherein the first wearable device further comprises: an acquisition module;所述获取模块,用于获取所述第一穿戴式设备的转动方向和转动角度;The acquisition module is used to acquire the rotation direction and the rotation angle of the first wearable device;所述显示模块,还用于基于所述获取模块获取的转动方向和转动角度,更新所述第二标识的显示位置。The display module is further configured to update the display position of the second indicator based on the rotation direction and the rotation angle acquired by the acquisition module.
- 一种电子设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1至8中任一项所述的虚拟画面同步方法的步骤。An electronic device, comprising a processor, a memory, and a computer program stored on the memory and capable of running on the processor, the computer program being executed by the processor to implement any of claims 1 to 8 One of the steps of the virtual picture synchronization method.
- 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1至8中任一项所述的虚拟画面同步方法的步骤。A computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the virtual screen synchronization method according to any one of claims 1 to 8 are realized.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911418240.2 | 2019-12-31 | ||
CN201911418240.2A CN111124136A (en) | 2019-12-31 | 2019-12-31 | Virtual picture synchronization method and wearable device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021136266A1 true WO2021136266A1 (en) | 2021-07-08 |
Family
ID=70506863
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/140836 WO2021136266A1 (en) | 2019-12-31 | 2020-12-29 | Virtual image synchronization method and wearable device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111124136A (en) |
WO (1) | WO2021136266A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111124136A (en) * | 2019-12-31 | 2020-05-08 | 维沃移动通信有限公司 | Virtual picture synchronization method and wearable device |
CN111813220A (en) * | 2020-06-19 | 2020-10-23 | 深圳增强现实技术有限公司 | Interactive system based on augmented reality or virtual reality intelligent head-mounted equipment |
CN112256121B (en) * | 2020-09-10 | 2024-12-24 | 苏宁智能终端有限公司 | Implementation method and device of input method based on AR technology |
CN112631677A (en) * | 2020-12-21 | 2021-04-09 | 上海影创信息科技有限公司 | Resource support prompting method and system |
CN113301506B (en) * | 2021-05-27 | 2023-07-25 | 维沃移动通信有限公司 | Information sharing method, device, electronic equipment and medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130194304A1 (en) * | 2012-02-01 | 2013-08-01 | Stephen Latta | Coordinate-system sharing for augmented reality |
CN103460256A (en) * | 2011-03-29 | 2013-12-18 | 高通股份有限公司 | Anchoring virtual images to real world surfaces in augmented reality systems |
CN106796344A (en) * | 2014-10-07 | 2017-05-31 | 艾尔比特系统有限公司 | The wear-type of the enlarged drawing being locked on object of interest shows |
CN108513165A (en) * | 2017-02-28 | 2018-09-07 | 三星电子株式会社 | The method of shared content and the electronic equipment for supporting this method |
CN109074772A (en) * | 2016-01-25 | 2018-12-21 | 艾维赛特有限公司 | Content based on sight shares dynamic self-organization network |
CN111124136A (en) * | 2019-12-31 | 2020-05-08 | 维沃移动通信有限公司 | Virtual picture synchronization method and wearable device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3324270A1 (en) * | 2016-11-16 | 2018-05-23 | Thomson Licensing | Selection of an object in an augmented reality environment |
KR20190056523A (en) * | 2017-11-17 | 2019-05-27 | 삼성에스디에스 주식회사 | System and method for synchronizing display of virtual reality content |
-
2019
- 2019-12-31 CN CN201911418240.2A patent/CN111124136A/en active Pending
-
2020
- 2020-12-29 WO PCT/CN2020/140836 patent/WO2021136266A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103460256A (en) * | 2011-03-29 | 2013-12-18 | 高通股份有限公司 | Anchoring virtual images to real world surfaces in augmented reality systems |
US20130194304A1 (en) * | 2012-02-01 | 2013-08-01 | Stephen Latta | Coordinate-system sharing for augmented reality |
CN106796344A (en) * | 2014-10-07 | 2017-05-31 | 艾尔比特系统有限公司 | The wear-type of the enlarged drawing being locked on object of interest shows |
CN109074772A (en) * | 2016-01-25 | 2018-12-21 | 艾维赛特有限公司 | Content based on sight shares dynamic self-organization network |
CN108513165A (en) * | 2017-02-28 | 2018-09-07 | 三星电子株式会社 | The method of shared content and the electronic equipment for supporting this method |
CN111124136A (en) * | 2019-12-31 | 2020-05-08 | 维沃移动通信有限公司 | Virtual picture synchronization method and wearable device |
Also Published As
Publication number | Publication date |
---|---|
CN111124136A (en) | 2020-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021136266A1 (en) | Virtual image synchronization method and wearable device | |
CN109952757B (en) | Method for recording video based on virtual reality application, terminal equipment and storage medium | |
WO2021098678A1 (en) | Screencast control method and electronic device | |
WO2019228163A1 (en) | Speaker control method and mobile terminal | |
WO2019196929A1 (en) | Video data processing method and mobile terminal | |
US11954200B2 (en) | Control information processing method and apparatus, electronic device, and storage medium | |
CN109032486B (en) | Display control method and terminal equipment | |
WO2019149028A1 (en) | Application download method and terminal | |
CN109862258A (en) | A kind of image display method and terminal device | |
CN109151546A (en) | A kind of method for processing video frequency, terminal and computer readable storage medium | |
CN108628515B (en) | Multimedia content operation method and mobile terminal | |
WO2020233323A1 (en) | Display control method, terminal device, and computer-readable storage medium | |
US20220326764A1 (en) | Video trimming method and head-mounted device | |
WO2021136330A1 (en) | Bullet screen display control method and electronic device | |
CN110798621A (en) | Image processing method and electronic equipment | |
CN110866465A (en) | Control method of electronic equipment and electronic equipment | |
WO2019184902A1 (en) | Method for controlling icon display, and terminal | |
WO2021104162A1 (en) | Display method and electronic device | |
CN109547696B (en) | Shooting method and terminal equipment | |
CN109814825B (en) | Display screen control method and mobile terminal | |
WO2020168859A1 (en) | Photographing method and terminal device | |
CN111273885A (en) | AR image display method and AR equipment | |
CN111178306A (en) | Display control method and electronic equipment | |
CN111352505A (en) | Operation control method, head-mounted device, and medium | |
WO2021136265A1 (en) | Unlocking method and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20911074 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20911074 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 05.01.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20911074 Country of ref document: EP Kind code of ref document: A1 |