WO2021136266A1 - Virtual image synchronization method and wearable device - Google Patents

Virtual image synchronization method and wearable device Download PDF

Info

Publication number
WO2021136266A1
WO2021136266A1 PCT/CN2020/140836 CN2020140836W WO2021136266A1 WO 2021136266 A1 WO2021136266 A1 WO 2021136266A1 CN 2020140836 W CN2020140836 W CN 2020140836W WO 2021136266 A1 WO2021136266 A1 WO 2021136266A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable device
display
virtual screen
input
screen
Prior art date
Application number
PCT/CN2020/140836
Other languages
French (fr)
Chinese (zh)
Inventor
凌深宏
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021136266A1 publication Critical patent/WO2021136266A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Definitions

  • the embodiments of the present invention relate to the field of communication technology, and in particular, to a virtual screen synchronization method and a wearable device.
  • AR augmented reality
  • Embodiments of the present invention provide a virtual screen synchronization method and wearable device, which can solve the problem of long delay between AR devices due to the long storage and download process when virtual information is shared between AR devices in related technologies. .
  • an embodiment of the present invention provides a method for synchronizing a virtual screen.
  • the method includes: receiving a first input from a user; responding to the first input, synchronizing a virtual screen of a first virtual screen of a first wearable device to a second The second virtual screen of the wearable device is displayed; wherein the second wearable device is determined based on the target area in the shooting preview screen of the camera of the first wearable device, and the target area is the area selected by the first input.
  • an embodiment of the present invention also provides a first wearable device, the first wearable device includes: a receiving module and a synchronization module; a receiving module for receiving a user's first input; a synchronization module for receiving a response
  • the first input received by the module synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display; wherein, the second wearable device is based on the first wearable device
  • the target area in the shooting preview screen of the camera is determined, and the target area is the area selected by the first input.
  • an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor.
  • the computer program is executed by the processor to achieve the following On the one hand, the steps of the virtual picture synchronization method.
  • an embodiment of the present invention provides a computer-readable storage medium that stores a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, it implements the virtual screen synchronization method as described in the first aspect. step.
  • the first wearable device when the display screen of the target area of the first wearable device includes the second wearable device, the first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to The display area of the second wearable device can conveniently and quickly realize the screen sharing of the first wearable device.
  • FIG. 1 is a schematic structural diagram of a possible operating system provided by an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of a method for synchronizing a virtual picture provided by an embodiment of the present invention
  • FIG. 3 is one of the schematic diagrams of an interface applied by a method for synchronizing a virtual picture according to an embodiment of the present invention
  • FIG. 4 is a second schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention.
  • FIG. 5 is the third schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention.
  • FIG. 6 is a fourth schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention.
  • FIG. 7 is a fifth schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a first wearable device according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of an electronic device provided by an embodiment of the present invention.
  • A/B can mean A or B
  • the "and/or" in this article is only an association relationship describing associated objects, indicating that there may be three A relationship, for example, A and/or B, can mean that: A alone exists, A and B exist at the same time, and B exists alone.
  • the words “first”, “second”, etc. are used for the same or similar items with basically the same function or effect. Distinguishing, those skilled in the art can understand that words such as “first” and “second” do not limit the quantity and execution order.
  • the first wearable device and the second wearable device are used to distinguish different wearable devices, rather than to describe a specific order of wearable devices.
  • the execution subject of the virtual screen synchronization method provided by the embodiment of the present invention may be the first wearable device, or may be a functional module and/or functional entity in the first wearable device that can implement the virtual screen synchronization method.
  • the specifics may be based on The actual use requirements are determined, which is not limited in the embodiment of the present invention.
  • the wearable device in the embodiment of the present invention may be: AR glasses, AR helmet, smart bracelet, smart watch, etc. It should be noted that the first wearable device and the second wearable device in the embodiment of the present invention may be the same wearable device (for example, both are AR glasses), or may be different wearable devices (for example, the first wearable device).
  • the type device is AR glasses, and the second wearable device is a mobile phone), which is not limited in the embodiment of the present invention.
  • the virtual screen in the embodiment of the present invention may be any carrier that can be used to display the content projected by the projection device when the AR technology is used to display content.
  • the projection device may be a projection device using AR technology, such as an electronic device, a wearable device, or an AR device in the embodiment of the present invention.
  • the projection device can project the virtual scene acquired (or internally integrated) by the projection device, or virtual scene and real scene onto the virtual screen, so that the virtual screen can display the content, thereby The user shows the effect of superimposing the real scene and the virtual scene.
  • the virtual screen can usually be any possible carrier such as the display screen of an electronic device (such as a mobile phone), the lens of AR glasses, the windshield of a car, the wall of a room, and so on.
  • an electronic device such as a mobile phone
  • the lens of AR glasses the windshield of a car
  • the wall of a room and so on.
  • the following takes the virtual screen as the display screen of the electronic device, the lens of the AR glasses, and the windshield of the car as examples to illustrate the process of displaying content on the virtual screen by using the AR technology.
  • the projection device may be the electronic device.
  • the electronic device can capture the real scene in the area where the electronic device is located through its camera, and display the real scene on the display screen of the electronic device, and then the electronic device can project the acquired (or internally integrated) virtual scene to the electronic device.
  • the virtual scene can be superimposed and displayed in the real scene, so that the user can see the superimposed effect of the real scene and the virtual scene through the display screen of the electronic device.
  • the projection device may be the AR glasses.
  • the user wears the glasses, the user can see the real scene in the area where he is located through the lens of the AR glasses, and the AR glasses can project the virtual scene acquired (or internally integrated) onto the lens of the AR glasses, so that The user can see the display effect of the real scene and the virtual scene superimposed through the lens of the AR glasses.
  • the projection device may be any electronic device.
  • the user can see the real scene in the area through the windshield of the car, and the projection device can project the virtual scene acquired (or internally integrated) onto the windshield of the car , So that the user can see the display effect of the real scene and the virtual scene superimposed through the windshield of the car.
  • the specific form of the virtual screen may not be limited, for example, it may be a non-carrier real space.
  • the user when the user is in the real space, the user can directly see the real scene in the real space, and the projection device can project the virtual scene acquired (or internally integrated) into the real space, so that The user can see the display effect of the real scene and the virtual scene superimposed in the real space.
  • the wearable device in the embodiment of the present invention may be a wearable device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present invention.
  • the following uses an operating system as an example to introduce the software environment to which the virtual screen synchronization method provided by the embodiment of the present invention is applied.
  • FIG. 1 it is a schematic structural diagram of a possible operating system provided by an embodiment of the present invention.
  • the architecture of the operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications in the operating system (including system applications and third-party applications).
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime layer includes a library (also called a system library) and an operating system runtime environment.
  • the library mainly provides various resources required by the operating system.
  • the operating system operating environment is used to provide a software environment for the operating system.
  • the kernel layer is the operating system layer of the operating system and belongs to the lowest level of the operating system software.
  • the kernel layer provides core system services and hardware-related drivers for the operating system based on the Linux kernel.
  • developers can develop a software program that implements the virtual screen synchronization method provided by the embodiment of the present invention based on the system architecture of the operating system shown in FIG. 1, so that the virtual screen
  • the synchronization method can be run based on the operating system as shown in FIG. 1. That is, the processor or the wearable device can implement the virtual screen synchronization method provided by the embodiment of the present invention by running the software program in the operating system.
  • FIG. 2 is a schematic flowchart of a virtual screen synchronization method provided by an embodiment of the present invention.
  • the virtual picture synchronization method provided by the embodiment of the invention includes the following steps 201 and 202:
  • Step 201 The first wearable device receives the first input of the user.
  • the above-mentioned first input may include: a specific gesture input by the user, or a voice input by the user to the first wearable device, or the user input to a specific button on the first wearable device, or the user Specific posture.
  • the above-mentioned specific gesture may be: a specific gesture input by the user in the shooting area of the camera of the first wearable device.
  • the above-mentioned specific gesture may be any one of a sliding gesture, a palm hovering gesture, a click gesture, a long-press gesture, an area change gesture, a single-finger hovering gesture, and a multi-finger hovering gesture. There is no restriction on this.
  • Step 202 In response to the first input, the first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display.
  • the second wearable device is determined based on the target area in the shooting preview screen of the camera of the first wearable device, and the target area is the area selected by the first input.
  • the target area selected by the user's first input may be a target area determined by the user through a button on the first wearable device or the user controls the first wearable device through voice input.
  • the shooting preview picture of the first wearable device includes: a real picture taken by a camera of the first wearable device and a virtual picture generated based on the real picture.
  • the aforementioned virtual picture is: virtual information generated by the aforementioned first wearable device according to a target object photographed by a camera set on the first wearable device.
  • the virtual picture may include: the length, width, and height of the table displayed on the first virtual screen in the form of a logo.
  • the second wearable device after the first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display, when the second wearable device When the second virtual screen displays the target object, the second wearable device will mark the received virtual information on the target object.
  • Example 1 in the case that the second wearable device receives the length, width and height identification of the table synchronized by the first wearable device, when the second wearable device’s second virtual screen includes the table, the second wearable device The wearable device will display the length, width and height of the table on the second virtual screen.
  • Example 2 Taking the first wearable device and the second wearable device as AR glasses as an example, the user uses AR glasses 1 to play a game (for example, making a cake on the table). After the game is over, the user sets the AR glasses 1 The above virtual screen (for example, the cake made) is sent to the AR glasses 2 of other users. When the above table appears in the shooting preview screen of the AR glasses 2 of other users, the AR glasses 1 will be displayed in the shooting preview screen of the AR glasses 2 A synchronized virtual screen (for example, a cake made by the user of the AR glasses 1 is displayed on the table).
  • the second wearable device may The virtual picture is displayed directly on the second virtual screen.
  • Example 3 taking the first wearable device and the second wearable device as AR glasses as an example, the user uses AR glasses 1 to play a game (for example, building a house with virtual building blocks), after the game is over, the user sets the AR glasses 1
  • the aforementioned virtual screen (for example, a house built by the user) is sent to the AR glasses 2 of other users, and the AR glasses 2 of other users can directly display the houses built by the user using the AR glasses 1.
  • the first wearable device determines the target area in the display area of its first virtual screen, and determines the wearable device within the range of the target area as the second wearable device.
  • the first wearable device determines the second wearable device within the range of the target area, in order to synchronize the virtual screen of the first virtual screen of the first wearable device to the second wearable device of the second wearable device
  • the virtual screen shows that the first wearable device needs to establish a connection with the second wearable device.
  • the first wearable device may establish a communication connection between the first wearable device and the second wearable device before synchronizing the virtual screen.
  • the virtual screen synchronization method provided in the embodiment of the present invention may further include the following steps 301 and 302:
  • Step 301 The first wearable device sends an infrared signal to the second wearable device through the infrared emitting device, the infrared signal includes: the device feature code of the first wearable device, so that the second wearable device can pass the device feature The code establishes a connection with the first wearable device.
  • the first wearable device may include: an infrared transmitting device; the second wearable device may further include: an infrared receiving device.
  • Step 302 After receiving the infrared signal sent by the first wearable device, the infrared receiving device of the second wearable device establishes a connection with the first wearable device through the device feature code.
  • the first wearable device synchronizes the virtual screen of the first virtual screen to the second virtual screen of the second wearable device for display.
  • the infrared signal sent by the first wearable device to the second wearable device is sent by directional transmission.
  • the user wants to enable the virtual screen synchronization function provided by the embodiment of the present invention, as shown in FIG. 3(A), it is the shooting preview screen of the first wearable device, which includes the virtual control 30 and the use of wearable Other users of the device.
  • the user can start the virtual screen synchronization method provided by the embodiment of the present invention by overlapping the finger of one hand with the aforementioned virtual control 30 for 2 seconds.
  • the first wearable device provided in the embodiment of the present invention is provided with an image acquisition device (for example, a camera).
  • an image acquisition device for example, a camera
  • the image capture device can capture the real-time picture image in front of the user who uses the first wearable device, so that the first wearable device can collect the real-time picture image from the collection. Recognize the user's specific gesture input in the camera shooting area of the first wearable device.
  • the virtual screen synchronization method provided by the embodiment of the present invention can be applied in a variety of scenarios, and it is convenient for the user to select the second wearable device with which the virtual screen needs to be synchronized by using different selection methods.
  • the user when the user needs to synchronize the virtual screen with multiple second wearable devices at the same time, he can create a selection box to select multiple second wearable
  • the second wearable device achieves the above-mentioned purpose by performing virtual screen synchronization.
  • the first wearable device in order to quickly select the second wearable device to reduce the data processing load of the first wearable device, may create a selection box to select the second wearable device.
  • step 202 may further include the following steps 202a and 202b:
  • Step 202a In response to the first input, the first wearable device creates a selection box on the shooting preview screen.
  • Step 202b The first wearable device determines the wearable device selected by the selection box as the second wearable device, and synchronizes the virtual screen to the second virtual screen of the second wearable device for display.
  • the shape of the foregoing selection frame may be any possible shape such as a circle, a rectangle, a triangle, a diamond, a ring, or a polygon, which may be specifically determined according to actual usage requirements, and is not limited in the embodiment of the present invention.
  • the first wearable device and the second wearable device are both AR glasses, as shown in Figure 3, the first virtual screen of the first wearable device displays a virtual screen synchronization function virtual control (as shown in Figure 3).
  • the user's finger moves to the position where the synchronization function virtual control 30 is located and stays for 2 seconds to turn on the virtual screen synchronization function.
  • the screen synchronization function is turned on, as shown in Figure 3
  • the first wearable device will create a rectangular selection box (31 in Figure 3) on the display screen. Wherein, the selection box 31 is used to select the wearable device in the shooting preview screen of the first wearable device.
  • the first wearable device only needs to identify the wearable device in the selection box, which reduces the data processing load of the first wearable device and at the same time reduces the energy consumption of the first wearable device.
  • the first wearable device can expand or reduce the range of the select box according to actual needs according to the specific input of the user.
  • the foregoing step 202a may include the following steps 202a1 and 202a2:
  • Step 202a1 In the case where the first sub-input is included in the shooting preview screen of the first wearable device, based on the diagonal of the target rectangle, the first wearable device displays a selection box on the shooting preview screen.
  • Step 202a2 upon receiving the second sub-input of the user, the first wearable device updates the display of the selection box.
  • the first sub-input is used to trigger the first wearable device to create a selection box
  • the second sub-input is used to trigger the first wearable device to adjust the size of the selection box.
  • the diagonal line of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
  • the first sub-input of the first hand and the second hand may be a gesture input of the user extending the fingers of both hands.
  • the first wearable device recognizes the gesture input, it displays on its shooting surface A selection box for selecting the second wearable device.
  • the above-mentioned second sub-input may be a gesture input in which the user extends a finger in both hands and expands to both sides along the diagonal line (that is, the diagonal line of the above-mentioned selection box) or gathers in the middle.
  • the first wearable device recognizes the gesture of extending a finger with both hands in the shooting preview screen, it starts to create a frame selection area, creates a rectangular selection frame based on the diagonal line of the finger, and adjusts the selection by moving the fingers of both hands The size of the frame.
  • the first wearable device after the first wearable device has turned on the screen synchronization function, it recognizes the user's hand gestures with fingers extending respectively, and will create a preset at the target position of the display screen. Set the size of the rectangular selection box. The user can move the fingers of both hands to the two end points of the rectangular selection frame 41, and adjust the size and position of the rectangular selection frame 41 by moving the fingers. For example, a rectangular selection frame 41 of a preset size created by the first wearable device can be adjusted to a selection frame 42 as shown in FIG. 4(B) according to the above method.
  • the user can display a selection frame on the shooting preview screen of the first wearable device through gesture input, and can adjust the size of the selection frame through gestures, so that the user can adjust the size of the selection frame to change the size of the selection frame.
  • the wearable device whose screen is synchronized is included in the selection box.
  • the size of the selection box needs to be locked to prevent the user from being able to adjust the size of the selection box when the user moves his finger out of the shooting preview screen.
  • the virtual screen synchronization method provided in the embodiment of the present invention may further include the following step 202a3:
  • Step 202a3 In the case of receiving the user's third sub-input, the first wearable device locks the display of the selection box.
  • the above-mentioned third sub-input may be a gesture input of the user spreading the palms of both hands.
  • the first wearable device detects the gesture input of spreading the palms of the user's hands in the shooting preview screen, the size of the selection box is locked.
  • the above-mentioned locking the size of the selection box includes: fixing the size and position of the selection box.
  • the creation of the selection box is completed, and the wearable device contained in the selection box is determined as the second wearable device.
  • the first wearable device and the second wearable device as AR glasses as an example, as shown in Figure 5, when the AR glasses recognize that the user's gesture in the display screen becomes a gesture of spreading the palms of both hands, the selection box is completed 31 is created, and it is recognized whether the user wearing AR glasses is included in the selection frame, and at the same time, the AR glasses in the selection frame are searched, and the screen is displayed synchronously to them.
  • the first wearable device determines the second wearable device from the selection frame, it is necessary to mark the second wearable device in the selection frame Location.
  • the virtual screen synchronization method provided in the embodiment of the present invention further includes the following step 202b1:
  • Step 202b1 the first wearable device displays the first logo in the first area on the shooting preview screen.
  • the first identifier is used to mark the second wearable device, and the first area is the area where the second wearable device is located.
  • the above-mentioned first area is the area where the second wearable device is located in the selection frame in the shooting preview screen of the first wearable device.
  • the first wearable device reminds the user that the second wearable device is in the shooting preview screen Location, the first logo is displayed within the area where the second wearable device is located.
  • the first mark may be a red dot
  • the red dot is displayed in the first area to mark the second wearable device in the shooting preview screen of the first wearable device.
  • the first wearable device recognizes the user wearing the AR glasses in the selection frame 31 , And search the AR glasses within the selection frame 31 at the same time, and display the origin mark (that is, the above-mentioned first mark, 32 in Fig. 5) at the position of the two AR glasses in the selection frame 31, thereby marking the AR glasses s position.
  • the origin mark that is, the above-mentioned first mark, 32 in Fig. 5
  • the user can clearly see the position of the second wearable device in the selection frame of the shooting preview screen of the first wearable device, which is convenient for the user to perform secondary screening.
  • the selection range is large, and it is likely to include wearable devices with which the user does not want to synchronize the virtual screen. At this time, the user can filter the second wearable devices in the selection box.
  • the virtual picture synchronization method provided in the embodiment of the present invention may include the following step 202b2:
  • Step 202b2 when the second wearable device is blocked by the target object, the first wearable device cancels the display of the first logo.
  • the aforementioned target object may be the palm of the user, or other opaque objects.
  • using the target object to block the second wearable device is to prevent the blocked wearable device from appearing in the selection frame of the shooting preview screen of the first wearable device. In this way, the first wearable device does not Determine it as the second wearable device.
  • the above-mentioned target object is also used to block the infrared signal sent by the first wearable device to the second wearable device.
  • the first wearable device and the second wearable device as both AR glasses as an example
  • when the user completes the creation of the selection box if there are AR glasses that the user does not want to synchronize with the virtual screen in the selection box , As shown in Figure 6, use the one-hand palm open gesture to cover the AR glasses for 2 seconds, the object that is hidden by the one-hand palm open gesture will be deselected, and the first mark of the AR glasses position disappears, making the first The display screen of the wearable device will not be synchronized to the display area of the second wearable device that is hidden by the open palm gesture of one hand.
  • the second wearable device can be blocked to prevent the virtual screen from being performed with it. Synchronize.
  • the user can drive the first wearable device to rotate by turning his head to align the second logo displayed on the first virtual screen of the first wearable device
  • the virtual screen of the first wearable device is synchronized to the second wearable device.
  • the virtual screen synchronization method provided in the embodiment of the present invention may include the following step 202c1:
  • Step 202c1 the first wearable device displays the second identifier at the target location of the target area.
  • the foregoing step 202 may further include the following step 202c2:
  • Step 202c2 The first wearable device determines the wearable device in the second area including the second identifier as the second wearable device, and synchronizes the virtual screen to the second virtual screen display of the second wearable device.
  • the second identifier is displayed at the target location of the target area.
  • the target area may be the entire display area of the shooting preview screen of the first wearable device, and the target position may be the center position of the aforementioned target area.
  • the above-mentioned second mark may be a cross auxiliary line, or may be an image, or may be another mark used to mark the second wearable device, which is not limited in the embodiment of the present invention.
  • the first wearable device determines the wearable device within the second area range as the second wearable device.
  • the foregoing second area is an area where the second wearable device is located.
  • the wearable device can be determined as the second wearable device, and the virtual screen can be synchronized with it.
  • the user can control the rotation of the first wearable device by rotating the head, and align the second mark displayed on the shooting preview screen with the wearable device with which the user wants to synchronize the virtual screen.
  • the virtual screen synchronization method provided in the embodiment of the present invention may include the following steps 202c3 and 202c4:
  • Step 202c3 The first wearable device obtains the rotation direction and the rotation angle of the first wearable device
  • Step 202c4 The first wearable device updates the display position of the second identifier based on the rotation direction and the rotation angle.
  • the user can control the rotation of the first wearable device by rotating the head or body.
  • a head-mounted device eg, AR glasses, AR helmet, etc.
  • the shooting preview screen of the first wearable device contains the wearable device that the user wants to synchronize with the virtual screen, and the second identifier is not aligned with the area of the wearable device, use the first wearable device
  • the user of the wearable device can align the second mark with the wearable device by turning the head.
  • the first virtual screen of the first wearable device displays a virtual screen synchronization function virtual control (as shown in FIG. 7 in 60), when the user wants to open the virtual screen synchronization function, the user's finger moves to the position where the synchronization function virtual control 60 is located, and stays for 2 seconds to turn on the screen synchronization function.
  • the screen synchronization function is turned on, as shown in the figure As shown in 7, the first wearable device will create a cross auxiliary line 61 on the shooting preview screen, and the user can turn the head or move the body to turn the cross auxiliary line 61 to the one that the user wants to synchronize with the virtual screen. Wearable devices.
  • the virtual screen of the shooting preview screen of the above-mentioned first wearable device is synchronized to the second logo position overlapping with the second logo.
  • the second virtual screen of the wearable device is displayed on the shooting preview screen of the first wearable device, indicating that the transmission is complete.
  • the user can continue to turn his head and continue to select the next wearable device that needs to be synchronized with the virtual screen.
  • the user can determine the second wearable device by turning the head or moving the body without extending his hands, avoiding the user having to put down the objects in his hands to manipulate the first wearable device through gestures when holding other objects in both hands.
  • the wearable device situation happened.
  • the user can use voice input, press a physical button on the first wearable device, or other gesture input methods to change the first virtual device of the first wearable device.
  • the virtual image of the screen is synchronized to the second virtual screen of the second wearable device for display.
  • the single-finger hovering gesture is moved to the location of the virtual control, and after hovering for a preset time, the first wearable device of the first wearable device
  • the virtual screen of the virtual screen is synchronized to the second virtual screen of the second wearable device and displayed on the second virtual screen.
  • the aforementioned first wearable device may adopt the 4th generation mobile communication technology (4G), the 5th generation mobile communication technology (5G) or wireless high-fidelity (5G).
  • wireless fidelity WIFI
  • WIFI wireless fidelity
  • the virtual screen synchronization method provided by the embodiment of the present invention, by acquiring the shooting preview screen of the first wearable device, if the second wearable device is included in the target area of the shooting preview screen, the first wearable device of the first wearable device.
  • the virtual screen of the virtual screen is synchronized to the second virtual screen display of the second wearable device, which can easily and quickly realize the screen sharing of the first wearable device, which to a certain extent avoids the sharing of virtual information between AR devices in the traditional technology.
  • the virtual screen synchronization methods shown in the figures of the above methods are all exemplified in conjunction with a figure in the embodiment of the present invention.
  • the virtual screen synchronization method shown in the figures of the above methods can also be implemented in combination with any other figures that can be combined as illustrated in the above embodiments, and will not be repeated here.
  • FIG. 8 is a schematic diagram of a possible structure for implementing a first wearable device provided by an embodiment of the present invention.
  • the first wearable device 500 includes: a receiving module 501 and a synchronization module 502, wherein:
  • the receiving module 501 is configured to receive the first input of the user.
  • the synchronization module 502 is configured to respond to the first input received by the receiving module 501 to synchronize the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display.
  • the second wearable device is determined based on the target area in the shooting preview picture of the camera of the first wearable device, and the target area is the area selected by the first input.
  • the first wearable device 500 further includes: a creation module 503 and a determination module 504.
  • the creation module 503 is used to respond to the first input and create a selection box on the shooting preview screen.
  • the determining module 504 is configured to determine the wearable device selected by the selection box created by the creating module 503 as the second wearable device.
  • the synchronization module 502 is further configured to synchronize the virtual screen to the second virtual screen display of the second wearable device determined by the determining module 504.
  • the first wearable device only needs to identify the wearable device in the selection box, which reduces the data processing load of the first wearable device and at the same time reduces the energy consumption of the first wearable device.
  • the first wearable device 500 further includes: a display module 505.
  • the first input includes: a first sub-input of the user's first hand and second hand; the first sub-input is used to trigger the first wearable device to create a selection box.
  • the display module 505 is configured to display a selection box on the shooting preview screen based on the diagonal of the target rectangle when the first sub-input is included in the shooting preview screen.
  • the display module 505 is also configured to update the display of the selection box when the receiving module 501 receives the second sub-input of the user.
  • the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
  • the user can display a selection frame on the shooting preview screen of the first wearable device through gesture input, and can adjust the size of the selection frame through gestures, so that the user can adjust the size of the selection frame to change the size of the selection frame.
  • the wearable device whose screen is synchronized is included in the selection box.
  • the display module 505 is further configured to lock the display of the selection box when the receiving module 501 receives the third sub-input of the user.
  • the display module 505 is further configured to display a first identifier in the first area on the shooting preview screen, the first identifier is used to mark the second wearable device, and the first area is the area where the second wearable device is located.
  • the user can clearly see the position of the second wearable device in the selection frame of the shooting preview screen of the first wearable device, which is convenient for the user to perform secondary screening.
  • the display module 505 is further configured to cancel the display of the first logo when the second wearable device is blocked by the target object.
  • the second wearable device can be blocked to prevent the virtual screen from being performed with it. Synchronize.
  • the display module 505 is further configured to display the second identifier at the target position of the target area.
  • the determining module 504 is further configured to determine the wearable device in the second area including the second identifier as the second wearable device.
  • the display module 505 is also configured to synchronize the virtual screen to the second virtual screen of the second wearable device determined by the determining module 504 for display.
  • the wearable device can be determined as the second wearable device, and the virtual screen can be synchronized with it.
  • the first wearable device further includes: an obtaining module 506.
  • the obtaining module 506 is used to obtain the rotation direction and the rotation angle of the first wearable device.
  • the display module 505 is further configured to update the display position of the second indicator based on the rotation direction and the rotation angle acquired by the acquisition module 506.
  • the user can determine the second wearable device by turning the head or moving the body without extending his hands, avoiding the user having to put down the objects in his hands to manipulate the first wearable device through gestures when holding other objects in both hands.
  • the wearable device situation happened.
  • the wearable device by acquiring the shooting preview screen of the first wearable device, when the second wearable device is included in the target area of the shooting preview screen, the first virtual device of the first wearable device
  • the virtual screen of the screen is synchronized to the second virtual screen display of the second wearable device, which can easily and quickly realize the screen sharing of the first wearable device, which to a certain extent avoids the sharing of virtual information between AR devices in the traditional technology.
  • the storage and downloading process takes a long time and causes the problem of large delays between AR devices.
  • the electronic device provided in the embodiment of the present invention can implement each process implemented by the wearable device in the foregoing method embodiment, and to avoid repetition, details are not described herein again.
  • the electronic device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, a power supply 111, a camera assembly 112 and other components.
  • a radio frequency unit 101 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, a power supply 111, a camera assembly 112 and other components.
  • the electronic device 100 may include more or less components than those shown in the figure, or combine certain components, or Different component arrangements.
  • the aforementioned camera component 112 includes a camera.
  • the electronic device 100 includes, but is not limited to, a mobile phone
  • the processor 110 can recognize the user's first sub-input and send a first instruction to the display unit 106 according to the first sub-input.
  • the display unit 106 responds to the first instruction sent by the processor 110 based on the diagonal of the target rectangle, On the shooting preview screen of the first wearable device, a selection frame is displayed. Wherein, the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
  • the processor 110 may also recognize the user's second sub-input, and send a second instruction to the display unit 106 according to the second sub-input.
  • the display unit 106 responds to the second instruction sent by the processor 110, based on the diagonal of the target rectangle, On the shooting preview screen of the first wearable device, the display of the selection box is updated.
  • the electronic device provided by the embodiment of the present invention obtains the shooting preview screen of the first wearable device, and in the case that the second wearable device is included in the target area of the shooting preview screen, the first virtual screen of the first wearable device.
  • the virtual screen of the second wearable device is synchronized to the second virtual screen display of the second wearable device, which can easily and quickly realize the screen sharing of the first wearable device. To a certain extent, it avoids the storage of virtual information between AR devices in the traditional technology. And the long delay between the AR device caused by the long download process.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device 100 provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the electronic device 100 obtains a real-time picture taken by the camera in the camera component 112 (for example, a photographed preview picture of the first wearable device), and displays it on the display unit 106.
  • a real-time picture taken by the camera in the camera component 112 for example, a photographed preview picture of the first wearable device
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041, a microphone 1042, and an image capture device 1043.
  • the graphics processor 1041 is configured to capture still pictures or videos obtained by the image capture device in the video capture mode or the image capture mode. Image data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frames processed by the graphics processor 1041 can be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the electronic device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the electronic device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 may be used to receive input numeric or character information, and generate key signal input related to user settings and function control of the electronic device 100.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the electronic device 100, in some embodiments, the touch panel 1071 and the display panel 1061 may be combined.
  • the input and output functions of the electronic device 100 are realized by integration, which is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with the electronic device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 100 or can be used to connect to the electronic device 100 and the external device. Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the electronic device 100, which uses various interfaces and lines to connect the various parts of the entire electronic device 100, runs or executes the software programs and/or modules stored in the memory 109, and calls the storage in the memory 109 , Execute various functions of the electronic device 100 and process data, so as to monitor the electronic device 100 as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the electronic device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the electronic device 100 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present invention further provides an AR device, including a processor, a memory, and a computer program stored in the memory and running on the processor 110.
  • the computer program realizes the aforementioned virtual reality when the computer program is executed by the processor.
  • Each process of the embodiment of the picture synchronization method can achieve the same technical effect, and in order to avoid repetition, it will not be repeated here.
  • the electronic device in the foregoing embodiment may be an AR device.
  • the AR device may include all or part of the functional modules in the foregoing electronic device.
  • the AR device may also include functional modules not included in the above electronic device.
  • the electronic device in the foregoing embodiment is an AR device
  • the electronic device may be an electronic device integrated with AR technology.
  • the above-mentioned AR technology refers to a technology that realizes the combination of a real scene and a virtual scene.
  • the use of AR technology can restore human visual functions, so that humans can experience the combination of real scenes and virtual scenes through AR technology, so that humans can better experience the immersive feelings.
  • the AR device when the user wears the AR glasses, the scene that the user sees is generated through AR technology processing, that is, the virtual scene can be superimposed and displayed in the real scene through the AR technology.
  • the user manipulates the content displayed by the AR glasses, the user can see the AR glasses "peel off" the real scene, thereby showing the user a more realistic side.
  • the user can only observe the carton shell when observing a carton with naked eyes, but when the user wears AR glasses, the user can directly observe the internal structure of the carton through the AR glasses.
  • the above-mentioned AR device may include a camera, so that the AR device can display and interact with a virtual screen based on the image captured by the camera.
  • the AR device can synchronize the virtual screen information generated when the user uses the AR device for entertainment activities to the display screen of other AR devices, so that the virtual screen can be shared between AR devices.
  • the embodiment of the present invention also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the above-mentioned virtual screen synchronization method embodiment is realized, and the same can be achieved.
  • the technical effect, in order to avoid repetition, will not be repeated here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM for short), random access memory (Random Access Memory, RAM for short), magnetic disk, or optical disk, etc.
  • the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.
  • a storage medium such as ROM/RAM, magnetic disk,
  • the optical disc includes several instructions to make an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Embodiments of the present invention provide a virtual image synchronization method and a wearable device. The method comprises: receiving a first input of a user; and in response to the first input, synchronizing a virtual image of a first virtual screen of a first wearable device to a second virtual screen of a second wearable device for display, wherein the second wearable device is determined based on a target area in a photographing preview image of a camera of the first wearable device, and the target area is an area selected by the first input.

Description

虚拟画面同步方法及穿戴式设备Virtual screen synchronization method and wearable equipment
相关申请的交叉引用Cross-references to related applications
本申请主张在2019年12月31日在中国提交的中国专利申请号No.201911418240.2的优先权,其全部内容通过引用包含于此。This application claims the priority of Chinese Patent Application No. 201911418240.2 filed in China on December 31, 2019, the entire content of which is incorporated herein by reference.
技术领域Technical field
本发明实施例涉及通信技术领域,尤其涉及一种虚拟画面同步方法及穿戴式设备。The embodiments of the present invention relate to the field of communication technology, and in particular, to a virtual screen synchronization method and a wearable device.
背景技术Background technique
随着增强现实(augmented reality,AR)技术的不断发展,用户使用AR设备的频率越来越高。以AR眼镜为例,在AR眼镜的使用中,由于工作交流、游戏分享等原因,当用户想要将AR眼镜内的虚拟信息与其他使用AR眼镜的用户共享时,通常需要用户将AR眼镜内的虚拟信息(例如,AR眼镜的显示画面)上传到服务器中,再由其他用户下载。With the continuous development of augmented reality (AR) technology, users use AR devices more and more frequently. Take AR glasses as an example. In the use of AR glasses, due to work communication, game sharing and other reasons, when users want to share the virtual information in the AR glasses with other users who use AR glasses, the user is usually required to put the AR glasses into the AR glasses. The virtual information (for example, the display screen of AR glasses) is uploaded to the server, and then downloaded by other users.
然而,由于上述存储和下载过程费时较长,导致用户的AR眼镜之间的画面存在较大延时。However, due to the time-consuming process of storing and downloading as described above, the picture between the AR glasses of the user has a large delay.
发明内容Summary of the invention
本发明实施例提供一种虚拟画面同步方法及穿戴式设备,能够解决相关技术中AR设备之间共享虚拟信息时,由于存储和下载过程耗时较长导致的AR设备之间延时大的问题。Embodiments of the present invention provide a virtual screen synchronization method and wearable device, which can solve the problem of long delay between AR devices due to the long storage and download process when virtual information is shared between AR devices in related technologies. .
为了解决上述技术问题,本申请是这样实现的:In order to solve the above technical problems, this application is implemented as follows:
第一方面,本发明实施例提供一种虚拟画面同步方法,该方法包括:接收用户的第一输入;响应第一输入,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示;其中,第二穿戴式设备为基于第一穿戴式设备的摄像头的拍摄预览画面中的目标区域确定的,目标区域为第一输入选择的区域。In a first aspect, an embodiment of the present invention provides a method for synchronizing a virtual screen. The method includes: receiving a first input from a user; responding to the first input, synchronizing a virtual screen of a first virtual screen of a first wearable device to a second The second virtual screen of the wearable device is displayed; wherein the second wearable device is determined based on the target area in the shooting preview screen of the camera of the first wearable device, and the target area is the area selected by the first input.
第二方面,本发明实施例还提供了第一穿戴式设备,该第一穿戴式设备包括:接收模块和同步模块;接收模块,用于接收用户的第一输入;同步模块, 用于响应接收模块接收到的第一输入,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示;其中,第二穿戴式设备为基于第一穿戴式设备的摄像头的拍摄预览画面中的目标区域确定的,目标区域为第一输入选择的区域。In the second aspect, an embodiment of the present invention also provides a first wearable device, the first wearable device includes: a receiving module and a synchronization module; a receiving module for receiving a user's first input; a synchronization module for receiving a response The first input received by the module synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display; wherein, the second wearable device is based on the first wearable device The target area in the shooting preview screen of the camera is determined, and the target area is the area selected by the first input.
第三方面,本发明实施例提供了一种电子设备,包括处理器、存储器及存储在该存储器上并可在该处理器上运行的计算机程序,该计算机程序被该处理器执行时实现如第一方面所述的虚拟画面同步方法的步骤。In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor. The computer program is executed by the processor to achieve the following On the one hand, the steps of the virtual picture synchronization method.
第四方面,本发明实施例提供了一种计算机可读存储介质,该计算机可读存储介质上存储计算机程序,该计算机程序被处理器执行时实现如第一方面所述的虚拟画面同步方法的步骤。In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium that stores a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, it implements the virtual screen synchronization method as described in the first aspect. step.
在本发明实施例中,当第一穿戴式设备的目标区域的显示画面中包含第二穿戴式设备时,第一穿戴式设备通过将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的显示区域,能够方便快捷地实现第一穿戴式设备的画面共享。In the embodiment of the present invention, when the display screen of the target area of the first wearable device includes the second wearable device, the first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to The display area of the second wearable device can conveniently and quickly realize the screen sharing of the first wearable device.
附图说明Description of the drawings
图1为本发明实施例提供的一种可能的操作系统的架构示意图;FIG. 1 is a schematic structural diagram of a possible operating system provided by an embodiment of the present invention;
图2为本发明实施例提供的一种虚拟画面同步方法的流程示意图;2 is a schematic flowchart of a method for synchronizing a virtual picture provided by an embodiment of the present invention;
图3为本发明实施例提供的一种虚拟画面同步方法所应用的界面的示意图之一;FIG. 3 is one of the schematic diagrams of an interface applied by a method for synchronizing a virtual picture according to an embodiment of the present invention;
图4为本发明实施例提供的一种虚拟画面同步方法所应用的界面的示意图之二;4 is a second schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention;
图5为本发明实施例提供的一种虚拟画面同步方法所应用的界面的示意图之三;FIG. 5 is the third schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention;
图6为本发明实施例提供的一种虚拟画面同步方法所应用的界面的示意图之四;FIG. 6 is a fourth schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention;
图7为本发明实施例提供的一种虚拟画面同步方法所应用的界面的示意图之五;FIG. 7 is a fifth schematic diagram of an interface applied by a virtual screen synchronization method provided by an embodiment of the present invention;
图8为本发明实施例提供的一种第一穿戴式设备的结构示意图;FIG. 8 is a schematic structural diagram of a first wearable device according to an embodiment of the present invention;
图9为本发明实施例提供的一种电子设备的结构示意图。FIG. 9 is a schematic structural diagram of an electronic device provided by an embodiment of the present invention.
具体实施方式Detailed ways
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are part of the embodiments of the present application, not all of the embodiments. Based on the embodiments in this application, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of this application.
需要说明的是,本文中的“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。It should be noted that the "/" in this article means or, for example, A/B can mean A or B; the "and/or" in this article is only an association relationship describing associated objects, indicating that there may be three A relationship, for example, A and/or B, can mean that: A alone exists, A and B exist at the same time, and B exists alone.
需要说明的是,本文中的“多个”是指两个或多于两个。It should be noted that the "plurality" in this article refers to two or more than two.
需要说明的是,本发明实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本发明实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。It should be noted that in the embodiments of the present invention, words such as "exemplary" or "for example" are used as examples, illustrations, or illustrations. Any embodiment or design solution described as "exemplary" or "for example" in the embodiments of the present invention should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as "exemplary" or "for example" are used to present related concepts in a specific manner.
需要说明的是,为了便于清楚描述本发明实施例的技术方案,在本发明实施例中,采用了“第一”、“第二”等字样对功能或作用基本相同的相同项或相似项进行区分,本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定。例如,第一穿戴式设备和第二穿戴式设备是用于区别不同的穿戴式设备,而不是用于描述穿戴式设备的特定顺序。It should be noted that, in order to facilitate the clear description of the technical solutions of the embodiments of the present invention, in the embodiments of the present invention, the words "first", "second", etc. are used for the same or similar items with basically the same function or effect. Distinguishing, those skilled in the art can understand that words such as "first" and "second" do not limit the quantity and execution order. For example, the first wearable device and the second wearable device are used to distinguish different wearable devices, rather than to describe a specific order of wearable devices.
本发明实施例提供的虚拟画面同步方法的执行主体可以为第一穿戴式设备,也可以为第一穿戴式设备中能够实现该虚拟画面同步方法的功能模块和/或功能实体,具体的可以根据实际使用需求确定,本发明实施例不作限定。The execution subject of the virtual screen synchronization method provided by the embodiment of the present invention may be the first wearable device, or may be a functional module and/or functional entity in the first wearable device that can implement the virtual screen synchronization method. The specifics may be based on The actual use requirements are determined, which is not limited in the embodiment of the present invention.
本发明实施例中的穿戴式设备可以为:AR眼镜,AR头盔,智能手环,智能手表等。需要说明的是,本发明实施例中的第一穿戴式设备与第二穿戴式设备可以为相同穿戴式设备(如,均为AR眼镜),也可以为不同穿戴式设备(如,第一穿戴式设备为AR眼镜,第二穿戴式设备为手机),本发明实施例对此不作 限定。The wearable device in the embodiment of the present invention may be: AR glasses, AR helmet, smart bracelet, smart watch, etc. It should be noted that the first wearable device and the second wearable device in the embodiment of the present invention may be the same wearable device (for example, both are AR glasses), or may be different wearable devices (for example, the first wearable device). The type device is AR glasses, and the second wearable device is a mobile phone), which is not limited in the embodiment of the present invention.
本发明实施例中的虚拟屏幕,可以为采用AR技术显示内容时,能够用于显示投射设备投射的内容的任意载体。其中,该投射设备可以为采用AR技术的投射设备,例如本发明实施例中的电子设备、穿戴式设备或AR设备。The virtual screen in the embodiment of the present invention may be any carrier that can be used to display the content projected by the projection device when the AR technology is used to display content. The projection device may be a projection device using AR technology, such as an electronic device, a wearable device, or an AR device in the embodiment of the present invention.
当采用AR技术在虚拟屏幕显示内容时,投射设备可以将投射设备获取(或内部集成的)的虚拟场景,或虚拟场景和现实场景投射至虚拟屏幕上,使得虚拟屏幕可以显示这些内容,从而向用户展示现实场景与虚拟场景叠加的效果。When AR technology is used to display content on a virtual screen, the projection device can project the virtual scene acquired (or internally integrated) by the projection device, or virtual scene and real scene onto the virtual screen, so that the virtual screen can display the content, thereby The user shows the effect of superimposing the real scene and the virtual scene.
结合AR技术应用的不同场景,虚拟屏幕通常可以为电子设备(例如手机)的显示屏、AR眼镜的镜片、汽车的挡风玻璃、房间的墙壁等任意可能的载体。Combined with different scenarios of AR technology application, the virtual screen can usually be any possible carrier such as the display screen of an electronic device (such as a mobile phone), the lens of AR glasses, the windshield of a car, the wall of a room, and so on.
下面分别以虚拟屏幕为电子设备的显示屏、AR眼镜的镜片,以及汽车的挡风玻璃为例,对采用AR技术在虚拟屏幕上显示内容的过程进行示例性的说明。The following takes the virtual screen as the display screen of the electronic device, the lens of the AR glasses, and the windshield of the car as examples to illustrate the process of displaying content on the virtual screen by using the AR technology.
一种示例中,当虚拟屏幕为电子设备的显示屏时,投射设备可以为该电子设备。电子设备可以通过其摄像头采集电子设备所处区域中的现实场景,并将该现实场景显示在电子设备的显示屏上,然后电子设备可以将其获取(或内部集成的)的虚拟场景投射到电子设备的显示屏上,从而使得该虚拟场景可以叠加显示在该现实场景中,进而使得用户可以通过电子设备的显示屏看到现实场景和虚拟场景叠加后的效果。In an example, when the virtual screen is the display screen of an electronic device, the projection device may be the electronic device. The electronic device can capture the real scene in the area where the electronic device is located through its camera, and display the real scene on the display screen of the electronic device, and then the electronic device can project the acquired (or internally integrated) virtual scene to the electronic device. On the display screen of the device, the virtual scene can be superimposed and displayed in the real scene, so that the user can see the superimposed effect of the real scene and the virtual scene through the display screen of the electronic device.
另一种示例中,当虚拟屏幕为AR眼镜的镜片时,投射设备可以为该AR眼镜。当用户佩戴眼镜时,用户可以通过AR眼镜的镜片看到其所处区域中的现实场景,并且AR眼镜可以将其获取(或内部集成的)的虚拟场景投射到AR眼镜的镜片上,从而使得用户可以通过AR眼镜的镜片看到现实场景和虚拟场景叠加后的显示效果。In another example, when the virtual screen is the lens of AR glasses, the projection device may be the AR glasses. When the user wears the glasses, the user can see the real scene in the area where he is located through the lens of the AR glasses, and the AR glasses can project the virtual scene acquired (or internally integrated) onto the lens of the AR glasses, so that The user can see the display effect of the real scene and the virtual scene superimposed through the lens of the AR glasses.
又一种示例中,当虚拟屏幕为汽车的挡风玻璃时,投射设备可以为任意电子设备。当用户位于汽车内时,用户可以通过汽车的挡风玻璃看到其所处区域中的现实场景,并且投射设备可以将其获取(或内部集成的)的虚拟场景投射到汽车的挡风玻璃上,从而使得用户可以通过汽车的挡风玻璃看到现实场景和虚拟场景叠加后的显示效果。In another example, when the virtual screen is the windshield of a car, the projection device may be any electronic device. When the user is in the car, the user can see the real scene in the area through the windshield of the car, and the projection device can project the virtual scene acquired (or internally integrated) onto the windshield of the car , So that the user can see the display effect of the real scene and the virtual scene superimposed through the windshield of the car.
当然,本发明实施例中,也可以不限定虚拟屏幕的具体形式,例如其可能 为非载体的现实空间。这种情况下,当用户位于现实空间中时,用户可以直接看到该现实空间中的现实场景,并且投射设备可以将其获取(或内部集成的)的虚拟场景投射到现实空间中,从而使得用户在现实空间中可以看到现实场景和虚拟场景叠加后的显示效果。Of course, in the embodiment of the present invention, the specific form of the virtual screen may not be limited, for example, it may be a non-carrier real space. In this case, when the user is in the real space, the user can directly see the real scene in the real space, and the projection device can project the virtual scene acquired (or internally integrated) into the real space, so that The user can see the display effect of the real scene and the virtual scene superimposed in the real space.
本发明实施例中的穿戴式设备可以为具有操作系统的穿戴式设备。该操作系统可以为安卓(Android)操作系统,可以为ios操作系统,还可以为其他可能的操作系统,本发明实施例不作具体限定。The wearable device in the embodiment of the present invention may be a wearable device with an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present invention.
下面以操作系统为例,介绍一下本发明实施例提供的虚拟画面同步方法所应用的软件环境。The following uses an operating system as an example to introduce the software environment to which the virtual screen synchronization method provided by the embodiment of the present invention is applied.
如图1所示,为本发明实施例提供的一种可能的操作系统的架构示意图。在图1中,操作系统的架构包括4层,分别为:应用程序层、应用程序框架层、系统运行库层和内核层(具体可以为Linux内核层)。As shown in FIG. 1, it is a schematic structural diagram of a possible operating system provided by an embodiment of the present invention. In Figure 1, the architecture of the operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
其中,应用程序层包括操作系统中的各个应用程序(包括系统应用程序和第三方应用程序)。Among them, the application layer includes various applications in the operating system (including system applications and third-party applications).
应用程序框架层是应用程序的框架,开发人员可以在遵守应用程序的框架的开发原则的情况下,基于应用程序框架层开发一些应用程序。The application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
系统运行库层包括库(也称为系统库)和操作系统运行环境。库主要为操作系统提供其所需的各类资源。操作系统运行环境用于为操作系统提供软件环境。The system runtime layer includes a library (also called a system library) and an operating system runtime environment. The library mainly provides various resources required by the operating system. The operating system operating environment is used to provide a software environment for the operating system.
内核层是操作系统的操作系统层,属于操作系统软件层次的最底层。内核层基于Linux内核为操作系统提供核心系统服务和与硬件相关的驱动程序。The kernel layer is the operating system layer of the operating system and belongs to the lowest level of the operating system software. The kernel layer provides core system services and hardware-related drivers for the operating system based on the Linux kernel.
以操作系统为例,本发明实施例中,开发人员可以基于上述如图1所示的操作系统的系统架构,开发实现本发明实施例提供的虚拟画面同步方法的软件程序,从而使得该虚拟画面同步方法可以基于如图1所示的操作系统运行。即处理器或者穿戴式设备可以通过在操作系统中运行该软件程序实现本发明实施例提供的虚拟画面同步方法。Taking the operating system as an example, in the embodiment of the present invention, developers can develop a software program that implements the virtual screen synchronization method provided by the embodiment of the present invention based on the system architecture of the operating system shown in FIG. 1, so that the virtual screen The synchronization method can be run based on the operating system as shown in FIG. 1. That is, the processor or the wearable device can implement the virtual screen synchronization method provided by the embodiment of the present invention by running the software program in the operating system.
下面结合图2所示的虚拟画面同步方法流程图对本发明实施例的虚拟画面同步方法进行说明,图2为本发明实施例提供的一种虚拟画面同步方法流程示 意图,如图2所示,本发明实施例提供的虚拟画面同步方法包括如下步骤201和步骤202:The following describes the virtual screen synchronization method according to the embodiment of the present invention with reference to the flowchart of the virtual screen synchronization method shown in FIG. 2. FIG. 2 is a schematic flowchart of a virtual screen synchronization method provided by an embodiment of the present invention. As shown in FIG. The virtual picture synchronization method provided by the embodiment of the invention includes the following steps 201 and 202:
步骤201、第一穿戴式设备接收用户的第一输入。Step 201: The first wearable device receives the first input of the user.
示例性的,上述第一输入可以包括:用户输入的特定手势,或者,用户对第一穿戴式设备的语音输入,或者,用户对第一穿戴式设备上某一特定按钮的输入,或者,用户的特定姿态。在一种示例中,上述的特定手势可以为:用户在第一穿戴式设备摄像头拍摄区域内的特定手势输入。Exemplarily, the above-mentioned first input may include: a specific gesture input by the user, or a voice input by the user to the first wearable device, or the user input to a specific button on the first wearable device, or the user Specific posture. In an example, the above-mentioned specific gesture may be: a specific gesture input by the user in the shooting area of the camera of the first wearable device.
示例性的,上述特定手势可以为滑动手势、手掌悬停手势、单击手势、长按手势、面积变化手势、单指悬停手势和多指悬停手势中的任意一种,本发明实施例对此不作限定。Exemplarily, the above-mentioned specific gesture may be any one of a sliding gesture, a palm hovering gesture, a click gesture, a long-press gesture, an area change gesture, a single-finger hovering gesture, and a multi-finger hovering gesture. There is no restriction on this.
步骤202、响应第一输入,第一穿戴式设备将该第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示。Step 202: In response to the first input, the first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display.
其中,第二穿戴式设备为基于第一穿戴式设备的摄像头的拍摄预览画面中的目标区域确定的,上述目标区域为第一输入选择的区域。Wherein, the second wearable device is determined based on the target area in the shooting preview screen of the camera of the first wearable device, and the target area is the area selected by the first input.
示例性的,上述用户第一输入选择的目标区域可以为用户通过第一穿戴式设备上的按钮或者用户通过语音输入控制第一穿戴式设备确定的目标区域。Exemplarily, the target area selected by the user's first input may be a target area determined by the user through a button on the first wearable device or the user controls the first wearable device through voice input.
示例性的,第一穿戴式设备的拍摄预览画面包括:该第一穿戴式设备的摄像头拍摄的真实画面以及基于该真实画面生成的虚拟画面。Exemplarily, the shooting preview picture of the first wearable device includes: a real picture taken by a camera of the first wearable device and a virtual picture generated based on the real picture.
示例性的,上述虚拟画面为:上述第一穿戴式设备根据该第一穿戴式设备上设置的摄像头拍摄的目标物体生成的虚拟信息。例如,当上述第一穿戴式设备的摄像头拍摄的画面中包含一个桌子时,上述虚拟画面可以包括:以标识形式显示在第一虚拟屏幕上的该桌子的长宽高标识。Exemplarily, the aforementioned virtual picture is: virtual information generated by the aforementioned first wearable device according to a target object photographed by a camera set on the first wearable device. For example, when the picture captured by the camera of the first wearable device includes a table, the virtual picture may include: the length, width, and height of the table displayed on the first virtual screen in the form of a logo.
在一种示例中,在上述第一穿戴式设备将该第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示后,当上述第二穿戴式设备的第二虚拟屏幕显示有上述目标物体时,第二穿戴式设备会将接收到的上述虚拟信息标记在上述目标物体上。In an example, after the first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display, when the second wearable device When the second virtual screen displays the target object, the second wearable device will mark the received virtual information on the target object.
例1,在第二穿戴式设备接收到第一穿戴式设备同步的桌子的长宽高标识的情况下,当第二穿戴式设备的第二虚拟屏幕的显示画面中包括该桌子时,第二 穿戴式设备会在第二虚拟屏幕上显示该桌子的长宽高标识。Example 1, in the case that the second wearable device receives the length, width and height identification of the table synchronized by the first wearable device, when the second wearable device’s second virtual screen includes the table, the second wearable device The wearable device will display the length, width and height of the table on the second virtual screen.
例2,以第一穿戴式设备和第二穿戴式设备均为AR眼镜为例,用户使用AR眼镜1进行游戏(例如,在桌子上做蛋糕),游戏结束后,该用户将AR眼镜1的上述虚拟画面(例如,做好的蛋糕)发送至其他用户的AR眼镜2,当其他用户的AR眼镜2的拍摄预览画面中出现上述桌子时,AR眼镜2的拍摄预览画面中会显示AR眼镜1同步的虚拟画面(例如,在桌子上显示AR眼镜1的用户做好的蛋糕)。Example 2: Taking the first wearable device and the second wearable device as AR glasses as an example, the user uses AR glasses 1 to play a game (for example, making a cake on the table). After the game is over, the user sets the AR glasses 1 The above virtual screen (for example, the cake made) is sent to the AR glasses 2 of other users. When the above table appears in the shooting preview screen of the AR glasses 2 of other users, the AR glasses 1 will be displayed in the shooting preview screen of the AR glasses 2 A synchronized virtual screen (for example, a cake made by the user of the AR glasses 1 is displayed on the table).
在另一种示例中,在上述第一穿戴式设备将该第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示后,第二穿戴式设备可以直接在第二虚拟屏幕上显示该虚拟画面。In another example, after the above-mentioned first wearable device synchronizes the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display, the second wearable device may The virtual picture is displayed directly on the second virtual screen.
例3,以第一穿戴式设备和第二穿戴式设备均为AR眼镜为例,用户使用AR眼镜1进行游戏(例如,用虚拟积木搭建房屋),游戏结束后,该用户将AR眼镜1的上述虚拟画面(例如,用户搭建好的房屋)发送至其他用户的AR眼镜2,其他用户的AR眼镜2上可以直接显示用户使用AR眼镜1搭建好的房屋。Example 3, taking the first wearable device and the second wearable device as AR glasses as an example, the user uses AR glasses 1 to play a game (for example, building a house with virtual building blocks), after the game is over, the user sets the AR glasses 1 The aforementioned virtual screen (for example, a house built by the user) is sent to the AR glasses 2 of other users, and the AR glasses 2 of other users can directly display the houses built by the user using the AR glasses 1.
示例性的,响应用户的第一输入,第一穿戴式设备在其第一虚拟屏幕的显示区域中确定目标区域,并将目标区域范围内的穿戴式设备确定为第二穿戴式设备。Exemplarily, in response to the user's first input, the first wearable device determines the target area in the display area of its first virtual screen, and determines the wearable device within the range of the target area as the second wearable device.
示例性的,在第一穿戴式设备确定目标区域范围内的第二穿戴式设备的情况下,为了将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示,第一穿戴式设备需要与第二穿戴式设备建立连接。Exemplarily, in a case where the first wearable device determines the second wearable device within the range of the target area, in order to synchronize the virtual screen of the first virtual screen of the first wearable device to the second wearable device of the second wearable device The virtual screen shows that the first wearable device needs to establish a connection with the second wearable device.
可选的,第一穿戴式设备在同步虚拟画面之前可以建立第一穿戴式设备与第二穿戴式设备间的通信连接。示例性的,在上述步骤202之前,本发明实施例提供的虚拟画面同步方法还可以包括以下步骤301和步骤302:Optionally, the first wearable device may establish a communication connection between the first wearable device and the second wearable device before synchronizing the virtual screen. Exemplarily, before the above step 202, the virtual screen synchronization method provided in the embodiment of the present invention may further include the following steps 301 and 302:
步骤301、第一穿戴式设备通过红外发射装置,向第二穿戴式设备发送红外信号,该红外信号包括:第一穿戴式设备的设备特征编码,以使得第二穿戴式设备能够通过该设备特征编码与第一穿戴式设备建立连接。Step 301: The first wearable device sends an infrared signal to the second wearable device through the infrared emitting device, the infrared signal includes: the device feature code of the first wearable device, so that the second wearable device can pass the device feature The code establishes a connection with the first wearable device.
示例性的,第一穿戴式设备可以包括:红外发送装置;第二穿戴式设备还包括:红外接收装置。Exemplarily, the first wearable device may include: an infrared transmitting device; the second wearable device may further include: an infrared receiving device.
步骤302、第二穿戴式设备的红外接收装置接收到第一穿戴式设备发送的红外信号后,通过该设备特征编码与第一穿戴式设备建立连接。Step 302: After receiving the infrared signal sent by the first wearable device, the infrared receiving device of the second wearable device establishes a connection with the first wearable device through the device feature code.
示例性的,在第一穿戴式设备与第二穿戴式设备建立连接后,第一穿戴式设备将第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示。Exemplarily, after the connection between the first wearable device and the second wearable device is established, the first wearable device synchronizes the virtual screen of the first virtual screen to the second virtual screen of the second wearable device for display.
示例性的,为了防止非用户选择的穿戴式设备接收到上述红外信号,上述第一穿戴式设备向第二穿戴式设备发送的红外信号的发送方式为定向发送。Exemplarily, in order to prevent a wearable device that is not selected by the user from receiving the infrared signal, the infrared signal sent by the first wearable device to the second wearable device is sent by directional transmission.
举例说明,当用户想要开启本发明实施例提供的虚拟画面同步功能时,如图3中(A)所示,为第一穿戴式设备的拍摄预览画面,其中包括虚拟控件30、使用可穿戴式设备的其他用户。用户可以通过将单手手指与上述虚拟控件30重合2秒的方式,开启本发明实施例提供的虚拟画面同步方法。For example, when the user wants to enable the virtual screen synchronization function provided by the embodiment of the present invention, as shown in FIG. 3(A), it is the shooting preview screen of the first wearable device, which includes the virtual control 30 and the use of wearable Other users of the device. The user can start the virtual screen synchronization method provided by the embodiment of the present invention by overlapping the finger of one hand with the aforementioned virtual control 30 for 2 seconds.
示例性的,本发明实施例提供的第一穿戴式设备设置有图像采集装置(例如,摄像头)。以第一穿戴式设备为AR眼镜为例,该图像采集装置可以采集使用该第一穿戴式设备的用户面前的实时画面图像,以使得该第一穿戴式设备能够从集到的实时画面图像中识别用户在第一穿戴式设备摄像头拍摄区域内的特定手势输入。Exemplarily, the first wearable device provided in the embodiment of the present invention is provided with an image acquisition device (for example, a camera). Taking the first wearable device as AR glasses as an example, the image capture device can capture the real-time picture image in front of the user who uses the first wearable device, so that the first wearable device can collect the real-time picture image from the collection. Recognize the user's specific gesture input in the camera shooting area of the first wearable device.
本发明实施例提供的虚拟画面同步方法可以应用于多种场景下,方便用户采用不同的选择方式选择需要与之进行虚拟画面同步的第二穿戴式设备。The virtual screen synchronization method provided by the embodiment of the present invention can be applied in a variety of scenarios, and it is convenient for the user to select the second wearable device with which the virtual screen needs to be synchronized by using different selection methods.
在第一种可能的场景下:In the first possible scenario:
在该场景下,当用户需要同时与多个第二穿戴式设备进行虚拟画面同步时,可以通过创建选择框,框选一定范围内多个第二穿戴式设备,并与选择框范围内包含的第二穿戴式设备进行虚拟画面同步的方式,来达到上述目的。In this scenario, when the user needs to synchronize the virtual screen with multiple second wearable devices at the same time, he can create a selection box to select multiple second wearable The second wearable device achieves the above-mentioned purpose by performing virtual screen synchronization.
可选的,在本发明实施例中,为了快速选择第二穿戴式设备降低第一穿戴式设备的数据处理负荷,第一穿戴式设备可以通过创建选择框,来选择第二穿戴式设备。Optionally, in the embodiment of the present invention, in order to quickly select the second wearable device to reduce the data processing load of the first wearable device, the first wearable device may create a selection box to select the second wearable device.
示例性的,上述步骤202还可以包括以下步骤202a和步骤202b:Exemplarily, the foregoing step 202 may further include the following steps 202a and 202b:
步骤202a、响应第一输入,第一穿戴式设备在拍摄预览画面上,创建选择框。Step 202a: In response to the first input, the first wearable device creates a selection box on the shooting preview screen.
步骤202b、第一穿戴式设备将选择框所框选的穿戴式设备确定为第二穿戴 式设备,并将虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示。Step 202b: The first wearable device determines the wearable device selected by the selection box as the second wearable device, and synchronizes the virtual screen to the second virtual screen of the second wearable device for display.
示例性的,上述的选择框的形状可以为圆形、矩形、三角形、菱形、圆环或者多边形等任意可能的形状,具体可以根据实际使用需求确定,本发明实施例不作限定。Exemplarily, the shape of the foregoing selection frame may be any possible shape such as a circle, a rectangle, a triangle, a diamond, a ring, or a polygon, which may be specifically determined according to actual usage requirements, and is not limited in the embodiment of the present invention.
举例说明,以矩形选择框为例。当第一穿戴式设备和第二穿戴式设备均为AR眼镜的情况下,如图3所示,第一穿戴式设备的第一虚拟屏幕中显示有虚拟画面同步功能虚拟控件(如图3中的30),当用户想要开启虚拟画面同步功能,则用户手指移动至同步功能虚拟控件30所在的位置,并停留2秒,从而开启虚拟画面同步功能,在画面同步功能开启后,如图3所示,第一穿戴式设备会在显示画面上创建一个矩形选择框(如图3中的31)。其中,选择框31用于选择第一穿戴式设备的拍摄预览画面中的穿戴式设备。For example, take a rectangular selection box as an example. When the first wearable device and the second wearable device are both AR glasses, as shown in Figure 3, the first virtual screen of the first wearable device displays a virtual screen synchronization function virtual control (as shown in Figure 3). 30), when the user wants to turn on the virtual screen synchronization function, the user's finger moves to the position where the synchronization function virtual control 30 is located and stays for 2 seconds to turn on the virtual screen synchronization function. After the screen synchronization function is turned on, as shown in Figure 3 As shown, the first wearable device will create a rectangular selection box (31 in Figure 3) on the display screen. Wherein, the selection box 31 is used to select the wearable device in the shooting preview screen of the first wearable device.
如此,第一穿戴式设备只需要识别选择框内的穿戴式设备,减少了第一穿戴式设备的数据处理负荷,同时也可以减少第一穿戴式设备的能耗。In this way, the first wearable device only needs to identify the wearable device in the selection box, which reduces the data processing load of the first wearable device and at the same time reduces the energy consumption of the first wearable device.
进一步可选的,在第一穿戴式设备创建选择框后,第一穿戴式设备可以用户的特定输入,根据实际需求扩大或缩小选择框的范围。Further optionally, after the first wearable device creates the select box, the first wearable device can expand or reduce the range of the select box according to actual needs according to the specific input of the user.
示例性的,在上述第一输入包括:用户第一手部和第二手部的第一子输入的情况下,上述步骤202a可以包括以下步骤202a1和步骤202a2:Exemplarily, in the case where the foregoing first input includes: the first sub-input of the user's first hand and second hand, the foregoing step 202a may include the following steps 202a1 and 202a2:
步骤202a1、在第一穿戴式设备的拍摄预览画面中包含第一子输入的情况下,基于目标矩形对角线,第一穿戴式设备在拍摄预览画面上,显示选择框。Step 202a1. In the case where the first sub-input is included in the shooting preview screen of the first wearable device, based on the diagonal of the target rectangle, the first wearable device displays a selection box on the shooting preview screen.
步骤202a2、在接收到用户的第二子输入的情况下,第一穿戴式设备更新选择框的显示。Step 202a2, upon receiving the second sub-input of the user, the first wearable device updates the display of the selection box.
其中,上述第一子输入用于触发第一穿戴式设备创建选择框,上述第二子输入用于触发第一穿戴式设备调节选择框的大小。上述目标矩形对角线为第一手部的第一部位和第二手部的第二部位之间的连线。Wherein, the first sub-input is used to trigger the first wearable device to create a selection box, and the second sub-input is used to trigger the first wearable device to adjust the size of the selection box. The diagonal line of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
示例性的,上述第一手部和第二手部的第一子输入可以为用户伸出双手手指的手势输入,当第一穿戴式设备识别出该手势输入后,在其拍摄换面上显示用于选择第二穿戴式设备的选择框。Exemplarily, the first sub-input of the first hand and the second hand may be a gesture input of the user extending the fingers of both hands. When the first wearable device recognizes the gesture input, it displays on its shooting surface A selection box for selecting the second wearable device.
示例性的,上述第二子输入可以为用户双手伸出一根手指并沿对角线(即 上述选择框的对角线)向两边展开或向中间聚集的手势输入。当第一穿戴式设备识别出拍摄预览画面内双手伸出一根手指的手势时,即开始创建框选区域,根据手指经过的对角线创建矩形的选择框,并通过移动双手手指来调节选择框的范围大小。Exemplarily, the above-mentioned second sub-input may be a gesture input in which the user extends a finger in both hands and expands to both sides along the diagonal line (that is, the diagonal line of the above-mentioned selection box) or gathers in the middle. When the first wearable device recognizes the gesture of extending a finger with both hands in the shooting preview screen, it starts to create a frame selection area, creates a rectangular selection frame based on the diagonal line of the finger, and adjusts the selection by moving the fingers of both hands The size of the frame.
在一种示例中,如图3中(B)所示,第一穿戴式设备在识别到用户双手分别伸出手指的手势后,以两手伸出的手指指尖的连线作为矩形的对角线,创建一个矩形选择框31,并通过移动手指对矩形选择框31的大小进行调整,在此期间,矩形选择框31的两个端点始终随着手指的移动而移动。In an example, as shown in Figure 3(B), after the first wearable device recognizes the gesture of extending fingers with both hands of the user, the line connecting the fingertips of the fingers extending from the two hands is used as the diagonal of the rectangle Line, create a rectangular selection box 31, and adjust the size of the rectangular selection box 31 by moving your fingers. During this period, the two end points of the rectangular selection box 31 always move with the movement of the fingers.
在另一种示例中,如图4中(A)所示,第一穿戴式设备在开启画面同步功能后,识别到用户双手分别伸出手指的手势,会在显示画面的目标位置创建一个预设大小的矩形选择框。用户可以将两手手指移动至该矩形选择框41的两个端点,通过移动手指对矩形选择框41的大小和位置进行调整。例如,第一穿戴式设备创建的预设大小的矩形选择框41可以按照上述方法,调整成为如图4中(B)所示的选择框42。In another example, as shown in Figure 4(A), after the first wearable device has turned on the screen synchronization function, it recognizes the user's hand gestures with fingers extending respectively, and will create a preset at the target position of the display screen. Set the size of the rectangular selection box. The user can move the fingers of both hands to the two end points of the rectangular selection frame 41, and adjust the size and position of the rectangular selection frame 41 by moving the fingers. For example, a rectangular selection frame 41 of a preset size created by the first wearable device can be adjusted to a selection frame 42 as shown in FIG. 4(B) according to the above method.
如此,用户可以通过手势输入在第一穿戴式设备的拍摄预览画面上显示选择框,并可以通过手势对选择框的范围大小进行调整,以使得用户可以通过调节选择框的大小,将想要进行画面同步的穿戴式设备划入选择框范围内。In this way, the user can display a selection frame on the shooting preview screen of the first wearable device through gesture input, and can adjust the size of the selection frame through gestures, so that the user can adjust the size of the selection frame to change the size of the selection frame. The wearable device whose screen is synchronized is included in the selection box.
进一步可选的,用户将选择框调整为合适的大小后,需要锁定选择框的大小,防止用户在将手指移出拍摄预览画面时依然能够调整选择框的大小。Further optionally, after the user adjusts the selection box to an appropriate size, the size of the selection box needs to be locked to prevent the user from being able to adjust the size of the selection box when the user moves his finger out of the shooting preview screen.
示例性的,在用户完成选择框的调整后,在上述步骤202a2之后,本发明实施例提供的虚拟画面同步方法还可以包括以下步骤202a3:Exemplarily, after the user completes the adjustment of the selection box, after the above step 202a2, the virtual screen synchronization method provided in the embodiment of the present invention may further include the following step 202a3:
步骤202a3、在接收到用户的第三子输入的情况下,第一穿戴式设备锁定选择框的显示。Step 202a3: In the case of receiving the user's third sub-input, the first wearable device locks the display of the selection box.
示例性的,上述第三子输入可以为用户双手手掌摊开的手势输入,当第一穿戴式设备检测到拍摄预览画面中含有用户双手手掌摊开的手势输入时,锁定选择框的大小。Exemplarily, the above-mentioned third sub-input may be a gesture input of the user spreading the palms of both hands. When the first wearable device detects the gesture input of spreading the palms of the user's hands in the shooting preview screen, the size of the selection box is locked.
示例性的,上述锁定选择框的大小,包括:固定选择框的大小和位置。Exemplarily, the above-mentioned locking the size of the selection box includes: fixing the size and position of the selection box.
示例性的,当第一穿戴式设备检测到用户双手手掌摊开的手势时,则完成 选择框的创建,并将选择框内包含的穿戴式设备确定为第二穿戴式设备。例如,以第一穿戴式设备与第二穿戴式设备均为AR眼镜为例,如图5所示,当AR眼镜识别出显示画面内用户手势变为双手手掌摊开的手势,则完成选择框31的创建,并识别该选择框内是否包含佩戴AR眼镜的用户,同时搜索选择框内的AR眼镜,向其同步显示画面。Exemplarily, when the first wearable device detects the gesture of spreading the palms of the user's hands, the creation of the selection box is completed, and the wearable device contained in the selection box is determined as the second wearable device. For example, taking the first wearable device and the second wearable device as AR glasses as an example, as shown in Figure 5, when the AR glasses recognize that the user's gesture in the display screen becomes a gesture of spreading the palms of both hands, the selection box is completed 31 is created, and it is recognized whether the user wearing AR glasses is included in the selection frame, and at the same time, the AR glasses in the selection frame are searched, and the screen is displayed synchronously to them.
如此,当用户通过双手手掌摊开的手势输入锁定选择框的大小时,用户可以安全的将双手放下,不会对选择框的大小产生影响。In this way, when the user locks the size of the selection box through gesture input with open palms of both hands, the user can safely put his hands down without affecting the size of the selection box.
可选的,当第一穿戴式设备拍摄预览画面的选择框被锁定,并且第一穿戴式设备从选择框内确定第二穿戴式设备后,需要在选择框内标记出第二穿戴式设备的所处位置。Optionally, when the selection frame of the first wearable device shooting preview screen is locked, and the first wearable device determines the second wearable device from the selection frame, it is necessary to mark the second wearable device in the selection frame Location.
示例性的,在上述步骤202b中将选择框所框选的穿戴式设备确定为第二穿戴式设备后,本发明实施例提供的虚拟画面同步方法还包括以下步骤202b1:Exemplarily, after the wearable device selected by the selection box is determined to be the second wearable device in step 202b, the virtual screen synchronization method provided in the embodiment of the present invention further includes the following step 202b1:
步骤202b1、第一穿戴式设备在拍摄预览画面上的第一区域,显示第一标识。Step 202b1, the first wearable device displays the first logo in the first area on the shooting preview screen.
其中,第一标识用于标记第二穿戴式设备,第一区域为第二穿戴式设备所在区域。The first identifier is used to mark the second wearable device, and the first area is the area where the second wearable device is located.
示例性的,上述第一区域为第一穿戴式设备的拍摄预览画面中选择框内的第二穿戴式设备所在区域,第一穿戴式设备为了提示用户第二穿戴式设备在拍摄预览画面中的位置,在第二穿戴式设备的所在区域范围内显示第一标识。Exemplarily, the above-mentioned first area is the area where the second wearable device is located in the selection frame in the shooting preview screen of the first wearable device. The first wearable device reminds the user that the second wearable device is in the shooting preview screen Location, the first logo is displayed within the area where the second wearable device is located.
示例性的,上述第一标识可以为红色圆点,通过在上述第一区域显示红色圆点,来标记第一穿戴式设备的拍摄预览画面内的第二穿戴式设备。Exemplarily, the first mark may be a red dot, and the red dot is displayed in the first area to mark the second wearable device in the shooting preview screen of the first wearable device.
例如,以第一穿戴式设备与第二穿戴式设备均为AR眼镜为例,参照图5,当用户完成选择框31的创建后,第一穿戴式设备识别选择框31内佩戴AR眼镜的用户,并同时检索选择框31范围内的AR眼镜,在选择框31内的两个AR眼镜所在位置上显示原点标识(即上述的第一标识,如图5中的32),从而标记该AR眼镜的位置。For example, taking the first wearable device and the second wearable device as AR glasses as an example, referring to FIG. 5, when the user completes the creation of the selection frame 31, the first wearable device recognizes the user wearing the AR glasses in the selection frame 31 , And search the AR glasses within the selection frame 31 at the same time, and display the origin mark (that is, the above-mentioned first mark, 32 in Fig. 5) at the position of the two AR glasses in the selection frame 31, thereby marking the AR glasses s position.
如此,用户可以清楚的看到第一穿戴式设备的拍摄预览画面的选择框内的第二穿戴式设备的位置,方便用户从中进行二次筛选。In this way, the user can clearly see the position of the second wearable device in the selection frame of the shooting preview screen of the first wearable device, which is convenient for the user to perform secondary screening.
可选的,由于上述选择框为矩形,选择范围较大,很有可能包含用户不想 与其进行虚拟画面同步的穿戴式设备。此时,用户可以对选择框内的第二穿戴式设备进行筛选。Optionally, since the above selection box is rectangular, the selection range is large, and it is likely to include wearable devices with which the user does not want to synchronize the virtual screen. At this time, the user can filter the second wearable devices in the selection box.
示例性的,在上述步骤202b1后,本发明实施例提供的虚拟画面同步方法可以包括以下步骤202b2:Exemplarily, after the above step 202b1, the virtual picture synchronization method provided in the embodiment of the present invention may include the following step 202b2:
步骤202b2、在第二穿戴式设备被目标对象遮挡的情况下,第一穿戴式设备取消第一标识的显示。Step 202b2, when the second wearable device is blocked by the target object, the first wearable device cancels the display of the first logo.
示例性的,上述目标对象可以为用户的手掌,或者其他不透光的物体。Exemplarily, the aforementioned target object may be the palm of the user, or other opaque objects.
示例性的,使用目标对象遮挡第二穿戴式设备,是为了使被遮挡的穿戴式设备不会出现在第一穿戴式设备的拍摄预览画面的选择框内,如此,第一穿戴式设备不会将其确定为第二穿戴式设备。Exemplarily, using the target object to block the second wearable device is to prevent the blocked wearable device from appearing in the selection frame of the shooting preview screen of the first wearable device. In this way, the first wearable device does not Determine it as the second wearable device.
示例性的,上述目标对象还用于遮挡第一穿戴式设备向第二穿戴式设备定向发送的红外信号。Exemplarily, the above-mentioned target object is also used to block the infrared signal sent by the first wearable device to the second wearable device.
示例性的,以第一穿戴式设备与第二穿戴式设备均为AR眼镜为例,当用户完成选择框的创建后,若选择框内存在用户不希望与之进行虚拟画面同步的AR眼镜时,如图6所示,使用单手手掌摊开手势将该AR眼镜遮掩2秒钟,被单手手掌摊开手势遮掩的对象将会取消选择,该AR眼镜位置的第一标识消失,使得第一穿戴式设备的显示画面不会同步至被单手手掌摊开手势遮掩的第二穿戴式设备的显示区域。Exemplarily, taking the first wearable device and the second wearable device as both AR glasses as an example, when the user completes the creation of the selection box, if there are AR glasses that the user does not want to synchronize with the virtual screen in the selection box , As shown in Figure 6, use the one-hand palm open gesture to cover the AR glasses for 2 seconds, the object that is hidden by the one-hand palm open gesture will be deselected, and the first mark of the AR glasses position disappears, making the first The display screen of the wearable device will not be synchronized to the display area of the second wearable device that is hidden by the open palm gesture of one hand.
如此,当第一穿戴式设备的拍摄预览画面的选择框内包含用户不想与之进行虚拟画面同步的第二穿戴式设备时,可以通过遮挡该第二穿戴式设备的方式来防止与其进行虚拟画面同步。In this way, when the selection frame of the shooting preview screen of the first wearable device contains a second wearable device that the user does not want to synchronize with the virtual screen, the second wearable device can be blocked to prevent the virtual screen from being performed with it. Synchronize.
在第二种可能的场景下:In the second possible scenario:
在该场景下,当用户不方便使用手势创建选择框时,用户可以通过头部转动带动第一穿戴式设备进行转动,将第一穿戴式设备的第一虚拟屏幕上显示的第二标识对准第二穿戴式设备的方式,将第一穿戴式设备的虚拟画面同步至第二穿戴式设备。In this scenario, when the user is inconvenient to use gestures to create a selection box, the user can drive the first wearable device to rotate by turning his head to align the second logo displayed on the first virtual screen of the first wearable device In the second wearable device mode, the virtual screen of the first wearable device is synchronized to the second wearable device.
可选的,在上述步骤202之前,本发明实施例提供的虚拟画面同步方法,可以包括以下步骤202c1:Optionally, before step 202, the virtual screen synchronization method provided in the embodiment of the present invention may include the following step 202c1:
步骤202c1、第一可穿戴式设备在目标区域的目标位置,显示第二标识。Step 202c1, the first wearable device displays the second identifier at the target location of the target area.
示例性的,结合上述步骤202c1,在步骤202c1之后,上述步骤202还可以包括以下步骤202c2:Exemplarily, in combination with the foregoing step 202c1, after step 202c1, the foregoing step 202 may further include the following step 202c2:
步骤202c2、第一可穿戴式设备将包括第二标识的第二区域中的穿戴式设备确定为第二穿戴式设备,并将虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示。Step 202c2: The first wearable device determines the wearable device in the second area including the second identifier as the second wearable device, and synchronizes the virtual screen to the second virtual screen display of the second wearable device.
示例性的,当第一穿戴式设备确定目标区域后,在目标区域的目标位置显示第二标识。其中,目标区域可以为第一穿戴式设备的拍摄预览画面的全部显示区域,目标位置可以为上述目标区域的中心位置。Exemplarily, after the first wearable device determines the target area, the second identifier is displayed at the target location of the target area. The target area may be the entire display area of the shooting preview screen of the first wearable device, and the target position may be the center position of the aforementioned target area.
示例性的,上述第二标识可以为十字辅助线,也可以为图像,也可以为其他用于标记第二穿戴式设备的标识,本发明实施例对此不做限定。Exemplarily, the above-mentioned second mark may be a cross auxiliary line, or may be an image, or may be another mark used to mark the second wearable device, which is not limited in the embodiment of the present invention.
示例性的,当上述第二标识位于第二区域的区域范围内时,第一穿戴式设备将处于第二区域范围内的穿戴式设备确定为第二穿戴式设备。Exemplarily, when the above-mentioned second identifier is located within the area range of the second area, the first wearable device determines the wearable device within the second area range as the second wearable device.
示例性的,上述第二区域为第二穿戴式设备所在区域。Exemplarily, the foregoing second area is an area where the second wearable device is located.
如此,当用户想要与某个穿戴式设备进行虚拟画面同步时,只需要将第一穿戴式设备的拍摄预览画面上显示的第二标识对准该穿戴式设备(即两者的全部或部分位置重叠),即可将该穿戴式设备确定为第二穿戴式设备,并与之进行虚拟画面同步。In this way, when the user wants to synchronize the virtual screen with a wearable device, he only needs to align the second mark displayed on the shooting preview screen of the first wearable device with the wearable device (that is, all or part of both). Position overlap), the wearable device can be determined as the second wearable device, and the virtual screen can be synchronized with it.
可选的,用户可以通过转动头部的方法,来控制第一穿戴式设备转动,并将其拍摄预览画面上显示的第二标识对准用户想要与之进行虚拟画面同步的穿戴式设备。Optionally, the user can control the rotation of the first wearable device by rotating the head, and align the second mark displayed on the shooting preview screen with the wearable device with which the user wants to synchronize the virtual screen.
示例性的,上述步骤202c1之后,本发明实施例提供的虚拟画面同步方法,可以包括以下步骤202c3和步骤202c4:Exemplarily, after the above step 202c1, the virtual screen synchronization method provided in the embodiment of the present invention may include the following steps 202c3 and 202c4:
步骤202c3、第一穿戴式设备获取第一穿戴式设备的转动方向和转动角度;Step 202c3: The first wearable device obtains the rotation direction and the rotation angle of the first wearable device;
步骤202c4、第一穿戴式设备基于转动方向和转动角度,更新第二标识的显示位置。Step 202c4: The first wearable device updates the display position of the second identifier based on the rotation direction and the rotation angle.
示例性的,在上述第一穿戴式设备为头戴式设备(如,AR眼镜、AR头盔等)时,用户可以通过转动头部或者身体,来控制第一穿戴式设备的转动。Exemplarily, when the above-mentioned first wearable device is a head-mounted device (eg, AR glasses, AR helmet, etc.), the user can control the rotation of the first wearable device by rotating the head or body.
例如,当第一穿戴式设备的拍摄预览画面内含有用户想要与之进行虚拟画面同步的穿戴式设备,并且上述第二标识并没有对准该穿戴式设备的所在区域时,使用第一穿戴式设备的用户可以通过转动头部的方式将第二标识对准该穿戴式设备。For example, when the shooting preview screen of the first wearable device contains the wearable device that the user wants to synchronize with the virtual screen, and the second identifier is not aligned with the area of the wearable device, use the first wearable device The user of the wearable device can align the second mark with the wearable device by turning the head.
举例说明,以第一穿戴式设备和第二穿戴式设备均为AR眼镜为例,如图7所示,第一穿戴式设备的第一虚拟屏幕中显示有虚拟画面同步功能虚拟控件(如图7中的60),当用户想要开启虚拟画面同步功能时,用户手指移动至同步功能虚拟控件60所在的位置,并停留2秒,从而开启画面同步功能,在画面同步功能开启后,如图7所示,第一穿戴式设备会在拍摄预览画面上创建十字辅助线61,用户可以通过转动头部或者移动身体的方式,将十字辅助线61对转用户想要与之进行虚拟画面同步的穿戴式设备。For example, taking the first wearable device and the second wearable device as AR glasses as an example, as shown in FIG. 7, the first virtual screen of the first wearable device displays a virtual screen synchronization function virtual control (as shown in FIG. 7 in 60), when the user wants to open the virtual screen synchronization function, the user's finger moves to the position where the synchronization function virtual control 60 is located, and stays for 2 seconds to turn on the screen synchronization function. After the screen synchronization function is turned on, as shown in the figure As shown in 7, the first wearable device will create a cross auxiliary line 61 on the shooting preview screen, and the user can turn the head or move the body to turn the cross auxiliary line 61 to the one that the user wants to synchronize with the virtual screen. Wearable devices.
示例性的,在上述第二标识与第二穿戴式设备重叠的情况下,持续1秒后,将上述第一穿戴式设备的拍摄预览画面的虚拟画面同步至与第二标识位置重叠的第二穿戴式设备的第二虚拟屏幕。同步完成后,第一穿戴式设备的拍摄预览画面上显示“√”,表示传输完成,用户此时可继续转动头部,继续选择下一个需要进行虚拟画面同步的穿戴式设备。Exemplarily, in the case where the above-mentioned second logo overlaps with the second wearable device, after 1 second, the virtual screen of the shooting preview screen of the above-mentioned first wearable device is synchronized to the second logo position overlapping with the second logo. The second virtual screen of the wearable device. After the synchronization is completed, "√" is displayed on the shooting preview screen of the first wearable device, indicating that the transmission is complete. At this time, the user can continue to turn his head and continue to select the next wearable device that needs to be synchronized with the virtual screen.
如此,用户可以在不用伸出双手的情况下,通过转动头部或者移动身体的方式来确定第二穿戴式设备,避免用户在双手手持其他物件时,必须放下手中的物件来通过手势操控第一穿戴式设备的情况发生。In this way, the user can determine the second wearable device by turning the head or moving the body without extending his hands, avoiding the user having to put down the objects in his hands to manipulate the first wearable device through gestures when holding other objects in both hands. The wearable device situation happened.
示例性的,在用户通过上述方法确定第二穿戴式设备之后,用户可以通过语音输入、按压第一穿戴式设备上的物理按键或者其他手势输入的方式,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕上显示。Exemplarily, after the user determines the second wearable device through the above method, the user can use voice input, press a physical button on the first wearable device, or other gesture input methods to change the first virtual device of the first wearable device. The virtual image of the screen is synchronized to the second virtual screen of the second wearable device for display.
可选的,当用户确定完需要进行画面同步的第二穿戴式设备后,将单指悬停手势移动至上述虚拟控件所在位置,悬停预设时间后,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕上显示。Optionally, after the user has determined the second wearable device that needs to perform screen synchronization, the single-finger hovering gesture is moved to the location of the virtual control, and after hovering for a preset time, the first wearable device of the first wearable device The virtual screen of the virtual screen is synchronized to the second virtual screen of the second wearable device and displayed on the second virtual screen.
示例性的,上述第一穿戴式设备可以通过第四代移动通信技术(the 4th generation mobile communication technology,4G)、第五代移动通信技术(the 5th  generation mobile communication technology,5G)或者无线高保真(wireless fidelity,WIFI)将第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示。Exemplarily, the aforementioned first wearable device may adopt the 4th generation mobile communication technology (4G), the 5th generation mobile communication technology (5G) or wireless high-fidelity (5G). wireless fidelity (WIFI) synchronizes the virtual screen of the first virtual screen to the second virtual screen of the second wearable device for display.
本发明实施例提供的虚拟画面同步方法,通过获取第一穿戴式设备的拍摄预览画面,在拍摄预览画面的目标区域中包括第二穿戴式设备的情况下,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示,能够方便快捷的实现第一穿戴式设备的画面共享,一定程度上避免了传统技术中AR设备之间共享虚拟信息时,由于存储和下载过程耗时较长导致的AR设备之间延时大的问题。In the virtual screen synchronization method provided by the embodiment of the present invention, by acquiring the shooting preview screen of the first wearable device, if the second wearable device is included in the target area of the shooting preview screen, the first wearable device of the first wearable device The virtual screen of the virtual screen is synchronized to the second virtual screen display of the second wearable device, which can easily and quickly realize the screen sharing of the first wearable device, which to a certain extent avoids the sharing of virtual information between AR devices in the traditional technology. The long delay between AR devices caused by the long storage and download process.
需要说明的是,本发明实施例中,上述各个方法附图所示的虚拟画面同步方法均是以结合本发明实施例中的一个附图为例示例性的说明的。具体实现时,上述各个方法附图所示的虚拟画面同步方法还可以结合上述实施例中示意的其它可以结合的任意附图实现,此处不再赘述。It should be noted that, in the embodiments of the present invention, the virtual screen synchronization methods shown in the figures of the above methods are all exemplified in conjunction with a figure in the embodiment of the present invention. In specific implementation, the virtual screen synchronization method shown in the figures of the above methods can also be implemented in combination with any other figures that can be combined as illustrated in the above embodiments, and will not be repeated here.
图8为实现本发明实施例提供的一种第一穿戴式设备的可能的结构示意图,如图8所示,第一穿戴式设备500包括:接收模块501和同步模块502,其中:FIG. 8 is a schematic diagram of a possible structure for implementing a first wearable device provided by an embodiment of the present invention. As shown in FIG. 8, the first wearable device 500 includes: a receiving module 501 and a synchronization module 502, wherein:
接收模块501,用于接收用户的第一输入。同步模块502,用于响应接收模块501接收到的第一输入,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示。其中,第二穿戴式设备为基于第一穿戴式设备的摄像头的拍摄预览画面中的目标区域确定的,目标区域为第一输入选择的区域。The receiving module 501 is configured to receive the first input of the user. The synchronization module 502 is configured to respond to the first input received by the receiving module 501 to synchronize the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display. Wherein, the second wearable device is determined based on the target area in the shooting preview picture of the camera of the first wearable device, and the target area is the area selected by the first input.
可选的,第一穿戴式设备500还包括:创建模块503和确定模块504。其中:创建模块503,用于响应第一输入,在拍摄预览画面上,创建选择框。确定模块504,用于将创建模块503创建的选择框所框选的穿戴式设备确定为第二穿戴式设备。同步模块502,还用于将虚拟画面同步至确定模块504确定的第二穿戴式设备的第二虚拟屏幕显示。Optionally, the first wearable device 500 further includes: a creation module 503 and a determination module 504. Wherein: the creation module 503 is used to respond to the first input and create a selection box on the shooting preview screen. The determining module 504 is configured to determine the wearable device selected by the selection box created by the creating module 503 as the second wearable device. The synchronization module 502 is further configured to synchronize the virtual screen to the second virtual screen display of the second wearable device determined by the determining module 504.
如此,第一穿戴式设备只需要识别选择框内的穿戴式设备,减少了第一穿戴式设备的数据处理负荷,同时也可以减少第一穿戴式设备的能耗。In this way, the first wearable device only needs to identify the wearable device in the selection box, which reduces the data processing load of the first wearable device and at the same time reduces the energy consumption of the first wearable device.
可选的,第一穿戴式设备500还包括:显示模块505。第一输入包括:用户 第一手部和第二手部的第一子输入;第一子输入用于触发第一穿戴式设备创建选择框。显示模块505,用于在拍摄预览画面中包含第一子输入的情况下,基于目标矩形对角线,在拍摄预览画面上,显示选择框。显示模块505,还用于在接收模块501接收到用户的第二子输入的情况下,更新选择框的显示。其中,目标矩形对角线为第一手部的第一部位和第二手部的第二部位之间的连线。Optionally, the first wearable device 500 further includes: a display module 505. The first input includes: a first sub-input of the user's first hand and second hand; the first sub-input is used to trigger the first wearable device to create a selection box. The display module 505 is configured to display a selection box on the shooting preview screen based on the diagonal of the target rectangle when the first sub-input is included in the shooting preview screen. The display module 505 is also configured to update the display of the selection box when the receiving module 501 receives the second sub-input of the user. Wherein, the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
如此,用户可以通过手势输入在第一穿戴式设备的拍摄预览画面上显示选择框,并可以通过手势对选择框的范围大小进行调整,以使得用户可以通过调节选择框的大小,将想要进行画面同步的穿戴式设备划入选择框范围内。In this way, the user can display a selection frame on the shooting preview screen of the first wearable device through gesture input, and can adjust the size of the selection frame through gestures, so that the user can adjust the size of the selection frame to change the size of the selection frame. The wearable device whose screen is synchronized is included in the selection box.
可选的,显示模块505,还用于在接收模块501接收到用户的第三子输入的情况下,锁定选择框的显示。Optionally, the display module 505 is further configured to lock the display of the selection box when the receiving module 501 receives the third sub-input of the user.
如此,当用户通过双手手掌摊开的手势输入锁定选择框的大小时,用户可以安全的将双手放下,不会对选择框的大小产生影响。In this way, when the user locks the size of the selection box through gesture input with open palms of both hands, the user can safely put his hands down without affecting the size of the selection box.
可选的,显示模块505,还用于在拍摄预览画面上的第一区域,显示第一标识,第一标识用于标记第二穿戴式设备,第一区域为第二穿戴式设备所在区域。Optionally, the display module 505 is further configured to display a first identifier in the first area on the shooting preview screen, the first identifier is used to mark the second wearable device, and the first area is the area where the second wearable device is located.
如此,用户可以清楚的看到第一穿戴式设备的拍摄预览画面的选择框内的第二穿戴式设备的位置,方便用户从中进行二次筛选。In this way, the user can clearly see the position of the second wearable device in the selection frame of the shooting preview screen of the first wearable device, which is convenient for the user to perform secondary screening.
可选的,显示模块505,还用于在第二穿戴式设备被目标对象遮挡的情况下,取消第一标识的显示。Optionally, the display module 505 is further configured to cancel the display of the first logo when the second wearable device is blocked by the target object.
如此,当第一穿戴式设备的拍摄预览画面的选择框内包含用户不想与之进行虚拟画面同步的第二穿戴式设备时,可以通过遮挡该第二穿戴式设备的方式来防止与其进行虚拟画面同步。In this way, when the selection frame of the shooting preview screen of the first wearable device contains a second wearable device that the user does not want to synchronize with the virtual screen, the second wearable device can be blocked to prevent the virtual screen from being performed with it. Synchronize.
可选的,显示模块505,还用于在目标区域的目标位置,显示第二标识。确定模块504,还用于将包括第二标识的第二区域中的穿戴式设备确定为第二穿戴式设备。显示模块505,还用于将虚拟画面同步至确定模块504确定的第二穿戴式设备的第二虚拟屏幕显示。Optionally, the display module 505 is further configured to display the second identifier at the target position of the target area. The determining module 504 is further configured to determine the wearable device in the second area including the second identifier as the second wearable device. The display module 505 is also configured to synchronize the virtual screen to the second virtual screen of the second wearable device determined by the determining module 504 for display.
如此,当用户想要与某个穿戴式设备进行虚拟画面同步时,只需要将第一穿戴式设备的拍摄预览画面上显示的第二标识对准该穿戴式设备(即两者的全部或部分位置重叠),即可将该穿戴式设备确定为第二穿戴式设备,并与之进行 虚拟画面同步。In this way, when the user wants to synchronize the virtual screen with a wearable device, he only needs to align the second mark displayed on the shooting preview screen of the first wearable device with the wearable device (that is, all or part of both). Position overlap), the wearable device can be determined as the second wearable device, and the virtual screen can be synchronized with it.
可选的,第一穿戴式设备,还包括:获取模块506。获取模块506,用于获取第一穿戴式设备的转动方向和转动角度。显示模块505,还用于基于获取模块506获取的转动方向和转动角度,更新第二标识的显示位置。Optionally, the first wearable device further includes: an obtaining module 506. The obtaining module 506 is used to obtain the rotation direction and the rotation angle of the first wearable device. The display module 505 is further configured to update the display position of the second indicator based on the rotation direction and the rotation angle acquired by the acquisition module 506.
如此,用户可以在不用伸出双手的情况下,通过转动头部或者移动身体的方式来确定第二穿戴式设备,避免用户在双手手持其他物件时,必须放下手中的物件来通过手势操控第一穿戴式设备的情况发生。In this way, the user can determine the second wearable device by turning the head or moving the body without extending his hands, avoiding the user having to put down the objects in his hands to manipulate the first wearable device through gestures when holding other objects in both hands. The wearable device situation happened.
本发明实施例提供的穿戴式设备,通过获取第一穿戴式设备的拍摄预览画面,在拍摄预览画面的目标区域中包括第二穿戴式设备的情况下,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示,能够方便快捷的实现第一穿戴式设备的画面共享,一定程度上避免了传统技术中AR设备之间共享虚拟信息时,由于存储和下载过程耗时较长导致的AR设备之间延时大的问题。In the wearable device provided by the embodiment of the present invention, by acquiring the shooting preview screen of the first wearable device, when the second wearable device is included in the target area of the shooting preview screen, the first virtual device of the first wearable device The virtual screen of the screen is synchronized to the second virtual screen display of the second wearable device, which can easily and quickly realize the screen sharing of the first wearable device, which to a certain extent avoids the sharing of virtual information between AR devices in the traditional technology. The storage and downloading process takes a long time and causes the problem of large delays between AR devices.
本发明实施例提供的电子设备能够实现上述方法实施例中穿戴式设备实现的各个过程,为避免重复,这里不再赘述。The electronic device provided in the embodiment of the present invention can implement each process implemented by the wearable device in the foregoing method embodiment, and to avoid repetition, details are not described herein again.
图9为实现本申请各个实施例的一种电子设备的硬件结构示意图,该电子设备100包括但不限于:射频单元101、网络模块102、音频输出单元103、输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、处理器110、电源111以及摄像头组件112等部件。本领域技术人员可以理解,图9中示出的电子设备100的结构并不构成对电子设备的限定,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。其中,上述摄像头组件112包括摄像头,在本发明实施例中,电子设备100包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端设备、可穿戴设备、以及计步器等。9 is a schematic diagram of the hardware structure of an electronic device implementing each embodiment of the present application. The electronic device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, a power supply 111, a camera assembly 112 and other components. Those skilled in the art can understand that the structure of the electronic device 100 shown in FIG. 9 does not constitute a limitation on the electronic device. The electronic device 100 may include more or less components than those shown in the figure, or combine certain components, or Different component arrangements. Wherein, the aforementioned camera component 112 includes a camera. In the embodiment of the present invention, the electronic device 100 includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a vehicle-mounted terminal device, a wearable device, and a pedometer.
其中,处理器110,可以识别用户第一子输入,并根据第一子输入向显示单元106发送第一指令,显示单元106响应与处理器110发送的第一指令,基于目标矩形对角线,在第一穿戴式设备的拍摄预览画面上,显示选择框。其中,目标矩形对角线为第一手部的第一部位和第二手部的第二部位之间的连线。处 理器110,还可以识别用户第二子输入,并根据第二子输入向显示单元106发送第二指令,显示单元106响应与处理器110发送的第二指令,基于目标矩形对角线,在第一穿戴式设备的拍摄预览画面上,更新选择框的显示。The processor 110 can recognize the user's first sub-input and send a first instruction to the display unit 106 according to the first sub-input. The display unit 106 responds to the first instruction sent by the processor 110 based on the diagonal of the target rectangle, On the shooting preview screen of the first wearable device, a selection frame is displayed. Wherein, the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand. The processor 110 may also recognize the user's second sub-input, and send a second instruction to the display unit 106 according to the second sub-input. The display unit 106 responds to the second instruction sent by the processor 110, based on the diagonal of the target rectangle, On the shooting preview screen of the first wearable device, the display of the selection box is updated.
本发明实施例提供的电子设备,通过获取第一穿戴式设备的拍摄预览画面,在拍摄预览画面的目标区域中包括第二穿戴式设备的情况下,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示,能够方便快捷的实现第一穿戴式设备的画面共享,一定程度上避免了传统技术中AR设备之间共享虚拟信息时,由于存储和下载过程耗时较长导致的AR设备之间延时大的问题。The electronic device provided by the embodiment of the present invention obtains the shooting preview screen of the first wearable device, and in the case that the second wearable device is included in the target area of the shooting preview screen, the first virtual screen of the first wearable device The virtual screen of the second wearable device is synchronized to the second virtual screen display of the second wearable device, which can easily and quickly realize the screen sharing of the first wearable device. To a certain extent, it avoids the storage of virtual information between AR devices in the traditional technology. And the long delay between the AR device caused by the long download process.
应理解的是,本发明实施例中,射频单元101可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器110处理;另外,将上行的数据发送给基站。通常,射频单元101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元101还可以通过无线通信系统与网络和其他设备通信。It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station. Generally, the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
电子设备100通过网络模块102为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。The electronic device 100 provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
音频输出单元103可以将射频单元101或网络模块102接收的或者在存储器109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元103还可以提供与电子设备100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元103包括扬声器、蜂鸣器以及受话器等。The audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic device 100 (for example, call signal reception sound, message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
电子设备100获取该摄像头组件112中的摄像头拍摄的实时画面(例如,第一穿戴式设备的拍摄预览画面),并显示在显示单元106上。The electronic device 100 obtains a real-time picture taken by the camera in the camera component 112 (for example, a photographed preview picture of the first wearable device), and displays it on the display unit 106.
输入单元104用于接收音频或视频信号。输入单元104可以包括图形处理器(Graphics Processing Unit,GPU)1041、麦克风1042和图像捕获装置1043,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元106上。经图形处理器1041处理后的图像帧可以存储在存储器109(或其它存储介 质)中或者经由射频单元101或网络模块102进行发送。麦克风1042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元101发送到移动通信基站的格式输出。The input unit 104 is used to receive audio or video signals. The input unit 104 may include a graphics processing unit (GPU) 1041, a microphone 1042, and an image capture device 1043. The graphics processor 1041 is configured to capture still pictures or videos obtained by the image capture device in the video capture mode or the image capture mode. Image data is processed. The processed image frame can be displayed on the display unit 106. The image frames processed by the graphics processor 1041 can be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102. The microphone 1042 can receive sound, and can process such sound into audio data. The processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
电子设备100还包括至少一种传感器105,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板1061的亮度,接近传感器可在电子设备100移动到耳边时,关闭显示面板1061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别电子设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器105还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。The electronic device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor. The ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light. The proximity sensor can close the display panel 1061 and the display panel 1061 when the electronic device 100 is moved to the ear. / Or backlight. As a kind of motion sensor, the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
显示单元106用于显示由用户输入的信息或提供给用户的信息。显示单元106可包括显示面板1061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板1061。The display unit 106 is used to display information input by the user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
用户输入单元107可用于接收输入的数字或字符信息,以及产生与电子设备100的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元107包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1071上或在触控面板1071附近的操作)。触控面板1071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,接收处理器110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1071。除了触控面板1071,用户输入单元107还可以包括其他输入设备1072。具体地,其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠 标、操作杆,在此不再赘述。The user input unit 107 may be used to receive input numeric or character information, and generate key signal input related to user settings and function control of the electronic device 100. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071, also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating). The touch panel 1071 may include two parts: a touch detection device and a touch controller. Among them, the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed. In addition, the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may also include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), trackball, mouse, and joystick, which will not be repeated here.
进一步的,触控面板1071可覆盖在显示面板1061上,当触控面板1071检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板1061上提供相应的视觉输出。虽然在图9中,触控面板1071与显示面板1061是作为两个独立的部件来实现电子设备100的输入和输出功能,但是在某些实施例中,可以将触控面板1071与显示面板1061集成而实现电子设备100的输入和输出功能,具体此处不做限定。Further, the touch panel 1071 can be overlaid on the display panel 1061. When the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch. The type of event provides corresponding visual output on the display panel 1061. Although in FIG. 9, the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the electronic device 100, in some embodiments, the touch panel 1071 and the display panel 1061 may be combined. The input and output functions of the electronic device 100 are realized by integration, which is not specifically limited here.
接口单元108为外部装置与电子设备100连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到电子设备100内的一个或多个元件或者可以用于在电子设备100和外部装置之间传输数据。The interface unit 108 is an interface for connecting an external device with the electronic device 100. For example, the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc. The interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 100 or can be used to connect to the electronic device 100 and the external device. Transfer data between devices.
存储器109可用于存储软件程序以及各种数据。存储器109可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。The memory 109 can be used to store software programs and various data. The memory 109 may mainly include a program storage area and a data storage area. The program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc. In addition, the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
处理器110是电子设备100的控制中心,利用各种接口和线路连接整个电子设备100的各个部分,通过运行或执行存储在存储器109内的软件程序和/或模块,以及调用存储在存储器109内的数据,执行电子设备100的各种功能和处理数据,从而对电子设备100进行整体监控。处理器110可包括一个或多个处理单元;可选的,处理器110可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110 中。The processor 110 is the control center of the electronic device 100, which uses various interfaces and lines to connect the various parts of the entire electronic device 100, runs or executes the software programs and/or modules stored in the memory 109, and calls the storage in the memory 109 , Execute various functions of the electronic device 100 and process data, so as to monitor the electronic device 100 as a whole. The processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc. The adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
电子设备100还可以包括给各个部件供电的电源111(比如电池),可选的,电源111可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。The electronic device 100 may also include a power source 111 (such as a battery) for supplying power to various components. Optionally, the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
另外,电子设备100包括一些未示出的功能模块,在此不再赘述。In addition, the electronic device 100 includes some functional modules not shown, which will not be repeated here.
可选的,本发明实施例还提供一种AR设备,包括处理器,存储器,存储在存储器上并可在所述处理器110上运行的计算机程序,该计算机程序被处理器执行时实现上述虚拟画面同步方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。Optionally, an embodiment of the present invention further provides an AR device, including a processor, a memory, and a computer program stored in the memory and running on the processor 110. The computer program realizes the aforementioned virtual reality when the computer program is executed by the processor. Each process of the embodiment of the picture synchronization method can achieve the same technical effect, and in order to avoid repetition, it will not be repeated here.
可选的,本发明实施例中,上述实施例中的电子设备可以为AR设备。具体的,当上述实施例中的电子设备(例如上述如图3所示的电子设备)为AR设备时,该AR设备可以包括上述电子设备中的全部或部分功能模块。当然,该AR设备还可以包括上述电子设备中未包括的功能模块。Optionally, in the embodiment of the present invention, the electronic device in the foregoing embodiment may be an AR device. Specifically, when the electronic device in the foregoing embodiment (for example, the electronic device shown in FIG. 3) is an AR device, the AR device may include all or part of the functional modules in the foregoing electronic device. Of course, the AR device may also include functional modules not included in the above electronic device.
可以理解,本发明实施例中,当上述实施例中的电子设备为AR设备时,该电子设备可以为集成有AR技术的电子设备。其中,上述AR技术是指实现现实场景和虚拟场景结合的一种技术。采用AR技术可以复原人类的视觉功能,从而使得人类可以通过AR技术体验现实场景和虚拟场景结合的感觉,进而使得人类更好地体验身临其境的感受。It can be understood that, in the embodiments of the present invention, when the electronic device in the foregoing embodiment is an AR device, the electronic device may be an electronic device integrated with AR technology. Among them, the above-mentioned AR technology refers to a technology that realizes the combination of a real scene and a virtual scene. The use of AR technology can restore human visual functions, so that humans can experience the combination of real scenes and virtual scenes through AR technology, so that humans can better experience the immersive feelings.
下面以AR设备为AR眼镜为例,当用户佩戴上AR眼镜后,用户观看到的场景是通过AR技术处理所产生的,即通过AR技术可以使得虚拟场景叠加显示在现实场景中。当用户对AR眼镜显示的内容操作时,用户可以看到AR眼镜将现实场景“剥开”,从而将更加真实的一面展示给用户。例如,用户肉眼观察一个纸箱时只能观察到纸箱外壳,但是当用户佩戴上AR眼镜后,用户便可以通过AR眼镜直接观察到纸箱的内部结构。Taking the AR device as the AR glasses as an example below, when the user wears the AR glasses, the scene that the user sees is generated through AR technology processing, that is, the virtual scene can be superimposed and displayed in the real scene through the AR technology. When the user manipulates the content displayed by the AR glasses, the user can see the AR glasses "peel off" the real scene, thereby showing the user a more realistic side. For example, the user can only observe the carton shell when observing a carton with naked eyes, but when the user wears AR glasses, the user can directly observe the internal structure of the carton through the AR glasses.
上述AR设备中可以包括摄像头,从而AR设备可以在摄像头拍摄的画面的基础上结合虚拟画面进行展示和互动。例如,本发明实施例中,AR设备可以将用户使用AR设备进行娱乐活动时生成的虚拟画面信息,同步至其他AR设备的显示屏上,使AR设备之间能够实现虚拟画面共享。The above-mentioned AR device may include a camera, so that the AR device can display and interact with a virtual screen based on the image captured by the camera. For example, in the embodiment of the present invention, the AR device can synchronize the virtual screen information generated when the user uses the AR device for entertainment activities to the display screen of other AR devices, so that the virtual screen can be shared between AR devices.
本发明实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述虚拟画面同步方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,简称ROM)、随机存取存储器(Random Access Memory,简称RAM)、磁碟或者光盘等。The embodiment of the present invention also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium. When the computer program is executed by a processor, each process of the above-mentioned virtual screen synchronization method embodiment is realized, and the same can be achieved. The technical effect, in order to avoid repetition, will not be repeated here. Wherein, the computer-readable storage medium, such as read-only memory (Read-Only Memory, ROM for short), random access memory (Random Access Memory, RAM for short), magnetic disk, or optical disk, etc.
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。It should be noted that in this article, the terms "include", "include" or any other variants thereof are intended to cover non-exclusive inclusion, so that a process, method, article or device including a series of elements not only includes those elements, It also includes other elements that are not explicitly listed, or elements inherent to the process, method, article, or device. If there are no more restrictions, the element defined by the sentence "including a..." does not exclude the existence of other identical elements in the process, method, article, or device that includes the element.
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台电子设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法。Through the description of the above implementation manners, those skilled in the art can clearly understand that the above-mentioned embodiment method can be implemented by means of software plus the necessary general hardware platform, of course, it can also be implemented by hardware, but in many cases the former is better.的实施方式。 Based on this understanding, the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。The embodiments of the application are described above with reference to the accompanying drawings, but the application is not limited to the above-mentioned specific embodiments. The above-mentioned specific embodiments are only illustrative and not restrictive. Those of ordinary skill in the art are Under the enlightenment of this application, many forms can be made without departing from the purpose of this application and the scope of protection of the claims, all of which fall within the protection of this application.

Claims (18)

  1. 一种虚拟画面同步方法,应用于第一穿戴式设备,所述方法包括:A method for synchronizing a virtual picture, applied to a first wearable device, the method including:
    接收用户的第一输入;Receive the user's first input;
    响应所述第一输入,将所述第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示;In response to the first input, synchronizing the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display;
    其中,所述第二穿戴式设备为基于所述第一穿戴式设备的摄像头的拍摄预览画面中的目标区域确定的,所述目标区域为所述第一输入选择的区域。Wherein, the second wearable device is determined based on a target area in a shooting preview image of a camera of the first wearable device, and the target area is an area selected by the first input.
  2. 根据权利要求1所述的方法,其中,所述响应于所述第一输入,将所述第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示,包括:The method according to claim 1, wherein, in response to the first input, the virtual screen of the first virtual screen of the first wearable device is synchronized to the second virtual screen of the second wearable device for display ,include:
    响应所述第一输入,在所述拍摄预览画面上,创建选择框;In response to the first input, creating a selection box on the shooting preview screen;
    将所述选择框所框选的穿戴式设备确定为第二穿戴式设备,并将所述虚拟画面同步至所述第二穿戴式设备的第二虚拟屏幕显示。The wearable device selected by the selection box is determined to be the second wearable device, and the virtual screen is synchronized to the second virtual screen of the second wearable device for display.
  3. 根据权利要求2所述的方法,其中,所述第一输入包括:用户第一手部和第二手部的第一子输入;所述第一子输入用于触发所述第一穿戴式设备创建所述选择框;The method according to claim 2, wherein the first input comprises: a first sub-input of the user's first hand and second hand; the first sub-input is used to trigger the first wearable device Create the selection box;
    所述响应于所述第一输入,在所述拍摄预览画面上,创建选择框,包括:The creating a selection frame on the shooting preview screen in response to the first input includes:
    在所述拍摄预览画面中包含所述第一子输入的情况下,基于目标矩形对角线,在所述拍摄预览画面上,显示选择框;In the case that the shooting preview screen includes the first sub-input, based on the diagonal of the target rectangle, displaying a selection box on the shooting preview screen;
    在接收到用户的第二子输入的情况下,更新所述选择框的显示;In the case of receiving the second sub-input of the user, updating the display of the selection box;
    其中,所述目标矩形对角线为所述第一手部的第一部位和所述第二手部的第二部位之间的连线。Wherein, the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
  4. 根据权利要求3所述的方法,其中,所述更新所述选择框的显示之后,还包括:在接收到用户的第三子输入的情况下,锁定所述选择框的显示。The method according to claim 3, wherein after the updating the display of the selection box, the method further comprises: locking the display of the selection box in the case of receiving a third sub-input of the user.
  5. 根据权利要求2所述的方法,其中,所述将所述选择框所框选的穿戴式设备确定为第二穿戴式设备之后,所述方法还包括:The method according to claim 2, wherein after the determining the wearable device selected by the selection box as the second wearable device, the method further comprises:
    在所述拍摄预览画面上的第一区域,显示第一标识,所述第一标识用于标记所述第二穿戴式设备,所述第一区域为所述第二穿戴式设备所在区域。In the first area on the shooting preview screen, a first identifier is displayed, the first identifier is used to mark the second wearable device, and the first area is the area where the second wearable device is located.
  6. 根据权利要求5所述的方法,其中,所述在所述拍摄预览画面上显示第 一标识之后,所述方法还包括:The method according to claim 5, wherein, after the first logo is displayed on the shooting preview screen, the method further comprises:
    在所述第二穿戴式设备被目标对象遮挡的情况下,取消所述第一标识的显示。When the second wearable device is blocked by the target object, the display of the first logo is cancelled.
  7. 根据权利要求1或2所述的方法,其中,所述将所述第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示之前,所述方法还包括:The method according to claim 1 or 2, wherein before the synchronization of the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display, the method further include:
    在所述目标区域的目标位置,显示第二标识;At the target position of the target area, display a second mark;
    所述将所述第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示,包括:The synchronizing the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display includes:
    将包括所述第二标识的第二区域中的穿戴式设备确定为第二穿戴式设备,并将所述虚拟画面同步至所述第二穿戴式设备的第二虚拟屏幕显示。The wearable device in the second area including the second identifier is determined to be the second wearable device, and the virtual screen is synchronized to the second virtual screen of the second wearable device for display.
  8. 根据权利要求7所述的方法,其中,所述在所述目标区域的目标位置,显示第二标识之后,所述方法还包括:The method according to claim 7, wherein, after the second identifier is displayed at the target position of the target area, the method further comprises:
    获取所述第一穿戴式设备的转动方向和转动角度;Acquiring the rotation direction and the rotation angle of the first wearable device;
    基于所述转动方向和转动角度,更新所述第二标识的显示位置。Based on the rotation direction and the rotation angle, the display position of the second indicator is updated.
  9. 一种第一穿戴式设备,所述第一穿戴式设备包括:接收模块和同步模块;A first wearable device, the first wearable device comprising: a receiving module and a synchronization module;
    所述接收模块,用于接收用户的第一输入;The receiving module is used to receive a user's first input;
    所述同步模块,用于响应所述接收模块接收到的第一输入,将第一穿戴式设备的第一虚拟屏幕的虚拟画面同步至第二穿戴式设备的第二虚拟屏幕显示;The synchronization module is configured to respond to the first input received by the receiving module to synchronize the virtual screen of the first virtual screen of the first wearable device to the second virtual screen of the second wearable device for display;
    其中,所述第二穿戴式设备为基于所述第一穿戴式设备的摄像头的拍摄预览画面中的目标区域确定的,所述目标区域为所述第一输入选择的区域。Wherein, the second wearable device is determined based on a target area in a shooting preview image of a camera of the first wearable device, and the target area is an area selected by the first input.
  10. 根据权利要求9所述的第一穿戴式设备,其中,所述第一穿戴式设备还包括:创建模块和确定模块;The first wearable device according to claim 9, wherein the first wearable device further comprises: a creation module and a determination module;
    所述创建模块,用于响应所述第一输入,在所述拍摄预览画面上,创建选择框;The creation module is configured to create a selection box on the shooting preview screen in response to the first input;
    所述确定模块,用于将所述创建模块创建的选择框所框选的穿戴式设备确定为第二穿戴式设备;The determining module is configured to determine the wearable device selected by the selection box created by the creating module as the second wearable device;
    所述同步模块,具体用于将所述虚拟画面同步至所述确定模块确定的第二 穿戴式设备的第二虚拟屏幕显示。The synchronization module is specifically configured to synchronize the virtual screen to the second virtual screen display of the second wearable device determined by the determining module.
  11. 根据权利要求10所述的第一穿戴式设备,其中,所述第一穿戴式设备还包括:显示模块;The first wearable device according to claim 10, wherein the first wearable device further comprises: a display module;
    所述第一输入包括:用户第一手部和第二手部的第一子输入;所述第一子输入用于触发所述第一穿戴式设备创建所述选择框;The first input includes: a first sub-input of a user's first hand and a second hand; the first sub-input is used to trigger the first wearable device to create the selection box;
    所述显示模块,用于在所述拍摄预览画面中包含所述第一子输入的情况下,基于目标矩形对角线,在所述拍摄预览画面上,显示选择框;The display module is configured to display a selection box on the shooting preview screen based on the diagonal of the target rectangle when the first sub-input is included in the shooting preview screen;
    所述显示模块,还用于在所述接收模块接收到用户的第二子输入的情况下,更新所述选择框的显示;The display module is further configured to update the display of the selection box when the receiving module receives the second sub-input of the user;
    其中,所述目标矩形对角线为所述第一手部的第一部位和所述第二手部的第二部位之间的连线。Wherein, the diagonal of the target rectangle is the line between the first part of the first hand and the second part of the second hand.
  12. 根据权利要求11所述的第一穿戴式设备,其中,The first wearable device according to claim 11, wherein:
    所述显示模块,还用于在所述接收模块接收到用户的第三子输入的情况下,锁定所述选择框的显示。The display module is further configured to lock the display of the selection box when the receiving module receives the third sub-input of the user.
  13. 根据权利要求10所述的第一穿戴式设备,其中,所述第一穿戴式设备还包括:显示模块;The first wearable device according to claim 10, wherein the first wearable device further comprises: a display module;
    所述显示模块,用于在所述拍摄预览画面上的第一区域,显示第一标识,所述第一标识用于标记所述第二穿戴式设备,所述第一区域为所述第二穿戴式设备所在区域。The display module is configured to display a first identifier in a first area on the shooting preview screen, the first identifier is used to mark the second wearable device, and the first area is the second The area where the wearable device is located.
  14. 根据权利要求13所述的第一穿戴式设备,其中,The first wearable device according to claim 13, wherein:
    所述显示模块,还用于在所述第二穿戴式设备被目标对象遮挡的情况下,取消所述第一标识的显示。The display module is further configured to cancel the display of the first logo when the second wearable device is blocked by the target object.
  15. 根据权利要求14所述的第一穿戴式设备,其中,The first wearable device according to claim 14, wherein:
    所述显示模块,还用于在所述目标区域的目标位置,显示第二标识;The display module is further configured to display a second identifier at the target position of the target area;
    所述确定模块,还用于将包括所述第二标识的第二区域中的穿戴式设备确定为第二穿戴式设备;The determining module is further configured to determine the wearable device in the second area including the second identifier as the second wearable device;
    所述显示模块,还用于将所述虚拟画面同步至所述确定模块确定的第二穿戴式设备的第二虚拟屏幕显示。The display module is further configured to synchronize the virtual screen to the second virtual screen of the second wearable device determined by the determining module for display.
  16. 根据权利要求15所述的第一穿戴式设备,其中,所述第一穿戴式设备,还包括:获取模块;The first wearable device according to claim 15, wherein the first wearable device further comprises: an acquisition module;
    所述获取模块,用于获取所述第一穿戴式设备的转动方向和转动角度;The acquisition module is used to acquire the rotation direction and the rotation angle of the first wearable device;
    所述显示模块,还用于基于所述获取模块获取的转动方向和转动角度,更新所述第二标识的显示位置。The display module is further configured to update the display position of the second indicator based on the rotation direction and the rotation angle acquired by the acquisition module.
  17. 一种电子设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1至8中任一项所述的虚拟画面同步方法的步骤。An electronic device, comprising a processor, a memory, and a computer program stored on the memory and capable of running on the processor, the computer program being executed by the processor to implement any of claims 1 to 8 One of the steps of the virtual picture synchronization method.
  18. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1至8中任一项所述的虚拟画面同步方法的步骤。A computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the virtual screen synchronization method according to any one of claims 1 to 8 are realized.
PCT/CN2020/140836 2019-12-31 2020-12-29 Virtual image synchronization method and wearable device WO2021136266A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911418240.2 2019-12-31
CN201911418240.2A CN111124136A (en) 2019-12-31 2019-12-31 Virtual picture synchronization method and wearable device

Publications (1)

Publication Number Publication Date
WO2021136266A1 true WO2021136266A1 (en) 2021-07-08

Family

ID=70506863

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/140836 WO2021136266A1 (en) 2019-12-31 2020-12-29 Virtual image synchronization method and wearable device

Country Status (2)

Country Link
CN (1) CN111124136A (en)
WO (1) WO2021136266A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111124136A (en) * 2019-12-31 2020-05-08 维沃移动通信有限公司 Virtual picture synchronization method and wearable device
CN111813220A (en) * 2020-06-19 2020-10-23 深圳增强现实技术有限公司 Interactive system based on augmented reality or virtual reality intelligent head-mounted equipment
CN112256121A (en) * 2020-09-10 2021-01-22 苏宁智能终端有限公司 Implementation method and device based on AR (augmented reality) technology input method
CN112631677A (en) * 2020-12-21 2021-04-09 上海影创信息科技有限公司 Resource support prompting method and system
CN113301506B (en) * 2021-05-27 2023-07-25 维沃移动通信有限公司 Information sharing method, device, electronic equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130194304A1 (en) * 2012-02-01 2013-08-01 Stephen Latta Coordinate-system sharing for augmented reality
CN103460256A (en) * 2011-03-29 2013-12-18 高通股份有限公司 Anchoring virtual images to real world surfaces in augmented reality systems
CN106796344A (en) * 2014-10-07 2017-05-31 艾尔比特系统有限公司 The wear-type of the enlarged drawing being locked on object of interest shows
CN108513165A (en) * 2017-02-28 2018-09-07 三星电子株式会社 The method of shared content and the electronic equipment for supporting this method
CN109074772A (en) * 2016-01-25 2018-12-21 艾维赛特有限公司 Content based on sight shares dynamic self-organization network
CN111124136A (en) * 2019-12-31 2020-05-08 维沃移动通信有限公司 Virtual picture synchronization method and wearable device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3324270A1 (en) * 2016-11-16 2018-05-23 Thomson Licensing Selection of an object in an augmented reality environment
KR20190056523A (en) * 2017-11-17 2019-05-27 삼성에스디에스 주식회사 System and method for synchronizing display of virtual reality content

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103460256A (en) * 2011-03-29 2013-12-18 高通股份有限公司 Anchoring virtual images to real world surfaces in augmented reality systems
US20130194304A1 (en) * 2012-02-01 2013-08-01 Stephen Latta Coordinate-system sharing for augmented reality
CN106796344A (en) * 2014-10-07 2017-05-31 艾尔比特系统有限公司 The wear-type of the enlarged drawing being locked on object of interest shows
CN109074772A (en) * 2016-01-25 2018-12-21 艾维赛特有限公司 Content based on sight shares dynamic self-organization network
CN108513165A (en) * 2017-02-28 2018-09-07 三星电子株式会社 The method of shared content and the electronic equipment for supporting this method
CN111124136A (en) * 2019-12-31 2020-05-08 维沃移动通信有限公司 Virtual picture synchronization method and wearable device

Also Published As

Publication number Publication date
CN111124136A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
WO2021136266A1 (en) Virtual image synchronization method and wearable device
WO2021098678A1 (en) Screencast control method and electronic device
CN109952757B (en) Method for recording video based on virtual reality application, terminal equipment and storage medium
WO2019228163A1 (en) Speaker control method and mobile terminal
CN109218648B (en) Display control method and terminal equipment
US11954200B2 (en) Control information processing method and apparatus, electronic device, and storage medium
WO2019196929A1 (en) Video data processing method and mobile terminal
CN109032486B (en) Display control method and terminal equipment
WO2020233323A1 (en) Display control method, terminal device, and computer-readable storage medium
WO2019149028A1 (en) Application download method and terminal
CN109151546A (en) A kind of method for processing video frequency, terminal and computer readable storage medium
WO2021136330A1 (en) Bullet screen display control method and electronic device
WO2021136329A1 (en) Video editing method and head-mounted device
CN110798621A (en) Image processing method and electronic equipment
WO2019184902A1 (en) Method for controlling icon display, and terminal
WO2020063136A1 (en) Application program start-up method and terminal device
CN110866465A (en) Control method of electronic equipment and electronic equipment
WO2021104162A1 (en) Display method and electronic device
WO2020168859A1 (en) Photographing method and terminal device
CN109547696B (en) Shooting method and terminal equipment
CN109814825B (en) Display screen control method and mobile terminal
CN111273885A (en) AR image display method and AR equipment
CN111352505A (en) Operation control method, head-mounted device, and medium
WO2021136265A1 (en) Unlocking method and electronic device
WO2021083085A1 (en) Content display method and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20911074

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20911074

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 05.01.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20911074

Country of ref document: EP

Kind code of ref document: A1