CN111208903B - Information transmission method, wearable device and medium - Google Patents

Information transmission method, wearable device and medium Download PDF

Info

Publication number
CN111208903B
CN111208903B CN201911415431.3A CN201911415431A CN111208903B CN 111208903 B CN111208903 B CN 111208903B CN 201911415431 A CN201911415431 A CN 201911415431A CN 111208903 B CN111208903 B CN 111208903B
Authority
CN
China
Prior art keywords
wearable device
virtual object
input
information
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911415431.3A
Other languages
Chinese (zh)
Other versions
CN111208903A (en
Inventor
凌深宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911415431.3A priority Critical patent/CN111208903B/en
Publication of CN111208903A publication Critical patent/CN111208903A/en
Application granted granted Critical
Publication of CN111208903B publication Critical patent/CN111208903B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/31Communication aspects specific to video games, e.g. between several handheld game devices at close range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an information transmission method, wearable equipment and a medium. The information transmission method comprises the following steps: receiving a first input of a first wearable device user to a first virtual object in a virtual screen; in response to the first input, sending first information to the second wearable device; the first information is information associated with the first virtual object. The embodiment of the invention can solve the problem that the information transmission process is more complicated in the prior art.

Description

Information transmission method, wearable device and medium
Technical Field
The embodiment of the invention relates to the technical field of information transmission, in particular to an information transmission method, wearable equipment and a medium.
Background
At present, when a user wants to use electronic equipment to perform information transmission with electronic equipment of other users, the user needs to search a social account of the user needing information transmission in a social application program of the electronic equipment first, and then perform information transmission through the searched social account.
Disclosure of Invention
The embodiment of the invention provides an information transmission method, wearable equipment and a medium, which can solve the problem that the information transmission process is complicated in the prior art.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an information transmission method, applied to a first wearable device, including:
receiving a first input of a first wearable device user to a first virtual object in a virtual screen;
in response to the first input, sending first information to the second wearable device;
the first information is information associated with the first virtual object.
In a second aspect, an embodiment of the present invention provides a wearable device, including:
the first input receiving module is used for receiving first input of a first wearable device user to a first virtual object in a virtual screen;
a first information sending module for sending first information to the second wearable device in response to the first input;
the first information is information associated with the first virtual object.
In a third aspect, an embodiment of the present invention provides a wearable device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the information transmission method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the information transmission method according to the first aspect.
In the embodiment of the invention, after the first wearable device receives the first input of the first wearable device user to the first virtual object in the virtual screen, the first information associated with the first virtual object can be sent to the second wearable device in response to the first input, information transmission between the two devices can be realized without adding a social account number, and the convenience of data transmission and the efficiency of data transmission are improved.
Drawings
Fig. 1 is a schematic flow chart of an information transmission method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a process for moving a first virtual object according to an embodiment of the invention;
FIG. 3 is an interface diagram of an object selection interface provided by one embodiment of the invention;
FIG. 4 is an interface schematic diagram of an object selection interface provided by another embodiment of the present invention;
FIG. 5 is an interface schematic diagram of an object selection interface provided in accordance with yet another embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a wearable device provided in accordance with an embodiment of the present invention;
fig. 7 is a schematic diagram of a hardware structure of a wearable device for implementing various embodiments of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
At present, when a user wants to use electronic equipment to perform information transmission with electronic equipment of other users, the user needs to search a social account of the user needing information transmission in a social application program of the electronic equipment first, and then perform information transmission through the searched social account.
In order to solve the above technical problem, the present invention provides an information transmission scheme, where a user of a first wearable device can directly perform information transmission with a second wearable device by using the first wearable device.
The following describes a screen of a wearable device provided in an embodiment of the present invention: the screen of the wearable device in the embodiment of the present invention may be a virtual screen, and may be any carrier that can be used for displaying content projected by the projection device when the content is displayed by using an Augmented Reality (AR) technology. The projection device may be a projection device using AR technology, such as a wearable device in the embodiment of the present invention.
When displaying content on the virtual screen by using the AR technology, the projection device may project a virtual scene acquired by (or internally integrated with) the projection device, or a virtual scene and a real scene onto the virtual screen, so that the virtual screen may display the content, thereby showing an effect of superimposing the real scene and the virtual scene to a user.
In connection with different scenarios of AR technology applications, the virtual screen may generally be a display screen of an electronic device (e.g. a mobile phone), a lens of AR glasses, a windshield of a car, a wall of a room, etc. any possible carrier.
The following describes an exemplary process of displaying content on a virtual screen by using AR technology, by taking the virtual screen as a display screen of an electronic device, a lens of AR glasses, and a windshield of an automobile as examples.
In one example, when the virtual screen is a display screen of an electronic device, the projection device may be the electronic device. The electronic equipment can acquire a real scene in the area where the electronic equipment is located through the camera of the electronic equipment, the real scene is displayed on the display screen of the electronic equipment, then the electronic equipment can project a virtual scene acquired by the electronic equipment (or internally integrated) onto the display screen of the electronic equipment, so that the virtual scene can be displayed in a superposition mode in the real scene, and a user can see the effect of the real scene and the virtual scene after superposition through the display screen of the electronic equipment.
In another example, when the virtual screen is a lens of AR glasses, the projection device may be the AR glasses. When the user wears the glasses, the user can see the real scene in the area where the user is located through the lenses of the AR glasses, and the AR glasses can project the acquired (or internally integrated) virtual scene onto the lenses of the AR glasses, so that the user can see the display effect of the real scene and the virtual scene after superposition through the lenses of the AR glasses.
In yet another example, when the virtual screen is a windshield of an automobile, the projection device may be any electronic device. When the user is located in the automobile, the user can see the real scene in the area where the user is located through the windshield of the automobile, and the projection device can project the acquired (or internally integrated) virtual scene onto the windshield of the automobile, so that the user can see the display effect of the real scene and the virtual scene after superposition through the windshield of the automobile.
Of course, in the embodiment of the present invention, the specific form of the virtual screen may not be limited, for example, it may be a non-carrier real space. In this case, when the user is located in the real space, the user can directly see the real scene in the real space, and the projection device can project the acquired (or internally integrated) virtual scene into the real space, so that the user can see the display effect of the real scene and the virtual scene after superposition in the real space.
The virtual object provided by the embodiment of the invention can be an AR object. It should be noted that the AR object may be understood as: the AR device analyzes the real object to obtain feature information of the real object, such as type information of the real object, appearance information (such as structure, color, shape, etc.) of the real object, position information of the real object in space, and the like, and constructs an AR model in the AR device according to the feature information.
Optionally, in the embodiment of the present invention, the virtual object may specifically be a virtual image, a virtual pattern, a virtual character, and the like.
Next, the information transmission process of the first wearable device will be described in detail.
Fig. 1 is a flowchart illustrating an information transmission method according to an embodiment of the present invention. The method shown in fig. 1 may be performed by the first wearable device described above. Wherein, foretell first wearable equipment and second wearable equipment can be wear-type AR equipment such as AR glasses or AR helmet.
As shown in fig. 1, the information transmission method may include:
step 110, receiving a first input of a first wearable device user to a first virtual object in a virtual screen;
step 120, in response to the first input, sending first information to the second wearable device;
the first information is information associated with the first virtual object. In the embodiment of the invention, after the first wearable device receives the first input of the first wearable device user to the first virtual object in the virtual screen, the first information associated with the first virtual object can be sent to the second wearable device in response to the first input, information transmission between the two devices can be realized without adding a social account number, and the convenience of data transmission and the efficiency of data transmission are improved.
The information transmission method provided by the embodiment of the invention can be applied to a scene that a plurality of users play games, for example, two users want to play the same game together, and a first wearable device user can rapidly share a game file or game invitation information to a second wearable device user using a second wearable device through the first wearable device, so that the convenience of the game shared by the plurality of users is improved.
The information transmission method provided by the embodiment of the invention can be applied to a scene of information sharing among a plurality of users, for example, two users want to share information, and a first wearable device user can quickly share information or files to be shared to a second wearable device user using a second wearable device through the first wearable device, so that the convenience of information sharing among the plurality of users is improved.
In the embodiment of the invention, after the first wearable device user wears the first wearable device, the first wearable device user can simultaneously view the real environment in front of eyes and the virtual object in the virtual screen of the first wearable device.
Specifically, the first wearable device may be provided with a front-facing camera, an image capture area of which is a visual field area of the eyes of the first wearable device user.
In an embodiment of the present invention, the received first input may be a movement input to the first virtual object or a selection input to the first virtual object. Hereinafter, the detailed description will be separately made for both cases.
First case
In some embodiments of the present invention, the specific method of step 110 may include:
a first input is received that a first wearable device user moves a first virtual object in a virtual screen to a first target location.
Specifically, if the number of the first virtual objects is at least one, the first input is an input of moving the first virtual object selected by the first wearable user to the first target position.
In some embodiments of the present invention, the front camera of the first wearable device may collect a gesture performed by a user of the first wearable device in an image collection area of the front camera, and when the front camera collects that the user of the first wearable device performs a grabbing gesture, the first wearable device may identify an object grabbed by the grabbing gesture, and if the object grabbed by the grabbing gesture is identified as one of the at least one first virtual object and the grabbing gesture of the user of the first wearable device moves to the first target location, it may be determined that a first input that the user of the first wearable device moves the first virtual object to the first target location is received.
The moving direction of the grabbing gesture can be any direction in the space, so that a first wearable device user can move the first virtual object to any direction in the space.
In the embodiment of the present invention, the virtual object may be a three-dimensional image or an AR object.
Fig. 2 is a schematic diagram illustrating a process of moving a first virtual object according to an embodiment of the present invention. As shown in fig. 2, a football 201 is a first virtual object seen by a first wearable device user through the first wearable device, a person 202 wearing the wearable device is a real person seen by the first wearable device user through the first wearable device, and a hand 203 making a grabbing gesture is a real own hand seen by the first wearable device user through the first wearable device. When the first wearable device collects that the hand 203 making the grabbing gesture grabs the soccer ball 201 and moves the soccer ball 201 to the person 202 wearing the wearable device, and stops at the position of the person 202 wearing the wearable device, it may be determined that a first input to move the soccer ball 201 to the position of the person 202 wearing the wearable device is received.
In some embodiments of the present invention, the specific method of step 120 may include:
in response to the first input, searching for the wearable device within a first target spatial range that includes a first target location;
determining a second wearable device according to the searched wearable device;
determining first information based on the first virtual object moved to the first target location;
and sending the first information to the second wearable device.
In these embodiments, the first target spatial range is a spatial range of a polygon type of a preset size having the first target position as a center point. The polygonal shape may be a rectangular parallelepiped shape, an octahedral shape, or a spherical shape, and the like, which is not limited herein.
In some embodiments, the wearable device may be searched within the first target spatial range based on the near field communication manner. Specifically, whether a near field communication signal sent by the wearable device exists or not can be searched in the first target space range, and if the near field communication signal is searched, the wearable device is determined to be searched.
In other embodiments, an image of the first target spatial range may be acquired through the front-facing camera, and image recognition may be performed on the acquired image, whether wearable devices exist in the image is determined, and if wearable devices exist in the image, it is determined that the wearable devices are searched.
Optionally, the specific method of searching for the wearable device within the first target space range including the first target location may further include:
and under the condition that the first time length of the first virtual object staying at the first target position reaches a first preset time length, searching the wearable device in a first target space range including the first target position.
Taking the first predetermined time length as 2s as an example, after the first wearable device user moves the first virtual object to the first target position, if the first time length of stay at the first target position reaches the first predetermined time length, it may be determined that the first wearable device user wishes to send the first virtual object to the second wearable device at the first target position. Thus, the search for the wearable device can be triggered when the first duration of the first virtual object staying at the first target position reaches 2 s.
In some embodiments, if the number of the searched wearable devices is 1, the searched wearable device may be determined as the second wearable device. In other embodiments, if the number of the searched wearable devices is 2 or more than 2, the wearable device with the smallest distance to the first target location may be determined as the second wearable device.
In an embodiment of the present invention, the first information may be at least one of the following:
invitation information associated with the first virtual object;
file information of the first virtual object;
the fetch address of the first virtual object.
For example, if the first virtual object is a soccer game, the first information may be invitation information for inviting a second wearable device user who is using the second wearable device to play the soccer game together, and if the second wearable device user accepts the invitation, the first wearable device user and the second wearable device user may enter the soccer game online.
For another example, if the first virtual object is a game video, the first information may be video file information of the game video or an acquisition address of the game video, so that a second wearable device user using the second wearable device can acquire the game video through the first information and watch the same game video together with the first wearable device user.
In the embodiment of the invention, the first wearable device can send the first information to the second wearable device in a near field communication mode.
In some embodiments of the present invention, before step 110, the information transmission method may further include:
receiving a third input of the first wearable device user to a second virtual object in the virtual screen;
displaying at least one first virtual object associated with a second virtual object in response to a third input;
wherein the first virtual object comprises at least an object of a different object type than the second virtual object.
In the case where the first wearable device is networked, the first virtual object may be an object obtained from the network that includes at least an object type different from that of the second virtual object.
Specifically, after seeing the second virtual object in the virtual screen, the first wearable device user may make a third input to the second virtual object, so that the first wearable device, in response to the third input, acquires at least one first virtual object associated with the second virtual object from the network and displays the acquired at least one first virtual object in the virtual screen.
In some embodiments, the third input is a grab input to a second virtual object in the virtual screen, and the first wearable device may display at least one first virtual object associated with the second virtual object if the first wearable device user grabs the second virtual object for a second period of time up to a second predetermined period of time.
For example, the television showing the football game is the second virtual object seen by the first wearable device user through the first wearable device, and the hand making the grab gesture is the real own hand seen by the first wearable device user through the first wearable device. And when the first wearable device collects that the second time length for grabbing the television by the hand making the grabbing gesture reaches a second preset time length, for example, exceeds 2s, the second time length is used as a third input for determining that the point number is grabbed. At this point, the first wearable device begins to network recognize the content of the virtual object grabbed by the hand making the grabbing gesture and search for a first virtual object associated with the content, for example, the first virtual object of the television showing a football game may include AR television, a football game, and game video.
FIG. 3 is a schematic interface diagram illustrating an object selection interface according to an embodiment of the present invention. As shown in fig. 3, the AR tv 204, the soccer game 205, and the game video 206 displayed in the left area are all the first virtual objects searched, and the tv 207 showing the soccer game is displayed in the right area. The first wearable device user may grab any of the first virtual objects and move them to the location of the wearable device wearing person 202 shown in fig. 2. For example, if the football game 205 is moved, the originally selected television 207 showing the football game may be replaced with the newly selected football game 205 having a different object type from that of the television, and invitation information corresponding to the football game 205 may be transmitted to the second wearable device.
In some embodiments of the present invention, a specific process of the information transmission method in the first case may include:
the first wearable device can display a "television showing a football game" virtual object in a virtual screen, if a first wearable device user sees the "television showing a football game" virtual object, a right hand can be used for making a gesture of grabbing the "television showing a football game" virtual object, and when the duration of the gesture of grabbing the "television showing a football game" virtual object, which is made by the first wearable device user, is more than 2s, the first wearable device searches virtual objects such as "AR television", "football game" and "game video" associated with the "television showing a football game" virtual object in a networked manner, and displays the virtual objects in a manner shown in FIG. 3. After seeing the interface shown in fig. 3, the first wearable device user can keep grabbing the "television showing a football game" virtual object with the right hand, grab the "football game" virtual object with the left hand, drag the "football game" virtual object to the position where the user wearing the wearable device in the field of view is located, as shown in fig. 2, in the case that the first wearable device collects that the first wearable device user drags and stays the "football game" virtual object at the position where the user wearing the wearable device in the field of view is located for 2s, search for the wearable device in a spatial range including the position where the user wearing the wearable device in the field of view is located, and send invitation information corresponding to the "football game" virtual object to the wearable device that is searched. If the wearing device user using the searched wearing device accepts the invitation information corresponding to the virtual object of the football game, the two users can start to enter the scene corresponding to the virtual object of the football game together in an online manner.
It should be noted that, if the first wearable device user selects the "game video" virtual object, the invitation information corresponding to the "game video" virtual object may also be sent to the searched wearable device, and if the wearable device user using the searched wearable device accepts the invitation information corresponding to the "game video" virtual object, the two users may watch the same game video synchronously.
Second case
In some embodiments of the present invention, a target identifier and at least one first virtual object are displayed in the virtual screen, the target identifier having a corresponding selected region.
In these embodiments, the specific method of step 110 may include:
receiving a first input that a first wearable device user moves the selected first virtual object to a selected area corresponding to the target identification.
Fig. 4 is an interface diagram of an object selection interface according to another embodiment of the present invention. As shown in fig. 4, the AR television 301, the soccer game 302, and the game video 303 are displayed in a horizontal arrangement in the scroll display area 304, and the AR television 301, the soccer game 302, and the game video 303 may be moved leftward or rightward in the horizontal direction in the scroll display area 304, the middle area of the scroll display area 304 is a selected area 305, and the bottom of the selected area 305 is displayed with a target mark 306 in the shape of "a. The first input may be an input to move the soccer game 302 into the selected area 305.
Fig. 5 is an interface diagram of an object selection interface according to another embodiment of the present invention. As shown in fig. 5, the AR television 301, the soccer game 302, and the game video 303 are uniformly distributed along the circumferential direction of the dial 307, and the top of the dial 307 displays a target mark 308 in the form of a droplet, and a region including a position pointed by the tip of the droplet is a selected region. The first input may be an input that spins the football game 302 into a selected region.
Further, in some embodiments of the present invention, the first information may be sent to the second wearable device in response to the first input in the event that the selected first virtual object stays in the selected area for a third period of time for a third predetermined period of time.
In some embodiments of the present invention, before step 110, the information transmission method may further include:
receiving a third input of the first wearable device user to a second virtual object in the virtual screen;
displaying at least one first virtual object associated with a second virtual object in response to a third input;
wherein the first virtual object comprises at least an object of a different object type than the second virtual object.
In the case where the first wearable device is networked, the first virtual object may be an object obtained from the network that includes at least an object type different from that of the second virtual object.
Specifically, after seeing the second virtual object in the virtual screen, the first wearable device user may make a third input to the second virtual object, so that the first wearable device, in response to the third input, acquires at least one first virtual object associated with the second virtual object from the network and displays the acquired at least one first virtual object in the virtual screen.
In some embodiments of the invention, the second wearable device may be determined after the first virtual object is selected. In these embodiments, the third input may be a grab input to a second virtual object in the virtual screen, and the first wearable device may display at least one first virtual object associated with the second virtual object if the first wearable device user grabs the second virtual object for a second period of time up to a second predetermined period of time.
In these embodiments, before sending the first information to the second wearable device in step 120, the information transmission method may further include:
receiving a second input that the first wearable device user moves the selected first virtual object to the second target location.
At this time, the specific method of sending the first information to the second wearable device in step 120 may include:
in response to the second input, searching for the wearable device within a second target spatial range that includes a second target location;
determining a second wearable device according to the searched wearable device;
and sending first information associated with the selected first virtual object to the second wearable device.
Specifically, the principle of moving the first virtual object to the second target position, searching the wearable device in the second target space range, and determining the second wearable device and the first information are similar to those in the first case, and are not repeated here.
Optionally, the specific method of searching for the wearable device within the second target space range including the second target location may further include:
and under the condition that the fourth time length for the first virtual object to stay at the second target position reaches a fourth preset time length, searching the wearable device in a second target space range including the second target position.
In other embodiments of the present invention, the second wearable device may be determined prior to selecting the first virtual object.
In these embodiments, the third input may be an input to move a second virtual object in the virtual screen to a third target location, the first wearable device further comprising, in addition to displaying at least one first virtual object associated with the second virtual object:
in response to a third input, searching for the wearable device within a third target spatial range that includes a third target location;
and determining a second wearable device according to the searched wearable device.
Specifically, the specific methods of searching for the wearable device in the third target space range and determining the second wearable device are similar to the principle of the related content in the first case, and are not described herein again.
Optionally, the specific method of searching for the wearable device within the third target space range including the third target location may further include:
and under the condition that the fifth time length for which the second virtual object stays at the third target position reaches a fifth preset time length, searching the wearable device in a third target space range including the third target position.
In some embodiments of the present invention, a specific process of the information transmission method in the second case may include:
the first wearable device can display a 'television showing a football game' virtual object in a virtual screen, if a first wearable device user can grab and drag the 'television showing a football game' virtual object to a position where a user wearing the wearable device is located in a visual field after seeing the 'television showing a football game' virtual object, the first wearable device searches the wearable device in a spatial range including the position where the user wearing the wearable device is located in the visual field under the condition that the first wearable device user drags and stays the 'television showing a football game' virtual object in the visual field for 2s, and meanwhile, the first wearable device searches an 'AR television' associated with the 'television showing a football game' virtual object in a networking mode, Virtual objects such as "soccer game" and "game video" are displayed in the manner shown in fig. 5. After seeing the interface shown in fig. 5, a first wearable device user can make a gesture for rotating the turntable, after acquiring the gesture for rotating the turntable, the first wearable device user can rotate the turntable and stop rotating the turntable along with the end of the rotation gesture, and if the rotation of the turntable is stopped, a 'football game' virtual object is placed in a selected area pointed by the tip of a water droplet, and the length of the staying time exceeds 2s, the first wearable device sends invitation information corresponding to the 'football game' virtual object to the searched wearable device. If the wearing device user using the searched wearing device accepts the invitation information corresponding to the virtual object of the football game, the two users can start to enter the scene corresponding to the virtual object of the football game together in an online manner.
It should be noted that, if the first wearable device user selects the "game video" virtual object, the invitation information corresponding to the "game video" virtual object may also be sent to the searched wearable device, and if the wearable device user using the searched wearable device accepts the invitation information corresponding to the "game video" virtual object, the two users may watch the same game video synchronously.
In summary, the information transmission method according to the embodiment of the present invention may be combined with an AR technology, and when a user uses a wearable device, for example, a head-mounted AR device, the head-mounted AR device may identify a second virtual object selected by the user, acquire a first virtual object of a different object type from that of the second virtual object, and after the user selects the first virtual object, may transmit information of the new type of virtual object to head-mounted AR devices of other users, so as to implement transmission of different types of AR information between users, and may help the user to quickly expand different information for interacting with other people according to existing information, thereby improving cooperative efficiency and entertainment of the head-mounted AR device.
Fig. 6 shows a schematic structural diagram of a wearable device provided by an embodiment of the invention. The device shown in fig. 6 may be implemented by the first wearable device described above. The first wearable device and the second wearable device may be head-mounted AR devices such as AR glasses or AR helmets.
As shown in fig. 6, the wearable device may include:
a first input receiving module 410, configured to receive a first input of a first wearable device user to a first virtual object in a virtual screen;
a first information sending module 420 for sending first information to the second wearable device in response to the first input;
the first information is information associated with the first virtual object.
In the embodiment of the invention, after the wearable device receives the first input of the first wearable device user to the first virtual object in the virtual screen, the first information associated with the first virtual object can be sent to the second wearable device in response to the first input, information transmission between the two devices can be realized without adding a social account number, and the convenience of data transmission and the efficiency of data transmission are improved.
In the embodiment of the invention, after the first wearable device user wears the first wearable device, the first wearable device user can simultaneously view the real environment in front of eyes and the virtual object in the virtual screen of the first wearable device.
In some embodiments of the present invention, the first input receiving module 410 may be specifically configured to:
a first input is received that a first wearable device user moves a first virtual object in a virtual screen to a first target location.
In some embodiments of the invention, the wearable device may further comprise:
a first device search module to search for a wearable device within a first target spatial range including a first target location in response to a first input;
the first equipment determining module is used for determining second wearable equipment according to the searched wearable equipment;
a first information determination module for determining first information based on the first virtual object moved to the first target position;
the first information sending module 420 may be configured to send the first information to the second wearable device.
In some embodiments of the present invention, the first device search module may be specifically configured to:
and under the condition that the first time length of the first virtual object staying at the first target position reaches a first preset time length, searching the wearable device in a first target space range including the first target position.
In some embodiments of the present invention, a target identifier and at least one first virtual object are displayed in the virtual screen.
In these embodiments, the first input receiving module 410 may be further specifically configured to:
receiving a first input that a first wearable device user moves the selected first virtual object to a selected area corresponding to the target identification.
In these embodiments, optionally, the wearable device may further comprise:
the second input receiving module is used for receiving second input that the first wearable device user moves the selected first virtual object to the second target position before the first information is sent to the second wearable device;
a second device search module to search for the wearable device within a second target space range including a second target location in response to a second input;
the second equipment determining module is used for determining second wearable equipment according to the searched wearable equipment;
wherein, the first information sending module 420 may be configured to send the first information associated with the selected first virtual object to the second wearable device.
In some embodiments of the invention, the wearable device may further comprise:
the third input receiving module is used for receiving a third input of the first wearable device user to the second virtual object in the virtual screen before receiving the first input of the first wearable device user to the first virtual object in the virtual screen;
a virtual object display module to display at least one first virtual object associated with a second virtual object in response to a third input; wherein the first virtual object comprises at least an object of a different object type than the second virtual object.
The wearable device provided by the embodiment of the present invention can implement each process and effect implemented by the first wearable device in the method embodiments of fig. 1 to fig. 6, and the principles of implementing each process are similar, and are not described herein again to avoid repetition.
Fig. 7 is a schematic diagram of a hardware structure of a wearable device for implementing various embodiments of the present invention. As shown in fig. 7, the wearable device 500 includes but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the wearable device configuration shown in fig. 7 does not constitute a limitation of a wearable device, which may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the wearable device may be a head-mounted AR device such as AR glasses or an AR helmet.
Wherein, the processor 510 is configured to control the display unit 506 to display the first virtual object on its virtual screen; the user input unit 507 is configured to receive a first input of a first wearable device user to a first virtual object in a virtual screen; the processor 510, further configured to send first information to the second wearable device in response to the first input; the first information is information associated with the first virtual object.
The wearable device provided by the embodiment of the invention can realize each process realized by the wearable device in the method embodiment, and is not repeated here to avoid repetition.
In the embodiment of the invention, after the wearable device receives the first input of the first wearable device user to the first virtual object in the virtual screen, the first information associated with the first virtual object can be sent to the second wearable device in response to the first input, information transmission between the two devices can be realized without adding a social account number, and the convenience of data transmission and the efficiency of data transmission are improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 may also communicate with a network and other devices through a wireless communication system, for example, may communicate with other wearable devices through a near field communication manner.
The first wearable device provides wireless broadband internet access to the user via the network module 502, such as assisting the user in sending and receiving e-mails, browsing web pages, accessing streaming media, and the like.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the wearable device 500 (e.g., a call signal receiving sound, a message receiving sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. Wherein, the camera can include leading camera to first wearable equipment can combine virtual screen to demonstrate and interdynamic on the basis of the picture that leading camera was shot. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphics processor 5041 may be stored in the memory 509 (or other storage media) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The wearable device 500 also includes at least one sensor 505, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of light projected by the projector 5061 according to the brightness of ambient light, and a proximity sensor that can turn off the projector 5061 when the wearable device 500 is moved to the ear. As one of the motion sensors, the accelerometer sensor may detect the magnitude of acceleration in each direction (generally, three axes), may detect the magnitude and direction of gravity when stationary, and may be used to identify the first wearable device posture (e.g., horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (e.g., pedometer, tapping), and the like; the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The display unit 506 may include a projector 5061 for projecting light corresponding to information to be displayed, and an optical assembly for reflecting the light such that the reflected light is projected to a retina of a user to form a virtual screen on a reflection surface of the optical assembly located in front of glasses of the user.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the first wearable device. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, when the touch panel 5071 detects a touch operation on or near the touch panel, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the projector 5061 according to the type of the touch event.
The interface unit 508 is an interface for connecting an external device to the wearable apparatus 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the second wearable apparatus 500 or may be used to transmit data between the wearable apparatus 500 and an external device.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the first wearable device, connects various parts of the whole first wearable device by using various interfaces and lines, and executes various functions and processes data of the first wearable device by running or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring on the first wearable device. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The wearable device 500 may further include a power source 511 (e.g., a battery) for supplying power to various components, and preferably, the power source 511 may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system.
In addition, the wearable device 500 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a first wearable device, which includes a processor 510, a memory 509, and a computer program that is stored in the memory 509 and can be run on the processor 510, and when the computer program is executed by the processor 510, the processes of the information transmission method embodiment are implemented, and the same technical effects can be achieved, and in order to avoid repetition, details are not described here again.
It is understood that, in the embodiment of the present invention, when the first wearable device in the above embodiment is integrated with the AR technology. The AR technology is a technology for realizing the combination of a real scene and a virtual scene. By adopting the AR technology, the visual function of human can be restored, so that human can experience the feeling of combining a real scene and a virtual scene through the AR technology, and further the human can experience the experience of being personally on the scene better.
Taking the first wearable device as AR glasses as an example, when the user wears the AR glasses, the scene viewed by the user is generated by processing through the AR technology, that is, the virtual scene can be displayed in the real scene in an overlapping manner through the AR technology. When the user operates the content displayed by the AR glasses, the user can see that the AR glasses peel off the real scene, so that a more real side is displayed to the user. For example, a user can only observe the carton case when observing one carton with the naked eye, but after the user wears the AR glasses, the AR technology can display the virtual scene "the internal structure of the carton" on the real carton case in an overlapping manner, so that the user can directly observe the internal structure of the carton through the AR glasses.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the information transmission method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. An information transmission method is applied to a first wearable device and is characterized by comprising the following steps:
receiving a first input of a first wearable device user to a first virtual object in a virtual screen;
in response to the first input, sending first information to a second wearable device;
wherein the first information is information associated with the first virtual object;
wherein the receiving a first input of a first wearable device user to a first virtual object in a virtual screen comprises:
receiving a first input by the first wearable device user to move the first virtual object in the virtual screen to a first target location;
wherein said sending first information to a second wearable device in response to the first input comprises:
in response to the first input, searching for a wearable device within a first target spatial range that includes the first target location;
determining the second wearable device according to the searched wearable device;
determining the first information based on the first virtual object moved to the first target location;
and sending the first information to the second wearable device.
2. The method of claim 1, wherein searching for a wearable device within a first target spatial range including the first target location comprises:
searching for a wearable device within a first target spatial range including the first target location if a first duration of the first virtual object staying at the first target location reaches a first predetermined duration.
3. The method of claim 1, wherein a target identifier and at least one of the first virtual objects are displayed in the virtual screen;
wherein the receiving a first input of a first wearable device user to a first virtual object in a virtual screen comprises:
receiving a first input that the first wearable device user moves the selected first virtual object to the selected area corresponding to the target identification.
4. The method of claim 3, wherein prior to sending the first information to the second wearable device, further comprising:
receiving a second input that the first wearable device user moves the selected first virtual object to a second target location;
wherein, said sending first information to second wearable device includes:
in response to the second input, searching for the wearable device within a second target spatial range that includes the second target location;
determining the second wearable device according to the searched wearable device;
sending the first information associated with the selected first virtual object to the second wearable device.
5. The method of any of claims 1 to 3, wherein prior to receiving the first input by the first wearable device user to the first virtual object in the virtual screen, further comprising:
receiving a third input of the first wearable device user to a second virtual object in the virtual screen;
displaying at least one of the first virtual objects associated with the second virtual object in response to the third input;
wherein the first virtual object comprises at least an object of a different object type than the second virtual object.
6. A wearable device, comprising:
the first input receiving module is used for receiving first input of a first wearable device user to a first virtual object in a virtual screen;
a first information sending module for sending first information to a second wearable device in response to the first input;
wherein the first information is information associated with the first virtual object;
the first input receiving module is specifically configured to receive a first input that the first wearable device user moves the first virtual object in the virtual screen to a first target position;
the wearable device further comprises:
a first device search module to search for a wearable device within a first target spatial range including the first target location in response to the first input;
the first equipment determining module is used for determining the second wearable equipment according to the searched wearable equipment;
a first information determination module to determine the first information based on the first virtual object moved to the first target location;
the first information sending module is further configured to send the first information to the second wearable device.
7. Wearable device, comprising a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the information transmission method according to any of claims 1 to 5.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the information transmission method according to one of claims 1 to 5.
CN201911415431.3A 2019-12-31 2019-12-31 Information transmission method, wearable device and medium Active CN111208903B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911415431.3A CN111208903B (en) 2019-12-31 2019-12-31 Information transmission method, wearable device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911415431.3A CN111208903B (en) 2019-12-31 2019-12-31 Information transmission method, wearable device and medium

Publications (2)

Publication Number Publication Date
CN111208903A CN111208903A (en) 2020-05-29
CN111208903B true CN111208903B (en) 2022-07-05

Family

ID=70789878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911415431.3A Active CN111208903B (en) 2019-12-31 2019-12-31 Information transmission method, wearable device and medium

Country Status (1)

Country Link
CN (1) CN111208903B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105892650A (en) * 2016-03-28 2016-08-24 联想(北京)有限公司 Information processing method and electronic equipment
CN106790553A (en) * 2016-12-24 2017-05-31 珠海市魅族科技有限公司 The interface sharing method and device of virtual reality device
CN106997281A (en) * 2017-04-10 2017-08-01 北京小米移动软件有限公司 The method and smart machine of shared virtual objects
CN108479060A (en) * 2018-03-29 2018-09-04 联想(北京)有限公司 A kind of display control method and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10250720B2 (en) * 2016-05-05 2019-04-02 Google Llc Sharing in an augmented and/or virtual reality environment
US20190102946A1 (en) * 2017-08-04 2019-04-04 Magical Technologies, Llc Systems, methods and apparatuses for deployment and targeting of context-aware virtual objects and behavior modeling of virtual objects based on physical principles

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105892650A (en) * 2016-03-28 2016-08-24 联想(北京)有限公司 Information processing method and electronic equipment
CN106790553A (en) * 2016-12-24 2017-05-31 珠海市魅族科技有限公司 The interface sharing method and device of virtual reality device
CN106997281A (en) * 2017-04-10 2017-08-01 北京小米移动软件有限公司 The method and smart machine of shared virtual objects
CN108479060A (en) * 2018-03-29 2018-09-04 联想(北京)有限公司 A kind of display control method and electronic equipment

Also Published As

Publication number Publication date
CN111208903A (en) 2020-05-29

Similar Documents

Publication Publication Date Title
CN107707817B (en) video shooting method and mobile terminal
JP6340301B2 (en) Head mounted display, portable information terminal, image processing apparatus, display control program, display control method, and display system
CN107786827B (en) Video shooting method, video playing method and device and mobile terminal
CN110174993B (en) Display control method, terminal equipment and computer readable storage medium
CN110213440B (en) Image sharing method and terminal
CN108628515B (en) Multimedia content operation method and mobile terminal
WO2021136266A1 (en) Virtual image synchronization method and wearable device
CN112817453A (en) Virtual reality equipment and sight following method of object in virtual reality scene
CN111083354A (en) Video recording method and electronic equipment
CN111258420A (en) Information interaction method, head-mounted device and medium
CN111177420A (en) Multimedia file display method, electronic equipment and medium
CN111182211B (en) Shooting method, image processing method and electronic equipment
CN109656636A (en) A kind of application starting method and device
CN109164908B (en) Interface control method and mobile terminal
CN109002245B (en) Application interface operation method and mobile terminal
CN108924413B (en) Shooting method and mobile terminal
CN108156386B (en) Panoramic photographing method and mobile terminal
CN111443805B (en) Display method and wearable electronic equipment
CN111093033B (en) Information processing method and device
CN111240471B (en) Information interaction method and wearable device
CN111178306B (en) Display control method and electronic equipment
CN109547696B (en) Shooting method and terminal equipment
CN111208903B (en) Information transmission method, wearable device and medium
CN111258482B (en) Information sharing method, head-mounted device and medium
CN109547773B (en) Control method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant