Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and "a" and "an" generally include at least two, but do not exclude at least one, unless the context clearly dictates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
In the prior art, when other external devices need to be connected in the process of using the VR/AR device, a user needs to manually detect the peripheral connectable external devices and select which external device to establish communication connection, and therefore the efficiency is very low. In addition to this, in a case where the user has worn the VR/AR device, the manual operation of the user is further restricted, and the operation of connecting the external device becomes more complicated. In order to solve the above-mentioned drawbacks, the present invention provides a connection method for bluetooth devices, which is used to intelligently select a target bluetooth device and establish a communication connection therewith. Technical solutions provided by embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a method for connecting a bluetooth device according to an embodiment of the present invention, and with reference to fig. 1, the method includes:
step 101, responding to a request for connecting an external device, and scanning to obtain at least one connectable Bluetooth device.
Step 102, state information associated with the target virtual scene is acquired.
Step 103, determining a target Bluetooth device from the at least one connectable Bluetooth device according to the state information associated with the target virtual scene.
And step 104, establishing communication connection with the target Bluetooth device.
The embodiment is applicable to a VR/AR device, and in step 101, the external device refers to other devices located at the periphery of the VR/AR system, capable of establishing a connection with the VR/AR device, and used for expanding or improving functions of the VR/AR device. For example, a handle, a steering wheel, a flight stick, a body sensing glove, and the like that can be connected to the VR/AR device.
The request for connecting the external device may be initiated by a user. In response to the request, the VR/AR device scans for external devices for at least one connectable bluetooth device. The connectable bluetooth device refers to one or more devices that can be scanned by the VR/AR device and that emit bluetooth broadcast.
In step 102, the target virtual scene refers to a virtual scene to be displayed by the VR/AR device, such as a racing game scene, a flying game scene, a running game scene, or a teaching scene.
The state information associated with the target virtual scene refers to a current state of a parameter related to the target virtual scene in the VR/AR device, for example, a first parameter related to the target virtual scene is currently in a set state, a second parameter related to the target virtual scene is currently in an unset state, and the like.
Next, in step 103, a target bluetooth device can be determined from the at least one connectable bluetooth device based on the status information associated with the target virtual scene. The target bluetooth device may be one or multiple ones. After determining the target bluetooth device, a communication connection may be established with the target bluetooth device.
In this embodiment, when a request for connecting an external device is received, state information associated with a target virtual scene is acquired, a target bluetooth device is selected from at least one connectable bluetooth device obtained by scanning, and a communication connection is established with the target bluetooth device. Compared with the prior art, the technical scheme of the embodiment of the invention can automatically select the target Bluetooth device and actively establish communication connection with the target Bluetooth device, thereby realizing the intelligent communication connection process between the VR/AR device and the external Bluetooth device and simplifying the operation aiming at the VR/AR device.
In the above embodiment, optionally, the state information associated with the target virtual scene may include: the target virtual scene is in a selected state or an unselected state.
Wherein the target virtual scene is in a selected state, that is, the user or the VR/AR device has selected the virtual scene to be presented. At this time, in the VR/AR device, the scene setting parameters related to the target virtual scene are in an effective setting state. The target virtual scene is in an unselected state, that is, the virtual scene to be presented is not determined yet. At this time, in the VR/AR device, the scene setting parameter related to the target virtual scene may be null, and it may be considered that the user does not actively select the scene, or the user waits for the VR/AR device to automatically select the virtual scene to be displayed. The present invention may provide different alternative embodiments for the two cases described above. The following section will specifically describe how to implement the bluetooth device connection method in the above two cases with reference to fig. 2 and fig. 3, respectively.
Fig. 2 is a flowchart of a method for connecting a bluetooth device according to another embodiment of the present invention, and with reference to fig. 2, the method includes:
step 201, in response to a request for connecting an external device, scanning for at least one connectable bluetooth device.
Step 202, obtaining state information associated with the target virtual scene: the target virtual scene is in a selected state.
And step 203, acquiring the target virtual scene selected by the user.
Step 204, judging whether a Bluetooth device corresponding to the target virtual scene exists in the at least one connectable Bluetooth device according to a pre-established corresponding relationship between the Bluetooth device and the virtual scene; if not, go to step 205; if so, go to step 206.
And step 205, prompting the user that no Bluetooth device corresponding to the target virtual scene exists.
Step 206, judging whether a Bluetooth device uniquely corresponding to the target virtual scene exists in the at least one connectable Bluetooth device; if yes, go to step 207; if not, go to step 208.
And step 207, taking the Bluetooth device uniquely corresponding to the target virtual scene as a target Bluetooth device, and executing step 210.
And step 208, calculating the relative positions of a plurality of connectable Bluetooth devices corresponding to the target virtual scene.
Step 209, selecting a target bluetooth device from the plurality of connectable bluetooth devices according to the relative positions of the plurality of connectable bluetooth devices.
And step 210, establishing communication connection with the target Bluetooth device.
In step 201, optionally, the user may initiate a request to connect an external device by gesture action, head action, eye action, and/or triggering a specific button.
When the VR/AR equipment detects gesture action, head action, eyeball action and/or trigger action of a specific button of a user, determining that the user requests to connect external equipment; in response to the request to connect the external device, the VR/AR device may scan for an external bluetooth signal; at least one connectable bluetooth device may be acquired according to the scanned external bluetooth signal. Before scanning an external Bluetooth signal, the VR/AR equipment can firstly detect whether a Bluetooth scanning function is opened, and if the Bluetooth scanning function is opened, the VR/AR equipment can directly scan; if not, the Bluetooth scanning function is started to scan external Bluetooth signals.
In steps 202 to 203, when the acquired state information associated with the target virtual scene is: when the target virtual scene is in the selected state, the target virtual scene selected by the user can be acquired.
In step 204, the correspondence between the bluetooth device and the virtual scene is pre-established. Optionally, in this embodiment, when any virtual scene is imported or created, a bluetooth device supported by or required to be used by the virtual scene may be recorded at the VR/AR device in advance. Optionally, the name and/or MAC (Media Access Control) address of the bluetooth device supported by or required to be used by the virtual scene may be recorded specifically. For example, a racing game scenario supported bluetooth device includes: a gamepad named H1 and a steering wheel named H2. As another example, a bluetooth device supported by an in-flight game scenario includes: a gamepad with MAC address a1 and a flight stick with MAC address a 2.
Based on this, after the target virtual scene selected by the user is obtained, the corresponding relation between the bluetooth device and the virtual scene can be inquired, and whether the bluetooth device corresponding to the target virtual scene exists in the at least one connectable bluetooth device obtained through scanning is judged. If not, step 205 is executed to prompt the user that there is no bluetooth device corresponding to the target virtual scene. Optionally, the prompting mode may be through voice broadcast, or prompt text is displayed on a display screen of the VR/AR device, which is not limited in this embodiment.
In step 206 and 209, when only one bluetooth device corresponding to the target virtual scene exists in the at least one connectable bluetooth device, the bluetooth device may be directly used as the target bluetooth device.
When a plurality of bluetooth devices corresponding to the target virtual scene exist among the at least one connectable bluetooth device, relative positions of the plurality of bluetooth devices may be respectively calculated, and the target bluetooth device may be selected based on the relative positions. Wherein, the relative position refers to the position of the bluetooth device relative to the VR/AR device, and is not described in detail later. The selected target bluetooth device can be one or a plurality of target bluetooth devices.
Optionally, in this embodiment, to calculate the relative position of the bluetooth device, at least two bluetooth modules may be installed on the VR/AR device, and the at least two bluetooth modules may transmit bluetooth signals outwards, and may also receive bluetooth signals sent by other bluetooth devices. In an alternative embodiment, the distance d from any one bluetooth device to each bluetooth module installed on the VR/AR device may be calculated by the following formula:
d=10^{[abs(RSSI)-A]/(100*n)}
wherein abs () represents an absolute value function, RSSI represents the bluetooth signal strength received by the bluetooth module installed on the VR/AR device, a represents the signal strength when the distance between the bluetooth transmitting end and the receiving end is 1m, and n is a spatial obstacle attenuation factor.
After the distance from any one bluetooth device to each bluetooth module installed on the VR/AR device is calculated, the bluetooth device may be located by combining the known distance between each bluetooth module to obtain the relative position of the bluetooth device.
In an alternative embodiment, to reduce the weight and volume of the VR/AR device, two bluetooth modules at a distance d0 may be mounted on the VR/AR device, and the relative positions of the external bluetooth modules may be determined based on triangulation. Specifically, for a bluetooth device, the distances d1 and d2 between the bluetooth device and the two bluetooth modules may be determined first; then, the two bluetooth modules can be respectively used as two vertexes of a triangle, and d0, d1 and d2 are correspondingly used as three sides of the triangle; then, the position of the third vertex of the triangle is determined based on the known two vertices and three sides, and the position of the third vertex is taken as the relative position of the bluetooth device.
Then, a target bluetooth device may be selected from the plurality of connectable bluetooth devices according to the calculated relative positions of the plurality of connectable bluetooth devices. Alternatively, a bluetooth device located within a set distance (e.g., 1m) around the VR/AR device, and/or closest to the VR/AR device, and/or in front of the VR/AR device may be selected as the target bluetooth device from among the plurality of connectable bluetooth devices. Of course, in practice, when determining the target bluetooth device according to the relative positions of the connectable bluetooth devices, other position determination methods may be selected according to actual requirements, and the embodiments of the present invention include, but are not limited to, the above cases.
After the target bluetooth device is selected, a communication request may be initiated to the target bluetooth device and a communication connection may be established therewith in step 210. Furthermore, the user can interact with the virtual scene through the connected Bluetooth device while watching the virtual scene.
In this embodiment, when a request for connecting an external device is received, a target bluetooth device corresponding to a target virtual scene is selected from at least one connectable bluetooth device obtained by scanning based on the target virtual scene selected by a user, and a communication connection is established with the target bluetooth device. Furthermore, automatic connection with the target Bluetooth device is achieved, operation on the VR/AR device is simplified, and user experience is improved.
Fig. 3 is a flowchart of a method for connecting a bluetooth device according to another embodiment of the present invention, and in conjunction with fig. 3, the method includes:
step 301, in response to a request for connecting an external device, scanning for at least one connectable bluetooth device.
Step 302, obtaining the state information associated with the target virtual scene is: the target virtual scene is in an unselected state.
Step 303, calculating the relative position of the at least one connectable bluetooth device.
Step 304, selecting a target Bluetooth device from the at least one connectable Bluetooth device according to the relative position of the at least one connectable Bluetooth device.
305, determining whether a target virtual scene corresponding to the target Bluetooth device exists according to a pre-established corresponding relationship between the Bluetooth device and the virtual scene; if not, go to step 306; if so, go to step 307.
And step 306, prompting the user that no target virtual scene corresponding to the target Bluetooth device exists.
Step 307, judging whether a virtual scene uniquely corresponding to the target Bluetooth device exists; if so, go to step 308; if not, go to step 309.
And 308, taking the virtual scene uniquely corresponding to the target Bluetooth device as a target virtual scene, and executing step 311.
Step 309, displaying a plurality of selectable virtual scenes corresponding to the target bluetooth device to a user.
Step 310, receiving the selection made by the user for the plurality of selectable virtual scenes, taking the virtual scene selected by the user as the target virtual scene, and executing step 311.
And 311, displaying the target virtual scene.
In this embodiment, the obtained state information associated with the target virtual scene is: the target virtual scene is in an unselected state. That is, the virtual scene to be presented is unknown, and it is not possible to select which bluetooth device to connect with based on the target virtual scene. In this case, steps 303 and 304 may be performed to calculate a relative position of at least one connectable bluetooth device and select a target bluetooth device from the at least one connectable bluetooth device based on the relative position. Optionally, the method for calculating the relative position and the method for selecting the target bluetooth device according to the relative position of the bluetooth device may refer to the record of the embodiment corresponding to fig. 2, and are not described herein again.
In step 305, optionally, after determining the target bluetooth device, the bluetooth device may be queried for a corresponding relationship with a virtual scene to determine whether a virtual scene corresponding to the target bluetooth device exists. And if not, prompting the user that the target virtual scene corresponding to the target Bluetooth equipment does not exist. Optionally, the prompting mode may be through voice broadcast, or prompt text is displayed on a display screen of the VR/AR device, which is not limited in this embodiment.
In step 307 to step 310, if there is a virtual scene uniquely corresponding to the target bluetooth device, the virtual scene is taken as a target virtual scene. For example, the target bluetooth device is a flight joystick, and in a virtual scene stored in the VR/AR device, the flight game scene can be used as the target virtual scene only if the flight game is supported and the flight joystick needs to be used.
And if a plurality of selectable virtual scenes corresponding to the target Bluetooth device exist, displaying the plurality of selectable virtual scenes for the user to select. For example, when the target bluetooth device is a game pad, multiple games stored in the VR/AR device are supported and the game pad needs to be used, and at this time, the user can select which game scene to enter. Optionally, the manner of presenting the multiple selectable virtual scenes may present names corresponding to the multiple selectable virtual scenes in a list form, or present a cover screen shot of the multiple selectable virtual scenes in a floating window form, or present dynamic thumbnails corresponding to the multiple selectable virtual scenes in a circulating manner, and the like, which is included in the embodiment but not limited thereto.
After the multiple selectable virtual scenes are presented to the user, the user's selection of the multiple selectable virtual scenes may be detected. For example, a user may select a virtual scene that he wants to view by pointing a finger in a certain direction, nodding to a certain direction, or turning the eye to a certain direction. And after receiving the selection made by the user for the plurality of selectable virtual scenes, the VR/AR device takes the virtual scene selected by the user as the target virtual scene.
In step 311, after the target virtual scene is determined, the target virtual scene may be presented for viewing by the user. And the user can interact with the virtual scene through the connected Bluetooth device in the process of watching the virtual scene.
In this embodiment, when a request for connecting an external device is received, a target bluetooth device is selected for connection based on a relative position of at least one connectable bluetooth device obtained by scanning, and after connection, a virtual scene corresponding to the target bluetooth device is displayed. Furthermore, automatic connection with the target Bluetooth device and automatic switching of the virtual scene are achieved, operation on the VR/AR device is simplified, and user experience is improved.
It should be noted that, alternatively, the execution subject of the above or below embodiments may be a bluetooth connection service built in the VR/AR device. The bluetooth connection service may be embodied as a service program capable of executing the methods provided by the embodiments of the present invention on the VR/AR device.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of step 201 to step 203 may be device a; for another example, the execution subject of steps 201 and 202 may be device a, and the execution subject of step 203 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 101, 102, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
It is also noted that, in the systems and methods of the present invention, it is apparent that individual components or steps may be disassembled and/or reassembled. These decompositions and/or recombinations are to be regarded as equivalents of the present invention. Also, the steps of executing the series of processes described above may naturally be executed chronologically in the order described, but need not necessarily be executed chronologically. Some steps may be performed in parallel or independently of each other.
In the above, an alternative embodiment of the bluetooth device connection method is described, as shown in fig. 4a, and in practice, the bluetooth device connection method may be implemented by a bluetooth device connection apparatus, as shown in fig. 4a, the apparatus includes:
a scanning module 401, configured to scan for at least one connectable bluetooth device in response to a request for connecting an external device; an information obtaining module 402, configured to obtain indication information for connecting an external bluetooth device; a determining module 403, configured to determine a target bluetooth device from the at least one connectable bluetooth device according to the status information associated with the target virtual scene; a connection module 404, configured to establish a communication connection with the target bluetooth device.
Further optionally, the state information associated with the target virtual scene includes: the target virtual scene is in a selected state; or, the target virtual scene is in an unselected state.
Further optionally, when the target virtual scene is in the selected state, the determining module 403 is specifically configured to: acquiring a target virtual scene selected by a user; and selecting the Bluetooth device corresponding to the target virtual scene as the target Bluetooth device from the at least one connectable Bluetooth device according to the pre-established corresponding relationship between the Bluetooth device and the virtual scene.
Further optionally, the determining module 403 is specifically configured to: when the target virtual scene corresponds to a plurality of connectable Bluetooth devices according to the corresponding relation between the Bluetooth devices and the virtual scene, calculating the relative positions of the plurality of connectable Bluetooth devices; selecting a target Bluetooth device from the plurality of connectable Bluetooth devices according to the relative positions of the plurality of connectable Bluetooth devices.
Further optionally, when the target virtual scene is in the unselected state, the determining module 403 is specifically configured to: calculating a relative position of the at least one connectable bluetooth device; selecting a target Bluetooth device from the at least one connectable Bluetooth device according to the relative position of the at least one connectable Bluetooth device.
Further optionally, as shown in fig. 4b, the apparatus further includes a display module 405, configured to determine a target virtual scene corresponding to the target bluetooth device according to a pre-established correspondence between the bluetooth device and the virtual scene; and displaying the target virtual scene.
Further optionally, the display module 405 is specifically configured to: when the target Bluetooth device is determined to correspond to a plurality of selectable virtual scenes according to the corresponding relation between the Bluetooth device and the virtual scenes, displaying the plurality of selectable virtual scenes to a user; receiving selections made by the user for the plurality of selectable virtual scenes; and taking the virtual scene selected by the user as the target virtual scene.
Further optionally, the scanning module 401 is specifically configured to: when detecting gesture action, head action, eyeball action and/or trigger action of a specific button of a user, determining that the user requests to connect the external equipment; scanning an external Bluetooth signal in response to the request to connect the external device; and acquiring the at least one connectable Bluetooth device according to the scanned external Bluetooth signal.
The Bluetooth device connecting device can execute the Bluetooth device connecting method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the executing method. For technical details that are not described in detail in this embodiment, reference may be made to the method provided in the embodiment of the present application, and details are not described again.
Fig. 4a and 4b illustrate the internal structure and functions of the bluetooth device connection apparatus, and in practice, the bluetooth device connection apparatus may be represented as an electronic device, as shown in fig. 5, and the electronic device includes: memory 501, processor 502, input device 503, and output device 504.
The memory 501, the processor 502, the input device 503, and the output device 504 may be connected by a bus or other means, and fig. 5 illustrates the bus connection as an example.
The memory 501 is used to store one or more computer instructions and may be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device.
The memory 501 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In some embodiments, memory 501 may optionally include memory located remotely from processor 502, which may be connected to the electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
A processor 502, coupled to the memory 501, for executing the one or more computer instructions for performing the bluetooth device connection method provided by the corresponding embodiments of fig. 1-3. The input device 503 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus. The output device 504 may include a display device such as a display screen.
Further, as shown in fig. 5, the electronic device further includes: a power supply component 505. The power supply component 505 provides power to the various components of the device in which the power supply component is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
The electronic device can execute the Bluetooth device connection method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to the method provided in the embodiment of the present application, and details are not described again.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.