CN106341165B - Beam presenting method and device - Google Patents

Beam presenting method and device Download PDF

Info

Publication number
CN106341165B
CN106341165B CN201510788820.6A CN201510788820A CN106341165B CN 106341165 B CN106341165 B CN 106341165B CN 201510788820 A CN201510788820 A CN 201510788820A CN 106341165 B CN106341165 B CN 106341165B
Authority
CN
China
Prior art keywords
parameter information
information
augmented reality
reality view
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510788820.6A
Other languages
Chinese (zh)
Other versions
CN106341165A (en
Inventor
徐然
魏娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co Ltd filed Critical Beijing Zhigu Ruituo Technology Services Co Ltd
Priority to CN201510788820.6A priority Critical patent/CN106341165B/en
Publication of CN106341165A publication Critical patent/CN106341165A/en
Application granted granted Critical
Publication of CN106341165B publication Critical patent/CN106341165B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/022Site diversity; Macro-diversity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/0413MIMO systems
    • H04B7/0426Power distribution
    • H04B7/043Power distribution using best eigenmode, e.g. beam forming or beam steering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/06Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station
    • H04B7/0613Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission
    • H04B7/0615Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal
    • H04B7/0617Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal for beam forming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/08Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the receiving station
    • H04B7/0891Space-time diversity
    • H04B7/0897Space-time diversity using beamforming per multi-path, e.g. to cope with different directions of arrival [DOA] at different multi-paths

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Power Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a beam presenting method and equipment, and relates to a beam forming P1 shape 53 field 339 CN. The method comprises the following steps: the method comprises the steps of obtaining first parameter information of a wave beam sent by wave beam forming equipment and sending position information of the wave beam forming equipment, and presenting a first augmented reality view according to the first parameter information and the sending position information, wherein the first augmented reality view comprises a first real world view of the surrounding environment and a simulation image of the wave beam. The method and the equipment realize visual presentation of the beam, so that a user can intuitively know information such as the coverage area of the beam and the like, and the user experience is improved.

Description

Beam presenting method and device
Technical Field
The present application relates to the field of beamforming technologies, and in particular, to a method and an apparatus for beam rendering.
Background
With the development of electronic devices, portable devices such as smart glasses, smart phones, and tablet computers are gradually popularized and become essential components in life of people. Taking a smart phone as an example, people can conveniently make calls, send messages, make car calls, register numbers, make a reservation, pay and the like by using the smart phone.
Near Field Communication (NFC) Communication is a way for electronic devices to communicate with other devices, and since wireless signals of the electronic devices propagate in all directions, the Communication signals may be stolen by non-target devices, which is a security risk. Bluetooth pairing is also an application scenario often faced by electronic devices, and when a plurality of bluetooth devices exist around, too many devices in a pairing list cause troubles to users.
The beamforming technology can make the wireless signal directionally propagate, and can overcome the above problems to a certain extent. However, due to the invisibility of the wireless signal, the user cannot intuitively determine whether the beam covers the target device, and the use of the beam is still inconvenient.
Disclosure of Invention
The purpose of this application is: a beam presenting method and device are provided, so that a user can intuitively know the coverage area of a beam and conveniently interact with a target device.
According to a first aspect of at least one embodiment of the present application, there is provided a method comprising:
acquiring first parameter information of a beam sent by beam forming equipment and sending position information of the beam forming equipment;
presenting a first augmented reality view comprising a first real-world view of the surrounding environment and a simulated image of the beam according to the first parameter information and the transmission location information.
With reference to any one of the possible implementation manners of the first aspect, in a second possible implementation manner, the first parameter information includes at least one of a direction, strength, and width of the beam.
With reference to any one of the possible implementation manners of the first aspect, in a third possible implementation manner, the method further includes:
determining receiving position information of a target device;
and controlling the beam forming equipment to send the beam according to the receiving position information.
With reference to any one of the possible implementation manners of the first aspect, in a fourth possible implementation manner, the determining the receiving location information of a target device includes:
and determining the receiving position information of the target equipment according to the fixation point of the user.
With reference to any one of the possible implementation manners of the first aspect, in a fifth possible implementation manner, the controlling, according to the receiving location information, the beam forming device to emit the beam includes:
determining the first parameter information at least according to the receiving position information;
and controlling the beam forming equipment to send the beam according to the first parameter information.
With reference to any one of the possible implementation manners of the first aspect, in a sixth possible implementation manner, the determining the first parameter information according to at least the receiving location information includes:
determining second parameter information of a beam to be transmitted at least according to the receiving position information;
presenting a second augmented reality view according to the second parameter information and the sending position information, wherein the second augmented reality view comprises a second real world view of the surrounding environment and a simulation image of the beam to be sent;
determining that the second parameter information is the first parameter information, or adjusting the second parameter information according to the second augmented reality view, and determining the adjusted second parameter information as the first parameter information.
With reference to any one of the possible implementation manners of the first aspect, in a seventh possible implementation manner, the method further includes: and presenting modulation information of the beam in the first augmented reality view corresponding to the analog image of the beam.
With reference to any one of the possible implementation manners of the first aspect, in an eighth possible implementation manner, the color of the analog image of the beam corresponds to the first parameter information.
According to a second aspect of at least one embodiment of the present application, there is provided a beam presenting apparatus, the apparatus including:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring first parameter information of a beam sent by beam forming equipment and sending position information of the beam forming equipment;
a rendering module for rendering a first augmented reality view according to the first parameter information and the transmission location information, the first augmented reality view comprising a first real-world view of the surrounding environment and the simulated image of the beam.
With reference to any one of the possible implementation manners of the second aspect, in a second possible implementation manner, the apparatus further includes:
the position determining module is used for determining the receiving position information of a target device;
and the control module is used for controlling the beam forming equipment to send out the beam according to the receiving position information.
With reference to any one of the possible implementation manners of the second aspect, in a third possible implementation manner, the location determining module is configured to determine the receiving location information of the target device according to a gaze point of a user.
With reference to any one of the possible implementation manners of the second aspect, in a fourth possible implementation manner, the control module includes:
a determining unit, configured to determine the first parameter information at least according to the receiving location information;
a control unit, configured to control the beam forming device to transmit the beam according to the first parameter information.
With reference to any one of the possible implementation manners of the second aspect, in a fifth possible implementation manner, the determining unit includes: a first subunit and a second subunit;
the first subunit is configured to determine, at least according to the receiving location information, second parameter information of a beam to be transmitted;
the presenting module is further configured to present a second augmented reality view according to the second parameter information and the sending position information, where the second augmented reality view includes a second real-world view of a surrounding environment and a simulated image of the beam to be sent;
the second subunit is configured to determine that the second parameter information is the first parameter information, or adjust the second parameter information according to the second augmented reality view, and determine the adjusted second parameter information as the first parameter information.
With reference to any one of the possible implementation manners of the second aspect, in a sixth possible implementation manner, the presenting module is further configured to present modulation information of the beam corresponding to the analog image of the beam.
With reference to any one of the possible implementations of the second aspect, in a seventh possible implementation, the beam presenting device includes the beam forming device.
According to a third aspect of at least one embodiment of the present application, there is provided a user equipment, including:
a memory for storing instructions;
a processor to execute the memory-stored instructions, the instructions to cause the processor to:
acquiring first parameter information of a beam sent by beam forming equipment and sending position information of the beam forming equipment;
presenting a first augmented reality view comprising a first real-world view of the surrounding environment and a simulated image of the beam according to the first parameter information and the transmission location information.
The method and the equipment of the embodiment of the application acquire first parameter information of a beam sent by beam forming equipment and sending position information of the beam forming equipment, and then present a first augmented reality view according to the first parameter information and the sending position information, wherein the first augmented reality view comprises a first real world view of the surrounding environment and a simulation image of the beam. The method realizes visual presentation of the beam, enables a user to intuitively know information such as the coverage area of the beam and the like, and improves user experience.
Drawings
Fig. 1 is a flowchart of a beam presenting method according to an embodiment of the present application;
FIG. 2 is a schematic illustration of the first augmented reality view in one embodiment;
fig. 3 is a schematic illustration of the first augmented reality view in another embodiment;
fig. 4 is a schematic block diagram of a beam presenting apparatus according to an embodiment of the present invention;
FIG. 5 is a block diagram of a beam rendering apparatus according to an embodiment of the present application;
FIG. 6 is a block diagram of the control module according to one embodiment of the present application;
FIG. 7 is a block diagram of the determining unit according to an embodiment of the present application;
fig. 8 is a schematic hardware structure diagram of a user equipment according to an embodiment of the present application.
Detailed Description
The following detailed description of embodiments of the present application will be made with reference to the accompanying drawings and examples. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
Those skilled in the art will understand that, in the embodiments of the present application, the size of the serial number of each step described below does not mean the execution sequence, and the execution sequence of each step should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The inventor finds in the research process that, during the communication interaction between a device and a target device based on the beam forming technology, a user cannot determine whether a beam covers the target device due to invisibility of the beam. Thus, if a user can be made to see an analog image of the beam in space, the user can be helped to easily determine whether the beam covers the target device.
Fig. 1 is a flow chart of a beam rendering method according to an embodiment of the present application, which may be implemented on, for example, a beam rendering device. As shown in fig. 1, the method includes:
s120: acquiring first parameter information of a beam sent by beam forming equipment and sending position information of the beam forming equipment;
s140: presenting a first augmented reality view comprising a first real-world view of the surrounding environment and a simulated image of the beam according to the first parameter information and the transmission location information.
The method of the embodiment of the application obtains first parameter information of a beam sent by a beam forming device and sending position information of the beam forming device, and then presents a first augmented reality view according to the first parameter information and the sending position information, wherein the first augmented reality view comprises a first real world view of a surrounding environment and a simulation image of the beam. The method realizes visual presentation of the beam, enables a user to intuitively know information such as the coverage area of the beam and the like, and improves user experience.
The functions of steps S120 and S140 will be described in detail below with reference to specific embodiments.
S120: the method comprises the steps of obtaining first parameter information of a beam sent by a beam forming device and sending position information of the beam forming device.
The beam forming can realize the directional transmission of information, and can be used for advertisement information push and the like. The user can receive the signal only by using the communication module on the intelligent device, and rich information content is obtained.
The beam forming device may include an antenna array composed of a plurality of antennas, and parameters of beams transmitted by the antenna array may be adjusted by adjusting phase, angle, power, and the like of the antenna array. The first parameter information is a parameter corresponding to the beam, which may include at least one of a direction, intensity, and width of the beam. The direction of the beam may for example be the direction of extension of the main lobe of the beam, the intensity of the beam being proportional to the radiation distance of the beam, and the width of the beam may for example be the maximum length of the beam in a direction perpendicular to the direction of extension.
The transmission position information may be a position of the beamforming device, which may be an absolute position or a relative position.
In one embodiment, the performing device of the method is separate from the beamforming device, that is, two independent devices, and the sending location information may be the location coordinates of the beamforming device in space or the location of the beamforming device relative to the performing device.
In another embodiment, the performing device of the method includes the beam forming device, i.e. both devices belong to the same device, for example, the performing device is a smart glasses capable of transmitting directional beams. In this case, the transmission position information may be position coordinates of the beamforming device (or the execution device) in space.
S140: presenting a first augmented reality view comprising a first real-world view of the surrounding environment and a simulated image of the beam according to the first parameter information and the transmission location information.
The first augmented reality view is also the augmented reality view presented to the user, which includes both the first real-world view reflecting the real surrounding environment and the simulated image of the beam superimposed or overlaid on the first real-world view. The first augmented reality view may be presented through a screen with the effect that the user feels the beam appears in the real world. Wherein the screen may be see-through or non-see-through, and when the screen is see-through, the first real world view is the real world as seen by a user through the screen; when the screen is non-see-through, the first real-world view is a real-world image captured by an image sensor, such as a camera. For simplicity, the following embodiments are mostly described with the screen being perspective.
Fig. 2 is a schematic diagram of the first augmented reality view in one embodiment. As shown in fig. 2, the beam forming device is a smartphone 210, and the smartphone 210 transmits a beam 250 (the beam 250 is not actually visible and is shown in fig. 2 for clarity) toward a target device 220 to interact with the target device 220, such as to attempt to complete a wireless payment; meanwhile, the presenting device for presenting the first augmented reality view is a pair of smart glasses 230 worn by the user, and the smart glasses 230 may communicate with the smartphone 210 in a wireless manner, for example, so as to obtain the first parameter information of the beam and the position information of the smartphone 210, and then present the first augmented reality view 240 on the lens thereof according to the first parameter information and the position information. It can be seen that both a first real-world view and a simulated image of the beam 250 superimposed on the first real-world view are included in the first augmented reality view 240. The first real world view includes the smartphone 210 and the target device 220, which are the real world as seen by the user through the lens. The user can visually see the coverage of the beam 250 according to the first augmented reality view 240, which is beneficial to conveniently and quickly complete the interaction between the smart phone 210 and the target device 220.
Fig. 3 is a schematic diagram of the first augmented reality view in another embodiment. As shown in fig. 3, the beam forming device and the rendering device are integrated into a smart glasses 310, that is, the smart glasses 310 have the capability of sending directional beams and rendering simultaneously; the smart glasses 310 transmit a beam 350 (the beam 350 is not actually visible, and is drawn in fig. 3 for clarity) toward a target device 320 to interact with the target device 320, such as to attempt to complete a wireless payment; due to the beam 350 transmitted by the smart glasses 310, the smart glasses 310 can easily acquire the first parameter information of the beam and the position information of the smart glasses, and then present the first augmented reality view 340 on the lenses thereof according to the first parameter information and the position information. It can be seen that both a first real-world view and a simulated image of the beam 350 superimposed on the first real-world view are included in the first augmented reality view 340. The first real world view includes the target device 320, which is the real world as seen by the user through the lens. The user can visually see the coverage of the beam 350 according to the first augmented reality view 340, which is beneficial to conveniently and quickly complete the interaction between the smart glasses 310 and the target device 320.
Those skilled in the art will appreciate that if the smart glasses 310 are replaced with a smart phone in the embodiment shown in fig. 3, and the screen of the smart phone is not transparent, the target device in the first augmented reality view is the image of the target device captured by the camera on the smart phone.
The user can judge whether the beam covers the target device according to the first augmented reality view, and if the beam does not cover the target device, the position of the beam forming device can be changed through actions such as head swinging, body turning, step moving and the like, or parameters of the beam can be adjusted to enable the beam to cover the target device, so that interaction is completed as soon as possible.
In one embodiment, the method may further comprise:
s100: determining receiving position information of a target device;
s110: and controlling the beam forming equipment to send the beam according to the receiving position information.
The received location information, i.e. the location information of the target device, may be determined based on sensors that the performing device of the method has. For example, in one embodiment, the receiving location information may be determined using a depth sensor on the device performing the method. For another example, in another embodiment, the receiving position may be determined based on a reflected wave of the target device by an ultrasonic sensor on an executing device of the method.
In another embodiment, the step S100 may include:
s100': and determining the receiving position information of the target equipment according to the fixation point of the user.
Generally, when a user wants to interact with the target device, the user naturally looks at the target device, and in this case, the method can naturally determine the receiving position information by using the gazing behavior of the user. After a certain training, the user may look at the target device first, and the method may determine the receiving location information of the target device according to the point of gaze.
In one embodiment, the user's gaze point may be determined as follows: acquiring an image of the fundus of a user in real time; adjusting imaging parameters of a light path between the eyes and the image acquisition position to acquire the clearest image; processing the acquired image to obtain the equivalent focal length and the sight line direction of the eyes when the acquired image is clearest; obtaining the actual focusing point distance of the eye according to the equivalent focal length of the eye; and obtaining the position of the fixation point according to the sight line direction and the actual focusing distance. The method may determine the location of the point of regard as the location of the target device.
In one embodiment, the step S110 may include:
s111: determining the first parameter information at least according to the receiving position information;
s112: and controlling the beam forming equipment to send the beam according to the first parameter information.
As described earlier, the first parameter information includes at least one of a direction, intensity, and width of the beam. When the beam forming device transmits the beam according to the first parameter information, the aim of the beam forming device is to enable the beam to accurately cover the target device. Therefore, the first parameter information needs to be determined according to the receiving position information
In one embodiment, the performing device of the method is located separately from the beamforming device. In this case, the execution device may need the transmission location information in addition to the reception location information to determine the first parameter information. That is, the performing device needs to know the relative location of the target device with respect to the beamforming device in order to determine the first parameter information.
In another case, the performing device of the method includes the beamforming device, and the performing device may obtain the relative position of the target device with respect to the beamforming device directly according to the receiving position information, and may further determine the first parameter information.
For example, in the application scenario shown in fig. 2, the user naturally gazes at the target device 220 at first, and the smart glasses 230 may determine the location information of the target device 220 according to the user's gazing point, i.e., the receiving location information; the smart glasses 230 may acquire location information of the smart phone, i.e., the sending location information, by communicating with the smart phone 210; the smart glasses 230 may obtain a relative position of the target device with respect to the smart phone 210 according to the receiving position information and the sending position information, and may further determine the first parameter information; the first parameter information may then be sent to the smartphone 210, and the smartphone 210 sends out the beam according to the first parameter information.
For another example, in the application scenario shown in fig. 3, the user naturally gazes at the target device 320 at first, the smart glasses 310 may determine the location information of the target device 320 according to the gazing point of the user, that is, the receiving location information, and further may determine the first parameter information according to the receiving location information, and then directly emit the beam according to the first parameter information.
In another embodiment, the step S111 may further include:
s1111: determining second parameter information of a beam to be transmitted at least according to the receiving position information;
s1112: presenting a second augmented reality view according to the second parameter information and the sending position information, wherein the second augmented reality view comprises a second real world view of the surrounding environment and a simulation image of the beam to be sent;
s1113: determining that the second parameter information is the first parameter information, or adjusting the second parameter information according to the second augmented reality view, and determining the adjusted second parameter information as the first parameter information.
In the embodiment, before the beam is transmitted, an image of a beam to be transmitted is simulated to generate a preview effect, and a user can determine whether the beam to be transmitted meets requirements according to the preview effect and can adjust parameters of the beam to be transmitted until the beam to be transmitted meets the requirements under the condition that the beam to be transmitted does not meet the requirements.
In step S1111, the beam to be transmitted may be understood as a beam to be possibly transmitted, and if the beam to be transmitted meets the requirement, the user will transmit the beam to be transmitted; otherwise, the user needs to adjust the beam to be transmitted. The second parameter information is parameter information corresponding to the beam to be transmitted, and the determination manner is similar to the determination manner of the first parameter information in the foregoing embodiment.
In step S1112, a manner of presenting the second augmented reality view is similar to the manner of presenting the first augmented reality view in the foregoing embodiment, except that the analog image of the beam to be transmitted in the second augmented reality view is not generated based on the beam that really exists, but is obtained according to information such as parameters of the beam to be transmitted. The second augmented reality view may be understood as a preview effect before the beam is transmitted.
The second augmented reality view may be presented with the same screen as the first augmented reality view, and similarly, when the screen is see-through, the second real world view is the real world as seen by the user through the screen; when the screen is non-see-through, the second real world view is a real world image captured by an image sensor, such as a camera.
In the process of determining the second parameter information of the beam to be transmitted at least according to the receiving position information, due to the influence of positioning errors and the like, the beam to be transmitted may not meet actual requirements, for example, cannot cover the target device, which is generally caused by unreasonable setting of the second parameter information. Through the processing of the above steps S1111 and S1112, the above problems can be found by the user in the second augmented reality view, so that the user can adjust the parameters before actually transmitting the beam, so that the finally transmitted beam can meet the requirements.
In step S1113, if the beam to be transmitted can meet the requirement, it can be directly determined that the second parameter information is the first parameter information; if the beam to be transmitted does not meet the requirement, the second parameter information can be adjusted according to the second augmented reality view, and the adjusted second parameter information is determined as the first parameter information. This step may be understood as a process of determining the first parameter information according to the second augmented reality view and the second parameter information. The requirement may be set according to a user requirement, and may be, for example, a simple requirement that the beam can cover the target device, or may further require that the beam cover the target device when the width of the beam is smaller than a predetermined value.
Those skilled in the art understand that after the first parameter information is determined, the beam may be actually transmitted according to the first parameter information.
In addition, those skilled in the art understand that the method may also directly transmit the beam, determine whether the beam meets the requirement according to the first augmented reality view, and if not, appropriately adjust the first parameter. The processing mode is more direct, the processing speed can be effectively guaranteed, the processing time is reduced, however, the processing mode of the beam to be transmitted is adjusted according to the second augmented reality view, and since the beam is not actually transmitted in the adjusting process, the power consumption of the beam forming equipment can be obviously reduced, and the interaction safety is improved.
In one embodiment, the method may further comprise:
s150: and presenting modulation information of the beam in the first augmented reality view corresponding to the analog image of the beam.
The modulation information may be determined according to an application executed by the beamforming device. For example, when the beam forming device executes a payment application, modulation information such as payment object, payment amount, etc. may be presented; for another example, when the beamforming device executes an access control application, modulation information such as access control time, a user name, and the like may be presented.
In one embodiment, the color of the simulated image of the beam may correspond to the first parameter information to facilitate a user to generally understand the first parameter information by color. For example, when the intensity of the beam is greater than a first threshold, the color of the beam is red, and when the intensity of the beam is less than the first threshold, the color of the beam is green.
Furthermore, embodiments of the present application also provide a computer-readable medium, comprising computer-readable instructions that when executed perform the following operations: the operations of steps S120 and S140 of the method in the embodiment shown in fig. 1 described above are performed.
In summary, the method can combine the real environment to present the simulation image of the beam when the beam forming device interacts with the target device, automatically determine the parameters of the beam according to the gazing behavior of the user, and generate the preview effect for the user to adjust the parameters of the beam, thereby effectively improving the interaction efficiency and safety, reducing the power consumption, and improving the user experience.
Fig. 4 is a schematic diagram of a module structure of the beam presenting device according to the embodiment of the present invention, where the beam presenting device may be used as a functional module to be disposed in an intelligent wearable device such as a smart watch, a smart phone, and smart glasses, or may be used as an independent device to implement a corresponding function through communication with other electronic devices. The apparatus 400 may include:
an obtaining module 410, configured to obtain first parameter information of a beam sent by a beam forming device, and sending location information of the beam forming device;
a rendering module 420 for rendering a first augmented reality view comprising a first real-world view of the surrounding environment and the simulated image of the beam according to the first parameter information and the transmission location information.
The device of the embodiment of the application acquires first parameter information of a beam sent by a beam forming device and sending position information of the beam forming device, and then presents a first augmented reality view according to the first parameter information and the sending position information, wherein the first augmented reality view comprises a first real world view of a surrounding environment and a simulation image of the beam. Therefore, the wave beam is presented, and the user can conveniently realize the interaction between the devices through the directional wave beam.
The functions of the obtaining module 410 and the presenting module 420 are described in detail below with reference to specific embodiments.
The obtaining module 410 is configured to obtain first parameter information of a beam sent by a beam forming apparatus, and sending location information of the beam forming apparatus.
The beam forming device may include an antenna array composed of a plurality of antennas, and parameters of beams transmitted by the antenna array may be adjusted by adjusting phase, angle, power, and the like of the antenna array. The first parameter information is a parameter corresponding to the beam, which may include at least one of a direction, intensity, and width of the beam. The direction of the beam may for example be the direction of extension of the main lobe of the beam, the intensity of the beam being proportional to the radiation distance of the beam, and the width of the beam may for example be the maximum length of the beam in a direction perpendicular to the direction of extension.
The transmission position information may be a position of the beamforming device, which may be an absolute position or a relative position.
In one embodiment, the beam-rendering device is located separately from the beam-forming device, that is, two independent devices, and the transmission position information may be position coordinates of the beam-forming device in space or a position of the beam-forming device relative to the execution device.
In another embodiment, the beam-rendering device comprises the beam-forming device, i.e. both devices belong to the same device, for example, the performing device is a smart glasses that can transmit directional beams. In this case, the transmission position information may be position coordinates of the beamforming device (or the execution device) in space.
The presenting module 420 is configured to present a first augmented reality view according to the first parameter information and the sending location information, where the first augmented reality view includes a first real-world view of the surrounding environment and the simulated image of the beam.
The first augmented reality view is also the augmented reality view presented to the user, which includes both the first real-world view reflecting the real surrounding environment and the simulated image of the beam superimposed or overlaid on the first real-world view. The first augmented reality view may be presented through a screen with the effect that the user feels the beam appears in the real world. Wherein the screen may be see-through or non-see-through, the first real world view being the real world as seen by a user through the screen when the screen is see-through; when the screen is non-see-through, the first real-world view is a real-world image captured by an image sensor, such as a camera. For simplicity, the following embodiments are mostly described with the screen being perspective.
The user can judge whether the beam covers the target device according to the first augmented reality view, and if the beam does not cover the target device, the position of the beam forming device can be changed through actions such as head swinging, body turning, step moving and the like, or parameters of the beam can be adjusted to enable the beam to cover the target device, so that interaction is completed as soon as possible.
In one embodiment, referring to fig. 5, the apparatus 400 further comprises:
a location determining module 430, configured to determine receiving location information of a target device;
a control module 440, configured to control the beam forming device to emit the beam according to the receiving location information.
The received location information, i.e., the location information of the target device, may be determined based on sensors that the device has. For example, in one embodiment, the location determination module 430 may determine the received location information via a depth sensor. For another example, in another embodiment, the position determining module 430 may determine the receiving position according to a reflected wave of the target device through an ultrasonic sensor.
In another embodiment, the location determining module 430 is configured to determine the receiving location information of the target device according to a user's gaze point.
In general, when a user wants to interact with the target device, the user naturally looks at the target device, and in this case, the device 400 can naturally determine the receiving location information by using the gazing behavior of the user. After a certain training, the user may look at the target device first, and the method may determine the receiving location information of the target device according to the point of gaze.
In one embodiment, the position determining module 430 may include: the acquisition unit is used for acquiring an image of the fundus oculi of the user in real time; the adjusting unit is used for adjusting imaging parameters of a light path between the eyes and the image acquisition position so as to acquire the clearest image; the processing unit is used for processing the acquired image to obtain the equivalent focal length and the sight line direction of the eye when the acquired image is clearest; the distance unit is used for obtaining the actual focusing distance of the eye according to the equivalent focal length of the eye; and the position unit is used for obtaining the position of the fixation point according to the sight line direction and the actual focusing distance. The position of the point of regard is the position of the target device.
In one embodiment, referring to fig. 6, the control module 440 includes:
a determining unit 441, configured to determine the first parameter information according to at least the receiving location information;
a control unit 442, configured to control the beam forming device to transmit the beam according to the first parameter information.
As described earlier, the first parameter information includes at least one of a direction, intensity, and width of the beam. When the beam forming device transmits the beam according to the first parameter information, the aim of the beam forming device is to enable the beam to accurately cover the target device. Therefore, the first parameter information needs to be determined according to the receiving position information
In one embodiment, the beam-rendering device is located separately from the beam-forming device. In this case, the beam presentation device may need the transmission position information in addition to the reception position information to determine the first parameter information. That is, the beam-rendering device needs to know the relative position of the target device with respect to the beam-forming device in order to determine the first parameter information.
In another case, the beam presenting device includes the beam forming device, and the beam presenting device may obtain the relative position of the target device with respect to the beam forming device directly according to the receiving position information, and may further determine the first parameter information.
In another embodiment, referring to fig. 7, the determining unit 441 includes: a first sub-unit 4411 and a second sub-unit 4412;
the first subunit 4411 is configured to determine, according to at least the receiving position information, second parameter information of a beam to be transmitted;
the presenting module 420 is further configured to present a second augmented reality view according to the second parameter information and the sending position information, where the second augmented reality view includes a second real world view of a surrounding environment and a simulated image of the beam to be sent;
the second subunit 4412 is configured to determine that the second parameter information is the first parameter information, or adjust the second parameter information according to the second augmented reality view, and determine the adjusted second parameter information as the first parameter information.
In the embodiment, before the beam is transmitted, an image of a beam to be transmitted is simulated to generate a preview effect, and a user can determine whether the beam to be transmitted meets requirements according to the preview effect and can adjust parameters of the beam to be transmitted until the beam to be transmitted meets the requirements under the condition that the beam to be transmitted does not meet the requirements.
The beam to be transmitted can be understood as a beam to be possibly transmitted, and if the beam to be transmitted meets the requirement, a user transmits the beam to be transmitted; otherwise, the user needs to adjust the beam to be transmitted. The second parameter information is parameter information corresponding to the beam to be transmitted, and the determination manner is similar to the determination manner of the first parameter information in the foregoing embodiment.
The manner of presenting the second augmented reality view is similar to the manner of presenting the first augmented reality view in the above embodiment, except that the analog image of the beam to be transmitted in the second augmented reality view is not generated based on the beam that really exists, but is obtained according to the information such as the parameter of the beam to be transmitted. The second augmented reality view may be understood as a preview effect before the beam is transmitted.
The second augmented reality view may be presented with the same screen as the first augmented reality view, and similarly, when the screen is see-through, the second real world view is the real world as seen by the user through the screen; when the screen is non-see-through, the second real world view is a real world image captured by an image sensor, such as a camera.
In the process of determining the second parameter information of the beam to be transmitted at least according to the receiving position information, due to the influence of positioning errors and the like, the beam to be transmitted may not meet actual requirements, for example, cannot cover the target device, which is generally caused by unreasonable setting of the second parameter information. Through the processing of the first sub-unit 4411 and the rendering module 420, the above problems can be found by the user in the second augmented reality view, so that the user can adjust the parameters of the beam before actually transmitting the beam, so that the finally transmitted beam can meet the requirements.
If the beam to be transmitted can meet the requirement, the second parameter information can be directly determined to be the first parameter information; if the beam to be transmitted does not meet the requirement, the second parameter information can be adjusted according to the second augmented reality view, and the adjusted second parameter information is determined as the first parameter information. The second sub-unit 4412 is configured to determine the first parameter information according to the second augmented reality view and the second parameter information. The requirement may be set according to a user requirement, and may be, for example, a simple requirement that the beam can cover the target device, or may further require that the beam cover the target device when the width of the beam is smaller than a predetermined value.
Those skilled in the art understand that after the first parameter information is determined, the beam may be actually transmitted according to the first parameter information.
In addition, those skilled in the art understand that the device may also directly transmit the beam, determine whether the beam meets the requirement according to the first augmented reality view, and if not, appropriately adjust the first parameter. The processing mode is more direct, the processing speed can be effectively guaranteed, the processing time is reduced, however, the processing mode of the beam to be transmitted is adjusted according to the second augmented reality view, and since the beam is not actually transmitted in the adjusting process, the power consumption of the beam forming equipment can be obviously reduced, and the interaction safety is improved.
In one embodiment, the presenting module 420 is further configured to present modulation information of the beam corresponding to the analog image of the beam.
The modulation information may be determined according to an application executed by the beamforming device. For example, when the beam forming device executes a payment application, modulation information such as payment object, payment amount, etc. may be presented; for another example, when the beamforming device executes an access control application, modulation information such as access control time, a user name, and the like may be presented.
In one embodiment, the color of the simulated image of the beam may correspond to the first parameter information to facilitate a user to generally understand the first parameter information by color.
In conclusion, the device can present the simulated image of the beam in combination with the real environment when the beam forming device interacts with the target device, automatically determine the parameters of the beam according to the gazing behavior of the user, and generate the preview effect for the user to adjust the parameters of the beam, thereby effectively improving the interaction efficiency and safety, reducing the power consumption, and improving the user experience.
The hardware structure of the user equipment according to an embodiment of the present application is shown in fig. 8. The specific embodiment of the present application does not limit the specific implementation of the user equipment, and referring to fig. 8, the user equipment 800 may include:
a processor (processor)810, a communication Interface 820, a memory 830, and a communication bus 840. Wherein:
processor 810, communication interface 820, and memory 830 communicate with one another via a communication bus 840.
A communication interface 820 for communicating with other network elements.
The processor 810 is configured to execute the program 832, and may specifically perform the relevant steps in the method embodiment shown in fig. 1.
In particular, the program 832 may include program code comprising computer operational instructions.
The processor 810 may be a central processing unit CPU, or an application specific Integrated circuit asic, or one or more Integrated circuits configured to implement embodiments of the present application.
The memory 830 stores a program 832. Memory 830 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory. The program 832 may specifically perform the following steps:
acquiring first parameter information of a beam sent by beam forming equipment and sending position information of the beam forming equipment;
presenting a first augmented reality view comprising a first real-world view of the surrounding environment and a simulated image of the beam according to the first parameter information and the transmission location information.
For specific implementation of each step in the program 832, reference may be made to corresponding steps or modules in the above embodiments, which are not described herein in detail. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a controller, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are merely illustrative, and not restrictive, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the present application, and therefore all equivalent technical solutions also fall within the scope of the present application, and the scope of the present application is defined by the appended claims.

Claims (9)

1. A method for beam rendering, the method comprising:
acquiring first parameter information of a beam sent by a beam forming device and sending position information of the beam forming device, wherein the first parameter information comprises at least one of the direction, the strength and the width of the beam;
presenting a first augmented reality view comprising a first real-world view of the surrounding environment and a simulated image of the beam according to the first parameter information and the transmission location information.
2. The method of claim 1, wherein the method further comprises:
determining receiving position information of a target device;
and controlling the beam forming equipment to send the beam according to the receiving position information.
3. The method of claim 2, wherein the determining the reception location information of a target device comprises:
and determining the receiving position information of the target equipment according to the fixation point of the user.
4. The method of claim 2, wherein said controlling the beamforming device to emit the beam based on the receive location information comprises:
determining the first parameter information at least according to the receiving position information;
and controlling the beam forming equipment to send the beam according to the first parameter information.
5. The method of claim 4, wherein said determining the first parameter information based at least on the received location information comprises:
determining second parameter information of a beam to be transmitted at least according to the receiving position information;
presenting a second augmented reality view according to the second parameter information and the sending position information, wherein the second augmented reality view comprises a second real world view of the surrounding environment and a simulation image of the beam to be sent;
determining that the second parameter information is the first parameter information, or adjusting the second parameter information according to the second augmented reality view, and determining the adjusted second parameter information as the first parameter information.
6. A beam rendering device, characterized in that the device comprises:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring first parameter information of a beam transmitted by a beam forming device and transmission position information of the beam forming device, and the first parameter information comprises at least one of the direction, the strength and the width of the beam;
a rendering module for rendering a first augmented reality view according to the first parameter information and the transmission location information, the first augmented reality view comprising a first real-world view of the surrounding environment and the simulated image of the beam.
7. The apparatus of claim 6, wherein the apparatus further comprises:
the position determining module is used for determining the receiving position information of a target device;
and the control module is used for controlling the beam forming equipment to send out the beam according to the receiving position information.
8. A user device, characterized in that the user device comprises the beam representing device of any of claims 6 to 7.
9. A user equipment, the user equipment comprising:
a memory for storing instructions;
a processor to execute the memory-stored instructions, the instructions to cause the processor to:
acquiring first parameter information of a beam sent by a beam forming device and sending position information of the beam forming device, wherein the first parameter information comprises at least one of the direction, the strength and the width of the beam;
presenting a first augmented reality view comprising a first real-world view of the surrounding environment and a simulated image of the beam according to the first parameter information and the transmission location information.
CN201510788820.6A 2015-11-17 2015-11-17 Beam presenting method and device Active CN106341165B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510788820.6A CN106341165B (en) 2015-11-17 2015-11-17 Beam presenting method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510788820.6A CN106341165B (en) 2015-11-17 2015-11-17 Beam presenting method and device

Publications (2)

Publication Number Publication Date
CN106341165A CN106341165A (en) 2017-01-18
CN106341165B true CN106341165B (en) 2020-02-04

Family

ID=57827186

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510788820.6A Active CN106341165B (en) 2015-11-17 2015-11-17 Beam presenting method and device

Country Status (1)

Country Link
CN (1) CN106341165B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11700043B2 (en) * 2018-11-13 2023-07-11 Nokia Solutions And Networks Oy Beamforming monitoring apparatus

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102196608A (en) * 2010-03-17 2011-09-21 富士通株式会社 Wireless base station and communication method
CN102667912A (en) * 2009-11-21 2012-09-12 道格拉斯·彼得·马格雅利 Head mounted display device
CN102753104A (en) * 2009-11-09 2012-10-24 索诺赛特公司 Systems and methods for beam enhancement
CN102798847A (en) * 2011-05-23 2012-11-28 索尼公司 Beam forming device and method
CN103067088A (en) * 2011-10-24 2013-04-24 瑞特技术有限公司 Method and system for indoor wireless optical link
CN103576148A (en) * 2012-08-07 2014-02-12 中国科学院电子学研究所 Method for simulating satellite-borne SAR range ambiguity noise image
CN103828257A (en) * 2011-09-29 2014-05-28 三星电子株式会社 Method and apparatus for short handover latency in wireless communication system using beam forming
CN104079748A (en) * 2013-03-29 2014-10-01 柯尼卡美能达株式会社 Job information display device
CN104463270A (en) * 2014-11-12 2015-03-25 惠州Tcl移动通信有限公司 Intelligent terminal, financial card and financial management system based on RFID
CN104918020A (en) * 2015-06-17 2015-09-16 宋克凡 Unmanned plane driving simulation display unit

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102753104A (en) * 2009-11-09 2012-10-24 索诺赛特公司 Systems and methods for beam enhancement
CN102667912A (en) * 2009-11-21 2012-09-12 道格拉斯·彼得·马格雅利 Head mounted display device
CN102196608A (en) * 2010-03-17 2011-09-21 富士通株式会社 Wireless base station and communication method
CN102798847A (en) * 2011-05-23 2012-11-28 索尼公司 Beam forming device and method
CN103828257A (en) * 2011-09-29 2014-05-28 三星电子株式会社 Method and apparatus for short handover latency in wireless communication system using beam forming
CN103067088A (en) * 2011-10-24 2013-04-24 瑞特技术有限公司 Method and system for indoor wireless optical link
CN103576148A (en) * 2012-08-07 2014-02-12 中国科学院电子学研究所 Method for simulating satellite-borne SAR range ambiguity noise image
CN104079748A (en) * 2013-03-29 2014-10-01 柯尼卡美能达株式会社 Job information display device
CN104463270A (en) * 2014-11-12 2015-03-25 惠州Tcl移动通信有限公司 Intelligent terminal, financial card and financial management system based on RFID
CN104918020A (en) * 2015-06-17 2015-09-16 宋克凡 Unmanned plane driving simulation display unit

Also Published As

Publication number Publication date
CN106341165A (en) 2017-01-18

Similar Documents

Publication Publication Date Title
US10948983B2 (en) System and method for utilizing gaze tracking and focal point tracking
US20210011550A1 (en) Machine learning based gaze estimation with confidence
US10620699B2 (en) Head mounted display, mobile information terminal, image processing apparatus, display control program, display control method, and display system
EP3293723A1 (en) Method, storage medium, and electronic device for displaying images
EP3341818B1 (en) Method and apparatus for displaying content
CN111556305B (en) Image processing method, VR device, terminal, display system and computer-readable storage medium
EP3729378B1 (en) Head-mounted display device and method thereof
US20230230334A1 (en) Electronic device and method for displaying notification about external object
CN112262361A (en) Method and system for gaze-based control of mixed reality content
EP3521978B1 (en) Apparatus and method for tracking a focal point in a head mounted display system
KR20150007643A (en) Method and apparatus For Connecting Devices Using Eye-tracking
WO2017021587A1 (en) Sharing mediated reality content
EP3086216A1 (en) Mobile terminal and controlling method thereof
CN112596247A (en) Image display method and device and head-mounted display equipment
EP3974950A1 (en) Interactive method and apparatus in virtual reality scene
EP4060646A1 (en) Augmented reality display for macular degeneration
JP6341759B2 (en) Head-mounted information display device and control method for head-mounted information display device
CN106341165B (en) Beam presenting method and device
US20200257360A1 (en) Method for calculating a gaze convergence distance
CN108140080B (en) Display method, device and system
JP6598575B2 (en) Method and program for controlling head mounted display system
CN116724548A (en) Electronic device for displaying content and method of operating the same
JP2018160249A (en) Head-mount display system, head-mount display, display control program, and display control method
CN107850654B (en) Initiating execution of active scan
JP6941715B2 (en) Display device, display program, display method and display system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant