WO2024051579A1 - 一种ar眼镜画面显示的控制方法及其组件 - Google Patents

一种ar眼镜画面显示的控制方法及其组件 Download PDF

Info

Publication number
WO2024051579A1
WO2024051579A1 PCT/CN2023/116257 CN2023116257W WO2024051579A1 WO 2024051579 A1 WO2024051579 A1 WO 2024051579A1 CN 2023116257 W CN2023116257 W CN 2023116257W WO 2024051579 A1 WO2024051579 A1 WO 2024051579A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
rendering
glasses
viewing angle
rendering device
Prior art date
Application number
PCT/CN2023/116257
Other languages
English (en)
French (fr)
Inventor
史高建
张超
Original Assignee
歌尔科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 歌尔科技有限公司 filed Critical 歌尔科技有限公司
Publication of WO2024051579A1 publication Critical patent/WO2024051579A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • This application relates to the technical field of smart wearable devices, and in particular to a control method and components for AR glasses screen display.
  • Augmented Reality (AR) technology is a technology that cleverly integrates virtual information with the real world.
  • AR technology is widely used in all aspects of life such as transportation, medical care, family and education, for example, in house decoration, etc.
  • AR technology can be used to present the final renderings in advance through AR equipment before renovation.
  • AR glasses equipment has developed from bulky all-in-one machines to compact devices with the size and appearance of myopia glasses.
  • the memory size and CPU computing power used in AR glasses devices will be affected to a certain extent, resulting in lags in the rendering and presentation of some complex scenes, or even failure to display, and thus Give users a bad experience.
  • the purpose of this application is to provide a method and components for controlling the screen display of AR glasses, so as to avoid the inability to render complex scenes when the AR glasses themselves have insufficient computing power, and to improve the user experience.
  • this application provides a control method for AR glasses screen display, including:
  • the target display pictures After obtaining each of the target display pictures, the target display pictures are merged and displayed.
  • determining whether the viewing angle range meets a preset condition based on the current pose data includes:
  • the method for controlling screen display of AR glasses further includes:
  • rendering the current picture includes:
  • the method further includes:
  • the viewing angle range is re-assigned to the rendering device, and the step of obtaining its own current pose data is entered.
  • the method for controlling screen display of AR glasses further includes:
  • the target trigger instruction When receiving the target trigger instruction, re-allocate the corresponding viewing angle range to the second target rendering device; wherein the target trigger instruction is generated when any one of the rendering devices disconnects and/or terminates the rendering task and/or the battery is low. instruction, and the second target rendering device is a device that has not triggered the target triggering instruction.
  • allocating a corresponding viewing angle range to a preset rendering device includes:
  • a viewing angle range from large to small is assigned to the corresponding rendering device in order of performance from high to low.
  • this application also provides a control device for AR glasses screen display, including:
  • the allocation module is used to allocate the corresponding viewing angle range to the preset rendering device
  • a judgment module used to judge whether the viewing angle range meets the preset conditions according to the current pose data, and if so, Call the sending module;
  • the sending module is configured to send rendering instructions to the first target rendering device whose viewing angle range meets preset conditions, so that the first target rendering device renders the current picture to obtain the target display picture;
  • a fusion module is configured to fuse and display each of the target display images after acquiring each of the target display images.
  • this application also provides AR glasses, including a memory for storing computer programs
  • a processor configured to implement the steps of the control method for AR glasses screen display when executing the computer program.
  • the present application also provides a computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, the AR glasses screen display is realized. Steps of the control method.
  • a method for controlling screen display of AR glasses includes: allocating a corresponding viewing angle range to a preset rendering device, obtaining its own current pose data, and judging whether the viewing angle range is based on the acquired current pose data. Meet the preset conditions. If the preset conditions are met, send the rendering command to the first target rendering device whose viewing angle range meets the preset conditions, so that the first target rendering device renders the current picture to obtain the target display picture, and obtains each target display picture. Finally, the display screens of each target are integrated and displayed.
  • the rendering device when the viewing angle range corresponding to the preset rendering device meets the preset conditions, the rendering device completes the picture rendering work of the AR glasses, and obtains the image rendered by the first target rendering device. After the images are displayed on each target, they are integrated and displayed to avoid screen freezes or inability to complete rendering due to insufficient computing power when the AR glasses themselves complete the rendering. This improves the screen rendering capabilities and enhances the user experience.
  • this application also provides a control device for AR glasses screen display, AR glasses and media, which corresponds to the above-mentioned control method for AR glasses screen display and has the same effect as above.
  • Figure 1 is a flow chart of a control method for AR glasses screen display provided by an embodiment of the present application
  • Figure 2 is a schematic diagram of a viewing angle range provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of AR glasses screen rendering provided by an embodiment of the present application.
  • Figure 4 is a structural diagram of a control device for AR glasses screen display provided by an embodiment of the present application.
  • Figure 5 is a structural diagram of AR glasses provided by another embodiment of the present application.
  • connection and “fixing” should be understood in a broad sense.
  • fixing can be a fixed connection, a detachable connection, or an integral body; it can It can be a mechanical connection or an electrical connection; it can be a direct connection or an indirect connection through an intermediate medium; it can be an internal connection between two elements or an interactive relationship between two elements, unless otherwise clearly limited.
  • fixing can be a fixed connection, a detachable connection, or an integral body; it can It can be a mechanical connection or an electrical connection; it can be a direct connection or an indirect connection through an intermediate medium; it can be an internal connection between two elements or an interactive relationship between two elements, unless otherwise clearly limited.
  • the core of this application is to provide a control method and components for AR glasses screen display, allocate a viewing angle range to a preset rendering device, and send rendering instructions to the rendering device when the viewing angle range meets the preset condition time, and the rendering device Complete the rendering work of AR glasses, and perform fusion display after obtaining the target display images rendered by each rendering device, to overcome the problem that AR glasses cannot render complex scenes due to insufficient computing power.
  • AR technology is widely used in various aspects of life such as transportation, medical care, family and education.
  • AR can be used when carrying out interior design such as house decoration.
  • the technology uses AR equipment to present the final renderings in advance before transformation.
  • AR glasses equipment has developed from bulky all-in-one machines to compact devices with the size and appearance of myopia glasses.
  • the memory size and CPU computing power used in AR glasses devices will be affected to a certain extent, resulting in lags in the rendering and presentation of some complex scenes, or even failure to display.
  • embodiments of the present application provide a method for controlling the screen display of AR glasses, using preset rendering The device assists in completing the rendering task of AR glasses, and the AR glasses integrate and display the target display images rendered by each rendering device, thereby avoiding the problem of the AR glasses themselves being unable to render complex scenes due to insufficient computing power.
  • Figure 1 is a flow chart of a method for controlling screen display of AR glasses provided by an embodiment of the present application. As shown in Figure 1, the method includes:
  • the 6DOF positioning and tracking system based on the AR glasses establishes a three-dimensional Euler coordinate system with the AR glasses as the origin.
  • it is connected to the preset rendering device through wireless streaming technology. And assign corresponding viewing angle ranges to each rendering device. It should be noted that the viewing angle range is allocated based on the three-dimensional Euler coordinate system, which is used to characterize the scene within the corresponding viewing angle range rendered by the rendering device.
  • the AR glasses themselves have a fixed field of view.
  • the AR glasses move in the three-dimensional space, when the field of view intersects with the viewing angle range of the rendering device, the corresponding rendering device renders the scene image within the viewing angle range. . Therefore, after establishing the three-dimensional Euler coordinate system, it is preset that different rendering devices should render scenes in different angle ranges, that is, different viewing angle ranges are assigned to each rendering device.
  • the rendering device can be a mobile phone, a tablet, a laptop, etc., which is not limited in this application. In addition, this application does not limit the number of equipment selected.
  • this application does not limit the number of equipment selected.
  • the scene to be rendered is a simple scene, for example, a picture, and the AR glasses themselves can complete the rendering task, there is no need to borrow a rendering device for rendering. If you render a more complex scene, for example, when rendering a scene in a video, you need to use a rendering device for rendering. At this time, the computing power of a rendering device is sufficient for rendering.
  • the device When assigning a viewing angle range to the device, the device is responsible for the three-dimensional space. Many angled scene renderings. When rendering complex scenes such as large games, multiple renderers may be required.
  • Rendering devices are used for auxiliary rendering. When assigning viewing angle ranges, different rendering devices are responsible for different viewing angle ranges.
  • the viewing angle range is allocated according to the computing power of the current device. That is, the device with stronger computing power is allocated a wider viewing angle range. On the contrary, the device with weaker computing power is allocated a wider viewing angle range. The viewing angle range is narrower.
  • the current pose data of the AR glasses itself is obtained in real time based on the three-dimensional Euler coordinate system.
  • the pose data includes the current position information of the AR glasses, that is, the current
  • the coordinate data also includes attitude information, that is, the angle at which the AR glasses are currently positioned.
  • step S12 Determine whether the viewing angle range meets the preset conditions based on the current pose data. If so, proceed to step S13;
  • S13 Send a rendering instruction to the first target rendering device whose viewing angle range meets the preset conditions, so that the first target rendering device renders the current picture to obtain the target display picture;
  • the preset condition is specifically that the field of view of the AR glasses intersects with the field of view range of the rendering device.
  • the rendering device that generates the intersection will be the first target rendering device, at this time, the first target rendering device renders the corresponding scene to obtain the target display screen.
  • the first target rendering device may be one or multiple. That is to say, during the movement and rotation of the AR glasses, the field of view may intersect with multiple viewing angle ranges. In this case, there are multiple corresponding first target rendering devices.
  • the first target rendering device first determines the percentage of the intersection between the field of view of the AR glasses and the first target rendering device's viewing angle range, and determines the proportion of the current picture rendered by the first target rendering device based on this percentage. Then render according to this ratio.
  • each first target rendering device After obtaining the target display pictures rendered by each first target rendering device, it is determined whether the definition of each target display picture reaches a preset value and whether the number of target display pictures reaches the target number. That is to say, when there are multiple first target rendering devices, it is determined whether the clarity of each rendered target display picture meets the requirements, and whether the same number of target display pictures as the first target rendering device is obtained. When the number and resolution of the target display images meet the display conditions, Fusion display of each target display screen.
  • the wireless streaming connection between the AR glasses and each rendering device can be disconnected, causing the rendering device to enter sleep mode.
  • the target trigger command of the AR glasses will be triggered.
  • the AR glasses will trigger based on three-dimensional Euler. The coordinate system redistributes the viewing angle range for the rendering device that is currently in normal state.
  • the method for controlling screen display of AR glasses includes: allocating a corresponding viewing angle range to a preset rendering device, obtaining its own current pose data, and determining whether the viewing angle range is based on the acquired current pose data. Meet the preset conditions. If the preset conditions are met, send the rendering command to the first target rendering device whose viewing angle range meets the preset conditions, so that the first target rendering device renders the current picture to obtain the target display picture, and obtains each target display picture. Finally, the display screens of each target are integrated and displayed.
  • the rendering device when the viewing angle range corresponding to the preset rendering device meets the preset conditions, the rendering device completes the picture rendering work of the AR glasses, and obtains the image rendered by the first target rendering device. After the images are displayed on each target, they are integrated and displayed to avoid screen freezes or inability to complete rendering due to insufficient computing power when the AR glasses themselves complete the rendering. This improves the screen rendering capabilities and enhances the user experience.
  • the rendering device corresponding to the intersection field of view range is used as the first target rendering device.
  • the first target rendering device Render the current screen to get the target display screen.
  • FIG 2 is a schematic diagram of a viewing angle range provided by an embodiment of the present application.
  • the viewing angle range View A assigned by the AR glasses to the rendering device A is -30° to -30° in the horizontal direction. 30°, -30° to 30° in the vertical direction, where the field of view of the AR glasses is FOV.
  • the AR glasses start a wireless streaming connection with the rendering device A.
  • the rendering device A renders the target display screen and transmits it to the AR glasses for display.
  • the control method for screen display of AR glasses provided by the embodiment of the present application allocates a corresponding viewing angle range in the three-dimensional space to each rendering device.
  • the corresponding rendering device obtains the target display screen. It can be seen that the rendering device assists in completing the rendering task of the AR glasses, preventing the AR glasses from insufficient computing power to render complex scenes, thereby improving the user experience.
  • a sleep command is sent to the first target rendering device.
  • the target rendering device is configured so that the first target rendering device disconnects the wireless streaming connection with the AR glasses and enters a sleep mode to save power.
  • the AR glasses continue to obtain their own current pose data and determine in real time whether the field of view intersects with the viewing angle range of the rendering device.
  • a sleep command is sent to the first target rendering device so that the first target rendering device enters sleep mode, thereby saving money. power, and ensures the length of time the rendering device can render images after being fully charged, further enhancing the user experience.
  • FIG 3 is a schematic diagram of AR glasses screen rendering provided by an embodiment of the present application.
  • each rendering device renders one-third of the target display screen.
  • Rendering device A, rendering device B and rendering device C allocate all angles in the three-dimensional space evenly, that is, the sum of the viewing angle ranges of each rendering device is the entire three-dimensional space.
  • the rendering device renders the corresponding picture according to the ratio.
  • the rendering device is responsible for rendering the leftmost picture in the target display screen, and the corresponding viewing angle range is View A.
  • the field of view angle FOV of the AR glasses intersects with the viewing angle range View A, and the intersection occurs If the portion occupies 50% of the viewing angle range View A, then rendering device A renders half of the leftmost target display screen.
  • the control method for AR glasses screen display provided by the embodiment of the present application, when rendering the current screen, first determines the percentage of the intersection of the field of view angle and the viewing angle range to the viewing angle range, and determines the proportion of rendering the current screen based on the percentage, and finally according to Render the current screen proportionally.
  • the picture can be rendered based on the intersection of the corresponding viewing angle ranges of the current AR glasses and the rendering device. This prevents some pictures from being rendered based on the field of view of the AR glasses and improves the reliability of the AR glasses rendering picture.
  • the first target rendering device after the first target rendering device renders the current picture to obtain the target display picture, it needs to fuse the target display pictures for display to the user.
  • it is necessary to determine whether each target display screen meets the display conditions, where the clarity of each target display screen reaches the preset value and the number of target display screens reaches a threshold. If the display conditions are met, each target display screen can be integrated and displayed.
  • the clarity of the target display screen may not reach the preset value, indicating that the computing power of the current rendering device may be insufficient.
  • the area of the rendering screen responsible for each rendering device needs to be reallocated, that is, each rendering device must be redistributed.
  • the corresponding viewing angle range of the rendering device is redistributed.
  • the current pose data of the AR glasses itself needs to be reacquired in order to determine the intersection of the current angle of the AR glasses in the three-dimensional space with the reassigned viewing angle range. Then re-render the screen.
  • the number of target display screens does not reach the target number.
  • there are 3 rendering devices for rendering tasks but the number of target display images obtained by the AR glasses is 2, which obviously does not meet the display conditions.
  • the rendering device that does not obtain the target display image may be faulty or insufficient.
  • the viewing angle range needs to be reassigned to the rendering device. It is understandable that when the viewing angle range is currently reassigned, it is only allocated to the rendering device that has not failed.
  • the device information that has not been successfully rendered needs to be transmitted to the management end so that maintenance personnel can troubleshoot in time.
  • the display conditions may include but are not limited to the clarity of each target display screen reaching a preset value and the number of each target display screen reaching a target number. This application does not limit the display conditions.
  • the control method for AR glasses screen display determines whether each target display screen meets the display conditions before merging and displaying each target display screen. If so, the target display screen will be fused and displayed. If If not satisfied, re-assign the viewing angle range to the rendering device, and re-obtain its own current pose data to determine the intersection of the current viewing angle and the re-assigned viewing angle range. This ensures the accuracy of the AR glasses target display and further improves the user experience.
  • the viewing angle range of the rendering device needs to be re-allocated.
  • the viewing angle range also needs to be re-allocated.
  • the target triggering instruction is an instruction generated when any rendering device disconnects and/or terminates the rendering task and/or the battery is low, and the second target rendering device is a device that does not trigger the target triggering instruction.
  • the conditions for triggering the target triggering instruction may include but are not limited to the conditions mentioned above. Any factor that cannot complete the current rendering task can be a condition for triggering the target triggering instruction, which is not limited in this application.
  • the rendering device when the rendering device is low on power or malfunctions, it may not be able to complete the current rendering task. At this time, it is necessary to reassign the viewing angle range to the second target rendering device, that is, to reassign the viewing angle range to the rendering device that has not triggered the target triggering instruction. After redistributing the viewing angle range, it is necessary to reacquire the current pose data of the AR glasses themselves in order to re-determine the intersection of the field of view angle and the reassigned viewing angle range.
  • the control method for AR glasses screen display provided by the embodiment of the present application triggers a target trigger command when any rendering device is disconnected and/or terminates the rendering task and/or the battery is low. After receiving the target trigger command, The corresponding viewing angle range is re-assigned to the second target rendering device, thereby avoiding rendering failure due to failure of any rendering device and further improving the rendering success rate.
  • the performance parameters and current application scenarios of each rendering device are obtained; where the performance parameters at least include memory and computing power, and are determined according to Select the target number of rendering devices for the current application scenario. It is understandable that different rendering scenarios require different numbers of rendering devices. In relatively simple rendering scenarios, fewer rendering devices are required, while in large-scale game rendering scenarios, more rendering devices are required. Therefore, it is necessary to select a corresponding number of rendering devices according to the rendering scene.
  • the assigned viewing angle range can be relatively large, and conversely, for a lower-performance rendering device, a smaller range of viewing angle range is allocated.
  • the control method for AR glasses screen display provided by the embodiments of the present application, when allocating the corresponding viewing angle range to the preset rendering device, allocates the viewing angle range according to the performance of each rendering device and the current application scene, and achieves high-performance rendering.
  • the device renders more images, thus increasing the rate at which images are rendered.
  • the method for controlling the screen display of AR glasses is described in detail.
  • the present application also provides a corresponding embodiment of a control device for the screen display of AR glasses. It should be noted that this application describes the embodiments of the device part from two perspectives, one is based on the functional module perspective, and the other is based on the hardware structure perspective.
  • FIG 4 is a structural diagram of a control device for AR glasses screen display provided by an embodiment of the present application. As shown in Figure 4, the device includes:
  • the allocation module 10 is used to allocate the corresponding viewing angle range to the preset rendering device
  • the judgment module 12 is used to judge whether the viewing angle range meets the preset conditions based on the current pose data, and if so, calls the sending module 13;
  • the sending module 13 is used to send rendering instructions to the first target rendering device whose viewing angle range meets the preset conditions, so that the first target rendering device renders the current picture to obtain the target display picture;
  • the fusion module 14 is used to fuse and display each target display screen after acquiring each target display screen. Since the embodiments of the device part correspond to the embodiments of the method part, please refer to the description of the embodiments of the method part for the embodiments of the device part, and will not be described again here.
  • the control device for AR glasses screen display includes: allocating a corresponding viewing angle range to a preset rendering device, obtaining its own current pose data, and determining whether the viewing angle range is based on the acquired current pose data. Meet the preset conditions. If the preset conditions are met, send the rendering command to the first target rendering device whose viewing angle range meets the preset conditions, so that the first target rendering device renders the current picture to obtain the target display picture, and obtains each target display picture. Finally, the display screens of each target are integrated and displayed.
  • the rendering device when the viewing angle range corresponding to the preset rendering device meets the preset conditions, the rendering device completes the picture rendering work of the AR glasses, and obtains the image rendered by the first target rendering device. After the images are displayed on each target, they are integrated and displayed to avoid screen freezes or inability to complete rendering due to insufficient computing power when the AR glasses themselves complete the rendering. This improves the screen rendering capabilities and enhances the user experience.
  • FIG. 5 is a structural diagram of AR glasses provided by another embodiment of the present application. As shown in Figure 5, the AR glasses include: a memory 20 for storing computer programs;
  • the processor 21 is configured to implement the steps of the control method for AR glasses screen display as mentioned in the above embodiment when executing a computer program.
  • the processor 21 may include one or more processing cores, such as a 4-core processor, an 8-core processor, etc.
  • the processor 21 can adopt at least one of a digital signal processor (Digital Signal Processor, DSP for short), a field-programmable gate array (Field-Programmable Gate Array, FPGA for short), and a programmable logic array (Programmable Logic Array, PLA for short). implemented in hardware form.
  • the processor 21 may also include a main processor and a co-processor.
  • the main processor is a processor used to process data in the wake-up state, also called a central processing unit (Central Processing Unit, CPU for short); a co-processor It is a low-power processor used to process data in standby mode.
  • CPU Central Processing Unit
  • the processor 21 may be integrated with a graphics processor (Graphics Processing Unit, GPU for short), and the GPU is responsible for rendering and drawing content to be displayed on the display screen.
  • the processor 21 may also include an artificial intelligence (Artificial Intelligence, AI for short) processor, which is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • Memory 20 may include one or more computer-readable storage media, which may be Non-transitory.
  • the memory 20 may also include high-speed random access memory, and non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices.
  • the memory 20 is at least used to store the following computer program 201. After the computer program is loaded and executed by the processor 21, the relevant steps of the control method for AR glasses screen display disclosed in any of the foregoing embodiments can be implemented.
  • the resources stored in the memory 20 may also include the operating system 202, data 203, etc., and the storage method may be short-term storage or permanent storage.
  • the operating system 202 may include Windows, Unix, Linux, etc.
  • Data 203 may include, but is not limited to, relevant data involved in the control method of AR glasses screen display.
  • AR glasses may also include a display screen 22, an input and output interface 23, a communication interface 24, a power supply 25 and a communication bus 26.
  • Figure 5 does not constitute a limitation on the AR glasses, and may include more or fewer components than shown in the figure.
  • the AR glasses provided by the embodiments of the present application include a memory and a processor.
  • the processor executes the program stored in the memory, it can implement the following method: a method for controlling screen display of the AR glasses.
  • the rendering device when the viewing angle range corresponding to the preset rendering device meets the preset conditions, the rendering device completes the picture rendering work of the AR glasses, and obtains each target rendered by the first target rendering device. After the screen is displayed, it is integrated and displayed to avoid screen freezes or inability to complete rendering due to insufficient computing power when the AR glasses themselves complete the rendering. This improves screen rendering capabilities and enhances user experience.
  • this application also provides a corresponding embodiment of a computer-readable storage medium.
  • the computer program is stored on the computer-readable storage medium.
  • the steps recorded in the above method embodiments are implemented.
  • the methods in the above embodiments are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or contributes to the existing technology, or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , perform all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, Read-Only Memory (ROM), Random Access Memory (RAM), magnetic disk or optical disk, etc., which can store program code. medium.

Abstract

本申请涉及智能穿戴设备技术领域,公开了一种AR眼镜画面显示的控制方法及其组件,包括:为预先设定的渲染设备分配对应的视角范围,获取自身的当前位姿数据,并根据当前位姿数据判断视角范围是否满足预设条件,若满足预设条件,发送渲染指令至视角范围满足预设条件的第一目标渲染设备,以便第一目标渲染设备渲染当前画面得到目标显示画面,并在获取到各目标显示画面后,将各目标显示画面进行融合显示。由此,由渲染设备完成AR眼镜的画面渲染工作,并在得到第一目标渲染设备渲染的各目标显示画面后,将其融合显示,进而避免由AR眼镜自身完成渲染时,由于算力不足导致的画面卡顿或无法完成渲染,提高画面渲染能力的同时,提升用户体验感。

Description

一种AR眼镜画面显示的控制方法及其组件
本申请要求于2022年09月06日提交中国专利局、申请号202211083940.2、发明名称为“一种AR眼镜画面显示的控制方法及其组件”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及智能穿戴设备技术领域,特别是涉及一种AR眼镜画面显示的控制方法及其组件。
背景技术
增强现实(Augmented Reality,简称AR)技术是一种将虚拟信息与真实世界巧妙融合的技术,AR技术广泛应用于交通、医疗、家庭和教育等生活中的各个方面,例如,在进行房屋装修等室内设计时,可以利用AR技术在改造前通过AR设备提前呈现最终的效果图。
近年来,随着AR技术的快速发展,AR眼镜设备由笨重的一体机发展为如近视眼镜大小和外观一般的小巧设备。由于AR眼镜设备越来越轻便小巧,则应用于AR眼镜设备中的内存大小和CPU算力等相应会受到一定的影响,导致一些复杂场景的渲染和呈现出现卡顿,甚至无法进行显示,进而给用户带来不好的体验感。
由此可见,如何解决AR眼镜设备无法渲染和呈现复杂场景的问题,提升用户体验感,是本领域技术人员亟待解决的问题。
发明内容
本申请的目的是提供一种AR眼镜画面显示的控制方法及其组件,避免AR眼镜自身算力不足时导致无法对复杂场景进行渲染,提升用户体验感。
为解决上述技术问题,本申请提供一种AR眼镜画面显示的控制方法,包括:
为预先设定的渲染设备分配对应的视角范围;
获取自身的当前位姿数据;
根据所述当前位姿数据判断所述视角范围是否满足预设条件;
若满足,发送渲染指令至所述视角范围满足预设条件的第一目标渲染设备,以便所述第一目标渲染设备渲染当前画面得到目标显示画面;
在获取到各所述目标显示画面后,将各所述目标显示画面进行融合显示。
优选地,所述根据所述当前位姿数据判断所述视角范围是否满足预设条件包括:
根据所述当前位姿数据判断所述视角范围是否与自身的视场角存在交集。
优选地,所述AR眼镜画面显示的控制方法还包括:
若确定所述视场角与所述视角范围不存在交集,发送睡眠指令至所述第一目标渲染设备,并返回所述获取自身的当前位姿数据的步骤。
优选地,渲染当前画面包括:
确定所述视场角与所述视角范围交集的部分占所述视角范围的百分比;
根据所述百分比确定渲染所述当前画面的比例;
按照所述比例渲染所述当前画面。
优选地,在所述将各所述目标显示画面进行融合显示之前还包括:
确定各所述目标显示画面是否满足显示条件;其中,所述显示条件至少包括清晰度达到预设值,以及获取的所述目标显示画面的数量达到目标数量;
若满足所述显示条件,进入所述将各所述目标显示画面进行融合显示的步骤;
若不满足所述显示条件,重新为所述渲染设备分配视角范围,并进入所述获取自身的当前位姿数据的步骤。
优选地,所述的AR眼镜画面显示的控制方法,还包括:
在接收到目标触发指令时,重新为第二目标渲染设备分配对应视角范围;其中,所述目标触发指令为任意一个所述渲染设备断开连接和/或终止渲染任务和/或电量不足时生成的指令,且所述第二目标渲染设备为未触发所述目标触发指令的设备。
优选地,所述为预先设定的渲染设备分配对应的视角范围包括:
获取各所述渲染设备的性能参数和当前应用场景;其中,所述性能参数至少包括内存和算力;
根据所述当前应用场景选择目标数量的渲染设备;
为所述目标数量的渲染设备性能进行排序后,按照性能由高至低为对应渲染设备分配由大至小的视角范围。
为了解决上述技术问题,本申请还提供了一种AR眼镜画面显示的控制装置,包括:
分配模块,用于为预先设定的渲染设备分配对应的视角范围;
获取模块,用于获取自身的当前位姿数据;
判断模块,用于根据所述当前位姿数据判断所述视角范围是否满足预设条件,若满足, 调用发送模块;
所述发送模块,用于发送渲染指令至所述视角范围满足预设条件的第一目标渲染设备,以便所述第一目标渲染设备渲染当前画面得到目标显示画面;
融合模块,用于在获取到各所述目标显示画面后,将各所述目标显示画面进行融合显示。
为了解决上述技术问题,本申请还提供了一种AR眼镜,包括存储器,用于存储计算机程序;
处理器,用于执行所述计算机程序时实现所述的AR眼镜画面显示的控制方法的步骤。
为了解决上述技术问题,本申请还提供了一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现所述的AR眼镜画面显示的控制方法的步骤。
本申请所提供的一种AR眼镜画面显示的控制方法,包括:为预先设定的渲染设备分配对应的视角范围,获取自身的当前位姿数据,并根据获取的当前位姿数据判断视角范围是否满足预设条件,若满足预设条件,发送渲染指令至视角范围满足预设条件的第一目标渲染设备,以便第一目标渲染设备渲染当前画面得到目标显示画面,并在获取到各目标显示画面后,将各目标显示画面进行融合显示。由此可见,本申请所提供的技术方案,在预先设定的渲染设备对应的视角范围满足预设条件时,由渲染设备完成AR眼镜的画面渲染工作,并在得到第一目标渲染设备渲染的各目标显示画面后,将其融合显示,进而避免由AR眼镜自身完成渲染时,由于算力不足导致的画面卡顿或无法完成渲染,提高画面渲染能力的同时,提升用户体验感。
此外,本申请还提供一种AR眼镜画面显示的控制装置、AR眼镜和介质,与上述的AR眼镜画面显示的控制方法相对应,效果同上。
附图说明
为了更清楚地说明本申请实施例,下面将对实施例中所需要使用的附图做简单的介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例所提供的一种AR眼镜画面显示的控制方法流程图;
图2为本申请实施例提供的一种视角范围的示意图;
图3为本申请实施例所提供的一种AR眼镜画面渲染的示意图;
图4为本申请实施例所提供的一种AR眼镜画面显示的控制装置的结构图;
图5为本申请另一实施例提供的一种AR眼镜的结构图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请的一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
需要说明,本申请实施例中所有方向性指示(诸如上、下、左、右、前、后……)仅用于解释在某一特定姿态(如附图所示)下各部件之间的相对位置关系、运动情况等,如果该特定姿态发生改变时,则该方向性指示也相应地随之改变。
在本申请中,除非另有明确的规定和限定,术语“连接”、“固定”等应做广义理解,例如,“固定”可以是固定连接,也可以是可拆卸连接,或成一体;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系,除非另有明确的限定。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
另外,在本申请中如涉及“第一”、“第二”等的描述仅用于描述目的,而不能理解为指示或暗示其相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。另外,各个实施例之间的技术方案可以相互结合,但是必须是以本领域普通技术人员能够实现为基础,当技术方案的结合出现相互矛盾或无法实现时应当认为这种技术方案的结合不存在,也不在本申请要求的保护范围之内。
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下,所获得的所有其他实施例,都属于本申请保护范围。
本申请的核心是提供一种AR眼镜画面显示的控制方法及其组件,为预先设定的渲染设备分配视角范围,并在视角范围满足预设条件时间,发送渲染指令至渲染设备,由渲染设备完成AR眼镜的渲染工作,并在得到各渲染设备渲染的目标显示画面后进行融合显示,克服AR眼镜算力不足导致无法渲染复杂场景的问题。
为了使本技术领域的人员更好地理解本申请方案,下面结合附图和具体实施方式对本申请作进一步的详细说明。
随着科技的不断发展,以及人们对品质生活需求的不断上升,AR技术广泛应用于交通、医疗、家庭和教育等生活中的各个方面,例如,在进行房屋装修等室内设计时,可以利用AR技术在改造前通过AR设备提前呈现最终的效果图。
近年来,随着AR技术的快速发展,AR眼镜设备由笨重的一体机发展为如近视眼镜大小和外观一般的小巧设备。由于AR眼镜设备越来越轻便小巧,则应用于AR眼镜设备中的内存大小和CPU算力等相应会受到一定的影响,导致一些复杂场景的渲染和呈现出现卡顿,甚至无法进行显示。
为了解决AR眼镜自身算力不足导致无法渲染复杂场景的问题,提高AR眼镜画面渲染能力,提升用户体验感,本申请实施例提供了一种AR眼镜画面显示的控制方法,利用预先设定的渲染设备辅助完成AR眼镜的渲染任务,并由AR眼镜将各渲染设备渲染的目标显示画面进行融合展示,由此避免AR眼镜本身算力不足导致无法渲染复杂场景的问题。
图1为本申请实施例所提供的一种AR眼镜画面显示的控制方法流程图,如图1所示,该方法包括:
S10:为预先设定的渲染设备分配对应的视角范围;
在具体实施例中,当AR眼镜设备启动后,基于AR眼镜的6DOF定位追踪系统以AR眼镜为原点建立三维欧拉坐标系,此外,通过无线串流技术与预先设定的渲染设备进行连接,并为各渲染设备分配对应的视角范围。需要说明的是,基于三维欧拉坐标系分配视角范围,用于表征渲染设备渲染对应视角范围内的场景。
可以理解的是,AR眼镜自身存在固定的视场角,当AR眼镜在三维空间中移动时,当视场角与渲染设备的视角范围存在交集时,对应的渲染设备渲染该视角范围内的场景图像。因此,在建立三维欧拉坐标系后,预先设定好不同的渲染设备对应渲染不同角度范围内的场景,即为各渲染设备分配不同的视角范围。
需要说明的是,渲染设备可以是手机、平板和笔记本等设备,对此本申请不作限定。此外,对于选用的设备数量,本申请也不做限定。当然,可以理解的是,当待渲染的场景为简单场景时,例如,一张图片,AR眼镜自身可以完成渲染任务时,则无需借用渲染设备进行渲染。若渲染较为复杂的场景,例如,渲染视频中的场景时,需要借助渲染设备进行渲染,此时一个渲染设备的算力已足够进行渲染,则为该设备分配视角范围时,该设备负责三维空间内多有角度的场景渲染。若渲染大型游戏等复杂场景时,则可能需要多个渲 染设备而进行辅助渲染,分配视角范围时,不同渲染设备负责不同的视角范围。
值得注意的是,多个设备辅助复杂场景的渲染时,根据当前设备的算力情况进行视角范围分配,即算力越强的设备分配的视角范围较广,反之,算力越弱,则分配的视角范围就越窄。
S11:获取自身的当前位姿数据;
在步骤S10建立三维欧拉坐标系的基础上,基于该三维欧拉坐标系实时获取AR眼镜自身的当前位姿数据,可以理解的是,位姿数据包括了AR眼镜当前的位置信息,即当前坐标数据,还包括姿态信息,即AR眼镜当前处于什么角度的一个姿态。
S12:根据当前位姿数据判断视角范围是否满足预设条件,若满足,进入步骤S13;
S13:发送渲染指令至视角范围满足预设条件的第一目标渲染设备,以便第一目标渲染设备渲染当前画面得到目标显示画面;
S14:在获取到各目标显示画面后,将各目标显示画面进行融合显示。
根据步骤S11中获取的AR眼镜自身的当前位姿数据,判断视角范围是否满足预设条件,若满足预设条件,则发送渲染指令至视角范围满足预设条件的第一目标渲染设备,此时,第一目标渲染设备渲染当前画面得到目标显示画面。可以理解的是,根据AR眼镜自身的当前位姿数据,以及AR眼镜自身的视场角,可以确定当前AR眼镜与不同渲染设备负责的视角范围是否产生交集。因此,预设条件具体为AR眼镜自身的视场角与渲染设备的视角范围产生交集。
AR眼镜在三维空间中移动旋转时,自身的视场角也会随之移动,当视场角与预先分配给不同渲染设备的视角范围产生交集时,则产生交集的该渲染设备为第一目标渲染设备,此时该第一目标渲染设备渲染对应的场景得到目标显示画面。
需要说明的是,第一目标渲染设备可以是一个,也可以是多个。也就是说,在AR眼镜移动和旋转过程中,视场角可能和多个视角范围产生交集,此时,对应的第一目标渲染设备为多个。第一目标渲染设备在进行渲染时,先确定AR眼镜视场角与第一目标渲染设备视角范围交集的部分占该视角范围的百分比,依据该百分比确定第一目标渲染设备渲染当前画面的比例,进而依据该比例进行渲染。
得到各第一目标渲染设备渲染的目标显示画面后,确定各目标显示画面的清晰度是否达到预设值,且目标显示画面的数量是否达到目标数量。也就是说,当第一目标渲染设备为多个时,确定渲染后的各目标显示画面的清晰度是否达到要求,且是否获取到和第一目标渲染设备相同数量的目标显示画面。在目标显示画面的数量和清晰度达到显示条件时, 将各目标显示画面进行融合显示。
当AR眼镜的视场角与各渲染设备的视角范围均无交集时,此时,可以断开AR眼镜与各渲染设备的无线串流连接,使得渲染设备进入睡眠模式。
在渲染设备进行渲染任务时,若出现其中一个或多个设备电量不足,或者设备故障等情况终止了渲染任务时,此时会触发AR眼镜的目标触发指令,此时AR眼镜会基于三维欧拉坐标系重新为当前处于正常状态的渲染设备分配视角范围。
本申请实施例所提供的AR眼镜画面显示的控制方法,包括:为预先设定的渲染设备分配对应的视角范围,获取自身的当前位姿数据,并根据获取的当前位姿数据判断视角范围是否满足预设条件,若满足预设条件,发送渲染指令至视角范围满足预设条件的第一目标渲染设备,以便第一目标渲染设备渲染当前画面得到目标显示画面,并在获取到各目标显示画面后,将各目标显示画面进行融合显示。由此可见,本申请所提供的技术方案,在预先设定的渲染设备对应的视角范围满足预设条件时,由渲染设备完成AR眼镜的画面渲染工作,并在得到第一目标渲染设备渲染的各目标显示画面后,将其融合显示,进而避免由AR眼镜自身完成渲染时,由于算力不足导致的画面卡顿或无法完成渲染,提高画面渲染能力的同时,提升用户体验感。
在具体实施例中,当AR眼镜自身的视场角与渲染设备的视角范围产生交集时,则将产生交集的视角范围对应的渲染设备作为第一目标渲染设备,此时,第一目标渲染设备渲染当前画面得到目标显示画面。为了便于理解,下面将举例说明。
图2为本申请实施例提供的一种视角范围的示意图,如图2所示,在三维欧拉坐标系中,AR眼镜为渲染设备A分配的视角范围View A为水平方向上-30°至30°,垂直方向上为-30°至30°,其中,AR眼镜的视场角为FOV,用户在使用AR眼镜,并在三维空间中移动AR眼镜时,若AR眼镜的视场角与渲染设备A的View A产生交集,此时,AR眼镜启动与渲染设备A的无线串流连接,渲染设备A渲染得到目标显示画面后传输至AR眼镜进行显示。
本申请实施例所提供的AR眼镜画面显示的控制方法,为各渲染设备分配三维空间中一个对应的视角范围,当AR眼镜的视场角与渲染设备的视角范围产生交集时,对应的渲染设备渲染画面得到目标显示画面,可见,由渲染设备辅助完成AR眼镜的渲染任务,避免AR眼镜算力不足无法渲染复杂场景,进而提升用户体验感。
在上述实施例的基础上,为了节约资源,保证渲染设备充满一次电的使用时长,当AR眼镜在三维空间中移动,并且移动至视场角与视角范围不存在交集时,发送睡眠指令至第一目标渲染设备,以便于第一目标渲染设备断开与AR眼镜的无线串流连接,并进入睡眠模式以节约电能。当然,AR眼镜继续获取自身的当前位姿数据,实时确定视场角是否与渲染设备的视角范围产生交集。
事实上,某些场景下用户使用AR眼镜时,三维空间中某些角度无需进行画面渲染,也就是说,三维空间中若有些角度并未分配至某个设备,若此时AR眼镜的视场角进入该三维空间角度中时,即AR眼镜的视场角与渲染设别的视角范围无交集时,渲染设备进入睡眠状态节约电能。
本申请实施例所提供的AR眼镜画面显示的控制方法,若确定视场角与视角范围不存在交集时,发送睡眠指令至第一目标渲染设备,以便第一目标渲染设备进入睡眠模式,进而节约电能,且保证渲染设备充满一次电后能渲染画面的时长,进一步提升用户体验感。
可以理解的是,在分配渲染设备的视角范围时,视角范围与渲染画面的大小相关。图3为本申请实施例所提供的一种AR眼镜画面渲染的示意图,例如,在图3中,每个渲染设备渲染目标显示画面的三分之一的画面,对应的,在三维空间中,渲染设备A、渲染设备B和渲染设备C平均分配三维空间中的所有角度,即各渲染设备的视角范围总和为整个三维空间。
此时,当AR眼镜与任意一个渲染设备对应的视角范围产生交集时,先确定AR眼镜的时长交与该视角范围交集的部分占该视角范围的比例,然后根据百分比确定渲染当前画面的比例,此时该渲染设备按照该比例渲染对应的画面。
例如,在图3中,渲染设备负责渲染目标显示画面中最左侧的画面,对应的视角范围为View A,当AR眼镜的视场角FOV与视角范围View A产生交集时,且产生交集的部分占视角范围View A的50%,则渲染设备A渲染最左侧目标显示画面一半的画面。
本申请实施例所提供的AR眼镜画面显示的控制方法,在渲染当前画面时,先确定视场角与视角范围交集的部分占视角范围的百分比,并根据百分比确定渲染当前画面的比例,最后按照比例渲染当前画面。由此,可根据当前AR眼镜与渲染设备对应视角范围的交集情况进行画面渲染,避免有些画面根据AR眼镜视场角无法看到也进行了渲染,提升了AR眼镜渲染画面可靠性。
在上述实施例的基础上,第一目标渲染设备对当前画面进行渲染得到目标显示画面后,需要对目标显示画面进行融合以便为用户展示。为了提高渲染准确性,在进行融合前,需要先确定各目标显示画面是否满足显示条件,其中,显示条件各目标显示画面的清晰度都达到预设值,且目标显示画面的数量达到阈值。若满足显示条件时,即可将各目标显示画面进行融合显示。
不满足显示条件时,可能存在目标显示画面的清晰度未达到预设值,说明当前存在渲染设备的算力可能不足,则需要对各渲染设备负责的渲染画面的面积进行重新分配,即对各渲染设备对应的视角范围进行重新分配,重新分配后,需要重新获取AR眼镜自身的当前位姿数据,以便确定当前AR眼镜在三维空间的角度使得视场角与重新分配的视角范围的交集情况,进而重新进行画面渲染。
当然,也可能是目标显示画面的数量未达到目标数量。例如,当然进行渲染任务的渲染设备为3个,但AR眼镜获取到的目标显示画面数量为2个,显然不符合显示条件,未获取到目标显示画面的渲染设备可能出现故障或限量不足,未完成渲染任务,此时,需要重新为渲染设备分配视角范围,可以理解的是,当前重新分配视角范围时,仅针对未发生故障的渲染设备进行分配。另外,需要将未渲染成功的设备信息传输至管理端,以便维护人员及时排除故障。
需要说的是,显示条件可以包括但不限于各目标显示画面清晰度达到预设值和各目标显示画面的数量达到目标数量,对于显示条件本申请不作限定。
本申请实施例所提供的AR眼镜画面显示的控制方法,在将各目标显示画面进行融合显示之前,在判断各目标显示画面是否满足显示条件,若满足则将各目标显示画面进行融合显示,若不满足,重新为渲染设备分配视角范围,重新获取自身的当前位姿数据以确定当前视场角与重新分配的视角范围的交集情况。由此,保证AR眼镜目标显示画面的准确性,进一步提高用户体验感。
在上述实施例中,当目标显示画面不满足显然条件时,需要对渲染设备的视角范围重新分配,除此之外,当AR眼镜在接收待目标触发指令时,也需要重新进行视角范围分配。
其中,目标触发指令为任意一个渲染设备断开连接和/或终止渲染任务和/或电量不足时生成的指令,且第二目标渲染设备为未触发目标触发指令的设备。需要说明的是,触发目标触发指令的条件可以包括但不限于上述提到的条件,任意一个无法完成当前渲染任务的因素均可成为触发目标触发指令的条件,对此本申请不作限定。
可以理解,当渲染设备电量不足或出现故障时,可能无法完成当前渲染任务,此时需要重新为第二目标渲染设备重新分配视角范围,即为未触发目标触发指令的渲染设备重新分配视角范围。重新分配视角范围后,需要重新获取AR眼镜自身的当前位姿数据以便重新确定视场角与重新分配后的视角范围的交集情况。
本申请实施例所提供的AR眼镜画面显示的控制方法,当任意一个渲染设备断开连接和/或终止渲染任务和/或电量不足时,触发目标触发指令,则在接收到目标触发指令后,重新为第二目标渲染设备分配对应视角范围,由此,避免任意一个渲染设备出现故障等情况导致渲染失败,进一步提升渲染成功率。
在具体实施例中,为了提高渲染速率,为预先设定的渲染设备分配对应的视角范围时,获取各渲染设备的性能参数和当前应用场景;其中,性能参数至少包括内存和算力,并根据当前应用场景选择目标数量的渲染设备。可以理解的是,对于不同的渲染场景,需要的渲染设备数量不同,在相对简单的渲染场景下,需要的渲染设备较少,而在大型游戏的渲染场景下需要的渲染设备较多。因此,需要根据渲染场景选择对应数量的渲染设备。
在确定渲染设备数量后,为目标数量的渲染设备性能进行排序,并按照性能由高至低为对应渲染设备分配由大至小的视角范围。也就是说,对于性能高的渲染设备,分配的视角范围可以相对较大,反之,性能较低的渲染设备分配较小范围的视角范围。
本申请实施例所提供的AR眼镜画面显示的控制方法,为预先设定的渲染设备分配对应的视角范围时,根据各渲染设备的性能和当前应用的场景进行视角范围的分配,性能高的渲染设备渲染较多画面,进而提升渲染画面的速率。
在上述实施例中,对于AR眼镜画面显示的控制方法进行了详细描述,本申请还提供一种AR眼镜画面显示的控制装置对应的实施例。需要说明的是,本申请从两个角度对装置部分的实施例进行描述,一种是基于功能模块的角度,另一种是基于硬件结构的角度。
图4为本申请实施例所提供的一种AR眼镜画面显示的控制装置的结构图,如图4所示,该装置包括:
分配模块10,用于为预先设定的渲染设备分配对应的视角范围;
获取模块11,用于获取自身的当前位姿数据;
判断模块12,用于根据当前位姿数据判断视角范围是否满足预设条件,若满足,调用发送模块13;
发送模块13,用于发送渲染指令至视角范围满足预设条件的第一目标渲染设备,以便第一目标渲染设备渲染当前画面得到目标显示画面;
融合模块14,用于在获取到各目标显示画面后,将各目标显示画面进行融合显示。由于装置部分的实施例与方法部分的实施例相互对应,因此装置部分的实施例请参见方法部分的实施例的描述,这里暂不赘述。
本申请实施例所提供的AR眼镜画面显示的控制装置,包括:为预先设定的渲染设备分配对应的视角范围,获取自身的当前位姿数据,并根据获取的当前位姿数据判断视角范围是否满足预设条件,若满足预设条件,发送渲染指令至视角范围满足预设条件的第一目标渲染设备,以便第一目标渲染设备渲染当前画面得到目标显示画面,并在获取到各目标显示画面后,将各目标显示画面进行融合显示。由此可见,本申请所提供的技术方案,在预先设定的渲染设备对应的视角范围满足预设条件时,由渲染设备完成AR眼镜的画面渲染工作,并在得到第一目标渲染设备渲染的各目标显示画面后,将其融合显示,进而避免由AR眼镜自身完成渲染时,由于算力不足导致的画面卡顿或无法完成渲染,提高画面渲染能力的同时,提升用户体验感。
图5为本申请另一实施例提供的一种AR眼镜的结构图,如图5所示,AR眼镜包括:存储器20,用于存储计算机程序;
处理器21,用于执行计算机程序时实现如上述实施例所提到的AR眼镜画面显示的控制方法的步骤。
其中,处理器21可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器21可以采用数字信号处理器(Digital Signal Processor,简称DSP)、现场可编程门阵列(Field-Programmable Gate Array,简称FPGA)、可编程逻辑阵列(Programmable Logic Array,简称PLA)中的至少一种硬件形式来实现。处理器21也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称中央处理器(Central Processing Unit,简称CPU);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器21可以集成有图像处理器(Graphics Processing Unit,简称GPU),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器21还可以包括人工智能(Artificial Intelligence,简称AI)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器20可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是 非暂态的。存储器20还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。本实施例中,存储器20至少用于存储以下计算机程序201,其中,该计算机程序被处理器21加载并执行之后,能够实现前述任一实施例公开的AR眼镜画面显示的控制方法的相关步骤。另外,存储器20所存储的资源还可以包括操作系统202和数据203等,存储方式可以是短暂存储或者永久存储。其中,操作系统202可以包括Windows、Unix、Linux等。数据203可以包括但不限于AR眼镜画面显示的控制方法中所涉及的相关数据。
在一些实施例中,AR眼镜还可包括有显示屏22、输入输出接口23、通信接口24、电源25以及通信总线26。
本领域技术人员可以理解,图5中示出的结构并不构成对AR眼镜的限定,可以包括比图示更多或更少的组件。
本申请实施例提供的AR眼镜,包括存储器和处理器,处理器在执行存储器存储的程序时,能够实现如下方法:AR眼镜画面显示的控制方法。
本申请实施例所提供的AR眼镜,在预先设定的渲染设备对应的视角范围满足预设条件时,由渲染设备完成AR眼镜的画面渲染工作,并在得到第一目标渲染设备渲染的各目标显示画面后,将其融合显示,进而避免由AR眼镜自身完成渲染时,由于算力不足导致的画面卡顿或无法完成渲染,提高画面渲染能力的同时,提升用户体验感。
最后,本申请还提供一种计算机可读存储介质对应的实施例。计算机可读存储介质上存储有计算机程序,计算机程序被处理器执行时实现如上述方法实施例中记载的步骤。
可以理解的是,如果上述实施例中的方法以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,简称ROM)、随机存取存储器(Random Access Memory,简称RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上对本申请所提供的一种AR眼镜画面显示的控制方法及其组件进行了详细介绍。说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不 同之处,各个实施例之间相同相似部分互相参见即可。对于实施例公开的装置而言,由于其与实施例公开的方法相对应,所以描述的比较简单,相关之处参见方法部分说明即可。应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以对本申请进行若干改进和修饰,这些改进和修饰也落入本申请权利要求的保护范围内。
还需要说明的是,在本说明书中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本说明书中各个实施例采用并列或者递进的方式描述,每个实施例重点说明的都是与其它实施例的不同之处,各个实施例之间相同或相似部分互相参见即可。对于实施例公开的装置而言,由于其与实施例公开的方法相对应,所以描述的比较简单,相关之处可参见方法部分说明。

Claims (10)

  1. 一种AR眼镜画面显示的控制方法,其特征在于,包括:
    为预先设定的渲染设备分配对应的视角范围;
    获取自身的当前位姿数据;
    根据所述当前位姿数据判断所述视角范围是否满足预设条件;
    若满足,发送渲染指令至所述视角范围满足预设条件的第一目标渲染设备,以便所述第一目标渲染设备渲染当前画面得到目标显示画面;
    在获取到各所述目标显示画面后,将各所述目标显示画面进行融合显示。
  2. 根据权利要求1所述的AR眼镜画面显示的控制方法,其特征在于,所述根据所述当前位姿数据判断所述视角范围是否满足预设条件包括:
    根据所述当前位姿数据判断所述视角范围是否与自身的视场角存在交集。
  3. 根据权利要求2所述的AR眼镜画面显示的控制方法,其特征在于,还包括:
    若确定所述视场角与所述视角范围不存在交集,发送睡眠指令至所述第一目标渲染设备,并返回所述获取自身的当前位姿数据的步骤。
  4. 根据权利要求2所述的AR眼镜画面显示的控制方法,其特征在于,渲染当前画面包括:
    确定所述视场角与所述视角范围交集的部分占所述视角范围的百分比;
    根据所述百分比确定渲染所述当前画面的比例;
    按照所述比例渲染所述当前画面。
  5. 根据权利要求2所述的AR眼镜画面显示的控制方法,其特征在于,在所述将各所述目标显示画面进行融合显示之前还包括:
    确定各所述目标显示画面是否满足显示条件;其中,所述显示条件至少包括清晰度达到预设值,以及获取的所述目标显示画面的数量达到目标数量;
    若满足所述显示条件,进入所述将各所述目标显示画面进行融合显示的步骤;
    若不满足所述显示条件,重新为所述渲染设备分配视角范围,并进入所述获取自身的当前位姿数据的步骤。
  6. 根据权利要求1-5任意一项所述的AR眼镜画面显示的控制方法,其特征在于,还包括:
    在接收到目标触发指令时,重新为第二目标渲染设备分配对应视角范围;其中,所述目标触发指令为任意一个所述渲染设备断开连接和/或终止渲染任务和/或电量不足时生 成的指令,且所述第二目标渲染设备为未触发所述目标触发指令的设备。
  7. 根据权利要求1所述的AR眼镜画面显示的控制方法,其特征在于,所述为预先设定的渲染设备分配对应的视角范围包括:
    获取各所述渲染设备的性能参数和当前应用场景;其中,所述性能参数至少包括内存和算力;
    根据所述当前应用场景选择目标数量的渲染设备;
    为所述目标数量的渲染设备性能进行排序后,按照性能由高至低为对应渲染设备分配由大至小的视角范围。
  8. 一种AR眼镜画面显示的控制装置,其特征在于,包括:
    分配模块,用于为预先设定的渲染设备分配对应的视角范围;
    获取模块,用于获取自身的当前位姿数据;
    判断模块,用于根据所述当前位姿数据判断所述视角范围是否满足预设条件,若满足,调用发送模块;
    所述发送模块,用于发送渲染指令至所述视角范围满足预设条件的第一目标渲染设备,以便所述第一目标渲染设备渲染当前画面得到目标显示画面;
    融合模块,用于在获取到各所述目标显示画面后,将各所述目标显示画面进行融合显示。
  9. 一种AR眼镜,其特征在于,包括存储器,用于存储计算机程序;
    处理器,用于执行所述计算机程序时实现如权利要求1至7任一项所述的AR眼镜画面显示的控制方法的步骤。
  10. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1至7任一项所述的AR眼镜画面显示的控制方法的步骤。
PCT/CN2023/116257 2022-09-06 2023-08-31 一种ar眼镜画面显示的控制方法及其组件 WO2024051579A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211083940.2 2022-09-06
CN202211083940.2A CN115423989A (zh) 2022-09-06 2022-09-06 一种ar眼镜画面显示的控制方法及其组件

Publications (1)

Publication Number Publication Date
WO2024051579A1 true WO2024051579A1 (zh) 2024-03-14

Family

ID=84203234

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/116257 WO2024051579A1 (zh) 2022-09-06 2023-08-31 一种ar眼镜画面显示的控制方法及其组件

Country Status (2)

Country Link
CN (1) CN115423989A (zh)
WO (1) WO2024051579A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115423989A (zh) * 2022-09-06 2022-12-02 歌尔科技有限公司 一种ar眼镜画面显示的控制方法及其组件
CN115981588B (zh) * 2023-03-16 2023-09-26 中国邮电器材集团有限公司 一种多终端数据显示方法、设备和系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108648257A (zh) * 2018-04-09 2018-10-12 腾讯科技(深圳)有限公司 全景画面的获取方法、装置、存储介质及电子装置
CN112057851A (zh) * 2020-09-02 2020-12-11 北京蔚领时代科技有限公司 一种基于多显卡的单帧画面实时渲染方法
CN113873264A (zh) * 2021-10-25 2021-12-31 北京字节跳动网络技术有限公司 显示图像的方法、装置、电子设备及存储介质
US20220067878A1 (en) * 2020-09-02 2022-03-03 Yutou Technology (Hangzhou) Co., Ltd. Method and device for presenting ar information based on video communication technology
CN115423989A (zh) * 2022-09-06 2022-12-02 歌尔科技有限公司 一种ar眼镜画面显示的控制方法及其组件

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108648257A (zh) * 2018-04-09 2018-10-12 腾讯科技(深圳)有限公司 全景画面的获取方法、装置、存储介质及电子装置
CN112057851A (zh) * 2020-09-02 2020-12-11 北京蔚领时代科技有限公司 一种基于多显卡的单帧画面实时渲染方法
US20220067878A1 (en) * 2020-09-02 2022-03-03 Yutou Technology (Hangzhou) Co., Ltd. Method and device for presenting ar information based on video communication technology
CN113873264A (zh) * 2021-10-25 2021-12-31 北京字节跳动网络技术有限公司 显示图像的方法、装置、电子设备及存储介质
CN115423989A (zh) * 2022-09-06 2022-12-02 歌尔科技有限公司 一种ar眼镜画面显示的控制方法及其组件

Also Published As

Publication number Publication date
CN115423989A (zh) 2022-12-02

Similar Documents

Publication Publication Date Title
WO2024051579A1 (zh) 一种ar眼镜画面显示的控制方法及其组件
US20230016490A1 (en) Systems and methods for virtual and augmented reality
US10127722B2 (en) Mobile capture visualization incorporating three-dimensional and two-dimensional imagery
EP3332565B1 (en) Mixed reality social interaction
US8924985B2 (en) Network based real-time virtual reality input/output system and method for heterogeneous environment
US10504203B2 (en) Virtual graphics device driver
TWI678099B (zh) 視頻處理方法、裝置和儲存介質
WO2018126957A1 (zh) 显示虚拟现实画面的方法和虚拟现实设备
US10445260B2 (en) Direct access to hardware queues of a storage device by software threads
US10235733B2 (en) Device and method for performing scheduling for virtualized graphics processing units
CN106598514B (zh) 一种终端设备中切换虚拟现实模式的方法及系统
CN109213607B (zh) 一种多线程渲染的方法和装置
US11205286B2 (en) Techniques for optimizing creation of digital diagrams
CN111737019A (zh) 一种显存资源的调度方法、装置及计算机存储介质
WO2023173516A1 (zh) 数据交互的方法、装置、存储介质及电子设备
CN108093245B (zh) 一种多屏融合方法、系统、装置和计算机可读存储介质
CN112965773A (zh) 用于信息显示的方法、装置、设备和存储介质
US20170109113A1 (en) Remote Image Projection Method, Sever And Client Device
US10416759B2 (en) Eye tracking laser pointer
CN108499102B (zh) 信息界面展示方法及装置、存储介质、电子设备
US20190163434A1 (en) Technologies for networked virtual content in a mobile computing environment
US20190108037A1 (en) Pro-Active GPU Hardware Bootup
CN109727315A (zh) 一对多集群渲染方法、装置、设备及存储介质
US11961184B2 (en) System and method for scene reconstruction with plane and surface reconstruction
CN117555419A (zh) 控制方法、装置、头戴显示设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23862273

Country of ref document: EP

Kind code of ref document: A1