CN114302054A - A photographing method of an AR device and an AR device thereof - Google Patents

A photographing method of an AR device and an AR device thereof Download PDF

Info

Publication number
CN114302054A
CN114302054A CN202111444241.1A CN202111444241A CN114302054A CN 114302054 A CN114302054 A CN 114302054A CN 202111444241 A CN202111444241 A CN 202111444241A CN 114302054 A CN114302054 A CN 114302054A
Authority
CN
China
Prior art keywords
point
eyeball
current
eye
line segment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111444241.1A
Other languages
Chinese (zh)
Other versions
CN114302054B (en
Inventor
张猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Optical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Optical Technology Co Ltd filed Critical Goertek Optical Technology Co Ltd
Priority to CN202111444241.1A priority Critical patent/CN114302054B/en
Priority to PCT/CN2021/138585 priority patent/WO2023097791A1/en
Publication of CN114302054A publication Critical patent/CN114302054A/en
Application granted granted Critical
Publication of CN114302054B publication Critical patent/CN114302054B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)

Abstract

本申请涉及增强现实技术领域,更为具体来说,本申请涉及一种AR设备的拍照方法及其AR设备。所述AR设备包括眼球追踪摄像机和拍照摄像机,所述拍照方法包括:所述眼球追踪摄像机拍摄使用者的眼部照片;所述AR设备根据所述眼部照片获取所述使用者眼睛的当前注视焦点区域;根据所述当前注视焦点区域控制所述拍照摄像机拍摄当前画面,其中,所述当前画面与所述当前注视焦点区域对应。本申请省略了通过画面预览的过程,节省了计算资源、功耗和内存等所带来的开销,提升了使用者的体验。

Figure 202111444241

The present application relates to the technical field of augmented reality, and more particularly, the present application relates to a photographing method of an AR device and an AR device thereof. The AR device includes an eye-tracking camera and a photographing camera, and the photographing method includes: the eye-tracking camera takes a photo of the user's eyes; the AR device obtains the current gaze of the user's eyes according to the eye photo A focus area; controlling the photographing camera to shoot a current image according to the current gaze focus area, wherein the current image corresponds to the current gaze focus area. The present application omits the process of previewing through the screen, which saves the overhead caused by computing resources, power consumption, and memory, and improves the user experience.

Figure 202111444241

Description

一种AR设备的拍照方法及其AR设备A photographing method of an AR device and an AR device thereof

技术领域technical field

本申请涉及增强现实技术领域,更为具体来说,本申请涉及一种AR设备的拍照方法及其AR设备。The present application relates to the technical field of augmented reality, and more particularly, the present application relates to a photographing method of an AR device and an AR device thereof.

背景技术Background technique

近年来,随着增强现实(Augmented Reality,简称AR)设备的兴起,虚拟现实以及增强现实的种种应用渐渐走进了人的生活。为满足使用者多种体验需求,增强现实设备通常设置摄像机用于拍照。In recent years, with the rise of Augmented Reality (AR) devices, various applications of virtual reality and augmented reality have gradually entered people's lives. In order to meet the various experience requirements of users, augmented reality devices usually set up cameras for taking pictures.

当前的AR设备软硬件系统大部分都是沿用手机平台。在拍照过程中,摄像机获取的画面发送至AR设备的显示系统以便使用者预览、使用者预览画面后执行拍照动作。要实现画面预览,摄像机向CPU传输数据、数据传输需要占用CPU中一块图像大小的内存,整个传输过程中会耗费内存并产生较大的功耗。Most of the current AR equipment software and hardware systems follow the mobile phone platform. During the photographing process, the image captured by the camera is sent to the display system of the AR device so that the user can preview the image and perform the photographing action after the user previews the image. To achieve screen preview, the camera needs to transmit data to the CPU, and data transmission needs to occupy an image-sized memory in the CPU. The entire transmission process consumes memory and generates large power consumption.

发明内容SUMMARY OF THE INVENTION

基于上述技术问题,本发明旨在提供一种新的节省资源的拍照方式,根据使用者的当前注视焦点区域控制拍照摄像机拍摄当前面。Based on the above technical problems, the present invention aims to provide a new resource-saving photographing method, which controls the photographing camera to photograph the current front according to the user's current gaze focus area.

本发明第一方面提供了一种AR设备的拍照方法,所述AR设备包括眼球追踪摄像机和拍照摄像机,所述拍照方法包括:A first aspect of the present invention provides a photographing method for an AR device, the AR device includes an eye tracking camera and a photographing camera, and the photographing method includes:

所述眼球追踪摄像机拍摄使用者的眼部照片;the eye-tracking camera takes a photo of the user's eyes;

所述AR设备根据所述眼部照片获取所述使用者眼睛的当前注视焦点区域;obtaining, by the AR device, the current focus area of the user's eye according to the eye photo;

根据所述当前注视焦点区域控制所述拍照摄像机拍摄当前画面,其中,所述当前画面与所述当前注视焦点区域对应。The photographing camera is controlled to shoot a current image according to the current gaze focus area, wherein the current image corresponds to the current gaze focus area.

具体地,所述根据所述当前注视焦点区域控制所述拍照摄像机拍摄当前画面,包括:Specifically, the controlling the camera to shoot the current picture according to the current gaze focus area includes:

根据所述当前注视焦点区域确定所述拍照摄像机的拍照范围和拍照角度;Determine the photographing range and photographing angle of the photographing camera according to the current gaze focus area;

控制所述拍照摄像机根据所述拍照范围和拍照角度拍摄以获得所述当前画面。The photographing camera is controlled to photograph according to the photographing range and photographing angle to obtain the current picture.

再具体地,所述AR设备根据所述眼部照片获取所述使用者眼睛的当前注视焦点区域;包括:More specifically, the AR device obtains the current focus area of the user's eye according to the eye photo; including:

确定映射函数,所述映射函数对应眼球的转动数据与视线角度的关系;Determine a mapping function, the mapping function corresponds to the relationship between the rotation data of the eyeball and the line of sight angle;

获取当前转动数据;Get the current rotation data;

根据双眼间距、所述当前转动数据以及所述映射函数获取所述当前注视焦点区域。The current gaze focus area is acquired according to the distance between the eyes, the current rotation data and the mapping function.

进一步地,所述确定映射函数,包括:Further, the determining the mapping function includes:

获取视场角;Get the field of view;

选取所述视场角的四角和中心分别作为第一点、第二点、第三点、第四点和中心点;Select the four corners and the center of the field of view as the first point, the second point, the third point, the fourth point and the center point respectively;

将眼球注视于所述中心点时在眼球追踪摄像机图像中对应位置标记为原点;Marking the corresponding position in the eye tracking camera image as the origin when the eye is fixed on the center point;

分别记录使用者每只眼球转动时注视所述第一点、所述第二点、所述第三点和所述第四点对应于眼球追踪摄像机图像中偏移于原点的像素数;Recording the number of pixels offset from the origin in the eye tracking camera image corresponding to the first point, the second point, the third point and the fourth point when the user looks at the first point, the second point, the third point and the fourth point respectively;

计算所述偏移的像素数和眼球转动角度之间的对应关系。The correspondence between the offset pixel number and the eyeball rotation angle is calculated.

进一步优选地,计算所述偏移的像素数和眼球转动角度之间的对应关系,包括:Further preferably, calculating the corresponding relationship between the offset pixel number and the eyeball rotation angle, including:

通过所述第一点、所述第二点、所述第三点、所述第四点和所述原点获得相关线段与夹角;Obtain the relevant line segment and the included angle through the first point, the second point, the third point, the fourth point and the origin;

将除去所述第一点、所述第二点、所述第三点、所述第四点和所述原点之外的点记作其他注视点;Denote points other than the first point, the second point, the third point, the fourth point and the origin as other fixation points;

基于所述相关线段与夹角计算每只眼球在看向其他注视点时眼球追踪摄像机图像里偏移的像素数与眼球转动角度之间的对应关系。Based on the relevant line segment and the included angle, the correspondence between the number of pixels shifted in the image of the eye tracking camera and the rotation angle of the eyeball is calculated when each eyeball looks at other fixation points.

再进一步地,所述基于相关线段与夹角计算每只眼球在看向其他注视点时眼球追踪摄像机图像里偏移的像素数与眼球转动角度之间的对应关系,包括:Still further, calculating the correspondence between the number of pixels shifted in the image of the eye tracking camera and the rotation angle of the eyeball when each eyeball looks at other gazing points based on the relevant line segment and the included angle, including:

获取使用者每只眼球到所述原点的线段记作第一线段;Obtaining the line segment from each eyeball of the user to the origin is recorded as the first line segment;

将使用者每只眼球到所述第一点、所述第二点、所述第三点和所述第四点的线段分别记作第二线段、第三线段、第四线段和第五线段;Denote the line segments from each eyeball of the user to the first point, the second point, the third point and the fourth point as the second line segment, the third line segment, the fourth line segment and the fifth line segment, respectively ;

分别获取所述第二线段、所述第三线段、所述第四线段、所述第五线段与所述第一线段的夹角;respectively acquiring the included angles of the second line segment, the third line segment, the fourth line segment, the fifth line segment and the first line segment;

根据所述夹角及使用者每只眼球转动时注视所述第一点、所述第二点、所述第三点和所述第四点对应于眼球追踪摄像机图像中偏移于原点的像素数,计算每只眼球在看向其他注视点时对应于眼球追踪摄像机图像里偏移的像素数与眼球转动角度之间的对应关系。The first point, the second point, the third point and the fourth point correspond to the pixels offset from the origin in the eye tracking camera image according to the included angle and when the user looks at the first point, the second point, the third point and the fourth point Calculate the corresponding relationship between the number of pixels offset in the image of the eye tracking camera and the rotation angle of the eyeball when each eyeball looks at other fixation points.

本发明第二方面提供了一种AR设备,包括眼球追踪摄像机和拍照摄像机;所述眼球追踪摄像机用于拍摄使用者的眼部照片;所述AR设备用于根据所述眼部照片获取所述使用者眼睛的当前注视焦点区域,及根据所述当前注视焦点区域控制所述拍照摄像机拍摄当前画面。A second aspect of the present invention provides an AR device, including an eye-tracking camera and a photographing camera; the eye-tracking camera is used to take a picture of a user's eyes; the AR device is used to obtain the eye-picture according to the eye-picture. The current gazing focus area of the user's eyes, and controlling the photographing camera to shoot the current picture according to the current gazing focus area.

优选地,所述AR设备还包括控制模组和显示模组。Preferably, the AR device further includes a control module and a display module.

本发明第三方面提供了一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现本发明各实施方式中的所述AR设备的拍照方法。A third aspect of the present invention provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, implements the photographing method of the AR device in each embodiment of the present invention.

本发明第四方面提供了一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现本发明各实施方式中的所述AR设备的拍照方法。A fourth aspect of the present invention provides a computer program product, including a computer program, which, when executed by a processor, implements the photographing method of the AR device in each embodiment of the present invention.

本申请的有益效果为:本申请根据使用者的当前注视焦点区域控制拍照摄像机拍摄当前面,是一种新的节省资源的拍照方式。本申请的眼球追踪摄像机拍摄使用者的眼部照片,AR设备根据所述眼部照片获取所述使用者眼睛的当前注视焦点区域,根据所述当前注视焦点区域控制所述拍照摄像机拍摄当前画面,其中,所述当前画面与所述当前注视焦点区域对应,省略了画面预览的过程,节省了计算资源、功耗和内存等所带来的开销,提升了使用者的体验。The beneficial effects of the present application are as follows: the present application controls the photographing camera to photograph the current front according to the user's current focus area, which is a new resource-saving photographing method. The eye tracking camera of the present application captures a photo of the user's eyes, the AR device obtains the current gazing focus area of the user's eyes according to the eye photo, and controls the photographing camera to shoot the current picture according to the current gazing focus area, The current picture corresponds to the current focus area, the picture preview process is omitted, the overhead caused by computing resources, power consumption and memory is saved, and the user experience is improved.

应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本发明。It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention.

附图说明Description of drawings

构成说明书的一部分的附图描述了本申请的实施例,并且连同描述一起用于解释本申请的原理。The accompanying drawings, which form a part of the specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application.

参照附图,根据下面的详细描述,可以更加清楚地理解本申请,其中:The present application may be more clearly understood from the following detailed description with reference to the accompanying drawings, wherein:

图1示出了本申请一示例性实施例的方法步骤示意图;FIG. 1 shows a schematic diagram of the method steps of an exemplary embodiment of the present application;

图2示出了本申请一示例性实施例的标定显示模组的视场角示意图;FIG. 2 shows a schematic diagram of a field of view of a calibration display module according to an exemplary embodiment of the present application;

图3示出了本申请一示例性实施例中人眼注视焦点区域示意图;FIG. 3 shows a schematic diagram of the focus area of human eye gaze in an exemplary embodiment of the present application;

图4示出了本申请一示例性实施例中所述拍照方法产生的节省操作过程示意图;FIG. 4 shows a schematic diagram of a saving operation process generated by the photographing method in an exemplary embodiment of the present application;

图5示出了本申请一示例性实施例中的装置结构示意图;FIG. 5 shows a schematic structural diagram of an apparatus in an exemplary embodiment of the present application;

图6示出了本申请一示例性实施例所提供的一种AR设备的结构示意图;FIG. 6 shows a schematic structural diagram of an AR device provided by an exemplary embodiment of the present application;

图7示出了本申请一示例性实施例所提供的一种存储介质的示意图。FIG. 7 shows a schematic diagram of a storage medium provided by an exemplary embodiment of the present application.

具体实施方式Detailed ways

以下,将参照附图来描述本申请的实施例。但是应该理解的是,这些描述只是示例性的,而并非要限制本申请的范围。此外,在以下说明中,省略了对公知结构和技术的描述,以避免不必要地混淆本申请的概念。对于本领域技术人员来说显而易见的是,本申请可以无需一个或多个这些细节而得以实施。在其他的例子中,为了避免与本申请发生混淆,对于本领域公知的一些技术特征未进行描述。Hereinafter, embodiments of the present application will be described with reference to the accompanying drawings. It should be understood, however, that these descriptions are exemplary only, and are not intended to limit the scope of the application. Also, in the following description, descriptions of well-known structures and techniques are omitted to avoid unnecessarily obscuring the concepts of the present application. It will be apparent to those skilled in the art that the present application may be practiced without one or more of these details. In other instances, some technical features known in the art have not been described in order to avoid confusion with the present application.

应予以注意的是,这里所使用的术语仅是为了描述具体实施例,而非意图限制根据本申请的示例性实施例。如在这里所使用的,除非上下文另外明确指出,否则单数形式也意图包括复数形式。此外,还应当理解的是,当在本说明书中使用术语“包含”和/或“包括”时,其指明存在所述特征、整体、步骤、操作、元件和/或组件,但不排除存在或附加一个或多个其他特征、整体、步骤、操作、元件、组件和/或它们的组合。It should be noted that the terminology used herein is for the purpose of describing specific embodiments only, and is not intended to limit the exemplary embodiments in accordance with the present application. As used herein, the singular forms are also intended to include the plural forms unless the context clearly dictates otherwise. Furthermore, it should also be understood that when the terms "comprising" and/or "comprising" are used in this specification, they indicate the presence of stated features, integers, steps, operations, elements and/or components, but do not exclude the presence or Addition of one or more other features, integers, steps, operations, elements, components and/or combinations thereof.

现在,将参照附图更详细地描述根据本申请的示例性实施例。然而,这些示例性实施例可以多种不同的形式来实施,并且不应当被解释为只限于这里所阐述的实施例。附图并非是按比例绘制的,其中为了清楚表达的目的,可能放大了某些细节,并且可能省略了某些细节。图中所示出的各种区域、层的形状以及它们之间的相对大小、位置关系仅是示例性的,实际中可能由于制造公差或技术限制而有所偏差,并且本领域技术人员根据实际所需可以另外设计具有不同形状、大小、相对位置的区域/层。Now, exemplary embodiments according to the present application will be described in more detail with reference to the accompanying drawings. These exemplary embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. The drawings are not to scale, some details may be exaggerated and some details may be omitted for clarity. The shapes of the various regions and layers shown in the figures, as well as their relative sizes and positional relationships are only exemplary, and in practice, there may be deviations due to manufacturing tolerances or technical limitations, and those skilled in the art should Regions/layers with different shapes, sizes, relative positions can be additionally designed as desired.

下面结合说明书附图1-7给出几个实施例来描述根据本申请的示例性实施方式。需要注意的是,下述应用场景仅是为了便于理解本申请的精神和原理而示出,本申请的实施方式在此方面不受任何限制。相反,本申请的实施方式可以应用于适用的任何场景。Exemplary embodiments according to the present application are described below by giving several embodiments in conjunction with FIGS. 1-7 of the specification. It should be noted that the following application scenarios are only shown to facilitate understanding of the spirit and principles of the present application, and the embodiments of the present application are not limited in this respect. Rather, the embodiments of the present application can be applied to any scenario where applicable.

目前的AR设备,通常具有拍照功能,使用者可以根据需要预览画面后拍摄照片。具体地,AR设备上的拍照摄像机拍摄外部画面,拍照摄像机将外部画面发送至AR设备的显示系统,显示系统以虚拟图像的形式将外部画面呈现给使用者,而后使用者根据需要触发拍照操作。The current AR devices usually have the function of taking pictures, and users can take pictures after previewing the screen according to their needs. Specifically, the camera on the AR device captures the external image, the camera sends the external image to the display system of the AR device, and the display system presents the external image to the user in the form of a virtual image, and then the user triggers the photo-taking operation as needed.

本申请实施例提供一种AR设备的拍照方法,其中AR设备可以是AR眼镜或者AR头戴,使用时AR设备佩戴于使用者头部。该AR设备包括两类摄像机:眼球追踪摄像机和拍照摄像机,眼球追踪摄像机可以包括一个或者多个、用于拍摄使用者眼部照片,通常设置于靠近使用者眼部一侧;拍照摄像机也可以包括一个或者多个、用于拍摄外部照片,通常设置于远离使用者眼部一侧;一些实施例中,眼球追踪摄像机和拍照摄像机分别位于AR设备的相对两侧。An embodiment of the present application provides a method for photographing an AR device, wherein the AR device may be AR glasses or an AR headset, and the AR device is worn on the user's head during use. The AR device includes two types of cameras: eye-tracking cameras and photographing cameras. The eye-tracking cameras may include one or more cameras, which are used to take pictures of the user's eyes, and are usually located on the side close to the user's eyes; the photographing camera may also include One or more of them are used to take external photos, and are usually arranged on the side away from the user's eyes; in some embodiments, the eye tracking camera and the photographing camera are respectively located on opposite sides of the AR device.

本申请一些实施例提供一种AR设备的拍照方法,如图1所示,该拍照方法包括:Some embodiments of the present application provide a photographing method for an AR device. As shown in FIG. 1 , the photographing method includes:

首先,AR设备上的眼球追踪摄像机拍摄使用者的眼部照片。例如,左眼球追踪摄像机拍摄左眼眼部照片、右眼球追踪摄像机拍摄右眼眼部照片;或者,同一眼球追踪摄像机同时拍摄左眼眼部照片和右眼眼部照片。First, eye-tracking cameras on AR devices take pictures of the user's eyes. For example, the left eye-tracking camera captures the left eye photo and the right eye-tracking camera captures the right eye photo; or the same eye-tracking camera captures both the left eye photo and the right eye photo.

然后,AR设备根据眼部照片获取使用者的当前注视焦点区域,当前注视焦点区域是使用者眼睛视场范围内的一个区域、且使用者眼睛当前主要注视于该区域,该区域以外的信息用于不易觉察;该区域离使用者眼睛预定距离,且该区域位于AR设备远离使用者眼部的一侧。Then, the AR device obtains the user's current gaze focus area according to the eye photo. The current gaze focus area is an area within the user's eye field of view, and the user's eyes are currently focused on this area. Information outside this area is used for It is not easy to perceive; the area is at a predetermined distance from the user's eyes, and the area is located on the side of the AR device away from the user's eyes.

最后,根据当前注视焦点区域控制拍照摄像机拍摄当前画面,其中当前画面与当前注视焦点区域对应。Finally, the camera is controlled to shoot the current picture according to the current gaze focus area, wherein the current image corresponds to the current gaze focus area.

通常,AR设备拍摄的照片都是几米甚至几十米以外的外景,而AR设备上的拍照摄像机与使用者眼睛之间的直线距离一般仅为1cm左右,在此认为拍照摄像机与使用者眼睛是重合的、即拍照摄像机的视场范围与眼睛的视场范围是重合的。Usually, the photos taken by AR devices are outside scenes several meters or even tens of meters away, and the straight-line distance between the camera on the AR device and the user's eyes is generally only about 1cm. Coincident, that is, the field of view of the photographing camera is coincident with the field of view of the eye.

根据当前注视焦点区域可以确定拍照摄像机的当前视场范围,控制拍照摄像机在当前视场范围拍摄照片。The current field of view of the camera can be determined according to the current focus area, and the camera can be controlled to take pictures within the current field of view.

本申请实施例提供的拍照方法,根据眼球追踪摄像机获得的使用者当前注视焦点区域确定拍照摄像机的当前视场范围,并据此拍摄照片。本方案中,拍照摄像机无需将预览画面发送至AR设备的显示系统、显示系统也无需将预览画面投射入使用者眼睛,节省了预览过程,不但节省了内存空间、同时消除了由于传输预览画面而带来的功耗。In the photographing method provided by the embodiment of the present application, the current field of view of the photographing camera is determined according to the user's current gaze focus area obtained by the eye tracking camera, and a photograph is taken accordingly. In this solution, the camera does not need to send the preview image to the display system of the AR device, nor does the display system need to project the preview image into the user's eyes, which saves the preview process, not only saves memory space, but also eliminates the need for transmission of preview images. power consumption.

本申请一些实施例中,AR设备根据眼部照片获取使用者的当前注视焦点区域时,可以先确定映射函数,再获取当前转动数据,最后根据当前转动数据、双眼间距(即使用者瞳距)以及映射函数获取当前注视焦点区域,其中,映射函数对应眼球的转动数据(转动数据可以包括转动距离和/或转动角度)与视线角度之间的关系。In some embodiments of the present application, when the AR device obtains the user's current focus area according to the eye photo, it may first determine the mapping function, then obtain the current rotation data, and finally obtain the current rotation data and the distance between the eyes (that is, the user's interpupillary distance) according to the current rotation data. And the mapping function obtains the current focus area, wherein the mapping function corresponds to the relationship between the rotation data of the eyeball (the rotation data may include rotation distance and/or rotation angle) and the line of sight angle.

在具体实施例中,AR设备根据眼部照片获取使用者眼睛的当前注视焦点区域;包括:确定映射函数,映射函数对应眼球的转动数据与视线角度的关系;获取当前转动数据;根据双眼间距、当前转动数据以及映射函数获取当前注视焦点区域。进一步地,确定映射函数,包括:获取视场角;选取视场角的四角和中心分别作为第一点、第二点、第三点、第四点和中心点;将眼球注视于中心点时在眼球追踪摄像机图像中对应位置标记为原点;分别记录使用者每只眼球转动时注视第一点、第二点、第三点和第四点对应于眼球追踪摄像机图像中偏移于原点的像素数;计算偏移的像素数和眼球转动角度之间的对应关系。In a specific embodiment, the AR device obtains the current focus area of the user's eye according to the eye photo; including: determining a mapping function, the mapping function corresponds to the relationship between the rotation data of the eyeball and the line of sight angle; obtaining the current rotation data; The current rotation data and the mapping function get the current gaze focus area. Further, determine the mapping function, including: obtaining the angle of view; selecting the four corners and the center of the angle of view as the first point, the second point, the third point, the fourth point and the center point respectively; when the eyeball is fixed on the center point The corresponding position in the eye-tracking camera image is marked as the origin; the first point, the second point, the third point and the fourth point corresponding to the pixels offset from the origin in the eye-tracking camera image are recorded separately when the user turns each eyeball. number; calculates the correspondence between the number of pixels shifted and the angle of eye rotation.

在一种可能的具体的实施方式中,参照图2,第一点、第二点、第三点和第四点分别对应于图2中的B点、C点、D点、E点,中心点则对应于图2中的A点。这里,对于两个眼球都要分别标定,将眼球注视于中心点时在眼球追踪摄像机图像中对应位置标记为原点,将使用者左眼记作PL,右眼记作PR,PL和PR转动多少距离对应于注视点偏移于原点的像素数,通过眼球追踪摄像机可以标注出来,并计算出使用者的目光延长线与使用者视场中心的夹角(可以是视线角度或者视线角度的一半),再者,标定的时候需要让眼睛望向远处,使渲染出的点位于远处,所以两只眼睛的目光延长线近似于平行。In a possible specific implementation, referring to FIG. 2 , the first point, the second point, the third point and the fourth point respectively correspond to the points B, C, D and E in FIG. 2 , and the center The point corresponds to point A in Figure 2. Here, the two eyeballs should be calibrated separately, the corresponding position in the eye tracking camera image when the eyeball is fixed on the center point is marked as the origin, the user's left eye is marked as PL , and the right eye is marked as PR , PL and How far the P R rotates corresponds to the number of pixels that the gaze point is offset from the origin, which can be marked by the eye tracking camera, and the angle between the extension line of the user's gaze and the center of the user's field of view (which can be the line of sight or the line of sight) is calculated. In addition, when calibrating, you need to make the eyes look far away, so that the rendered point is in the far distance, so the extension lines of the eyes of the two eyes are approximately parallel.

具体地,计算所述偏移的像素数和眼球转动角度之间的对应关系,包括:通过第一点、第二点、第三点、第四点和原点获得相关线段与夹角;将除去所述第一点、所述第二点、所述第三点、所述第四点和所述原点之外的点记作其他注视点;基于相关线段与夹角计算每只眼球在看向其他注视点时眼球追踪摄像机图像里偏移的像素数与眼球转动角度之间的对应关系。Specifically, calculating the corresponding relationship between the number of offset pixels and the rotation angle of the eyeball includes: obtaining the relevant line segment and the included angle through the first point, the second point, the third point, the fourth point and the origin; removing the The first point, the second point, the third point, the fourth point and the points other than the origin are recorded as other gaze points; The correspondence between the number of pixels offset in the image of the eye-tracking camera and the angle of eye rotation at other fixation points.

进一步地,所述基于相关线段与夹角计算每只眼球在看向其他注视点时眼球追踪摄像机图像里偏移的像素数与眼球转动角度之间的对应关系,包括:获取使用者每只眼球到原点的线段记作第一线段;将使用者每只眼球到所述第一点、所述第二点、所述第三点和所述第四点的线段分别记作第二线段、第三线段、第四线段和第五线段;分别获取所述第二线段、所述第三线段、所述第四线段、所述第五线段与所述第一线段的夹角;根据所述夹角及使用者每只眼球转动时注视所述第一点、所述第二点、所述第三点和所述第四点对应于眼球追踪摄像机图像中偏移于原点的像素数,计算每只眼球在看向其他注视点时对应于眼球追踪摄像机图像里偏移的像素数与眼球转动角度之间的对应关系。Further, calculating the correspondence between the number of pixels shifted in the eye tracking camera image and the rotation angle of the eyeball when each eyeball looks at other gazing points based on the relevant line segment and the included angle includes: obtaining each eyeball of the user; The line segment to the origin is recorded as the first line segment; the line segments from each eyeball of the user to the first point, the second point, the third point and the fourth point are respectively recorded as the second line segment, The third line segment, the fourth line segment and the fifth line segment; respectively obtain the included angles of the second line segment, the third line segment, the fourth line segment, the fifth line segment and the first line segment; The included angle and the first point, the second point, the third point and the fourth point when the user looks at the first point, the second point, the third point and the fourth point correspond to the number of pixels offset from the origin in the eye tracking camera image, Calculate the correspondence between the number of pixels offset in the image of the eye-tracking camera and the angle of eye rotation when each eye looks at other fixation points.

在一种具体的实施方式中,(Δx,Δy)表示使用者每只眼球转动时注视第一点、第二点、第三点和第四点对应于眼球追踪摄像机图像中偏移于原点的像素数,而每只眼球在看向其他注视点时对应于眼球追踪摄像机图像里偏移的像素数与眼球转动角度之间的对应关系会形成f函数即Δθ=f(Δx,Δy),根据不同的眼球追踪摄像机会产生不同的f函数,f函数可能是线性的,也可能非线性的,通过不同的对应关系可以得到其他注视点对应于眼球追踪摄像机图像中偏移于原点的像素数,进而实现当使用者眼球转动导致注视点从一点转移到其他点时具体偏移的像素数,使拍摄的图像精准地拍摄出使用者的眼睛。In a specific embodiment, (Δx, Δy) indicates that the first point, the second point, the third point and the fourth point when the user rotates each eyeball, corresponds to the offset from the origin in the eye-tracking camera image. The number of pixels, and the corresponding relationship between the number of pixels shifted in the eye tracking camera image and the rotation angle of the eyeball when each eyeball looks at other fixation points will form an f function, namely Δθ=f(Δx, Δy), according to Different eye tracking cameras will generate different f functions. The f function may be linear or nonlinear. Through different correspondences, other gaze points can be obtained corresponding to the number of pixels offset from the origin in the image of the eye tracking camera. Further, when the user's eyeball rotates and the gaze point is shifted from one point to another point, the specific offset pixel number is realized, so that the captured image accurately captures the user's eyes.

在一种可能的实施方式中,还可以根据三角函数及双眼间距计算双眼的注视点到双眼的相对位置,及根据三角函数及双眼间距计算双眼注视焦点区域。在一种可能的具体实施方式中,可参照图3,θL表示左眼球与注视点的线段与第一线段的夹角,θR表示右眼球与注视点的线段与第一线段的夹角。In a possible implementation manner, the relative positions of the fixation points of the eyes to the eyes may also be calculated according to the trigonometric function and the distance between the eyes, and the focus area of the fixation of the eyes may be calculated according to the trigonometric function and the distance between the eyes. In a possible specific implementation, referring to FIG. 3 , θ L represents the angle between the line segment between the left eyeball and the fixation point and the first line segment, and θ R represents the angle between the line segment between the right eyeball and the fixation point and the first line segment. angle.

在另一种可能的实施方式中,还可以将眼球追踪摄像机基于显示模组的视场角标定使用者每只眼球转动所导致的眼球追踪摄像机图像里偏移的像素数,在此之前,给定使用者的双眼间距,再如图3所示,使用者的双眼间距为L。In another possible implementation, the eye-tracking camera can also be used to calibrate the number of pixels offset in the image of the eye-tracking camera caused by the rotation of each eyeball of the user based on the field of view of the display module. The distance between the eyes of the user is determined, and as shown in FIG. 3 , the distance between the eyes of the user is L.

需要说明的是,现有的相机即camera需要以一个特定的帧率向CPU传输数据,当前手机平台中普遍是30hz/60hz,同时数据传输需要占用CPU中一块图像大小的内存,整个传输过程中会产生不菲的资源(功耗,内存)开销。如图4所示,采用本申请所述的AR设备的拍照方法,可以节省操作过程。使用者可以直接透过屏幕看到外部的世界,所以可以减少系统资源的消耗,如图4下部,可以通过眼球追踪摄像机(即ET Camera)捕捉使用者的注视点,进行精确对焦,完成拍照。It should be noted that the existing camera, that is, the camera, needs to transmit data to the CPU at a specific frame rate, which is generally 30hz/60hz in the current mobile phone platform. At the same time, the data transmission needs to occupy an image-sized memory in the CPU. There will be a lot of resource (power consumption, memory) overhead. As shown in FIG. 4 , using the AR device photographing method described in this application can save the operation process. The user can directly see the outside world through the screen, so the consumption of system resources can be reduced. As shown in the lower part of Figure 4, the user's gaze point can be captured by the eye tracking camera (ie ET Camera), and the focus can be accurately completed to complete the photo.

本申请根据使用者的当前注视焦点区域控制拍照摄像机拍摄当前面,是一种新的节省资源的拍照方式。本申请的眼球追踪摄像机拍摄使用者的眼部照片,AR设备根据眼部照片获取使用者眼睛的当前注视焦点区域,根据当前注视焦点区域控制拍照摄像机拍摄当前画面,其中,当前画面与当前注视焦点区域对应,省略了画面预览的过程以及使用者手动选择对焦区域的过程,节省了计算资源、功耗和内存等所带来的开销,提升了使用者的体验,且增强了使用者拍照时的沉浸感。The present application controls the photographing camera to photograph the current front according to the user's current gaze focus area, which is a new resource-saving photographing method. The eye tracking camera of the present application captures a photo of the user's eyes, the AR device obtains the current gaze focus area of the user's eyes according to the eye photo, and controls the camera to capture the current image according to the current gaze focus area, wherein the current image and the current gaze focus Area correspondence, omitting the process of screen preview and the process of manually selecting the focus area by the user, saving the overhead caused by computing resources, power consumption and memory, improving the user's experience, and enhancing the user's ability to take pictures. Immersion.

应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本发明。It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention.

在一些示例性实施例中,还提供了一种AR设备,如图5所示,包括眼球追踪摄像机501和拍照摄像机502;所述眼球追踪摄像机用于拍摄使用者的眼部照片;所述AR设备用于根据所述眼部照片获取所述使用者眼睛的当前注视焦点区域,及根据所述当前注视焦点区域控制所述拍照摄像机拍摄当前画面,其中,所述当前画面与所述当前注视焦点区域对应。In some exemplary embodiments, an AR device is also provided, as shown in FIG. 5 , including an eye-tracking camera 501 and a photographing camera 502; the eye-tracking camera is used to take pictures of the user's eyes; the AR device The device is configured to obtain the current gaze focus area of the user's eyes according to the eye photo, and control the photographing camera to shoot a current image according to the current gaze focus area, wherein the current image and the current gaze focus corresponding to the region.

优选地,所述AR设备还包括控制模组503和显示模组504,眼球追踪摄像机501、拍照摄像机502、控制模组503和显示模组504协同工作。Preferably, the AR device further includes a control module 503 and a display module 504, and the eye tracking camera 501, the photographing camera 502, the control module 503 and the display module 504 work together.

还需要强调的是,本申请实施例中提供的增强现实系统可以基于人工智能技术对相关的数据进行获取和处理。其中,人工智能(Artificial Intelligence,AI)是利用数字计算机或者数字计算机控制的机器模拟、延伸和扩展人的智能,感知环境、获取知识并使用知识获得最佳结果的理论、方法、技术及应用系统。人工智能基础技术一般包括如传感器、专用人工智能芯片、云计算、分布式存储、大数据处理技术、操作/交互系统、机电一体化等技术。人工智能软件技术主要包括计算机视觉技术、机器人技术、生物识别技术、语音处理技术、自然语言处理技术以及机器学习/深度学习等几大方向。It should also be emphasized that the augmented reality system provided in the embodiments of the present application can acquire and process related data based on artificial intelligence technology. Among them, artificial intelligence (AI) is a theory, method, technology and application system that uses digital computers or machines controlled by digital computers to simulate, extend and expand human intelligence, perceive the environment, acquire knowledge and use knowledge to obtain the best results. . The basic technologies of artificial intelligence generally include technologies such as sensors, special artificial intelligence chips, cloud computing, distributed storage, big data processing technology, operation/interaction systems, and mechatronics. Artificial intelligence software technology mainly includes computer vision technology, robotics technology, biometrics technology, speech processing technology, natural language processing technology, and machine learning/deep learning.

下面请参考图6,其示出了本申请的一些实施方式所提供的一种AR设备的示意图。如图6所示,所述AR设备2包括:处理器200,存储器201,总线202和通信接口203,所述处理器200、通信接口203和存储器201通过总线202连接;所述存储器201中存储有可在所述处理器200上运行的计算机程序,所述处理器200运行所述计算机程序时执行本申请前述任一实施方式所提供的AR设备的拍照方法。Please refer to FIG. 6 below, which shows a schematic diagram of an AR device provided by some embodiments of the present application. As shown in FIG. 6 , the AR device 2 includes: a processor 200 , a memory 201 , a bus 202 and a communication interface 203 , the processor 200 , the communication interface 203 and the memory 201 are connected through the bus 202 ; There is a computer program that can be run on the processor 200. When the processor 200 runs the computer program, the method for photographing an AR device provided by any of the foregoing embodiments of the present application is executed.

其中,存储器201可能包含高速随机存取存储器(RAM:Random Access Memory),也可能还包括非不稳定的存储器(non-volatile memory),例如至少一个磁盘存储器。通过至少一个通信接口203(可以是有线或者无线)实现该系统网元与至少一个其他网元之间的通信连接,可以使用互联网、广域网、本地网、城域网等。The memory 201 may include a high-speed random access memory (RAM: Random Access Memory), and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 203 (which may be wired or wireless), which may use the Internet, a wide area network, a local network, a metropolitan area network, and the like.

总线202可以是ISA总线、PCI总线或EISA总线等。所述总线可以分为地址总线、数据总线、控制总线等。其中,存储器201用于存储程序,所述处理器200在接收到执行指令后,执行所述程序,前述本申请实施例任一实施方式揭示的所述AR设备的拍照方法可以应用于处理器200中,或者由处理器200实现。The bus 202 may be an ISA bus, a PCI bus, an EISA bus, or the like. The bus can be divided into an address bus, a data bus, a control bus, and the like. The memory 201 is used to store a program, and the processor 200 executes the program after receiving the execution instruction, and the method for photographing the AR device disclosed in any of the foregoing embodiments of the present application can be applied to the processor 200 , or implemented by the processor 200 .

处理器200可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器200中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器200可以是通用处理器,包括中央处理器(Central Processing Unit,简称CPU)、网络处理器(Network Processor,简称NP)等;还可以是数字信号处理器(DSP)、专用集成电路(ASIC)、现成可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器201,处理器200读取存储器201中的信息,结合其硬件完成上述方法的步骤。The processor 200 may be an integrated circuit chip with signal processing capability. In the implementation process, each step of the above-mentioned method can be completed by an integrated logic circuit of hardware in the processor 200 or an instruction in the form of software. The above-mentioned processor 200 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; it may also be a digital signal processor (DSP), an application-specific integrated circuit (ASIC), off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware component. The methods, steps, and logic block diagrams disclosed in the embodiments of this application can be implemented or executed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor. The software modules may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art. The storage medium is located in the memory 201, and the processor 200 reads the information in the memory 201, and completes the steps of the above method in combination with its hardware.

本申请实施例提供的AR设备与本申请实施例提供的AR设备的拍照方法出于相同的发明构思,具有与其采用、运行或实现的方法相同的有益效果。The AR device provided by the embodiment of the present application and the AR device photographing method provided by the embodiment of the present application are based on the same inventive concept, and have the same beneficial effects as the method adopted, operated or implemented.

本申请实施方式还提供一种与前述实施方式所提供的AR设备的拍照方法对应的计算机可读存储介质,请参考图7,图7示出的计算机可读存储介质为光盘30,其上存储有计算机程序(即程序产品),所述计算机程序在被处理器运行时,会执行前述任意实施方式所提供的AR设备的拍照方法。Embodiments of the present application further provide a computer-readable storage medium corresponding to the AR device photographing method provided by the foregoing embodiments. Please refer to FIG. 7 . The computer-readable storage medium shown in FIG. 7 is an optical disc 30 on which storage There is a computer program (ie, a program product), when the computer program is run by the processor, the computer program will execute the photographing method of the AR device provided by any of the foregoing embodiments.

另外,所述计算机可读存储介质的例子还可以包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他光学、磁性存储介质,在此不再一一赘述。In addition, examples of the computer-readable storage medium may also include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other optical and magnetic storage media, which will not be described in detail here.

本申请的上述实施例提供的计算机可读存储介质与本申请实施例提供的空分复用光网络中量子密钥分发信道分配方法出于相同的发明构思,具有与其存储的应用程序所采用、运行或实现的方法相同的有益效果。The computer-readable storage medium provided by the above-mentioned embodiments of the present application and the quantum key distribution channel allocation method in the space-division multiplexing optical network provided by the embodiments of the present application are based on the same inventive concept, and have the same inventive concept as the application program stored in the computer-readable storage medium. run or achieve the same beneficial effect.

本申请实施方式还提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现前述任意实施方式所提供的AR设备的拍照方法的步骤。Embodiments of the present application further provide a computer program product, including a computer program, which, when executed by a processor, implements the steps of the AR device photographing method provided by any of the foregoing embodiments.

需要说明的是:在此提供的算法和显示不与任何特定计算机、虚拟装置或者其它设备有固有相关。各种通用装置也可以与基于在此的示教一起使用。根据上面的描述,构造这类装置所要求的结构是显而易见的。此外,本申请也不针对任何特定编程语言。应当明白,可以利用各种编程语言实现在此描述的本申请的内容,并且上面对特定语言所做的描述是为了披露本申请的最佳实施方式。在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本申请的实施例可以在没有这些具体细节的情况下实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书的理解。It should be noted that the algorithms and displays provided herein are not inherently related to any particular computer, virtual appliance or other device. Various general-purpose devices can also be used with the teachings based on this. The structure required to construct such a device is apparent from the above description. Furthermore, this application is not directed to any particular programming language. It should be understood that the content of the application described herein can be implemented using a variety of programming languages and that the descriptions of specific languages above are intended to disclose the best mode of the application. In the description provided herein, numerous specific details are set forth. It will be understood, however, that the embodiments of the present application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.

类似地,应当理解,为了精简本申请并帮助理解各个发明方面中的一个或多个,在上面对本申请的示例性实施例的描述中,本申请的各个特征有时被一起分组到单个实施例、图或者对其的描述中。然而,并不应将该公开的方法解释成反映如下意图:即所要求保护的本申请要求比在每个权利要求中所明确记载的特征更多的特征。更确切地说,如下面的权利要求书所反映的那样,发明方面在于少于前面公开的单个实施例的所有特征。因此,遵循具体实施方式的权利要求书由此明确地并入该具体实施方式,其中每个权利要求本身都作为本申请的单独实施例。Similarly, it is to be understood that in the above description of exemplary embodiments of the present application, various features of the present application are sometimes grouped together into a single embodiment, figure or its description. This disclosure, however, should not be interpreted as reflecting an intention that the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this application.

本领域那些技术人员可以理解,可以对实施例中的设备中的模块进行自适应性地改变并且把它们设置在与该实施例不同的一个或多个设备中。可以把实施例中的模块或单元或组件组合成一个模块或单元或组件,以及此外可以把它们分成多个子模块或子单元或子组件。除了这样的特征和/或过程或者单元中的至少一些是相互排斥之外,可以采用任何组合对本说明书中公开的所有特征以及如此公开的任何方法或者设备的所有过程或单元进行组合。除非另外明确陈述,本说明书中公开的每个特征可以由提供相同、等同或相似目的的替代特征来代替。Those skilled in the art will understand that the modules in the device in the embodiment can be adaptively changed and arranged in one or more devices different from the embodiment. The modules or units or components in the embodiments may be combined into one module or unit or component, and further they may be divided into multiple sub-modules or sub-units or sub-assemblies. All features disclosed in this specification and all processes or elements of any method or apparatus so disclosed may be combined in any combination, except that at least some of such features and/or procedures or elements are mutually exclusive. Unless expressly stated otherwise, each feature disclosed in this specification may be replaced by alternative features serving the same, equivalent or similar purpose.

本申请的各个部件实施例可以以硬件实现,或者以在一个或者多个处理器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器(DSP)来实现根据本申请实施例的虚拟机的创建装置中的一些或者全部部件的一些或者全部功能。本申请还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序。实现本申请的程序可以存储在计算机可读介质上,或者可以具有一个或者多个信号的形式。这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。Various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art should understand that, in practice, a microprocessor or a digital signal processor (DSP) may be used to implement some or all functions of some or all components in the apparatus for creating a virtual machine according to the embodiments of the present application. The present application can also be implemented as an apparatus or apparatus program for performing part or all of the methods described herein. A program implementing the present application may be stored on a computer-readable medium, or may be in the form of one or more signals. Such signals may be downloaded from Internet sites, or provided on carrier signals, or in any other form.

以上所述,仅为本申请较佳的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。The above description is only a preferred embodiment of the present application, but the protection scope of the present application is not limited to this. Substitutions should be covered within the protection scope of this application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A photographing method of an AR device is characterized in that the AR device comprises an eyeball tracking camera and a photographing camera, and the photographing method comprises the following steps:
the eyeball tracking camera shoots the eye picture of the user;
the AR equipment acquires the current watching focus area of the eyes of the user according to the eye photo;
and controlling the photographing camera to photograph a current picture according to the current watching focus area, wherein the current picture corresponds to the current watching focus area.
2. The photographing method according to claim 1, wherein the controlling the photographing camera to photograph a current picture according to the current gaze focus area comprises:
determining the photographing range and the photographing angle of the photographing camera according to the current watching focus area;
and controlling the camera to shoot according to the shooting range and the shooting angle so as to obtain the current picture.
3. The photographing method according to claim 1 or 2, wherein the AR device obtains a current gaze focus area of the user's eyes from the eye photograph; the method comprises the following steps:
determining a mapping function corresponding to the relationship between the rotation data of the eyeballs and the sight angle;
acquiring current rotation data;
and acquiring the current gazing focus area according to the distance between the two eyes, the current rotation data and the mapping function.
4. The photographing method according to claim 3, wherein the determining a mapping function includes:
acquiring a field angle;
selecting four corners and the center of the field angle as a first point, a second point, a third point, a fourth point and a center point respectively;
marking the corresponding position in the eyeball tracking camera image as the origin when the eyeball focuses on the central point;
respectively recording the pixel numbers of the first point, the second point, the third point and the fourth point which are respectively watched by the user when each eyeball rotates and are offset from the original point in the image of the eyeball tracking camera;
and calculating the corresponding relation between the deviated pixel number and the eyeball rotation angle.
5. The photographing method according to claim 4, wherein calculating the correspondence between the number of pixels of the offset and the eyeball rotation angle includes:
obtaining a relevant line segment and an included angle through the first point, the second point, the third point, the fourth point and the origin;
recording points other than the first point, the second point, the third point, the fourth point, and the origin as other gazing points;
and calculating the corresponding relation between the pixel number deviated in the eyeball tracking camera image and the eyeball rotation angle when each eyeball looks at other fixation points based on the related line segments and the included angles.
6. The photographing method according to claim 5, wherein the calculating of the correspondence between the number of pixels shifted in the image of the eye-tracking camera when each eye looks at other fixation points and the rotation angle of the eye based on the associated line segment and the included angle comprises:
acquiring a line segment from each eyeball of the user to the origin point and recording the line segment as a first line segment;
recording line segments from each eyeball of the user to the first point, the second point, the third point and the fourth point as a second line segment, a third line segment, a fourth line segment and a fifth line segment respectively;
respectively acquiring included angles between the second line segment, the third line segment, the fourth line segment and the fifth line segment and the first line segment;
and calculating the corresponding relation between the number of pixels of each eyeball which are offset in the image of the eyeball tracking camera when the eyeball looks at other fixation points and the eyeball rotation angle according to the included angle and the number of pixels of each eyeball which are offset from the original point in the image of the eyeball tracking camera when the user looks at each eyeball.
7. An AR device comprising an eye tracking camera and a camera; the eyeball tracking camera is used for shooting an eye picture of a user;
the AR equipment is used for acquiring a current watching focus area of the eyes of the user according to the eye picture and controlling the photographing camera to photograph a current picture according to the current watching focus area, wherein the current picture corresponds to the current watching focus area.
8. The AR apparatus of claim 7, further comprising a control module and a display module.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the method of any one of claims 1 to 6 when executed by a processor.
CN202111444241.1A 2021-11-30 2021-11-30 Photographing method of AR equipment and AR equipment Active CN114302054B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111444241.1A CN114302054B (en) 2021-11-30 2021-11-30 Photographing method of AR equipment and AR equipment
PCT/CN2021/138585 WO2023097791A1 (en) 2021-11-30 2021-12-16 Photographing method of ar device and ar device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111444241.1A CN114302054B (en) 2021-11-30 2021-11-30 Photographing method of AR equipment and AR equipment

Publications (2)

Publication Number Publication Date
CN114302054A true CN114302054A (en) 2022-04-08
CN114302054B CN114302054B (en) 2023-06-20

Family

ID=80966029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111444241.1A Active CN114302054B (en) 2021-11-30 2021-11-30 Photographing method of AR equipment and AR equipment

Country Status (2)

Country Link
CN (1) CN114302054B (en)
WO (1) WO2023097791A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119211511B (en) * 2024-11-21 2025-05-27 雷鸟创新技术(深圳)有限公司 Shooting control method, device, extended reality device and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108881724A (en) * 2018-07-17 2018-11-23 北京七鑫易维信息技术有限公司 A kind of image acquiring method, device, equipment and storage medium
CN110177210A (en) * 2019-06-17 2019-08-27 Oppo广东移动通信有限公司 photographing method and related device
CN110225252A (en) * 2019-06-11 2019-09-10 Oppo广东移动通信有限公司 Photographing control method and related product
CN111225157A (en) * 2020-03-03 2020-06-02 Oppo广东移动通信有限公司 Focus tracking method and related equipment
CN111880654A (en) * 2020-07-27 2020-11-03 歌尔光学科技有限公司 Image display method and device, wearable device and storage medium
CN112738388A (en) * 2019-10-28 2021-04-30 七鑫易维(深圳)科技有限公司 Photographing processing method and system, electronic device and storage medium
CN113395438A (en) * 2020-03-12 2021-09-14 Oppo广东移动通信有限公司 Image correction method and related device for eyeball tracking technology
CN113420678A (en) * 2021-06-25 2021-09-21 阿波罗智联(北京)科技有限公司 Gaze tracking method, device, apparatus, storage medium, and computer program product

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000201289A (en) * 1999-01-07 2000-07-18 Sony Corp Image input-output device and image acquiring method
KR102345652B1 (en) * 2015-06-26 2021-12-30 삼성전자주식회사 View finder apparatus and method for the same
WO2018107566A1 (en) * 2016-12-16 2018-06-21 华为技术有限公司 Processing method and mobile device
KR20210137831A (en) * 2020-05-11 2021-11-18 삼성전자주식회사 Electronic apparatus and operaintg method thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108881724A (en) * 2018-07-17 2018-11-23 北京七鑫易维信息技术有限公司 A kind of image acquiring method, device, equipment and storage medium
CN110225252A (en) * 2019-06-11 2019-09-10 Oppo广东移动通信有限公司 Photographing control method and related product
CN110177210A (en) * 2019-06-17 2019-08-27 Oppo广东移动通信有限公司 photographing method and related device
CN112738388A (en) * 2019-10-28 2021-04-30 七鑫易维(深圳)科技有限公司 Photographing processing method and system, electronic device and storage medium
CN111225157A (en) * 2020-03-03 2020-06-02 Oppo广东移动通信有限公司 Focus tracking method and related equipment
CN113395438A (en) * 2020-03-12 2021-09-14 Oppo广东移动通信有限公司 Image correction method and related device for eyeball tracking technology
CN111880654A (en) * 2020-07-27 2020-11-03 歌尔光学科技有限公司 Image display method and device, wearable device and storage medium
CN113420678A (en) * 2021-06-25 2021-09-21 阿波罗智联(北京)科技有限公司 Gaze tracking method, device, apparatus, storage medium, and computer program product

Also Published As

Publication number Publication date
CN114302054B (en) 2023-06-20
WO2023097791A1 (en) 2023-06-08

Similar Documents

Publication Publication Date Title
US11756223B2 (en) Depth-aware photo editing
WO2020103503A1 (en) Night scene image processing method and apparatus, electronic device, and storage medium
US9736391B2 (en) Photographing method of dual-lens device, and dual-lens device
US10109076B2 (en) Depth-aware stereo image editing method apparatus and computer-readable medium
KR102390252B1 (en) Technique for providing virtual lighting adjustments to image data
CN104125405B (en) Interesting image regions extracting method based on eyeball tracking and autofocus system
US10136056B2 (en) Panoramic imaging apparatus and system, method of generating panoramic image using panoramic imaging system, computer-readable recording medium, and computer program stored in computer-readable recording medium
US11012608B2 (en) Processing method and mobile device
CN109002248B (en) VR scene screenshot method, device and storage medium
CN105430269B (en) A kind of photographic method and device applied to mobile terminal
CN115484403A (en) Video recording method and related device
JP6283329B2 (en) Augmented Reality Object Recognition Device
CN114302054A (en) A photographing method of an AR device and an AR device thereof
WO2018042074A1 (en) A method, apparatus and computer program product for indicating a seam of an image in a corresponding area of a scene
CN112087612A (en) Multi-scene-based panoramic imaging method and device, computer equipment and storage medium
CN110133852A (en) Intelligence beautification glasses
CN108919957A (en) A kind of image transfer method, device, terminal device and storage medium
CN115268650A (en) Screen capture method, device, head-mounted virtual reality device, and storage medium
CN115134532A (en) Image processing method, image processing device, storage medium and electronic equipment
CN115129159A (en) Display method and device, head-mounted equipment, electronic equipment and computer medium
CN104754201B (en) A kind of electronic equipment and information processing method
CN108924410B (en) Photographing control method and related device
CN110941344A (en) Method for obtaining gazing point data and related device
CN111147934B (en) Electronic device and output picture determining method
CN118612395A (en) Display screen control method, device, display panel, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221116

Address after: No. 500 Songling Road, Laoshan District, Qingdao City, Shandong Province, 266100

Applicant after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261000 plant 1, phase III, goer Photoelectric Industrial Park, No. 3999, Huixian Road, Yongchun community, Qingchi street, high tech Zone, Weifang City, Shandong Province

Applicant before: GoerTek Optical Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant