CN114302054A - AR device photographing method and AR device - Google Patents

AR device photographing method and AR device Download PDF

Info

Publication number
CN114302054A
CN114302054A CN202111444241.1A CN202111444241A CN114302054A CN 114302054 A CN114302054 A CN 114302054A CN 202111444241 A CN202111444241 A CN 202111444241A CN 114302054 A CN114302054 A CN 114302054A
Authority
CN
China
Prior art keywords
point
eyeball
current
line segment
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111444241.1A
Other languages
Chinese (zh)
Other versions
CN114302054B (en
Inventor
张猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Optical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Optical Technology Co Ltd filed Critical Goertek Optical Technology Co Ltd
Priority to CN202111444241.1A priority Critical patent/CN114302054B/en
Priority to PCT/CN2021/138585 priority patent/WO2023097791A1/en
Publication of CN114302054A publication Critical patent/CN114302054A/en
Application granted granted Critical
Publication of CN114302054B publication Critical patent/CN114302054B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)

Abstract

The present application relates to the field of augmented reality technologies, and more particularly, to a photographing method for an AR device and an AR device thereof. The AR equipment comprises an eyeball tracking camera and a photographing camera, and the photographing method comprises the following steps: the eyeball tracking camera shoots the eye picture of the user; the AR equipment acquires the current watching focus area of the eyes of the user according to the eye photo; and controlling the photographing camera to photograph a current picture according to the current watching focus area, wherein the current picture corresponds to the current watching focus area. According to the method and the device, the process of previewing the picture is omitted, the expenses caused by computing resources, power consumption, memory and the like are saved, and the user experience is improved.

Description

AR device photographing method and AR device
Technical Field
The present application relates to the field of augmented reality technologies, and more particularly, to a photographing method for an AR device and an AR device thereof.
Background
In recent years, with the rise of Augmented Reality (AR) devices, various applications of virtual Reality and Augmented Reality have gradually advanced into human lives. In order to meet various experience requirements of users, augmented reality equipment is generally provided with a camera for taking pictures.
Most of the current software and hardware systems of the AR device continue to use the mobile phone platform. In the photographing process, the picture acquired by the camera is sent to a display system of the AR device so that a user can preview the picture and then execute a photographing action. To implement the picture preview, the camera transmits data to the CPU, and the data transmission needs to occupy a memory of the size of an image in the CPU, which consumes memory and generates large power consumption in the whole transmission process.
Disclosure of Invention
Based on the above technical problems, the present invention aims to provide a new resource-saving photographing method, which controls the photographing camera to photograph the current area according to the current focus area of the user.
The invention provides a photographing method of an AR device, wherein the AR device comprises an eyeball tracking camera and a photographing camera, and the photographing method comprises the following steps:
the eyeball tracking camera shoots the eye picture of the user;
the AR equipment acquires the current watching focus area of the eyes of the user according to the eye photo;
and controlling the photographing camera to photograph a current picture according to the current watching focus area, wherein the current picture corresponds to the current watching focus area.
Specifically, the controlling the camera to shoot the current picture according to the current focus area includes:
determining the photographing range and the photographing angle of the photographing camera according to the current watching focus area;
and controlling the camera to shoot according to the shooting range and the shooting angle so as to obtain the current picture.
More specifically, the AR device obtains a current gazing focus area of the user's eyes according to the eye photograph; the method comprises the following steps:
determining a mapping function corresponding to the relationship between the rotation data of the eyeballs and the sight angle;
acquiring current rotation data;
and acquiring the current gazing focus area according to the distance between the two eyes, the current rotation data and the mapping function.
Further, the determining a mapping function includes:
acquiring a field angle;
selecting four corners and the center of the field angle as a first point, a second point, a third point, a fourth point and a center point respectively;
marking the corresponding position in the eyeball tracking camera image as the origin when the eyeball focuses on the central point;
respectively recording the pixel numbers of the first point, the second point, the third point and the fourth point which are respectively watched by the user when each eyeball rotates and are offset from the original point in the image of the eyeball tracking camera;
and calculating the corresponding relation between the deviated pixel number and the eyeball rotation angle.
Further preferably, calculating the correspondence between the number of pixels of the offset and the eyeball rotation angle includes:
obtaining a relevant line segment and an included angle through the first point, the second point, the third point, the fourth point and the origin;
recording points other than the first point, the second point, the third point, the fourth point, and the origin as other gazing points;
and calculating the corresponding relation between the pixel number deviated in the eyeball tracking camera image and the eyeball rotation angle when each eyeball looks at other fixation points based on the related line segments and the included angles.
Still further, the calculating a corresponding relationship between the number of pixels shifted in the image of the eye tracking camera and the rotation angle of the eye when each eye looks at other fixation points based on the relevant line segments and the included angles includes:
acquiring a line segment from each eyeball of the user to the origin point and recording the line segment as a first line segment;
recording line segments from each eyeball of the user to the first point, the second point, the third point and the fourth point as a second line segment, a third line segment, a fourth line segment and a fifth line segment respectively;
respectively acquiring included angles between the second line segment, the third line segment, the fourth line segment and the fifth line segment and the first line segment;
and calculating the corresponding relation between the number of pixels of each eyeball which are offset in the image of the eyeball tracking camera when the eyeball looks at other fixation points and the eyeball rotation angle according to the included angle and the number of pixels of each eyeball which are offset from the original point in the image of the eyeball tracking camera when the user looks at each eyeball.
A second aspect of the present invention provides an AR device including an eye-tracking camera and a photographing camera; the eyeball tracking camera is used for shooting an eye picture of a user; the AR equipment is used for acquiring the current watching focus area of the eyes of the user according to the eye picture and controlling the photographing camera to photograph the current picture according to the current watching focus area.
Preferably, the AR apparatus further includes a control module and a display module.
A third aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a photographing method of the AR device in embodiments of the present invention.
A fourth aspect of the present invention provides a computer program product comprising a computer program that, when executed by a processor, implements the method of photographing by the AR device in the embodiments of the present invention.
The beneficial effect of this application does: the method and the device control the shooting camera to shoot the current surface according to the current watching focus area of the user, and are a new resource-saving shooting mode. The utility model provides an eyeball tracking camera shoots user's eye photo, and AR equipment basis the eye photo acquires the current focus area of gazing of user's eyes, according to the current focus area control of gazing the current picture of camera shooting, wherein, current picture with the current focus area of gazing corresponds, has left out the process of picture preview, has saved the expense that brings such as computational resource, consumption and memory, has promoted user's experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description, serve to explain the principles of the application.
The present application may be more clearly understood from the following detailed description with reference to the accompanying drawings, in which:
FIG. 1 illustrates a schematic diagram of method steps of an exemplary embodiment of the present application;
FIG. 2 is a schematic view illustrating an angle of view of a calibration display module according to an exemplary embodiment of the present application;
FIG. 3 illustrates a schematic view of a human eye gaze focal area in an exemplary embodiment of the application;
FIG. 4 is a schematic diagram illustrating the saving operation process generated by the photographing method in an exemplary embodiment of the present application;
FIG. 5 shows a schematic diagram of an apparatus in an exemplary embodiment of the present application;
fig. 6 is a schematic structural diagram of an AR device according to an exemplary embodiment of the present application;
fig. 7 illustrates a schematic diagram of a storage medium provided by an exemplary embodiment of the present application.
Detailed Description
Hereinafter, embodiments of the present application will be described with reference to the accompanying drawings. It should be understood that the description is intended to be exemplary only, and is not intended to limit the scope of the present application. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present application. It will be apparent to one skilled in the art that the present application may be practiced without one or more of these details. In other instances, well-known features of the art have not been described in order to avoid obscuring the present application.
It should be noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments in accordance with the application. As used herein, the singular is intended to include the plural unless the context clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Exemplary embodiments according to the present application will now be described in more detail with reference to the accompanying drawings. These exemplary embodiments may, however, be embodied in many different forms and should not be construed as limited to only the embodiments set forth herein. The figures are not drawn to scale, wherein certain details may be exaggerated and omitted for clarity. The shapes of various regions, layers, and relative sizes and positional relationships therebetween shown in the drawings are merely exemplary, and deviations may occur in practice due to manufacturing tolerances or technical limitations, and a person skilled in the art may additionally design regions/layers having different shapes, sizes, relative positions, as actually required.
Several examples are given below in conjunction with the description of figures 1-7 to describe exemplary embodiments according to the present application. It should be noted that the following application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present application, and the embodiments of the present application are not limited in this respect. Rather, embodiments of the present application may be applied to any scenario where applicable.
The conventional AR device usually has a photographing function, and a user can preview a picture and take a picture as needed. Specifically, a camera on the AR device shoots an external picture, the camera sends the external picture to a display system of the AR device, the display system presents the external picture to a user in the form of a virtual image, and then the user triggers a shooting operation as needed.
The embodiment of the application provides a photographing method of an AR device, wherein the AR device can be AR glasses or an AR head, and the AR device is worn on the head of a user during use. The AR device includes two types of cameras: the eye tracking camera can comprise one or more than one of eye tracking cameras and photographing cameras, and is used for photographing the eye pictures of the user and is usually arranged on one side close to the eyes of the user; the camera may also include one or more cameras for taking external photographs, typically located on the side away from the user's eyes; in some embodiments, the eye-tracking camera and the photo camera are located on opposite sides of the AR device.
Some embodiments of the present application provide a photographing method for an AR device, as shown in fig. 1, the photographing method includes:
first, an eye tracking camera on the AR device takes a picture of the user's eyes. For example, a left eye tracking camera takes a left eye picture, a right eye tracking camera takes a right eye picture; alternatively, the same eye-tracking camera takes a left eye photograph and a right eye photograph at the same time.
Then, the AR equipment acquires a current watching focus area of the user according to the eye photo, wherein the current watching focus area is an area in the visual field range of the eyes of the user, the eyes of the user mainly watch on the area at present, and information outside the area is used for being imperceptible; the area is a predetermined distance from the user's eyes and is located on a side of the AR device away from the user's eyes.
And finally, controlling the photographing camera to photograph a current picture according to the current focus region, wherein the current picture corresponds to the current focus region.
Generally, the pictures taken by the AR device are all external scenes several meters or even several tens of meters away, while the straight-line distance between the camera on the AR device and the eyes of the user is generally only about 1cm, and here, the camera and the eyes of the user are considered to be coincident, that is, the field of view range of the camera and the field of view range of the eyes are coincident.
And determining the current field range of the camera according to the current focus region, and controlling the camera to take pictures in the current field range.
According to the photographing method provided by the embodiment of the application, the current field range of the photographing camera is determined according to the current focus region of the user, which is obtained by the eyeball tracking camera, and the picture is photographed according to the current field range. In the scheme, the camera does not need to send the preview picture to the display system of the AR equipment, and the display system does not need to project the preview picture into eyes of a user, so that the preview process is saved, the memory space is saved, and the power consumption caused by the transmission of the preview picture is eliminated.
In some embodiments of the present application, when the AR device obtains the current gazing focus area of the user according to the eye photo, the AR device may determine the mapping function first, then obtain the current rotation data, and finally obtain the current gazing focus area according to the current rotation data, the inter-ocular distance (i.e., the pupil distance of the user) and the mapping function, where the mapping function corresponds to a relationship between the rotation data (the rotation data may include a rotation distance and/or a rotation angle) of the eyeball and the gaze angle.
In a specific embodiment, the AR device obtains a current gazing focus area of the user's eyes from the eye photograph; the method comprises the following steps: determining a mapping function, wherein the mapping function corresponds to the relationship between the rotation data of the eyeballs and the sight angle; acquiring current rotation data; and acquiring the current gazing focus area according to the binocular distance, the current rotation data and the mapping function. Further, determining a mapping function includes: acquiring a field angle; selecting four corners and the center of the field angle as a first point, a second point, a third point, a fourth point and a center point respectively; marking the corresponding position in the eyeball tracking camera image as the origin when the eyeball focuses on the central point; respectively recording the number of pixels of the eyeball tracking camera image deviated from the original point corresponding to a first point, a second point, a third point and a fourth point watched by a user when each eyeball rotates; and calculating the corresponding relation between the offset pixel number and the eyeball rotation angle.
In one possible specific embodiment, referring to fig. 2, the first point, the second point, the third point and the fourth point correspond to points B, C, D and E in fig. 2, respectively, and the central point corresponds to point a in fig. 2. Here, both the eyeballs are respectively calibrated, the corresponding position in the image of the eyeball tracking camera when the eyeball is focused on the central point is marked as the origin, and the left eye of the user is marked as PLThe right eye is marked as PR,PLAnd PRHow much distance the gaze point is rotated corresponds to the image of the origin shifted from the gaze pointPrime number can be marked out through eyeball tracking camera to calculate the contained angle (can be sight angle or half of sight angle) of user's gaze extension line and user visual field center, moreover, need let eyes look far away when demarcation, make the point of rendering out be located far away, so the gaze extension line of two eyes is similar to parallelly.
Specifically, calculating the correspondence between the number of pixels of the offset and the eyeball rotation angle includes: obtaining a relevant line segment and an included angle through the first point, the second point, the third point, the fourth point and the origin; recording points other than the first point, the second point, the third point, the fourth point, and the origin as other gazing points; and calculating the corresponding relation between the pixel number deviated in the eyeball tracking camera image and the eyeball rotation angle when each eyeball looks at other fixation points based on the related line segments and the included angles.
Further, the calculating a corresponding relationship between the number of pixels shifted in the image of the eye tracking camera and the rotation angle of the eyeball based on the relevant line segment and the included angle when each eyeball looks at other fixation points includes: acquiring a line segment from each eyeball of a user to an origin point and recording the line segment as a first line segment; recording line segments from each eyeball of the user to the first point, the second point, the third point and the fourth point as a second line segment, a third line segment, a fourth line segment and a fifth line segment respectively; respectively acquiring included angles between the second line segment, the third line segment, the fourth line segment and the fifth line segment and the first line segment; and calculating the corresponding relation between the number of pixels of each eyeball which are offset in the image of the eyeball tracking camera when the eyeball looks at other fixation points and the eyeball rotation angle according to the included angle and the number of pixels of each eyeball which are offset from the original point in the image of the eyeball tracking camera when the user looks at each eyeball.
In one embodiment, (ax, ay) indicates that the user looks at the first, second, third and fourth points corresponding to the number of pixels in the eye-tracking camera image that are offset from the origin when each eye rotates, and the corresponding relationship between the number of pixels corresponding to the displacement in the eyeball tracking camera image and the eyeball rotation angle when each eyeball looks at other fixation points forms an f function, namely delta theta is f (delta x, delta y), different f-functions are generated according to different eyeball tracking cameras, and the f-function can be linear or nonlinear, the other fixation points can be obtained by different corresponding relations corresponding to the number of pixels of the eyeball tracking camera image which are offset from the original point, and further realizes the specific offset pixel number when the user rotates the eyeballs to cause the fixation point to shift from one point to other points, so that the shot image can accurately shoot the eyes of the user.
In a possible implementation manner, the relative positions of the fixation points of the two eyes to the two eyes can be calculated according to the trigonometric function and the distance between the two eyes, and the fixation focus areas of the two eyes can be calculated according to the trigonometric function and the distance between the two eyes. In one possible embodiment, reference may be made to FIG. 3, θLThe angle theta between the first line segment and the line segment representing the left eyeball and the fixation pointRAnd the included angle between the line segment representing the right eyeball and the fixation point and the first line segment.
In another possible embodiment, the eyeball tracking camera may also calibrate the number of pixels shifted in the image of the eyeball tracking camera caused by the rotation of each eyeball of the user based on the field angle of the display module, and before this, the distance between the eyes of the user is given, and as shown in fig. 3, the distance between the eyes of the user is L.
It should be noted that, an existing camera, i.e., camera, needs to transmit data to the CPU at a specific frame rate, which is generally 30hz/60hz in a current mobile phone platform, and meanwhile, data transmission needs to occupy a memory with an image size in the CPU, which may generate significant resource (power consumption and memory) overhead in the whole transmission process. As shown in fig. 4, the method for taking a picture by using the AR device according to the present application can save the operation process. The user can see the outside world directly through the screen, so the consumption of system resources can be reduced, and as shown in the lower part of fig. 4, the gaze point of the user can be captured by the eyeball tracking Camera (i.e., ET Camera), so as to perform accurate focusing and complete photographing.
The method and the device control the shooting camera to shoot the current surface according to the current watching focus area of the user, and are a new resource-saving shooting mode. The utility model provides an eyeball tracks camera and shoots user's eye photo, AR equipment acquires the current focus area of watching of user's eyes according to eye photo, according to the current focus area of watching control camera and shoot current picture, wherein, current picture corresponds with the current focus area of watching, the process of picture preview and the process of user's manual selection focusing area have been left out, computational resource has been saved, the cost that brings such as consumption and memory, user's experience has been promoted, and the sense of immersing when having strengthened the user and shooing.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
In some exemplary embodiments, an AR device is also provided, as shown in fig. 5, comprising an eye tracking camera 501 and a photo camera 502; the eyeball tracking camera is used for shooting an eye picture of a user; the AR equipment is used for acquiring a current watching focus area of the eyes of the user according to the eye picture and controlling the photographing camera to photograph a current picture according to the current watching focus area, wherein the current picture corresponds to the current watching focus area.
Preferably, the AR apparatus further comprises a control module 503 and a display module 504, and the eye tracking camera 501, the photographing camera 502, the control module 503 and the display module 504 work cooperatively.
It is further emphasized that the augmented reality system provided in the embodiments of the present application may acquire and process relevant data based on artificial intelligence techniques. Among them, Artificial Intelligence (AI) is a theory, method, technique and application system that simulates, extends and expands human Intelligence using a digital computer or a machine controlled by a digital computer, senses the environment, acquires knowledge and uses the knowledge to obtain the best result. The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a robot technology, a biological recognition technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
Reference is now made to fig. 6, which is a schematic diagram illustrating an AR device according to some embodiments of the present application. As shown in fig. 6, the AR device 2 includes: the system comprises a processor 200, a memory 201, a bus 202 and a communication interface 203, wherein the processor 200, the communication interface 203 and the memory 201 are connected through the bus 202; the memory 201 stores a computer program that can be executed on the processor 200, and the processor 200 executes the photographing method of the AR device provided in any of the foregoing embodiments when executing the computer program.
The Memory 201 may include a high-speed Random Access Memory (RAM) and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 203 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
Bus 202 can be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The memory 201 is used for storing a program, and the processor 200 executes the program after receiving the execution instruction, and the photographing method of the AR device disclosed by any of the foregoing embodiments of the present application may be applied to the processor 200, or implemented by the processor 200.
The processor 200 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 200. The Processor 200 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 201, and the processor 200 reads the information in the memory 201 and completes the steps of the method in combination with the hardware thereof.
The AR device provided by the embodiment of the application and the photographing method of the AR device provided by the embodiment of the application have the same beneficial effects as the method adopted, operated or realized by the AR device.
Referring to fig. 7, the computer-readable storage medium shown in fig. 7 is an optical disc 30, and a computer program (i.e., a program product) is stored on the optical disc 30, and when the computer program is executed by a processor, the computer program may execute the photographing method of the AR device provided in any of the foregoing embodiments.
In addition, examples of the computer-readable storage medium may also include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory, or other optical and magnetic storage media, which are not described in detail herein.
The computer-readable storage medium provided by the above-mentioned embodiment of the present application and the quantum key distribution channel allocation method in the spatial division multiplexing optical network provided by the embodiment of the present application have the same inventive concept, and have the same beneficial effects as the method adopted, run, or implemented by the application program stored in the computer-readable storage medium.
Embodiments of the present application further provide a computer program product, which includes a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the photographing method of the AR device provided in any of the foregoing embodiments.
It should be noted that: the algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose devices may be used with the teachings herein. The required structure for constructing such a device will be apparent from the description above. In addition, this application is not directed to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the present application as described herein, and any descriptions of specific languages are provided above to disclose the best modes of the present application. In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the application, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the application and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification, and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except that at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification may be replaced by an alternative feature serving the same, equivalent or similar purpose, unless expressly stated otherwise.
The various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in the creation apparatus of a virtual machine according to embodiments of the present application. The present application may also be embodied as an apparatus or device program for carrying out a portion or all of the methods described herein. A program implementing the application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
The above description is only for the preferred embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A photographing method of an AR device is characterized in that the AR device comprises an eyeball tracking camera and a photographing camera, and the photographing method comprises the following steps:
the eyeball tracking camera shoots the eye picture of the user;
the AR equipment acquires the current watching focus area of the eyes of the user according to the eye photo;
and controlling the photographing camera to photograph a current picture according to the current watching focus area, wherein the current picture corresponds to the current watching focus area.
2. The photographing method according to claim 1, wherein the controlling the photographing camera to photograph a current picture according to the current gaze focus area comprises:
determining the photographing range and the photographing angle of the photographing camera according to the current watching focus area;
and controlling the camera to shoot according to the shooting range and the shooting angle so as to obtain the current picture.
3. The photographing method according to claim 1 or 2, wherein the AR device obtains a current gaze focus area of the user's eyes from the eye photograph; the method comprises the following steps:
determining a mapping function corresponding to the relationship between the rotation data of the eyeballs and the sight angle;
acquiring current rotation data;
and acquiring the current gazing focus area according to the distance between the two eyes, the current rotation data and the mapping function.
4. The photographing method according to claim 3, wherein the determining a mapping function includes:
acquiring a field angle;
selecting four corners and the center of the field angle as a first point, a second point, a third point, a fourth point and a center point respectively;
marking the corresponding position in the eyeball tracking camera image as the origin when the eyeball focuses on the central point;
respectively recording the pixel numbers of the first point, the second point, the third point and the fourth point which are respectively watched by the user when each eyeball rotates and are offset from the original point in the image of the eyeball tracking camera;
and calculating the corresponding relation between the deviated pixel number and the eyeball rotation angle.
5. The photographing method according to claim 4, wherein calculating the correspondence between the number of pixels of the offset and the eyeball rotation angle includes:
obtaining a relevant line segment and an included angle through the first point, the second point, the third point, the fourth point and the origin;
recording points other than the first point, the second point, the third point, the fourth point, and the origin as other gazing points;
and calculating the corresponding relation between the pixel number deviated in the eyeball tracking camera image and the eyeball rotation angle when each eyeball looks at other fixation points based on the related line segments and the included angles.
6. The photographing method according to claim 5, wherein the calculating of the correspondence between the number of pixels shifted in the image of the eye-tracking camera when each eye looks at other fixation points and the rotation angle of the eye based on the associated line segment and the included angle comprises:
acquiring a line segment from each eyeball of the user to the origin point and recording the line segment as a first line segment;
recording line segments from each eyeball of the user to the first point, the second point, the third point and the fourth point as a second line segment, a third line segment, a fourth line segment and a fifth line segment respectively;
respectively acquiring included angles between the second line segment, the third line segment, the fourth line segment and the fifth line segment and the first line segment;
and calculating the corresponding relation between the number of pixels of each eyeball which are offset in the image of the eyeball tracking camera when the eyeball looks at other fixation points and the eyeball rotation angle according to the included angle and the number of pixels of each eyeball which are offset from the original point in the image of the eyeball tracking camera when the user looks at each eyeball.
7. An AR device comprising an eye tracking camera and a camera; the eyeball tracking camera is used for shooting an eye picture of a user;
the AR equipment is used for acquiring a current watching focus area of the eyes of the user according to the eye picture and controlling the photographing camera to photograph a current picture according to the current watching focus area, wherein the current picture corresponds to the current watching focus area.
8. The AR apparatus of claim 7, further comprising a control module and a display module.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the method of any one of claims 1 to 6 when executed by a processor.
CN202111444241.1A 2021-11-30 2021-11-30 Photographing method of AR equipment and AR equipment Active CN114302054B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111444241.1A CN114302054B (en) 2021-11-30 2021-11-30 Photographing method of AR equipment and AR equipment
PCT/CN2021/138585 WO2023097791A1 (en) 2021-11-30 2021-12-16 Photographing method of ar device and ar device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111444241.1A CN114302054B (en) 2021-11-30 2021-11-30 Photographing method of AR equipment and AR equipment

Publications (2)

Publication Number Publication Date
CN114302054A true CN114302054A (en) 2022-04-08
CN114302054B CN114302054B (en) 2023-06-20

Family

ID=80966029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111444241.1A Active CN114302054B (en) 2021-11-30 2021-11-30 Photographing method of AR equipment and AR equipment

Country Status (2)

Country Link
CN (1) CN114302054B (en)
WO (1) WO2023097791A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108881724A (en) * 2018-07-17 2018-11-23 北京七鑫易维信息技术有限公司 A kind of image acquiring method, device, equipment and storage medium
CN110177210A (en) * 2019-06-17 2019-08-27 Oppo广东移动通信有限公司 Photographic method and relevant apparatus
CN110225252A (en) * 2019-06-11 2019-09-10 Oppo广东移动通信有限公司 Camera control method and Related product
CN111225157A (en) * 2020-03-03 2020-06-02 Oppo广东移动通信有限公司 Focus tracking method and related equipment
CN111880654A (en) * 2020-07-27 2020-11-03 歌尔光学科技有限公司 Image display method and device, wearable device and storage medium
CN112738388A (en) * 2019-10-28 2021-04-30 七鑫易维(深圳)科技有限公司 Photographing processing method and system, electronic device and storage medium
CN113395438A (en) * 2020-03-12 2021-09-14 Oppo广东移动通信有限公司 Image correction method and related device for eyeball tracking technology
CN113420678A (en) * 2021-06-25 2021-09-21 阿波罗智联(北京)科技有限公司 Gaze tracking method, device, apparatus, storage medium, and computer program product

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000201289A (en) * 1999-01-07 2000-07-18 Sony Corp Image input-output device and image acquiring method
KR102345652B1 (en) * 2015-06-26 2021-12-30 삼성전자주식회사 View finder apparatus and method for the same
WO2018107566A1 (en) * 2016-12-16 2018-06-21 华为技术有限公司 Processing method and mobile device
KR20210137831A (en) * 2020-05-11 2021-11-18 삼성전자주식회사 Electronic apparatus and operaintg method thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108881724A (en) * 2018-07-17 2018-11-23 北京七鑫易维信息技术有限公司 A kind of image acquiring method, device, equipment and storage medium
CN110225252A (en) * 2019-06-11 2019-09-10 Oppo广东移动通信有限公司 Camera control method and Related product
CN110177210A (en) * 2019-06-17 2019-08-27 Oppo广东移动通信有限公司 Photographic method and relevant apparatus
CN112738388A (en) * 2019-10-28 2021-04-30 七鑫易维(深圳)科技有限公司 Photographing processing method and system, electronic device and storage medium
CN111225157A (en) * 2020-03-03 2020-06-02 Oppo广东移动通信有限公司 Focus tracking method and related equipment
CN113395438A (en) * 2020-03-12 2021-09-14 Oppo广东移动通信有限公司 Image correction method and related device for eyeball tracking technology
CN111880654A (en) * 2020-07-27 2020-11-03 歌尔光学科技有限公司 Image display method and device, wearable device and storage medium
CN113420678A (en) * 2021-06-25 2021-09-21 阿波罗智联(北京)科技有限公司 Gaze tracking method, device, apparatus, storage medium, and computer program product

Also Published As

Publication number Publication date
WO2023097791A1 (en) 2023-06-08
CN114302054B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
US11632537B2 (en) Method and apparatus for obtaining binocular panoramic image, and storage medium
US10859840B2 (en) Graphics rendering method and apparatus of virtual reality
US9978118B1 (en) No miss cache structure for real-time image transformations with data compression
US20170177082A1 (en) Stabilization Plane Determination Based On Gaze Location
US10242654B2 (en) No miss cache structure for real-time image transformations
KR102096730B1 (en) Image display method, method for manufacturing irregular screen having curved surface, and head-mounted display device
CN110263657A (en) A kind of human eye method for tracing, device, system, equipment and storage medium
US9652887B2 (en) Object oriented image processing and rendering in a multi-dimensional space
JP2016533686A (en) Method and apparatus for generating an omnifocal image
CN109002796A (en) A kind of image-pickup method, device and system and electronic equipment
CN114742703A (en) Method, device and equipment for generating binocular stereoscopic panoramic image and storage medium
CN111163303A (en) Image display method, device, terminal and storage medium
CN111476151A (en) Eyeball detection method, device, equipment and storage medium
KR101818839B1 (en) Apparatus and method of stereo scopic 3d contents creation and display
US10417738B2 (en) System and method for displaying graphical effects based on determined facial positions
CN111597963B (en) Light supplementing method, system and medium for face in image and electronic equipment
CN116524022B (en) Offset data calculation method, image fusion device and electronic equipment
CN109788199B (en) Focusing method suitable for terminal with double cameras
CN114302054A (en) AR device photographing method and AR device
CN116977804A (en) Image fusion method, electronic device, storage medium and computer program product
Narducci et al. Enabling consistent hand-based interaction in mixed reality by occlusions handling
CN115278203A (en) Calibration method and calibration device for virtual reality equipment and calibration robot
EP4097687A1 (en) A method for generating a 3d model
Wetzstein Augmented and virtual reality
CN109741465A (en) Image processing method and device, display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221116

Address after: No. 500 Songling Road, Laoshan District, Qingdao City, Shandong Province, 266100

Applicant after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261000 plant 1, phase III, goer Photoelectric Industrial Park, No. 3999, Huixian Road, Yongchun community, Qingchi street, high tech Zone, Weifang City, Shandong Province

Applicant before: GoerTek Optical Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant