CN110728622B - Fisheye image processing method, device, electronic equipment and computer readable medium - Google Patents

Fisheye image processing method, device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN110728622B
CN110728622B CN201911007209.XA CN201911007209A CN110728622B CN 110728622 B CN110728622 B CN 110728622B CN 201911007209 A CN201911007209 A CN 201911007209A CN 110728622 B CN110728622 B CN 110728622B
Authority
CN
China
Prior art keywords
cylindrical projection
mapping relation
determining
coordinates
viewing angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911007209.XA
Other languages
Chinese (zh)
Other versions
CN110728622A (en
Inventor
田池
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Vyagoo Technology Co ltd
Original Assignee
Zhuhai Vyagoo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Vyagoo Technology Co ltd filed Critical Zhuhai Vyagoo Technology Co ltd
Priority to CN201911007209.XA priority Critical patent/CN110728622B/en
Publication of CN110728622A publication Critical patent/CN110728622A/en
Application granted granted Critical
Publication of CN110728622B publication Critical patent/CN110728622B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • G06T3/047Fisheye or wide-angle transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/12Panospheric to cylindrical image transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a fish-eye image processing method, a device, electronic equipment and a computer storage medium, wherein the method comprises the following steps: acquiring a current viewing angle for a fish-eye image to be processed; when the current viewing angle changes, determining a first mapping relation between fish-eye image coordinates corresponding to part of pixel points in the fish-eye image to be processed and corresponding part of cylindrical projection coordinates through a fish-eye camera imaging model; determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation; and determining and displaying a target cylindrical projection diagram corresponding to the fish-eye image to be processed based on the second mapping relation and all cylindrical projection coordinates. According to the scheme provided by the invention, when the current viewing angle changes, the first mapping relation between the fisheye image coordinates corresponding to part of the pixel points and the corresponding part of the cylindrical projection coordinates can be determined, the first mapping relation does not need to be calculated pixel by pixel through a model, and the calculated amount of the model is reduced.

Description

Fisheye image processing method, device, electronic equipment and computer readable medium
Technical Field
The invention relates to the technical field of image processing, in particular to a fisheye image processing method, a fisheye image processing device, electronic equipment and a computer readable medium.
Background
In the prior art, a cylindrical projection image corresponding to a fisheye image is determined based on an imaging model of the fisheye camera, specifically, a determination mode is that based on cylindrical projection coordinates of each pixel in the fisheye image and fisheye image coordinates, a mapping relation between the cylindrical projection coordinates and the fisheye image coordinates is calculated pixel by pixel, and then based on the mapping relation, the cylindrical projection image corresponding to the fisheye image is determined. However, in the above method, the mapping relationship between the cylindrical projection coordinates and the fisheye image coordinates is calculated pixel by pixel, so that the calculation amount is very large, and especially for low-power-consumption equipment, the real-time requirement of the low-power-consumption equipment is difficult to be met by the above method.
Disclosure of Invention
The present invention aims to solve at least one of the above technical drawbacks and improve the data processing efficiency. The technical scheme adopted by the invention is as follows:
in a first aspect, the present invention provides a fisheye image processing method, the method comprising:
acquiring a current viewing angle for a fish-eye image to be processed;
When the current viewing angle changes relative to the initial viewing angle, determining a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates through a fisheye camera imaging model;
determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation;
and determining and displaying a target cylindrical projection diagram corresponding to the fish-eye image to be processed based on the second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing angle and the screen vertex coordinates.
In an embodiment of the first aspect of the present invention, determining that the current viewing angle changes relative to the initial viewing angle includes:
acquiring a sliding operation of a user on a screen corresponding to an initial viewing angle;
based on the sliding operation, it is determined that the current viewing angle changes relative to the initial viewing angle.
In an embodiment of the first aspect of the present invention, a part of the pixels are a set number of pixels.
In an embodiment of the first aspect of the present invention, determining all cylindrical projection coordinates corresponding to a fisheye image to be processed based on a first mapping relationship includes:
Based on the first mapping relation, determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed;
and determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation and the third mapping relation.
In an embodiment of the first aspect of the present invention, determining a target cylindrical projection map corresponding to a fisheye image to be processed based on the second mapping relationship and all cylindrical projection coordinates includes:
and rendering all cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain a target cylindrical projection graph.
In an embodiment of the first aspect of the present invention, the second mapping relationship is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of an initial fisheye image;
a second mapping relationship is determined based on the initial cylindrical projection coordinates and the screen vertex coordinates.
In an embodiment of the first aspect of the present invention, the initial viewing angle is a set angle.
In a second aspect, the present invention provides a fisheye image processing device comprising:
The view angle acquisition module is used for acquiring the current viewing view angle of the fish-eye image to be processed;
the first mapping relation determining module is used for determining a first mapping relation between fish-eye image coordinates corresponding to partial pixel points in the fish-eye image to be processed and corresponding partial cylindrical projection coordinates through a fish-eye camera imaging model when the current viewing angle changes relative to the initial viewing angle;
the cylindrical projection coordinate determining module is used for determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation;
and the target cylindrical projection diagram determining module is used for determining and displaying a target cylindrical projection diagram corresponding to the fish-eye image to be processed based on a second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing angle and the screen vertex coordinates.
In an embodiment of the second aspect of the present invention, when determining that the current viewing angle changes relative to the initial viewing angle, the first mapping relationship determining module is specifically configured to:
acquiring a sliding operation of a user on a screen corresponding to an initial viewing angle;
Based on the sliding operation, it is determined that the current viewing angle changes relative to the initial viewing angle.
In an embodiment of the second aspect of the present invention, a part of the pixels are a set number of pixels.
Optionally, the cylindrical projection coordinate determining module is specifically configured to, when determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relationship:
based on the first mapping relation, determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed;
and determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation and the third mapping relation.
In an embodiment of the second aspect of the present invention, when determining the target cylindrical projection map corresponding to the fisheye image to be processed based on the second mapping relationship and all the cylindrical projection coordinates, the target cylindrical projection map determining module is specifically configured to:
and rendering all cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain a target cylindrical projection graph.
In an embodiment of the second aspect of the present invention, the second mapping relation is determined by:
Acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of an initial fisheye image;
a second mapping relationship is determined based on the initial cylindrical projection coordinates and the screen vertex coordinates.
In an embodiment of the second aspect of the present invention, the initial viewing angle is a set angle.
In a third aspect, the present invention provides an electronic device comprising:
a processor and a memory;
a memory for storing computer operating instructions;
a processor for performing the method as shown in any of the embodiments of the first aspect of the invention by invoking computer operating instructions.
In a fourth aspect, the invention provides a computer readable medium having stored thereon at least one instruction, at least one program, code set or instruction set, the at least one instruction, at least one program, code set or instruction set being loaded and executed by a processor to implement a method as shown in any embodiment of the first aspect of the invention.
The technical scheme provided by the embodiment of the invention has the beneficial effects that:
according to the fisheye image processing method, the device, the electronic equipment and the computer readable medium, when the current viewing angle changes relative to the initial viewing angle, the first mapping relation between the fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and the corresponding part of cylindrical projection coordinates can be determined through the fisheye camera imaging model, all the cylindrical projection coordinates corresponding to the fisheye image to be processed can be determined based on the first mapping relation, the first mapping relation between the fisheye image coordinates and the corresponding part of cylindrical projection coordinates does not need to be calculated pixel by pixel point through the model, the calculated amount of the model is reduced, the calculation efficiency is improved, and the requirement of low-power consumption equipment can be met.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings that are required to be used in the description of the embodiments of the present invention will be briefly described below.
Fig. 1 is a schematic flow chart of a fisheye image processing method according to an embodiment of the invention;
fig. 2 is a schematic structural diagram of a fisheye image processing device according to an embodiment of the invention;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While the invention is susceptible of embodiment in the drawings, it is to be understood that the invention may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided to provide a more thorough and complete understanding of the invention. It should be understood that the drawings and embodiments of the invention are for illustration purposes only and are not intended to limit the scope of the present invention.
It should be understood that the various steps recited in the method embodiments of the present invention may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the invention is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like herein are merely used for distinguishing between devices, modules, or units and not necessarily for defining the order in which such devices, modules, or units perform their functions or are interdependent.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those skilled in the art will appreciate that "one or more" is intended to be construed as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the devices in the embodiments of the present invention are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Angle of view: in the optical instrument, a lens of the optical instrument is taken as a vertex, and an included angle formed by two edges of the maximum range of the lens, which can be passed through by an object image of a measured object, is called a field angle.
OpenGL (Open Graphics Library, open graphics library or open graphics library): is a cross-language, cross-platform application programming interface for rendering 2D, 3D vector graphics. This interface consists of nearly 350 different function calls, used to draw a scene from simple graphics bits to complex three dimensions.
The following describes the technical scheme of the present invention and how the technical scheme of the present invention solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
In view of the above technical problems, a fisheye image processing method provided in an embodiment of the present invention, as shown in fig. 1, may include:
step S110, a current viewing angle for the fish-eye image to be processed is acquired.
Specifically, the fisheye image to be processed refers to an image of which a corresponding target cylindrical projection image needs to be determined, the fisheye image to be processed is an image shot by a fisheye camera, and the current viewing angle refers to: the lens of the fisheye camera corresponding to the fisheye image to be processed is taken as the vertex, and the object image of the measured object can pass through the included angle formed by the two edges of the maximum range of the lens.
Step S120, when the current viewing angle is changed relative to the initial viewing angle, determining a first mapping relationship between the fisheye image coordinates corresponding to the partial pixel points in the fisheye image to be processed and the corresponding partial cylindrical projection coordinates through the fisheye camera imaging model.
Specifically, the change of the current viewing angle means that: the angle of the field angle with respect to the initial viewing field angle becomes larger or smaller, and the initial viewing field angle may be a preset field angle, and the fisheye image corresponding to the initial viewing field angle is photographed before the fisheye image corresponding to the current viewing field angle.
Step S130, determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation.
Specifically, the same mapping relation can be shared between the coordinates of each fisheye image in the same fisheye image and the corresponding cylindrical projection coordinates, so that based on the first mapping relation, the first mapping relation between the coordinates of the fisheye image corresponding to other pixels except part of pixels in the fisheye image to be processed and the corresponding partial cylindrical projection coordinates is not required to be calculated pixel by pixel, and all the cylindrical projection coordinates corresponding to the fisheye image to be processed can be determined based on the first mapping relation, so that the calculated amount is reduced.
And step S140, determining and displaying a target cylindrical projection diagram corresponding to the fish-eye image to be processed based on the second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing angle and the screen vertex coordinates.
Specifically, based on the cylindrical projection coordinates of the fisheye image, determining the target cylindrical projection map corresponding to the fisheye image may be implemented by a manner in the prior art, which is not described herein in detail. Compared with the method in the prior art, the target cylindrical projection graph obtained based on the scheme improves the data processing efficiency, and can ensure the real-time performance of the algorithm when the current viewing field angle changes.
According to the scheme in the embodiment of the invention, when the current viewing angle changes relative to the initial viewing angle, the first mapping relation between the fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and the corresponding part of cylindrical projection coordinates can be determined through the fisheye camera imaging model, all the cylindrical projection coordinates corresponding to the fisheye image to be processed can be determined based on the first mapping relation, the first mapping relation between the fisheye image coordinates and the corresponding part of cylindrical projection coordinates does not need to be calculated pixel by pixel point through the model, the calculation amount of the model is reduced, the calculation efficiency is improved, and the requirement of low-power consumption equipment can be met.
In an embodiment of the present invention, determining that the current viewing angle changes relative to the initial viewing angle includes:
acquiring a sliding operation of a user on a screen corresponding to an initial viewing angle;
based on the sliding operation, it is determined that the current viewing angle changes relative to the initial viewing angle.
Specifically, the sliding operation refers to an operation performed by a user on a screen corresponding to an initial viewing angle, for example, the initial viewing angle is a, and after the sliding operation is received, the corresponding angles are B, and a and B are different angles, the sliding operation may change the angles, so that, based on the sliding operation, the current viewing angle of the fisheye image to be processed may be determined. It is understood that the fisheye image corresponding to the sliding operation may be used as the fisheye image to be processed.
In the embodiment of the invention, part of the pixel points are the pixel points with the set number.
Specifically, part of the pixels can be pixels with set number, the set number can be configured based on actual requirements, and the calculation requirements of fish-eye images with different sizes can be met.
In the embodiment of the present invention, in step S130, determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relationship may include:
Based on the first mapping relation, determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed;
and determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation and the third mapping relation.
Specifically, the first mapping relationship is a mapping relationship between fisheye image coordinates corresponding to partial pixel points in the fisheye image to be processed and corresponding partial cylindrical projection coordinates, which is established through the fisheye camera imaging model, and for cylindrical projection coordinates corresponding to other pixel points in the fisheye image to be processed, the mapping relationship between the fisheye image coordinates and the cylindrical projection coordinates of other pixel points can be determined directly based on the mapping relationship, and other modes are adopted to determine the mapping relationship between the fisheye image coordinates and the cylindrical projection coordinates of other pixel points, so that the model does not need to calculate pixel points, and the calculation amount of the model is reduced.
In an embodiment of the present invention, based on the first mapping relationship, one implementation manner of determining a third mapping relationship between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixels except for the set number of pixels in the fisheye image to be processed is: the third mapping relation between other coordinate points (the fisheye image coordinates and the cylindrical projection coordinates corresponding to other pixels) can be calculated based on an interpolation mode, and the calculation speed can be improved.
In an embodiment of the present invention, determining a target cylindrical projection map corresponding to a fisheye image to be processed based on a second mapping relationship and all cylindrical projection coordinates includes:
and rendering all cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain a target cylindrical projection graph.
Specifically, after all cylindrical projection coordinates of the fisheye image to be processed are determined, the cylindrical projection coordinates can be projected into a screen vertex coordinate system through OpenGL based on a second mapping relation established in advance, so as to obtain a target cylindrical projection diagram corresponding to the fisheye image to be processed.
Rendering all cylindrical projection coordinates based on OpenGL refers to projecting all cylindrical projection coordinates into a screen vertex coordinate system through OpenGL, and a specific rendering mode is a rendering mode corresponding to OpenGL in the prior art, which is not described herein.
In an embodiment of the present invention, the second mapping relationship is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of an initial fisheye image;
a second mapping relationship is determined based on the initial cylindrical projection coordinates and the screen vertex coordinates.
Specifically, the second mapping relationship is determined based on the initial fisheye image coordinates corresponding to the initial viewing angle and the screen vertex coordinates, and the second mapping relationship corresponding to different fisheye images is the same, i.e. the second mapping relationship is fixed. The vertex coordinates of the screen refer to coordinates corresponding to each position on the screen.
It should be noted that, the above mentioned cylindrical projection coordinates, fisheye image coordinates, and screen vertex coordinates are coordinates corresponding to the pixel points in different coordinate systems, and may be determined by a coordinate system conversion formula, which is not described herein.
In an embodiment of the present invention, the initial viewing angle is a set angle.
Specifically, the initial viewing angle may be configured based on actual requirements, for example, to be a set angle, which may be a relatively good angle for viewing the photographed object. In an embodiment of the present invention, the set angle may be 90 degrees.
The following describes the scheme of the present invention in detail by way of a specific example:
step 1: a current viewing angle for the fish-eye image to be processed is acquired.
Step 2: when the current viewing angle changes relative to the initial viewing angle, determining a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates through a fisheye camera imaging model.
In this example, whether the current viewing angle changes or not may be determined based on a sliding operation of the user on the screen, that is, if there is a sliding operation, it may be determined that the current viewing angle changes with respect to the initial viewing angle, and if there is no sliding operation, it may be determined that the current viewing angle does not change with respect to the initial viewing angle.
The specific implementation process of the method comprises the following steps:
acquiring a sliding operation of a user on a screen corresponding to an initial viewing angle;
based on the sliding operation, it is determined that the current viewing angle changes relative to the initial viewing angle.
When the current viewing angle changes relative to the initial viewing angle, determining a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates through a fisheye camera imaging model.
The partial pixels can be set numbers of pixels, the set numbers can be configured based on actual requirements, and the calculation requirements of fish-eye images with different sizes can be met.
The initial viewing angle may be configured based on actual requirements, for example, a set angle, which may be a relatively good angle for viewing the object to be photographed. In an embodiment of the present invention, the set angle may be 90 degrees.
Step 3: and determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation.
One implementation is: based on the first mapping relation, determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed;
and determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation and the third mapping relation.
Specifically, the first mapping relationship is a mapping relationship between fisheye image coordinates corresponding to partial pixel points in the fisheye image to be processed and corresponding partial cylindrical projection coordinates, which is established through the fisheye camera imaging model, and for cylindrical projection coordinates corresponding to other pixel points in the fisheye image to be processed, the mapping relationship between the fisheye image coordinates and the cylindrical projection coordinates of other pixel points can be determined directly based on the mapping relationship, and other modes are adopted to determine the mapping relationship between the fisheye image coordinates and the cylindrical projection coordinates of other pixel points, so that the model does not need to calculate pixel points, and the calculation amount of the model is reduced.
In an embodiment of the present invention, based on the first mapping relationship, one implementation manner of determining a third mapping relationship between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixels except for the set number of pixels in the fisheye image to be processed is: the third mapping relation between other coordinate points (the fisheye image coordinates and the cylindrical projection coordinates corresponding to other pixels) can be calculated based on an interpolation mode, and the calculation speed can be improved.
Step 4: and determining and displaying a target cylindrical projection diagram corresponding to the fish-eye image to be processed based on the second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing angle and the screen vertex coordinates.
Specifically, after all cylindrical projection coordinates of the fisheye image to be processed are determined, the cylindrical projection coordinates can be projected into a screen vertex coordinate system through OpenGL based on a second mapping relation established in advance, so as to obtain a target cylindrical projection diagram corresponding to the fisheye image to be processed.
Rendering all cylindrical projection coordinates based on OpenGL refers to projecting all cylindrical projection coordinates into a screen vertex coordinate system through OpenGL, and a specific rendering mode is a rendering mode corresponding to OpenGL in the prior art, which is not described herein.
In the above example, the second mapping relationship is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of an initial fisheye image;
a second mapping relationship is determined based on the initial cylindrical projection coordinates and the screen vertex coordinates.
Specifically, the second mapping relationship is determined based on the initial fisheye image coordinates corresponding to the initial viewing angle and the screen vertex coordinates, and the second mapping relationship corresponding to different fisheye images is the same, i.e. the second mapping relationship is fixed. The vertex coordinates of the screen refer to coordinates corresponding to each position on the screen.
It should be noted that, the above mentioned cylindrical projection coordinates, fisheye image coordinates, and screen vertex coordinates are coordinates corresponding to the pixel points in different coordinate systems, and may be determined by a coordinate system conversion formula, which is not described herein.
By the method, when the current viewing angle changes relative to the initial viewing angle, a first mapping relation between the fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and the corresponding part of cylindrical projection coordinates can be determined through the fisheye camera imaging model, all the cylindrical projection coordinates corresponding to the fisheye image to be processed can be determined based on the first mapping relation, the first mapping relation between the fisheye image coordinates and the corresponding part of cylindrical projection coordinates does not need to be calculated pixel by pixel point through the model, the calculation amount of the model is reduced, the calculation efficiency is improved, and the requirements of low-power consumption equipment can be met.
Based on the same principle as the fisheye image processing method shown in fig. 1, there is also provided a fisheye image processing device 20 in an embodiment of the present invention, as shown in fig. 2, the device 20 may include: a field angle acquisition module 210, a first mapping relationship determination module 220, a cylindrical projection coordinate determination module 230, and a target cylindrical projection map determination module 240, wherein,
a view angle acquisition module 210, configured to acquire a current viewing view angle for a fish-eye image to be processed;
the first mapping relationship determining module 220 is configured to determine, when the current viewing angle changes relative to the initial viewing angle, a first mapping relationship between a fisheye image coordinate corresponding to a portion of the pixel points in the fisheye image to be processed and a corresponding portion of the cylindrical projection coordinates through the fisheye camera imaging model;
the cylindrical projection coordinate determining module 230 is configured to determine all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relationship;
the target cylindrical projection map determining module 240 is configured to determine and display a target cylindrical projection map corresponding to the fisheye image to be processed based on the second mapping relationship and all the cylindrical projection coordinates, where the second mapping relationship is determined based on the cylindrical projection coordinates corresponding to the initial viewing angle and the screen vertex coordinates.
According to the fisheye image processing device provided by the embodiment of the invention, when the current viewing angle changes relative to the initial viewing angle, the first mapping relation between the fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and the corresponding part of cylindrical projection coordinates can be determined through the fisheye camera imaging model, all the cylindrical projection coordinates corresponding to the fisheye image to be processed can be determined based on the first mapping relation, the first mapping relation between the fisheye image coordinates and the corresponding part of cylindrical projection coordinates does not need to be calculated pixel by pixel point through the model, the calculated amount of the model is reduced, the calculation efficiency is improved, and the requirements of low-power consumption equipment can be met.
Optionally, the first mapping relation determining module is specifically configured to, when determining that the current viewing angle changes relative to the initial viewing angle:
acquiring a sliding operation of a user on a screen corresponding to an initial viewing angle;
based on the sliding operation, it is determined that the current viewing angle changes relative to the initial viewing angle.
Optionally, the partial pixels are a set number of pixels.
Optionally, the cylindrical projection coordinate determining module is specifically configured to, when determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relationship:
Based on the first mapping relation, determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed;
and determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation and the third mapping relation.
Optionally, the target cylindrical projection map determining module is specifically configured to, when determining the target cylindrical projection map corresponding to the fisheye image to be processed based on the second mapping relationship and all the cylindrical projection coordinates:
and rendering all cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain a target cylindrical projection graph.
Optionally, the second mapping is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of an initial fisheye image;
a second mapping relationship is determined based on the initial cylindrical projection coordinates and the screen vertex coordinates.
Optionally, the initial viewing angle is a set angle.
The device of the embodiment of the present invention may execute a fisheye image processing method shown in fig. 1, and its implementation principle is similar, and actions executed by each module in the fisheye image processing device of each embodiment of the present invention correspond to steps in the fisheye image processing method of each embodiment of the present invention, and detailed functional descriptions of each module of the fisheye image processing device may be referred to the descriptions in the corresponding fisheye image processing method shown in the foregoing, which are not repeated herein.
Based on the same principle as the method in the embodiment of the invention, reference is now made to fig. 3, which shows a schematic structural diagram of an electronic device (e.g. a terminal device or a server in fig. 1) 600 suitable for implementing the embodiment of the invention. The terminal device in the embodiment of the present invention may include, but is not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), car terminals (e.g., car navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 3 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the invention.
An electronic device includes: a memory and a processor, where the processor may be referred to as a processing device 601 hereinafter, the memory may include at least one of a Read Only Memory (ROM) 602, a Random Access Memory (RAM) 603, and a storage device 608 hereinafter, as shown in detail below:
as shown in fig. 3, the electronic device 600 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 601, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
In general, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 3 shows an electronic device 600 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present invention, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present invention include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602. The above-described functions defined in the method of the embodiment of the present invention are performed when the computer program is executed by the processing means 601.
The computer readable medium of the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (Hyper Text Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a transport stream TS file to be processed; analyzing the TS file to obtain basic data stream PES data packets of each packet corresponding to the TS file, wherein one PES data packet corresponds to the content of one video frame; analyzing each PES data packet respectively to obtain an ES data packet contained in each PES data packet; analyzing each ES data packet respectively to obtain audio and video parameters of each ES data packet; and obtaining the audio and video parameters of the TS file based on the video parameters of each ES data packet.
Computer program code for carrying out operations of the present invention may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules or units involved in the embodiments of the present invention may be implemented in software or in hardware. Where the name of a module or unit does not in some cases constitute a limitation of the unit itself.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of the present invention, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The above description is only illustrative of the preferred embodiments of the present invention and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the invention referred to in the present invention is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept described above. Such as the above-mentioned features and the features having similar functions (but not limited to) of the invention.
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the invention. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (8)

1. A fish-eye image processing method, characterized by comprising:
acquiring a current viewing angle for a fish-eye image to be processed;
when the current viewing angle changes relative to the initial viewing angle, determining a first mapping relation between fisheye image coordinates corresponding to partial pixel points in the fisheye image to be processed and corresponding partial cylindrical projection coordinates through a fisheye camera imaging model, wherein the partial pixel points are a set number of pixel points;
determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation;
determining and displaying a target cylindrical projection diagram corresponding to the fish-eye image to be processed based on a second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on cylindrical projection coordinates corresponding to the initial viewing angle and screen vertex coordinates;
The determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation comprises the following steps:
based on the first mapping relation, determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed;
and determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation and the third mapping relation.
2. The method of claim 1, wherein determining that the current viewing angle of view changes relative to an initial viewing angle of view comprises:
acquiring a sliding operation of a user on a screen corresponding to an initial viewing angle;
based on the sliding operation, it is determined that the current viewing angle of view changes relative to an initial viewing angle of view.
3. The method according to claim 1, wherein determining the target cylindrical projection map corresponding to the fish-eye image to be processed based on the second mapping relationship and the all cylindrical projection coordinates includes:
and rendering all cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain the target cylindrical projection graph.
4. A method according to any one of claims 1 to 3, wherein the second mapping is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of the initial fisheye image;
and determining the second mapping relation based on the initial cylindrical projection coordinates and the screen vertex coordinates.
5. A method according to any one of claims 1 to 3, wherein the initial viewing angle is a set angle.
6. A fisheye image processing device, comprising:
the view angle acquisition module is used for acquiring the current viewing view angle of the fish-eye image to be processed;
the first mapping relation determining module is used for determining a first mapping relation between fisheye image coordinates corresponding to partial pixel points in the fisheye image to be processed and corresponding partial cylindrical projection coordinates through a fisheye camera imaging model when the current viewing angle changes relative to the initial viewing angle, wherein the partial pixel points are a set number of pixel points;
the cylindrical projection coordinate determining module is used for determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation;
The target cylindrical projection diagram determining module is used for determining and displaying a target cylindrical projection diagram corresponding to the fish-eye image to be processed based on a second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on cylindrical projection coordinates corresponding to the initial viewing angle and screen vertex coordinates;
the cylindrical projection coordinate determining module is specifically configured to, when determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relationship:
based on the first mapping relation, determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed;
and determining all cylindrical projection coordinates corresponding to the fish-eye image to be processed based on the first mapping relation and the third mapping relation.
7. An electronic device, comprising:
a processor and a memory;
the memory is used for storing computer operation instructions;
the processor is configured to perform the method of any one of claims 1 to 5 by invoking the computer operating instructions.
8. A computer readable medium having stored thereon at least one instruction, at least one program, code set or instruction set, the at least one instruction, the at least one program, code set or instruction set being loaded and executed by a processor to implement the method of any of claims 1 to 5.
CN201911007209.XA 2019-10-22 2019-10-22 Fisheye image processing method, device, electronic equipment and computer readable medium Active CN110728622B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911007209.XA CN110728622B (en) 2019-10-22 2019-10-22 Fisheye image processing method, device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911007209.XA CN110728622B (en) 2019-10-22 2019-10-22 Fisheye image processing method, device, electronic equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN110728622A CN110728622A (en) 2020-01-24
CN110728622B true CN110728622B (en) 2023-04-25

Family

ID=69221689

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911007209.XA Active CN110728622B (en) 2019-10-22 2019-10-22 Fisheye image processing method, device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN110728622B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112200064B (en) * 2020-09-30 2021-07-30 腾讯科技(深圳)有限公司 Image processing method and device, electronic equipment and storage medium
CN112184543B (en) * 2020-09-30 2024-04-16 湖北安泰泽善科技有限公司 Data display method and device for fisheye camera
CN112565730B (en) * 2020-12-03 2023-07-25 阿波罗智联(北京)科技有限公司 Road side sensing method and device, electronic equipment, storage medium and road side equipment
CN117652139A (en) * 2021-11-25 2024-03-05 英特尔公司 Method and apparatus for tile-based stitching and encoding of images
CN115937010B (en) * 2022-08-17 2023-10-27 北京字跳网络技术有限公司 Image processing method, device, equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107749050A (en) * 2017-09-30 2018-03-02 珠海市杰理科技股份有限公司 Fish eye images antidote, device and computer equipment
CN107820012A (en) * 2017-11-21 2018-03-20 暴风集团股份有限公司 A kind of fish eye images processing method, device, server and system
CN108846796A (en) * 2018-06-22 2018-11-20 北京航空航天大学青岛研究院 Image split-joint method and electronic equipment
CN109308686A (en) * 2018-08-16 2019-02-05 北京市商汤科技开发有限公司 A kind of fish eye images processing method and processing device, equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106981050A (en) * 2016-01-18 2017-07-25 深圳岚锋创视网络科技有限公司 The method and apparatus of the image flame detection shot to fish eye lens
JP6330987B2 (en) * 2016-06-17 2018-05-30 日本電気株式会社 Image processing apparatus, image processing method, and storage medium
CN107392851A (en) * 2017-07-04 2017-11-24 上海小蚁科技有限公司 Method and apparatus for generating panoramic picture

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107749050A (en) * 2017-09-30 2018-03-02 珠海市杰理科技股份有限公司 Fish eye images antidote, device and computer equipment
CN107820012A (en) * 2017-11-21 2018-03-20 暴风集团股份有限公司 A kind of fish eye images processing method, device, server and system
CN108846796A (en) * 2018-06-22 2018-11-20 北京航空航天大学青岛研究院 Image split-joint method and electronic equipment
CN109308686A (en) * 2018-08-16 2019-02-05 北京市商汤科技开发有限公司 A kind of fish eye images processing method and processing device, equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吕耀文等.鱼眼视频图像畸变的实时校正方法.吉林大学学报(理学版).2017,(01),全文. *
周小康 ; 饶鹏 ; 朱秋煜 ; 陈忻 ; .鱼眼图像畸变校正技术研究.工业控制计算机.2017,(10),全文. *
李剑 ; 曾丹 ; 张之江 ; 朱沁怡 ; .基于双目鱼眼相机的柱状投影全景行车记录仪.电子测量技术.2017,(10),全文. *

Also Published As

Publication number Publication date
CN110728622A (en) 2020-01-24

Similar Documents

Publication Publication Date Title
CN110728622B (en) Fisheye image processing method, device, electronic equipment and computer readable medium
CN115908679A (en) Texture mapping method, device, equipment and storage medium
CN112801907A (en) Depth image processing method, device, equipment and storage medium
CN114742934A (en) Image rendering method and device, readable medium and electronic equipment
CN111915532B (en) Image tracking method and device, electronic equipment and computer readable medium
CN111833459B (en) Image processing method and device, electronic equipment and storage medium
CN111258582B (en) Window rendering method and device, computer equipment and storage medium
WO2023193613A1 (en) Highlight shading method and apparatus, and medium and electronic device
CN111862342A (en) Texture processing method and device for augmented reality, electronic equipment and storage medium
CN110717467A (en) Head pose estimation method, device, equipment and storage medium
CN115086541B (en) Shooting position determining method, device, equipment and medium
CN115272060A (en) Transition special effect diagram generation method, device, equipment and storage medium
CN112132909B (en) Parameter acquisition method and device, media data processing method and storage medium
CN114419298A (en) Virtual object generation method, device, equipment and storage medium
CN114202617A (en) Video image processing method and device, electronic equipment and storage medium
CN114723600A (en) Method, device, equipment, storage medium and program product for generating cosmetic special effect
CN113066166A (en) Image processing method and device and electronic equipment
CN113837918A (en) Method and device for realizing rendering isolation by multiple processes
CN115937010B (en) Image processing method, device, equipment and medium
CN110570502A (en) method, apparatus, electronic device and computer-readable storage medium for displaying image frame
CN113592734B (en) Image processing method and device and electronic equipment
CN111489428B (en) Image generation method, device, electronic equipment and computer readable storage medium
CN112668474B (en) Plane generation method and device, storage medium and electronic equipment
CN112214187B (en) Water ripple image implementation method and device
CN112395826B (en) Text special effect processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant