CN110728622A - Fisheye image processing method and device, electronic equipment and computer readable medium - Google Patents

Fisheye image processing method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN110728622A
CN110728622A CN201911007209.XA CN201911007209A CN110728622A CN 110728622 A CN110728622 A CN 110728622A CN 201911007209 A CN201911007209 A CN 201911007209A CN 110728622 A CN110728622 A CN 110728622A
Authority
CN
China
Prior art keywords
cylindrical projection
fisheye image
mapping relation
determining
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911007209.XA
Other languages
Chinese (zh)
Other versions
CN110728622B (en
Inventor
田池
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Fruit Technology Co Ltd
Original Assignee
Zhuhai Fruit Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Fruit Technology Co Ltd filed Critical Zhuhai Fruit Technology Co Ltd
Priority to CN201911007209.XA priority Critical patent/CN110728622B/en
Publication of CN110728622A publication Critical patent/CN110728622A/en
Application granted granted Critical
Publication of CN110728622B publication Critical patent/CN110728622B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T3/047
    • G06T3/12
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The invention provides a fisheye image processing method, a fisheye image processing device, electronic equipment and a computer storage medium, wherein the method comprises the following steps: acquiring a current viewing angle aiming at the fisheye image to be processed; when the current viewing field angle changes, determining a first mapping relation between fisheye image coordinates corresponding to part of pixel points in a fisheye image to be processed and corresponding part of cylindrical projection coordinates through a fisheye camera imaging model; determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation; and determining and displaying a target cylindrical projection image corresponding to the fisheye image to be processed based on the second mapping relation and all cylindrical projection coordinates. By the scheme of the invention, when the current viewing field angle changes, the first mapping relation between the fisheye image coordinate corresponding to the partial pixel point and the corresponding partial cylindrical projection coordinate can be determined, the first mapping relation does not need to be calculated pixel by pixel through a model, and the calculation amount of the model is reduced.

Description

Fisheye image processing method and device, electronic equipment and computer readable medium
Technical Field
The invention relates to the technical field of image processing, in particular to a fisheye image processing method and device, electronic equipment and a computer readable medium.
Background
In the prior art, a cylindrical projection image corresponding to a fisheye image is determined based on a fisheye camera imaging model, and a specific determination method is to calculate a mapping relationship between cylindrical projection coordinates and fisheye image coordinates pixel by pixel based on cylindrical projection coordinates and fisheye image coordinates of each pixel in the fisheye image, and then determine a cylindrical projection image corresponding to the fisheye image based on the mapping relationship. However, in the above method, the mapping relationship between the cylindrical projection coordinates and the fisheye image coordinates is calculated pixel by pixel, the calculation amount is very large, and particularly for a low-power-consumption device, the above method is difficult to meet the real-time requirement of the low-power-consumption device.
Disclosure of Invention
The present invention is directed to solving at least one of the above-mentioned drawbacks and improving data processing efficiency. The technical scheme adopted by the invention is as follows:
in a first aspect, the present invention provides a method for processing a fisheye image, including:
acquiring a current viewing angle aiming at the fisheye image to be processed;
when the current viewing angle changes relative to the initial viewing angle, determining a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates through a fisheye camera imaging model;
determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation;
and determining and displaying a target cylindrical projection image corresponding to the fisheye image to be processed based on a second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing field angle and the screen vertex coordinates.
In an embodiment of the first aspect of the present invention, determining that the current viewing angle changes from the initial viewing angle comprises:
acquiring sliding operation of a user for a screen corresponding to an initial viewing field angle;
based on the sliding operation, it is determined that the current viewing angle changes from the initial viewing angle.
In an embodiment of the first aspect of the present invention, some of the pixels are a set number of pixels.
In an embodiment of the first aspect of the present invention, determining all cylindrical projection coordinates corresponding to a fisheye image to be processed based on the first mapping relationship includes:
determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed based on the first mapping relation;
and determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation and the third mapping relation.
In an embodiment of the first aspect of the present invention, determining a target cylindrical projection image corresponding to a fisheye image to be processed based on the second mapping relationship and all cylindrical projection coordinates includes:
and rendering all the cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain a target cylindrical projection image.
In an embodiment of the first aspect of the present invention, the second mapping relationship is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of the initial fisheye image;
and determining a second mapping relation based on the initial cylindrical projection coordinates and the screen vertex coordinates.
In an embodiment of the first aspect of the present invention, the initial viewing angle is a set angle.
In a second aspect, the present invention provides a fisheye image processing apparatus comprising:
the viewing angle acquisition module is used for acquiring the current viewing angle of the fisheye image to be processed;
the first mapping relation determining module is used for determining a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates through a fisheye camera imaging model when the current viewing angle changes relative to the initial viewing angle;
the cylindrical projection coordinate determination module is used for determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation;
and the target cylindrical projection image determining module is used for determining and displaying a target cylindrical projection image corresponding to the fisheye image to be processed based on a second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing field angle and the screen vertex coordinates.
In an embodiment of the second aspect of the present invention, when determining that the current viewing angle changes from the initial viewing angle, the first mapping relationship determining module is specifically configured to:
acquiring sliding operation of a user for a screen corresponding to an initial viewing field angle;
based on the sliding operation, it is determined that the current viewing angle changes from the initial viewing angle.
In an embodiment of the second aspect of the present invention, some of the pixels are a set number of pixels.
Optionally, when determining all the cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relationship, the cylindrical projection coordinate determination module is specifically configured to:
determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed based on the first mapping relation;
and determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation and the third mapping relation.
In an embodiment of the second aspect of the present invention, when the target cylindrical projection image determining module determines the target cylindrical projection image corresponding to the fisheye image to be processed based on the second mapping relationship and all the cylindrical projection coordinates, the target cylindrical projection image determining module is specifically configured to:
and rendering all the cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain a target cylindrical projection image.
In an embodiment of the second aspect of the present invention, the second mapping relationship is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of the initial fisheye image;
and determining a second mapping relation based on the initial cylindrical projection coordinates and the screen vertex coordinates.
In an embodiment of the second aspect of the present invention, the initial viewing angle is a set angle.
In a third aspect, the present invention provides an electronic device, comprising:
a processor and a memory;
a memory for storing computer operating instructions;
a processor for performing the method as shown in any of the embodiments of the first aspect of the present invention by invoking computer operational instructions.
In a fourth aspect, the present invention provides a computer readable medium having stored thereon at least one instruction, at least one program, set of codes or set of instructions, which is loaded and executed by a processor to implement a method as set forth in any one of the embodiments of the first aspect of the invention.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
according to the fisheye image processing method, the fisheye image processing device, the electronic equipment and the computer readable medium, when the current viewing angle changes relative to the initial viewing angle, the first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates can be determined through the fisheye camera imaging model, all cylindrical projection coordinates corresponding to the fisheye image to be processed can be determined based on the first mapping relation, the first mapping relation between the fisheye image coordinates and the corresponding part of cylindrical projection coordinates does not need to be calculated pixel by pixel point through the model, the calculated amount of the model is reduced, the calculation efficiency is improved, and the requirements of low-power consumption equipment can be met.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings used in the description of the embodiments of the present invention will be briefly described below.
Fig. 1 is a schematic flowchart of a method for processing a fisheye image according to an embodiment of the invention;
fig. 2 is a schematic structural diagram of a fisheye image processing apparatus according to an embodiment of the invention;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present invention. It should be understood that the drawings and the embodiments of the present invention are illustrative only and are not intended to limit the scope of the present invention.
It should be understood that the various steps recited in the method embodiments of the present invention may be performed in a different order and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the invention is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present invention are only used for distinguishing the devices, modules or units, and are not used for limiting the devices, modules or units to be different devices, modules or units, and are not used for limiting the sequence or interdependence relationship of the functions executed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in the present invention are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that reference to "one or more" unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present invention are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The field angle: in an optical instrument, an angle formed by two edges of a lens, which is the maximum range in which an object image of a target to be measured can pass through, is called a field angle.
OpenGL (Open Graphics Library, Open Graphics Library or Open Graphics Library): is a cross-language, cross-platform application programming interface for rendering 2D, 3D vector graphics. This interface consists of nearly 350 different function calls to draw from simple graphics bits to complex three-dimensional scenes.
The following describes the technical solution of the present invention and how to solve the above technical problems with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
In view of the above technical problem, an embodiment of the present invention provides a method for processing a fisheye image, as shown in fig. 1, the method may include:
step S110, a current viewing angle for the fisheye image to be processed is acquired.
Specifically, the fisheye image to be processed refers to an image for which a corresponding target cylindrical projection image needs to be determined, the fisheye image to be processed is an image shot by a fisheye camera, and the current viewing field angle refers to: and taking a lens of the fisheye camera corresponding to the fisheye image to be processed as a vertex, and taking an included angle formed by two edges of the largest range of the object image of the detected target which can pass through the lens.
Step S120, when the current viewing angle changes relative to the initial viewing angle, determining a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates through a fisheye camera imaging model.
Specifically, the current viewing angle is changed by: the angle of the viewing angle relative to the initial viewing angle is increased or decreased, the initial viewing field angle may be a preset field angle, and the fisheye image corresponding to the initial viewing field angle is captured before the fisheye image corresponding to the current viewing field angle.
And step S130, determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation.
Specifically, because the same mapping relationship can be shared between the coordinates of each fisheye image in the same fisheye image and the corresponding cylindrical projection coordinates, based on the first mapping relationship, the first mapping relationship between the coordinates of the fisheye image corresponding to other pixels except for part of the pixels in the fisheye image to be processed and the corresponding part of the cylindrical projection coordinates is not required to be calculated pixel by pixel, and all the cylindrical projection coordinates corresponding to the fisheye image to be processed can be determined based on the first mapping relationship, so that the calculation amount is reduced.
And step S140, determining and displaying a target cylindrical projection image corresponding to the fisheye image to be processed based on a second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing field angle and the screen vertex coordinates.
Specifically, determining the target cylindrical projection image corresponding to the fisheye image based on the cylindrical projection coordinates of the fisheye image may be implemented in a manner in the prior art, which is not described herein again. Compared with the method in the prior art, the target cylindrical surface projection graph obtained based on the scheme improves the data processing efficiency, and the real-time performance of the algorithm can be ensured when the viewing angle of the current viewing field changes.
According to the scheme in the embodiment of the invention, when the current viewing angle changes relative to the initial viewing angle, the first mapping relation between the fisheye image coordinates corresponding to part of the pixel points in the fisheye image to be processed and the corresponding part of the cylindrical projection coordinates can be determined through the fisheye camera imaging model, all the cylindrical projection coordinates corresponding to the fisheye image to be processed can be determined based on the first mapping relation, the first mapping relation between the fisheye image coordinates and the corresponding part of the cylindrical projection coordinates does not need to be calculated pixel by pixel through the model, the calculation amount of the model is reduced, the calculation efficiency is improved, and the requirement of low-power consumption equipment can be met.
In an embodiment of the present invention, determining that a current viewing angle changes from an initial viewing angle includes:
acquiring sliding operation of a user for a screen corresponding to an initial viewing field angle;
based on the sliding operation, it is determined that the current viewing angle changes from the initial viewing angle.
Specifically, the sliding operation refers to an operation of the user on a screen corresponding to an initial viewing angle, for example, the initial viewing angle is a, after the sliding operation is received, the corresponding viewing angle is B, and a and B are different angles, the sliding operation may change the viewing angle, so that based on the sliding operation, the current viewing angle of the fisheye image to be processed may be determined. It can be understood that the fisheye image corresponding to the sliding operation can be used as the fisheye image to be processed.
In the embodiment of the invention, part of the pixel points are the pixel points with the set number.
Specifically, some pixel points can be pixel points with set quantity, and the set quantity can be configured based on actual requirements, so that the calculation requirements of the fisheye images with different sizes can be met.
In the embodiment of the present invention, in step S130, determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relationship may include:
determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed based on the first mapping relation;
and determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation and the third mapping relation.
Specifically, the first mapping relationship is a mapping relationship between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates, which is established through a fisheye camera imaging model, for the cylindrical projection coordinates corresponding to other pixel points in the fisheye image to be processed, the mapping relationship can be directly determined based on the mapping relationship, the mapping relationships between the fisheye image coordinates of other pixel points and the cylindrical projection coordinates are determined through other modes, the model does not need to calculate pixel points by pixel point, and the calculation amount of the model is reduced.
In the embodiment of the present invention, based on the first mapping relationship, one implementation manner of determining a third mapping relationship between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points in the fisheye image to be processed except for the set number of pixel points is as follows: the third mapping relationship between other coordinate points (the fisheye image coordinates and the cylindrical projection coordinates corresponding to other pixels) can be calculated based on an interpolation mode, and the calculation speed can be improved by the mode.
In the embodiment of the present invention, determining the target cylindrical projection image corresponding to the fisheye image to be processed based on the second mapping relationship and the all cylindrical projection coordinates includes:
and rendering all the cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain a target cylindrical projection image.
Specifically, after all the cylindrical projection coordinates of the fisheye image to be processed are determined, all the cylindrical projection coordinates can be projected into a screen vertex coordinate system through OpenGL based on a second mapping relation established in advance, so that a target cylindrical projection image corresponding to the fisheye image to be processed is obtained.
The rendering of all the cylindrical projection coordinates based on OpenGL means that all the cylindrical projection coordinates are projected into a screen vertex coordinate system through OpenGL, and a specific rendering mode is a rendering mode corresponding to OpenGL in the prior art, and is not described herein again.
In an embodiment of the present invention, the second mapping relationship is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of the initial fisheye image;
and determining a second mapping relation based on the initial cylindrical projection coordinates and the screen vertex coordinates.
Specifically, the second mapping relationship is determined based on the initial fisheye image coordinate corresponding to the initial viewing field angle and the screen vertex coordinate, and the second mapping relationships corresponding to different fisheye images are the same, that is, the second mapping relationship is fixed and unchanged. The screen vertex coordinates refer to coordinates corresponding to each position on the screen.
It should be noted that the above-mentioned cylindrical projection coordinate, fisheye image coordinate, and screen vertex coordinate are all coordinates corresponding to the pixel points in different coordinate systems, and may be determined by a coordinate system conversion formula, which is not described herein again.
In the embodiment of the invention, the initial viewing angle is a set angle.
Specifically, the initial viewing angle may be configured based on actual requirements, for example, to set an angle, which may be a relatively good angle for viewing the photographed object. In an embodiment of the present invention, the set angle may be 90 degrees.
The following describes the scheme of the present invention with a specific example:
step 1: and acquiring the current viewing angle of the fisheye image to be processed.
Step 2: when the current viewing angle changes relative to the initial viewing angle, a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates is determined through a fisheye camera imaging model.
In this example, whether the current viewing angle is changed or not may be determined based on a sliding operation of the user on the screen, that is, if there is a sliding operation, it may be determined that the current viewing angle is changed from the initial viewing angle, and if there is no sliding operation, it may be determined that the current viewing angle is not changed from the initial viewing angle.
The specific implementation process can be as follows:
acquiring sliding operation of a user for a screen corresponding to an initial viewing field angle;
based on the sliding operation, it is determined that the current viewing angle changes from the initial viewing angle.
When the current viewing angle changes relative to the initial viewing angle, a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates is determined through a fisheye camera imaging model.
The number of the partial pixel points can be set based on actual demand configuration, and calculation requirements of fisheye images with different sizes can be met.
The initial viewing angle may be configured based on actual requirements, for example, to be a set angle, which may be a relatively good angle for viewing the photographed object. In an embodiment of the present invention, the set angle may be 90 degrees.
And step 3: and determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation.
One way that can be achieved is: determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed based on the first mapping relation;
and determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation and the third mapping relation.
Specifically, the first mapping relationship is a mapping relationship between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates, which is established through a fisheye camera imaging model, for the cylindrical projection coordinates corresponding to other pixel points in the fisheye image to be processed, the mapping relationship can be directly determined based on the mapping relationship, the mapping relationships between the fisheye image coordinates of other pixel points and the cylindrical projection coordinates are determined through other modes, the model does not need to calculate pixel points by pixel point, and the calculation amount of the model is reduced.
In the embodiment of the present invention, based on the first mapping relationship, one implementation manner of determining a third mapping relationship between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points in the fisheye image to be processed except for the set number of pixel points is as follows: the third mapping relationship between other coordinate points (the fisheye image coordinates and the cylindrical projection coordinates corresponding to other pixels) can be calculated based on an interpolation mode, and the calculation speed can be improved by the mode.
And 4, step 4: and determining and displaying a target cylindrical projection image corresponding to the fisheye image to be processed based on a second mapping relation and all cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing field angle and the screen vertex coordinates.
Specifically, after all the cylindrical projection coordinates of the fisheye image to be processed are determined, all the cylindrical projection coordinates can be projected into a screen vertex coordinate system through OpenGL based on a second mapping relation established in advance, so that a target cylindrical projection image corresponding to the fisheye image to be processed is obtained.
The rendering of all the cylindrical projection coordinates based on OpenGL means that all the cylindrical projection coordinates are projected into a screen vertex coordinate system through OpenGL, and a specific rendering mode is a rendering mode corresponding to OpenGL in the prior art, and is not described herein again.
In the above example, the second mapping relationship is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of the initial fisheye image;
and determining a second mapping relation based on the initial cylindrical projection coordinates and the screen vertex coordinates.
Specifically, the second mapping relationship is determined based on the initial fisheye image coordinate corresponding to the initial viewing field angle and the screen vertex coordinate, and the second mapping relationships corresponding to different fisheye images are the same, that is, the second mapping relationship is fixed and unchanged. The screen vertex coordinates refer to coordinates corresponding to each position on the screen.
It should be noted that the above-mentioned cylindrical projection coordinate, fisheye image coordinate, and screen vertex coordinate are all coordinates corresponding to the pixel points in different coordinate systems, and may be determined by a coordinate system conversion formula, which is not described herein again.
By the method, when the current viewing angle changes relative to the initial viewing angle, the first mapping relation between the fisheye image coordinates corresponding to part of the pixel points in the fisheye image to be processed and the corresponding part of the cylindrical projection coordinates can be determined through the fisheye camera imaging model, all the cylindrical projection coordinates corresponding to the fisheye image to be processed can be determined based on the first mapping relation, the first mapping relation between the fisheye image coordinates and the corresponding part of the cylindrical projection coordinates does not need to be calculated pixel by pixel through the model, the calculated amount of the model is reduced, the calculation efficiency is improved, and the requirements of low-power-consumption equipment can be met.
Based on the same principle as the fisheye image processing method shown in fig. 1, an embodiment of the present invention further provides a fisheye image processing apparatus 20, as shown in fig. 2, where the apparatus 20 may include: a field angle acquisition module 210, a first mapping relationship determination module 220, a cylindrical projection coordinates determination module 230, and an object cylindrical projection map determination module 240, wherein,
the field angle acquisition module 210 is configured to acquire a current viewing field angle for the fisheye image to be processed;
the first mapping relationship determining module 220 is configured to determine, through a fisheye camera imaging model, a first mapping relationship between fisheye image coordinates corresponding to part of the pixel points in the fisheye image to be processed and corresponding part of the cylindrical projection coordinates when a current viewing angle changes relative to an initial viewing angle;
a cylindrical projection coordinate determining module 230, configured to determine, based on the first mapping relationship, all cylindrical projection coordinates corresponding to the fisheye image to be processed;
and the target cylindrical projection image determining module 240 is configured to determine and display a target cylindrical projection image corresponding to the fisheye image to be processed based on a second mapping relationship and all the cylindrical projection coordinates, where the second mapping relationship is determined based on the cylindrical projection coordinates corresponding to the initial viewing field angle and the screen vertex coordinates.
The fisheye image processing device provided by the embodiment of the invention can determine the first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates through the fisheye camera imaging model when the current viewing angle changes relative to the initial viewing angle, and can determine all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation without calculating the first mapping relation between the fisheye image coordinates and the corresponding part of cylindrical projection coordinates pixel by pixel through the model, so that the calculation amount of the model is reduced, the calculation efficiency is improved, and the requirements of low-power consumption equipment can be met.
Optionally, when determining that the current viewing angle changes with respect to the initial viewing angle, the first mapping relationship determining module is specifically configured to:
acquiring sliding operation of a user for a screen corresponding to an initial viewing field angle;
based on the sliding operation, it is determined that the current viewing angle changes from the initial viewing angle.
Optionally, part of the pixels are pixels with a set number.
Optionally, when determining all the cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relationship, the cylindrical projection coordinate determination module is specifically configured to:
determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed based on the first mapping relation;
and determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation and the third mapping relation.
Optionally, when the target cylindrical projection image determining module determines the target cylindrical projection image corresponding to the fisheye image to be processed based on the second mapping relationship and all the cylindrical projection coordinates, the target cylindrical projection image determining module is specifically configured to:
and rendering all the cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain a target cylindrical projection image.
Optionally, the second mapping relationship is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of the initial fisheye image;
and determining a second mapping relation based on the initial cylindrical projection coordinates and the screen vertex coordinates.
Optionally, the initial viewing angle is a set angle.
The apparatus according to the embodiment of the present invention may execute the fisheye image processing method shown in fig. 1, and the implementation principles thereof are similar, the actions executed by the modules in the fisheye image processing apparatus according to the embodiments of the present invention correspond to the steps in the fisheye image processing method according to the embodiments of the present invention, and for the detailed functional description of the modules of the fisheye image processing apparatus, reference may be specifically made to the description in the corresponding fisheye image processing method shown in the foregoing, and details are not repeated here.
Based on the same principle as the method in the embodiment of the present invention, reference is made to fig. 3, which shows a schematic structural diagram of an electronic device (e.g. the terminal device or the server in fig. 1) 600 suitable for implementing the embodiment of the present invention. The terminal device in the embodiments of the present invention may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a fixed terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
The electronic device includes: a memory and a processor, wherein the processor may be referred to as the processing device 601 hereinafter, and the memory may include at least one of a Read Only Memory (ROM)602, a Random Access Memory (RAM)603 and a storage device 608 hereinafter, which are specifically shown as follows:
as shown in fig. 3, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 3 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present invention, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, an embodiment of the invention includes a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing means 601, performs the above-described functions defined in the method of an embodiment of the invention.
It should be noted that the computer readable medium of the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (Hyper Text transfer protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a transport stream TS file to be processed; analyzing the TS file to obtain basic data stream (PES) data packets of each packet corresponding to the TS file, wherein one PES data packet corresponds to the content of one video frame; respectively analyzing each PES data packet to obtain an ES data packet contained in each PES data packet; respectively analyzing each ES data packet to obtain audio and video parameters of each ES data packet; and obtaining audio and video parameters of the TS file based on the video parameters of each ES data packet.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules or units described in the embodiments of the present invention may be implemented by software, or may be implemented by hardware. Wherein the designation of a module or unit does not in some cases constitute a limitation of the unit itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of the present invention, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the invention and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention according to the present invention is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the scope of the invention as defined by the appended claims. For example, the above features and (but not limited to) features having similar functions of the present invention are mutually replaced to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the invention. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (10)

1. A fisheye image processing method is characterized by comprising the following steps:
acquiring a current viewing angle aiming at the fisheye image to be processed;
when the current viewing angle changes relative to the initial viewing angle, determining a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates through a fisheye camera imaging model;
determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation;
and determining and displaying a target cylindrical projection image corresponding to the fisheye image to be processed based on a second mapping relation and all the cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing angle and the screen vertex coordinates.
2. The method of claim 1, wherein determining that the current viewing angle of view has changed relative to an initial viewing angle of view comprises:
acquiring sliding operation of a user for a screen corresponding to an initial viewing field angle;
determining that the current viewing angle changes from an initial viewing angle based on the sliding operation.
3. The method of claim 1, wherein the portion of the pixels are a set number of pixels.
4. The method according to claim 3, wherein the determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation comprises:
determining a third mapping relation between fisheye image coordinates and cylindrical projection coordinates corresponding to other pixel points except the set number of pixel points in the fisheye image to be processed based on the first mapping relation;
and determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation and the third mapping relation.
5. The method according to any one of claims 1 to 4, wherein the determining a target cylindrical projection image corresponding to the fisheye image to be processed based on the second mapping relationship and the all cylindrical projection coordinates comprises:
and rendering all the cylindrical projection coordinates through an open graphics library OpenGL based on the second mapping relation to obtain the target cylindrical projection image.
6. The method of any of claims 1 to 4, wherein the second mapping relationship is determined by:
acquiring an initial fisheye image corresponding to an initial viewing angle;
determining initial cylindrical projection coordinates and screen vertex coordinates of the initial fisheye image;
determining the second mapping relationship based on the initial cylindrical projection coordinates and the screen vertex coordinates.
7. The method of any of claims 1-4, wherein the initial viewing field angle is a set angle.
8. A fisheye image processing apparatus comprising:
the viewing angle acquisition module is used for acquiring the current viewing angle of the fisheye image to be processed;
the first mapping relation determining module is used for determining a first mapping relation between fisheye image coordinates corresponding to part of pixel points in the fisheye image to be processed and corresponding part of cylindrical projection coordinates through a fisheye camera imaging model when the current viewing angle changes relative to the initial viewing angle;
the cylindrical projection coordinate determination module is used for determining all cylindrical projection coordinates corresponding to the fisheye image to be processed based on the first mapping relation;
and the target cylindrical projection image determining module is used for determining and displaying a target cylindrical projection image corresponding to the fisheye image to be processed based on a second mapping relation and the all cylindrical projection coordinates, wherein the second mapping relation is determined based on the cylindrical projection coordinates corresponding to the initial viewing field angle and the screen vertex coordinates.
9. An electronic device, comprising:
a processor and a memory;
the memory is used for storing computer operation instructions;
the processor is used for executing the method of any one of claims 1 to 7 by calling the computer operation instruction.
10. A computer readable medium storing at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the method of any one of claims 1 to 7.
CN201911007209.XA 2019-10-22 2019-10-22 Fisheye image processing method, device, electronic equipment and computer readable medium Active CN110728622B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911007209.XA CN110728622B (en) 2019-10-22 2019-10-22 Fisheye image processing method, device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911007209.XA CN110728622B (en) 2019-10-22 2019-10-22 Fisheye image processing method, device, electronic equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN110728622A true CN110728622A (en) 2020-01-24
CN110728622B CN110728622B (en) 2023-04-25

Family

ID=69221689

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911007209.XA Active CN110728622B (en) 2019-10-22 2019-10-22 Fisheye image processing method, device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN110728622B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184543A (en) * 2020-09-30 2021-01-05 湖北安泰泽善科技有限公司 Data display method and device for fisheye camera
CN112200064A (en) * 2020-09-30 2021-01-08 腾讯科技(深圳)有限公司 Image processing method and device, electronic equipment and storage medium
CN112565730A (en) * 2020-12-03 2021-03-26 北京百度网讯科技有限公司 Roadside sensing method and device, electronic equipment, storage medium and roadside equipment
WO2023092373A1 (en) * 2021-11-25 2023-06-01 Intel Corporation Methods and apparatus for tile-based stitching and encoding of images
WO2024036764A1 (en) * 2022-08-17 2024-02-22 北京字跳网络技术有限公司 Image processing method and apparatus, device, and medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107749050A (en) * 2017-09-30 2018-03-02 珠海市杰理科技股份有限公司 Fish eye images antidote, device and computer equipment
CN107820012A (en) * 2017-11-21 2018-03-20 暴风集团股份有限公司 A kind of fish eye images processing method, device, server and system
US20180150944A1 (en) * 2016-01-18 2018-05-31 Shenzhen Arashi Vision Company Limited Method and Device For Rectifying Image Photographed by Fish-Eye Lens
CN108846796A (en) * 2018-06-22 2018-11-20 北京航空航天大学青岛研究院 Image split-joint method and electronic equipment
US20190012766A1 (en) * 2016-06-17 2019-01-10 Nec Corporation Image processing device, image processing method, and storage medium
US20190014260A1 (en) * 2017-07-04 2019-01-10 Shanghai Xiaoyi Technology Co., Ltd. Method and device for generating a panoramic image
CN109308686A (en) * 2018-08-16 2019-02-05 北京市商汤科技开发有限公司 A kind of fish eye images processing method and processing device, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180150944A1 (en) * 2016-01-18 2018-05-31 Shenzhen Arashi Vision Company Limited Method and Device For Rectifying Image Photographed by Fish-Eye Lens
US20190012766A1 (en) * 2016-06-17 2019-01-10 Nec Corporation Image processing device, image processing method, and storage medium
US20190014260A1 (en) * 2017-07-04 2019-01-10 Shanghai Xiaoyi Technology Co., Ltd. Method and device for generating a panoramic image
CN107749050A (en) * 2017-09-30 2018-03-02 珠海市杰理科技股份有限公司 Fish eye images antidote, device and computer equipment
CN107820012A (en) * 2017-11-21 2018-03-20 暴风集团股份有限公司 A kind of fish eye images processing method, device, server and system
CN108846796A (en) * 2018-06-22 2018-11-20 北京航空航天大学青岛研究院 Image split-joint method and electronic equipment
CN109308686A (en) * 2018-08-16 2019-02-05 北京市商汤科技开发有限公司 A kind of fish eye images processing method and processing device, equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吕耀文等: "鱼眼视频图像畸变的实时校正方法" *
周小康;饶鹏;朱秋煜;陈忻;: "鱼眼图像畸变校正技术研究" *
李剑;曾丹;张之江;朱沁怡;: "基于双目鱼眼相机的柱状投影全景行车记录仪" *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184543A (en) * 2020-09-30 2021-01-05 湖北安泰泽善科技有限公司 Data display method and device for fisheye camera
CN112200064A (en) * 2020-09-30 2021-01-08 腾讯科技(深圳)有限公司 Image processing method and device, electronic equipment and storage medium
CN112184543B (en) * 2020-09-30 2024-04-16 湖北安泰泽善科技有限公司 Data display method and device for fisheye camera
CN112565730A (en) * 2020-12-03 2021-03-26 北京百度网讯科技有限公司 Roadside sensing method and device, electronic equipment, storage medium and roadside equipment
CN112565730B (en) * 2020-12-03 2023-07-25 阿波罗智联(北京)科技有限公司 Road side sensing method and device, electronic equipment, storage medium and road side equipment
WO2023092373A1 (en) * 2021-11-25 2023-06-01 Intel Corporation Methods and apparatus for tile-based stitching and encoding of images
WO2024036764A1 (en) * 2022-08-17 2024-02-22 北京字跳网络技术有限公司 Image processing method and apparatus, device, and medium

Also Published As

Publication number Publication date
CN110728622B (en) 2023-04-25

Similar Documents

Publication Publication Date Title
CN110728622B (en) Fisheye image processing method, device, electronic equipment and computer readable medium
CN112801907B (en) Depth image processing method, device, equipment and storage medium
CN115761090A (en) Special effect rendering method, device, equipment, computer readable storage medium and product
CN115908679A (en) Texture mapping method, device, equipment and storage medium
CN114331823A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111833459B (en) Image processing method and device, electronic equipment and storage medium
WO2024032752A1 (en) Method and apparatus for generating transition special effect image, device, and storage medium
WO2023193639A1 (en) Image rendering method and apparatus, readable medium and electronic device
WO2023193613A1 (en) Highlight shading method and apparatus, and medium and electronic device
CN113274735B (en) Model processing method and device, electronic equipment and computer readable storage medium
CN111915532B (en) Image tracking method and device, electronic equipment and computer readable medium
CN115086541B (en) Shooting position determining method, device, equipment and medium
CN114202617A (en) Video image processing method and device, electronic equipment and storage medium
CN115937290A (en) Image depth estimation method and device, electronic equipment and storage medium
CN112492230B (en) Video processing method and device, readable medium and electronic equipment
CN115358919A (en) Image processing method, device, equipment and storage medium
CN115063335A (en) Generation method, device and equipment of special effect graph and storage medium
CN114419298A (en) Virtual object generation method, device, equipment and storage medium
CN114598824A (en) Method, device and equipment for generating special effect video and storage medium
CN114693860A (en) Highlight rendering method, highlight rendering device, highlight rendering medium and electronic equipment
CN114723600A (en) Method, device, equipment, storage medium and program product for generating cosmetic special effect
CN113891057A (en) Video processing method and device, electronic equipment and storage medium
CN111862342A (en) Texture processing method and device for augmented reality, electronic equipment and storage medium
CN112037280A (en) Object distance measuring method and device
CN110717467A (en) Head pose estimation method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant