CN112991216B - Fisheye image correction method and system, intelligent shelf and image processing method - Google Patents

Fisheye image correction method and system, intelligent shelf and image processing method Download PDF

Info

Publication number
CN112991216B
CN112991216B CN202110292196.6A CN202110292196A CN112991216B CN 112991216 B CN112991216 B CN 112991216B CN 202110292196 A CN202110292196 A CN 202110292196A CN 112991216 B CN112991216 B CN 112991216B
Authority
CN
China
Prior art keywords
image
target
pixel point
fisheye
point position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110292196.6A
Other languages
Chinese (zh)
Other versions
CN112991216A (en
Inventor
杨硕
赵雄心
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202110292196.6A priority Critical patent/CN112991216B/en
Publication of CN112991216A publication Critical patent/CN112991216A/en
Application granted granted Critical
Publication of CN112991216B publication Critical patent/CN112991216B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

According to the fisheye image correction method, the fisheye image correction system, the intelligent shelf and the image processing method, the mapping relation between the fisheye image and the rectangular correction image is obtained by shooting a calibration plane calibrated with a plurality of straight lines through the fisheye lens, so that the mapping relation between arc lines in the fisheye image corresponding to the straight lines in the rectangular correction image, namely a first mapping relation, is obtained; correcting the fisheye image through the first mapping relation, mapping the fisheye image onto an imaging plane to obtain a corrected first corrected image, and performing scale compensation on the first corrected image based on the second mapping relation to obtain a target corrected image, so that the size of an object at the edge in the first corrected image is increased, and the size of an object at the center is decreased, and therefore the size proportion of the object at the edge and the object at the center in the target corrected image is more coordinated, and the visual browsing experience is improved.

Description

Fisheye image correction method and system, intelligent shelf and image processing method
Technical Field
The specification relates to the field of unmanned retail, in particular to a fisheye image correction method and system, an intelligent shelf and an image processing method.
Background
The current intelligent retail unmanned container mainly adopts a visual identification scheme, commodity identification is carried out by acquiring images through a camera, the state of equipment can be monitored at any time, static or dynamic commodity identification is completed, and the application is wider and more universal. The visual scheme mainly uses a fisheye lens as a visual collector to detect and identify the commodities in the container. Compared with a common lens, the visual angle of the fisheye lens is larger, and visual information of commodities in the whole layer of container can be collected under the condition that the layer height of the container is limited. However, the edge distortion of the fisheye image collected by the fisheye lens is large, and the user cannot directly browse the fisheye view. The existing distortion correction method is used for carrying out distortion correction on fisheye images, so that fisheye view edges are stretched, edge image information can exceed the view field range of the images, an ideal view browsing effect cannot be achieved, and meanwhile certain difficulty can be caused to commodity identification. In addition, the fisheye image has small middle distortion and large edge distortion, and objects with the same size have the effect of large and small distances in an image view field, which cannot be improved by the existing distortion correction method.
Therefore, it is desirable to provide a fisheye image correction method, a fisheye image correction system, an intelligent shelf, and an image processing method with better visual effects.
Disclosure of Invention
The present specification provides a fisheye image correction method and system with a better visual effect, an intelligent shelf, and an image processing method.
In a first aspect, the present specification provides a method for correcting a fisheye image, including: acquiring a first mapping relation, wherein the first mapping relation comprises a mapping relation between a first pixel point position in an M multiplied by N imaging plane and a second pixel point position in a fisheye image shot by a fisheye lens, wherein M is the line number of pixel points of the imaging plane, N is the column number of the pixel points of the imaging plane, and M and N are integers greater than 1; acquiring a target fisheye image, wherein the fisheye image comprises the target fisheye image; based on the first mapping relation, projecting pixel points in the target fisheye image to the imaging plane to obtain a first corrected image; and performing scale compensation on the first corrected image based on a preset second mapping relationship, and projecting a pixel point in the first corrected image to the imaging plane to obtain a target corrected image, so that an object at the edge in the first corrected image is enlarged and an object at the center in the first corrected image is reduced, wherein the second mapping relationship comprises a mapping relationship between a third pixel point position of the target corrected image and a fourth pixel point position of the target corrected image in the first corrected image, and the distance from the third pixel point position of the target corrected image to the center of the target corrected image is smaller than the distance from the fourth pixel point position of the first corrected image corresponding to the third pixel point position to the center of the first corrected image.
In some embodiments, the obtaining the first mapping relationship includes: acquiring a fisheye image of a calibration plane shot by the fisheye lens, wherein the calibration plane comprises L straight lines arranged in parallel, and L is a positive integer; fitting L elliptic curves based on L arc lines corresponding to the L straight lines in the fisheye image, and determining the major semiaxes of the L elliptic curves, wherein the central points of the L elliptic curves coincide with the center of the fisheye image, and the major semiaxes are consistent; for each of N columns of pixels of the imaging plane: determining a target elliptic curve corresponding to the ith column of pixels based on the ith column of pixels and the major-semiaxis; determining the corresponding second pixel point position of each first pixel point position in the ith row of pixels in the fisheye image based on the ith row of pixels and the mapping relation with the target elliptic curve; and establishing the first mapping relation.
In some embodiments, the projecting the pixel points in the target fisheye image into the imaging plane based on the first mapping relationship includes: for each of the first pixel point locations in the imaging plane: determining a corresponding second pixel point position in the target fisheye image based on the first mapping relation and the current first pixel point position; determining a target pixel value corresponding to the position of the second pixel point based on the target fisheye image; and assigning the target pixel value to the current first pixel point position.
In some embodiments, said determining the target pixel value corresponding to it at said second pixel point location comprises: selecting a pixel value corresponding to a pixel point position closest to the second pixel point position in the target fisheye image as the target pixel value; or based on a plurality of pixel point positions near the second pixel point position, calculating the target pixel value corresponding to the second pixel point position through an interpolation algorithm.
In some embodiments, the performing, based on a preset second mapping relationship, scale compensation on the first corrected image, and projecting a pixel point in the first corrected image into the imaging plane to obtain a target corrected image includes: for each of the third pixel point locations in the target rectified image: determining a corresponding fourth pixel point position in the first corrected image based on the second mapping relation and the current third pixel point position; determining a target pixel value corresponding to the fourth pixel point position based on the first corrected image; and assigning the target pixel value to the current third pixel point position.
In some embodiments, said determining a fourth pixel point location corresponding thereto in said first rectified image comprises: selecting a pixel value corresponding to a pixel point position closest to the fourth pixel point position in the first corrected image as the target pixel value; or based on a plurality of pixel point positions near the fourth pixel point position, calculating the target pixel value corresponding to the fourth pixel point position by an interpolation algorithm.
In some embodiments, the second mapping relationship comprises at least one of: a third mapping relationship including a mapping relationship between a position of the third pixel point position in the target corrected image in a row direction and a position thereof in the row direction of the fourth pixel point position in the first corrected image; and a fourth mapping relationship including a mapping relationship between a position of the third pixel point position in the column direction in the target corrected image and a position thereof in the column direction in the first corrected image.
In some embodiments, the performing the scale compensation on the first corrected image based on the second mapping relationship includes: performing the scale compensation on the first rectified image based on at least one of the row direction and the column direction of the first rectified image.
In some embodiments, the second mapping comprises a non-linear mapping.
In some embodiments, in the second mapping relationship, a rate of change of the third pixel point position in the target rectified image with the fourth pixel point position in the first rectified image is positively correlated with a distance of the fourth pixel point position from a center of the first rectified image.
In some embodiments, the rate of change of the third pixel point position with the fourth pixel point position is greater than 1 at the edges of the first rectified image and less than 1 at the center of the first rectified image.
In some embodiments, the second mapping comprises a one-dimensional quadratic mapping.
In a second aspect, the present description provides a fisheye image rectification system, comprising at least one storage medium storing at least one instruction set for fisheye image rectification, and at least one processor; the at least one processor is communicatively connected to the at least one storage medium, wherein when the fisheye image rectification system is operating, the at least one processor reads the at least one instruction set and implements the fisheye image rectification method according to the first aspect of the present description.
In a third aspect, the present specification further provides an intelligent shelf, including at least one carrying device, an image processing device, and a display device, where each carrying device of the at least one carrying device includes a tray and a fisheye lens, and the tray is used for carrying an article; the fisheye lens is positioned above the tray and used for shooting a fisheye image of the object; the image processing device is in communication connection with the fisheye lens during operation, receives the fisheye image, and corrects the target fisheye image based on the fisheye image correction method of the first aspect of the specification to obtain the target corrected image, wherein the target fisheye image comprises a fisheye image corresponding to a target bearing device selected by a target user, and the at least one bearing device comprises the target bearing device; and the display device is in communication connection with the image processing device during operation and displays the target correction image.
In some embodiments, the fisheye lens is mounted at a preset position and a preset angle of the tray.
In a fourth aspect, the present specification further provides an image processing method for the intelligent shelf according to the third aspect of the present specification, including the steps of executing, by the image processing apparatus: acquiring the target fisheye image; correcting the target fisheye image based on the fisheye image correction method to obtain the target corrected image; and sending the target corrected image to the display device.
In some embodiments, the obtaining the target fisheye image comprises: receiving an operation instruction of the target user on the display device, which is sent by the display device, and determining the target bearing device corresponding to the operation instruction; and acquiring the target fisheye image corresponding to the target bearing device.
According to the technical scheme, the fisheye image correction method, the fisheye image correction system, the intelligent shelf and the image processing method provided by the specification acquire the mapping relationship between the fisheye image and the mxn rectangular correction image by shooting the calibration plane calibrated with the plurality of straight lines through the fisheye lens, so as to acquire the mapping relationship between the arc lines in the fisheye image corresponding to the straight lines in the rectangular correction image, namely, the first mapping relationship. The fisheye image correction method, the fisheye image correction system, the intelligent shelf and the image processing method provided by the specification correct the fisheye image through the first mapping relation, map the fisheye image onto an mxn imaging plane, obtain a corrected first corrected image, and perform scale compensation on the first corrected image based on the second mapping relation, obtain a target corrected image, so that the size of an object at the edge in the first corrected image is increased and the size of an object at the center is decreased, and thus the size proportion of the object at the edge and the center in the target corrected image is more coordinated, so as to improve the visual browsing experience.
Other functions of the fisheye image rectification method, the fisheye image rectification system, the intelligent shelf and the image processing method provided by the specification are partially listed in the following description. The following numerical and exemplary descriptions will be readily apparent to those of ordinary skill in the art in view of the description. The inventive aspects of the fisheye image rectification method, system, smart shelf, and image processing method provided herein may be fully explained by the practice or use of the methods, apparatus, and combinations described in the detailed examples below.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic structural diagram of an intelligent shelf provided according to an embodiment of the present description;
FIG. 2 illustrates a hardware schematic diagram of a computing device provided in accordance with embodiments of the present description;
fig. 3 illustrates a schematic diagram of a target fisheye image provided in accordance with an embodiment of the present disclosure;
FIG. 4 illustrates a schematic diagram of a calibration plane provided in accordance with embodiments herein;
fig. 5 is a schematic diagram illustrating a fisheye image with a plane being calibrated according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating a first mapping relationship provided in accordance with an embodiment of the present disclosure;
fig. 7 is a flowchart illustrating a method for correcting a fisheye image according to an embodiment of the present disclosure;
fig. 8 is a flowchart illustrating a method for obtaining a first mapping relationship according to an embodiment of the present specification;
fig. 9 is a schematic diagram illustrating an object-corrected image of a fisheye image provided in accordance with an embodiment of the present disclosure; and
fig. 10 shows a flowchart of an image processing method provided according to an embodiment of the present specification.
Detailed Description
The following description is presented to enable any person skilled in the art to make and use the present description, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present description. Thus, the present description is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. For example, as used herein, the singular forms "a", "an" and "the" may include the plural forms as well, unless the context clearly indicates otherwise. The terms "comprises," "comprising," "includes," and/or "including," when used in this specification, are intended to specify the presence of stated integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
These and other features of the present specification, as well as the operation and function of the elements of the structure related thereto, and the combination of parts and economies of manufacture, may be particularly improved upon in view of the following description. Reference is made to the accompanying drawings, all of which form a part of this specification. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the specification. It should also be understood that the drawings are not drawn to scale.
The flow diagrams used in this specification illustrate the operation of system implementations according to some embodiments of the specification. It should be clearly understood that the operations of the flow diagrams may be performed out of order. Rather, the operations may be performed in reverse order or simultaneously. In addition, one or more other operations may be added to the flowchart. One or more operations may be removed from the flowchart.
The intelligent retail is to use the internet and the internet of things technology, sense consumption habits, predict consumption trends, guide production and manufacture, and provide diversified and personalized products and services for consumers. Intelligent unmanned visual containers are the most typical application of intelligent retail. The intelligent unmanned visual container is an intelligent container which acquires images by means of a camera, completes automatic identification of commodities by means of technologies such as vision and the like, and performs automatic transaction settlement on the commodities. A customer opens a door through face recognition or code scanning, takes out commodities from the intelligent unmanned visual container, closes the door for automatic settlement, completes the whole transaction process, realizes intelligent transaction payment in the real sense, and achieves good user experience of taking the commodities before paying.
The visual counter may typically carry a display device thereon. The customer can browse the images shot by the camera through the display device to select the commodities. When the replenishment worker replenishes the goods, the replenishment worker can browse the images shot by the camera through the display device to check whether the commodity placement meets the standard or not. The fisheye image has large distortion, which affects the visual browsing effect. When watching the fisheye image, the replenishment person can only see the commodities with distortion in the image, the row and column positions in the visual container are difficult to identify quickly, the replenishment is not efficient when new, the error probability is high, and the good replenishment is not experienced newly. The fisheye image edge corrected by the traditional fisheye image distortion correction method is stretched seriously, the size ratio of the edge article to the middle article is unbalanced, a good visual browsing effect cannot be provided, a user experience sensory function is not friendly, whether replenishment commodities meet a placing specification or not cannot be guaranteed, and the commodity identification accuracy rate cannot be guaranteed.
The fisheye image correction method, the fisheye image correction system, the intelligent shelf and the image processing method can correct the fisheye image, and perform scale compensation on the corrected fisheye image, so that the size of an object at the edge is increased, and the size of an object at the center is reduced, so that the size ratio of the object at the edge to the object at the center in the corrected fisheye image is more coordinated, the corrected fisheye image is closer to a real visual browsing effect, and the problem of serious edge distortion of the fisheye image is solved.
The fisheye image correction method, the fisheye image correction system, the intelligent goods shelf and the image processing method can provide more friendly visual browsing experience for customers and replenishment workers, can provide better guidance for the replenishment workers to replenish commodities, are beneficial to the replenishment workers to improve the replenishment standard degree, improve the replenishment efficiency and effectively reduce the replenishment error probability; meanwhile, the commodity identification success rate can be effectively improved, and the problems of difficult identification and the like caused by overlarge fisheye image distortion are solved.
Fig. 1 shows a schematic structural diagram of an intelligent shelf 001 provided according to an embodiment of the present disclosure. The smart shelf 001 may be used to display and store items. The items may be sporadic objects that may exist individually. Such as a bottle of beverage, a package of snacks, etc. As shown in fig. 1, the smart shelf 001 may include at least one carrier device 400, an image processing device 200, and a display device 800. In some embodiments, smart shelf 001 may also include rack 600.
The rack 600 may be a support base for the smart shelf 001.
At least one carrier 400 may be mounted on the rack 600 for carrying the articles. Fig. 1 shows 5 carriers 400. It should be noted that fig. 1 is only an exemplary illustration, and the number of the carrying devices 400 on the smart shelf 001 may be any number. Each carrier 400 may include a tray 460 and a fisheye lens 480.
The tray 460 may be mounted on the rack 600. The tray 460 may be used to carry items. The items may be displayed on the tray 460 according to a predetermined display rule. For example, the tray 460 may be divided into a plurality of rows, each row displaying the same item, and different rows may display different kinds of items, or the same item.
A fisheye lens 480 may be positioned above the tray 460 for taking a fisheye image of the item currently on the tray 460 on the carrier 400 to monitor changes in the item currently on the tray 460. The image processing device in the smart shelf 001 may recognize the target item, such as the name, price, origin, etc., of the target item, which is removed from the tray 460 by the user at the current time, according to the fisheye image of the fisheye lens 480. The fisheye lens 480 may be mounted at a predetermined position and a predetermined angle on the tray 460. That is, the fisheye lens 480 may be mounted with the tray 460 at the preset position and the preset angle. That is, the fisheye lens 480 and the tray 460 are calibrated according to the preset position and the preset angle.
The image processing apparatus 200 may store data or instructions to perform the image processing method described herein, and may execute or be used to execute the data and/or instructions. The image processing apparatus 200 may include a hardware device having a data information processing function and a program necessary for driving the hardware device to operate. Of course, the image processing apparatus 200 may be only a hardware device having a data processing capability, or only a program running in a hardware device. The image processing apparatus 200 may be in communication connection with the fisheye lens 480 in each carrying apparatus 400 during operation, receive the fisheye image, and correct the target fisheye image selected by the target user based on the image processing method described in this specification to obtain a target corrected image. In some embodiments, the image processing device 200 may also identify the target corrected image, determine the type of target item removed from the tray 460 or the type of target item placed on the tray 460 by the target user at the current time. The target user may be a customer of the smart shelf 001, a restocker, and the like. The communication connection may be a wireless communication connection, such as a network connection, a bluetooth connection, an NFC connection, etc., or may be an electrical connection or a wired communication connection based on an electrical connection.
In some embodiments, the image processing apparatus 200 may include a mobile device, a tablet computer, a notebook computer, an in-built device of a motor vehicle, or the like, or any combination thereof. In some embodiments, the mobile device may include a smart home device, a smart mobile device, or the like, or any combination thereof. In some embodiments, the smart home device may include a smart television, a desktop computer, etc., or any combination thereof. In some embodiments, the smart mobile device may include a smartphone, a personal digital assistant, a gaming device, a navigation device, etc., or any combination thereof. In some embodiments, the built-in devices in the motor vehicle may include an on-board computer, an on-board television, and the like. In some embodiments, the image processing apparatus 200 may be a device with positioning technology for positioning the position of the image processing apparatus 200.
The display device 800 may be in communication with the image processing device 200 for displaying the target rectified image. The display device 800 may be used for human-machine interaction with the target user. In some embodiments, the human-machine interaction functions include, but are not limited to: web browsing, word processing, status prompting, operational input, etc. In some embodiments, display device 800 may include a display screen. The display screen may be a touch screen type Liquid Crystal Display (LCD). The display screen has a Graphical User Interface (GUI) that enables the target user to interact with the image processing device 200 by touching the GUI and/or by gestures. In some embodiments, the display device 800 may also include a voice playback device, such as a speaker. The speaker may be any form of device that can deliver an audio signal. The target user can receive the voice information through the voice playing device, so as to perform human-computer interaction with the image processing device 200. In some embodiments, the display device 800 may also include a voice capture device, such as a microphone. The target user may input a voice instruction to the image processing apparatus 200 through the voice capturing apparatus, and the like. In some embodiments, the display device 800 may include one or more of the display screen, the voice playing device, and the voice acquisition device. In some embodiments, executable instructions for performing the above-described human-machine interaction functions are stored in one or more processor-executable computer program products or readable storage media. For convenience of illustration, the display device 800 is taken as an example of the display screen in the following description.
The present description also provides a correction system for fisheye images. The fisheye image rectification system may store data or instructions for performing the fisheye image rectification method described herein, and may execute or be used to execute the data and/or instructions. The fisheye image correcting system can comprise a hardware device with a data information processing function and necessary programs for driving the hardware device to work. Of course, the fisheye image rectification system may be only a hardware device with data processing capability, or only a program running in the hardware device. In some embodiments, the fisheye image rectification system may be the image processing apparatus 200. In some embodiments, the fisheye image rectification system may be any other device or program meeting the above requirements.
FIG. 2 illustrates a hardware schematic diagram of a computing device 300 provided in accordance with embodiments of the present description. In some embodiments, the data or instructions for the image processing apparatus 200 to perform the image processing method may be implemented on the computing device 300. That is, a part of the hardware configuration of the image processing apparatus 200 may be the hardware configuration shown in the computing device 300. In some embodiments, the data or instructions of the fisheye image rectification system to perform the fisheye image rectification method may be implemented on computing device 300. That is, a part of the hardware structure of the fisheye image rectification system may be the hardware structure shown in the computing device 300. The image processing method is described elsewhere in this specification. The correction of fisheye images is described elsewhere in this specification.
As shown in fig. 2, computing device 300 may include at least one storage medium 330 and at least one processor 320. In some embodiments, computing device 300 may also include a communication port 350 and an internal communication bus 310. In some embodiments, computing device 300 may also include I/O components 360.
The internal communication bus 310 may connect various system components to enable data communication between the various components in the image processing apparatus 200, including the storage medium 330, the processor 320, the communication port 350, and the I/O component 360. For example, the processor 320 may send data through the internal communication bus 310 to the storage medium 330 or to other hardware such as the I/O component 360. In some embodiments, internal communication bus 310 may be an Industry Standard (ISA) bus, an Extended ISA (EISA) bus, a Video Electronics Standard (VESA) bus, a peripheral component interconnect standard (PCI) bus, or the like.
The I/O components 360 may be used to input or output signals, data, or information. The I/O component 360 supports input/output between the image processing apparatus 200 and other components. In some embodiments, I/O components 360 may include input devices and output devices. Exemplary input devices may include a camera, a keyboard, a mouse, a display screen, a microphone, and the like, or any combination thereof. Exemplary output devices may include a display device, a voice playback device (e.g., speakers, etc.), a printer, a projector, etc., or any combination thereof. Exemplary display devices may include Liquid Crystal Displays (LCDs), Light Emitting Diode (LED) based displays, flat panel displays, curved displays, television equipment, Cathode Ray Tubes (CRTs), and the like, or any combination thereof.
The communication port 350 may be connected to a network for data communication of the image processing apparatus 200 with the outside. The connection may be a wired connection, a wireless connection, or a combination of both. The wired connection may include an electrical cable, an optical cable, or a telephone line, among others, or any combination thereof. The wireless connection may include bluetooth, Wi-Fi, WiMax, WLAN, ZigBee, mobile networks (e.g., 3G, 4G, or 5G, etc.), and the like, or any combination thereof. In some embodiments, the communication port 350 may be a standardized port, such as RS232, RS485, and the like. In some embodiments, the communication port 350 may be a specially designed port.
Storage media 330 may include data storage devices. The data storage device may be a non-transitory storage medium or a transitory storage medium. For example, the data storage devices may include one or more of a magnetic disk 332, a read-only storage medium (ROM)334, or a random access storage medium (RAM) 336. The storage medium 330 further comprises at least one set of instructions stored in the data storage device. The at least one instruction set is for the image processing and/or rectification of the fisheye image. The instructions are computer program code that may include programs, routines, objects, components, data structures, procedures, modules, and the like that perform the payment processing methods provided herein.
The at least one processor 320 may be communicatively coupled to at least one storage medium 330 and a communication port 350 via an internal communication bus 310. The at least one processor 320 is configured to execute the at least one instruction set. When the image processing apparatus 200 is running, the at least one processor 320 reads the at least one instruction set and executes the image processing method provided in the present specification according to the instruction of the at least one instruction set. When the fisheye image rectification system is in operation, the at least one processor 320 reads the at least one instruction set and executes the fisheye image rectification method provided by the specification according to the instruction of the at least one instruction set. The processor 320 may perform all the steps involved in the payment processing method. Processor 320 may be in the form of one or more processors, and in some embodiments, processor 320 may include one or more hardware processors, such as microcontrollers, microprocessors, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASICs), application specific instruction set processors (ASIPs), Central Processing Units (CPUs), Graphics Processing Units (GPUs), Physical Processing Units (PPUs), microcontroller units, Digital Signal Processors (DSPs), Field Programmable Gate Arrays (FPGAs), Advanced RISC Machines (ARM), Programmable Logic Devices (PLDs), any circuit or processor capable of executing one or more functions, or the like, or any combination thereof. For illustrative purposes only, only one processor 320 is depicted in the computing device 300 in this description. It should be noted, however, that the computing device 300 illustrated in this specification may also include multiple processors, and thus, the operations and/or method steps disclosed in this specification may be performed by one processor, as described in this specification, or by a combination of multiple processors. For example, if in this description processor 320 of computing device 300 performs steps a and B, it should be understood that steps a and B may also be performed jointly or separately by two different processors 320 (e.g., a first processor performing step a, a second processor performing step B, or both a first and second processor performing steps a and B).
Fig. 3 shows a schematic diagram of a target fisheye image 002 provided according to an embodiment of the present disclosure. The target fisheye image 002 shown in fig. 3 may be a fisheye image of the tray 460. As previously described, the items may be displayed on the tray 460 according to a predetermined display rule. For example, the tray 460 may be divided into a plurality of rows, each row displaying the same item, and different rows may display different kinds of items, or the same item. For ease of description, the display area on the tray 460 will be described as including multiple columns. As shown in fig. 3, the tray 460 in the target fisheye image 002 includes 7 rows. Each column displays a different type of item. As shown in fig. 3, the edge distortion of the target fisheye image 002 is serious, the distortion at the center is small, and the farther the distance from the center of the target fisheye image 002, the more serious the distortion. The object in the target fisheye image 002, which should be on the same straight line, is distorted into a curve, which is not beneficial for the target user to observe, and affects the visual browsing effect. Therefore, we need to perform fisheye image distortion correction on the target fisheye image 002, establish a mapping relationship between the fisheye image and the corrected image, and obtain a view that is eye-friendly. The corrected image may be a rectangular image, and the purpose of the fisheye image distortion correction is to project the image in the target fisheye image 002 into the rectangular corrected image, so that the corrected image conforms to the image visually observed by human eyes, even if the corrected image conforms to the real image visually observed by human eyes for placing the articles on the tray 460.
At present, fisheye image distortion correction mainly adopts two modes, namely a correction method based on a transformation model and a fisheye lens distortion correction method based on calibration. The method is characterized in that a target function is optimized mainly by fitting a polynomial based on a transformation model, so that parameters of the correction model are estimated, and corrected images are deduced, and the method is early in research, complex in calculation and poor in instantaneity; the correction algorithm based on calibration mainly calibrates the internal and external parameters of the fisheye image by means of external equipment, and realizes fisheye image distortion correction through coordinate conversion between real coordinates and fisheye imaging plane coordinates.
The fisheye image correction method provided by the specification is based on a calibrated correction algorithm, a calibration plane calibrated with a plurality of parallel straight lines is photographed through a fisheye lens 480, the fisheye image of the calibration plane is obtained, and thus the mapping relation between the parallel straight lines and the arc lines in the fisheye image is obtained, and the fisheye image correction is realized.
Fig. 4 shows a schematic diagram of a calibration plane 003 provided in accordance with embodiments herein. As shown in fig. 4, the calibration plane 003 includes L straight lines arranged in parallel. Wherein L is a positive integer. L may be any positive integer, for example, L may be 1, 2, 3, 4, or even 5, 6, 7, etc. The calibration plane 003 can be located at any position of the fisheye lens 480. For example, the calibration plane 003 may be disposed perpendicular to the optical central axis of the fisheye lens 480, or may not be perpendicular to the optical central axis of the fisheye lens 480. For another example, the center of the calibration plane 003 may or may not coincide with the center of the fisheye lens 480. For convenience of illustration, we will describe an example in which the calibration plane 003 is disposed perpendicular to the optical central axis of the fisheye lens 480, and the center of the calibration plane 003 coincides with the center of the fisheye lens 480. We define the L straight lines from left to right as the 1 st straight line, the i-th straight line, and the L-th straight line, respectively. In fig. 4, 3 straight lines are shown, i.e., L ═ 3.
Fig. 5 shows a schematic diagram of a fisheye image 004 of a calibration plane 003 provided according to an embodiment of the present description. As described above, the fisheye image captured by the fisheye lens 480 is distorted. As shown in fig. 5, the L straight lines correspond to L arcs in the fisheye image 004. Fig. 5 shows a mapping relationship between the positions of the L straight lines in the M × N imaging plane 006 and the L arcs in the fisheye image 004. Wherein M is the number of rows of the pixels of the imaging plane 006, N is the number of columns of the pixels of the imaging plane 007, and M and N are integers greater than 1. M and N may be any positive integer. For example, mxn may be 640 × 0480, 1024 × 768, 1600 × 1200, 2048 × 1536, and so on. The mxn may be decided based on the usage scenario of the fisheye lens 480. The mxn can be set or changed based on the usage requirements. Specifically, the M × N may be set or changed based on the size of the effective region in the fisheye image 004. The M × N may cover an effective area in the fisheye image 004, or may be larger than the effective area in the fisheye image 004. For convenience of description, a solid line in fig. 5 represents a real position of the L straight lines in the M × N imaging plane 006, and a dotted line in fig. 5 represents a corresponding L arc line of the L straight lines in the fisheye image 004. Each arc may be approximately equivalent to an elliptic curve. The L arcs are approximately equivalent to L elliptic curves. The centers of the L elliptic curves coincide with the center of the fisheye image 004. For convenience of description, we define the major half axis of the L elliptic curves as b and the minor half axis of the L elliptic curves as a, wherein the ith elliptic curve isThe major semi-axis of the circular curve is biThe minor semi-axis is ai. As shown in fig. 5, the major semiaxis b of the L elliptic curvesiSubstantially identical.
For convenience of illustration, we define the center of the fish-eye image 004 as the origin O, the row direction of the imaging plane 006 as the X direction, and the column direction of the imaging plane 006 as the Y direction. We define a coordinate system formed by the origin O, X axis and the Y axis as a reference coordinate system XOY. The X direction is perpendicular to the Y direction. For convenience of illustration, the X direction of the calibration plane 006 is defined as the arrangement direction of the L straight lines in the calibration plane 003, and the Y direction is defined as the extending direction of the L straight lines. As shown in FIG. 5, the coordinate value of the ith straight line on the X-axis in the imaging plane 006 is Xi. The minor semi-axis a of the ith elliptic curve corresponding to the ith straight linei=|xi|。
A plurality of positioning points can be selected from each arc line, and based on the coordinates of the plurality of positioning points, an equation of the elliptic curve corresponding to the ith arc line is fitted, so that the major semi-axis b of the elliptic curve corresponding to the ith arc line is determinedi. The fitting method may be polynomial fitting, least squares fitting, or the like. Long half shaft b corresponding to L arc lines1、......、bi、......、bLAnd determining the long half shaft b. As shown in FIG. 5, the abscissa is xiAnd the equation of the elliptic curve corresponding to the straight line parallel to the Y axis can be expressed as the following equation:
Figure BDA0002982963160000161
it should be noted that the greater the number of L, the higher the fitting accuracy, and the greater the calculation amount. Meanwhile, the more positioning points are selected on each arc line, the higher the fitting precision is, and the larger the calculated amount is.
Fig. 6 is a schematic diagram illustrating a first mapping relationship provided according to an embodiment of the present disclosure. The first mapping relationship includes a mapping relationship between a first pixel point position in the M × N imaging plane 006 and a second pixel point position thereof in a fisheye image 004 captured by the fisheye lens 480. For convenience of description, we define the pixel point in the imaging plane 006 as the first pixel point, and define the pixel point in the fisheye image 004 as the second pixel point.
As shown in fig. 6, the fisheye image 004 coincides with the center point of the imaging plane 006. The abscissa in the imaging plane 006 is xjAnd a straight line 007 parallel to the Y-axis corresponds to the target elliptic curve 008 shown in the fisheye image 004. The first pixel point P (x) on the straight line 007j,yj) Corresponding to the second pixel point Q (x) on the target elliptic curve 008s,ys). First pixel point on straight line 007
Figure BDA0002982963160000171
Corresponding to the second pixel point on the target elliptic curve 008
Figure BDA0002982963160000172
The first pixel A (x) on the line 007j0) corresponds to the second pixel point A (x) on the target elliptic curve 008j,0). Wherein, x is knownjAnd yjUnder the condition of (2), calculating a first pixel point P (x)j,yj) Corresponding second pixel point Q (x)s,ys) The position of (a). According to the formula (1):
Figure BDA0002982963160000173
Figure BDA0002982963160000174
according to a similar principle, the following formula exists:
Figure BDA0002982963160000175
wherein the content of the first and second substances,
Figure BDA0002982963160000176
representing the arc length of points F and Q on the target elliptic curve 008.
Figure BDA0002982963160000177
Representing the arc length of point F, point a, on the target elliptic curve 008.
Figure BDA0002982963160000178
And
Figure BDA0002982963160000179
can be expressed as the following equation:
Figure BDA00029829631600001710
Figure BDA0002982963160000181
wherein, theta1=∠BOF,θ2=∠BOQ,θ3=∠BOA。
According to the formulas (2) to (6), the first pixel point P (x) on the straight line 007 can be determinedj,yj) Corresponding to the second pixel point Q (x) on the target elliptic curve 008s,ys). Based on the same method, the second pixel point position in the fisheye image 004 corresponding to each first pixel point position on the imaging plane 006 can be acquired, so as to determine the first mapping relationship.
Fig. 7 shows a flowchart of a fisheye image rectification method P100 according to an embodiment of the present disclosure. As described above, the fisheye image rectification system may perform the fisheye image rectification method P100 described in this specification. Specifically, when the fisheye image rectification system runs on the computing device 300, the processor 320 may read an instruction set stored in its local storage medium and then execute the fisheye image rectification method P100 described in this specification according to the instruction set. The method P100 may comprise:
s120: and acquiring the first mapping relation.
Fig. 8 shows a flowchart for obtaining the first mapping relationship according to an embodiment of the present specification. Specifically, as shown in fig. 8, step S120 may include:
s122: and acquiring a fisheye image 004 of a calibration plane 003 shot by the fisheye lens 480. The calibration plane 003 may be located at a preset position of the fisheye lens 480. The preset position may be that the calibration plane 003 is disposed perpendicular to the optical central axis of the fisheye lens 480, and the center of the calibration plane 003 is disposed coincident with the center of the fisheye lens 480. As previously described, the items may be displayed on the tray 460 according to a predetermined display rule. For example, the tray 460 may be divided into a plurality of rows, each row displaying the same item, and different rows may display different kinds of items, or the same item. In order to make the corrected image conform to the real situation that the article on the tray 460 is laid out visually observed by human eyes, we can make the direction of the imaging plane 006 coincide with the direction of laying out the article on the tray 460. That is, the row direction (X direction) of the imaging plane 006 is made to coincide with the arrangement direction of the columns of the display area of the articles on the tray 460, and the column direction (Y direction) of the imaging plane 006 is made to coincide with the extending direction of the columns of the display area of the articles on the tray 460. As described above, the size of the imaging plane 006 can be set or changed according to the usage scenario requirements. In the smart shelf 001, the size of the imaging plane 006 may be set or changed according to the size of the tray 460 and the distance between the tray 460 and the fisheye lens 480, so that the imaging plane 006 can cover all the items on the tray 460, so that all the items on the tray 460 can be included in the corrected image. As described above, the size of the imaging plane 006 may cover the effective area in the fisheye image 004, or may be larger than the effective area in the fisheye image 004. In smart shelf 001, the active area may be the area containing all items on tray 460. The farther the tray 460 is from the fisheye lens 480, the smaller the effective area; the closer the tray 460 is to the fisheye lens 480, the larger the active area. When the relative positions where the tray 460 and the fisheye lens 480 are mounted are determined, the effective area may be determined by a calibration method, and the size mxn and the direction of the imaging plane 006 may also be determined by the calibration method. Specifically, the fisheye image rectification system can identify the circle center and the radius in the fisheye image 004, so as to determine the effective area.
S124: and fitting L elliptic curves based on L arc lines corresponding to the L straight lines in the fisheye image 004, and determining the major semi-axes of the L elliptic curves. Step S124 is shown in fig. 5, and is not described herein again.
S126: for each of N columns of pixels of the imaging plane:
determining a target elliptic curve 008 corresponding to the pixel in the ith column based on the pixel in the ith column and the long semi-axis; and
and determining the corresponding second pixel point position of each first pixel point position in the ith row of pixels in the fisheye image based on the ith row of pixels and the mapping relation with the target elliptic curve 008.
S128: and establishing the first mapping relation.
Specifically, step S126 and step S128 are shown in fig. 6, and are not described herein again.
S140: the target fisheye image 002 is acquired.
The fisheye image includes the target fisheye image 002. The fisheye image rectification system may be in communication with the fisheye lens 480 to obtain the target fisheye image 002.
S160: based on the first mapping relationship, projecting the pixel points in the target fisheye image 002 to the imaging plane 006 to obtain a first corrected image.
Specifically, step S160 may be: for each of the first pixel point locations in the imaging plane 006: determining a corresponding second pixel point position in the target fisheye image 002 based on the first mapping relation and the current first pixel point position; determining a target pixel value corresponding to the second pixel point position based on the target fisheye image 002; and assigning the target pixel value to the current first pixel point position.
When determining the current first pixel point position and calculating the corresponding second pixel point position of the current first pixel point position in the target fisheye image 002 based on the first mapping relationship, the corresponding second pixel point position of the current first pixel point position in the target fisheye image 002 is not necessarily an integer, and may be a decimal. At this time, when determining a target pixel value corresponding to the second pixel point position corresponding to the current first pixel point position, a pixel value corresponding to a pixel point position closest to the second pixel point position may be selected as the target pixel value in the target fish-eye image 002; the target pixel value corresponding to the second pixel point position may be calculated by an interpolation algorithm based on a plurality of pixel point positions in the vicinity of the second pixel point position. For example, in the target fisheye image 002, two pixel point positions near the target fisheye image are determined according to the second pixel point position determined by calculation, and the target pixel value corresponding to the second pixel point position is calculated by an interpolation algorithm according to the pixel values corresponding to the two pixel point positions. The interpolation algorithm may be any interpolation algorithm, such as discrete smooth interpolation, spline interpolation, etc. This is not a limitation of the present specification.
The method P100 may further include:
s180: and performing scale compensation on the first corrected image based on a preset second mapping relation, and projecting pixel points in the first corrected image to the imaging plane to obtain a target corrected image, so that an object at the edge in the first corrected image is enlarged and an object at the center in the first corrected image is reduced.
As described above, as shown in fig. 3, the edge of the target fisheye image 002 is stretched seriously, the size of the article at the edge is reduced, the size of the article at the center is enlarged, the object with the same size has a small and large effect at a distance in the target fisheye image 002, the size ratio of the article at the edge and the article in the middle is unbalanced, a good visual browsing effect cannot be provided, a user experience sensory function is not friendly, whether the replenishment product meets the placement specification or not cannot be guaranteed, and the product identification accuracy cannot be guaranteed. Therefore, in order to improve the visual browsing effect of the user, the first corrected image needs to be subjected to scale compensation, so that the size of the object at the edge of the first corrected image is enlarged and the size of the object at the center of the first corrected image is reduced, and thus the ratio of the size of the object at the edge of the corrected target corrected image to the size of the object at the center of the corrected target corrected image is closer to the ratio of the size of the object visually observed by the human eyes.
Specifically, the fisheye image rectification system may rectify the first rectified image based on the second mapping relationship. The second mapping relation comprises a mapping relation between pixel point positions of the target correction image and pixel point positions of the target correction image in the first correction image. For convenience of illustration, the pixel point in the target-corrected image is defined as a third pixel point, and the pixel point in the first corrected image is defined as a fourth pixel point. The first corrected image and the second corrected image have the same size.
As described above, the scale compensation of the first corrected image based on the second mapping relationship may cause the size of the object at the center of the target corrected image to be reduced, and the size of the object at the edge to be enlarged. Therefore, the distance from the center of the corrected image to the third pixel point position of the corrected image is smaller than the distance from the center of the corrected image to the fourth pixel point position of the corrected image. That is, performing the scale compensation on the first corrected image based on the second mapping relationship may gather pixel points in the first corrected image to a central position.
In step S180, the performing the scale compensation on the first corrected image based on the second mapping relationship may include performing the scale compensation on the first corrected image based on at least one of the row direction and the column direction of the first corrected image. In some embodiments, the scale compensating the first rectified image may be the scale compensating the first rectified image in the row direction (X direction) of the first rectified image. In some embodiments, the performing the scale compensation on the first rectified image may be performing the scale compensation on the first rectified image in the column direction (Y direction) of the first rectified image. In some embodiments, the performing the scale compensation on the first rectified image may be performing the scale compensation on the first rectified image in the row direction (X direction) and the column direction (Y direction) of the first rectified image. Thus, the second mapping relationship may include at least one of a third mapping relationship and a fourth mapping relationship. The third mapping relationship may include a mapping relationship between a position of the third pixel point position in the row direction (X direction) in the target corrected image and a position thereof in the row direction (X direction) in the first corrected image. The fourth mapping relationship may include a mapping relationship between a position of the third pixel point position in the column direction (Y direction) in the target corrected image and a position thereof in the column direction (Y direction) in the first corrected image. The third mapping relationship and the fourth mapping relationship may be the same or different. For convenience of illustration, we will describe the third mapping relationship as being the same as the fourth mapping relationship.
In some embodiments, the scale compensation of the first rectified image may be performed by performing the scale compensation on the first rectified image in the row direction (X direction) of the first rectified image and then performing the scale compensation on the first rectified image in the column direction (Y direction). In some embodiments, the scale compensation of the first corrected image may be performed by performing the scale compensation on the first corrected image in the column direction (Y direction) of the first corrected image and then performing the scale compensation on the first corrected image in the row direction (X direction). In some embodiments, the performing the scale compensation on the first rectified image may be performing the scale compensation on the first rectified image in the row direction (X direction) and the column direction (Y direction) of the first rectified image at the same time. The method of performing the scale compensation on the first rectified image in the row direction (X direction) of the first rectified image coincides with the method of performing the scale compensation on the first rectified image in the column direction (Y direction) of the first rectified image. For convenience of illustration, the following description will be given by taking the scale compensation of the first rectified image in the row direction (X direction) of the first rectified image based on the third mapping relationship as an example.
As shown in fig. 3, the farther the distance from the center of the target fisheye image 002, the larger the distortion, that is, the more the size of the object is reduced. Thus, the second mapping comprises a non-linear mapping. Namely, the mapping relationship between the third pixel point position of the target corrected image and the fourth pixel point position of the first corrected image satisfies a non-linear relationship. Since the farther the distance from the center of the target fisheye image 002 is, the larger the distortion is, in the second mapping relationship, the rate of change of the third pixel point position in the target corrected image with the fourth pixel point position in the first corrected image is positively correlated with the distance of the fourth pixel point position from the center of the first corrected image. That is, the farther the distance from the center of the first corrected image, the faster the rate of change of the third pixel point position in the target corrected image with the fourth pixel point position in the first corrected image; the closer the distance from the center of the first corrected image, the slower the rate of change of the third pixel point position in the target corrected image with the fourth pixel point position in the first corrected image. I.e. closer to the center of the first rectified image, the smaller its scale compensation, i.e. closer to the edges of the first rectified image, the larger its scale compensation.
The second mapping relationship may be any non-linear curve that satisfies the above relationship, such as a parabolic curve, a conical curve, an elliptic curve, and so on. For convenience of illustration, we will describe the mapping relationship in which the second mapping relationship is a one-dimensional quadratic equation (i.e., a parabolic curve).
For convenience of illustration, in the reference coordinate system XOY, we will refer to the edges of the first rectified image (i.e., the edges of the first rectified image)
Figure BDA0002982963160000241
The rate of change of the position of the third pixel point in the target corrected image with the position of the fourth pixel point in the first corrected image is defined as k1At the center of the first rectified image (i.e. x)2At 0) is defined as k, the rate of change of the third pixel point position in the target corrected image with the fourth pixel point position in the first corrected image is defined as2. Wherein k is1>k2。k1And k2May be any parameter that meets the above requirements.
When the second mapping relation satisfies the mapping relation of the one-dimensional quadratic equation, we can express the mapping relation between the position of the third pixel point position in the row direction (X direction) in the target corrected image and the position thereof in the row direction (X direction) in the first corrected image as the following formula:
x′=cx2+ dx formula (7)
Where X may be a position of the fourth pixel point position in the first corrected image in the row direction (X direction), and X' may be a position of the third pixel point position in the target corrected image in the row direction (X direction).
The rate of change of the third pixel point location with the fourth pixel point location may be expressed as:
x ″ ═ 2cx + d equation (8)
When in use
Figure BDA0002982963160000242
When the temperature of the water is higher than the set temperature,
Figure BDA0002982963160000243
when x is equal to x2When 0, x ″)2=d=k2. In this way, it can be seen that,
Figure BDA0002982963160000244
d=k2. Thus, equation (7) can be expressed as:
Figure BDA0002982963160000245
in some embodiments, the rate of change of the third pixel point position with the fourth pixel point position may be greater than 1, e.g., k, at an edge of the target-corrected image1May be 1.5, 1.4, 1.3, 1.2, may be even larger 1.6, 1.7, 1.8, etc., and may be even smaller, e.g., 1.1, 1, 0.9, etc. A rate of change k of the third pixel point position with the fourth pixel point position at a center of the target-corrected image2Can be less than 1, e.g., k2May be 0.3, 0.4, 0.5, and may even be 0.6, 0.7, 0.8, etc.
In some embodiments, the scale compensation may be symmetric with respect to a center of the first rectified image when the scale compensation is performed on the first rectified image. At this time, d is 0.
In some embodiments, step S180 may be: for each third pixel point position in the target corrected image, determining a corresponding fourth pixel point position in the first corrected image based on the second mapping relation and the current third pixel point position; determining a target pixel value corresponding to the fourth pixel point position based on the first corrected image; and assigning the target pixel value to the current third pixel point position.
When the current third pixel point position is determined and the current third pixel point position is calculated in the fourth pixel point position corresponding to the first corrected image based on the second mapping relationship, the fourth pixel point position corresponding to the current third pixel point position in the first corrected image is not necessarily an integer, and may be a decimal. At this time, when determining a target pixel value corresponding to the fourth pixel point position corresponding to the current third pixel point position, a pixel value corresponding to a pixel point position closest to the fourth pixel point position may be selected as the target pixel value in the first corrected image; the target pixel value corresponding to the fourth pixel point position may be calculated by an interpolation algorithm based on a plurality of pixel point positions in the vicinity of the fourth pixel point position. For example, in the first corrected image, two pixel point positions near the first corrected image are determined according to the calculated fourth pixel point position, and the target pixel value corresponding to the fourth pixel point position is calculated by an interpolation algorithm according to the pixel values corresponding to the two pixel point positions. The interpolation algorithm may be any interpolation algorithm, such as discrete smooth interpolation, spline interpolation, etc. This is not a limitation in the present specification.
It should be noted that, in some embodiments, the second mapping curve may also be a linear mapping relationship. Namely, the position of the third pixel point is in a linear relation with the position of the fourth pixel point.
Fig. 9 shows a schematic diagram of a target corrected image 011 of a fisheye image provided according to an embodiment of the present specification. As shown in fig. 3 and fig. 9, after the target fisheye image 002 is corrected by the method P100, the ratio of the size of the object at the edge of the target corrected image 011 to the size of the object at the center is more consistent, and more suitable for the ratio visually observed by human eyes.
In summary, the fisheye image correction method P100 may not only correct the target fisheye image 002, but also perform scale compensation on the corrected first corrected image, so as to enlarge the size of the object at the edge of the first corrected image, and reduce the size of the object at the center, so that the ratio of the size of the object at the edge of the target corrected image 011 and the size of the object at the center after the scale compensation is more consistent with the ratio visually observed by human eyes, thereby improving the visual browsing effect, improving the visual experience of the target user, assisting the replenishment worker to quickly check the goods placement state on the tray 460, the replenishment information increment information, the system inventory, and the replenishment identification inventory, and the information check is equivalent, so that the replenishment efficiency of the replenishment worker is improved. The fisheye image correction method P100 can also solve the problem of difficulty in article identification caused by distortion, improve the accuracy of article identification and improve user experience.
The present specification also provides a method P200 of image processing. Fig. 10 shows a flowchart of an image processing method P200 provided according to an embodiment of the present specification. As described above, the image processing apparatus 200 can execute the image processing method P200 described in this specification. Specifically, when the image processing apparatus 200 runs on the computing device 300, the processor 320 may read an instruction set stored in its local storage medium and then execute the image processing method P200 described in this specification according to the specification of the instruction set. The method P200 may comprise:
s220: the image processing apparatus 200 acquires the target fisheye image 002.
As mentioned above, at least one carrier 400 may be included in the smart shelf 001, and each carrier 400 includes a fisheye lens 480 therein. The image processing apparatus 200 may be communicatively connected to each fisheye lens 480. The image processing apparatus 200 may perform the image processing on the fisheye image captured by each fisheye lens 480, or may perform the image processing on the selected fisheye image. That is, the target fisheye image 002 may include a fisheye image corresponding to each carrying device 400, and may also include a fisheye image corresponding to the target carrying device 400 selected by the target user. The at least one carrier 400 may include the target carrier 400.
Specifically, step S220 may include: receiving an operation instruction of the target user on the display device 800 sent by the display device 800, and determining the target bearer device 400 corresponding to the operation instruction; the target fisheye image 002 corresponding to the target carrying device 400 is obtained. As previously mentioned, a human-machine interface may be included on the display device 800. The target user may perform human-computer interaction with the image processing apparatus 200 through the display apparatus 800. For example, the target user (e.g., a restocker) may select the target carrier 400 via the display device 800; the display device 800 may transmit the target bearer 400 selected by the target user to the image processing device 200; the image processing apparatus 200 may use the fisheye image captured by the fisheye lens 480 corresponding to the target carrying apparatus 400 as the target fisheye image 002 to be corrected.
The method P200 may further include:
s260: the image processing apparatus 200 corrects the target fisheye image 001 based on the fisheye image correction method P100, and obtains the target corrected image 011.
The fisheye image rectification method P100 is as described above, and will not be described herein again.
S280: the image processing apparatus 200 transmits the object corrected image 011 to the display apparatus 800. The display device 800 receives and displays the target corrected image 011. The target user can browse the images of the articles in the tray 460 in the target bearing device 400 through the display device 800, so that more friendly visual experience is achieved, better guidance is brought to the replenishment worker for replenishing the goods, the replenishment standard degree is favorably improved, and the replenishment efficiency is improved.
In some embodiments, before step S260, the method P200 may further include:
s240: determining that the target fisheye image 002 needs to be rotated, and rotating the target fisheye image 002 to enable the target fisheye image 002 to be at a preset angle.
In the smart shelf 001, the preset angle may be an angle of the tray 460 that is more in line with the visual observation of the human eye. For example, the preset angle may be an angle at which the position of the cabinet door in the target correction image 011 faces downward.
Specifically, in step S240, the image processing apparatus 200 may recognize the position of the cabinet door in the target fisheye image 002 and determine whether the position of the target fisheye image 002 is at the preset angle. When the image processing apparatus 200 recognizes that the position of the target fisheye image 002 is at the preset angle, the step S260 is directly performed without rotating the target fisheye image 002. When the image processing apparatus 200 recognizes that the position of the target fisheye image 002 is not at the preset angle, the target fisheye image 002 needs to be rotated. Specifically, the image processing apparatus 200 may rotate the target fisheye image 002 around the center of the target fisheye image 002 until the target fisheye image 002 is at the preset angle. In some embodiments, the rotation model may be preset in the image processing apparatus 200. The rotation model may be derived by machine learning based on historical samples.
Step S240 may be executed after step S260.
To sum up, the fisheye image correction method P100, the fisheye image correction system, the intelligent shelf 001, and the image processing method P200 provided in this specification correct the target fisheye image 002 through the first mapping relationship, map the target fisheye image 002 onto an mxn imaging plane, obtain a corrected first corrected image, and perform scale compensation on the first corrected image based on the second mapping relationship, so as to obtain the target corrected image 011, so that the size of an object at the edge in the first corrected image is increased and the size of an object at the center is decreased, so that the size ratios of the objects at the edge and the center in the target corrected image are more coordinated, and the visual browsing experience is improved. Meanwhile, the fisheye image correction method P100 and system, the intelligent shelf 001 and the image processing method P200 can also solve the problem of difficulty in article identification caused by distortion, improve the accuracy of article identification and improve the user experience.
Another aspect of the present description provides a non-transitory storage medium storing at least one set of executable instructions for fisheye image rectification. When executed by a processor, the executable instructions direct the processor to implement the steps of the fisheye image rectification method P100 described herein. In some possible implementations, various aspects of the description may also be implemented in the form of a program product including program code. When the program product is run on the computing device 300, the program code is configured to cause the computing device 300 to perform the steps of the fisheye image rectification method P100 described in this specification. A program product for implementing the above-described method may employ a portable compact disc read only memory (CD-ROM) including program code and may be run on the computing device 300. However, the program product of the present specification is not so limited, and in this specification, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system (e.g., the processor 320). The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. The computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. Program code for carrying out operations for this specification may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on computing device 300, partly on computing device 300, as a stand-alone software package, partly on computing device 300 and partly on a remote computing device, or entirely on the remote computing device.
Another aspect of the present description provides a non-transitory storage medium storing at least one set of executable instructions for image processing. When executed by a processor, the executable instructions direct the processor to perform the steps of the image processing method P200 described herein. In some possible implementations, various aspects of the description may also be implemented in the form of a program product including program code. The program code is for causing the computing device 300 to perform the steps of the image processing method P200 described in this specification when the program product is run on the computing device 300. A program product for implementing the above-described method may employ a portable compact disc read only memory (CD-ROM) including program code and may be run on the computing device 300. However, the program product of the present specification is not so limited, and in this specification, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system (e.g., the processor 320). The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. The computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. Program code for carrying out operations for this specification may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on computing device 300, partly on computing device 300, as a stand-alone software package, partly on computing device 300 and partly on a remote computing device, or entirely on the remote computing device.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In conclusion, upon reading the present detailed disclosure, those skilled in the art will appreciate that the foregoing detailed disclosure can be presented by way of example only, and not limitation. Those skilled in the art will appreciate that the present specification contemplates various reasonable variations, enhancements and modifications to the embodiments, even though not explicitly described herein. Such alterations, improvements, and modifications are intended to be suggested by this specification, and are within the spirit and scope of the exemplary embodiments of this specification.
Furthermore, certain terminology has been used in this specification to describe embodiments of the specification. For example, "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined as suitable in one or more embodiments of the specification.
It should be appreciated that in the foregoing description of embodiments of the specification, various features are grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the specification, for the purpose of aiding in the understanding of one feature. This is not to be taken as an admission that any of the features are required in combination, and it is fully possible for one skilled in the art to extract some of the features as separate embodiments when reading this specification. That is, embodiments in this specification may also be understood as an integration of a plurality of sub-embodiments. And each sub-embodiment described herein is equally applicable to less than all features of a single foregoing disclosed embodiment.
Each patent, patent application, publication of a patent application, and other material, such as articles, books, descriptions, publications, documents, articles, and the like, cited herein is hereby incorporated by reference. All matters hithertofore set forth herein except for any prosecution history associated therewith, are to be construed as an admission that any of the same is inconsistent or contrary to this document or any of the same prosecution history may have a limiting effect on the broadest scope of the claims. Now or later associated with this document. For example, if there is any inconsistency or conflict in the description, definition, and/or use of terms associated with any of the included materials with respect to the terms, descriptions, definitions, and/or uses associated with this document, the terms in this document are used.
Finally, it should be understood that the embodiments of the application disclosed herein are illustrative of the principles of the embodiments of the present specification. Other modified embodiments are also within the scope of this description. Accordingly, the disclosed embodiments are to be considered in all respects as illustrative and not restrictive. Those skilled in the art may implement the applications in this specification in alternative configurations according to the embodiments in this specification. Therefore, the embodiments of the present description are not limited to the embodiments described precisely in the application.

Claims (17)

1. A method of correcting a fisheye image, comprising:
acquiring a first mapping relation, wherein the first mapping relation comprises a mapping relation between a first pixel point position in an M multiplied by N imaging plane and a second pixel point position in a fisheye image shot by a fisheye lens, wherein M is the line number of pixel points of the imaging plane, N is the column number of the pixel points of the imaging plane, and M and N are integers greater than 1;
acquiring a target fisheye image, wherein the fisheye image comprises the target fisheye image;
based on the first mapping relation, projecting pixel points in the target fisheye image to the imaging plane to obtain a first corrected image; and
the method comprises the steps of carrying out scale compensation on a first corrected image based on a preset second mapping relation, projecting pixel points in the first corrected image to an imaging plane to obtain a target corrected image, and enabling an object at the edge of the first corrected image to be enlarged and an object at the center of the first corrected image to be reduced, wherein the second mapping relation comprises the mapping relation between the position of a third pixel point of the target corrected image and the position of a fourth pixel point of the target corrected image, and the distance from the position of the third pixel point of the target corrected image to the center of the target corrected image is smaller than the distance from the position of the corresponding fourth pixel point of the first corrected image to the center of the first corrected image.
2. The fisheye image rectification method of claim 1, wherein the obtaining the first mapping comprises:
acquiring a fisheye image of a calibration plane shot by the fisheye lens, wherein the calibration plane comprises L straight lines arranged in parallel, and L is a positive integer;
fitting L elliptic curves based on L arc lines corresponding to the L straight lines in the fisheye image, and determining the semi-major axes of the L elliptic curves, wherein the central points of the L elliptic curves coincide with the center of the fisheye image, and the semi-major axes are consistent;
for each of N columns of pixels of the imaging plane:
determining a target elliptic curve corresponding to the ith column of pixels based on the ith column of pixels and the major-half axis; and
determining the corresponding second pixel point position of each first pixel point position in the ith row of pixels in the fisheye image based on the ith row of pixels and the mapping relation with the target elliptic curve; and
and establishing the first mapping relation.
3. The fisheye image rectification method of claim 2, wherein the projecting pixel points in the target fisheye image into the imaging plane based on the first mapping relation comprises:
for each of the first pixel point locations in the imaging plane:
determining a corresponding second pixel point position in the target fisheye image based on the first mapping relation and the current first pixel point position;
determining a target pixel value corresponding to the position of the second pixel point based on the target fisheye image; and
assigning the target pixel value to the current first pixel point position.
4. The fisheye image rectification method of claim 3, wherein the determining the target pixel value corresponding to the second pixel point position comprises:
selecting a pixel value corresponding to a pixel point position closest to the second pixel point position in the target fisheye image as the target pixel value; or alternatively
And calculating the target pixel value corresponding to the second pixel point position through an interpolation algorithm based on a plurality of pixel point positions near the second pixel point position.
5. The fisheye image rectification method of claim 1, wherein the performing scale compensation on the first rectification image based on the preset second mapping relationship and projecting pixel points in the first rectification image to the imaging plane to obtain a target rectification image comprises:
for each of the third pixel point locations in the target rectified image:
determining a corresponding fourth pixel point position in the first corrected image based on the second mapping relation and the current third pixel point position;
determining a target pixel value corresponding to the fourth pixel point position based on the first corrected image; and
assigning the target pixel value to the current third pixel point position.
6. The fisheye image rectification method of claim 5, wherein said determining a corresponding fourth pixel point position in said first rectified image comprises:
selecting a pixel value corresponding to a pixel point position closest to the fourth pixel point position in the first corrected image as the target pixel value; or
And calculating the target pixel value corresponding to the fourth pixel point position by an interpolation algorithm based on a plurality of pixel point positions near the fourth pixel point position.
7. The fisheye image rectification method of claim 1, wherein the second mapping relationship comprises at least one of:
a third mapping relationship including a mapping relationship between a position of the third pixel point position in the target corrected image in a row direction and a position thereof in the row direction of the fourth pixel point position in the first corrected image; and
a fourth mapping relationship including a mapping relationship between a position of the third pixel point position in the column direction in the target corrected image and a position thereof in the column direction in the first corrected image.
8. The fisheye image rectification method of claim 7, wherein the performing scale compensation on the first rectified image based on the second mapping relation comprises:
performing the scale compensation on the first rectified image based on at least one of the row direction and the column direction of the first rectified image.
9. The fisheye image rectification method of claim 1, wherein the second mapping comprises a non-linear mapping.
10. The fisheye image rectification method of claim 9, wherein in the second mapping relationship, a rate of change of the third pixel point position in the target rectification image with the fourth pixel point position in the first rectification image positively correlates with a distance of the fourth pixel point position from a center of the first rectification image.
11. The fisheye image rectification method of claim 10, wherein at an edge of the first rectified image, a rate of change of the third pixel point position with the fourth pixel point position is greater than 1, and at a center of the first rectified image, a rate of change of the third pixel point position with the fourth pixel point position is less than 1.
12. The fisheye image rectification method of claim 9, wherein the second mapping comprises a one-dimensional quadratic mapping.
13. A system for rectification of fisheye images, comprising:
at least one storage medium storing at least one instruction set for fisheye image rectification; and
at least one processor communicatively coupled to the at least one storage medium,
wherein when the fisheye image rectification system is in operation, the at least one processor reads the at least one instruction set and implements the fisheye image rectification method of any of claims 1-12.
14. A smart shelf, comprising:
at least one carrier, each of the at least one carrier comprising:
a tray for carrying an article; and
the fisheye lens is positioned above the tray and used for shooting a fisheye image of the object;
an image processing device, which is in communication connection with the fisheye lens during operation, receives the fisheye image, and corrects the target fisheye image based on the fisheye image correction method of any one of claims 1 to 12 to obtain the target corrected image, where the target fisheye image includes a fisheye image corresponding to a target bearing device selected by a target user, and the at least one bearing device includes the target bearing device; and
and the display device is in communication connection with the image processing device during operation and displays the target correction image.
15. The smart shelf of claim 14, wherein the fisheye lens is mounted at a preset position and a preset angle of the tray.
16. An image processing method for the smart shelf of claim 15, comprising performing, by the image processing apparatus:
acquiring the target fisheye image;
correcting the target fisheye image based on the fisheye image correction method to obtain the target corrected image; and
and sending the target correction image to the display device.
17. The image processing method of claim 16, wherein the acquiring the target fisheye image comprises:
receiving an operation instruction of the target user on the display device, which is sent by the display device, and determining the target bearing device corresponding to the operation instruction; and
and acquiring the target fisheye image corresponding to the target bearing device.
CN202110292196.6A 2021-03-18 2021-03-18 Fisheye image correction method and system, intelligent shelf and image processing method Active CN112991216B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110292196.6A CN112991216B (en) 2021-03-18 2021-03-18 Fisheye image correction method and system, intelligent shelf and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110292196.6A CN112991216B (en) 2021-03-18 2021-03-18 Fisheye image correction method and system, intelligent shelf and image processing method

Publications (2)

Publication Number Publication Date
CN112991216A CN112991216A (en) 2021-06-18
CN112991216B true CN112991216B (en) 2022-05-13

Family

ID=76334390

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110292196.6A Active CN112991216B (en) 2021-03-18 2021-03-18 Fisheye image correction method and system, intelligent shelf and image processing method

Country Status (1)

Country Link
CN (1) CN112991216B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB202117102D0 (en) * 2021-11-26 2022-01-12 Ocado Innovation Ltd Calibrating a camera for mapping image pixels to grid points in a storage system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6036601B2 (en) * 2013-08-09 2016-11-30 株式会社デンソー Image processing apparatus and image processing method
CN106981050A (en) * 2016-01-18 2017-07-25 深圳岚锋创视网络科技有限公司 The method and apparatus of the image flame detection shot to fish eye lens
CN106600548B (en) * 2016-10-20 2020-01-07 广州视源电子科技股份有限公司 Fisheye camera image processing method and system
CN106780374B (en) * 2016-12-01 2020-04-24 哈尔滨工业大学 Fisheye image distortion correction method based on fisheye imaging model
CN107749050B (en) * 2017-09-30 2020-05-15 珠海市杰理科技股份有限公司 Fisheye image correction method and device and computer equipment
CN108462838B (en) * 2018-03-16 2020-10-02 影石创新科技股份有限公司 Panoramic video anti-shake method and device and portable terminal
CN108830810A (en) * 2018-06-07 2018-11-16 辽宁工业大学 A kind of fisheye image distortion correction method based on rectangular projection
CN110189269B (en) * 2019-05-23 2023-06-09 Oppo广东移动通信有限公司 Correction method, device, terminal and storage medium for 3D distortion of wide-angle lens

Also Published As

Publication number Publication date
CN112991216A (en) 2021-06-18

Similar Documents

Publication Publication Date Title
US11231845B2 (en) Display adaptation method and apparatus for application, and storage medium
US10713812B2 (en) Method and apparatus for determining facial pose angle, and computer storage medium
JP6153727B2 (en) Apparatus and method for scaling application program layout on video display apparatus
US10969949B2 (en) Information display device, information display method and information display program
US8773591B1 (en) Method and apparatus for interacting with television screen
GB2528948A (en) Activation target deformation using accelerometer or gyroscope information
US9448638B2 (en) Mobile devices for transmitting and receiving data using gesture
US9117395B2 (en) Method and apparatus for defining overlay region of user interface control
CN111353458B (en) Text box labeling method, device and storage medium
US10516820B2 (en) Electronic device for controlling focus of lens and method for controlling the same
CN105760070B (en) Method and apparatus for simultaneously displaying more items
CN108364209A (en) Methods of exhibiting, device, medium and the electronic equipment of merchandise news
US10817054B2 (en) Eye watch point tracking via binocular and stereo images
CN112991216B (en) Fisheye image correction method and system, intelligent shelf and image processing method
CN107770604A (en) The method of electronic installation and operation electronic installation
US20160188282A1 (en) Image Receiving Apparatus and Method Thereof for Determining an Orientation of a Screen of an Electronic Apparatus
US20220293066A1 (en) Rotational image viewer
KR102237520B1 (en) Method of providing virtual exhibition space for efficient data management
CN111914693A (en) Face posture adjusting method, system, device, equipment and medium
CN110858814B (en) Control method and device for intelligent household equipment
US20130162562A1 (en) Information processing device and non-transitory recording medium storing program
US11899918B2 (en) Method, apparatus, electronic device and storage medium for invoking touch screen magnifier
WO2024051639A1 (en) Image processing method, apparatus and device, and storage medium and product
CN113128463A (en) Image recognition method and system
US20110235917A1 (en) Digital image analyzing method and related computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant