CN112567727A - Image processing apparatus and image processing system - Google Patents

Image processing apparatus and image processing system Download PDF

Info

Publication number
CN112567727A
CN112567727A CN201980053880.XA CN201980053880A CN112567727A CN 112567727 A CN112567727 A CN 112567727A CN 201980053880 A CN201980053880 A CN 201980053880A CN 112567727 A CN112567727 A CN 112567727A
Authority
CN
China
Prior art keywords
image
area
data
image processing
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980053880.XA
Other languages
Chinese (zh)
Other versions
CN112567727B (en
Inventor
松原义明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Semiconductor Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp filed Critical Sony Semiconductor Solutions Corp
Publication of CN112567727A publication Critical patent/CN112567727A/en
Application granted granted Critical
Publication of CN112567727B publication Critical patent/CN112567727B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/665Control of cameras or camera modules involving internal camera communication with the image sensor, e.g. synchronising or multiplexing SSIS control signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • H04N25/443Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

An image processing apparatus (200) is provided, comprising: a communication unit (202) for causing area image data indicating an image of each line corresponding to an area and additional data including, for each area, area information corresponding to the area set with respect to the captured image to be transmitted in different packets, the communication unit (202) being capable of communicating with each of the plurality of image sensors; and a processing unit (204) for processing, in association with each area, area image data acquired from each of the plurality of image sensors, based on area information included in the additional data acquired from each of the plurality of image sensors, the area information including a part or all of information for identifying the area, information indicating a position of the area, and information indicating a size of the area.

Description

Image processing apparatus and image processing system
Technical Field
The present disclosure relates to an image processing apparatus and an image processing system.
Background
Technologies related to a compound-eye imaging apparatus including a plurality of imaging units are being developed. An example of such a technique is the technique described in the following patent document 1.
Reference list
Patent document
Patent document 1: JP 2007-110499A.
Disclosure of Invention
Technical problem
For example, in the case of using the technique described in patent document 1, when a specific subject is detected from an image obtained by imaging by one of imaging units constituting an imaging apparatus, imaging by the other imaging units constituting the imaging apparatus is performed. However, using the technique described in patent document 1 enables a plurality of images to be obtained by imaging, and associated processing of the plurality of images obtained by imaging is not particularly considered.
The present disclosure proposes a novel and improved image processing apparatus and image processing system capable of processing images respectively obtained from a plurality of image sensors in association.
Solution to the problem
The present disclosure provides an image processing apparatus including: a communication unit capable of communicating with each of a plurality of image sensors configured to transmit additional data and area image data in different groups, respectively, the additional data including, for each area, area information corresponding to an area set with respect to a captured image, the area image data indicating an image of each line corresponding to the area; and a processing unit configured to process the area image data acquired from each of the plurality of image sensors in association with each area based on area information included in the additional data acquired from each of the plurality of image sensors, wherein the area information includes a part or all of identification information of the area, information indicating a position of the area, and information indicating a size of the area.
In addition, the present disclosure provides an image processing system including: a plurality of image sensors configured to transmit additional data including, for each area, area information corresponding to an area set with respect to a captured image and area image data indicating an image of each line corresponding to the area, in different packets, respectively; and an image processing apparatus, wherein the image processing apparatus includes: a communication unit capable of communicating with each of the plurality of image sensors; and a processing unit configured to process the area image data acquired from each of the plurality of image sensors in association with each area based on area information included in the additional data acquired from each of the plurality of image sensors, and the area information includes a part or all of identification information of the area, information indicating a position of the area, and information indicating a size of the area.
Advantageous effects of the invention
According to the present disclosure, images respectively obtained from a plurality of image sensors can be processed in association.
It should be noted that the above-described advantageous effects are not necessarily restrictive, and any of the advantageous effects described in the present specification or other advantageous effects that can be understood from the present specification may be produced in addition to or instead of the above-described advantageous effects.
Drawings
Fig. 1 is an explanatory diagram showing an example of the configuration of an information processing system according to the present embodiment.
Fig. 2 is an explanatory diagram showing a format of a packet defined in the MIPI CSI-2 standard.
Fig. 3 is an explanatory diagram showing a format of a packet defined in the MIPI CSI-2 standard.
Fig. 4 is an explanatory diagram showing an example of signal waveforms related to transmission of packets in the MIPI CSI-2 standard.
Fig. 5 is an explanatory diagram showing an example of a region to be set with respect to an image.
Fig. 6 is an explanatory diagram showing an example of data to be transmitted by the first transmission system relating to the transmission method according to the present embodiment.
Fig. 7 is an explanatory diagram for explaining an example of embedded data to be transmitted by the first transmission system according to the present embodiment.
Fig. 8 is an explanatory diagram for explaining an example of area information included in the embedded data shown in fig. 7.
Fig. 9 is an explanatory diagram showing another example of a region to be set with respect to an image.
Fig. 10 is an explanatory diagram showing an example of data to be transmitted by the second transmission system relating to the transmission method according to the present embodiment.
Fig. 11 is a block diagram showing an example of the configuration of an image sensor according to the present embodiment.
Fig. 12 is a block diagram showing an example of the configuration of an image processing apparatus according to the present embodiment.
Fig. 13 is a block diagram showing an example of a functional configuration of a communication circuit included in the image processing apparatus according to the present embodiment.
Fig. 14 is a block diagram showing an example of a functional configuration of an image processing circuit included in the image processing apparatus according to the present embodiment.
Fig. 15 is an explanatory diagram for explaining an example of processing in the information processing system according to the present embodiment.
Fig. 16 is an explanatory diagram for explaining an example of processing in the information processing system according to the present embodiment.
Fig. 17 is an explanatory diagram for explaining an example of processing in the information processing system according to the present embodiment.
Detailed Description
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the present specification and the drawings, components having substantially the same functional configuration will be denoted by the same reference numerals, and overlapping description thereof will be omitted.
In addition, hereinafter, the description will be given in the order of the following description.
1. Transmission method according to the present embodiment, image processing method according to the present embodiment, and image processing system according to the present embodiment
[1] Configuration of image processing system to which transmission method according to the present embodiment can be applied
[2] Application example of image processing system according to the present embodiment
[3] Transmission method according to the present embodiment
[4] Configuration examples of image sensor and image processing apparatus constituting the image processing system according to the present embodiment
[5] Example of processing in the image processing system according to the present embodiment
[6] Example of advantageous effects produced by using the image processing system according to the present embodiment
2. Program according to the present embodiment
(Transmission method according to the present embodiment, image processing method according to the present embodiment, and image processing system according to the present embodiment)
[1] Configuration of image processing system to which transmission method according to the present embodiment can be applied
First, an example of a configuration of an image processing system to which the transmission method according to the present embodiment can be applied will be described.
Hereinafter, a case where the communication system between the devices constituting the image processing system according to the present embodiment is a communication system conforming to the MIPI (mobile industry processor interface) CSI-2 (camera serial interface 2) standard will be described as an example. However, the communication system between the apparatuses constituting the image processing system according to the present embodiment is not limited to a communication system conforming to the MIPI CSI-2 standard. For example, communication between devices constituting the image processing system according to the present embodiment may satisfy another standard developed by the MIPI alliance, such as a communication system conforming to the MIPI CSI-3 standard or a communication system conforming to the MIPI DSI (digital serial interface) standard. In addition, it goes without saying that a communication system to which the transmission method according to the present embodiment can be applied is not limited to a communication system related to a standard developed by the MIPI alliance.
Fig. 1 is an explanatory diagram showing an example of the configuration of an image processing system 1000 according to the present embodiment. Examples of the image processing system 1000 include a communication apparatus such as a smartphone and a mobile body such as a drone (a device that can be operated by remote control or a device that can be autonomously operated) or an automobile. An application example of the image processing system 1000 is not limited to the above example. Other application examples of the image processing system 1000 will be described later.
For example, the image processing system 1000 has image sensors 100A and 100B, an image processing apparatus 200, a memory 300, and a display device 400. Hereinafter, the term "image sensor 100" will be used when referring collectively to the image sensors 100A and 100B and when referring to one of the image sensors 100A and 100B.
The image sensor 100 has an imaging function and a transmission function, and transmits data indicating an image generated by imaging. The image processing apparatus 200 receives data transmitted from the image sensor 100 and processes the received data. In other words, in the image processing system 1000, the image sensor 100 performs the role of a transmitting device, and the image processing device 200 performs the role of a receiving device.
Although fig. 1 shows the image processing system 1000 having two image sensors 100, the number of image sensors 100 included in the image processing system according to the present embodiment is not limited to the example shown in fig. 1. For example, the image processing system according to the present embodiment may have three or more image sensors 100.
In addition, in the image processing system according to the present embodiment, a plurality of image sensors 100 can be modularized. For example, an image sensor module obtained by modularizing a plurality of image sensors 100 is provided with a plurality of image sensors 100, a processor (not shown) for the image sensor module, and a recording medium readable by the processor. For example, the recording medium constituting the image sensor module records information (e.g., data indicating an angle of view, etc.) relating to the angle of view of the image sensor 100 constituting the image sensor module. In addition, the processor constituting the image sensor module transmits information related to the angle of view to the image processing apparatus 200 via an arbitrary transmission path.
Further, although fig. 1 shows the image processing system 1000 having one image processing apparatus 200, the number of image processing apparatuses 200 included in the image processing system according to the present embodiment is not limited to the example shown in fig. 1. For example, the image processing system according to the present embodiment may have two or more image processing apparatuses 200. In an image processing system having a plurality of image processing apparatuses 200, a plurality of image sensors 100 correspond to each image processing apparatus 200. Even in an image processing system having a plurality of image sensors 100 and a plurality of image processing apparatuses 200, communication is performed between each image sensor 100 and the image processing apparatus 200 in a manner similar to the image processing system 1000 shown in fig. 1.
The image sensor 100 and the image processing apparatus 200 are electrically connected through a data bus B1. The data bus B1 is a transmission path of signals that connect the image sensor 100 and the image processing apparatus 200 to each other. For example, data indicating an image transmitted from the image sensor 100 (hereinafter, sometimes described as "image data") is transmitted from the image sensor 100 to the image processing apparatus 200 via the data bus B1.
The signal transmitted by the data bus B1 in the image processing system 1000 is transmitted according to a communication system conforming to a prescribed standard such as the MIPI CSI-2 standard.
Fig. 2 and 3 are explanatory diagrams showing formats of packets defined in the MIPI CSI-2 standard. Fig. 2 shows a format of a short packet defined in the MIPI CSI-2 standard, and fig. 3 shows a format of a long packet defined in the MIPI CSI-2 standard.
A long packet refers to Data consisting of a packet header ("PH" shown in fig. 3), a Payload ("Payload Data" shown in fig. 3), and a packet footer ("PF" shown in fig. 3). The short packet refers to data having a structure similar to the packet header (PH shown in fig. 3) shown in fig. 2.
In both the short packet and the long packet, a VC (virtual channel) number ("VC": VC value shown in fig. 2 and 3) is recorded in the header section, and an arbitrary VC number may be assigned to each packet. The packets assigned the same VC number are handled as packets belonging to the same image data.
In addition, in both the short packet and the long packet, a DT (data type) value ("data type" shown in fig. 2 and 3) is recorded in the header portion. Therefore, in a manner similar to the VC number, packets assigned the same DT value can also be handled as packets belonging to the same image data.
The end of the packet is recorded as the number of words in the word count in the header portion of the long packet. The error correction code is recorded in the ECC of the header portions of the short packet and the long packet.
In the MIPI CSI-2 standard, a high-speed differential signal is used during a period in which a data signal is transmitted, and a low-power signal is used during a blanking period of the data signal. In addition, a period in which the high-speed differential signal is used is referred to as a period of HPS (high-speed state), and a period in which the low-power signal is used is referred to as a period of LPS (low-power state).
Fig. 4 is an explanatory diagram showing an example of signal waveforms related to transmission of packets in the MIPI CSI-2 standard. A in fig. 4 shows an example of transmission of a packet, and B in fig. 4 shows another example of transmission of a packet. The "ST", "ET", "PH", "PF", "SP", and "PS" shown in fig. 4 are abbreviations for the following, respectively.
ST: transmission start
ET: end of transmission
pH: packet header
PF: grouped page feet
SP: short packet
PS: packet spacing
As shown in fig. 4, it is revealed that the amplitudes of the differential signals are different from each other between the differential signal in the period of the LPS ("LPS" shown in fig. 4) and the differential signal in the period of the HPS (except for "LPS" shown in fig. 4). Therefore, from the viewpoint of improving transmission efficiency, the period of time for which LPS is included is as small as possible.
For example, the image sensor 100 and the image processing apparatus 200 are electrically connected to each other through a control bus B2 different from the data bus B1. The control bus B2 is a transmission path of other signals that connect the image sensor 100 and the image processing apparatus 200 to each other. For example, control information output from the image processing apparatus 200 is transmitted from the image processing apparatus 200 to the image sensor 100 via the control bus B2.
For example, the control information includes information for control and a processing command. Examples of the information for control include data for controlling functions in the image sensor 100, such as one or two or more of data indicating an image size, data indicating a frame rate, and data indicating an amount of output delay from an output command of a received image to an output image. In addition, the control information may include identification information indicating the image sensor 100. Examples of the identification information include any data capable of identifying the image sensor 100, such as an ID set to the image sensor 100.
The information transmitted from the image processing apparatus 200 to the image sensor 100 via the control bus B2 is not limited to the above example. For example, the image processing apparatus 200 may transmit area specifying information specifying an area in an image via the control bus B2. Examples of the region specifying information include data in any format capable of identifying a region, such as data indicating the positions of pixels included in the region (for example, coordinate data in which the positions of pixels included in the region are represented by coordinates).
Although fig. 1 shows an example in which the image sensor 100 and the image processing apparatus 200 are electrically connected to each other through the control bus B2, the image sensor 100 and the image processing apparatus 200 may not be connected through the control bus B2. For example, the image sensor 100 and the image processing apparatus 200 can transmit and receive control information and the like by wireless communication based on an arbitrary communication system.
In addition, although the image sensor 100A and the image sensor 100B are not electrically connected to each other in fig. 1, alternatively, the image sensor 100A and the image sensor 100B may be electrically connected to each other through a transmission path capable of performing communication by an arbitrary communication system. When the image sensor 100A and the image sensor 100B are electrically connected to each other, the image sensor 100A and the image sensor 100B can directly communicate with each other. As an example, the image sensor 100A and the image sensor 100B may communicate with each other by inter-processor communication including communication via transmission paths between processors included in the image sensors 100A and 100B, respectively. As shown in fig. 1, even when the image sensor 100A and the image sensor 100B are not electrically connected to each other, the image sensor 100A and the image sensor 100B can communicate with each other via the image processing apparatus 200.
Hereinafter, each apparatus constituting the image processing system 1000 shown in fig. 1 will be described.
[1-1] memory 300
The memory 300 is a recording medium included in the image processing system 1000. Examples of the memory 300 include a volatile memory such as a RAM (random access memory) and a nonvolatile memory such as a flash memory. The memory 300 operates using power supplied from an internal power supply (not shown) constituting the image processing system 1000, such as a battery, or power supplied from an external power supply of the image processing system 1000.
For example, the memory 300 stores an image output from the image sensor 100. Recording of the image to the memory 300 is controlled by the image processing apparatus 200, for example.
[1-2] display device 400
The display device 400 is a display device included in the image processing system 1000. Examples of the display device 400 include a liquid crystal display and an organic EL display (organic electroluminescence display). The display device 400 operates using power (such as a battery) supplied from an internal power supply (not shown) constituting the image processing system 1000 or power supplied from an external power supply of the image processing system 1000.
For example, various images and screens such as an image output from the image sensor 100, a screen related to an application executed in the image processing apparatus 200, and a screen related to a UI (user interface) are displayed on the display screen of the display device 400. The display of images and the like on the display screen of the display device 400 is controlled by the image processing apparatus 200, for example.
[1-3] image sensor 100
The image sensor 100 has an imaging function and a transmission function, and transmits data indicating an image generated by imaging. As described previously, the image sensor 100 performs the role of a transmitting device in the image processing system 1000.
Examples of the image sensor 100 include an imaging device such as a digital still camera, a digital video camera, or a stereo camera, and an image sensor device of any system capable of generating an image such as an infrared sensor or a distance image sensor, and the image sensor 100 has a function of transmitting the generated image. The image generated in the image sensor 100 corresponds to data indicating a sensing result in the image sensor 100. An example of the configuration of the image sensor 100 will be described later.
The image sensor 100 transmits image data corresponding to a region set with respect to an image (hereinafter, referred to as "region image data") by a transmission method according to the present embodiment, which will be described later. Control related to transmission of area image data is performed by, for example, a component serving as an image processing unit in the image sensor 100 (which will be described later). The region set with respect to the image may be referred to as an ROI (region of interest). Hereinafter, the region set with respect to the image may be referred to as "ROI".
Examples of the processing related to setting the region with respect to the image include any processing capable of recognizing a partial region in the image (or any processing capable of cutting out a partial region in the image), such as "processing for detecting an object from the image and setting a region including the detected object" or "processing for setting a region specified by an operation performed with respect to any operation means".
The processing related to setting the area with respect to the image may be performed by the image sensor 100 or by an external device such as the image processing device 200. When the image sensor 100 performs processing related to setting a region with respect to an image, the image sensor 100 recognizes the region according to the result of the processing related to setting the region with respect to the image. In addition, for example, when the external device performs processing related to setting the area with respect to the image, the image sensor 100 identifies the area based on the area specifying information acquired from the external device.
By having the image sensor 100 transmit the area image data, or in other words, transmit data of a portion of an image, the amount of data associated with the transmission is reduced compared to transmitting the entire image. Therefore, by causing the image sensor 100 to transmit the area image data, for example, various advantageous effects are produced by reducing the amount of data (such as reducing the transmission time and reducing the load associated with transmission in the image processing system 1000).
Alternatively, the image sensor 100 may also transmit data indicating the entire image.
When the image sensor 100 has a function of transmitting area image data and a function of transmitting data indicating an entire image, the image sensor 100 can selectively switch between transmitting the area image data and transmitting the data indicating the entire image.
For example, the image sensor 100 transmits area image data or transmits data indicating an entire image according to a set operation mode. The setting of the operation mode is performed, for example, by an operation with respect to an arbitrary operation device.
In addition, the image sensor 100 can selectively switch between transmitting area image data and transmitting data indicating an entire image based on area specifying information acquired from an external device. For example, when the area specifying information is acquired from the external apparatus, the image sensor 100 transmits area image data of an area corresponding to the area specifying information, and when the area specifying information is not acquired from the external apparatus, the image sensor 100 transmits data indicating the entire image.
[1-4] image processing apparatus 200
The image processing apparatus 200 receives data transmitted from the image sensor 100, and processes the received data by, for example, performing processing related to the image processing method according to the present embodiment. As described previously, the image sensor 200 has the role of a receiving device in the image processing system 1000. An example of a configuration (a configuration for realizing a role as a receiving device) relating to processing of data transmitted from the image sensor 100 will be described later.
For example, the image processing apparatus 200 is configured by one or two or more processors configured by an arithmetic circuit such as an MPU (micro processing unit), various processing circuits, and the like. The image processing apparatus 200 operates using power supplied from an internal power supply (not shown) constituting the image processing system 1000, such as a battery, or power supplied from an external power supply of the image processing system 1000.
The image processing apparatus 200 processes image data acquired from each of the plurality of image sensors 100 by performing processing related to the image processing method according to the present embodiment.
In the image processing system 1000, the image sensor 100 transmits area image data by a transmission system according to a transmission method which will be described later. The image processing apparatus 200 processes area image data acquired from each of the plurality of image sensors 100 in association with each area set for an image.
More specifically, for example, the image processing apparatus 200 combines, for each region, images indicated by region image data acquired from each of the plurality of image sensors 100.
In so doing, the image processing apparatus 200 combines the images by aligning the relative positions of the images indicated by the area image data of the objects to be combined. For example, the image processing apparatus 200 aligns the relative positions of the images indicated by the area image data based on the information about the angle of view acquired from each image sensor 100 that has transmitted the area image data (or the information about the angle of view acquired from the aforementioned image sensor module: hereinafter, a similar description will be applied). In addition, the image processing apparatus 200 can align the relative positions of the images indicated by the area image data by performing arbitrary object detection processing with respect to each image indicated by the area image data and detecting a corresponding object.
The process for associating the area image data acquired from each of the plurality of image sensors 100 with each area is not limited to the above-described example.
For example, the image processing apparatus 200 may combine images indicated by the area image data by matching the signal levels. The image processing apparatus 200 realizes the combination of the matched signal levels, for example, by "obtaining correction gains for correcting the respective sensitivity ratios of the image sensors 100 that have transmitted the area image data based on information (to be described later) acquired from each of the plurality of image sensors 100 and related to imaging of the image sensors 100". In this case, an example of the sensitivity of the image sensor 100 is a photoelectric conversion rate of an image sensor device included in the image sensor 100.
The processing in the image processing apparatus 200 is not limited to the above example.
For example, the image processing apparatus 200 may perform any processing that can be performed with respect to image data, such as RGB (red, green, and blue) processing, YC processing, and γ processing.
In addition, the image processing apparatus 200 executes various types of processing, such as processing related to controlling recording of image data to a recording medium such as the memory 300, processing related to controlling display of an image on a display screen of the display device 400, and processing for executing arbitrary application software. Examples of the processing related to controlling recording include "processing for transmitting control data including a recording command and data to be recorded in a recording medium to a recording medium such as the memory 300". In addition, examples of the processing related to controlling display include "processing for transmitting control data including a display command and data to be displayed on a display screen to a display device such as the display device 400".
Further, the image processing apparatus 200 may control functions in the image sensor 100, for example, by transmitting control information to the image sensor 100. The image processing apparatus 200 can also control data to be transmitted from the image sensor 100, for example, by transmitting area specifying information to the image sensor 100.
An example of the configuration of the image processing apparatus 200 will be described later.
For example, the image processing system 1000 is configured as shown in fig. 1. It should be noted that the configuration of the image processing system according to the present embodiment is not limited to the example shown in fig. 1.
For example, in the case where an image transmitted from the image sensor 100 is stored in a recording medium outside the image processing system, in the case where an image transmitted from the image sensor 100 is stored in a memory provided in the image processing apparatus 200, or in the case where an image transmitted from the image sensor 100 is not recorded, the image processing system according to the present embodiment need not have the memory 300.
In addition, the image processing system according to the present embodiment may be configured without the display device 400 shown in fig. 1.
Further, the image processing system according to the present embodiment may have any configuration according to functions provided in an electronic apparatus to which the image processing system according to the present embodiment is to be applied, which will be described later.
[2] Application example of image processing system according to the present embodiment
Although the image processing system has been described above as the present embodiment, the present embodiment is not limited to this mode. For example, the present embodiment can be applied to various electronic apparatuses including a communication device such as a smartphone, a mobile body such as an unmanned aerial vehicle (an apparatus that can be operated by remote control or an apparatus that can be autonomously operated), or an automobile, a computer such as a PC (personal computer), a tablet type device, and a game machine.
[3] Transmission method according to the present embodiment
Next, a transmission method according to the present embodiment will be described. Hereinafter, a case where the transmission method according to the present embodiment is applied to the image sensor 100 will be described.
(1) First transmission system
Fig. 5 is an explanatory diagram showing an example of a region to be set with respect to an image. In fig. 5, four regions including a region 1, a region 2, a region 3, and a region 4 are shown as an example of the regions. Needless to say, the area to be set with respect to the image is not limited to the example shown in fig. 5.
For example, the image sensor 100 transmits "additional data including, for each region, region information corresponding to regions (such as regions 1 to 4 shown in fig. 5) set with respect to an image" and "region image data indicating an image of each row (column) corresponding to the region" in different groups, respectively. When the position of a pixel is indicated by two-dimensional plane coordinates (x, y), the rows in the image refer to the same y-coordinate.
The area information according to the present embodiment refers to data (data group) for identifying an area to be set with respect to an image from the receiving apparatus side. For example, the area information includes a part or all of identification information of the area, information indicating a position of the area, and information indicating a size of the area.
The information included in the area information is not limited to the above-described example. The area information may include any information for identifying an area to be set with respect to the image from the reception apparatus side. For example, when the area is divided by the VC number, the VC number may perform the role of identification information of the area included in the row. In addition, when the region is divided by VC number, the payload length may replace information indicating the size of the region included in the row.
Examples of the identification information of the area include any data capable of uniquely identifying the area, such as data indicating an ID of the area, such as a number added to the area. Hereinafter, the identification information of the region may be referred to as "ROI ID".
The information indicating the position of the region is data indicating the position of the region in the image. An example of the information indicating the position of the region is "data indicating an arbitrary position of the region that can be uniquely identified by being combined with the size of the region indicated by the information indicating the size of the region", such as data indicating the upper left position of the region in the image by two-dimensional plane coordinates (x, y).
Examples of the information indicating the size of the area include data indicating the number of rows of the area (data indicating the number of pixels in the vertical direction in the area) and data indicating the number of columns of the area (data indicating the number of pixels in the horizontal direction in the area). It should be noted that the information indicating the size of the area may be data in any format capable of identifying the size of the area, such as data indicating a rectangular area (for example, data indicating the number of pixels in the horizontal direction and the number of pixels in the vertical direction in the rectangular area).
Hereinafter, an example of processing according to the first transmission system in the image sensor 100 will be described.
The image sensor 100 stores the area information in "embedded data" of one packet and causes the packet to be transmitted. In addition, the image sensor 100 stores the area image data in the payload of another packet, and causes the packet to be transmitted in rows.
The "embedded data" refers to data that can be embedded in a packet to be transmitted, and corresponds to additional data that is additionally transmitted by the image sensor 100. Hereinafter, the embedded data may also be referred to as "EBD".
Fig. 6 is an explanatory diagram showing an example of data to be transmitted by the first transmission system relating to the transmission method according to the present embodiment. Fig. 6 shows an example in which "region information corresponding to region 1, region 2, region 3, and region 4, respectively, shown in fig. 5 is stored as" embedded data "in the payload of a long packet of MIPI, and region image data is stored in the payload of a long packet of MIPI shown in fig. 3 to be transmitted in a row.
The "FS" shown in fig. 6 is an FS (start of frame) packet in the MIPI CSI-2 standard, and the "FE" shown in fig. 6 is an FE (end of frame) packet in the MIPI CSI-2 standard (similar description applies to other drawings).
As described above, "embedded data" shown in fig. 6 is data that can be embedded in a packet to be transmitted. For example, "embedded data" may be embedded in the header, payload, or footer of the packet to be sent. In the example shown in fig. 6, the region information is stored in "embedded data" of one packet, and the "embedded data" storing the region information corresponds to the additional data.
The information included in the additional data according to the present embodiment is not limited to the above-described example. For example, the additional data according to the present embodiment may include information on imaging of the image sensor 100. Examples of the information on imaging of the image sensor 100 include a part or all of exposure information indicating an exposure value or the like in the image sensor device, gain information indicating a gain in the image sensor device, and sensitivity information indicating a photoelectric conversion rate in the image sensor device. Each of the exposure value indicated by the exposure information and the gain indicated by the gain information is set to the image sensor device, for example, by control of the image processing apparatus 200 via the control bus B2.
Fig. 7 is an explanatory diagram for explaining an example of embedded data to be transmitted by the first transmission system according to the present embodiment. The PH and the data following it shown in fig. 7 are examples of the embedded data shown in fig. 6.
In embedded data, the type of data included in the embedded data is defined by, for example, "data format code".
In the example shown in fig. 7, each of "first ROI information" and "second ROI information" … immediately following the "data format code" corresponds to an example of region information. In other words, the embedded data shown in fig. 7 is an example of additional data including region information.
For example, in the area information shown in fig. 7, the "value" includes identification information of the area, information indicating the position of the area, and information indicating the size of the area. In addition, the "value" may include information about imaging of the image sensor 100. In the region information shown in fig. 7, the boundary with other region information included in the embedded data is defined by, for example, "length".
Fig. 8 is an explanatory diagram for explaining an example of area information included in the embedded data shown in fig. 7. The "ROI ID" shown in fig. 8 corresponds to identification information of the region, and the "upper left coordinate" shown in fig. 8 corresponds to information indicating the position of the region. In addition, "height" and "width" shown in fig. 8 correspond to information indicating the size of the area.
Needless to say, the data configuration examples of the area information and data included in the embedded data are not limited to the examples shown in fig. 7 and 8.
Referring again to fig. 6, an example of data transmitted by the first transmission system will be described. Each of "1", "2", "3", and "4" shown in fig. 6 corresponds to the area image data of the area 1, the area image data of the area 2, the area image data of the area 3, and the area image data of the area 4 to be stored in the payload of the packet. Although fig. 6 shows that each region image data is divided, for convenience sake, the division is simply indicated, and the data stored in the payload is not divided.
In the first transmission system, the area information respectively corresponding to the area 1, the area 2, the area 3, and the area 4 shown in fig. 5 is stored in "embedded data" of one packet as shown in fig. 7 to be transmitted. In addition, in the first transmission system, the area image data respectively corresponding to the area 1, the area 2, the area 3, and the area 4 shown in fig. 5 are stored in the payload of the long packet of the MIPI shown in fig. 6 to be transmitted in a row.
(2) Second transmission system
The transmission method that can be applied to the image processing system 1000 according to the present embodiment is not limited to the transmission method according to the first transmission system.
For example, the image sensor 100 may store the region information and the region image data in the payload of a packet and cause the packet to be transmitted in a row.
Fig. 9 is an explanatory diagram showing another example of the region to be set with respect to the image. In fig. 9, four regions including a region 1, a region 2, a region 3, and a region 4 are shown as an example of the regions.
Fig. 10 is an explanatory diagram showing an example of data to be transmitted by the second transmission system relating to the transmission method according to the present embodiment. Fig. 10 shows an example in which "region information and region image data respectively corresponding to region 1, region 2, region 3, and region 4 shown in fig. 9 are stored in the payload of the long packet of MIPI shown in fig. 3 to be transmitted in a row. "
The "PH" shown in fig. 10 denotes a packet header of a long packet. In this case, the packet header of the long packet according to the second transmission system may be used as data (change information) indicating whether the information included in the region information has changed from the region information included in the packet to be transmitted last. In other words, "PH" shown in fig. 10 may be considered as data indicating a data type of a long packet.
As an example, when the information included in the region information has changed from the region information included in the packet to be transmitted last, the image sensor 100 sets "PH" to "0 x 38". In this case, the image sensor 100 stores the area information in the payload of the long packet.
As another example, when the information included in the region information has not been changed from the region information included in the packet to be transmitted last, the image sensor 100 sets "PH" to "0 x 39". In this case, the image sensor 100 does not store the area information in the payload of the long packet. In other words, the image sensor 100 does not cause the area information to be transmitted when the information included in the area information has not been changed from the area information included in the packet to be transmitted last.
Needless to say, the data set to "PH" is not limited to the above-described example.
"information" in fig. 10 indicates region information stored in the payload. As shown in fig. 10, the region information is stored in the header of the payload.
Each of "1", "2", "3", and "4" shown in fig. 10 corresponds to the area image data of the area 1, the area image data of the area 2, the area image data of the area 3, and the area image data of the area 4 to be stored in the payload. Although fig. 10 shows that each region image data is divided, for convenience sake, the division is simply indicated, and the data stored in the payload is not divided.
In the second transmission system, the area information and the area image data respectively corresponding to the area 1, the area 2, the area 3, and the area 4 shown in fig. 9 are stored in the payload of a long packet of MIPI as shown in fig. 10, for example, to be transmitted in a row.
Therefore, when the second transmission system is used, the image sensor 100 can transmit the shape set to an arbitrary region of the image.
[4] Configuration examples of image sensor and image processing apparatus constituting the image processing system according to the present embodiment
Next, an example of a configuration of the image sensor 100 capable of performing processing according to the above-described transmission method and an example of a configuration of the image processing apparatus 200 capable of performing processing according to the above-described image processing method will be described.
[4-1] configuration of image sensor 100
Fig. 11 is a block diagram showing an example of the configuration of the image sensor 100 according to the present embodiment. For example, the image sensor 100 includes a photoelectric conversion unit 102, a signal processing unit 104, a communication unit 106, and a control unit 108. The image sensor 100 operates using power (such as a battery) supplied from an internal power supply (not shown) constituting the image processing system 1000 or power supplied from an external power supply of the image processing system 1000.
The photoelectric conversion unit 102 is constituted by a lens/imaging element 150, and the signal processing unit 104 is constituted by a signal processing circuit 152. The lens/imaging element 150 and the signal processing circuit 152 function as an image sensor device in the image sensor 100. In the image processing system 1000, all the image sensors 100 may include the same type of image sensor device, or part of the image sensors 100 may include different types of image sensor devices. Examples of the image sensor 100 including different types of image sensor devices are the image sensor 100 including an image sensor device that images a color image and the image sensor 100 including an image sensor device that images a black-and-white image.
The communication unit 106 is constituted by a communication circuit 154, and the control unit 108 is constituted by a processor 156. The operation of each of the lens/imaging element 150, the signal processing circuit 152, and the communication circuit 154 is controlled by a processor 156.
It should be noted that the functional blocks of the image sensor 100 shown in fig. 11 have been created by dividing the functions included in the image sensor 100 for convenience, and are not limited to the example shown in fig. 11. For example, the signal processing unit 104 and the control unit 108 shown in fig. 11 may also be considered as a single processing unit.
The lens/imaging element 150 is constituted by, for example, a lens of an optical system and an image sensor using a plurality of imaging elements such as CMOS (complementary metal oxide semiconductor) or CCD (charge coupled device). In the lens/imaging element 150, when light that has passed through the lens of the optical system is photoelectrically converted by the imaging element of the image sensor, an analog signal indicating a captured image is obtained.
The signal processing circuit 152 includes, for example, an AGC (automatic gain control) circuit and an ADC (analog-to-digital converter), and converts an analog signal sent from the lens/imaging element 150 into a digital signal (image data). In addition, the signal processing circuit 152 includes an amplifier, and amplifies the digital signal with a prescribed gain.
Further, the signal processing circuit 152 may perform processing related to setting an area with respect to an image, and transmit area designation information to the communication circuit 154. As will be described later, processing related to setting of a region with respect to an image in the image sensor 100 may be performed by the processor 156. In addition, as described previously, in the image processing system 1000, processing related to setting of a region with respect to an image can be performed by an external apparatus such as the image processing apparatus 200.
Further, the signal processing circuit 152 may transmit various data such as exposure information and gain information to the communication circuit 154. Sending various data, such as exposure information and gain information, to the communication circuit 154 in the image sensor 100 may be performed by the processor 156.
The communication circuit 154 is a circuit related to a data transmission function of the transmission method according to the present embodiment, and an example of the communication circuit 154 is an IC (integrated circuit) chip into which the circuit related to the transmission function is integrated. The communication circuit 154 processes the image data transmitted from the signal processing circuit 152, and transmits data corresponding to the generated image. The data corresponding to the image is image data (in other words, data indicating the entire image) or area information and area image data transmitted from the signal processing circuit 152.
The processor 156 controls the operation of each of the lens/imaging element 150, the signal processing circuit 152, and the communication circuit 154 based on, for example, a control signal transmitted from the image processing apparatus 200 via the control bus B2. Alternatively, when the image sensor 100 provided with the processor 156 and the other image sensor 100 are capable of directly communicating with each other, the processor 156 may perform processing based on a control signal transmitted from the other image sensor 100 via an arbitrary transmission path.
Examples of control of the lens/imaging element 150 by the processor 156 include control of imaging, such as control of exposure time. Examples of control of the signal processing circuit 152 by the processor 156 include control of signal processing, such as control of gain. Examples of the control of the communication circuit 154 by the processor 156 include communication control such as "switching control between transmission of area image data and transmission of data indicating an entire image", and various types of control when the area image data is transmitted (for example, transmission control of area information and transmission control of information related to imaging).
The image sensor 100 performs processing related to the above-described transmission method by, for example, the configuration shown in fig. 11. Needless to say, the configuration of the image sensor 100 is not limited to the example shown in fig. 11.
[4-2] configuration of image processing apparatus 200
Fig. 12 is a block diagram showing an example of the configuration of the image processing apparatus 200 according to the present embodiment. Fig. 12 shows an example of a configuration of the image processing apparatus 200 constituting the image processing system 1000 shown in fig. 1, or in other words, a configuration of communicating with each of the two image sensors 100 (i.e., the image sensors 100A and 100B).
For example, the image processing apparatus 200 includes a communication unit 202 and a processing unit 204. The image processing apparatus 200 operates using power supplied from an internal power supply (not shown) constituting the image processing system 1000, such as a battery, or power supplied from an external power supply of the image processing system 1000.
The communication unit 202 has a function of communicating with each of the plurality of image sensors 100. For example, the communication unit 202 is constituted by communication circuits 250A and 250B respectively corresponding to the image sensors 100 as communication targets. Hereinafter, one of the communication circuits 250A and 250B constituting the communication unit 202 will be referred to as "communication circuit 250".
In addition, the communication unit 202 may be capable of switching between the image sensors 100 as communication targets. Using the image processing system 1000 shown in fig. 1 as an example, switching of the image sensor 100 as a communication target in the communication unit 202 includes switching between "communication only with the image sensor 100A", "communication only with the image sensor 100B", and "communication with both the image sensor 100A and the image sensor 100B". Switching of the image sensor 100 as a communication target in the communication unit 202 is realized by, for example, the processor 252 controlling the operations of the communication circuits 250A and 250B. The processor 252 performs switching of the image sensor 100 as a communication object by threshold processing based on a detection value of a sensor capable of detecting luminance (such as an illuminance sensor), which may be a sensor external to the image processing apparatus 200 or a sensor included in the image processing apparatus 200. As an example, when the detection value is equal to or smaller than the set threshold value (or when the detection value is smaller than the threshold value), the processor 252 causes communication to be performed with both the image sensor 100A and the image sensor 100B. As another example, when the detection value is greater than the threshold value (or when the detection value is equal to or greater than the threshold value), the processor 252 causes communication to be performed with one of the image processors of the image sensor 100A and the image sensor 100B. Since the data processing amount in the image processing apparatus 200 can be further reduced by causing the processor 252 to switch the image sensor 100 as the communication target, reduction in power consumption can be achieved.
In addition, when performing "communication only with the image sensor 100A" or "communication only with the image sensor 100B", the processor 252 may pause the operation of the image sensor 100 that does not perform communication. For example, the processor 252 performs switching of the image sensor 100 as a communication object and suspension of the operation of the image sensor 100 by threshold processing based on a detection value of a sensor capable of detecting luminance (such as an illuminance sensor). By causing the processor 252 to suspend operation of the image sensor 100, a reduction in power consumption can be achieved in the image processing system 1000.
The processing unit 204 processes data received by the communication unit 202. For example, the processing unit 204 performs processing related to the image processing method according to the present embodiment, and processes area image data acquired from each of the plurality of image sensors 100 in association with each area based on the area information. Optionally, the processing unit 204 may also process data indicative of the entire image.
The processing unit 204 is constituted by a processor 252 and an image processing circuit 254. The operation of each of the communication circuits 250A and 250B and the image processing circuit 254 is controlled by the processor 252. In other words, the processing unit 204 may perform the role of the control unit in the image processing apparatus 200.
In addition, the processor 252 constituting the processing unit 204 performs a role of controlling the operation of each image sensor 100 constituting the image processing system 1000. The processor 252 controls the operation of each image sensor 100 by sending control signals to the image sensors 100 via the control bus B2.
It should be noted that the functional blocks of the image processing apparatus 200 shown in fig. 12 have been created by dividing the functions included in the image processing apparatus 200 for convenience, and are not limited to the example shown in fig. 12. For example, the processing unit 204 shown in fig. 12 may be divided into a control unit constituted by the processor 252 and an image processing unit constituted by the image processing circuit 254.
For example, the communication circuit 250A is a communication circuit that communicates with the image sensor 100A. The communication circuit 250A receives data (for example, packets shown in fig. 6 or fig. 10) that has been transmitted by the transmission method according to the present embodiment from the image sensor 100A. The communication circuit 250A may have a function of transmitting data to the image sensor 100A via, for example, an arbitrary transmission path between the communication circuit 250A and the image sensor 100A.
For example, the communication circuit 250B is a communication circuit that communicates with the image sensor 100B. The communication circuit 250B receives data (e.g., packets shown in fig. 6 or fig. 10) that has been transmitted by the transmission method according to the present embodiment from the image sensor 100B. The communication circuit 250B may have a function of transmitting data to the image sensor 100B via, for example, an arbitrary transmission path between the communication circuit 250B and the image sensor 100B.
The communication circuits 250A and 250B transmit data included in the embedded data (such as area information and information on imaging of the image sensor 100) included in the received data to the processor 252. Fig. 12 shows "an example of transmitting the area information from each of the communication circuits 250A and 250B to the processor 252". The communication circuits 250A and 250B may transmit embedded data in the received data to the processor 252. When the embedded data is transmitted to the processor 252, data included in the embedded data, such as region information and information regarding imaging of the image sensor 100, is retrieved from the embedded data by the processor 252. In addition, the communication circuits 250A and 250B transmit data other than embedded data included in the payload among the received data to the image processing circuit 254.
The communication circuits 250A and 250B separate header data corresponding to the header portion and payload data corresponding to the payload portion from the received data. The communication circuits 250A and 250B separate header data from the received data according to a rule defined in advance by a standard or the like, for example. In addition, the communication circuits 250A and 250B may separate payload data from the received data according to a rule defined in advance by a standard or the like, for example, or separate payload data from the received data based on the content indicated by the header data. Further, the communication circuits 250A and 250B transmit data included in the embedded data (or embedded data) in the separated data to the processor 252, and transmit data other than the embedded data in the payload data to the image processing circuit 254.
Fig. 13 is a block diagram showing an example of the functional configuration of the communication circuit 250 included in the image processing apparatus 200 according to the present embodiment. For example, the communication circuit 250 includes a header separation unit 260, a header interpretation unit 262, and a payload separation unit 264.
The header separation unit 260 separates header data corresponding to the header portion and payload data corresponding to the payload portion from the received data. The header separation unit 260 separates header data from the received data according to a rule predefined by a standard or the like, for example. In addition, the header separation unit 260 may separate payload data from the received data according to a rule defined in advance by a standard or the like, for example, or separate payload data from the received data based on the result of the processing by the header interpretation unit 262.
The header interpreting unit 262 interprets the content indicated by the header data.
As an example, the header interpretation unit 262 interprets whether or not the payload data is "embedded data". The header interpretation unit 262 interprets whether or not the payload data is "embedded data" based on, for example, the DT value recorded in the header portion. As another example, header interpretation unit 262 may identify a location of the payload data and send the identified location to header separation unit 260.
The payload separating unit 264 processes the payload data based on the interpretation result of the header interpreting unit 262.
As an example, when the header interpretation unit 262 interprets that the payload data is "embedded data", the payload separation unit 264 separates data included in the embedded data, such as region information and information on imaging of the image sensor 100, from the payload data. In addition, the payload separation unit 264 transmits data included in the embedded data, such as area information and information on imaging of the image sensor 100, to the processing unit 204 (more specifically, the processor 252 constituting the processing unit 204). Fig. 13 shows an "example of sending the region information from the payload separating unit 264 to the processor 204".
As another example, when the header interpretation unit 262 interprets that the payload data is not "embedded data", the payload separation unit 264 separates image data (data indicating an entire image or area image data) from the payload data. The payload separating unit 264 separates the region image data from the payload data based on, for example, the region information retrieved from the embedded data. In addition, the payload separation unit 264 transmits the image data to the processing unit 204 (more specifically, the image processing circuit 254 constituting the processing unit 204).
Since the communication circuit 250 has a functional configuration such as that shown in fig. 13, the communication circuit 250 receives data that has been transmitted from the image sensor 100 by the transmission method according to the present embodiment, and transmits the received data to the processing unit 204. It should be noted that the functional blocks of the communication circuit 250 shown in fig. 13 have been created by dividing the functions included in the communication circuit 250 for convenience, and are not limited to the example shown in fig. 13. Additionally, as described above, the communication circuit 250 may be configured to send embedded data in the received data to the processor 252.
Referring again to fig. 12, an example of the configuration of the image processing apparatus 200 will be described. The processor 252 controls the operation of each of the communication circuits 250A and 250B and the image processing circuit 254. In addition, the processor 252 may perform various types of processing, such as processing for executing arbitrary application software.
Examples of the control of the communication circuits 250A and 250B by the processor 252 include on/off control of a communication function. For example, by controlling on/off of the communication function of each of the communication circuits 250A and 250B as described above, switching between the image sensors 100 as communication targets is realized.
The control of the image processing circuit 254 by the processor 252 includes control of processing performed by the image processing circuit 254 in relation to the image processing method according to the present embodiment. The processor 252 performs control of the image processing circuit 254 using, for example, data included in the embedded data, such as area information transmitted from the communication circuits 250A and 250B and information on imaging of the image sensor 100. In addition, when embedded data is transmitted from the communication circuits 250A and 250B, the processor 252 performs control of the image processing circuit 254 by, for example, retrieving region information and the like from the embedded data.
As an example, the processor 252 sends correction control information indicating a correction value for aligning the relative position of the images indicated by the area image data to the image processing circuit 254. The correction value for aligning the relative position of the images indicated by the area image data is set based on, for example, area information included in the embedded data transmitted from each of the communication circuits 250A and 250B and information on the angle of view acquired from each image sensor 100. Alternatively, the correction value for aligning the relative position of the images indicated by the area image data may be set based on, for example, the area information included in the embedded data transmitted from each of the communication circuits 250A and 250B and the result of performing arbitrary object detection processing with respect to each image indicated by the area image data.
As another example, the processor 252 transmits correction control information indicating a correction gain for correcting the sensitivity ratio of each image sensor 100 that has transmitted the area image data to the image processing circuit 254.
The correction gain is set by calculating a correction gain "G12" satisfying the following mathematical expression 1 based on, for example, information about the angle of view acquired from each image sensor 100. It should be noted that calculating the correction gain based on the following mathematical expression 1 is a calculation example in the case where the image processing apparatus 200 controls the image sensor 100 so that the respective exposure times are the same. In other words, the calculation method of the correction gain according to the present embodiment is not limited to use of the following mathematical expression 1.
G2·G2l=A1·G1/A2
… (data expression 1)
In this case, "G1" in the above mathematical expression 1 represents a gain in the image sensor device included in the image sensor 100A, and "G2" in the above mathematical expression 1 represents a gain in the image sensor device included in the image sensor 100B. In addition, "a 1" in the above mathematical expression 1 represents the photoelectric conversion rate in the image sensor device included in the image sensor 100A, and "a 2" in the above mathematical expression 1 represents the photoelectric conversion rate in the image sensor device included in the image sensor 100B. In other words, the correction gain for correcting the signal level of the image indicated by the area image data acquired from the image sensor 100B is calculated using the above mathematical expression 1.
The image processing circuit 254 processes data transmitted from each of the communication circuits 250A and 250B. For example, in the image processing circuit 254, the processing unit 204 performs processing relating to the image processing method according to the present embodiment, and area image data acquired from each of the image sensors 100A and 100B is processed in association with each area based on the area information.
For example, the image processing circuit 254 matches the signal level of the image indicated by the area image data using, for example, correction control information transmitted from the processor 252. After matching the signal levels, the image processing circuit 254 aligns the relative positions of the images indicated by the area image data using the correction control information sent from the processor 252. In addition, the image processing circuit 254 combines images indicated by the area image data of each area. It should be noted that the image processing circuit 254 is capable of combining the images indicated by the area image data without matching the signal levels of the images indicated by the area image data, and is capable of combining the images indicated by the area image data without aligning the relative positions of the images indicated by the area image data.
Optionally, the image processing circuit 254 may also process data indicative of the entire image acquired from each of the image sensors 100A and 100B.
In addition, the processing in the image processing circuit 254 is not limited to the above example. In addition, the image processing circuit 254 may perform one or both of processing related to controlling recording of image data to a recording medium such as the memory 300 and processing related to controlling display of an image on the display screen of the display device 400.
Fig. 14 is a block diagram showing an example of the functional configuration of the image processing circuit 254 included in the image processing apparatus 200 according to the present embodiment. For example, the image processing circuit 254 has first image processing units 270A and 270B, a relative sensitivity difference correction processing unit 272, a relative position correction processing unit 274, a combination processing unit 276, and a second image processing unit 278. Part or all of the processing in each unit may be performed by hardware, or may be performed by executing software (computer program) on hardware.
Hereinafter, an example of the functional configuration of the image processing circuit 254 will be described using a case where the image processing circuit 254 processes the area image data as an example.
The first image processing unit 270A performs prescribed image processing with respect to data transmitted from the communication circuit 250A. The first image processing unit 270B performs prescribed image processing with respect to data transmitted from the communication circuit 250B. Examples of the prescribed image processing performed by each of the first image processing units 270A and 270B include various types of processing related to a RAW phenomenon or the like.
The relative sensitivity difference correction processing unit 272 matches the signal level of the image indicated by the area image data transmitted from the first image processing unit 270B with the signal level of the image indicated by the area image data that has been processed by the first image processing unit 270A. The relative sensitivity difference correction processing unit 272 corrects the gain of the area image data transmitted from the first image processing unit 270B using, for example, a correction gain indicated by correction control information transmitted from the processor 252.
Although fig. 14 shows an example of correcting the gain of the area image data transmitted from the first image processing unit 270B, the image processing circuit 254 may have a functional configuration for correcting the gain of the area image data transmitted from the first image processing unit 270A.
The relative position correction processing unit 274 aligns the relative position of the image indicated by the area image data transmitted from the relative sensitivity difference correction processing unit 272 with the image indicated by the area image data that has been processed by the first image processing unit 270A. The relative position correction processing unit 274 corrects the relative position of the image indicated by the area image data transmitted from the relative sensitivity difference correction processing unit 272 using, for example, a correction gain for aligning the relative position indicated by the correction control information transmitted from the processor 252.
The combination processing unit 276 combines the image indicated by the region image data processed by the first image processing unit 270A and the image indicated by the region image data transmitted from the relative position correction processing unit 274 for each region. The combination processing unit 276 combines the images indicated by the region image data by any processing capable of combining the images, such as alpha blending.
The second image processing unit 278 performs prescribed image processing on the combined image sent from the combination processing unit 276. Examples of the prescribed image processing performed by the second image processing unit 278 include any processing that can be performed with respect to image data, such as γ processing.
Since the image processing circuit 254 has a functional configuration such as that shown in fig. 14, the image processing circuit 254 performs processing related to the image processing method according to the present embodiment. It should be noted that the functional blocks of the image processing circuit 254 shown in fig. 14 have been created by dividing the functions included in the image processing circuit 254 for convenience, and are not limited to the example shown in fig. 14.
The image processing apparatus 200 performs processing related to the above-described image processing method by, for example, the configuration shown in fig. 12 to 14. Needless to say, the configuration of the image processing apparatus 200 is not limited to the examples shown in fig. 12 to 14.
[5] Example of processing in the image processing system according to the present embodiment
Next, an example of processing in the image processing system 1000 will be described.
[5-1] initialization-related processing
Fig. 15 is an explanatory diagram for explaining an example of processing in the image processing system 1000 according to the present embodiment, and shows processing relating to initialization. The processing shown in fig. 15 is executed, for example, when the image processing system 1000 is activated or when a user of the image processing system 1000 or the like performs a prescribed operation.
The image processing apparatus 200 transmits, for example, via the control bus B2, a setting request for causing each of the image sensors 100A and 100B to set driving parameters, and transmits an acquisition request for causing each of the image sensors 100A and 100B to transmit information about an angle of view (S100). For example, the setting request of the driving parameters includes various setting values such as an exposure value, an exposure time, and a gain, and a setting command. For example, the acquisition request includes a transmission command of information related to the angle of view.
Each of the image sensors 100A and 100B, which has received the setting request and the acquisition request transmitted in step S100, sets the driving parameters based on the setting request, and transmits the information related to the angle of view based on the acquisition request (S102, S104).
The image processing apparatus 200, which has transmitted the setting request in step S100, calculates a correction gain based on various setting values included in the setting request, and configures settings for performing correction according to the calculated correction gain (S106).
The image processing apparatus 200 having received the information on the angle of view transmitted in steps S102 and S104 obtains a correction value for aligning the relative position based on the information on the angle of view (S108), and configures a setting for performing correction according to the correction value (S110).
In the image processing system 1000, for example, the processing shown in fig. 15 is executed as the processing related to initialization. Needless to say, the example of the processing related to initialization is not limited to the example shown in fig. 15.
[5-2] treatment during operation
Fig. 16 is an explanatory diagram for explaining an example of processing in the image processing system 1000 according to the present embodiment, and shows an example of processing during operation. Fig. 16 shows an example of processing of the image sensor 100B performed with reference to imaging of the image sensor 100A. In other words, the image processing system 1000 can perform cooperative imaging by having one image sensor 100 function as a master image sensor and having another image sensor 100 function as a slave image sensor.
The image sensor 100A starts imaging upon acquiring a frame start trigger (hereinafter, sometimes referred to as "V start trigger") (S200).
The image sensor 100A sets a clipping position to be clipped from the captured image (S202), and transmits information indicating the set clipping position to the image sensor 100B and the image processing apparatus 200 (S204). The setting of the cut-out position in the image sensor 100A corresponds to setting the area with respect to the captured image. In other words, for example, information indicating a cut position corresponds to the area information.
The image sensor 100A transmits information indicating the cut-out position to the image processing apparatus 200 via the data bus B1, for example. In addition, the image sensor 100A transmits information indicating a cut-out position (area information: hereinafter, similar description will be applied) to the image sensor 100B via the image processing apparatus 200, for example. In the case of a configuration in which the image sensor 100A and the image sensor 100B are each capable of communicating with each other by inter-processor communication or the like, the image sensor 100A may transmit information indicating the cut-out position to the image sensor 100B by direct communication.
The image sensor 100B, which has received the information indicating the clipping position transmitted from the image sensor 100A in step S204, sets the clipping position to be clipped from the captured image based on the information indicating the clipping position (S206).
The image processing apparatus 200 having received the information indicating the clipping position transmitted from the image sensor 100A in step S204 identifies the number of pixels included in the setting area and the two-dimensional plane coordinates of the pixels based on the information indicating the clipping position (S208), and sets the coordinates to be used in the processing and the size of the area (S210).
The image sensor 100A, which has transmitted the information indicating the cut-out position in step S204, transmits information on imaging in the image sensor 100A to the image sensor 100B and the image processing apparatus 200 (S212). As described previously, the information on imaging includes exposure information and gain information. The image sensor 100A transmits information on imaging to the image sensor 100B and the image processing apparatus 200, for example, in a similar manner to the transmission of the information indicating the cut-out position in step S204.
The image sensor 100B, which has received the information on imaging transmitted from the image sensor 100A in step S212, performs gain control and exposure control based on the received information on imaging (S214). In addition, the image sensor 100B transmits information on imaging in the image sensor 100B to the image processing apparatus 200.
The image processing apparatus 200, which has received the information on imaging transmitted from the image sensor 100A and the information on imaging transmitted from the image sensor 100B in step S212, calculates, for example, a correction gain, and configures settings for performing correction according to the calculated correction gain (S216). In addition, the image processing apparatus 200 starts processing with respect to the image data transmitted from each of the image sensors 100A and 100B (S218).
Fig. 17 is an explanatory diagram for explaining an example of processing in the image processing system 1000 according to the present embodiment, and shows a timing chart corresponding to the processing shown in fig. 16. In other words, fig. 17 shows an example in which "the image sensor 100A functions as a master image sensor, and the image sensor 100B functions as a slave image sensor" in a manner similar to fig. 16.
As shown in fig. 17, in the image processing system 1000, the image sensor 100A and the image sensor 100B cooperate with each other to perform imaging according to the setting notification in the image sensor 100A. In addition, in the image processing system 1000, the image processing apparatus 200 processes the area image data acquired from each of the image sensors 100A and 100B in association for each area based on the setting notification in the image sensor 100A. Accordingly, the cooperative operation of the image sensor 100A, the image sensor 100B, and the image processing apparatus 200 is realized in the image processing system 1000.
In the image processing system 1000, for example, the processes shown in fig. 16 and 17 are executed as the processes during operation. Needless to say, the example of processing during operation is not limited to the examples shown in fig. 16 and 17.
[6] Example of advantageous effects produced by using the image processing system according to the present embodiment
For example, using the image processing system according to the present embodiment produces the advantageous effects described below. Needless to say, the advantageous effects produced by using the image processing system according to the present embodiment are not limited to the examples described below.
In the image processing system according to the present embodiment, imaging of a region set with respect to a captured image can be performed by cooperation among the plurality of image sensors 100.
In the image processing system according to the present embodiment, since the plurality of image sensors and the image processing apparatus operate in cooperation with each other, for example, the apparatus can operate while sharing various information such as information on exposure time, driving frequency, gain value, relative view angle difference between image sensor devices, and object distance.
Since the image processing apparatus can combine the images indicated by the area image data by matching the signal levels of the images, the image processing system according to the present embodiment can enhance the sensitivity of the images to be processed in association.
The image processing apparatus can switch between the image sensors as communication targets, and suspend the operation of the partial image sensors in conjunction with such switching. Therefore, the image processing system according to the present embodiment can achieve reduction in power consumption.
(procedure according to the present embodiment)
By causing a processor or an image processing circuit in a computer to execute a program that causes the computer to function as the image processing apparatus according to the present embodiment (for example, a program that causes the computer to execute processing related to the image processing method according to the present embodiment), images respectively obtained from a plurality of image sensors can be processed in association.
In addition, by causing a processor or an image processing circuit in a computer to execute a program that causes the computer to function as the image processing apparatus according to the present embodiment, advantageous effects produced by using the image processing method according to the present embodiment can be produced.
Although the preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, the technical scope of the present disclosure is not limited thereto. It will be apparent to those skilled in the art to which the present disclosure pertains that various modifications and variations can be implemented without departing from the scope of the technical idea set forth in the appended claims, and therefore, it is to be understood that such modifications and variations are naturally encompassed within the technical scope of the present disclosure.
For example, although a program (computer program) for causing a computer to function as the image processing apparatus according to the present embodiment is provided in the description given above, the present embodiment may also provide a recording medium storing the above-described program.
The above-described configuration represents an example of the present embodiment, and naturally falls within the technical scope of the present disclosure.
Furthermore, the benefits described in this specification are merely illustrative or exemplary and not restrictive. In other words, in addition to or in place of the above-described benefits, other benefits may be generated by techniques in accordance with the present disclosure that will be apparent to those skilled in the art from the description of the present specification.
The following configuration is also included in the technical scope of the present disclosure.
(1)
An image processing apparatus comprising:
a communication unit capable of communicating with each of a plurality of image sensors configured to transmit additional data and area image data in different groups, respectively, the additional data including, for each area, area information corresponding to an area set with respect to a captured image, the area image data indicating an image of each line corresponding to the area; and
a processing unit configured to process area image data acquired from each of the plurality of image sensors in association with each area based on area information included in the additional data acquired from each of the plurality of image sensors, wherein,
the area information includes a part or all of identification information of the area, information indicating a position of the area, and information indicating a size of the area.
(2)
The image processing apparatus according to (1), wherein the processing unit is configured to combine, for each region, an image indicated by region image data acquired from each of the plurality of image sensors.
(3)
The image processing apparatus according to (2), wherein the processing unit is configured to combine images indicated by the area image data of the objects to be combined by aligning relative positions of the images.
(4)
The image processing apparatus according to (2) or (3), wherein,
the additional data comprises information about the imaging in the image sensor, an
The processing unit is configured to combine images indicated by the area image data of the objects to be combined by matching signal levels of the images based on information on imaging that has been acquired from each of the plurality of image sensors.
(5)
The image processing apparatus according to any one of (1) to (4), wherein the communication unit is configured to be switchable between the image sensors as communication subjects.
(6)
The image processing apparatus according to any one of (1) to (5), wherein the packet is a long packet of MIPI (mobile industry processor interface alliance).
(7)
An image processing system comprising:
a plurality of image sensors configured to transmit additional data including, for each area, area information corresponding to an area set with respect to a captured image and area image data indicating an image of each line corresponding to the area, in different packets, respectively; and
an image processing apparatus, wherein,
the image processing apparatus includes:
a communication unit capable of communicating with each of the plurality of image sensors; and
a processing unit configured to process area image data acquired from each of the plurality of image sensors in association with each area based on area information included in the additional data acquired from each of the plurality of image sensors, and
the area information includes a part or all of identification information of the area, information indicating a position of the area, and information indicating a size of the area.
List of reference marks
100. 100A, 100B image sensor
102 photoelectric conversion unit
104 signal processing unit
106. 202 communication unit
150 lens/imaging element
152 signal processing circuit
154. 250, 250A, 250B communication circuit
156. 252 processor
200 image processing apparatus
204 processing unit
254 image processing circuit
260 header separation unit
262 header interpretation unit
264 payload separation unit
270A, 270B first image processing unit
272 relative sensitivity difference correction processing unit
274 relative position correction processing unit
276 combined processing unit
278 second image processing unit
300 memory
400 display device
1000 image processing system
B1 data bus
B2 controls the bus.

Claims (7)

1. An image processing apparatus comprising:
a communication unit capable of communicating with each of a plurality of image sensors configured to transmit additional data and area image data in different groups, respectively, the additional data including, for each area, area information corresponding to an area set with respect to a captured image, the area image data indicating an image of each line corresponding to the area; and
a processing unit configured to process the area image data acquired from each of the plurality of image sensors in association with each area based on the area information included in the additional data acquired from each of the plurality of image sensors, wherein,
the area information includes a part or all of identification information of the area, information indicating a position of the area, and information indicating a size of the area.
2. The image processing device according to claim 1, wherein the processing unit is configured to combine, for the each region, images indicated by the region image data acquired from each of the plurality of image sensors.
3. The image processing device according to claim 2, wherein the processing unit is configured to combine images indicated by the region image data of the objects to be combined by aligning relative positions of the images.
4. The image processing apparatus according to claim 2,
the additional data includes information on imaging in the image sensors, and the processing unit is configured to combine images indicated by the area image data of the objects to be combined by matching signal levels of the images based on the information on the imaging that has been acquired from each of the plurality of image sensors.
5. The image processing apparatus according to claim 1, wherein the communication unit is configured to be switchable between image sensors as communication subjects.
6. The image processing apparatus according to claim 1, wherein the packet is a long packet of MIPI (mobile industry processor interface alliance).
7. An image processing system comprising:
a plurality of image sensors configured to transmit additional data and area image data in different groups, respectively, the additional data including, for each area, area information corresponding to an area set with respect to a captured image, the area image data indicating an image of each line corresponding to the area; and
an image processing apparatus, wherein,
the image processing apparatus includes:
a communication unit capable of communicating with each of the plurality of image sensors; and
a processing unit configured to process the area image data acquired from each of the plurality of image sensors in association with each area based on the area information included in the additional data acquired from each of the plurality of image sensors, and
the area information includes a part or all of identification information of the area, information indicating a position of the area, and information indicating a size of the area.
CN201980053880.XA 2018-08-20 2019-08-09 Image processing apparatus and image processing system Active CN112567727B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018153930 2018-08-20
JP2018-153930 2018-08-20
PCT/JP2019/031780 WO2020039992A1 (en) 2018-08-20 2019-08-09 Image processing device, and image processing system

Publications (2)

Publication Number Publication Date
CN112567727A true CN112567727A (en) 2021-03-26
CN112567727B CN112567727B (en) 2023-04-07

Family

ID=69593151

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980053880.XA Active CN112567727B (en) 2018-08-20 2019-08-09 Image processing apparatus and image processing system

Country Status (6)

Country Link
US (2) US11647284B2 (en)
EP (1) EP3843376A4 (en)
JP (1) JP7357620B2 (en)
KR (1) KR102709488B1 (en)
CN (1) CN112567727B (en)
WO (1) WO2020039992A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3993389A4 (en) * 2019-06-28 2022-08-17 Sony Semiconductor Solutions Corporation Transmission device, reception device, and transport system
JP2022131079A (en) * 2021-02-26 2022-09-07 ソニーセミコンダクタソリューションズ株式会社 Image processing device, image processing method, and image processing system
WO2023153743A1 (en) * 2022-02-09 2023-08-17 삼성전자주식회사 Electronic device and operating method therefor
KR20230125990A (en) 2022-02-22 2023-08-29 고려대학교 산학협력단 Pixel converter for converting the mipi dsi packet into arbitrary pixel sizes and method thereof

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101616260A (en) * 2008-06-27 2009-12-30 索尼株式会社 Signal processing apparatus, signal processing method, program and recording medium
CN102236890A (en) * 2010-05-03 2011-11-09 微软公司 Generating a combined image from multiple images
CN102273212A (en) * 2009-11-17 2011-12-07 索尼公司 Image reception device
CN102804791A (en) * 2010-01-22 2012-11-28 索尼公司 Reception device, transmission device, communication system, method for controlling reception device, and program
CN102892008A (en) * 2011-07-20 2013-01-23 美国博通公司 Dual image capture processing
CN104284064A (en) * 2013-07-05 2015-01-14 三星电子株式会社 Method and apparatus for previewing a dual-shot image
US20150172539A1 (en) * 2013-12-17 2015-06-18 Amazon Technologies, Inc. Distributing processing for imaging processing
US20150353011A1 (en) * 2014-06-10 2015-12-10 Lg Electronics Inc. Apparatus for providing around view and vehicle including the same
JP2016054479A (en) * 2014-09-02 2016-04-14 キヤノン株式会社 Imaging device, control method and program thereof, and imaging element
CN107925738A (en) * 2015-08-12 2018-04-17 三星电子株式会社 For providing method, electronic equipment and the storage medium of image
CN107950017A (en) * 2016-06-15 2018-04-20 索尼公司 Image processing equipment, image processing method and picture pick-up device
JP2018129587A (en) * 2017-02-06 2018-08-16 セコム株式会社 Data distribution system and data distribution method

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3974964B2 (en) * 1996-11-08 2007-09-12 オリンパス株式会社 Image processing device
US6249524B1 (en) * 1997-03-19 2001-06-19 Hitachi, Ltd. Cell buffer memory for a large capacity and high throughput ATM switch
US6249616B1 (en) * 1997-05-30 2001-06-19 Enroute, Inc Combining digital images based on three-dimensional relationships between source image data sets
JP2003219271A (en) 2002-01-24 2003-07-31 Nippon Hoso Kyokai <Nhk> System for synthesizing multipoint virtual studio
JP2006115006A (en) 2004-10-12 2006-04-27 Nippon Telegr & Teleph Corp <Ntt> Individual video image photographing and distributing apparatus, and individual video image photographing and distributing method and program
JP2007110499A (en) 2005-10-14 2007-04-26 Fujifilm Corp Compound eye photographing apparatus
US8339475B2 (en) * 2008-12-19 2012-12-25 Qualcomm Incorporated High dynamic range image combining
US8896668B2 (en) * 2010-04-05 2014-11-25 Qualcomm Incorporated Combining data from multiple image sensors
JP5367640B2 (en) * 2010-05-31 2013-12-11 パナソニック株式会社 Imaging apparatus and imaging method
US9237252B2 (en) * 2010-10-01 2016-01-12 Contex A/S Signal intensity matching of image sensors
JP5932376B2 (en) 2012-02-08 2016-06-08 富士機械製造株式会社 Image transfer method and image transfer apparatus
EP2629506A1 (en) * 2012-02-15 2013-08-21 Harman Becker Automotive Systems GmbH Two-step brightness adjustment in around-view systems
CN104509097B (en) 2012-05-30 2019-06-14 株式会社日立制作所 Surveillance camera control device and image monitoring system
US9906749B2 (en) 2014-09-02 2018-02-27 Canon Kabushiki Kaisha Image capturing apparatus capable of generating and adding pixel region information to image data, method of controlling the same, and image sensor
CN108353195A (en) 2015-11-17 2018-07-31 索尼公司 Sending device, sending method, receiving device, method of reseptance and transmitting/receiving system
JP6722044B2 (en) 2016-05-27 2020-07-15 ソニーセミコンダクタソリューションズ株式会社 Processing device, image sensor, and system
US10009551B1 (en) * 2017-03-29 2018-06-26 Amazon Technologies, Inc. Image processing for merging images of a scene captured with differing camera parameters
US11039092B2 (en) * 2017-11-15 2021-06-15 Nvidia Corporation Sparse scanout for image sensors

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101616260A (en) * 2008-06-27 2009-12-30 索尼株式会社 Signal processing apparatus, signal processing method, program and recording medium
CN102273212A (en) * 2009-11-17 2011-12-07 索尼公司 Image reception device
CN102804791A (en) * 2010-01-22 2012-11-28 索尼公司 Reception device, transmission device, communication system, method for controlling reception device, and program
CN102236890A (en) * 2010-05-03 2011-11-09 微软公司 Generating a combined image from multiple images
CN102892008A (en) * 2011-07-20 2013-01-23 美国博通公司 Dual image capture processing
CN104284064A (en) * 2013-07-05 2015-01-14 三星电子株式会社 Method and apparatus for previewing a dual-shot image
US20150172539A1 (en) * 2013-12-17 2015-06-18 Amazon Technologies, Inc. Distributing processing for imaging processing
US20150353011A1 (en) * 2014-06-10 2015-12-10 Lg Electronics Inc. Apparatus for providing around view and vehicle including the same
JP2016054479A (en) * 2014-09-02 2016-04-14 キヤノン株式会社 Imaging device, control method and program thereof, and imaging element
CN107925738A (en) * 2015-08-12 2018-04-17 三星电子株式会社 For providing method, electronic equipment and the storage medium of image
CN107950017A (en) * 2016-06-15 2018-04-20 索尼公司 Image processing equipment, image processing method and picture pick-up device
JP2018129587A (en) * 2017-02-06 2018-08-16 セコム株式会社 Data distribution system and data distribution method

Also Published As

Publication number Publication date
US11647284B2 (en) 2023-05-09
CN112567727B (en) 2023-04-07
US20210281749A1 (en) 2021-09-09
JPWO2020039992A1 (en) 2021-08-10
US20230388628A1 (en) 2023-11-30
EP3843376A4 (en) 2021-09-15
WO2020039992A1 (en) 2020-02-27
KR102709488B1 (en) 2024-09-26
KR20210046654A (en) 2021-04-28
EP3843376A1 (en) 2021-06-30
JP7357620B2 (en) 2023-10-06
US12058438B2 (en) 2024-08-06

Similar Documents

Publication Publication Date Title
CN112567727B (en) Image processing apparatus and image processing system
US11074023B2 (en) Transmission device
US20170094240A1 (en) Image processing device, imaging device, image processing method, and program
JP7277373B2 (en) transmitter
KR20200016229A (en) Video transmitter and video receiver
US11871008B2 (en) Transmitting apparatus, receiving apparatus, and transmission system
US20140055566A1 (en) Gesture recognition system and method
TW201838396A (en) Video transmission apparatus and video reception apparatus
US20230362307A1 (en) Transmitting apparatus, receiving apparatus, and transmission system
CN113170044B (en) Receiving apparatus and transmitting apparatus
JP7450704B2 (en) Transmitting device, receiving device and transmission system
JP7152475B2 (en) Transmitting device, receiving device, and communication system
CN113170029B (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant