WO2020027646A2 - Apparatus and method for imaging - Google Patents

Apparatus and method for imaging Download PDF

Info

Publication number
WO2020027646A2
WO2020027646A2 PCT/MY2019/000028 MY2019000028W WO2020027646A2 WO 2020027646 A2 WO2020027646 A2 WO 2020027646A2 MY 2019000028 W MY2019000028 W MY 2019000028W WO 2020027646 A2 WO2020027646 A2 WO 2020027646A2
Authority
WO
WIPO (PCT)
Prior art keywords
images
lighting
processor
pixel
pixel value
Prior art date
Application number
PCT/MY2019/000028
Other languages
French (fr)
Other versions
WO2020027646A3 (en
Inventor
Khurram Hamid KHOKHAR
Original Assignee
Kulim Technology Ideas Sdn Bhd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kulim Technology Ideas Sdn Bhd filed Critical Kulim Technology Ideas Sdn Bhd
Publication of WO2020027646A2 publication Critical patent/WO2020027646A2/en
Publication of WO2020027646A3 publication Critical patent/WO2020027646A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Definitions

  • Various embodiments relate to an apparatus for imaging and a method for imaging.
  • Photographing of products normally requires at least a few hours for professional photographers to adjust the lights to strike a balance between product lighting and background colour, as well as lighting of the product.
  • the product is reflective or is multicolour, and/or some parts of the product have a matching or similar colour as that of the background, the photographer has to make some compromises in lightings and quality of product shooting.
  • an apparatus for imaging may include a lighting arrangement configured to provide lighting, and a processor, wherein the processor is configured to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, and to control the lighting arrangement to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor is further configured, for a respective pixel of pixels belonging to the object, to determine a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • an apparatus for imaging may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor to: control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images, and determine, for a respective pixel of pixels belonging to the object, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • an apparatus for imaging may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor, for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, to determine, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • a method for imaging may include generating a plurality of images, each of the plurality of images depicting an object of interest, varying at least one parameter of lighting during generation of the plurality of images, and determining, for a respective pixel of pixels belonging to the object, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • a method for imaging may include, for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, determining, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • a computer program or a computer program product may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
  • FIG. 1 A shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
  • FIG. 1B shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
  • FIG. 1C shows a flow chart illustrating a method for imaging, according to various embodiments.
  • FIG. 1D shows a method for imaging, according to various embodiments.
  • FIGS. 2A to 2E show schematic perspective views of a pixel machine of various embodiments from different angles.
  • FIG. 3 shows a schematic back view of a main controller, according to various embodiments.
  • FIG. 4 shows a flow chart illustrating a method for imaging, according to various embodiments.
  • a and/or B may include A or B or both A and B.
  • an imaging apparatus for example, a product photography apparatus, e.g., an automatic photography equipment for products.
  • the apparatus may minimise the effort and labour associated with photography of products, for example, for the fastest expanding online shopping business.
  • Various embodiments may also provide the corresponding methods for imaging.
  • One or more of the following may be achieved: (1) detection of pixels belonging to object(s) for automatic background elimination (e.g., background cut); (2) detection of saturated pixel(s) to automatically eliminate reflections; (3) detection of, and elimination or maintenance of product shadows; (4) automatic centering of the object(s); (5) elimination of background and shadows of rotating object(s); (6) providing uniform exposure and colour for all the pixels.
  • automatic background elimination e.g., background cut
  • detection of saturated pixel(s) to automatically eliminate reflections e.g., background cut
  • detection of, and elimination or maintenance of product shadows e.g., automatic centering of the object(s)
  • elimination of background and shadows of rotating object(s) e.g., rotating object(s)
  • FIG. 1A shows a schematic block diagram of an apparatus 100 for imaging, according to various embodiments.
  • the apparatus 100 includes a lighting arrangement 102 configured to provide lighting, and a processor 104, wherein the processor 104 is configured to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, and to control the lighting arrangement 102 to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor 104 is further configured, for a respective pixel of pixels belonging to the object, to determine a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • an apparatus 100 for imaging may be provided, having a lighting arrangement 102 and a processor 104.
  • the apparatus (or arrangement) 100 may be employed to image or take images of an object of interest that is the subject to be imaged.
  • the object may be positioned against a background.
  • background may include a reference to the background relative to the object (i.e., in the presence of the object in the foreground).
  • the processor 104 may communicate with the lighting arrangement 102, for example, via a channel represented by the line 106.
  • the processor 104 and the lighting arrangement 102 may be electrically coupled to one another, e.g., via a physical cable 106.
  • the processor 104 may send one or more control signals to the lighting arrangement 102.
  • the lighting arrangement 102 may provide lighting to illuminate the object and/or the background. This may mean that the object and/or the background may be illuminated simultaneously or separately by the lighting.
  • the lighting arrangement 102 may provide lighting to illuminate the object and/or the background from the front side of the object.
  • the lighting arrangement 102 may partially or entirely surround the object.
  • the lighting arrangement 102 may be arranged in the form of a closed box environment, and the object may be placed within said environment for imaging purposes.
  • the various methods and techniques may be used in an open studio, on a stage, or in a film set, or any other suitable environments or settings.
  • the lighting may illuminate the object and/or the background only from the front side of the object, or provide lighting from different directions towards the object and/or the background.
  • the processor 104 may control an (optical) imaging device (e.g., a (digital) camera capable of taking photographs and/or videos, or a (digital) video recorder) to generate a plurality of images (or frames).
  • the plurality of images may mean 10, 20, 50, 100 or more images.
  • the imaging device may be separately provided or integrally provided with the apparatus 100.
  • the processor 104 may control the imaging device to take a number of images showing the object of interest in the images. It should be appreciated that each of or all of the plurality of images may depict the object (e.g., against a background).
  • the plurality of images may be taken or obtained (directly) as still images (e.g., photographic images), and/or may be images extracted from a moving sequence of consecutive graphics (e.g., a moving picture, motion picture, movie).
  • still images e.g., photographic images
  • moving sequence of consecutive graphics e.g., a moving picture, motion picture, movie
  • the processor 104 may control the lighting arrangement 102 to vary at least one parameter of the lighting during or for generation of the plurality of images.
  • the at least one parameter may be varied in between generation of two images, or of two immediately adjacent images. This may mean that the at least one parameter of the lighting may be different for each of the images generated.
  • the at least one parameter of the lighting may include any one of or any combination of a lighting intensity, a lighting colour, or a lighting direction.
  • a plurality of pixels may belong to the object (“object pixels”).
  • object pixels means a pixel that defines, in an image, the physical part of the object.
  • the processor 104 may determine a respective desired (or optimum) pixel value from corresponding pixels of the respective object pixel through the plurality of images.
  • the processor 104 may be configured to determine the respective desired value for each respective pixel of the plurality of object pixels or of all the object pixels.
  • the term“pixel value” may mean the intensity value of the pixel.
  • the term“corresponding pixel” for an image in relation to the“respective pixel” may mean a pixel corresponding to the same position or coordinate in each image of the plurality of images generated.
  • the determination of the respective desired (or optimum) pixel value may be carried out using corresponding pixels of two or more images of the plurality of images. All of the plurality of images may be used for this determination. Further, the determination of the respective desired (or optimum) pixel value may be carried out for each of all of the object pixels.
  • 10 corresponding pixels for a particular object pixel may be used for determining its desired pixel value. This may mean that, to determine the desired pixel value for the respective object pixel, the corresponding pixel from each of the 10 images, at the same position or coordinate in each image, may be used.
  • corresponding pixels may include pixels belonging to images generated immediately adjacent to each other.
  • a uniform lighting or exposure may be provided for the object. This may help to minimize or avoid reflection effect on or from the object.
  • the term“object” may refer to a thing, a product, a person, or a subject to be imaged. Further, it should be appreciated that the term“object” may include a living thing (e.g., person, animal, plant, etc.) and/or a non-living thing (e.g., a product, item, inanimate body or object, etc.).
  • a living thing e.g., person, animal, plant, etc.
  • a non-living thing e.g., a product, item, inanimate body or object, etc.
  • the lighting arrangement 102 may include a plurality of light sources to provide the lighting.
  • Each light source may be or may include a light emitting diode or device (LED).
  • the plurality of light sources may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs.
  • RGBW LEDs may provide a pure white colour.
  • any other types of light sources may be used, as long as the light sources may be controlled to vary at least one parameter (e.g., light intensity) of the lighting provided by the light sources.
  • Each light source (e.g., LED) may be individually addressable. This may mean that each light source may be independently switched on/off, and controlled to vary the at least one parameter of the lighting.
  • the lighting arrangement 102 may include a plurality of (separate) lighting panels to provide lighting from a plurality of directions to the object and/or background.
  • Each lighting panel may include a plurality of light sources, for example, LEDs, e.g., a combination of red LEDs, green LEDs, blue LEDs, and white LEDs.
  • the lighting arrangement 102 may be or may form part of a pixel machine.
  • a driver arrangement may be provided (e.g., in the pixel machine) for driving the lighting arrangement 102.
  • the driver arrangement may have a plurality of drivers for driving the associated light sources of the lighting arrangement 102.
  • the processor 104 may be provided with one or more communication interfaces, e.g., including at least one interface for communication with the lighting arrangement 102 or the pixel machine.
  • the processor 104 may be or may form part of a (main) controller.
  • the (main) controller may further include a display and other peripheral devices, e.g., a keyboard, a mouse, etc.
  • the pixel machine and the (main) controller may be comprised in the apparatus 100.
  • the processor 104 may be further configured to determine pixels belonging to the object, prior to determining the respective desired pixel value.
  • the processor 104 may be further configured to determine pixels belonging to the background (“background pixel”).
  • the processor 104 may remove or discard pixels belonging to the background. All pixels that are determined as background pixels may be removed or discarded.
  • background pixel may mean a pixel that defines, in an image, the background (e.g., relative to the object).
  • to“determine” a pixel as belonging to the object or the background may include, to“identify” the pixel as belonging to the object or the background.
  • this may not necessarily mean to (positively) mark or tag the pixel as an object pixel or a background pixel, although this may be the case in various embodiments.
  • the processor 104 may be configured, for the respective pixel, to determine an average (or mean) pixel value from the corresponding pixels, the average pixel value being the respective desired pixel value.
  • the corresponding pixels may have their (associated) respective pixel values and the processor 104 may be configured to calculate an average pixel value using the respective pixel values, where the average pixel value may be employed as the respective desired pixel value.
  • the processor 104 may be configured to determine a minimum pixel value and a maximum (saturated) pixel value from the corresponding pixels.
  • a minimum pixel value and a maximum pixel value may be determined. This may mean that both the minimum and maximum pixel values corresponding to the respective object pixel may be determined from the plurality of images generated.
  • the average pixel value may be determined as (minimum pixel value + maximum pixel value)/2. It should be appreciated that the respective minimum pixel values for different respective object pixels may be in different images of the plurality of images, and/or the respective maximum pixel values for different respective object pixels may be in different images of the plurality of images.
  • the processor 104 may be further configured, for a (or each) respective pixel of pixels belonging to a background (relative to the object), to determine a respective desired pixel value from corresponding pixels of the respective pixel belonging to the background through the plurality of images.
  • the processor 104 may be configured to determine the respective desired value for each respective pixel of the plurality of background pixels or of all the background pixels.
  • the processor 104 may be further configured to differentiate between pixels belonging to the object and pixels belonging to the background. In other words, the processor 104 may determine or identify a pixel as either an object pixel or a background pixel. In this way, object pixels and background pixels may be determined and thus distinguished from each other. Accordingly, an object pixel may be distinguished from a non-object pixel e.g., a background pixel.
  • the lighting may include an object lighting to illuminate (only) the object, and the at least one parameter may include at least one object parameter of the object lighting.
  • the processor 104 may be configured to control the lighting arrangement 102 to vary the at least one object parameter during or for generation of the plurality of images.
  • background lighting may be provided to illuminate the background.
  • the term “background lighting” may mean the lighting that illuminates the background, and/or the space in between the background and the object of interest, without illuminating the object. This means that the object of interest to be imaged is not illuminated by the background lighting, i.e., lighting provided from one or more light sources, to be employed as the background lighting to illuminate the background, does not illuminate the object.
  • the background lighting may be constant (i.e., not variable) or the at least background parameter of the background lighting may be varied.
  • the at least one object parameter and/or the at least background parameter may be varied simultaneously or may be varied by the same factor or value.
  • the at least one object parameter and/or the at least background parameter may correspond to intensity and/or colour.
  • the imaging device e.g., camera
  • the imaging device may be adjusted to low(er) sensitivity so that light scattered from the background that may potentially illuminate the object may not be captured or be observable in one or more images of the plurality of images generated.
  • the processor 104 may be further configured to vary the at least one parameter of the lighting through (or over) a plurality of values (associated with the at least one parameter), and, for generating the plurality of images, the processor 104 may be further configured to control the imaging device to generate a respective image of the plurality of images at a respective (different) value of the plurality of values.
  • the plurality of values may be different to one another. This may mean that the lighting for illuminating the object and/or background may be different between two (immediately adjacent) images.
  • the parameter of the lighting may be at or may have a first value for generation of one first image, and a second value for generation of one second image.
  • the processor 104 may vary the at least one parameter over a plurality of (different) values at a plurality of intervals to illuminate the object and/or the background, where each value may be associated with a respective interval, and the processor 104 may be configured to generate a respective image of the plurality of images at the respective interval.
  • the plurality of values may be within or may span a range from value 0 (minimum value) to value 255 (maximum value).
  • the at least one parameter may be varied from value 0 to value 255 at intervals of 1, 5 or any other number.
  • the value 0 may represent the minimum intensity or darkness, while the value 255 may represent the maximum intensity or saturation.
  • the values 0 - 255 may also represent the scale or range for colour.
  • the processor 104 may be further configured to control relative movement between the imaging device and the object during or for generation of the plurality of images. For example, there may be rotational movement between the imaging device and the object. This may allow 360° generation of images of the object.
  • the imaging device may be placed on a support structure.
  • the support structure may be a movable support structure (e.g., an XYZ motorized stand), and the processor 104 may control movement of the movable support structure, relative to the object.
  • the apparatus 100 may include a turn table to support the object, and the processor 104 may control movement of the turn table, relative to the imaging device.
  • the object may be placed on the turn table to be rotated.
  • the turn table may be at least substantially transparent or transmissive to light.
  • the lighting arrangement 102 may include at least one curved lighting panel configured to provide a focused lighting towards the object.
  • the at least one curved lighting panel may include a plurality of light sources, e.g., LEDs. Four curved lighting panels may be provided.
  • the apparatus 100 may further include an actuator configured to communicate with the processor 104, wherein, in response to a single activation of the actuator, operation of the processor 104 may be (fully) automated.
  • the actuator may be a push actuator, e.g., a push button.
  • the processor 104 may be further configured to identify a flicker effect in the plurality of images generated and to remove the flicker effect. For example, when the imaging device is operated to take videos, flicker effect may be captured in one or more images of the plurality of images due to frequency difference in the respective operations between the imaging device and the lighting arrangement 102. Flicker in the images may be observed continuously. Rather than controlling the lighting to minimise the flicker effect, this may be achieved in the processing of the images. From the plurality of images generated, flicker which is regular and similar may be identified so that the flicker may be eventually removed.
  • the processor 104 may be further configured to generate a resultant (or final) image of the object based on (or from or using) the pixels belonging to the object having the respective desired pixel values. In this way, a final image may be generated for the object on the basis of the desired pixel values determined for the object pixels.
  • the processor 104 may be further configured to control the lighting arrangement 102 to vary the at least one parameter to adapt the lighting according to the respective desired pixel values (to illuminate the object), and the processor 104 may be further configured to control the imaging device to generate, based on the lighting adapted, a resultant image depicting the object. For example, after determination of the respective desired pixel values, it may be possible to vary the intensity and/or colour of the lighting directed to one or more features (or elements or areas) of or within the object to highlight said feature(s) or to provide the desired (or optimum) lighting condition for the feature(s).
  • FIG. 1B shows a schematic block diagram of an apparatus lOOb for imaging (and/or for image processing), according to various embodiments.
  • the apparatus lOOb includes a processor l04b, and a memory 105 coupled to the processor l04b.
  • the memory 105 and the processor l04b may be coupled to each other (as represented by the line 107), e.g., physically coupled and/or electrically coupled.
  • the memory 105 has stored therein instructions, which when executed by the processor l04b, cause the processor l04b to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, to control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during (or for) generation of the plurality of images, and to determine, for a (or each) respective pixel of pixels belonging to the object, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • the plurality of images may be transferred to and/or stored in the memory 105 or another memory.
  • the memory 105 has stored therein instructions, which when executed by the processor l04b, cause the processor l04b, for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, to determine, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • the plurality of images may be transferred to and/or stored in the memory 105 or another memory.
  • description in the context of the apparatus 100 may correspondingly be applicable to the apparatus lOOb, and/or description in the context of the processor 104 may correspondingly be applicable to the processor l04b.
  • FIG. 1C shows a flow chart 120 illustrating a method for imaging (and/or for image processing), according to various embodiments.
  • a plurality of images are generated, each of the plurality of images depicting an object of interest.
  • At 124 at least one parameter of lighting is varied during (or for) generation of the plurality of images.
  • a respective desired pixel value is determined from corresponding pixels of the respective pixel through the plurality of images.
  • the method may include providing the lighting.
  • the method may further include determining or identifying pixels belonging to the object, prior to determining the respective desired pixel value.
  • the method may further include determining or identifying pixels belonging to the background.
  • the method may further include removing or discarding pixels belonging to the background. All pixels that are determined as background pixels may be removed or discarded.
  • an average pixel value may be determined from the corresponding pixels, the average pixel value being the respective desired pixel value.
  • the method may include determining a minimum pixel value and a maximum pixel value from the corresponding pixels.
  • the method may further include determining, for a respective pixel of pixels belonging to a background (relative to the object), a respective desired pixel value from corresponding pixels of the respective pixel belonging to the background through the plurality of images.
  • the method may further include differentiating between pixels belonging to the object and pixels belonging to the background.
  • the lighting may include an object lighting to illuminate (only) the object, and the at least one parameter may include at least one object parameter of the object lighting.
  • the at least one object parameter may be varied during or for generation of the plurality of images.
  • the imaging device e.g., camera
  • the imaging device may be adjusted to low(er) sensitivity so that light scattered from the background that may potentially illuminate the object may not be captured or be observable in one or more images of the plurality of images generated.
  • the at least one parameter of the lighting may be varied through a plurality of (different) values, and, at 122, a respective image of the plurality of images may be generated at a respective value of the plurality of values.
  • the method may further include identifying a flicker effect in the plurality of images generated, and removing the flicker effect.
  • a resultant image of the object may be generated based on (or from or using) the pixels belonging to the object having the respective desired pixel values.
  • the at least one parameter may be varied to adapt the lighting according to the respective desired pixel values, and, based on the lighting adapted, a resultant image depicting the object may be generated.
  • FIG. 1D shows a method 130 for imaging (and/or for image processing), according to various embodiments.
  • the method 130 includes, for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, determining, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • the method may include providing the lighting. It should be appreciated that description in relation to the method described in the context of the flow chart 120 may correspondingly be applicable in relation to the method 130, and vice versa.
  • Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
  • Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to control an imaging device to generate a plurality of images depicting an object of interest, to control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during (or for) generation of the plurality of images, and to determine, for a respective pixel of pixels belonging to the object, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
  • the intensity of light directed onto the object may be varied (e.g., increased) during capture of different frames such that, from (all) the frames captured, the minimum intensity value and a maximum intensity value (saturation) for each respective pixel corresponding to the object may be determined (where saturation may be reached for different pixels in different frames).
  • a final resultant frame may then be constructed such that each pixel corresponding to the object may carry an intensity value equal to the mean value of the corresponding minimum and maximum intensity values.
  • (all) the pixels in the resultant image may be formed with uniform exposure.
  • Frames may be captured with front light from low to saturated level. Then, the pixel values may be calculated, which may be the desired or optimum values throughout the frames and then combined to obtain uniform exposure.
  • the object and/or the background may be lit from the front using lighting with an intensity increasing from value 0 to value 255 value, i.e., from dark to saturated frame (or value).
  • different pixels may get saturated at different values of the input lighting.
  • the minimum and maximum values for (all) the pixels individually may be determined and the average (or mean) value may be determined to obtain the desired or best pixels with uniform exposure as the output.
  • a plurality of images may be captured at different lighting conditions and a resultant frame may then be formed by calculating the mean of minimum and maximum of pixel values throughout the plurality of images. Therefore, pixel values which are optimal or best throughout the images may be calculated and the associated pixels may then be combined to obtain uniform exposure for the object. In this way, a uniform exposure may be achieved for (all) the pixels without having some dark or saturated pixels.
  • various embodiments work with saturation, and determine where it appears and its relations with overall pixel values. Once it is determined which area of an image is saturating more than other areas or parts of the image at different light intensities, the group of pixels in that area may be marked and the intensity of the lighting provided to that area may be adjusted to a level optimally or best suited as per average level of pixel intensity of the whole image, and a resultant image may then be captured at the lighting intensity so adjusted.
  • the apparatus of various embodiments may include two parts: a pixel machine and a main controller, as will be described further below with reference to FIGS. 2 A to 2E and 3.
  • FIGS. 2 A to 2E show schematic perspective views of a pixel machine 201 of various embodiments. It should be appreciated that while one or more features or elements of the pixel machine 201 may not be shown in one or more of FIGS. 2 A to 2E for clarity and easier understanding purposes, and/or to illustrate an internal environment of the pixel machine 201, such features/elements may nevertheless be part of the pixel machine 201 as may be apparent from FIGS. 2A to 2E. Further, while LEDs are described in the context of the pixel machine 201, it should be appreciated that other types of light sources may be employed, as long as at least one parameter (e.g., light intensity) of the lighting provided may be variable. Nevertheless, LED lighting is preferable due to its characteristics of low power consumption, speed of control and long life.
  • a parameter e.g., light intensity
  • the pixel machine 201 may include a front-left LED panel 202, a front-middle LED panel 204, a front-right LED panel 206, a back side LED panel 207, a left side LED panel 214, a right side LED panel 216, four (key light) LED panels 220 which may be curved, a top LED panel 222, a top-back LED panel 224, a top-left LED panel 225, a top-right LED panel 226, a top- front (key light) LED panel 227, and a bottom LED panel 228, where one or more of these LED panels may define a lighting arrangement (e.g., 102, FIG. 1A).
  • the lighting arrangement may provide omnidirectional lighting, where the intensity of the lighting may be changed.
  • Each of the LED panels 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a plurality of LEDs (illustrated as squares in the panels and represented as 229 for some LEDs).
  • Each LED panel 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs (RGBW LEDs), which may provide pure white colour as producing white from RGB in a close box may be challenging.
  • Each LED panel may be individually or independently controlled.
  • Each LED 229 may be individually or independently addressable.
  • the lighting arrangement of the pixel machine 201 may provide three types of lights: key light, fill light and back light, thereby providing three-point lighting.
  • the lighting arrangement may also provide background lighting.
  • the four LED panels 220 in the comer, the top-back LED panel 224, the top-left LED panel 225, the top-right LED panel 226, and the top-front LED panel 227 may provide key light.
  • the left side LED panel 214 and the right side LED panel 216 may provide fill light.
  • the back side LED panel 207 may provide back light.
  • An imaging device (e.g., camera) 210 which may be supported on a support structure (e.g., camera stand) 212, may be provided for imaging through an aperture 208 defined in the front- middle LED panel 204.
  • the imaging device 210 may have a zoom feature, meaning that the imaging device 210 is capable of zooming-in and zooming-out operations for generation of images.
  • a turn table 231 may be provided to support one or more (sample) objects (e.g., represented as 230) that are to be imaged.
  • a stepper motor 232 may be provided for rotation of the object(s) 230 or turn table 231.
  • the lighting arrangement of the pixel machine 201 may provide lighting to light up an object (or product) 230 at least substantially equally from all angles so one or more or all features of the object(s) 230 may have good resolution, contrast and colours.
  • the four side LED panels 220 (curved or round shape) employ more LEDs to provide sharp and crisp contrast of the object(s) 230.
  • the panels 220 may also help to produce useful shadows shining areas of the object(s) 230.
  • the curved shape angle of the LED panels 220 may give a more focused light to the object(s) 230.
  • the panels 220 may also be used to generate different light effects, e.g., highlights and long and softer shadows, Chiaroscuro effect, etc, for photography.
  • the key light provided may be angled such that it may light up any part of an object 230 with control of the intensity of the key light.
  • Each LED 229 may be addressed individually.
  • Such a technique may allow (full) control over object lighting so that an optimal lighting condition or environment suitable for product photography may be determined. Further, in various embodiments, as the object height and length may be detected using a computer application, the optimal or best lighting condition or environment may be determined or suggested automatically with minimal or without any saturation and/or dark spots.
  • An interface 234 may be provided on the pixel machine 201, for example, for cable connection to the processor to be described below with reference to FIG. 3.
  • FIG. 3 shows a schematic back view of a main controller 340, according to various embodiments.
  • the main controller 340 may include a processor 342, for example, an industrial PC with embedded LEDs and motor controllers.
  • the main controller 340 may further include a display device (e.g., a touch-screen monitor) 352, a keyboard 354 and a mouse 356.
  • a display device e.g., a touch-screen monitor
  • An actuator for example, a single capture button 344, may be provided (integrated) with the processor 342.
  • a connector 348 may be provided for connection to a network, e.g., a local area network (LAN).
  • a power connector or socket 350 may also be provided.
  • An interface (or link connector) 346 may be provided on the processor 342 for communication with the pixel machine 201 (FIGS. 2A to 2E), for example, via the interface 234 (FIG. 2A). In this way, signals may be communicated from the processor 342 to the pixel machine 201, for example, to control movement and operation of the imaging device 210, to vary the parameter of the lighting provided by the lighting arrangement, etc.
  • a cable may be connected between the interfaces 234, 346, e.g., a multicore cable which may carry power, and control signals for LEDs 229, motor(s) (e.g., 232) and the imaging device 210 of the pixel machine 201.
  • Reflection elimination may be necessary in product photography, for example, when involving highly reflective objects such as jewellery, glass, etc.
  • the parameter (e.g., intensity) of the lighting provided may be controlled or varied
  • reflection on the product may also be varied, for example, to a level where there is no reflection anymore or reduced to its minimum level.
  • the (reflection) level may be detected based on the saturation level(s) of pixels, where pixels which lie at the boundary of the product may be determined or identified, and with the increase of light to allow these pixels to reach the saturation levels or to their highest intensity level. These pixels may be marked at their best contrast and resolution levels, and, subsequently, the pixels may be used to replace the saturated pixels of the frame(s).
  • FIG. 4 shows a flow chart 490 illustrating a method for imaging, as an example to minimise reflection.
  • an object of interest e.g., a product
  • the apparatus e.g., 100, FIG. 1A
  • Lighting of an increasing intensity may be provided to illuminate the object and a plurality of frames may be generated in the process. The lighting may also illuminate the background.
  • Command is sent to the imaging device (e.g., 210) to generate images.
  • the intensity of the lighting may be varied (e.g., increased) such that, in the frames captured, each pixel defining the object and/or the background may initially have a minimum intensity value and progressively reach a saturation value or level. It should be appreciated that respective pixels defining the object and/or the background may reach respective saturation levels in different frames. Further, respective pixels having desired values may be distributed in different frames. Further, in various embodiments, the colour of the lighting on the object and/or the background may also be changed.
  • the light intensity and/or colour emitted from the associated LEDs or LED panels may be varied under the control of the processor 342 (FIG. 3) which may in turn take command via or from a software application, e.g., based on user input. It should be appreciated that key light may be provided on the object.
  • the button 344 (FIG. 3) may be actuated/pressed to perform the product imaging or photography. Actuation of the button 344 may result in automated operations.
  • the minimum intensity value and the maximum intensity value may be determined from the corresponding pixels.
  • the average or mean value for each respective pixel may be calculated based on the minimum and maximum intensity values.
  • the mean value defining the final intensity value may be an average of the minimum and maximum intensity values (i.e., (minimum intensity value + maximum intensity value)/2).
  • the desired (or optimum) value for each respective pixel may be determined, where there is no saturation of the pixels.
  • a resultant or final frame may then be generated using pixels that are determined to have the desired (or optimum) values. Therefore, the resultant frame may be defined by a combination of pixels having the desired values.
  • the resultant image may include a uniform exposure of the pixels in the image, e.g., the corresponding pixels defining the object and/or the background, or all the pixels in the image.
  • the intensity of lighting directed onto the object and/or the background may be varied (e.g., increased) during capture of different images such that, from the images captured, the minimum intensity value and a maximum intensity value (saturation) for each respective pixel corresponding to the object and/or the background may be determined, where saturation may be reached for different pixels in different frames.
  • a final resultant frame may then be constructed such that each pixel corresponding to the object and/or the background carries an intensity value equal to the mean value of the corresponding minimum and maximum intensity values.
  • various methods for imaging may create saturation, for the purpose of removing saturation.
  • the images may be generated via two modes: photography mode and video (or movie) mode.
  • movie mode the imaging device (e.g., camera) 210 may capture frames at a rate of 60 seconds (may be less or more depending on device speed), and the light intensity and/or colours may be changed during the making of the video.
  • the auto iris mode of camera would be turned off so that the effect of rate of change of light in the background and the product may be captured.
  • the timing of the frames may be marked in the software application, which may be extracted and used to make a final or resultant 360° of still photograph.
  • the various methods for imaging may be used for making 360° photography of product by rotation.
  • Each frame may have clear background elimination.
  • the object may help a user to place the object in the centre of the turn table to have a better result in 360° photography.
  • the object may be placed anywhere at the turn table and rotated, and the object pixels may be determined or identified.
  • the image that is desired or best to put in front of the imaging device may be identified and such image with the product may be shown on a display device in half tone and half tone of live video, which may provide a live view which helps the user to match the location of the object following the image on screen by overlaying both images.
  • the product may first be detected where it is in the frame.
  • the centre or middle of the product may be determined by dividing X and Y values by 2.
  • a centre line of the product may be drawn.
  • the centre line may be thick as all the middle pixels may not lie on the same X and Y lines, so the width (threshold of centre) of the product may increase but in no way, may have value out of product boundary.
  • the difference between the current location and what should be if the product is placed in the centre of the turn- table may be determined and a half tone image may be shown to the user, and where necessary, to act to move the product to the centre of the turn table to have good 360 rotation view.
  • each light source In the context of the apparatus and various methods for imaging, each light source
  • LED may be controlled through a touch screen monitor or by one or more mouse clicks which help the user to create different lighting effects.
  • LEDs may be driven by a DC (direct current) controller to minimise or eliminate lighting non-sync locking effect.
  • DC controller direct current controller may eliminate the flicker caused with most LED driving arrangements used in low cost LED drivers using direct AC line of 50/60 hertz so as to save cost.
  • Each light source e.g., LED
  • Each light source e.g., LED
  • the light sources e.g., LEDs
  • receive commands from the processor the associated drivers for the light sources are located in the pixel machine.
  • LED light control from all angles may help to light up the round edges from left, right, top and bottom of the product, and good contrast and resolution pixels are made part of the final frame.
  • a first lighting sequence may be based on feature or part detection within the object. After the detection of an object as a whole (i.e., determination or identification of object pixels), various features or parts of the object may be detected based on colour.
  • LEDs as a non-limiting example, as lighting provided by the LEDs may be controlled (e.g., via processor or software application in the processor), it may be possible to determine the specific LEDs or LED panel illuminating a specific individual feature of the object.
  • the intensity and/or colour of the light from said specific LEDs or LED panel, which illuminates said specific individual feature may be varied to be substantially similar to the colour of the individual feature, which helps to further highlight the individual feature.
  • the associated LED may be controlled to provide yellow- coloured lighting onto the individual feature so that said feature may be clearer and colour rich.
  • Such an approach may also help in determining the saturation level for the pixels corresponding to said specific feature for determining the optimum intensity and/or colour value for said pixels when constructing the final resultant frame.
  • Such an approach may be helpful for image capture of objects having reflective part(s), for example, shiny metal parts and glass products.
  • a second lighting sequence may be based on defined lighting patterns.
  • Different lighting patterns may be provided, for example, round circle spot lighting, line lighting (of different widths or thicknesses), motion lighting (e.g., up, down, circular) to create different effects in photography; fade-in/fade-out effect, etc. in short movie mode as most digital cameras are capable of video recording at high resolution.
  • Lighting patterns or effects may be applied in circumstances where the object is stationary or is being rotated 360° during the image capture.
  • lighting may be provided to the sides of an object.
  • lighting may be provided from the front for a shiny object, the edges of the object may become mixed with the background.
  • the front lighting may then be turned off and side lighting may be provided for images to be generated to capture the edges of the object without saturation.
  • lighting may be provided from the left and/or right sides of the object. Subsequently, the images generated with the front lighting and the side lighting may be combined to form the complete object.
  • a camera stand may be mounted on the box defining the pixel machine, which may help in the minimisation or elimination of vibration or small differences in the position of detection of pixels in different frames.
  • the imaging device e.g., camera
  • the imaging device may be mounted on an XYZ motorized stand which may not only automatically zoom but adjust its physical location as per product size and/or placement on the turn table.
  • the imaging device may be moved up and down, in a linear or curved motion.
  • the imaging device may be moved in a curved motion to take a top view image of the object if desired.
  • This may mean that the front-middle LED panel 204 may be movable together with the imaging device when the XYZ motorized stand is activated to control positioning of the imaging device.
  • the front-middle LED panel 204 may be movable up and down and/or rotated in curve. Additionally or alternatively, even if the imaging device is moved up and down with XY movement, a curved effect may be achieved using the software application.
  • the lighting may be turned on in a specific or defined manner to find the desired or best lighting based on pixel resolution and/or saturation levels.
  • front, back, left and right views of the object may be automatically captured by rotation of the turn table.
  • a user (only) needs to place the object front facing the imaging device.
  • the turn table may then rotate and adjust, for example, 5 - 10°, to determine the location where the maximum width of the object is detected. This may help to further eliminate any placement errors by the user.
  • the turn table may rotate at any angle, which may be defined in a default setting of the software. For example, the turn table may rotate 90° to capture the left, back and right views of the object for photoshoot.
  • camera white balance may be adjusted based on RGB colour intensity which may help in capturing close to the natural colours of the object(s) or product(s).
  • a software application compatible with Windows® and Linux® operating systems may be provided.
  • the object on the turn table may be automatically determined, the camera may be adjusted to its best location, the lighting may be adjusted, object image(s) may be taken or generated, the background may be eliminated, the object may be centred in the middle of frame and both raw and compressed versions may be stored, for example, in the“My Pictures” folder in Windows® or on a flash drive connected to processor or to a network addressed folder.
  • Activating the button 344 triggers the automatic scripts written in the software application, which may include predefined template(s) to get most of product photographs correct.
  • an application programming interface (API) platform may be provided, which may be used to write plugin for web application which may support single button operation to upload product images in single, quad or 360° compressed and optimised for web resolution direct to e-commerce websites with single press of the button 344.
  • API application programming interface
  • the various methods may provide a uniform exposure by patching best pixels in the frames.
  • the various methods may be used in a studio with bigger objects to shoot, including fashion models, clothing and cars. Background lighting may be varied with known values, and pixels may be marked and determined (or identified) as belonging to the product or the background so that there is no necessity for manual editing, e.g., to remove the background. It should be appreciated that the various methods for imaging may be implemented, either individually or a combination thereof, in the apparatus for imaging of various embodiments.
  • the intensity of the lighting may be initially set for saturation of the pixels, and the intensity may be gradually reduced.
  • a pixel may have red, green and blue components, which when saturated, each colour component may have a high or maximum (saturated) value.
  • the value for each colour (red, green, blue) component may decrease.
  • the ratio in the change of the respective value for each colour component may vary with each reduction in the lighting intensity. At a certain threshold lighting intensity and below, the ratio in the change of the respective value for each colour component may be constant with each reduction in the lighting intensity.
  • the value of the red component may change by“rl”
  • the value of the green component may change by“gl”
  • the value of the blue component may change by“bl” such that the change in the respective components may follow a constant ratio rl :gl :bl with each reduction.
  • a pixel with the respective values for the red, green and blue components at the threshold lighting intensity may be identified as the best pixel or having the desired pixel value.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

An apparatus for imaging is provided. The apparatus includes a lighting arrangement configured to provide lighting, and a processor, wherein the processor is configured to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, and to control the lighting arrangement to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor is further configured, for a respective pixel of pixels belonging to the object of interest, to determine a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images. A method for imaging is also provided.

Description

APPARATUS AND METHOD FOR IMAGING
Technical Field
Various embodiments relate to an apparatus for imaging and a method for imaging.
Background
Photographing of products normally requires at least a few hours for professional photographers to adjust the lights to strike a balance between product lighting and background colour, as well as lighting of the product. In worst cases, if the product is reflective or is multicolour, and/or some parts of the product have a matching or similar colour as that of the background, the photographer has to make some compromises in lightings and quality of product shooting. In almost every case, no matter how good and professional and skilled the photographer is, along with how good the post production tools may be, it ends up with the need for the photographer to eliminate the background by manual editing, which definitely is bound to cause loss of product details and resolution.
Summary
The invention is defined in the independent claims. Further embodiments of the invention are defined in the dependent claims.
According to an embodiment, an apparatus for imaging is provided. The apparatus may include a lighting arrangement configured to provide lighting, and a processor, wherein the processor is configured to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, and to control the lighting arrangement to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor is further configured, for a respective pixel of pixels belonging to the object, to determine a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
According to an embodiment, an apparatus for imaging is provided. The apparatus may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor to: control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images, and determine, for a respective pixel of pixels belonging to the object, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
According to an embodiment, an apparatus for imaging is provided. The apparatus may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor, for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, to determine, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
According to an embodiment, a method for imaging is provided. The method may include generating a plurality of images, each of the plurality of images depicting an object of interest, varying at least one parameter of lighting during generation of the plurality of images, and determining, for a respective pixel of pixels belonging to the object, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
According to an embodiment, a method for imaging is provided. The method may include, for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, determining, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
According to an embodiment, a computer program or a computer program product is provided. The computer program or computer program product may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
Brief Description of the Drawings
In the drawings, like reference characters generally refer to like parts throughout the different views. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the following description, various embodiments of the invention are described with reference to the following drawings, in which:
FIG. 1 A shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
FIG. 1B shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
FIG. 1C shows a flow chart illustrating a method for imaging, according to various embodiments.
FIG. 1D shows a method for imaging, according to various embodiments.
FIGS. 2A to 2E show schematic perspective views of a pixel machine of various embodiments from different angles.
FIG. 3 shows a schematic back view of a main controller, according to various embodiments.
FIG. 4 shows a flow chart illustrating a method for imaging, according to various embodiments.
Detailed Description
The following detailed description refers to the accompanying drawings that show, by way of illustration, specific details and embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments may be utilized and structural, logical, and electrical changes may be made without departing from the scope of the invention. The various embodiments are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments.
As used herein, the term“and/or” includes any and all combinations of one or more of the associated listed items. For example, A and/or B may include A or B or both A and B.
Various embodiments may provide an imaging apparatus, for example, a product photography apparatus, e.g., an automatic photography equipment for products. The apparatus may minimise the effort and labour associated with photography of products, for example, for the fastest expanding online shopping business. Various embodiments may also provide the corresponding methods for imaging.
One or more of the following may be achieved: (1) detection of pixels belonging to object(s) for automatic background elimination (e.g., background cut); (2) detection of saturated pixel(s) to automatically eliminate reflections; (3) detection of, and elimination or maintenance of product shadows; (4) automatic centering of the object(s); (5) elimination of background and shadows of rotating object(s); (6) providing uniform exposure and colour for all the pixels.
FIG. 1A shows a schematic block diagram of an apparatus 100 for imaging, according to various embodiments. The apparatus 100 includes a lighting arrangement 102 configured to provide lighting, and a processor 104, wherein the processor 104 is configured to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, and to control the lighting arrangement 102 to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor 104 is further configured, for a respective pixel of pixels belonging to the object, to determine a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
In other words, an apparatus 100 for imaging (and/or for image processing) may be provided, having a lighting arrangement 102 and a processor 104. The apparatus (or arrangement) 100 may be employed to image or take images of an object of interest that is the subject to be imaged. The object may be positioned against a background. It should be appreciated that the term“background” may include a reference to the background relative to the object (i.e., in the presence of the object in the foreground).
The processor 104 may communicate with the lighting arrangement 102, for example, via a channel represented by the line 106. The processor 104 and the lighting arrangement 102 may be electrically coupled to one another, e.g., via a physical cable 106. The processor 104 may send one or more control signals to the lighting arrangement 102.
The lighting arrangement 102 may provide lighting to illuminate the object and/or the background. This may mean that the object and/or the background may be illuminated simultaneously or separately by the lighting. The lighting arrangement 102 may provide lighting to illuminate the object and/or the background from the front side of the object. The lighting arrangement 102 may partially or entirely surround the object. As a non-limiting example, the lighting arrangement 102 may be arranged in the form of a closed box environment, and the object may be placed within said environment for imaging purposes. However, it should be appreciated that the various methods and techniques may be used in an open studio, on a stage, or in a film set, or any other suitable environments or settings. The lighting may illuminate the object and/or the background only from the front side of the object, or provide lighting from different directions towards the object and/or the background. The processor 104 may control an (optical) imaging device (e.g., a (digital) camera capable of taking photographs and/or videos, or a (digital) video recorder) to generate a plurality of images (or frames). As non-limiting examples, the plurality of images may mean 10, 20, 50, 100 or more images. The imaging device may be separately provided or integrally provided with the apparatus 100. The processor 104 may control the imaging device to take a number of images showing the object of interest in the images. It should be appreciated that each of or all of the plurality of images may depict the object (e.g., against a background).
In the context of various embodiments, the plurality of images may be taken or obtained (directly) as still images (e.g., photographic images), and/or may be images extracted from a moving sequence of consecutive graphics (e.g., a moving picture, motion picture, movie).
The processor 104 may control the lighting arrangement 102 to vary at least one parameter of the lighting during or for generation of the plurality of images. For example, the at least one parameter may be varied in between generation of two images, or of two immediately adjacent images. This may mean that the at least one parameter of the lighting may be different for each of the images generated.
In the context of various embodiments, the at least one parameter of the lighting may include any one of or any combination of a lighting intensity, a lighting colour, or a lighting direction.
Of the pixels (“image pixels”) defining each image generated, a plurality of pixels may belong to the object (“object pixels”). The term“object pixel” means a pixel that defines, in an image, the physical part of the object. For a (or each) respective object pixel, the processor 104 may determine a respective desired (or optimum) pixel value from corresponding pixels of the respective object pixel through the plurality of images. The processor 104 may be configured to determine the respective desired value for each respective pixel of the plurality of object pixels or of all the object pixels.
In the context of various embodiments, the term“pixel value” may mean the intensity value of the pixel.
In the context of various embodiments, the term“corresponding pixel” for an image in relation to the“respective pixel” may mean a pixel corresponding to the same position or coordinate in each image of the plurality of images generated. The determination of the respective desired (or optimum) pixel value may be carried out using corresponding pixels of two or more images of the plurality of images. All of the plurality of images may be used for this determination. Further, the determination of the respective desired (or optimum) pixel value may be carried out for each of all of the object pixels.
Using 10 images as a non-limiting example, 10 corresponding pixels for a particular object pixel may be used for determining its desired pixel value. This may mean that, to determine the desired pixel value for the respective object pixel, the corresponding pixel from each of the 10 images, at the same position or coordinate in each image, may be used.
In the context of various embodiments, corresponding pixels may include pixels belonging to images generated immediately adjacent to each other.
In various embodiments, by determining the respective desired pixel values for respective object pixels, a uniform lighting or exposure may be provided for the object. This may help to minimize or avoid reflection effect on or from the object.
In the context of various embodiments, the term“object” may refer to a thing, a product, a person, or a subject to be imaged. Further, it should be appreciated that the term“object” may include a living thing (e.g., person, animal, plant, etc.) and/or a non-living thing (e.g., a product, item, inanimate body or object, etc.).
The lighting arrangement 102 may include a plurality of light sources to provide the lighting. Each light source may be or may include a light emitting diode or device (LED). In one non-limiting example, the plurality of light sources may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs. The use of RGBW LEDs may provide a pure white colour. Nevertheless, it should be appreciated that any other types of light sources may be used, as long as the light sources may be controlled to vary at least one parameter (e.g., light intensity) of the lighting provided by the light sources. Each light source (e.g., LED) may be individually addressable. This may mean that each light source may be independently switched on/off, and controlled to vary the at least one parameter of the lighting.
The lighting arrangement 102 may include a plurality of (separate) lighting panels to provide lighting from a plurality of directions to the object and/or background. Each lighting panel may include a plurality of light sources, for example, LEDs, e.g., a combination of red LEDs, green LEDs, blue LEDs, and white LEDs.
The lighting arrangement 102 may be or may form part of a pixel machine. A driver arrangement may be provided (e.g., in the pixel machine) for driving the lighting arrangement 102. The driver arrangement may have a plurality of drivers for driving the associated light sources of the lighting arrangement 102. The processor 104 may be provided with one or more communication interfaces, e.g., including at least one interface for communication with the lighting arrangement 102 or the pixel machine. The processor 104 may be or may form part of a (main) controller. The (main) controller may further include a display and other peripheral devices, e.g., a keyboard, a mouse, etc.
The pixel machine and the (main) controller may be comprised in the apparatus 100.
In various embodiments, the processor 104 may be further configured to determine pixels belonging to the object, prior to determining the respective desired pixel value. The processor 104 may be further configured to determine pixels belonging to the background (“background pixel”). In some embodiments, the processor 104 may remove or discard pixels belonging to the background. All pixels that are determined as background pixels may be removed or discarded. The term “background pixel” may mean a pixel that defines, in an image, the background (e.g., relative to the object).
In the context of various embodiments, it should be appreciated that, to“determine” a pixel as belonging to the object or the background may include, to“identify” the pixel as belonging to the object or the background. However, this may not necessarily mean to (positively) mark or tag the pixel as an object pixel or a background pixel, although this may be the case in various embodiments.
In various embodiments, for determining the respective desired pixel value, the processor 104 may be configured, for the respective pixel, to determine an average (or mean) pixel value from the corresponding pixels, the average pixel value being the respective desired pixel value. In one non-limiting example, the corresponding pixels may have their (associated) respective pixel values and the processor 104 may be configured to calculate an average pixel value using the respective pixel values, where the average pixel value may be employed as the respective desired pixel value.
In various embodiments, for determining the average pixel value, the processor 104 may be configured to determine a minimum pixel value and a maximum (saturated) pixel value from the corresponding pixels. In other words, from the respective pixel values associated with or of the corresponding pixels, a minimum pixel value and a maximum pixel value may be determined. This may mean that both the minimum and maximum pixel values corresponding to the respective object pixel may be determined from the plurality of images generated. The average pixel value may be determined as (minimum pixel value + maximum pixel value)/2. It should be appreciated that the respective minimum pixel values for different respective object pixels may be in different images of the plurality of images, and/or the respective maximum pixel values for different respective object pixels may be in different images of the plurality of images.
In various embodiments, the processor 104 may be further configured, for a (or each) respective pixel of pixels belonging to a background (relative to the object), to determine a respective desired pixel value from corresponding pixels of the respective pixel belonging to the background through the plurality of images. The processor 104 may be configured to determine the respective desired value for each respective pixel of the plurality of background pixels or of all the background pixels.
In various embodiments, the processor 104 may be further configured to differentiate between pixels belonging to the object and pixels belonging to the background. In other words, the processor 104 may determine or identify a pixel as either an object pixel or a background pixel. In this way, object pixels and background pixels may be determined and thus distinguished from each other. Accordingly, an object pixel may be distinguished from a non-object pixel e.g., a background pixel.
In various embodiments, the lighting may include an object lighting to illuminate (only) the object, and the at least one parameter may include at least one object parameter of the object lighting. The processor 104 may be configured to control the lighting arrangement 102 to vary the at least one object parameter during or for generation of the plurality of images. Additionally, background lighting may be provided to illuminate the background. In the context of various embodiments, the term “background lighting” may mean the lighting that illuminates the background, and/or the space in between the background and the object of interest, without illuminating the object. This means that the object of interest to be imaged is not illuminated by the background lighting, i.e., lighting provided from one or more light sources, to be employed as the background lighting to illuminate the background, does not illuminate the object. The background lighting may be constant (i.e., not variable) or the at least background parameter of the background lighting may be varied. The at least one object parameter and/or the at least background parameter may be varied simultaneously or may be varied by the same factor or value. As a non-limiting example, the at least one object parameter and/or the at least background parameter may correspond to intensity and/or colour. Further, in various embodiments, for the purpose of minimizing or avoiding the effect of the object being illuminated by scattered light from the background, the imaging device (e.g., camera) may be adjusted to low(er) sensitivity so that light scattered from the background that may potentially illuminate the object may not be captured or be observable in one or more images of the plurality of images generated.
In various embodiments, the processor 104 may be further configured to vary the at least one parameter of the lighting through (or over) a plurality of values (associated with the at least one parameter), and, for generating the plurality of images, the processor 104 may be further configured to control the imaging device to generate a respective image of the plurality of images at a respective (different) value of the plurality of values. The plurality of values may be different to one another. This may mean that the lighting for illuminating the object and/or background may be different between two (immediately adjacent) images. For example, the parameter of the lighting may be at or may have a first value for generation of one first image, and a second value for generation of one second image.
As another example, the processor 104 may vary the at least one parameter over a plurality of (different) values at a plurality of intervals to illuminate the object and/or the background, where each value may be associated with a respective interval, and the processor 104 may be configured to generate a respective image of the plurality of images at the respective interval.
The plurality of values may be within or may span a range from value 0 (minimum value) to value 255 (maximum value). In one non-limiting example, the at least one parameter may be varied from value 0 to value 255 at intervals of 1, 5 or any other number. In terms of intensity, the value 0 may represent the minimum intensity or darkness, while the value 255 may represent the maximum intensity or saturation. The values 0 - 255 may also represent the scale or range for colour.
The processor 104 may be further configured to control relative movement between the imaging device and the object during or for generation of the plurality of images. For example, there may be rotational movement between the imaging device and the object. This may allow 360° generation of images of the object.
The imaging device may be placed on a support structure. The support structure may be a movable support structure (e.g., an XYZ motorized stand), and the processor 104 may control movement of the movable support structure, relative to the object.
The apparatus 100 may include a turn table to support the object, and the processor 104 may control movement of the turn table, relative to the imaging device. The object may be placed on the turn table to be rotated. The turn table may be at least substantially transparent or transmissive to light. The lighting arrangement 102 may include at least one curved lighting panel configured to provide a focused lighting towards the object. The at least one curved lighting panel may include a plurality of light sources, e.g., LEDs. Four curved lighting panels may be provided.
The apparatus 100 may further include an actuator configured to communicate with the processor 104, wherein, in response to a single activation of the actuator, operation of the processor 104 may be (fully) automated. The actuator may be a push actuator, e.g., a push button.
The processor 104 may be further configured to identify a flicker effect in the plurality of images generated and to remove the flicker effect. For example, when the imaging device is operated to take videos, flicker effect may be captured in one or more images of the plurality of images due to frequency difference in the respective operations between the imaging device and the lighting arrangement 102. Flicker in the images may be observed continuously. Rather than controlling the lighting to minimise the flicker effect, this may be achieved in the processing of the images. From the plurality of images generated, flicker which is regular and similar may be identified so that the flicker may be eventually removed.
The processor 104 may be further configured to generate a resultant (or final) image of the object based on (or from or using) the pixels belonging to the object having the respective desired pixel values. In this way, a final image may be generated for the object on the basis of the desired pixel values determined for the object pixels.
In various embodiments, the processor 104 may be further configured to control the lighting arrangement 102 to vary the at least one parameter to adapt the lighting according to the respective desired pixel values (to illuminate the object), and the processor 104 may be further configured to control the imaging device to generate, based on the lighting adapted, a resultant image depicting the object. For example, after determination of the respective desired pixel values, it may be possible to vary the intensity and/or colour of the lighting directed to one or more features (or elements or areas) of or within the object to highlight said feature(s) or to provide the desired (or optimum) lighting condition for the feature(s).
FIG. 1B shows a schematic block diagram of an apparatus lOOb for imaging (and/or for image processing), according to various embodiments. The apparatus lOOb includes a processor l04b, and a memory 105 coupled to the processor l04b. The memory 105 and the processor l04b may be coupled to each other (as represented by the line 107), e.g., physically coupled and/or electrically coupled. In various embodiments, the memory 105 has stored therein instructions, which when executed by the processor l04b, cause the processor l04b to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, to control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during (or for) generation of the plurality of images, and to determine, for a (or each) respective pixel of pixels belonging to the object, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images. The plurality of images may be transferred to and/or stored in the memory 105 or another memory.
In various embodiments, the memory 105 has stored therein instructions, which when executed by the processor l04b, cause the processor l04b, for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, to determine, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images. The plurality of images may be transferred to and/or stored in the memory 105 or another memory.
It should be appreciated that description in the context of the apparatus 100 may correspondingly be applicable to the apparatus lOOb, and/or description in the context of the processor 104 may correspondingly be applicable to the processor l04b.
FIG. 1C shows a flow chart 120 illustrating a method for imaging (and/or for image processing), according to various embodiments.
At 122, a plurality of images are generated, each of the plurality of images depicting an object of interest.
At 124, at least one parameter of lighting is varied during (or for) generation of the plurality of images.
At 126, for a respective pixel of pixels belonging to the object, a respective desired pixel value is determined from corresponding pixels of the respective pixel through the plurality of images.
The method may include providing the lighting.
The method may further include determining or identifying pixels belonging to the object, prior to determining the respective desired pixel value. The method may further include determining or identifying pixels belonging to the background. The method may further include removing or discarding pixels belonging to the background. All pixels that are determined as background pixels may be removed or discarded.
In various embodiments, at 126, an average pixel value may be determined from the corresponding pixels, the average pixel value being the respective desired pixel value. For determining the average pixel value, the method may include determining a minimum pixel value and a maximum pixel value from the corresponding pixels.
In various embodiments, the method may further include determining, for a respective pixel of pixels belonging to a background (relative to the object), a respective desired pixel value from corresponding pixels of the respective pixel belonging to the background through the plurality of images.
The method may further include differentiating between pixels belonging to the object and pixels belonging to the background.
In the method of various embodiments, the lighting may include an object lighting to illuminate (only) the object, and the at least one parameter may include at least one object parameter of the object lighting. At 124, the at least one object parameter may be varied during or for generation of the plurality of images. Further, in various embodiments, for the purpose of minimizing or avoiding the effect of the object being illuminated by scattered light from the background, the imaging device (e.g., camera) may be adjusted to low(er) sensitivity so that light scattered from the background that may potentially illuminate the object may not be captured or be observable in one or more images of the plurality of images generated.
In various embodiments, at 124, the at least one parameter of the lighting may be varied through a plurality of (different) values, and, at 122, a respective image of the plurality of images may be generated at a respective value of the plurality of values.
The method may further include identifying a flicker effect in the plurality of images generated, and removing the flicker effect.
In various embodiments, a resultant image of the object may be generated based on (or from or using) the pixels belonging to the object having the respective desired pixel values.
In various embodiments, the at least one parameter may be varied to adapt the lighting according to the respective desired pixel values, and, based on the lighting adapted, a resultant image depicting the object may be generated. FIG. 1D shows a method 130 for imaging (and/or for image processing), according to various embodiments. The method 130 includes, for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, determining, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images. The method may include providing the lighting. It should be appreciated that description in relation to the method described in the context of the flow chart 120 may correspondingly be applicable in relation to the method 130, and vice versa.
It should be appreciated that description in the context of the apparatus 100, lOOb may correspondingly be applicable in relation to the method described in the context of the flow chart 120, and the method 130, and vice versa.
Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to control an imaging device to generate a plurality of images depicting an object of interest, to control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during (or for) generation of the plurality of images, and to determine, for a respective pixel of pixels belonging to the object, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
As described above, in various embodiments, for the purpose of minimising or eliminating reflection effect from the object, the intensity of light directed onto the object may be varied (e.g., increased) during capture of different frames such that, from (all) the frames captured, the minimum intensity value and a maximum intensity value (saturation) for each respective pixel corresponding to the object may be determined (where saturation may be reached for different pixels in different frames). A final resultant frame may then be constructed such that each pixel corresponding to the object may carry an intensity value equal to the mean value of the corresponding minimum and maximum intensity values.
In various embodiments, (all) the pixels in the resultant image may be formed with uniform exposure. Frames may be captured with front light from low to saturated level. Then, the pixel values may be calculated, which may be the desired or optimum values throughout the frames and then combined to obtain uniform exposure.
In further details, for uniform exposure of (all) the pixels in the image, the object and/or the background may be lit from the front using lighting with an intensity increasing from value 0 to value 255 value, i.e., from dark to saturated frame (or value). In the frames captured, different pixels may get saturated at different values of the input lighting. The minimum and maximum values for (all) the pixels individually may be determined and the average (or mean) value may be determined to obtain the desired or best pixels with uniform exposure as the output. For example, a plurality of images may be captured at different lighting conditions and a resultant frame may then be formed by calculating the mean of minimum and maximum of pixel values throughout the plurality of images. Therefore, pixel values which are optimal or best throughout the images may be calculated and the associated pixels may then be combined to obtain uniform exposure for the object. In this way, a uniform exposure may be achieved for (all) the pixels without having some dark or saturated pixels.
Further, it should be appreciated that various embodiments work with saturation, and determine where it appears and its relations with overall pixel values. Once it is determined which area of an image is saturating more than other areas or parts of the image at different light intensities, the group of pixels in that area may be marked and the intensity of the lighting provided to that area may be adjusted to a level optimally or best suited as per average level of pixel intensity of the whole image, and a resultant image may then be captured at the lighting intensity so adjusted.
The apparatus of various embodiments may include two parts: a pixel machine and a main controller, as will be described further below with reference to FIGS. 2 A to 2E and 3.
FIGS. 2 A to 2E show schematic perspective views of a pixel machine 201 of various embodiments. It should be appreciated that while one or more features or elements of the pixel machine 201 may not be shown in one or more of FIGS. 2 A to 2E for clarity and easier understanding purposes, and/or to illustrate an internal environment of the pixel machine 201, such features/elements may nevertheless be part of the pixel machine 201 as may be apparent from FIGS. 2A to 2E. Further, while LEDs are described in the context of the pixel machine 201, it should be appreciated that other types of light sources may be employed, as long as at least one parameter (e.g., light intensity) of the lighting provided may be variable. Nevertheless, LED lighting is preferable due to its characteristics of low power consumption, speed of control and long life.
The pixel machine 201 may include a front-left LED panel 202, a front-middle LED panel 204, a front-right LED panel 206, a back side LED panel 207, a left side LED panel 214, a right side LED panel 216, four (key light) LED panels 220 which may be curved, a top LED panel 222, a top-back LED panel 224, a top-left LED panel 225, a top-right LED panel 226, a top- front (key light) LED panel 227, and a bottom LED panel 228, where one or more of these LED panels may define a lighting arrangement (e.g., 102, FIG. 1A). The lighting arrangement may provide omnidirectional lighting, where the intensity of the lighting may be changed.
Each of the LED panels 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a plurality of LEDs (illustrated as squares in the panels and represented as 229 for some LEDs). Each LED panel 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs (RGBW LEDs), which may provide pure white colour as producing white from RGB in a close box may be challenging. Each LED panel may be individually or independently controlled. Each LED 229 may be individually or independently addressable.
The lighting arrangement of the pixel machine 201 may provide three types of lights: key light, fill light and back light, thereby providing three-point lighting. The lighting arrangement may also provide background lighting. The four LED panels 220 in the comer, the top-back LED panel 224, the top-left LED panel 225, the top-right LED panel 226, and the top-front LED panel 227 may provide key light. The left side LED panel 214 and the right side LED panel 216 may provide fill light. The back side LED panel 207 may provide back light.
An imaging device (e.g., camera) 210, which may be supported on a support structure (e.g., camera stand) 212, may be provided for imaging through an aperture 208 defined in the front- middle LED panel 204. The imaging device 210 may have a zoom feature, meaning that the imaging device 210 is capable of zooming-in and zooming-out operations for generation of images. A turn table 231 may be provided to support one or more (sample) objects (e.g., represented as 230) that are to be imaged. A stepper motor 232 may be provided for rotation of the object(s) 230 or turn table 231. The lighting arrangement of the pixel machine 201 may provide lighting to light up an object (or product) 230 at least substantially equally from all angles so one or more or all features of the object(s) 230 may have good resolution, contrast and colours.
The four side LED panels 220 (curved or round shape) employ more LEDs to provide sharp and crisp contrast of the object(s) 230. The panels 220 may also help to produce useful shadows shining areas of the object(s) 230. The curved shape angle of the LED panels 220 may give a more focused light to the object(s) 230. The panels 220 may also be used to generate different light effects, e.g., highlights and long and softer shadows, Chiaroscuro effect, etc, for photography. The key light provided may be angled such that it may light up any part of an object 230 with control of the intensity of the key light. Each LED 229 may be addressed individually. Such a technique may allow (full) control over object lighting so that an optimal lighting condition or environment suitable for product photography may be determined. Further, in various embodiments, as the object height and length may be detected using a computer application, the optimal or best lighting condition or environment may be determined or suggested automatically with minimal or without any saturation and/or dark spots.
An interface 234 may be provided on the pixel machine 201, for example, for cable connection to the processor to be described below with reference to FIG. 3.
FIG. 3 shows a schematic back view of a main controller 340, according to various embodiments. The main controller 340 may include a processor 342, for example, an industrial PC with embedded LEDs and motor controllers. The main controller 340 may further include a display device (e.g., a touch-screen monitor) 352, a keyboard 354 and a mouse 356.
An actuator, for example, a single capture button 344, may be provided (integrated) with the processor 342. A connector 348 may be provided for connection to a network, e.g., a local area network (LAN). A power connector or socket 350 may also be provided.
An interface (or link connector) 346 may be provided on the processor 342 for communication with the pixel machine 201 (FIGS. 2A to 2E), for example, via the interface 234 (FIG. 2A). In this way, signals may be communicated from the processor 342 to the pixel machine 201, for example, to control movement and operation of the imaging device 210, to vary the parameter of the lighting provided by the lighting arrangement, etc. A cable may be connected between the interfaces 234, 346, e.g., a multicore cable which may carry power, and control signals for LEDs 229, motor(s) (e.g., 232) and the imaging device 210 of the pixel machine 201.
The method for imaging will now be described by way of the following non-limiting examples, and with reference to FIG. 4.
Reflection elimination may be necessary in product photography, for example, when involving highly reflective objects such as jewellery, glass, etc. In the methods for imaging of various embodiments, as the parameter (e.g., intensity) of the lighting provided may be controlled or varied, reflection on the product may also be varied, for example, to a level where there is no reflection anymore or reduced to its minimum level. The (reflection) level may be detected based on the saturation level(s) of pixels, where pixels which lie at the boundary of the product may be determined or identified, and with the increase of light to allow these pixels to reach the saturation levels or to their highest intensity level. These pixels may be marked at their best contrast and resolution levels, and, subsequently, the pixels may be used to replace the saturated pixels of the frame(s).
FIG. 4 shows a flow chart 490 illustrating a method for imaging, as an example to minimise reflection. At 491, an object of interest (e.g., a product) may be placed in the apparatus (e.g., 100, FIG. 1A) of various embodiments (e.g., in the pixel machine 201, FIGS. 2A-2E). Lighting of an increasing intensity may be provided to illuminate the object and a plurality of frames may be generated in the process. The lighting may also illuminate the background. Command is sent to the imaging device (e.g., 210) to generate images. The intensity of the lighting may be varied (e.g., increased) such that, in the frames captured, each pixel defining the object and/or the background may initially have a minimum intensity value and progressively reach a saturation value or level. It should be appreciated that respective pixels defining the object and/or the background may reach respective saturation levels in different frames. Further, respective pixels having desired values may be distributed in different frames. Further, in various embodiments, the colour of the lighting on the object and/or the background may also be changed.
As a non-limiting example, the light intensity and/or colour emitted from the associated LEDs or LED panels may be varied under the control of the processor 342 (FIG. 3) which may in turn take command via or from a software application, e.g., based on user input. It should be appreciated that key light may be provided on the object.
As a further non-limiting example, the button 344 (FIG. 3) may be actuated/pressed to perform the product imaging or photography. Actuation of the button 344 may result in automated operations.
At 492, for each respective pixel (e.g., for the object and/or the background) throughout the frames generated, the minimum intensity value and the maximum intensity value may be determined from the corresponding pixels. The average or mean value for each respective pixel may be calculated based on the minimum and maximum intensity values. As illustrated at 493, the mean value defining the final intensity value may be an average of the minimum and maximum intensity values (i.e., (minimum intensity value + maximum intensity value)/2). As described at 494, in this way, the desired (or optimum) value for each respective pixel may be determined, where there is no saturation of the pixels.
At 495, a resultant or final frame may then be generated using pixels that are determined to have the desired (or optimum) values. Therefore, the resultant frame may be defined by a combination of pixels having the desired values. The resultant image may include a uniform exposure of the pixels in the image, e.g., the corresponding pixels defining the object and/or the background, or all the pixels in the image.
In other words, the intensity of lighting directed onto the object and/or the background may be varied (e.g., increased) during capture of different images such that, from the images captured, the minimum intensity value and a maximum intensity value (saturation) for each respective pixel corresponding to the object and/or the background may be determined, where saturation may be reached for different pixels in different frames. A final resultant frame may then be constructed such that each pixel corresponding to the object and/or the background carries an intensity value equal to the mean value of the corresponding minimum and maximum intensity values. As described above, various methods for imaging may create saturation, for the purpose of removing saturation.
In the context of the methods for imaging disclosed herein, the images may be generated via two modes: photography mode and video (or movie) mode. In movie mode, the imaging device (e.g., camera) 210 may capture frames at a rate of 60 seconds (may be less or more depending on device speed), and the light intensity and/or colours may be changed during the making of the video. In all operation modes, the auto iris mode of camera would be turned off so that the effect of rate of change of light in the background and the product may be captured. The timing of the frames may be marked in the software application, which may be extracted and used to make a final or resultant 360° of still photograph.
The various methods for imaging may be used for making 360° photography of product by rotation. Each frame may have clear background elimination.
In the context of the apparatus and various methods for imaging, as pixels associated with or belonging to the object may be detected, it may help a user to place the object in the centre of the turn table to have a better result in 360° photography. To achieve this, the object may be placed anywhere at the turn table and rotated, and the object pixels may be determined or identified. The image that is desired or best to put in front of the imaging device may be identified and such image with the product may be shown on a display device in half tone and half tone of live video, which may provide a live view which helps the user to match the location of the object following the image on screen by overlaying both images. As a non-limiting example, the product may first be detected where it is in the frame. The centre or middle of the product may be determined by dividing X and Y values by 2. A centre line of the product may be drawn. The centre line may be thick as all the middle pixels may not lie on the same X and Y lines, so the width (threshold of centre) of the product may increase but in no way, may have value out of product boundary. The difference between the current location and what should be if the product is placed in the centre of the turn- table may be determined and a half tone image may be shown to the user, and where necessary, to act to move the product to the centre of the turn table to have good 360 rotation view.
In the context of the apparatus and various methods for imaging, each light source
(e.g., LED) may be controlled through a touch screen monitor or by one or more mouse clicks which help the user to create different lighting effects. LEDs may be driven by a DC (direct current) controller to minimise or eliminate lighting non-sync locking effect. The use of DC controller may eliminate the flicker caused with most LED driving arrangements used in low cost LED drivers using direct AC line of 50/60 hertz so as to save cost. Each light source (e.g., LED) may be addressed and controlled independently. While the light sources (e.g., LEDs) receive commands from the processor, the associated drivers for the light sources are located in the pixel machine.
Capture of round edges of products in good resolution is a challenge. In the context of the apparatus and various methods for imaging, LED light control from all angles may help to light up the round edges from left, right, top and bottom of the product, and good contrast and resolution pixels are made part of the final frame.
In the context of the apparatus and various methods for imaging, various lighting sequences may be employed, either individually or in combination. A first lighting sequence may be based on feature or part detection within the object. After the detection of an object as a whole (i.e., determination or identification of object pixels), various features or parts of the object may be detected based on colour. Using LEDs as a non-limiting example, as lighting provided by the LEDs may be controlled (e.g., via processor or software application in the processor), it may be possible to determine the specific LEDs or LED panel illuminating a specific individual feature of the object. The intensity and/or colour of the light from said specific LEDs or LED panel, which illuminates said specific individual feature, may be varied to be substantially similar to the colour of the individual feature, which helps to further highlight the individual feature. For example, where the colour of the individual feature is yellow, the associated LED may be controlled to provide yellow- coloured lighting onto the individual feature so that said feature may be clearer and colour rich. Such an approach may also help in determining the saturation level for the pixels corresponding to said specific feature for determining the optimum intensity and/or colour value for said pixels when constructing the final resultant frame. Such an approach may be helpful for image capture of objects having reflective part(s), for example, shiny metal parts and glass products. A second lighting sequence may be based on defined lighting patterns. Different lighting patterns may be provided, for example, round circle spot lighting, line lighting (of different widths or thicknesses), motion lighting (e.g., up, down, circular) to create different effects in photography; fade-in/fade-out effect, etc. in short movie mode as most digital cameras are capable of video recording at high resolution. Lighting patterns or effects may be applied in circumstances where the object is stationary or is being rotated 360° during the image capture.
As the user has control of flexibility in lighting from any or all angles, coupled with possibility of rotation of object(s), there is no limit on the lighting effects that may be created.
In the context of the various methods for imaging, lighting may be provided to the sides of an object. As lighting may be provided from the front for a shiny object, the edges of the object may become mixed with the background. The front lighting may then be turned off and side lighting may be provided for images to be generated to capture the edges of the object without saturation. For example, lighting may be provided from the left and/or right sides of the object. Subsequently, the images generated with the front lighting and the side lighting may be combined to form the complete object.
In the context of the apparatus, a camera stand may be mounted on the box defining the pixel machine, which may help in the minimisation or elimination of vibration or small differences in the position of detection of pixels in different frames.
In the context of the apparatus, the imaging device (e.g., camera) may be mounted on an XYZ motorized stand which may not only automatically zoom but adjust its physical location as per product size and/or placement on the turn table. The imaging device may be moved up and down, in a linear or curved motion. The imaging device may be moved in a curved motion to take a top view image of the object if desired. This may mean that the front-middle LED panel 204 may be movable together with the imaging device when the XYZ motorized stand is activated to control positioning of the imaging device. The front-middle LED panel 204 may be movable up and down and/or rotated in curve. Additionally or alternatively, even if the imaging device is moved up and down with XY movement, a curved effect may be achieved using the software application.
In the context of the apparatus and various methods for imaging, as lighting may be provided from any or all angles, the lighting may be turned on in a specific or defined manner to find the desired or best lighting based on pixel resolution and/or saturation levels.
In the context of the apparatus and various methods for imaging, front, back, left and right views of the object may be automatically captured by rotation of the turn table. A user (only) needs to place the object front facing the imaging device. The turn table may then rotate and adjust, for example, 5 - 10°, to determine the location where the maximum width of the object is detected. This may help to further eliminate any placement errors by the user. Unless, if desired, the user wants to have a front view of its own choice. Once the best possible front position is detected, the turn table may rotate at any angle, which may be defined in a default setting of the software. For example, the turn table may rotate 90° to capture the left, back and right views of the object for photoshoot.
In the context of the apparatus and various methods for imaging, imaging or photography, and/or elimination of background, may take less than a minute.
In the context of the apparatus and various methods for imaging, as the light intensity may be controlled and/or different colour combinations may be produced, camera white balance may be adjusted based on RGB colour intensity which may help in capturing close to the natural colours of the object(s) or product(s).
In the context of the apparatus and various methods for imaging, a software application compatible with Windows® and Linux® operating systems may be provided. After the single button 344 is pressed by a user, the object on the turn table may be automatically determined, the camera may be adjusted to its best location, the lighting may be adjusted, object image(s) may be taken or generated, the background may be eliminated, the object may be centred in the middle of frame and both raw and compressed versions may be stored, for example, in the“My Pictures” folder in Windows® or on a flash drive connected to processor or to a network addressed folder. Activating the button 344 triggers the automatic scripts written in the software application, which may include predefined template(s) to get most of product photographs correct.
In the context of the apparatus and various methods for imaging, an application programming interface (API) platform may be provided, which may be used to write plugin for web application which may support single button operation to upload product images in single, quad or 360° compressed and optimised for web resolution direct to e-commerce websites with single press of the button 344.
As described above, the various methods may provide a uniform exposure by patching best pixels in the frames.
The various methods may be used in a studio with bigger objects to shoot, including fashion models, clothing and cars. Background lighting may be varied with known values, and pixels may be marked and determined (or identified) as belonging to the product or the background so that there is no necessity for manual editing, e.g., to remove the background. It should be appreciated that the various methods for imaging may be implemented, either individually or a combination thereof, in the apparatus for imaging of various embodiments.
In further embodiments, during generation of the plurality of images, the intensity of the lighting may be initially set for saturation of the pixels, and the intensity may be gradually reduced. A pixel may have red, green and blue components, which when saturated, each colour component may have a high or maximum (saturated) value. As the intensity of the lighting is reduced, the value for each colour (red, green, blue) component may decrease. The ratio in the change of the respective value for each colour component may vary with each reduction in the lighting intensity. At a certain threshold lighting intensity and below, the ratio in the change of the respective value for each colour component may be constant with each reduction in the lighting intensity. As a non-limiting example, with each reduction of the lighting intensity starting from the threshold lighting intensity, the value of the red component may change by“rl”, the value of the green component may change by“gl”, and the value of the blue component may change by“bl” such that the change in the respective components may follow a constant ratio rl :gl :bl with each reduction. A pixel with the respective values for the red, green and blue components at the threshold lighting intensity may be identified as the best pixel or having the desired pixel value.
While the invention has been particularly shown and described with reference to specific embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. The scope of the invention is thus indicated by the appended claims and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced.

Claims

1. An apparatus for imaging comprising:
a lighting arrangement configured to provide lighting; and
a processor,
wherein the processor is configured to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, and to control the lighting arrangement to vary at least one parameter of the lighting during generation of the plurality of images, and
wherein the processor is further configured, for a respective pixel of pixels belonging to the object of interest, to determine a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
2. The apparatus as claimed in claim 1, wherein, for determining the respective desired pixel value, the processor is configured, for the respective pixel, to determine an average pixel value from the corresponding pixels, the average pixel value being the respective desired pixel value.
3. The apparatus as claimed in claim 2, wherein, for determining the average pixel value, the processor is configured to determine a minimum pixel value and a maximum pixel value from the corresponding pixels for determining the average pixel value.
4. The apparatus as claimed in any one of claims 1 to 3, wherein the processor is further configured, for a respective pixel of pixels belonging to a background, to determine a respective desired pixel value from corresponding pixels of the respective pixel belonging to the background through the plurality of images.
5. The apparatus as claimed in any one of claims 1 to 3,
wherein the processor is further configured to vary the at least one parameter of the lighting through a plurality of values, and
wherein, for generating the plurality of images, the processor is further configured to control the imaging device to generate a respective image of the plurality of images at a respective value of the plurality of values.
6. The apparatus as claimed in any one of claims 1 to 3,
wherein the processor is further configured to control the lighting arrangement to vary the at least one parameter to adapt the lighting according to the respective desired pixel values, and
wherein the processor is further configured to control the imaging device to generate, based on the lighting adapted, a resultant image depicting the object of interest.
7. An apparatus for imaging comprising:
a processor; and
a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor to:
control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest;
control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images; and
determine, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
8. An apparatus for imaging comprising:
a processor; and
a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor, for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, to:
determine, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
9. A method for imaging comprising:
generating a plurality of images, each of the plurality of images depicting an object of interest; varying at least one parameter of lighting during generation of the plurality of images; and determining, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
10. The method as claimed in claim 9, wherein determining a respective desired pixel value comprises determining an average pixel value from the corresponding pixels, the average pixel value being the respective desired pixel value.
11. The method as claimed in claim 10, wherein determining an average pixel value comprises determining a minimum pixel value and a maximum pixel value from the corresponding pixels.
12. The method as claimed in any one of claims 9 to 11, further comprising determining, for a respective pixel of pixels belonging to a background, a respective desired pixel value from corresponding pixels of the respective pixel belonging to the background through the plurality of images.
13. The method as claimed in any one of claims 9 to 11,
wherein varying at least one parameter of the lighting comprises varying the at least one parameter of the lighting through a plurality of values, and
wherein generating a plurality of images comprises generating a respective image at a respective value of the plurality of values.
14. The method as claimed in any one of claims 9 to 1 1, further comprising generating a resultant image of the object of interest based on the pixels belonging to the object of interest having the respective desired pixel values.
15. The method as claimed in any one of claims 9 to 11, further comprising:
varying the at least one parameter to adapt the lighting according to the respective desired pixel values; and
generating, based on the lighting adapted, a resultant image depicting the object of interest.
16. A method for imaging comprising:
for a plurality of images generated at different values of at least one parameter of lighting, wherein each of the plurality of images depicts an object of interest, determining, for a respective pixel of pixels belonging to the object of interest, a respective desired pixel value from corresponding pixels of the respective pixel through the plurality of images.
17. A computer program or a computer program product comprising instructions which, when executed by a computing device, cause the computing device to carry out a method as claimed in any one of claims 9 to 11.
18. A computer program or a computer program product comprising instructions which, when executed by a computing device, cause the computing device to carry out a method as claimed in claim 16.
PCT/MY2019/000028 2018-07-31 2019-07-24 Apparatus and method for imaging WO2020027646A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI2018001379 2018-07-31
MYPI2018001379 2018-07-31

Publications (2)

Publication Number Publication Date
WO2020027646A2 true WO2020027646A2 (en) 2020-02-06
WO2020027646A3 WO2020027646A3 (en) 2020-03-05

Family

ID=69230723

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2019/000028 WO2020027646A2 (en) 2018-07-31 2019-07-24 Apparatus and method for imaging

Country Status (1)

Country Link
WO (1) WO2020027646A2 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100916484B1 (en) * 2007-08-14 2009-09-08 삼성전기주식회사 Method for auto white balance
KR101252671B1 (en) * 2011-04-29 2013-04-09 주식회사 디아이랩 Vehicle imeging system and method of controlling intensity of rafiation
KR101283079B1 (en) * 2011-08-17 2013-07-05 엘지이노텍 주식회사 Network camera having infrared light emitting diode illumination
KR101284268B1 (en) * 2011-10-28 2013-07-08 한국생산기술연구원 Color lighting control method for improving image quality of vision system
US10455137B2 (en) * 2014-07-28 2019-10-22 Orbotech Ltd. Auto-focus system

Also Published As

Publication number Publication date
WO2020027646A3 (en) 2020-03-05

Similar Documents

Publication Publication Date Title
US7724952B2 (en) Object matting using flash and no-flash images
US7206449B2 (en) Detecting silhouette edges in images
US7224360B2 (en) Image display method, image processing method, image processing apparatus, scan capturing apparatus and image signal generation method
JP6286481B2 (en) Writing system and method for object enhancement
EP1889471B1 (en) Method and apparatus for alternate image/video insertion
TW201602556A (en) Camera shooting method achieved through camera shooting elements
CN102572211A (en) Method and apparatus for estimating light source
US10735636B2 (en) Background replacement system and methods
US20180332239A1 (en) Background replacement utilizing infrared light and visible light
US9648699B2 (en) Automatic control of location-registered lighting according to a live reference lighting environment
TWI568260B (en) Image projection and capture with simultaneous display of led light
US10594995B2 (en) Image capture and display on a dome for chroma keying
WO2020027647A1 (en) Apparatus and method for imaging
EP3175291B1 (en) Image projection and capture with adjustment for white point
WO2020027645A2 (en) Apparatus and method for imaging
US20220114734A1 (en) System for background and floor replacement in full-length subject images
WO2020027646A2 (en) Apparatus and method for imaging
WO2020027648A1 (en) Apparatus and method for imaging
JP2013026656A (en) Photographing device, photographing method, and photographing program
CN111034365A (en) Apparatus and method for irradiating object
KR20190115712A (en) Apparatus for obtaining image of Beef Sirloin
KR102062982B1 (en) Method for video synthesis
Bimber et al. Digital illumination for augmented studios
US20210185242A1 (en) Extraction of subject from background
CN117354439B (en) Light intensity processing method, light intensity processing device, electronic equipment and computer storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19844931

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19844931

Country of ref document: EP

Kind code of ref document: A2