WO2020027645A2 - Apparatus and method for imaging - Google Patents

Apparatus and method for imaging Download PDF

Info

Publication number
WO2020027645A2
WO2020027645A2 PCT/MY2019/000027 MY2019000027W WO2020027645A2 WO 2020027645 A2 WO2020027645 A2 WO 2020027645A2 MY 2019000027 W MY2019000027 W MY 2019000027W WO 2020027645 A2 WO2020027645 A2 WO 2020027645A2
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
images
lighting
background
image
Prior art date
Application number
PCT/MY2019/000027
Other languages
French (fr)
Other versions
WO2020027645A3 (en
Inventor
Khurram Hamid KHOKHAR
Original Assignee
Kulim Technology Ideas Sdn Bhd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kulim Technology Ideas Sdn Bhd filed Critical Kulim Technology Ideas Sdn Bhd
Publication of WO2020027645A2 publication Critical patent/WO2020027645A2/en
Publication of WO2020027645A3 publication Critical patent/WO2020027645A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Definitions

  • Various embodiments relate to an apparatus for imaging and a method for imaging.
  • Photographing of products normally requires at least a few hours for professional photographers to adjust the lights to strike a balance between product lighting and background colour, as well as lighting of the product.
  • the product is reflective or is multicolour, and/or some parts of the product have a matching or similar colour as that of the background, the photographer has to make some compromises in lightings and quality of product shooting.
  • an apparatus for imaging may include a lighting arrangement configured to provide lighting, and a processor, wherein the processor is configured to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, and to control the lighting arrangement to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor is further configured, for a respective pixel of pixels defining the at least one image, to determine a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background.
  • an apparatus for imaging may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor to: control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images, determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background.
  • an apparatus for imaging may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor, for a plurality of images, wherein at least one image of the plurality of images depicts an object of interest, and wherein the plurality of images are generated at different values of at least one parameter of lighting, to: determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
  • a method for imaging may include generating a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, varying at least one parameter of lighting during generation of the plurality of images, determining, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determining, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background.
  • a method for imaging may include for a respective pixel of pixels defining at least one image of a plurality of images, wherein the at least one image depicts an object of interest, and wherein the plurality of images are generated at different values of at least one parameter of lighting, determining a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determining, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
  • a computer program or a computer program product may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
  • FIG. 1 A shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
  • FIG. 1B shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
  • FIG. 1C shows a flow chart illustrating a method for imaging, according to various embodiments.
  • FIG. 1D shows a flow chart illustrating a method for imaging, according to various embodiments.
  • FIGS. 2 A to 2E show schematic perspective views of a pixel machine of various embodiments from different angles.
  • FIG. 3 shows a schematic back view of a main controller, according to various embodiments.
  • FIGS. 4 to 7 show respective flow charts illustrating methods for imaging, according to various embodiments.
  • FIG. 8 shows schematic plots illustrating changes in pixel values, according to various embodiments. Detailed Description
  • a and/or B may include A or B or both A and B.
  • an imaging apparatus for example, a product photography apparatus, e.g., an automatic photography equipment for products.
  • the apparatus may minimise the effort and labour associated with photography of products, for example, for the fastest expanding online shopping business.
  • Various embodiments may also provide the corresponding methods for imaging.
  • One or more of the following may be achieved: (1) detection of pixels belonging to object(s) for automatic background elimination (e.g., background cut); (2) detection of saturated pixel(s) to automatically eliminate reflections; (3) detection of, and elimination or maintenance of product shadows; (4) automatic centering of the object(s); (5) elimination of background and shadows of rotating object(s); (6) providing uniform exposure and colour for all the pixels.
  • automatic background elimination e.g., background cut
  • detection of saturated pixel(s) to automatically eliminate reflections e.g., background cut
  • detection of, and elimination or maintenance of product shadows e.g., automatic centering of the object(s)
  • elimination of background and shadows of rotating object(s) e.g., rotating object(s)
  • FIG. 1A shows a schematic block diagram of an apparatus 100 for imaging, according to various embodiments.
  • the apparatus 100 includes a lighting arrangement 102 configured to provide lighting, and a processor 104, wherein the processor 104 is configured to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, and to control the lighting arrangement 102 to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor 104 is further configured, for a respective pixel of pixels defining the at least one image, to determine a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background.
  • an apparatus 100 for imaging may be provided, having a lighting arrangement 102 and a processor 104.
  • the apparatus (or arrangement) 100 may be employed to image or take images of an object of interest, e.g., against a background, and/or of the background in the absence of the object.
  • background may mean the background relative to the object (in the presence of the object) or said background in the absence of the object, or otherwise to be construed in the context the term is used.
  • the processor 104 may communicate with the lighting arrangement 102, for example, via a channel represented as 106.
  • the processor 104 and the lighting arrangement 102 may be electrically coupled to one another, e.g., via a physical cable 106.
  • the processor 104 may send one or more control signals to the lighting arrangement 102.
  • the lighting arrangement 102 may provide lighting to illuminate the object and/or the background.
  • the lighting arrangement 102 may partially or entirely surround the object to provide lighting from different directions towards the object and/or the background.
  • the lighting arrangement 102 may be arranged in the form of a closed box environment, and the object may be placed within said environment for imaging purposes.
  • the various methods and techniques may be used in an open studio, on a stage, or in a film set, or any other suitable environments or settings.
  • the processor 104 may control an (optical) imaging device (e.g., a (digital) camera capable of taking photographs and/or videos, or a (digital) video recorder) to generate a plurality of images (or frames).
  • the plurality of images may mean 100 or more images.
  • the imaging device may be separately provided or integrally provided with the apparatus 100.
  • the processor 104 may control the imaging device to take at least one image showing the object of interest in the image. It should be appreciated that one or more or all of the plurality of images may depict the object (e.g., against a background), and/or one or more of the plurality of images may depict the background without the object. An equal number of the plurality of images may depict the background without the object and with the object respectively.
  • the plurality of images may be taken or obtained (directly) as still images (e.g., photographic images), and/or may be images extracted from a moving sequence of consecutive graphics (e.g., a moving picture, motion picture, movie).
  • still images e.g., photographic images
  • moving sequence of consecutive graphics e.g., a moving picture, motion picture, movie
  • the processor 104 may control the lighting arrangement 102 to vary at least one parameter of the lighting for or during generation of the plurality of images.
  • the at least one parameter may be varied in between generation of two images, or of two immediately adjacent images. This may mean that the at least one parameter of the lighting is different for each of the two images generated.
  • the at least one parameter of the lighting may include any one of or any combination of a lighting intensity, a lighting colour, or a lighting direction.
  • the processor 104 may determine a change in pixel value of a respective pixel with a pixel across the plurality of images.
  • the pixels defining the at least one image may mean the pixels defining a portion of the at least one image (e.g., pixels defining an area in the image, the area being smaller than the (outer) boundary of the image), or all of the pixels defining the entirety of the at least one image.
  • the at least one pixel may include a pixel of any image of the plurality of images, for example, a pixel of the at least one image and/or belonging to an image generated immediately adjacent to the at least one image.
  • the at least one pixel of the plurality of images may be a corresponding pixel through or across the plurality of images.
  • the term“corresponding pixel” for an image in relation to the respective pixel of the at least one image may mean a pixel corresponding to the same position or coordinate as that for the respective pixel.
  • the change in pixel value may be a difference between a value of the respective pixel and a value of the at least one pixel.
  • the determination of the change in pixel value may be carried out with the respective pixel against another pixel (e.g., background pixel) of the same image, and/or against respective corresponding pixels of one or more (other) images of the plurality of images. It should be appreciated that the determination may be carried out for each of all of the plurality of pixels defining the at least one image. All of the plurality of images may be used for this determination.
  • the change in the pixel value may be defined in terms of a percentage change.
  • the processor 104 may determine whether the respective pixel is a pixel associated with the object (“object pixel”) or belonging to a background (“background pixel”) (e.g., relative to the object). In this way, object pixels and background pixels may be determined.
  • An“object pixel” may mean a pixel related to the object (e.g., the object pixel defines, in the image, the physical part of the object, and/or a shadow of the object), while a “background pixel” means a pixel defining or belonging to the background, either in the presence of the object or in the absence of the object. In this way, object pixels and background pixels may be determined or identified, and thus differentiated from each other.
  • to“determine” the respective pixel as a pixel belonging to the object or a background may include, to“identify” the respective pixel as a pixel belonging to the object or a background.
  • this may not necessarily mean to (positively) mark or tag the respective pixel as an object pixel or a background pixel, although this may be the case in various embodiments.
  • the identification of the respective pixel may mean the resulting or selection process of the respective pixel being determined or inferred to be an object pixel or a background pixel based on the result of the determination of the change in pixel value.
  • a certain or predetermined threshold value may be set, and the change in the pixel value may be compared against the threshold value. For example, if the change in the pixel value is less than the threshold value, the respective pixel may be determined or identified as an object pixel; otherwise, the respective pixel may be determined or identified as a background pixel.
  • the phrase“change in pixel value” may refer to a rate of change of pixel value.
  • the phrase“rate of change of pixel value” may mean the change in the pixel value through or across the plurality of images. This may mean that the term“rate” herein may refer to the change in pixel value determined progressively from one image to another image through the plurality of images. While the change in pixel value may vary with time, this is due to a change in the at least one parameter of the lighting over time for or during generation of the plurality of images, rather than a direct result from the passage of time.
  • the difference between object pixels and background pixels may be widened, which may help in determining a pixel as being associated with the object or belonging to the background.
  • the lighting arrangement 102 may include a plurality of light sources to provide the lighting.
  • Each light source may be or may include a light emitting diode or device (LED).
  • the plurality of light sources may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs.
  • RGBW LEDs may provide a pure white colour.
  • any other types of light sources may be used, as long as the light sources may be controlled to vary at least one parameter (e.g., light intensity) of the lighting provided by the light sources.
  • Each light source (e.g., LED) may be individually addressable. This may mean that each light source may be independently switched on/off, and controlled to vary the at least one parameter of the lighting.
  • the lighting arrangement 102 may include a plurality of (separate) lighting panels to provide lighting from a plurality of directions to the object and/or background.
  • Each lighting panel may include a plurality of light sources, for example, LEDs, e.g., a combination of red LEDs, green LEDs, blue LEDs, and white LEDs.
  • the lighting arrangement 102 may be or may form part of a pixel machine.
  • a driver arrangement may be provided (e.g., in the pixel machine) for driving the lighting arrangement 102.
  • the driver arrangement may have a plurality of drivers for driving the associated light sources of the lighting arrangement 102.
  • the processor 104 may be provided with one or more communication interfaces, e.g., including at least one interface for communication with the lighting arrangement 102 or the pixel machine.
  • the processor may be or may form part of a (main) controller.
  • the (main) controller may further include a display and other peripheral devices, e.g., a keyboard, a mouse, etc.
  • the pixel machine and the (main) controller may be comprised in the apparatus 100.
  • the at least one pixel may include at least one corresponding pixel of the respective pixel through the plurality of images
  • the processor 104 may be further configured to determine the change in pixel value between the respective pixel and the at least one corresponding pixel, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to the background.
  • the processor 104 may be configured to vary the at least one parameter of the lighting through (or over) a plurality of values (associated with the at least one parameter). For generating the plurality of images, the processor 104 may be configured to control the imaging device to generate one or more images including the at least one image at a respective (different) value of the plurality of values.
  • the plurality of values may be different to one another. This may mean that the lighting for illuminating the object and/or background may be different between two (immediately adjacent) images.
  • the parameter of the lighting may be at or may have a first value for generation of at least one first image, and a second value for generation of at least one second image.
  • the processor 104 may vary the at least one parameter over a plurality of (different) values at a plurality of intervals to illuminate the object and/or background, where each value may be associated with a respective interval, and the processor 104 may be configured to generate one or more images of the plurality of images at the respective interval.
  • the plurality of values may be within or may span a range from value 0 (minimum value) to value 255 (maximum value).
  • the at least one parameter may be varied from value 0 to value 255 at intervals of 1, 5 or any other number.
  • the value 0 may represent the minimum intensity or darkness, while the value 255 may represent the maximum intensity or saturation.
  • the values 0 - 255 may also represent the scale or range for colour.
  • the lighting may include a background lighting to illuminate the background, and the at least one parameter may include at least one background parameter of the background lighting.
  • the processor 104 may be configured to control the lighting arrangement 102 to vary the at least one background parameter during generation of the plurality of images. For example, the processor 104 may vary the intensity and/or colour (as the background parameter) of the background lighting. As a non-limiting example, the intensity of the background lighting may be increased, e.g., from value 0 (dark) to value 255 (saturated).
  • the lighting may further include an object lighting (to illuminate the object) which may be maintained constant, or, at least one parameter thereof may be varied.
  • the phrase“background lighting” means the lighting that illuminates the background, and/or the space in between the background and the object of interest, without illuminating the object.
  • the object of interest to be imaged is not illuminated by the background lighting, i.e., lighting provided from one or more light sources, to be employed as the background lighting to illuminate the background, does not illuminate the object.
  • the imaging device e.g., camera
  • the imaging device may be adjusted to low(er) sensitivity so that light scattered from the background that may potentially illuminate the object may not be captured or be observable in one or more images of the plurality of images generated.
  • the processor 104 may be configured to vary the at least one background parameter of the background lighting through a plurality of values, and further configured to control the imaging device to generate a respective image of the plurality of images at a respective value of the plurality of values, wherein the respective image depicts the object of interest.
  • the processor 104 may be further configured, for each respective image through the plurality of images, to determine a change in pixel value between a respective corresponding pixel at an edge region of the object depicted in the respective image and a respective corresponding background pixel adjacent to the edge region.
  • the processor 104 may be further configured, from the changes in pixel value determined for the respective corresponding pixel through the plurality of images, to determine the respective corresponding pixel as a pixel associated with the object or a pixel belonging to the background.
  • the edge region of the object may refer to an area including an (perceived) edge or boundary of the object.
  • the edge region may include an area of between one and twenty pixels (e.g., one, two, three, five, ten, or twenty pixels) away from a pixel at or corresponding to the edge of the object.
  • a background pixel adjacent to the edge region may refer to a background pixel that may be between one and twenty pixels (e.g., one, two, three, five, ten, or twenty pixels) away from the pixel at or corresponding to the edge of the object.
  • the at least one background parameter of the background lighting may be varied from value 0 to value 255.
  • the change in pixel value between the corresponding pixel Ai and a respective corresponding background pixel (e.g., Bi) adjacent to the edge region may be determined for each image of the plurality of images generated from value 0 to value 255, and the result may be as illustrated in FIG. 8.
  • the change in pixel value may reach a maximum value at a background parameter value, ii, of the at least one background parameter. Based on such a change, the pixel Ai may be determined or identified as a pixel associated with the object.
  • the result for pixel A 3 at the edge region of the object may be as illustrated in FIG. 8, where the change in pixel value may reach a maximum value at a background parameter value, i 3 , and the pixel A 3 may be determined or identified as a pixel associated with the object.
  • the result for pixel A 2 at the edge region of the object may be as illustrated in FIG. 8, where the change in pixel value may be minimal as the pixel values for pixel A 2 and the associated corresponding background pixel may be at least substantially similar, and, as a result, the pixel A 2 may be determined or identified as a pixel belonging to the background.
  • the respective corresponding background pixel may be adjacent to the associated corresponding pixel for determination of the change in pixel value, for example, between one and twenty pixels (e.g., one, two, three, five, ten, or twenty pixels) away from the associated corresponding pixel. It should be appreciated that, while, h, i 3 , and i n , are illustratively shown as of different values, any two or more background parameter values may be of the same value.
  • the processor 104 may be further configured to determine, from the changes in pixel value, a desired change in pixel value (e.g., maximum change) for the respective corresponding pixel determined as being associated with the object, and further configured to generate a resultant image depicting the object including the determined respective corresponding pixel at the respective value corresponding to the desired change in pixel value. For example, using the results of FIG. 8, the processor 104 may generate a resultant image including pixel Ai at the background parameter value, ii, pixel A 3 at the background parameter value, i 3 , and so on with any other pixel determined as being associated with the object at the particular background parameter value with the desired change in pixel value.
  • a desired change in pixel value e.g., maximum change
  • the lighting may include an object lighting to illuminate the object, and a background lighting to illuminate the background, and the at least one parameter may include at least one object parameter of the object lighting, and at least one background parameter of the background lighting.
  • the processor 104 may be configured to control the lighting arrangement 102 to vary the at least one object parameter and the at least one background parameter relative to each other during generation of the plurality of images.
  • the respective colours of the object lighting and the background lighting may be different.
  • the processor 104 may be further configured to remove or discard pixels determined or identified as belonging to the background. All pixels that are determined as background pixels may be removed.
  • the processor 104 may be further configured to control relative movement between the imaging device and the object, and the processor 104 may be further configured to control the imaging device to generate the plurality of images from a plurality of directions relative to the object. For example, there may be rotational movement between the imaging device and the object. This may allow 360° generation of images of the object.
  • the imaging device may be placed on a support structure.
  • the support structure may be a movable support structure (e.g., an XYZ motorized stand), and the processor 104 may control movement of the movable support structure, relative to the object.
  • the apparatus 100 may include a turn table to support the object, and the processor 104 may control movement of the turn table, relative to the imaging device.
  • the turn table may be at least substantially transparent or transmissive to light.
  • the object may be placed on the turn table to be rotated to allow generation of images of the object at different angles of rotation.
  • the lighting arrangement 102 may include at least one curved lighting panel configured to provide a focused lighting onto the object.
  • the at least one curved lighting panel may include a plurality of light sources, e.g., LEDs. Four curved lighting panels may be provided.
  • the apparatus 100 may further include an actuator configured to communicate with the processor 104, wherein in response to a single activation of the actuator, operation of the processor 104 may be (fully) automated.
  • the actuator may be a push actuator, e.g., a push button.
  • the processor 104 may be further configured to control the imaging device to generate a number of images (or multiple images), each image depicting the object, and the processor 104 may be further configured to control the lighting arrangement 102 to vary a direction (or angle) of the lighting for or during generation of the number of images.
  • the processor 104 may be further configured, for each respective pixel of pixels determined as being associated with the object, to determine another change in pixel value between the respective pixel and at least one corresponding pixel of the number of images, and the processor 104 may be further configured to determine, based on the other change in pixel value determined, the respective pixel as a pixel belonging to the object or belonging to a shadow of the object.
  • a pixel belonging to the object refers to the pixel that defines, in the image, the physical or actual or tangible part of the object.
  • lighting may be provided from a plurality of directions onto the object. In this way, as shadow is generally formed on a side opposite to a source of lighting, the shadow that is formed moves about depending on the direction of the lighting, while the object that is lit remains stationary.
  • the processor 104 may be further configured to remove or discard one or more pixels (“shadow pixel(s)”) determined or identified as belonging to the shadow of the object. All pixels determined as belonging to the shadow may be removed. Nevertheless, it should be appreciated that one or more or all shadow pixels may be maintained or kept.
  • the processor 104 may be further configured to identify a flicker effect in the plurality of images generated and to remove the flicker effect. For example, when the imaging device is operated to take videos, flicker effect may be captured in one or more images of the plurality of images due to frequency difference in the respective operations between the imaging device and the lighting arrangement 102. Flicker in the images may be observed continuously. Rather than controlling the lighting to minimise the flicker effect, this may be achieved in the processing of the images. From the plurality of images generated, flicker which is regular and similar may be identified so that the flicker may be eventually removed.
  • the processor 104 may be further configured to generate a resultant image of the object based on (or using) pixels determined or identified as being associated with the object.
  • the processor 104 may be further configured to determine, from pixels determined as being associated with the object, respective pixels having respective desired (or optimum) pixel values for generation of the resultant image. From the object pixels determined, pixels (“desired pixels”) corresponding to the desired or optimum lighting condition, thereby having the desired (or optimum) pixel values, may be determined and the resultant image generated using such desired pixels. It should be appreciated that desired pixels of the object may be derived from different images of the plurality of images.
  • the processor 104 may be further configured to determine, from pixels determined as associated with the object, respective pixels having respective desired (or optimum) pixel values.
  • the processor 104 may be further configured to control the lighting arrangement 102 to vary the at least one parameter to adapt the lighting according to the respective desired pixel values determined (to illuminate the object).
  • the processor 104 may be further configured to control the imaging device to generate, based on the lighting adapted, a resultant image depicting the object.
  • the intensity and/or colour of the lighting directed to one or more features (or elements or areas) of or within the object may be varied to highlight said feature(s) or to provide the desired (or optimum) lighting condition for the feature(s), for generation of the resultant image.
  • FIG. 1B shows a schematic block diagram of an apparatus lOOb for imaging (and/or for image processing), according to various embodiments.
  • the apparatus lOOb includes a processor l04b, and a memory 105 coupled to the processor l04b.
  • the memory 105 and the processor l04b may be coupled to each other (as represented by the line 107), e.g., physically coupled and/or electrically coupled.
  • the memory 105 has stored therein instructions, which when executed by the processor l04b, cause the processor l04b to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images, determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background.
  • the plurality of images may be transferred to and/or stored in the memory 105 or another memory.
  • the memory 105 has stored therein instructions, which when executed by the processor l04b, cause the processor l04b, for a plurality of images, wherein at least one image of the plurality of images depicts an object of interest, and wherein the plurality of images are generated at different values of at least one parameter of lighting, to determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
  • the plurality of images may be transferred and/or stored in the memory 105 or another memory.
  • description in the context of the apparatus 100 may correspondingly be applicable to the apparatus lOOb, and description in the context of the processor 104 may correspondingly be applicable to the processor 104b.
  • object may include a living thing (e.g., person, animal, plant, etc.) and/or a non-living thing (e.g., a product, item, inanimate body or object, etc.).
  • a living thing e.g., person, animal, plant, etc.
  • a non-living thing e.g., a product, item, inanimate body or object, etc.
  • FIG. 1C shows a flow chart 120 illustrating a method for imaging (and/or for image processing), according to various embodiments.
  • a plurality of images are generated, wherein at least one image of the plurality of images generated depicts an object of interest.
  • at least one parameter of lighting is varied during generation of the plurality of images.
  • a change in pixel value is determined between the respective pixel and at least one pixel of the plurality of images.
  • the respective pixel is determined as a pixel associated with the object or a pixel belonging to a background (relative to the object).
  • the method may include providing the lighting.
  • the at least one pixel may include at least one corresponding pixel of the respective pixel through the plurality of images.
  • the change in pixel value between the respective pixel and the at least one corresponding pixel may be determined.
  • the respective pixel may be determined as a pixel associated with the object or a pixel belonging to the background.
  • the at least one parameter of the lighting may be varied through a plurality of (different) values, and, at 122, one or more images including the at least one image may be generated at a respective value of the plurality of values.
  • first and second images depicting the background with and without the object respectively may be generated at the respective value.
  • the (only) difference is the presence or absence of the object.
  • the first image may depict the object against a background and the second image may depict the corresponding background in the absence of the object.
  • the first image may be comprised in or within the definition of the at least one image that is generated depicting the object of interest.
  • a change in pixel value may be determined between the respective pixel and a corresponding pixel of the second image.
  • the respective pixel of the first image may be determined as a pixel associated with the object or a pixel belonging to the background.
  • the lighting may include a background lighting to illuminate the background, and, at 124, at least one background parameter of the background lighting may be varied for or during generation of the plurality of images.
  • the at least one background parameter may be varied through a plurality of values.
  • a respective image of the plurality of images may be generated at a respective value of the plurality of values, wherein the respective image depicts the object of interest.
  • a change in pixel value may be determined between a respective corresponding pixel at an edge region of the object depicted in the respective image and a respective corresponding background pixel adjacent to the edge region.
  • the respective corresponding pixel may be determined as a pixel associated with the object or a pixel belonging to the background.
  • the method may further include determining, from the changes in pixel value, a desired change in pixel value for the respective corresponding pixel determined or identified as being associated with the object, and generating a resultant image depicting the object including the determined respective corresponding pixel at the respective value corresponding to the desired change in pixel value.
  • the lighting may include an object lighting to illuminate the object, and a background lighting to illuminate the background, and, at 124, at least one object parameter of the object lighting and at least one background parameter of the background lighting may be varied relative to each other for or during generation of the plurality of images. Pixels determined or identified as belonging to the background may be removed or discarded. All pixels that are determined as background pixels may be removed.
  • the plurality of images may be generated from a plurality of directions relative to the object.
  • the method may further include generating a number of images (or multiple images), each image depicting the object, varying a direction of the lighting for or during generation of the number of images, determining, for each respective pixel of pixels determined as being associated with the object, another change in pixel value between the respective pixel and at least one corresponding pixel of the number of images, and determining, based on the other change in pixel value determined, the respective pixel as a pixel belonging to the object or a shadow of the object. One or more or all pixels determined or identified as belonging to the shadow may be removed or discarded.
  • the method may further include identifying a flicker effect in the plurality of images generated, and removing the flicker effect.
  • a resultant image of the object may be generated based on pixels determined as being associated with the object.
  • the method may further include determining, from the pixels determined as being associated with the object, respective pixels having respective desired pixel values for generation of the resultant image.
  • the method may further include determining, from pixels determined as being associated with the object, respective pixels having respective desired pixel values, varying the at least one parameter to adapt the lighting according to the respective desired pixel values determined, and generating, based on the lighting adapted, a resultant image depicting the object.
  • FIG. 1D shows a flow chart 130 illustrating a method for imaging (and/or for image processing), according to various embodiments.
  • a change in pixel value is determined between the respective pixel and at least one pixel of the plurality of images.
  • the respective pixel is determined as a pixel associated with the object or a pixel belonging to a background (relative to the object).
  • the method may include providing the lighting.
  • Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
  • Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, to control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images, to determine, for a respective pixel of the pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background.
  • the apparatus of various embodiments may include two parts: a pixel machine and a main controller, as will be described further below with reference to FIGS. 2 A to 2E and 3.
  • FIGS. 2 A to 2E show schematic perspective views of a pixel machine 201 of various embodiments. It should be appreciated that while one or more features or elements of the pixel machine 201 may not be shown in one or more of FIGS. 2A to 2E for clarity and easier understanding purposes, and/or to illustrate an internal environment of the pixel machine 201, such features/elements may nevertheless be part of the pixel machine 201 as may be apparent from FIGS. 2A to 2E. Further, while LEDs are described in the context of the pixel machine 201, it should be appreciated that other types of light sources may be employed, as long as at least one parameter (e.g., light intensity and/or colour) of the lighting provided may be variable. Nevertheless, LED lighting is preferable due to its characteristics of low power consumption, speed of control and long life.
  • a parameter e.g., light intensity and/or colour
  • the pixel machine 201 may include a front-left LED panel 202, a front-middle LED panel 204, a front-right LED panel 206, a back side LED panel 207, a left side LED panel 214, a right side LED panel 216, four (key light) LED panels 220 which may be curved, a top LED panel 222, a top-back LED panel 224, a top-left LED panel 225, a top-right LED panel 226, a top- front (key light) LED panel 227, and a bottom LED panel 228, where one or more of these LED panels may define a lighting arrangement (e.g., 102, FIG. 1A).
  • the lighting arrangement may provide omnidirectional lighting, where the intensity of the lighting may be changed.
  • Each of the LED panels 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a plurality of LEDs (illustrated as squares in the panels and represented as 229 for some LEDs).
  • Each LED panel 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs (RGBW LEDs), which may provide pure white colour as producing white from RGB in a closed box may be challenging.
  • Each LED panel may be individually or independently controlled.
  • Each LED 229 may be individually or independently addressable.
  • the lighting arrangement of the pixel machine 201 may provide three types of lights: key light, fill light and back light, thereby providing three-point lighting.
  • the lighting arrangement may also provide background lighting.
  • the four LED panels 220 in the comer, the top-back LED panel 224, the top-left LED panel 225, the top-right LED panel 226, and the top-front LED panel 227 may provide key light.
  • the left side LED panel 214 and the right side LED panel 216 may provide fill light.
  • the back side LED panel 207 may provide back light.
  • An imaging device (e.g., camera) 210 which may be supported on a support structure (e.g., camera stand) 212, may be provided for imaging through an aperture 208 defined in the front- middle LED panel 204.
  • the imaging device 210 may have a zoom feature (e.g., including at least one of optical zoom or digital zoom), meaning that the imaging device 210 is capable of zooming-in and zooming-out operations for generation of images.
  • a turn table 231 may be provided to support one or more (sample) objects (e.g., represented as 230) that are to be imaged.
  • a stepper motor 232 may be provided for rotation of the object(s) 230 or turn table 231.
  • the lighting arrangement of the pixel machine 201 may provide lighting to light up an object (or product) 230 at least substantially equally from all angles so one or more or all features of the object(s) 230 may have good resolution, contrast and colours.
  • the four side LED panels 220 (curved or round shape) employ more LEDs to provide sharp and crisp contrast of the object(s) 230.
  • the panels 220 may also help to produce useful shadows shining areas of the object(s) 230.
  • the curved shape angle of the LED panels 220 may give a more focused light to the object(s) 230.
  • the panels 220 may also be used to generate different light effects, e.g., highlights and long and softer shadows, Chiaroscuro effect, etc, for photography.
  • the key light provided may be angled such that it may light up any part of an object 230 with control of the intensity of the key light.
  • Each LED 229 may be addressed individually.
  • Such a technique may allow (full) control over object lighting so that an optimal lighting condition or environment suitable for product photography may be determined.
  • the optimal or best lighting condition or environment may be determined or suggested automatically with minimal or without any saturation and/or dark spots.
  • An interface 234 may be provided on the pixel machine 201, for example, for cable connection to the processor to be described below with reference to FIG. 3.
  • FIG. 3 shows a schematic back view of a main controller 340, according to various embodiments.
  • the main controller 340 may include a processor 342, for example, an industrial PC with embedded LEDs and motor controllers.
  • the main controller 340 may further include a display device (e.g., a touch-screen monitor) 352, a keyboard 354 and a mouse 356.
  • a display device e.g., a touch-screen monitor
  • An actuator for example, a single capture button 344, may be provided (integrated) with the processor 342.
  • a connector 348 may be provided for connection to a network, e.g., a local area network (LAN).
  • a power connector or socket 350 may also be provided.
  • An interface (or link connector) 346 may be provided on the processor 342 for communication with the pixel machine 201 (FIGS. 2A to 2E), for example, via the interface 234 (FIG. 2A). In this way, signals may be communicated from the processor 342 to the pixel machine 201, for example, to control movement and operation of the imaging device 210, to vary the parameter of the lighting provided by the lighting arrangement, etc.
  • a cable may be connected between the interfaces 234, 346, e.g., a multicore cable which may carry power, and control signals for LEDs 229, motor(s) (e.g., 232) and the imaging device 210 of the pixel machine 201.
  • a product may be placed in the apparatus (e.g., 100) of various embodiments (e.g., in the pixel machine 201).
  • the button 344 may then be actuated/pressed to perform product photography. If the button 344 is pressed for more than 5 seconds, 360° photography may be carried out.
  • the RGB LEDs providing background lighting light up with varying colour and/or intensity values.
  • the light intensity and/or colour emitted from the associated LEDs or LED panels may be varied under the control of the processor 342 which may in turn take command via or from a software application, e.g., based on user input.
  • key light may be provided on the product. Further, depending on requirements, the (intensity and/or colour of) key light on the product may be changed and the background light maintained as static, or vice versa.
  • command is sent to the imaging device (e.g., 210) to capture or generate frames or images at different intervals. Respective images may be generated with and without the product at respective intervals.
  • each pixel in the respective frames with and without the product may be compared to each other. Such respective frames may be taken at the same lighting condition (e.g., same colour and/or intensity value).
  • a change or difference between the pixel values of a pixel in a frame with the object (Pixel A) and a pixel in a frame without the object (PixelB) may be determined.
  • the change in the pixel values of PixelA and PixelB may be computed based on respective changes between the pixel values of each of the red component, green component and blue component of PixelA and PixelB (i.e., PixelA.Red and PixelB.Red, PixelA.Green and PixelB. Green, PixelA.Blue and PixelB.Blue).
  • the change in the pixel value between PixelA and PixelB may be expressed as a percentage change.
  • the red component of a pixel may mean the pixel of an image generated under (only) red light condition, or the red component of the pixel of an image generated under white light condition. This applies correspondingly to the green component and the blue component.
  • the percentage change determined at 472 may be compared against a threshold value.
  • the condition may be set as to whether the percentage change is less than the threshold value (percentage ⁇ threshold). If the result“percentage ⁇ threshold” is determined, the pixel is determined or identified as being associated with the product at 476. Otherwise, the pixel is determined or identified as belonging to the background at 478.
  • the threshold value may be set via the corresponding software application, and may be set depending on the chosen rate of change to be applied. As the lighting may be varied with known value(s), the expected rate of change between product pixels (or object pixels) and background pixels may be determined. As non-limiting examples, a threshold of 10 percent may be set.
  • one frame may be captured at, for example, 20% background and 40% product light.
  • the background lighting intensity may be reduced to 10% background and the product lighting may be increased to 50%.
  • the expected difference between the product pixels and background pixels may be 20%; however, to provide a buffer, the threshold may be set to 10%, being 50% of the actual expected change of 20%. Accordingly, the pixels may be easily determined as product pixels or background pixels.
  • the method may include varying at least the light intensity on the background. LED lights may be used in a closed box environment, e.g., within the pixel machine 201 (or may be used in a studio setting/environment).
  • At least 50 to 100 frames with different light levels may be generated within 30 to 60 seconds (depending on device capture speed per second). These frames may be later compared to find the rate of change of pixel colour values and/or intensity values. As the rate of change of pixels of the background is different than the rate of change of pixels of the product under shoot, pixels associated with the product or belonging to the background may be determined, and thus, separated from one another.
  • images may be generated starting from value 0 which is completely dark background to value 255 which is completely saturated.
  • images or more may be captured between values 0 to 255 for the background lighting.
  • the images may be generated both with and without the object.
  • a pair of images, respectively with and without the object may be generated with the background illuminated with a background lighting at the same intensity value.
  • corresponding images with and without the product may be compared to each other, and the rate of change of each pixel throughout the frames or in all the frames may be determined.
  • the rate of change of pixels is determined through all the 100 or more frames to determine the changes that happen to each pixel. In other words, each pixel is compared in all the frames to determine how much it has changed throughout the frames.
  • the rate of change of background pixels and those for the object are different, and, in this way, object pixels and background pixels may be differentiated.
  • the background may be determined or identified and subsequently removed. Further, even when there are reflections on a product that is shiny or reflective, the rate of change of pixels associated with the reflection is different from that for the background.
  • the method without the product in the pixel machine e.g., 201 and to determine the rate of change of pixels by varying the light intensity and/or colour with known values and/or pattern.
  • the product may be placed on a turn table (e.g., 231), and the same lighting and/or colour intensity pattern may be run, and both frames with product and without product may be compared to find which pixel is associated with or belongs to the product and which pixel belongs to the background, so as to eliminate the background.
  • images without the product may not be generated. This may mean that, at 470, each pixel in the respective frames with the product may be compared to each other.
  • Such respective frames may be taken at different lighting parameters or conditions (e.g., different colour and/or intensity values). Further, this may mean that, at 472, a change or difference between the pixel values of a pixel in a frame with the object (PixelA’) and a pixel in a frame (generated under a different lighting condition) with the object (PixelB’) may be determined. Therefore, comparison of pixels using frames with and without the object, or using frames with the object under different lighting conditions, may be possible. Both techniques work, as the rate of change of pixel values corresponding to the product and the background is different. Further, if key light is provided facing the product directly, the variation in the key light on the product has less effect on the background lighting intensity, and this may widen the rate of change of background pixels and products pixels even further.
  • the images may be generated via two modes: photography mode and video (or movie) mode.
  • movie mode the imaging device (e.g., camera) 210 may capture frames at a rate of 60 seconds (may be less or more depending on device speed), and the light intensity and/or colors may be changed during the making of the video.
  • the auto iris mode of camera would be turned off so that the effect of rate of change of light in the background and the product may be captured.
  • the timing of the frames may be marked in the software application, which may be extracted and used to make a final or resultant 360° of still photograph.
  • individual LEDs may be controlled so that there may be full control over lighting.
  • Different lighting colours may be provided.
  • red LED red LED
  • the gap of difference between product pixels and background pixels may widen even further.
  • This effect may be addressed or eliminated by alternating the background and product lighting colours, which may be blue light in the background and red light on the product, using the above example. This may further help in finding and eliminating the common colour mixing pixels. Any or all combination in background and products with RGBW colours may be employed in the methods.
  • an image of the product in white colour background may be generated, and images of the product with different red, green and blue (RGB) backgrounds may be further generated.
  • the mixing and common pixels e.g., pixels having colour substantially similar to the background
  • the intensity of the lighting for providing the colour background(s) may be varied during generation of the images.
  • a pixel may have a (inherent) colour, where each colour component (RGB) of the pixel may have a certain (or original or inherent) colour value (e.g., corresponding to the case of the white colour background).
  • the colour value of one or more of the colour components may change to a different value.
  • the intensity of the lighting provided for the background is changed, the colour value of one or more of the colour components may also change to a different value.
  • the change(s) in the colour value(s) of the colour component(s) is generally greater compared to that for a background pixel.
  • Such a difference in the change(s) may be used to determine or identify object pixels.
  • a threshold value may be set and a pixel may be determined as an object pixel if the change in the colour value corresponding to one or more colour components is at or greater than the threshold value. This approach may be helpful for object pixels located at or adjacent to the edge of the object so as to be able to distinguish the border or edge of the object from the background.
  • This technique, and the technique described above relating to alternating the background and product lighting colours may produce substantially similar results.
  • FIG. 5 shows a flow chart 560 illustrating a method for imaging, according to various embodiments.
  • a product may be placed in the apparatus (e.g., 100) of various embodiments.
  • each pixel in the respective frames with and without the product may be compared to each other. Such respective frames may be taken at the same lighting condition (e.g., same colour and/or intensity value).
  • a change or difference between the pixel values of a pixel in a frame with the object (PixelA) and a pixel in a frame without the object (PixelB) may be determined and expressed in percentage.
  • the percentage change determined at 572 may be compared against a threshold value. For example, the condition may be set as to whether the percentage change is less than the threshold value (percentage ⁇ threshold). If the result “percentage ⁇ threshold” is determined, the pixel is determined or identified as being associated with the product at 576, 586. Otherwise, the pixel is determined or identified as belonging to the background at 578. It should be appreciated that similar or like actions or features described in the context of the flow chart 460 of FIG. 4 may correspondingly be applicable here, and hence are omitted for brevity.
  • one or more pixels defining such portion(s) of the product may be mixed into the background in one or more images that are generated.
  • the colour and/or intensity of the background lighting may be varied. Accordingly, by such changes, the colour of the product or portion(s) thereof may be different to the background colour at one or more times for images to be generated.
  • product pixels may be determined, for example, using a similar (percentage ⁇ threshold) condition.
  • pixels that are consistently in the background may be determined. If a pixel is not in the background in any one image, the pixel may be determined as a product pixel.
  • the method described in the context of flow chart 560 may be employed when faced with the challenge where one or more parts of the product under shoot contains the same or similar colour as the colour of the background, which in most cases, is a white background.
  • the background colour may be changed via the use of RGBW LEDs.
  • product pixels and background pixels may be differentiated from each other.
  • Elimination of shadow is always a challenge in any photography and, using the apparatus of various embodiments, as the lighting may be rotated in a known pattern to allow the shadow to rotate as well, and where the product pixels do not rotate, pixels belonging to the product and pixels belonging to the shadow of the product (e.g., pixels belonging to the bottom of the product) may be determined. Different LEDs or LED panels may be turned on at different intervals to light up the product from different angles so as to rotate the lighting in a known pattern. In this way, the shadow “moves” as the source of lighting moves. Shadow is always opposite to light. And, shadows are mono color, mostly shaded in black. Turning on the light where shadows are found, thus, further helps as the shadow disappears and appears on the opposite side behind the product.
  • FIG. 6 shows a flow chart 660 illustrating a method for imaging, according to various embodiments, as an example to remove shadow.
  • a product may be placed in the apparatus (e.g., 100) of various embodiments (e.g., in the pixel machine 201).
  • the button 344 may then be actuated/pressed to perform product photography.
  • the RGB LEDs providing background lighting light up with varying colour and/or intensity values.
  • Command is sent to the imaging device (e.g., 210) to generate images. Respective images may be generated with and without the product.
  • each pixel in the respective frames with and without the product may be compared to each other. Such respective frames may be taken with the same lighting condition (e.g., same colour and/or intensity value).
  • the change determined from the comparison carried out may be compared against a threshold value.
  • the condition may be set as to whether the change is less than the threshold value (change ⁇ threshold). If the result“change ⁇ threshold” is determined, the pixel is determined as being associated with the product at 676. Otherwise, the pixel is determined as belonging to the background at 678.
  • similar or like actions or features described in the context of the flow chart 460 of FIG. 4 may correspondingly be applicable here, and hence are omitted for brevity.
  • the lighting may be rotated relative to the product and frames may be generated with displaced shadow. Such frames may be compared to the product pixels with the shadow as determined at 676.
  • the change determined from the comparison carried out may be compared against a threshold value. For example, the condition may be set as to whether the change is less than the threshold value (change ⁇ threshold). If the result“change ⁇ threshold” is determined, the pixel is determined as belonging to the product at 684. Otherwise, the pixel is determined as belonging to the shadow at 686 and the shadow pixel may be removed or discarded.
  • the respective threshold values at 674 and 682 may be the same or may be different.
  • the lighting may be provided from the top of the product and the direction of the lighting above the product may be changed or rotated. In this way, shadows that are formed are not uniform in all images. Therefore, when the rate of change for all the frames is compared, shadows may be determined and, if desired, removed.
  • lighting may be rotated to provide the lighting from different angles, e.g., from the top, middle and bottom (but not limited to 3 positions only) of the object, or to provide top lighting, side lighting and back lighting, etc.
  • the shadow changes its position or direction every time the direction or position of the lighting is changed. Then, pixels belonging to the shadow may be defined by comparing all the frames generated. Some shadows may be removed and some may be maintained to add detail to the object. Therefore, after capturing all the shadows, user input may be obtained to either keep a particular shadow or remove it. Further, there may be shadows that fall on the object itself due to the presence of different shapes in the object, and such shadows may be maintained or removed.
  • the product in order to eliminate the shadow, it may be possible to light up the product from the bottom, for example, using LEDs at the bottom (e.g., using the bottom LED panel 228) and the product is laying on a transparent glass of turn table (e.g., 231). As a result, the shadow is produced on top which is not part of the product frame, so that pixels belonging to the product may be clearly identified or determined.
  • FIG. 7 shows a flow chart 760 illustrating a method for imaging, according to various embodiments, as an example to remove shadow by illuminating the product from the bottom of the product.
  • a product may be placed in the apparatus (e.g., 100) of various embodiments (e.g., in the pixel machine 201).
  • the button 344 may then be actuated/pressed to perform product photography.
  • the RGB LEDs providing background lighting light up with varying colour and/or intensity values.
  • Command is sent to the imaging device (e.g., 210) to generate images. Respective images may be generated with and without the product.
  • each pixel in the respective frames with and without the product may be compared to each other.
  • Such respective frames may be taken with the same lighting condition (e.g., same colour and/or intensity value). While not shown, it should be appreciated that the actions or features described in the context of 472 (FIG. 4) may correspondingly be applicable here.
  • the change determined from the comparison carried out may be compared against a threshold value. For example, the condition may be set as to whether the change is less than the threshold value (change ⁇ threshold). If the result “change ⁇ threshold” is determined, the pixel is determined as being associated with the product at 776. Otherwise, the pixel is determined as belonging to the background at 778. It should be appreciated that similar or like actions or features described in the context of the flow chart 460 of FIG. 4 may correspondingly be applicable here, and hence are omitted for brevity.
  • the change determined from the comparison carried out may be compared against a threshold value. For example, the condition may be set as to whether the change is less than the threshold value (change ⁇ threshold). If the result“change ⁇ threshold” is determined, the pixel is determined as belonging to the product at 784. Otherwise, the pixel is determined as belonging to the shadow at 786 and the shadow pixel may be removed or discarded. It should be appreciated that the respective threshold values at 774 and 782 may be the same or may be different.
  • illuminating the product and the turn table from below for the purpose of determining or identifying shadow pixels may be challenging in some situations as shadows may fall on the object itself, which may not disappear when illuminating from below. In such situations, rotation of the lighting above the object may help with identification of the shadows.
  • various methods for imaging may work with shadow, and then remove the shadow, if desired.
  • various methods for imaging may create shadow, for the purpose of removing shadow. Nevertheless, the shadow pixels may also be maintained, if desired.
  • known methods may employ lighting in a way that eliminates shadow so that shadows are not captured in the images that are generated.
  • the various methods for imaging may be used for making 360° photography of product by rotation where each frame has clear background elimination.
  • the object may help a user to place the object in the centre of the turn table to have a better result in 360° photography.
  • the object may be placed anywhere at the turn table and rotated, and the object pixels may be determined or identified.
  • the image that is desired or best to put in front of the imaging device may be determinedand such image with the product may be shown on a display device in half tone and half tone of live video, which may provide a live view which helps the user to match the location of the object following the image on screen by overlaying both images.
  • the product may first be detected where it is in the frame.
  • the centre or middle of the product may be determined by dividing X and Y values by 2.
  • a centre line of the product may be drawn.
  • the centre line may be thick as all the middle pixels may not lie on the same X and Y lines, so the width (threshold of centre) of the product may increase but in no way, may have value out of product boundary.
  • the difference between the current location and what should be if the product is placed in the centre of the tum- table may be determined and a half tone image may be shown to the user, and where necessary, to act to move the product to the centre of the turn table to have good 360 rotation view.
  • each light source e.g., LED
  • each light source may be controlled through a touch screen monitor or by one or more mouse clicks which help the user to create different lighting effects.
  • LEDs may be driven by a DC (direct current) controller to minimise or eliminate lighting non-sync locking effect.
  • DC controller direct current controller may eliminate the flicker caused with most LED driving arrangements used in low cost LED drivers using direct AC line of 50/60 hertz so as to save cost.
  • Each light source e.g., LED
  • Each light source may be addressed and controlled independently. While the light sources (e.g., LEDs) receive commands from the processor, the associated drivers for the light sources are located in the pixel machine.
  • LED light control from all angles may help to light up the round edges from left, right, top and bottom of the product, and good contrast and resolution pixels are made part of the final frame.
  • a first lighting sequence may be based on feature or part detection within the object. After the detection of an object as a whole (i.e., determination or identification of object pixels), various features or parts of the object may be detected based on colour.
  • LEDs as a non-limiting example, as lighting provided by the LEDs may be controlled (e.g., via processor or software application in the processor), it may be possible to determine the specific LEDs or LED panel illuminating a specific individual feature of the object.
  • the intensity and/or colour of the light from said specific LEDs or LED panel, which illuminates said specific individual feature may be varied to be substantially similar to the colour of the individual feature, which helps to further highlight the individual feature.
  • the associated LED may be controlled to provide yellow- coloured lighting onto the individual feature so that said feature may be clearer and colour rich.
  • Such an approach may also help in determining the saturation level for the pixels corresponding to said specific feature for determining the optimum intensity and/or colour value for said pixels when constructing the final resultant frame.
  • Such an approach may be helpful for image capture of objects having reflective part(s), for example, shiny metal parts and glass products.
  • a second lighting sequence may be based on defined lighting patterns.
  • Different lighting patterns may be provided, for example, round circle spot lighting, line lighting (of different widths or thicknesses), motion lighting (e.g., up, down, circular) to create different effects in photography; fade-in/fade-out effect, etc. in short movie mode as most digital cameras are capable of video recording at high resolution.
  • Lighting patterns or effects may be applied in circumstances where the object is stationary or is being rotated 360° during the image capture.
  • all the colours in the object may be identified as separate objects, and lighting specific to such separate objects (e.g., having the same colour) may be provided. All the individual separate objects may then be patched up into a resultant frame to form the complete object with better colour pixels.
  • the intensity of the lighting may be decreased, thereby minimising reflection.
  • the lighting angle may be rotated, resulting in change in shadow position and in reflection effect.
  • Such a lighting pattern may be employed for identification of different pixels, for example, object pixels, background pixels, shadow pixels.
  • lighting may be provided to the sides of an object.
  • lighting may be provided from the front for a shiny object, the edges of the object may become mixed with the background.
  • the front lighting may then be turned off and side lighting may be provided for images to be generated to capture the edges of the object without saturation.
  • lighting may be provided from the left and/or right sides of the object. Subsequently, the images generated with the front lighting and the side lighting may be combined to form the complete object.
  • the pixel values for the object pixels may change but the internal reflection still stays at the same position. So, when there is a movement, pixels inside the object which have not moved may be determined and determined (or identified) as reflections.
  • a camera stand may be mounted on the box defining the pixel machine, which may help in the minimisation or elimination of vibration or small differences in the position of detection of pixels in different frames.
  • the imaging device e.g., camera
  • the imaging device may be mounted on an XYZ motorized stand which may not only automatically zoom but adjust its physical location as per product size and/or placement on the turn table.
  • the imaging device may be moved up and down, in a linear or curved motion.
  • the imaging device may be moved in a curved motion to take a top view image of the object if desired.
  • This may mean that the front-middle LED panel 204 may be movable together with the imaging device when the XYZ motorized stand is activated to control positioning of the imaging device.
  • the front-middle LED panel 204 may be movable up and down and/or rotated in curve. Additionally or alternatively, even if the imaging device is moved up and down with XY movement, a curved effect may be achieved using the software application.
  • lighting may be provided from any or all angles, the lighting may be turned on in a specific or defined manner to find the desired or best lighting based on pixel resolution and/or saturation levels.
  • front, back, left and right views of the object may be automatically captured by rotation of the turn table.
  • a user (only) needs to place the object front facing the imaging device.
  • the turn table may then rotate and adjust, for example, 5 - 10°, to determine the location where the maximum width of the object is detected. This may help to further eliminate any placement errors by the user.
  • the turn table may rotate at any angle, which may be defined in a default setting of the software. For example, the turn table may rotate 90° to capture the left, back and right views of the object for photoshoot.
  • camera white balance may be adjusted based on RGB color intensity which may help in capturing close to the natural colours of the object(s) or product(s).
  • a software application compatible with Windows® and Linux® operating systems may be provided.
  • the object on the turn table may be automatically determined, the camera may be adjusted to its best location, the lighting may be adjusted, object image(s) may be taken or generated, the background may be eliminated, the object may be centred in the middle of frame and both raw and compressed versions may be stored, for example, in the“My Pictures” folder in Windows® or on a flash drive connected to processor or to a network addressed folder.
  • Activating the button 344 triggers the automatic scripts written in the software application, which may include predefined template(s) to get most of product photographs correct.
  • an application programming interface (API) platform may be provided, which may be used to write plugin for web application which may support single button operation to upload product images in single, quad or 360° compressed and optimised for web resolution direct to e-commerce websites with single press of the button 344.
  • API application programming interface
  • the various methods and techniques may include one or more of capturing frames with and without the object, using side lights by switching off front lights to identify shiny object edges, and removing shadow by rotating lights.
  • the various methods and techniques may be used in a studio, on stage, or in a film set with bigger objects to shoot, including fashion models, clothing and cars.
  • Background lighting may be varied with known values, and pixels may be marked and determined (or identified) as belonging to the product or the background so that there is no necessity for manual editing, e.g., to remove the background.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Stroboscope Apparatuses (AREA)

Abstract

An apparatus for imaging is provided. The apparatus includes a lighting arrangement configured to provide lighting, and a processor, wherein the processor is configured to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, and to control the lighting arrangement to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor is further configured, for a respective pixel of pixels defining the at least one image, to determine a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background. A method for imaging is also provided.

Description

APPARATUS AND METHOD FOR IMAGING
Technical Field
Various embodiments relate to an apparatus for imaging and a method for imaging.
Background
Photographing of products normally requires at least a few hours for professional photographers to adjust the lights to strike a balance between product lighting and background colour, as well as lighting of the product. In worst cases, if the product is reflective or is multicolour, and/or some parts of the product have a matching or similar colour as that of the background, the photographer has to make some compromises in lightings and quality of product shooting. In almost every case, no matter how good and professional and skilled the photographer is, along with how good the post production tools may be, it ends up with the need for the photographer to eliminate the background by manual editing, which definitely is bound to cause loss of product details and resolution.
Summary
The invention is defined in the independent claims. Further embodiments of the invention are defined in the dependent claims.
According to an embodiment, an apparatus for imaging is provided. The apparatus may include a lighting arrangement configured to provide lighting, and a processor, wherein the processor is configured to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, and to control the lighting arrangement to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor is further configured, for a respective pixel of pixels defining the at least one image, to determine a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background. According to an embodiment, an apparatus for imaging is provided. The apparatus may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor to: control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images, determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background.
According to an embodiment, an apparatus for imaging is provided. The apparatus may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor, for a plurality of images, wherein at least one image of the plurality of images depicts an object of interest, and wherein the plurality of images are generated at different values of at least one parameter of lighting, to: determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
According to an embodiment, a method for imaging is provided. The method may include generating a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, varying at least one parameter of lighting during generation of the plurality of images, determining, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determining, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background.
According to an embodiment, a method for imaging is provided. The method may include for a respective pixel of pixels defining at least one image of a plurality of images, wherein the at least one image depicts an object of interest, and wherein the plurality of images are generated at different values of at least one parameter of lighting, determining a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determining, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
According to an embodiment, a computer program or a computer program product is provided. The computer program or computer program product may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
Brief Description of the Drawings
In the drawings, like reference characters generally refer to like parts throughout the different views. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the following description, various embodiments of the invention are described with reference to the following drawings, in which:
FIG. 1 A shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
FIG. 1B shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
FIG. 1C shows a flow chart illustrating a method for imaging, according to various embodiments.
FIG. 1D shows a flow chart illustrating a method for imaging, according to various embodiments.
FIGS. 2 A to 2E show schematic perspective views of a pixel machine of various embodiments from different angles.
FIG. 3 shows a schematic back view of a main controller, according to various embodiments.
FIGS. 4 to 7 show respective flow charts illustrating methods for imaging, according to various embodiments.
FIG. 8 shows schematic plots illustrating changes in pixel values, according to various embodiments. Detailed Description
The following detailed description refers to the accompanying drawings that show, by way of illustration, specific details and embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments may be utilized and structural, logical, and electrical changes may be made without departing from the scope of the invention. The various embodiments are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments.
As used herein, the term“and/or” includes any and all combinations of one or more of the associated listed items. For example, A and/or B may include A or B or both A and B.
Various embodiments may provide an imaging apparatus, for example, a product photography apparatus, e.g., an automatic photography equipment for products. The apparatus may minimise the effort and labour associated with photography of products, for example, for the fastest expanding online shopping business. Various embodiments may also provide the corresponding methods for imaging.
One or more of the following may be achieved: (1) detection of pixels belonging to object(s) for automatic background elimination (e.g., background cut); (2) detection of saturated pixel(s) to automatically eliminate reflections; (3) detection of, and elimination or maintenance of product shadows; (4) automatic centering of the object(s); (5) elimination of background and shadows of rotating object(s); (6) providing uniform exposure and colour for all the pixels.
FIG. 1A shows a schematic block diagram of an apparatus 100 for imaging, according to various embodiments. The apparatus 100 includes a lighting arrangement 102 configured to provide lighting, and a processor 104, wherein the processor 104 is configured to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, and to control the lighting arrangement 102 to vary at least one parameter of the lighting during generation of the plurality of images, and wherein the processor 104 is further configured, for a respective pixel of pixels defining the at least one image, to determine a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background. In other words, an apparatus 100 for imaging (and/or for image processing) may be provided, having a lighting arrangement 102 and a processor 104. The apparatus (or arrangement) 100 may be employed to image or take images of an object of interest, e.g., against a background, and/or of the background in the absence of the object. It should be appreciated that the term “background” may mean the background relative to the object (in the presence of the object) or said background in the absence of the object, or otherwise to be construed in the context the term is used.
The processor 104 may communicate with the lighting arrangement 102, for example, via a channel represented as 106. The processor 104 and the lighting arrangement 102 may be electrically coupled to one another, e.g., via a physical cable 106. The processor 104 may send one or more control signals to the lighting arrangement 102.
The lighting arrangement 102 may provide lighting to illuminate the object and/or the background. The lighting arrangement 102 may partially or entirely surround the object to provide lighting from different directions towards the object and/or the background. As a non-limiting example, the lighting arrangement 102 may be arranged in the form of a closed box environment, and the object may be placed within said environment for imaging purposes. However, it should be appreciated that the various methods and techniques may be used in an open studio, on a stage, or in a film set, or any other suitable environments or settings.
The processor 104 may control an (optical) imaging device (e.g., a (digital) camera capable of taking photographs and/or videos, or a (digital) video recorder) to generate a plurality of images (or frames). As non-limiting examples, the plurality of images may mean 100 or more images. The imaging device may be separately provided or integrally provided with the apparatus 100. The processor 104 may control the imaging device to take at least one image showing the object of interest in the image. It should be appreciated that one or more or all of the plurality of images may depict the object (e.g., against a background), and/or one or more of the plurality of images may depict the background without the object. An equal number of the plurality of images may depict the background without the object and with the object respectively.
In the context of various embodiments, the plurality of images may be taken or obtained (directly) as still images (e.g., photographic images), and/or may be images extracted from a moving sequence of consecutive graphics (e.g., a moving picture, motion picture, movie).
The processor 104 may control the lighting arrangement 102 to vary at least one parameter of the lighting for or during generation of the plurality of images. For example, the at least one parameter may be varied in between generation of two images, or of two immediately adjacent images. This may mean that the at least one parameter of the lighting is different for each of the two images generated.
In the context of various embodiments, the at least one parameter of the lighting may include any one of or any combination of a lighting intensity, a lighting colour, or a lighting direction.
Of the plurality of pixels (“image pixels”) defining the at least one image depicting the object, the processor 104 may determine a change in pixel value of a respective pixel with a pixel across the plurality of images. The pixels defining the at least one image may mean the pixels defining a portion of the at least one image (e.g., pixels defining an area in the image, the area being smaller than the (outer) boundary of the image), or all of the pixels defining the entirety of the at least one image. The at least one pixel may include a pixel of any image of the plurality of images, for example, a pixel of the at least one image and/or belonging to an image generated immediately adjacent to the at least one image. In one example, the at least one pixel of the plurality of images may be a corresponding pixel through or across the plurality of images. In the context of various embodiments, the term“corresponding pixel” for an image in relation to the respective pixel of the at least one image may mean a pixel corresponding to the same position or coordinate as that for the respective pixel.
The change in pixel value may be a difference between a value of the respective pixel and a value of the at least one pixel. As examples, the determination of the change in pixel value may be carried out with the respective pixel against another pixel (e.g., background pixel) of the same image, and/or against respective corresponding pixels of one or more (other) images of the plurality of images. It should be appreciated that the determination may be carried out for each of all of the plurality of pixels defining the at least one image. All of the plurality of images may be used for this determination.
In the context of various embodiments, the change in the pixel value may be defined in terms of a percentage change.
From the change in the pixel value determined, the processor 104 may determine whether the respective pixel is a pixel associated with the object (“object pixel”) or belonging to a background (“background pixel”) (e.g., relative to the object). In this way, object pixels and background pixels may be determined. An“object pixel” may mean a pixel related to the object (e.g., the object pixel defines, in the image, the physical part of the object, and/or a shadow of the object), while a “background pixel” means a pixel defining or belonging to the background, either in the presence of the object or in the absence of the object. In this way, object pixels and background pixels may be determined or identified, and thus differentiated from each other.
In the context of various embodiments, it should be appreciated that, to“determine” the respective pixel as a pixel belonging to the object or a background may include, to“identify” the respective pixel as a pixel belonging to the object or a background. However, this may not necessarily mean to (positively) mark or tag the respective pixel as an object pixel or a background pixel, although this may be the case in various embodiments. Nevertheless, in some embodiments, the identification of the respective pixel may mean the resulting or selection process of the respective pixel being determined or inferred to be an object pixel or a background pixel based on the result of the determination of the change in pixel value.
A certain or predetermined threshold value may be set, and the change in the pixel value may be compared against the threshold value. For example, if the change in the pixel value is less than the threshold value, the respective pixel may be determined or identified as an object pixel; otherwise, the respective pixel may be determined or identified as a background pixel.
The phrase“change in pixel value” may refer to a rate of change of pixel value. Generally, the phrase“rate of change of pixel value” may mean the change in the pixel value through or across the plurality of images. This may mean that the term“rate” herein may refer to the change in pixel value determined progressively from one image to another image through the plurality of images. While the change in pixel value may vary with time, this is due to a change in the at least one parameter of the lighting over time for or during generation of the plurality of images, rather than a direct result from the passage of time.
In various embodiments, by changing the at least one parameter of the lighting (e.g., intensity and/or colour) during generation of the plurality of images, the difference between object pixels and background pixels (or their associated rate of change) may be widened, which may help in determining a pixel as being associated with the object or belonging to the background.
The lighting arrangement 102 may include a plurality of light sources to provide the lighting. Each light source may be or may include a light emitting diode or device (LED). In one non-limiting example, the plurality of light sources may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs. The use of RGBW LEDs may provide a pure white colour. Nevertheless, it should be appreciated that any other types of light sources may be used, as long as the light sources may be controlled to vary at least one parameter (e.g., light intensity) of the lighting provided by the light sources. Each light source (e.g., LED) may be individually addressable. This may mean that each light source may be independently switched on/off, and controlled to vary the at least one parameter of the lighting.
The lighting arrangement 102 may include a plurality of (separate) lighting panels to provide lighting from a plurality of directions to the object and/or background. Each lighting panel may include a plurality of light sources, for example, LEDs, e.g., a combination of red LEDs, green LEDs, blue LEDs, and white LEDs.
The lighting arrangement 102 may be or may form part of a pixel machine. A driver arrangement may be provided (e.g., in the pixel machine) for driving the lighting arrangement 102. The driver arrangement may have a plurality of drivers for driving the associated light sources of the lighting arrangement 102.
The processor 104 may be provided with one or more communication interfaces, e.g., including at least one interface for communication with the lighting arrangement 102 or the pixel machine. The processor may be or may form part of a (main) controller. The (main) controller may further include a display and other peripheral devices, e.g., a keyboard, a mouse, etc.
The pixel machine and the (main) controller may be comprised in the apparatus 100.
In various embodiments, the at least one pixel may include at least one corresponding pixel of the respective pixel through the plurality of images, and the processor 104 may be further configured to determine the change in pixel value between the respective pixel and the at least one corresponding pixel, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to the background.
The processor 104 may be configured to vary the at least one parameter of the lighting through (or over) a plurality of values (associated with the at least one parameter). For generating the plurality of images, the processor 104 may be configured to control the imaging device to generate one or more images including the at least one image at a respective (different) value of the plurality of values. The plurality of values may be different to one another. This may mean that the lighting for illuminating the object and/or background may be different between two (immediately adjacent) images. For example, the parameter of the lighting may be at or may have a first value for generation of at least one first image, and a second value for generation of at least one second image.
As another example, the processor 104 may vary the at least one parameter over a plurality of (different) values at a plurality of intervals to illuminate the object and/or background, where each value may be associated with a respective interval, and the processor 104 may be configured to generate one or more images of the plurality of images at the respective interval.
The plurality of values may be within or may span a range from value 0 (minimum value) to value 255 (maximum value). In one non-limiting example, the at least one parameter may be varied from value 0 to value 255 at intervals of 1, 5 or any other number. In terms of intensity, the value 0 may represent the minimum intensity or darkness, while the value 255 may represent the maximum intensity or saturation. The values 0 - 255 may also represent the scale or range for colour.
The lighting may include a background lighting to illuminate the background, and the at least one parameter may include at least one background parameter of the background lighting. The processor 104 may be configured to control the lighting arrangement 102 to vary the at least one background parameter during generation of the plurality of images. For example, the processor 104 may vary the intensity and/or colour (as the background parameter) of the background lighting. As a non-limiting example, the intensity of the background lighting may be increased, e.g., from value 0 (dark) to value 255 (saturated). The lighting may further include an object lighting (to illuminate the object) which may be maintained constant, or, at least one parameter thereof may be varied.
In the context of various embodiments, the phrase“background lighting” means the lighting that illuminates the background, and/or the space in between the background and the object of interest, without illuminating the object. This means that the object of interest to be imaged is not illuminated by the background lighting, i.e., lighting provided from one or more light sources, to be employed as the background lighting to illuminate the background, does not illuminate the object. Further, in various embodiments, for the purpose of minimizing or avoiding the effect of the object being illuminated by scattered light from the background, the imaging device (e.g., camera) may be adjusted to low(er) sensitivity so that light scattered from the background that may potentially illuminate the object may not be captured or be observable in one or more images of the plurality of images generated.
In various embodiments, the processor 104 may be configured to vary the at least one background parameter of the background lighting through a plurality of values, and further configured to control the imaging device to generate a respective image of the plurality of images at a respective value of the plurality of values, wherein the respective image depicts the object of interest. The processor 104 may be further configured, for each respective image through the plurality of images, to determine a change in pixel value between a respective corresponding pixel at an edge region of the object depicted in the respective image and a respective corresponding background pixel adjacent to the edge region. The processor 104 may be further configured, from the changes in pixel value determined for the respective corresponding pixel through the plurality of images, to determine the respective corresponding pixel as a pixel associated with the object or a pixel belonging to the background. The edge region of the object may refer to an area including an (perceived) edge or boundary of the object. In one example, the edge region may include an area of between one and twenty pixels (e.g., one, two, three, five, ten, or twenty pixels) away from a pixel at or corresponding to the edge of the object. A background pixel adjacent to the edge region may refer to a background pixel that may be between one and twenty pixels (e.g., one, two, three, five, ten, or twenty pixels) away from the pixel at or corresponding to the edge of the object.
As non-limiting examples, referring to FIG. 8, the at least one background parameter of the background lighting may be varied from value 0 to value 255. For a pixel Ai at an edge region of the object, the change in pixel value between the corresponding pixel Ai and a respective corresponding background pixel (e.g., Bi) adjacent to the edge region may be determined for each image of the plurality of images generated from value 0 to value 255, and the result may be as illustrated in FIG. 8. The change in pixel value may reach a maximum value at a background parameter value, ii, of the at least one background parameter. Based on such a change, the pixel Ai may be determined or identified as a pixel associated with the object. As a further example, the result for pixel A3 at the edge region of the object may be as illustrated in FIG. 8, where the change in pixel value may reach a maximum value at a background parameter value, i3, and the pixel A3 may be determined or identified as a pixel associated with the object. As a further example, the result for pixel A 2 at the edge region of the object may be as illustrated in FIG. 8, where the change in pixel value may be minimal as the pixel values for pixel A2 and the associated corresponding background pixel may be at least substantially similar, and, as a result, the pixel A2 may be determined or identified as a pixel belonging to the background. In various embodiments, the respective corresponding background pixel may be adjacent to the associated corresponding pixel for determination of the change in pixel value, for example, between one and twenty pixels (e.g., one, two, three, five, ten, or twenty pixels) away from the associated corresponding pixel. It should be appreciated that, while, h, i3, and in, are illustratively shown as of different values, any two or more background parameter values may be of the same value.
In various embodiments, the processor 104 may be further configured to determine, from the changes in pixel value, a desired change in pixel value (e.g., maximum change) for the respective corresponding pixel determined as being associated with the object, and further configured to generate a resultant image depicting the object including the determined respective corresponding pixel at the respective value corresponding to the desired change in pixel value. For example, using the results of FIG. 8, the processor 104 may generate a resultant image including pixel Ai at the background parameter value, ii, pixel A3 at the background parameter value, i3, and so on with any other pixel determined as being associated with the object at the particular background parameter value with the desired change in pixel value.
The lighting may include an object lighting to illuminate the object, and a background lighting to illuminate the background, and the at least one parameter may include at least one object parameter of the object lighting, and at least one background parameter of the background lighting. The processor 104 may be configured to control the lighting arrangement 102 to vary the at least one object parameter and the at least one background parameter relative to each other during generation of the plurality of images. As a non-limiting example, the respective colours of the object lighting and the background lighting may be different.
The processor 104 may be further configured to remove or discard pixels determined or identified as belonging to the background. All pixels that are determined as background pixels may be removed.
The processor 104 may be further configured to control relative movement between the imaging device and the object, and the processor 104 may be further configured to control the imaging device to generate the plurality of images from a plurality of directions relative to the object. For example, there may be rotational movement between the imaging device and the object. This may allow 360° generation of images of the object.
The imaging device may be placed on a support structure. The support structure may be a movable support structure (e.g., an XYZ motorized stand), and the processor 104 may control movement of the movable support structure, relative to the object.
The apparatus 100 may include a turn table to support the object, and the processor 104 may control movement of the turn table, relative to the imaging device. The turn table may be at least substantially transparent or transmissive to light. The object may be placed on the turn table to be rotated to allow generation of images of the object at different angles of rotation.
The lighting arrangement 102 may include at least one curved lighting panel configured to provide a focused lighting onto the object. The at least one curved lighting panel may include a plurality of light sources, e.g., LEDs. Four curved lighting panels may be provided. The apparatus 100 may further include an actuator configured to communicate with the processor 104, wherein in response to a single activation of the actuator, operation of the processor 104 may be (fully) automated. The actuator may be a push actuator, e.g., a push button.
The processor 104 may be further configured to control the imaging device to generate a number of images (or multiple images), each image depicting the object, and the processor 104 may be further configured to control the lighting arrangement 102 to vary a direction (or angle) of the lighting for or during generation of the number of images. The processor 104 may be further configured, for each respective pixel of pixels determined as being associated with the object, to determine another change in pixel value between the respective pixel and at least one corresponding pixel of the number of images, and the processor 104 may be further configured to determine, based on the other change in pixel value determined, the respective pixel as a pixel belonging to the object or belonging to a shadow of the object. A pixel belonging to the object refers to the pixel that defines, in the image, the physical or actual or tangible part of the object. As an example, lighting may be provided from a plurality of directions onto the object. In this way, as shadow is generally formed on a side opposite to a source of lighting, the shadow that is formed moves about depending on the direction of the lighting, while the object that is lit remains stationary.
The processor 104 may be further configured to remove or discard one or more pixels (“shadow pixel(s)”) determined or identified as belonging to the shadow of the object. All pixels determined as belonging to the shadow may be removed. Nevertheless, it should be appreciated that one or more or all shadow pixels may be maintained or kept.
The processor 104 may be further configured to identify a flicker effect in the plurality of images generated and to remove the flicker effect. For example, when the imaging device is operated to take videos, flicker effect may be captured in one or more images of the plurality of images due to frequency difference in the respective operations between the imaging device and the lighting arrangement 102. Flicker in the images may be observed continuously. Rather than controlling the lighting to minimise the flicker effect, this may be achieved in the processing of the images. From the plurality of images generated, flicker which is regular and similar may be identified so that the flicker may be eventually removed.
The processor 104 may be further configured to generate a resultant image of the object based on (or using) pixels determined or identified as being associated with the object. The processor 104 may be further configured to determine, from pixels determined as being associated with the object, respective pixels having respective desired (or optimum) pixel values for generation of the resultant image. From the object pixels determined, pixels (“desired pixels”) corresponding to the desired or optimum lighting condition, thereby having the desired (or optimum) pixel values, may be determined and the resultant image generated using such desired pixels. It should be appreciated that desired pixels of the object may be derived from different images of the plurality of images.
The processor 104 may be further configured to determine, from pixels determined as associated with the object, respective pixels having respective desired (or optimum) pixel values. The processor 104 may be further configured to control the lighting arrangement 102 to vary the at least one parameter to adapt the lighting according to the respective desired pixel values determined (to illuminate the object). The processor 104 may be further configured to control the imaging device to generate, based on the lighting adapted, a resultant image depicting the object. For example, after the detection of an object as a whole (i.e., determination or identification of pixels associated with or belonging to the object), it may be possible to vary the intensity and/or colour of the lighting directed to one or more features (or elements or areas) of or within the object to highlight said feature(s) or to provide the desired (or optimum) lighting condition for the feature(s), for generation of the resultant image.
FIG. 1B shows a schematic block diagram of an apparatus lOOb for imaging (and/or for image processing), according to various embodiments. The apparatus lOOb includes a processor l04b, and a memory 105 coupled to the processor l04b. The memory 105 and the processor l04b may be coupled to each other (as represented by the line 107), e.g., physically coupled and/or electrically coupled.
In various embodiments, the memory 105 has stored therein instructions, which when executed by the processor l04b, cause the processor l04b to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images, determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background. The plurality of images may be transferred to and/or stored in the memory 105 or another memory. In various embodiments, the memory 105 has stored therein instructions, which when executed by the processor l04b, cause the processor l04b, for a plurality of images, wherein at least one image of the plurality of images depicts an object of interest, and wherein the plurality of images are generated at different values of at least one parameter of lighting, to determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background. The plurality of images may be transferred and/or stored in the memory 105 or another memory.
It should be appreciated that description in the context of the apparatus 100 may correspondingly be applicable to the apparatus lOOb, and description in the context of the processor 104 may correspondingly be applicable to the processor 104b.
In the context of various embodiments, it should be appreciated that the term“object” may include a living thing (e.g., person, animal, plant, etc.) and/or a non-living thing (e.g., a product, item, inanimate body or object, etc.).
FIG. 1C shows a flow chart 120 illustrating a method for imaging (and/or for image processing), according to various embodiments. At 122, a plurality of images are generated, wherein at least one image of the plurality of images generated depicts an object of interest. At 124, at least one parameter of lighting is varied during generation of the plurality of images. At 126, for a respective pixel of pixels defining the at least one image, a change in pixel value is determined between the respective pixel and at least one pixel of the plurality of images. At 128, based on the change in pixel value determined, the respective pixel is determined as a pixel associated with the object or a pixel belonging to a background (relative to the object). The method may include providing the lighting.
In various embodiments, the at least one pixel may include at least one corresponding pixel of the respective pixel through the plurality of images. At 126, the change in pixel value between the respective pixel and the at least one corresponding pixel may be determined. At 128, based on the change in pixel value determined, the respective pixel may be determined as a pixel associated with the object or a pixel belonging to the background.
In various embodiments, at 124, the at least one parameter of the lighting may be varied through a plurality of (different) values, and, at 122, one or more images including the at least one image may be generated at a respective value of the plurality of values. For generating the one or more images at 122, first and second images depicting the background with and without the object respectively may be generated at the respective value. For generating the first and second images, the (only) difference is the presence or absence of the object. For example, the first image may depict the object against a background and the second image may depict the corresponding background in the absence of the object. The first image may be comprised in or within the definition of the at least one image that is generated depicting the object of interest. At 126, for each respective pixel of the pixels defining the first image, a change in pixel value may be determined between the respective pixel and a corresponding pixel of the second image. At 128, based on the change in pixel value determined between the respective pixel of the first image and the corresponding pixel of the second image, the respective pixel of the first image may be determined as a pixel associated with the object or a pixel belonging to the background.
The lighting may include a background lighting to illuminate the background, and, at 124, at least one background parameter of the background lighting may be varied for or during generation of the plurality of images.
In various embodiments, the at least one background parameter may be varied through a plurality of values. At 122, a respective image of the plurality of images may be generated at a respective value of the plurality of values, wherein the respective image depicts the object of interest. At 126, for each respective image through the plurality of images, a change in pixel value may be determined between a respective corresponding pixel at an edge region of the object depicted in the respective image and a respective corresponding background pixel adjacent to the edge region. At 128, from the changes in pixel value determined for the respective corresponding pixel through the plurality of images, the respective corresponding pixel may be determined as a pixel associated with the object or a pixel belonging to the background.
In various embodiments, the method may further include determining, from the changes in pixel value, a desired change in pixel value for the respective corresponding pixel determined or identified as being associated with the object, and generating a resultant image depicting the object including the determined respective corresponding pixel at the respective value corresponding to the desired change in pixel value.
The lighting may include an object lighting to illuminate the object, and a background lighting to illuminate the background, and, at 124, at least one object parameter of the object lighting and at least one background parameter of the background lighting may be varied relative to each other for or during generation of the plurality of images. Pixels determined or identified as belonging to the background may be removed or discarded. All pixels that are determined as background pixels may be removed.
At 122, the plurality of images may be generated from a plurality of directions relative to the object.
The method may further include generating a number of images (or multiple images), each image depicting the object, varying a direction of the lighting for or during generation of the number of images, determining, for each respective pixel of pixels determined as being associated with the object, another change in pixel value between the respective pixel and at least one corresponding pixel of the number of images, and determining, based on the other change in pixel value determined, the respective pixel as a pixel belonging to the object or a shadow of the object. One or more or all pixels determined or identified as belonging to the shadow may be removed or discarded.
The method may further include identifying a flicker effect in the plurality of images generated, and removing the flicker effect.
A resultant image of the object may be generated based on pixels determined as being associated with the object. The method may further include determining, from the pixels determined as being associated with the object, respective pixels having respective desired pixel values for generation of the resultant image.
The method may further include determining, from pixels determined as being associated with the object, respective pixels having respective desired pixel values, varying the at least one parameter to adapt the lighting according to the respective desired pixel values determined, and generating, based on the lighting adapted, a resultant image depicting the object.
FIG. 1D shows a flow chart 130 illustrating a method for imaging (and/or for image processing), according to various embodiments. At 132, for a respective pixel of pixels defining at least one image of a plurality of images, wherein the at least one image depicts an object of interest, and wherein the plurality of images are generated at different values of at least one parameter of lighting, a change in pixel value is determined between the respective pixel and at least one pixel of the plurality of images. At 134, based on the change in pixel value determined, the respective pixel is determined as a pixel associated with the object or a pixel belonging to a background (relative to the object). The method may include providing the lighting. It should be appreciated that description in relation to the method described in the context of the flow chart 120 may correspondingly be applicable in relation to the method described in the context of the flow chart 130, and vice versa. It should be appreciated that description in the context of the apparatus 100, lOOb may correspondingly be applicable in relation to the method described in the context of the flow chart 120, 130 and vice versa.
Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, to control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images, to determine, for a respective pixel of the pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object or a pixel belonging to a background.
The apparatus of various embodiments may include two parts: a pixel machine and a main controller, as will be described further below with reference to FIGS. 2 A to 2E and 3.
FIGS. 2 A to 2E show schematic perspective views of a pixel machine 201 of various embodiments. It should be appreciated that while one or more features or elements of the pixel machine 201 may not be shown in one or more of FIGS. 2A to 2E for clarity and easier understanding purposes, and/or to illustrate an internal environment of the pixel machine 201, such features/elements may nevertheless be part of the pixel machine 201 as may be apparent from FIGS. 2A to 2E. Further, while LEDs are described in the context of the pixel machine 201, it should be appreciated that other types of light sources may be employed, as long as at least one parameter (e.g., light intensity and/or colour) of the lighting provided may be variable. Nevertheless, LED lighting is preferable due to its characteristics of low power consumption, speed of control and long life.
The pixel machine 201 may include a front-left LED panel 202, a front-middle LED panel 204, a front-right LED panel 206, a back side LED panel 207, a left side LED panel 214, a right side LED panel 216, four (key light) LED panels 220 which may be curved, a top LED panel 222, a top-back LED panel 224, a top-left LED panel 225, a top-right LED panel 226, a top- front (key light) LED panel 227, and a bottom LED panel 228, where one or more of these LED panels may define a lighting arrangement (e.g., 102, FIG. 1A). The lighting arrangement may provide omnidirectional lighting, where the intensity of the lighting may be changed.
Each of the LED panels 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a plurality of LEDs (illustrated as squares in the panels and represented as 229 for some LEDs). Each LED panel 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs (RGBW LEDs), which may provide pure white colour as producing white from RGB in a closed box may be challenging. Each LED panel may be individually or independently controlled. Each LED 229 may be individually or independently addressable.
The lighting arrangement of the pixel machine 201 may provide three types of lights: key light, fill light and back light, thereby providing three-point lighting. The lighting arrangement may also provide background lighting. The four LED panels 220 in the comer, the top-back LED panel 224, the top-left LED panel 225, the top-right LED panel 226, and the top-front LED panel 227 may provide key light. The left side LED panel 214 and the right side LED panel 216 may provide fill light. The back side LED panel 207 may provide back light.
An imaging device (e.g., camera) 210, which may be supported on a support structure (e.g., camera stand) 212, may be provided for imaging through an aperture 208 defined in the front- middle LED panel 204. The imaging device 210 may have a zoom feature (e.g., including at least one of optical zoom or digital zoom), meaning that the imaging device 210 is capable of zooming-in and zooming-out operations for generation of images. A turn table 231 may be provided to support one or more (sample) objects (e.g., represented as 230) that are to be imaged. A stepper motor 232 may be provided for rotation of the object(s) 230 or turn table 231. The lighting arrangement of the pixel machine 201 may provide lighting to light up an object (or product) 230 at least substantially equally from all angles so one or more or all features of the object(s) 230 may have good resolution, contrast and colours.
The four side LED panels 220 (curved or round shape) employ more LEDs to provide sharp and crisp contrast of the object(s) 230. The panels 220 may also help to produce useful shadows shining areas of the object(s) 230. The curved shape angle of the LED panels 220 may give a more focused light to the object(s) 230. The panels 220 may also be used to generate different light effects, e.g., highlights and long and softer shadows, Chiaroscuro effect, etc, for photography.
The key light provided may be angled such that it may light up any part of an object 230 with control of the intensity of the key light. Each LED 229 may be addressed individually. Such a technique may allow (full) control over object lighting so that an optimal lighting condition or environment suitable for product photography may be determined. Further, in various embodiments, as the object height and length may be detected using a computer application, the optimal or best lighting condition or environment may be determined or suggested automatically with minimal or without any saturation and/or dark spots.
An interface 234 may be provided on the pixel machine 201, for example, for cable connection to the processor to be described below with reference to FIG. 3.
FIG. 3 shows a schematic back view of a main controller 340, according to various embodiments. The main controller 340 may include a processor 342, for example, an industrial PC with embedded LEDs and motor controllers. The main controller 340 may further include a display device (e.g., a touch-screen monitor) 352, a keyboard 354 and a mouse 356.
An actuator, for example, a single capture button 344, may be provided (integrated) with the processor 342. A connector 348 may be provided for connection to a network, e.g., a local area network (LAN). A power connector or socket 350 may also be provided.
An interface (or link connector) 346 may be provided on the processor 342 for communication with the pixel machine 201 (FIGS. 2A to 2E), for example, via the interface 234 (FIG. 2A). In this way, signals may be communicated from the processor 342 to the pixel machine 201, for example, to control movement and operation of the imaging device 210, to vary the parameter of the lighting provided by the lighting arrangement, etc. A cable may be connected between the interfaces 234, 346, e.g., a multicore cable which may carry power, and control signals for LEDs 229, motor(s) (e.g., 232) and the imaging device 210 of the pixel machine 201.
The methods for imaging will now be described by way of the following non-limiting examples, and with reference to FIGS. 4 to 10.
Referring to the flow chart 460 of FIG. 4, at 464, a product may be placed in the apparatus (e.g., 100) of various embodiments (e.g., in the pixel machine 201). The button 344 may then be actuated/pressed to perform product photography. If the button 344 is pressed for more than 5 seconds, 360° photography may be carried out.
At 466, the RGB LEDs providing background lighting light up with varying colour and/or intensity values. The light intensity and/or colour emitted from the associated LEDs or LED panels may be varied under the control of the processor 342 which may in turn take command via or from a software application, e.g., based on user input. It should be appreciated that key light may be provided on the product. Further, depending on requirements, the (intensity and/or colour of) key light on the product may be changed and the background light maintained as static, or vice versa.
At 468, command is sent to the imaging device (e.g., 210) to capture or generate frames or images at different intervals. Respective images may be generated with and without the product at respective intervals. At 470 and 472, each pixel in the respective frames with and without the product may be compared to each other. Such respective frames may be taken at the same lighting condition (e.g., same colour and/or intensity value). As illustrated at 472, a change or difference between the pixel values of a pixel in a frame with the object (Pixel A) and a pixel in a frame without the object (PixelB) may be determined. As a non-limiting example, the change in the pixel values of PixelA and PixelB may be computed based on respective changes between the pixel values of each of the red component, green component and blue component of PixelA and PixelB (i.e., PixelA.Red and PixelB.Red, PixelA.Green and PixelB. Green, PixelA.Blue and PixelB.Blue). The change in the pixel value between PixelA and PixelB may be expressed as a percentage change. The red component of a pixel may mean the pixel of an image generated under (only) red light condition, or the red component of the pixel of an image generated under white light condition. This applies correspondingly to the green component and the blue component.
At 474, the percentage change determined at 472 may be compared against a threshold value. For example, the condition may be set as to whether the percentage change is less than the threshold value (percentage < threshold). If the result“percentage < threshold” is determined, the pixel is determined or identified as being associated with the product at 476. Otherwise, the pixel is determined or identified as belonging to the background at 478. The threshold value may be set via the corresponding software application, and may be set depending on the chosen rate of change to be applied. As the lighting may be varied with known value(s), the expected rate of change between product pixels (or object pixels) and background pixels may be determined. As non-limiting examples, a threshold of 10 percent may be set. In order to widen the difference between the pixels of the product and the background, one frame may be captured at, for example, 20% background and 40% product light. For the next frame, the background lighting intensity may be reduced to 10% background and the product lighting may be increased to 50%. The expected difference between the product pixels and background pixels may be 20%; however, to provide a buffer, the threshold may be set to 10%, being 50% of the actual expected change of 20%. Accordingly, the pixels may be easily determined as product pixels or background pixels. In further details, the method may include varying at least the light intensity on the background. LED lights may be used in a closed box environment, e.g., within the pixel machine 201 (or may be used in a studio setting/environment). At least 50 to 100 frames with different light levels may be generated within 30 to 60 seconds (depending on device capture speed per second). These frames may be later compared to find the rate of change of pixel colour values and/or intensity values. As the rate of change of pixels of the background is different than the rate of change of pixels of the product under shoot, pixels associated with the product or belonging to the background may be determined, and thus, separated from one another.
In the context of the apparatus and methods for imaging disclosed herein, around 100 frames or images with different background light intensities (e.g., increasing illumination in the background) may be generated, for the purpose of differentiating the background from the object/product. For example, images may be generated starting from value 0 which is completely dark background to value 255 which is completely saturated. Around 100 images or more may be captured between values 0 to 255 for the background lighting. The images may be generated both with and without the object. For example, a pair of images, respectively with and without the object, may be generated with the background illuminated with a background lighting at the same intensity value. Subsequently, corresponding images with and without the product may be compared to each other, and the rate of change of each pixel throughout the frames or in all the frames may be determined. The rate of change of pixels is determined through all the 100 or more frames to determine the changes that happen to each pixel. In other words, each pixel is compared in all the frames to determine how much it has changed throughout the frames. The rate of change of background pixels and those for the object are different, and, in this way, object pixels and background pixels may be differentiated. The background may be determined or identified and subsequently removed. Further, even when there are reflections on a product that is shiny or reflective, the rate of change of pixels associated with the reflection is different from that for the background.
As described, it is possible to perform the method without the product in the pixel machine (e.g., 201) and to determine the rate of change of pixels by varying the light intensity and/or colour with known values and/or pattern. After that, the product may be placed on a turn table (e.g., 231), and the same lighting and/or colour intensity pattern may be run, and both frames with product and without product may be compared to find which pixel is associated with or belongs to the product and which pixel belongs to the background, so as to eliminate the background. Nevertheless, it should be appreciated that images without the product may not be generated. This may mean that, at 470, each pixel in the respective frames with the product may be compared to each other. Such respective frames may be taken at different lighting parameters or conditions (e.g., different colour and/or intensity values). Further, this may mean that, at 472, a change or difference between the pixel values of a pixel in a frame with the object (PixelA’) and a pixel in a frame (generated under a different lighting condition) with the object (PixelB’) may be determined. Therefore, comparison of pixels using frames with and without the object, or using frames with the object under different lighting conditions, may be possible. Both techniques work, as the rate of change of pixel values corresponding to the product and the background is different. Further, if key light is provided facing the product directly, the variation in the key light on the product has less effect on the background lighting intensity, and this may widen the rate of change of background pixels and products pixels even further.
It has been found that good results may be obtained even where prior reference without the product on the table is not available, as comparison on the basis of frames with a change in a lighting parameter (e.g., light intensity) may assist in determination or identification of product pixels and background pixels, for the purposes of removing the background pixels. Such a technique makes it a faster and more reliable process.
In the context of the methods for imaging disclosed herein, the images may be generated via two modes: photography mode and video (or movie) mode. In movie mode, the imaging device (e.g., camera) 210 may capture frames at a rate of 60 seconds (may be less or more depending on device speed), and the light intensity and/or colors may be changed during the making of the video. In all operation modes, the auto iris mode of camera would be turned off so that the effect of rate of change of light in the background and the product may be captured. The timing of the frames may be marked in the software application, which may be extracted and used to make a final or resultant 360° of still photograph.
In the context of the methods for imaging disclosed herein, individual LEDs may be controlled so that there may be full control over lighting. Different lighting colours may be provided. In one example, using only red light (red LED) in the background and blue light on the product with known intensity, the gap of difference between product pixels and background pixels may widen even further. However, it may be challenging if one or more parts of the product may be red, which may then mix in with the background. This effect may be addressed or eliminated by alternating the background and product lighting colours, which may be blue light in the background and red light on the product, using the above example. This may further help in finding and eliminating the common colour mixing pixels. Any or all combination in background and products with RGBW colours may be employed in the methods.
As a further technique, an image of the product in white colour background may be generated, and images of the product with different red, green and blue (RGB) backgrounds may be further generated. The mixing and common pixels (e.g., pixels having colour substantially similar to the background) may first be determined (or identified) and then subtracted, which may result in separation of background and product pixels. Further, in the technique, the intensity of the lighting for providing the colour background(s) may be varied during generation of the images. As non- limiting examples, a pixel may have a (inherent) colour, where each colour component (RGB) of the pixel may have a certain (or original or inherent) colour value (e.g., corresponding to the case of the white colour background). When lighting is supplied to provide a colour background, the colour value of one or more of the colour components may change to a different value. When the intensity of the lighting provided for the background is changed, the colour value of one or more of the colour components may also change to a different value. For an object pixel, the change(s) in the colour value(s) of the colour component(s) is generally greater compared to that for a background pixel. Such a difference in the change(s) may be used to determine or identify object pixels. For example, a threshold value may be set and a pixel may be determined as an object pixel if the change in the colour value corresponding to one or more colour components is at or greater than the threshold value. This approach may be helpful for object pixels located at or adjacent to the edge of the object so as to be able to distinguish the border or edge of the object from the background. This technique, and the technique described above relating to alternating the background and product lighting colours, may produce substantially similar results.
FIG. 5 shows a flow chart 560 illustrating a method for imaging, according to various embodiments. At 564, a product may be placed in the apparatus (e.g., 100) of various embodiments
(e.g., in the pixel machine 201). The button 344 may then be actuated/pressed to perform product photography. At 566, the RGB LEDs providing background lighting light up with varying colour and/or intensity values. Command is sent to the imaging device (e.g., 210) to generate images. Respective images may be generated with and without the product. At 570 and 572, each pixel in the respective frames with and without the product may be compared to each other. Such respective frames may be taken at the same lighting condition (e.g., same colour and/or intensity value). As illustrated at 572, a change or difference between the pixel values of a pixel in a frame with the object (PixelA) and a pixel in a frame without the object (PixelB) may be determined and expressed in percentage. At 574, the percentage change determined at 572 may be compared against a threshold value. For example, the condition may be set as to whether the percentage change is less than the threshold value (percentage < threshold). If the result “percentage < threshold” is determined, the pixel is determined or identified as being associated with the product at 576, 586. Otherwise, the pixel is determined or identified as belonging to the background at 578. It should be appreciated that similar or like actions or features described in the context of the flow chart 460 of FIG. 4 may correspondingly be applicable here, and hence are omitted for brevity.
Further, as illustrated at 578, it may be possible that, due to the colour of one or more portions of the product being substantially similar to the colour of the background, one or more pixels defining such portion(s) of the product may be mixed into the background in one or more images that are generated. At 580, the colour and/or intensity of the background lighting may be varied. Accordingly, by such changes, the colour of the product or portion(s) thereof may be different to the background colour at one or more times for images to be generated. As such, product pixels may be determined, for example, using a similar (percentage < threshold) condition.
At 582, pixels that are consistently in the background may be determined. If a pixel is not in the background in any one image, the pixel may be determined as a product pixel.
At 584, for each pixel for each frame or image, a comparison is made on the condition that if the pixel colour is not equal to (represented as“!-”) the background colour, the pixel is determined to be not a background pixel, and, therefore, may be determined as a product pixel at 586.
The method described in the context of flow chart 560 may be employed when faced with the challenge where one or more parts of the product under shoot contains the same or similar colour as the colour of the background, which in most cases, is a white background. As described, in order to address this challenge, the background colour may be changed via the use of RGBW LEDs. As the rate of change of known pixel values of the background and the product may be determined, product pixels and background pixels may be differentiated from each other.
Elimination of shadow is always a challenge in any photography and, using the apparatus of various embodiments, as the lighting may be rotated in a known pattern to allow the shadow to rotate as well, and where the product pixels do not rotate, pixels belonging to the product and pixels belonging to the shadow of the product (e.g., pixels belonging to the bottom of the product) may be determined. Different LEDs or LED panels may be turned on at different intervals to light up the product from different angles so as to rotate the lighting in a known pattern. In this way, the shadow “moves” as the source of lighting moves. Shadow is always opposite to light. And, shadows are mono color, mostly shaded in black. Turning on the light where shadows are found, thus, further helps as the shadow disappears and appears on the opposite side behind the product.
FIG. 6 shows a flow chart 660 illustrating a method for imaging, according to various embodiments, as an example to remove shadow. At 664, a product may be placed in the apparatus (e.g., 100) of various embodiments (e.g., in the pixel machine 201). The button 344 may then be actuated/pressed to perform product photography. At 666, the RGB LEDs providing background lighting light up with varying colour and/or intensity values. Command is sent to the imaging device (e.g., 210) to generate images. Respective images may be generated with and without the product. At 670, each pixel in the respective frames with and without the product may be compared to each other. Such respective frames may be taken with the same lighting condition (e.g., same colour and/or intensity value). While not shown, it should be appreciated that the actions or features described in the context of 472 (FIG. 4) may correspondingly be applicable here. At 674, the change determined from the comparison carried out may be compared against a threshold value. For example, the condition may be set as to whether the change is less than the threshold value (change < threshold). If the result“change < threshold” is determined, the pixel is determined as being associated with the product at 676. Otherwise, the pixel is determined as belonging to the background at 678. It should be appreciated that similar or like actions or features described in the context of the flow chart 460 of FIG. 4 may correspondingly be applicable here, and hence are omitted for brevity.
Further, as illustrated at 676, it may be possible that there are pixels belonging to the product and pixels belonging to the shadow. At 680, the lighting may be rotated relative to the product and frames may be generated with displaced shadow. Such frames may be compared to the product pixels with the shadow as determined at 676. At 682, the change determined from the comparison carried out may be compared against a threshold value. For example, the condition may be set as to whether the change is less than the threshold value (change < threshold). If the result“change < threshold” is determined, the pixel is determined as belonging to the product at 684. Otherwise, the pixel is determined as belonging to the shadow at 686 and the shadow pixel may be removed or discarded. It should be appreciated that the respective threshold values at 674 and 682 may be the same or may be different. As an example, the lighting may be provided from the top of the product and the direction of the lighting above the product may be changed or rotated. In this way, shadows that are formed are not uniform in all images. Therefore, when the rate of change for all the frames is compared, shadows may be determined and, if desired, removed.
In further details, lighting may be rotated to provide the lighting from different angles, e.g., from the top, middle and bottom (but not limited to 3 positions only) of the object, or to provide top lighting, side lighting and back lighting, etc. The shadow changes its position or direction every time the direction or position of the lighting is changed. Then, pixels belonging to the shadow may be defined by comparing all the frames generated. Some shadows may be removed and some may be maintained to add detail to the object. Therefore, after capturing all the shadows, user input may be obtained to either keep a particular shadow or remove it. Further, there may be shadows that fall on the object itself due to the presence of different shapes in the object, and such shadows may be maintained or removed.
Additionally, or alternatively, in order to eliminate the shadow, it may be possible to light up the product from the bottom, for example, using LEDs at the bottom (e.g., using the bottom LED panel 228) and the product is laying on a transparent glass of turn table (e.g., 231). As a result, the shadow is produced on top which is not part of the product frame, so that pixels belonging to the product may be clearly identified or determined.
FIG. 7 shows a flow chart 760 illustrating a method for imaging, according to various embodiments, as an example to remove shadow by illuminating the product from the bottom of the product. At 764, a product may be placed in the apparatus (e.g., 100) of various embodiments (e.g., in the pixel machine 201). The button 344 may then be actuated/pressed to perform product photography. At 766, the RGB LEDs providing background lighting light up with varying colour and/or intensity values. Command is sent to the imaging device (e.g., 210) to generate images. Respective images may be generated with and without the product. At 770, each pixel in the respective frames with and without the product may be compared to each other. Such respective frames may be taken with the same lighting condition (e.g., same colour and/or intensity value). While not shown, it should be appreciated that the actions or features described in the context of 472 (FIG. 4) may correspondingly be applicable here. At 774, the change determined from the comparison carried out may be compared against a threshold value. For example, the condition may be set as to whether the change is less than the threshold value (change < threshold). If the result “change < threshold” is determined, the pixel is determined as being associated with the product at 776. Otherwise, the pixel is determined as belonging to the background at 778. It should be appreciated that similar or like actions or features described in the context of the flow chart 460 of FIG. 4 may correspondingly be applicable here, and hence are omitted for brevity.
Further, as illustrated at 776, it may be possible that there are pixels belonging to the product and pixels belonging to the shadow. At 780, lighting may be provided from beneath the product, through the turn table (e.g., 231) which may be transparent and frames may be generated. Such frames may be compared to the product pixels with the shadow as determined at 776. At 782, the change determined from the comparison carried out may be compared against a threshold value. For example, the condition may be set as to whether the change is less than the threshold value (change < threshold). If the result“change < threshold” is determined, the pixel is determined as belonging to the product at 784. Otherwise, the pixel is determined as belonging to the shadow at 786 and the shadow pixel may be removed or discarded. It should be appreciated that the respective threshold values at 774 and 782 may be the same or may be different.
However, illuminating the product and the turn table from below for the purpose of determining or identifying shadow pixels may be challenging in some situations as shadows may fall on the object itself, which may not disappear when illuminating from below. In such situations, rotation of the lighting above the object may help with identification of the shadows.
As described above, various methods for imaging may work with shadow, and then remove the shadow, if desired. In other words, various methods for imaging may create shadow, for the purpose of removing shadow. Nevertheless, the shadow pixels may also be maintained, if desired. In contrast, known methods may employ lighting in a way that eliminates shadow so that shadows are not captured in the images that are generated.
The various methods for imaging may be used for making 360° photography of product by rotation where each frame has clear background elimination.
In the context of the apparatus and various methods for imaging, as pixels associated with or belonging to the object may be detected, it may help a user to place the object in the centre of the turn table to have a better result in 360° photography. To achieve this, the object may be placed anywhere at the turn table and rotated, and the object pixels may be determined or identified. The image that is desired or best to put in front of the imaging device may be determinedand such image with the product may be shown on a display device in half tone and half tone of live video, which may provide a live view which helps the user to match the location of the object following the image on screen by overlaying both images. As a non-limiting example, the product may first be detected where it is in the frame. The centre or middle of the product may be determined by dividing X and Y values by 2. A centre line of the product may be drawn. The centre line may be thick as all the middle pixels may not lie on the same X and Y lines, so the width (threshold of centre) of the product may increase but in no way, may have value out of product boundary. The difference between the current location and what should be if the product is placed in the centre of the tum- table may be determined and a half tone image may be shown to the user, and where necessary, to act to move the product to the centre of the turn table to have good 360 rotation view.
In the context of the apparatus and various methods for imaging, each light source (e.g., LED) may be controlled through a touch screen monitor or by one or more mouse clicks which help the user to create different lighting effects. LEDs may be driven by a DC (direct current) controller to minimise or eliminate lighting non-sync locking effect. The use of DC controller may eliminate the flicker caused with most LED driving arrangements used in low cost LED drivers using direct AC line of 50/60 hertz so as to save cost. Each light source (e.g., LED) may be addressed and controlled independently. While the light sources (e.g., LEDs) receive commands from the processor, the associated drivers for the light sources are located in the pixel machine.
Capture of round edges of products in good resolution is a challenge. In the context of the apparatus and various methods for imaging, LED light control from all angles may help to light up the round edges from left, right, top and bottom of the product, and good contrast and resolution pixels are made part of the final frame.
In the context of the apparatus and various methods for imaging, various lighting sequences may be employed, either individually or in combination. A first lighting sequence may be based on feature or part detection within the object. After the detection of an object as a whole (i.e., determination or identification of object pixels), various features or parts of the object may be detected based on colour. Using LEDs as a non-limiting example, as lighting provided by the LEDs may be controlled (e.g., via processor or software application in the processor), it may be possible to determine the specific LEDs or LED panel illuminating a specific individual feature of the object. The intensity and/or colour of the light from said specific LEDs or LED panel, which illuminates said specific individual feature, may be varied to be substantially similar to the colour of the individual feature, which helps to further highlight the individual feature. For example, where the colour of the individual feature is yellow, the associated LED may be controlled to provide yellow- coloured lighting onto the individual feature so that said feature may be clearer and colour rich. Such an approach may also help in determining the saturation level for the pixels corresponding to said specific feature for determining the optimum intensity and/or colour value for said pixels when constructing the final resultant frame. Such an approach may be helpful for image capture of objects having reflective part(s), for example, shiny metal parts and glass products.
A second lighting sequence may be based on defined lighting patterns. Different lighting patterns may be provided, for example, round circle spot lighting, line lighting (of different widths or thicknesses), motion lighting (e.g., up, down, circular) to create different effects in photography; fade-in/fade-out effect, etc. in short movie mode as most digital cameras are capable of video recording at high resolution. Lighting patterns or effects may be applied in circumstances where the object is stationary or is being rotated 360° during the image capture.
As the user has control of flexibility in lighting from any or all angles, coupled with possibility of rotation of object(s), there is no limit on the lighting effects that may be created.
In the context of the apparatus and various methods for imaging, to get better colour pixels, all the colours in the object may be identified as separate objects, and lighting specific to such separate objects (e.g., having the same colour) may be provided. All the individual separate objects may then be patched up into a resultant frame to form the complete object with better colour pixels.
In the context of the various methods for imaging, the intensity of the lighting may be decreased, thereby minimising reflection. The lighting angle may be rotated, resulting in change in shadow position and in reflection effect. Such a lighting pattern may be employed for identification of different pixels, for example, object pixels, background pixels, shadow pixels.
In the context of the various methods for imaging, lighting may be provided to the sides of an object. As lighting may be provided from the front for a shiny object, the edges of the object may become mixed with the background. The front lighting may then be turned off and side lighting may be provided for images to be generated to capture the edges of the object without saturation. For example, lighting may be provided from the left and/or right sides of the object. Subsequently, the images generated with the front lighting and the side lighting may be combined to form the complete object.
In the context of the various methods for imaging, in order to have proper details and proper pixel colours, it may be possible to zoom in on the object parts and identify the pixel colours as details, and, then, when in a“zoom out” condition, a picture or image may be formed based on the pixel values obtained in the“zoom in” condition. In this way, even when the object is looked at from far with higher resolution, details and true colours are still there.
In the context of the various methods for imaging, when the imaging device is moved, and/or the object on the turn table is turned, the pixel values for the object pixels may change but the internal reflection still stays at the same position. So, when there is a movement, pixels inside the object which have not moved may be determined and determined (or identified) as reflections.
In the context of the apparatus, a camera stand may be mounted on the box defining the pixel machine, which may help in the minimisation or elimination of vibration or small differences in the position of detection of pixels in different frames.
In the context of the apparatus, the imaging device (e.g., camera) may be mounted on an XYZ motorized stand which may not only automatically zoom but adjust its physical location as per product size and/or placement on the turn table. The imaging device may be moved up and down, in a linear or curved motion. The imaging device may be moved in a curved motion to take a top view image of the object if desired. This may mean that the front-middle LED panel 204 may be movable together with the imaging device when the XYZ motorized stand is activated to control positioning of the imaging device. The front-middle LED panel 204 may be movable up and down and/or rotated in curve. Additionally or alternatively, even if the imaging device is moved up and down with XY movement, a curved effect may be achieved using the software application.
In the context of the apparatus, as lighting may be provided from any or all angles, the lighting may be turned on in a specific or defined manner to find the desired or best lighting based on pixel resolution and/or saturation levels.
In the context of the apparatus and various methods for imaging, front, back, left and right views of the object may be automatically captured by rotation of the turn table. A user (only) needs to place the object front facing the imaging device. The turn table may then rotate and adjust, for example, 5 - 10°, to determine the location where the maximum width of the object is detected. This may help to further eliminate any placement errors by the user. Unless, if desired, the user wants to have a front view of its own choice. Once the best possible front position is detected, the turn table may rotate at any angle, which may be defined in a default setting of the software. For example, the turn table may rotate 90° to capture the left, back and right views of the object for photoshoot.
In the context of the apparatus and various methods for imaging, imaging or photography, and elimination of background, may take less than a minute.
In the context of the apparatus and various methods for imaging, as the light intensity may be controlled and/or different colour combinations may be produced, camera white balance may be adjusted based on RGB color intensity which may help in capturing close to the natural colours of the object(s) or product(s). In the context of the apparatus and various methods for imaging, a software application compatible with Windows® and Linux® operating systems may be provided. After the single button 344 is pressed by a user, the object on the turn table may be automatically determined, the camera may be adjusted to its best location, the lighting may be adjusted, object image(s) may be taken or generated, the background may be eliminated, the object may be centred in the middle of frame and both raw and compressed versions may be stored, for example, in the“My Pictures” folder in Windows® or on a flash drive connected to processor or to a network addressed folder. Activating the button 344 triggers the automatic scripts written in the software application, which may include predefined template(s) to get most of product photographs correct.
In the context of the apparatus and various methods for imaging, an application programming interface (API) platform may be provided, which may be used to write plugin for web application which may support single button operation to upload product images in single, quad or 360° compressed and optimised for web resolution direct to e-commerce websites with single press of the button 344.
As described above, the various methods and techniques may include one or more of capturing frames with and without the object, using side lights by switching off front lights to identify shiny object edges, and removing shadow by rotating lights.
The various methods and techniques may be used in a studio, on stage, or in a film set with bigger objects to shoot, including fashion models, clothing and cars. Background lighting may be varied with known values, and pixels may be marked and determined (or identified) as belonging to the product or the background so that there is no necessity for manual editing, e.g., to remove the background.
It should be appreciated that the various methods for imaging may be implemented, either individually or a combination thereof, in the apparatus for imaging of various embodiments.
While the invention has been particularly shown and described with reference to specific embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. The scope of the invention is thus indicated by the appended claims and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced.

Claims

1. An apparatus for imaging comprising:
a lighting arrangement configured to provide lighting; and
a processor,
wherein the processor is configured to control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest, and to control the lighting arrangement to vary at least one parameter of the lighting during generation of the plurality of images, and
wherein the processor is further configured, for a respective pixel of pixels defining the at least one image, to determine a change in pixel value between the respective pixel and at least one pixel of the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
2. The apparatus as claimed in claim 1,
wherein the at least one pixel comprises at least one corresponding pixel of the respective pixel through the plurality of images, and
wherein the processor is further configured to determine the change in pixel value between the respective pixel and the at least one corresponding pixel, and to determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to the background.
3. The apparatus as claimed in claim 1 or 2,
wherein the lighting comprises a background lighting to illuminate the background, wherein the at least one parameter comprises at least one background parameter of the background lighting, and
wherein the processor is configured to control the lighting arrangement to vary the at least one background parameter during generation of the plurality of images.
4. The apparatus as claimed in claim 1 or 2,
wherein the processor is configured to vary the at least one parameter of the lighting through a plurality of values, and
wherein, for generating the plurality of images, the processor is configured to control the imaging device to generate one or more images comprising the at least one image at a respective value of the plurality of values.
5. The apparatus as claimed in claim 3,
wherein the processor is configured to vary the at least one background parameter of the background lighting through a plurality of values, and further configured to control the imaging device to generate a respective image of the plurality of images at a respective value of the plurality of values, wherein the respective image depicts the object of interest,
wherein the processor is further configured, for each respective image through the plurality of images, to determine a change in pixel value between a respective corresponding pixel at an edge region of the object of interest depicted in the respective image and a respective corresponding background pixel adjacent to the edge region, and
wherein the processor is further configured, from the changes in pixel value determined for the respective corresponding pixel through the plurality of images, to determine the respective corresponding pixel as a pixel associated with the object of interest or a pixel belonging to the background.
6. The apparatus as claimed in claim 1 or 2,
wherein the processor is further configured to determine, from pixels determined as associated with the object of interest, respective pixels having respective desired pixel values,
wherein the processor is further configured to control the lighting arrangement to vary the at least one parameter to adapt the lighting according to the respective desired pixel values determined, and
wherein the processor is further configured to control the imaging device to generate, based on the lighting adapted, a resultant image depicting the object of interest.
7. An apparatus for imaging comprising:
a processor; and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor to:
control an imaging device to generate a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest;
control a lighting arrangement to vary at least one parameter of lighting provided by the lighting arrangement during generation of the plurality of images;
determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images; and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
8. An apparatus for imaging comprising:
a processor; and
a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor, for a plurality of images, wherein at least one image of the plurality of images depicts an object of interest, and wherein the plurality of images are generated at different values of at least one parameter of lighting, to:
determine, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images; and determine, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
9. A method for imaging comprising:
generating a plurality of images, wherein at least one image of the plurality of images generated depicts an object of interest;
varying at least one parameter of lighting during generation of the plurality of images;
determining, for a respective pixel of pixels defining the at least one image, a change in pixel value between the respective pixel and at least one pixel of the plurality of images; and
determining, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
10. The method as claimed in claim 9,
wherein the at least one pixel comprises at least one corresponding pixel of the respective pixel through the plurality of images,
wherein determining a change in pixel value comprises determining the change in pixel value between the respective pixel and the at least one corresponding pixel, and
wherein determining the respective pixel comprises determining, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to the background.
1 1. The method as claimed in claim 9 or 10,
wherein the lighting comprises a background lighting to illuminate the background, and wherein varying at least one parameter comprises varying at least one background parameter of the background lighting during generation of the plurality of images.
12. The method as claimed in claim 9 or 10,
wherein varying at least one parameter of the lighting comprises varying the at least one parameter of the lighting through a plurality of values, and
wherein generating a plurality of images comprises generating one or more images comprising the at least one image at a respective value of the plurality of values.
13. The method as claimed in claim 12,
wherein generating one or more images of the plurality of images at a respective value comprises generating first and second images depicting the background with and without the object of interest respectively,
wherein determining a change in pixel value comprises determining, for each respective pixel of the pixels defining the first image, a change in pixel value between the respective pixel and a corresponding pixel of the second image, and
wherein determining the respective pixel comprises determining, based on the change in pixel value determined between the respective pixel of the first image and the corresponding pixel of the second image, the respective pixel of the first image as a pixel associated with the object of interest or a pixel belonging to the background.
14. The method as claimed in claim 11 ,
wherein varying at least one background parameter of the background lighting comprises varying the at least one background parameter through a plurality of values,
wherein generating a plurality of images comprises generating a respective image of the plurality of images at a respective value of the plurality of values, wherein the respective image depicts the object of interest,
wherein determining a change in pixel value comprises determining, for each respective image through the plurality of images, a change in pixel value between a respective corresponding pixel at an edge region of the object of interest depicted in the respective image and a respective corresponding background pixel adjacent to the edge region, and
wherein determining the respective pixel comprises determining, from the changes in pixel value determined for the respective corresponding pixel through the plurality of images, the respective corresponding pixel as a pixel associated with the object of interest or a pixel belonging to the background.
15. The method as claimed in claim 14, further comprising:
determining, from the changes in pixel value, a desired change in pixel value for the respective corresponding pixel determined as being associated with the object of interest, and
generating a resultant image depicting the object of interest comprising the determined respective corresponding pixel at the respective value corresponding to the desired change in pixel value.
16. The method as claimed in claim 9 or 10, further comprising:
determining, from pixels determined as being associated with the object of interest, respective pixels having respective desired pixel values,
varying the at least one parameter to adapt the lighting according to the respective desired pixel values determined, and
generating, based on the lighting adapted, a resultant image depicting the object of interest.
17. A method for imaging comprising:
for a respective pixel of pixels defining at least one image of a plurality of images, wherein the at least one image depicts an object of interest, and wherein the plurality of images are generated at different values of at least one parameter of lighting, determining a change in pixel value between the respective pixel and at least one pixel of the plurality of images; and
determining, based on the change in pixel value determined, the respective pixel as a pixel associated with the object of interest or a pixel belonging to a background.
18. A computer program or a computer program product comprising instructions which, when executed by a computing device, cause the computing device to carry out a method as claimed in claim 9 or 10.
19. A computer program or a computer program product comprising instructions which, when executed by a computing device, cause the computing device to carry out a method as claimed in claim 17.
PCT/MY2019/000027 2018-07-31 2019-07-24 Apparatus and method for imaging WO2020027645A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI2018702672 2018-07-31
MYPI2018702672 2018-07-31

Publications (2)

Publication Number Publication Date
WO2020027645A2 true WO2020027645A2 (en) 2020-02-06
WO2020027645A3 WO2020027645A3 (en) 2020-03-05

Family

ID=69231213

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2019/000027 WO2020027645A2 (en) 2018-07-31 2019-07-24 Apparatus and method for imaging

Country Status (1)

Country Link
WO (1) WO2020027645A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4258639A1 (en) * 2022-04-08 2023-10-11 Arnold & Richter Cine Technik GmbH & Co. Betriebs KG Background reproducing device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2009272418A1 (en) * 2008-07-14 2010-01-21 Holicom Film Limited Method and system for filming
KR101252671B1 (en) * 2011-04-29 2013-04-09 주식회사 디아이랩 Vehicle imeging system and method of controlling intensity of rafiation
KR101283079B1 (en) * 2011-08-17 2013-07-05 엘지이노텍 주식회사 Network camera having infrared light emitting diode illumination
US10455137B2 (en) * 2014-07-28 2019-10-22 Orbotech Ltd. Auto-focus system
KR102263537B1 (en) * 2014-09-30 2021-06-11 삼성전자주식회사 Electronic device and control method of the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4258639A1 (en) * 2022-04-08 2023-10-11 Arnold & Richter Cine Technik GmbH & Co. Betriebs KG Background reproducing device

Also Published As

Publication number Publication date
WO2020027645A3 (en) 2020-03-05

Similar Documents

Publication Publication Date Title
US7224360B2 (en) Image display method, image processing method, image processing apparatus, scan capturing apparatus and image signal generation method
US7340094B2 (en) Image segmentation by means of temporal parallax difference induction
JP4301656B2 (en) Color normalization method for surface image and system suitable therefor
US7724952B2 (en) Object matting using flash and no-flash images
US7974467B2 (en) Image-taking system
EP1889471B1 (en) Method and apparatus for alternate image/video insertion
US20160014316A1 (en) Photography method using projecting light source and a photography element thereof
JP6286481B2 (en) Writing system and method for object enhancement
CN102572211A (en) Method and apparatus for estimating light source
JP2003271971A (en) Method for real-time discrimination and compensation for illuminance change in digital color image signal
TWI568260B (en) Image projection and capture with simultaneous display of led light
US10594995B2 (en) Image capture and display on a dome for chroma keying
WO2020027647A1 (en) Apparatus and method for imaging
WO2020027645A2 (en) Apparatus and method for imaging
US20170230561A1 (en) Image projection and capture with adjustment for white point
WO2020027646A2 (en) Apparatus and method for imaging
WO2020027648A1 (en) Apparatus and method for imaging
CN111034365A (en) Apparatus and method for irradiating object
JP2013026656A (en) Photographing device, photographing method, and photographing program
JPH1093859A (en) Image processing unit and image processing method
US20240171698A1 (en) Video capture
US20210185242A1 (en) Extraction of subject from background
US20080063275A1 (en) Image segmentation by means of temporal parallax difference induction
JP2023531605A (en) Halo correction in digital images and device for performing such correction
Teixeira et al. Repositioning the salient region of videos by using active illumination

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19843290

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19843290

Country of ref document: EP

Kind code of ref document: A2