WO2020027648A1 - Appareil et procédé d'imagerie - Google Patents

Appareil et procédé d'imagerie Download PDF

Info

Publication number
WO2020027648A1
WO2020027648A1 PCT/MY2019/000030 MY2019000030W WO2020027648A1 WO 2020027648 A1 WO2020027648 A1 WO 2020027648A1 MY 2019000030 W MY2019000030 W MY 2019000030W WO 2020027648 A1 WO2020027648 A1 WO 2020027648A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
pixel
interest
processor
lighting
Prior art date
Application number
PCT/MY2019/000030
Other languages
English (en)
Inventor
Khurram Hamid KHOKHAR
Original Assignee
Kulim Technology Ideas Sdn Bhd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kulim Technology Ideas Sdn Bhd filed Critical Kulim Technology Ideas Sdn Bhd
Publication of WO2020027648A1 publication Critical patent/WO2020027648A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means

Definitions

  • Various embodiments relate to an apparatus for imaging and a method for imaging.
  • Photographing of products normally requires at least a few hours for professional photographers to adjust the lights to strike a balance between product lighting and background colour, as well as lighting of the product.
  • the product is reflective or is multicolour, and/or some parts of the product have a matching or similar colour as that of the background, the photographer has to make some compromises in lightings and quality of product shooting.
  • an apparatus for imaging may include a lighting arrangement configured to provide lighting, and a processor, wherein the processor is configured to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, and to vary a relative rotational movement between the imaging device and the object during generation of the plurality of images, and wherein the processor is further configured, for a respective pixel of pixels defining an image of the plurality of images, to determine a change in pixel value between corresponding pixels of the respective pixel through the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel belonging to the object or a background.
  • an apparatus for imaging is provided.
  • the apparatus may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor to: control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, vary a relative rotational movement between the imaging device and the object during generation of the plurality of images, determine, for a respective pixel of pixels defining an image of the plurality of images, a change in pixel value between corresponding pixels of the respective pixel through the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel belonging to the object or a background.
  • an apparatus for imaging may include a processor, and a memory coupled to the processor, the memory having stored therein instructions, which when executed by the processor, cause the processor, for a plurality of images depicting an object of interest from a plurality of directions, wherein each of the plurality of images depicts the object of interest, to: determine, for a respective pixel of pixels defining an image of the plurality of images, a change in pixel value between corresponding pixels of the respective pixel through the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel belonging to the object of interest or a background.
  • a method for imaging may include generating, via an imaging device, a plurality of images, each of the plurality of images depicting an object of interest, varying a relative rotational movement between the imaging device and the object during generation of the plurality of images, determining, for a respective pixel of pixels defining an image of the plurality of images, a change in pixel value between corresponding pixels of the respective pixel through the plurality of images, and determining, based on the change in pixel value determined, the respective pixel as a pixel belonging to the object or a background.
  • a method for imaging may include, for a respective pixel of pixels defining an image of a plurality of images, the plurality of images depicting an object of interest from a plurality of directions, wherein each of the plurality of images depicts the object of interest, determining a change in pixel value between corresponding pixels of the respective pixel through the plurality of images, and determining, based on the change in pixel value determined, the respective pixel as a pixel belonging to the object of interest or a background.
  • a computer program or a computer program product may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
  • FIG. 1A shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
  • FIG. 1B shows a schematic block diagram of an apparatus for imaging, according to various embodiments.
  • FIG. 1C shows a flow chart illustrating a method for imaging, according to various embodiments.
  • FIG. 1D shows a flow chart illustrating a method for imaging, according to various embodiments.
  • FIGS. 2A to 2E show schematic perspective views of a pixel machine of various embodiments from different angles.
  • FIG. 3 shows a schematic back view of a main controller, according to various embodiments.
  • FIG. 4 shows a flow chart illustrating a method for imaging, according to various embodiments.
  • an imaging apparatus for example, a product photography apparatus, e.g., an automatic photography equipment for products.
  • the apparatus may minimise the effort and labour associated with photography of products, for example, for the fastest expanding online shopping business.
  • Various embodiments may also provide the corresponding methods for imaging.
  • One or more of the following may be achieved: (1) detection of pixels belonging to object(s) for automatic background elimination (e.g., background cut); (2) detection of saturated pixel(s) to automatically eliminate reflections; (3) detection of, and elimination or maintenance of product shadows; (4) automatic centering of the object(s); (5) elimination of background and shadows of rotating object(s); (6) providing uniform exposure and colour for all the pixels.
  • automatic background elimination e.g., background cut
  • detection of saturated pixel(s) to automatically eliminate reflections e.g., background cut
  • detection of, and elimination or maintenance of product shadows e.g., automatic centering of the object(s)
  • elimination of background and shadows of rotating object(s) e.g., rotating object(s)
  • FIG. 1A shows a schematic block diagram of an apparatus 100 for imaging, according to various embodiments.
  • the apparatus 100 includes a lighting arrangement 102 configured to provide lighting, and a processor 104, wherein the processor 104 is configured to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, and to vary a relative rotational movement between the imaging device and the object during generation of the plurality of images, and wherein the processor is further configured, for a respective pixel of pixels defining an image of the plurality of images, to determine a change in pixel value between corresponding pixels of the respective pixel through the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel belonging to the object or a background.
  • an apparatus 100 for imaging may be provided, having a lighting arrangement 102 and a processor 104.
  • the apparatus (or arrangement) 100 may be employed to image or take images of an object of interest that is the subject to be imaged.
  • the object may be positioned against a background.
  • background may include a reference to the background relative to the object (i.e., in the presence of the object in the foreground).
  • the processor 104 may communicate with the lighting arrangement 102, for example, via a channel represented by the line 106.
  • the processor 104 and the lighting arrangement 102 may be electrically coupled to one another, e.g., via a physical cable 106.
  • the processor 104 may send one or more control signals to the lighting arrangement 102.
  • the lighting arrangement 102 may provide lighting to illuminate the object and/or the background. This may mean that the object and/or the background may be illuminated simultaneously or separately by the lighting.
  • the lighting arrangement 102 may partially or entirely surround the object.
  • the lighting arrangement 102 may be arranged in the form of a closed box environment, and the object may be placed within said environment for imaging purposes.
  • the various methods and techniques may be used in an open studio, on a stage, or in a film set, or any other suitable environments or settings.
  • the lighting may illuminate the object and/or the background from different directions towards the object and/or the background.
  • the processor 104 may control an (optical) imaging device (e.g., a (digital) camera capable of taking photographs and/or videos, or a (digital) video recorder) to generate a plurality of images (or frames).
  • the plurality of images may mean 10, 20, 50, 100 or more images.
  • the imaging device may be separately provided or integrally provided with the apparatus 100.
  • the processor 104 may control the imaging device to take a number of images showing the object of interest in the images. It should be appreciated that each of or all of the plurality of images may depict the object (e.g., against a background).
  • the plurality of images may be taken or obtained (directly) as still images (e.g., photographic images), and/or may be images extracted from a moving sequence of consecutive graphics (e.g., a moving picture, motion picture, movie).
  • still images e.g., photographic images
  • moving sequence of consecutive graphics e.g., a moving picture, motion picture, movie
  • the processor 104 may be further configured to vary a relative rotational movement between the imaging device and the object during or for generation of the plurality of images. This may mean that, when generating the plurality of images, at least one of the object or the imaging device may be rotated relative to each other. In other words, the object may be rotated, or the imaging device may be rotated, or both may be rotated relative to one another. In various embodiments, the object may be rotated continuously or continually along one direction.
  • the imaging device may be placed on a support structure.
  • the support structure may be a movable support structure (e.g., an XYZ motorized stand), and the processor 104 may control movement (e.g., rotational movement) of the movable support structure, relative to the object.
  • movement e.g., rotational movement
  • the apparatus 100 may include a turn table to support the object, and the processor 104 may control movement (e.g., rotational movement) of the turn table, relative to the imaging device.
  • the object may be placed on the turn table to be rotated.
  • the turn table may be at least substantially transparent or transmissive to light.
  • image pixels defining each image generated
  • the processor 104 may determine a change in pixel value between corresponding pixels of the respective pixel through (or across) the plurality of images.
  • the change in pixel value may be a difference between respective pixel values of the corresponding pixels.
  • the change in the pixel value may be defined in terms of a percentage change.
  • the term“pixel value” may mean the intensity value of the pixel.
  • the term“corresponding pixel” for an image in relation to the“respective pixel” may mean a pixel corresponding to the same position or coordinate in each image of the plurality of images generated.
  • the determination of the change in pixel value may be carried out using corresponding pixels of two or more images of the plurality of images. All of the plurality of images may be used for this determination. Further, the determination of the change in pixel value may be carried out for each of all of the image pixels.
  • corresponding pixels may include pixels belonging to images generated immediately adjacent to each other.
  • the processor 104 may determine whether the respective pixel is a pixel belonging to the object (“object pixel”) or a background (relative to the object) (“background pixel”). In this way, object pixels and background pixels may be determined or identified and, thus, differentiated from each other. Accordingly, an object pixel may be distinguished from a non-object pixel e.g., a background pixel.
  • to“determine” the respective pixel as a pixel belonging to the object or a background may include, to“identify” the respective pixel as a pixel belonging to the object or a background.
  • this may not necessarily mean to (positively) mark or tag the respective pixel as an object pixel or a background pixel, although this may be the case in various embodiments.
  • the identification of the respective pixel may mean the resulting or selection process of the respective pixel being determined or inferred to be an object pixel or a background pixel based on the result of the determination of the change in pixel value.
  • the term“object pixel” means a pixel that defines, in an image, the physical part of the object
  • the term“background pixel” means a pixel that defines, in an image, the background in the presence of the object.
  • the term“object” may refer to a thing, a product, a person, or a subject to be imaged. Further, it should be appreciated that the term“object” may include a living thing (e.g., person, animal, plant, etc.) and/or a non-living thing (e.g., a product, item, inanimate body or object, etc.).
  • a certain or predetermined threshold value may be set, and the change in the pixel value may be compared against the threshold value. For example, if the change in the pixel value is less than the threshold value, the respective pixel may be determined or identified as a background pixel; otherwise, the respective pixel may be determined or identified as an object pixel.
  • a first image may be generated at a first angle or degree (e.g., 0°), a second image at a second angle or degree (e.g., 5°), and then a third image back at the first angle or degree (e.g., 0°).
  • a first angle or degree e.g., 0°
  • a second image at a second angle or degree (e.g., 5°)
  • a third image back at the first angle or degree (e.g., 0°).
  • This provides 3 images where the first and third images should be or are expected to be the same as both images are generated at the first angle.
  • such technique may help in setting the threshold and eliminate any pixel shared between the background and the object of interest, and further help to achieve a better result in defining the edges of the object of interest.
  • the processor 104 may control the lighting arrangement 102 to vary at least one parameter of the lighting during or for generation of the plurality of images.
  • the at least one parameter may be varied in between generation of two images, or of two immediately adjacent images. This may mean that the at least one parameter of the lighting may be different for each of the images generated.
  • the at least one parameter of the lighting may include any one of or any combination of a lighting intensity, a lighting colour, or a lighting direction.
  • the lighting may include an object lighting to illuminate the object.
  • the processor 104 may be configured to control the lighting arrangement 102 to vary at least one object parameter of the object lighting during or for generation of the plurality of images.
  • background lighting may be provided to illuminate the background.
  • the term“background lighting” may mean the lighting that illuminates the background, and/or the space in between the background and the object of interest, without illuminating the object. This means that the object of interest to be imaged is not illuminated by the background lighting, i.e., lighting provided from one or more light sources, to be employed as the background lighting to illuminate the background, does not illuminate the object.
  • the background lighting may be constant (i.e., not variable) or an at least background parameter of the background lighting may be varied.
  • the at least one object parameter and/or the at least background parameter may be varied simultaneously or may be varied by the same factor or value.
  • the at least one object parameter and/or the at least background parameter may correspond to intensity and/or colour.
  • the imaging device e.g., camera
  • the imaging device may be adjusted to low(er) sensitivity so that light scattered from the background that may potentially illuminate the object may not be captured or be observable in one or more images of the plurality of images generated.
  • the lighting arrangement 102 may include a plurality of light sources to provide the lighting.
  • Each light source may be or may include a light emitting diode or device (LED).
  • the plurality of light sources may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs.
  • RGBW LEDs may provide a pure white colour.
  • any other types of light sources may be used, as long as the light sources may be controlled to vary at least one parameter (e.g., light intensity) of the lighting provided by the light sources.
  • Each light source (e.g., LED) may be individually addressable. This may mean that each light source may be independently switched on/off, and controlled to vary the at least one parameter of the lighting.
  • the lighting arrangement 102 may include a plurality of (separate) lighting panels to provide lighting from a plurality of directions to the object and/or background.
  • Each lighting panel may include a plurality of light sources, for example, LEDs, e.g., a combination of red LEDs, green LEDs, blue LEDs, and white LEDs.
  • the lighting arrangement 102 may be or may form part of a pixel machine.
  • a driver arrangement may be provided (e.g., in the pixel machine) for driving the lighting arrangement 102.
  • the driver arrangement may have a plurality of drivers for driving the associated light sources of the lighting arrangement 102.
  • the processor 104 may be provided with one or more communication interfaces, e.g., including at least one interface for communication with the lighting arrangement 102 or the pixel machine.
  • the processor 104 may be or may form part of a (main) controller.
  • the (main) controller may further include a display and other peripheral devices, e.g., a keyboard, a mouse, etc.
  • the pixel machine and the (main) controller may be comprised in the apparatus 100.
  • the processor 104 may be configured to rotate the object relative to the imaging device about at least one of a vertical axis or a horizontal axis (associated with the object). The rotation may be in one direction, or back and forth in opposite or two directions, e.g., clockwise and/or anti-clockwise rotation.
  • the processor 104 may be configured to vary the relative rotational movement through a plurality of angles. During or for generating the plurality of images, the processor 104 may be configured to control the imaging device to generate a respective image of the plurality of images at a respective angle of the plurality of angles.
  • the plurality of values may be different to one another. This may mean that the angle of rotation of the object and/or the imaging device may be different between two (immediately adjacent) images.
  • the plurality of angles may be within or may span a range from 0° to 360°. Adjacent (or immediately adjacent) respective angles may be spaced apart by 1°, 2°, 3°, 5°, 10° or any higher value.
  • the processor 104 may rotate the object over a plurality of angles at a plurality of intervals of 1°, 2°, 3°, 5°, 10° or any higher number, and the processor 104 may be configured to generate each image of the plurality of images at a respective interval.
  • the processor 104 may be further configured to remove or discard pixels determined or identified as belonging to the background. All pixels that are determined as background pixels may be removed or discarded.
  • the lighting arrangement 102 may include at least one curved lighting panel configured to provide a focused lighting towards the object.
  • the at least one curved lighting panel may include a plurality of light sources, e.g., LEDs. Four curved lighting panels may be provided.
  • the apparatus 100 may further include an actuator configured to communicate with the processor 104, wherein, in response to a single activation of the actuator, operation of the processor 104 may be (fully) automated.
  • the actuator may be a push actuator, e.g., a push button.
  • the processor 104 may be further configured to identify a flicker effect in the plurality of images generated and to remove the flicker effect. For example, when the imaging device is operated to take videos, flicker effect may be captured in one or more images of the plurality of images due to frequency difference in the respective operations between the imaging device and the lighting arrangement 102. Flicker in the images may be observed continuously. Rather than controlling the lighting to minimise the flicker effect, this may be achieved in the processing of the images. From the plurality of images generated, flicker which is regular and similar may be identified so that the flicker may be eventually removed.
  • the processor 104 may be further configured to generate a resultant (or final) image of the object based on (or from or using) the pixels determined as belonging to the object. In this way, a final image may be generated for the object on the basis of the desired pixel values determined for the object pixels.
  • FIG. 1B shows a schematic block diagram of an apparatus lOOb for imaging (and/or for image processing), according to various embodiments.
  • the apparatus lOOb includes a processor 104b, and a memory 105 coupled to the processor l04b.
  • the memory 105 and the processor l04b may be coupled to each other (as represented by the line 107), e.g., physically coupled and/or electrically coupled.
  • the memory 105 has stored therein instructions, which when executed by the processor 104b, cause the processor 104b to control an imaging device to generate a plurality of images, each of the plurality of images depicting an object of interest, to vary a relative rotational movement between the imaging device and the object during (or for) generation of the plurality of images, to determine, for a (or each) respective pixel of pixels defining an image of the plurality of images, a change in pixel value between corresponding pixels of the respective pixel through the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel belonging to the object or a background.
  • the plurality of images may be transferred to and/or stored in the memory 105 or another memory.
  • the processor l04b may execute the instructions to cause the processor l04b to vary the relative rotational movement through a plurality of angles, and to control the imaging device to generate a respective image of the plurality of images at a respective angle of the plurality of angles.
  • the memory 105 has stored therein instructions, which when executed by the processor l04b, cause the processor l04b, for a plurality of images depicting an object of interest from a plurality of directions, wherein each of the plurality of images depicts the object of interest, to determine, for a respective pixel of pixels defining an image of the plurality of images, a change in pixel value between corresponding pixels of the respective pixel through the plurality of images, and determine, based on the change in pixel value determined, the respective pixel as a pixel belonging to the object of interest or a background.
  • the plurality of images may be transferred to and/or stored in the memory 105 or another memory. It should be appreciated that description in the context of the apparatus 100 may correspondingly be applicable to the apparatus lOOb, and/or description in the context of the processor 104 may correspondingly be applicable to the processor l04b.
  • FIG. 1C shows a flow chart 120 illustrating a method for imaging (and/or for image processing), according to various embodiments.
  • a plurality of images are generated via (or using) an imaging device, each of the plurality of images depicting an object of interest.
  • a relative rotational movement between the imaging device and the object is varied (or changed) during (or for) generation of the plurality of images.
  • a change in pixel value between corresponding pixels of the respective pixel through the plurality of images is determined.
  • the respective pixel is determined as a pixel belonging to the object or a background
  • the method may include providing the lighting.
  • the object may be rotated relative to the imaging device about at least one of a vertical axis or a horizontal axis (associated with the object).
  • the relative rotational movement may be varied through a plurality of angles, and, at 122, a respective image of the plurality of images may be generated at a respective angle of the plurality of angles.
  • the method may further include removing or discarding pixels determined or identified as belonging to the background. All pixels that are determined as background pixels may be removed or discarded.
  • the method may further include identifying a flicker effect in the plurality of images generated and removing the flicker effect.
  • a resultant image of the object may be generated based on (or from or using) the pixels determined as belonging to the object.
  • At least one parameter of lighting provided to illuminate the object may be varied.
  • FIG. 1D shows a flow chart 130 illustrating a method for imaging (and/or for image processing), according to various embodiments.
  • a change in pixel value is determined between corresponding pixels of the respective pixel through the plurality of images.
  • the respective pixel is determined as a pixel belonging to the object of interest or a background.
  • the method may include providing the lighting. It should be appreciated that description in relation to the method described in the context of the flow chart 120 may correspondingly be applicable in relation to the method described in the context of the flow chart 130, and vice versa.
  • Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to carry out a method for imaging as described herein.
  • Various embodiments may also provide a computer program or a computer program product, which may include instructions which, when executed by a computing device, cause the computing device to control an imaging device to generate a plurality of images depicting an object of interest, to vary a relative rotational movement between the imaging device and the object during (or for) generation of the plurality of images, to determine, for a respective pixel of pixels defining an image of the plurality of images, a change in pixel value between corresponding pixels of the respective pixel through the plurality of images, and to determine, based on the change in pixel value determined, the respective pixel as a pixel belonging to the object or a background.
  • the apparatus of various embodiments may include two parts: a pixel machine and a main controller, as will be described further below with reference to FIGS. 2A to 2E and 3.
  • FIGS. 2 A to 2E show schematic perspective views of a pixel machine 201 of various embodiments. It should be appreciated that while one or more features or elements of the pixel machine 201 may not be shown in one or more of FIGS. 2 A to 2E for clarity and easier understanding purposes, and/or to illustrate an internal environment of the pixel machine 201, such features/elements may nevertheless be part of the pixel machine 201 as may be apparent from FIGS. 2A to 2E. Further, while LEDs are described in the context of the pixel machine 201, it should be appreciated that other types of light sources may be employed, as long as at least one parameter (e.g., light intensity) of the lighting provided may be variable.
  • at least one parameter e.g., light intensity
  • the pixel machine 201 may include a front- left LED panel 202, a front-middle LED panel 204, a front-right LED panel 206, a back side LED panel 207, a left side LED panel 214, a right side LED panel 216, four (key light) LED panels 220 which may be curved, a top LED panel 222, a top-back LED panel 224, a top-left LED panel 225, a top-right LED panel 226, a top- front (key light) LED panel 227, and a bottom LED panel 228, where one or more of these LED panels may define a lighting arrangement (e.g., 102, FIG. 1A).
  • the lighting arrangement may provide omnidirectional lighting, where the intensity of the lighting may be changed.
  • Each of the LED panels 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a plurality of LEDs (illustrated as squares in the panels and represented as 229 for some LEDs).
  • Each LED panel 202, 204, 206, 207, 214, 216, 220, 222, 224, 225, 226, 227, 228 may include a combination of red (R) LEDs, green (G) LEDs, blue (B) LEDs, and white (W) LEDs (RGBW LEDs), which may provide pure white colour as producing white from RGB in a close box may be challenging.
  • Each LED panel may be individually or independently controlled.
  • Each LED 229 may be individually or independently addressable.
  • the lighting arrangement of the pixel machine 201 may provide three types of lights: key light, fill light and back light, thereby providing three-point lighting.
  • the lighting arrangement may also provide background lighting.
  • the four LED panels 220 in the comer, the top-back LED panel 224, the top-left LED panel 225, the top-right LED panel 226, and the top-front LED panel 227 may provide key light.
  • the left side LED panel 214 and the right side LED panel 216 may provide fill light.
  • the back side LED panel 207 may provide back light.
  • An imaging device (e.g., camera) 210 which may be supported on a support structure (e.g., camera stand) 212, may be provided for imaging through an aperture 208 defined in the front- middle LED panel 204.
  • the imaging device 210 may have a zoom feature, meaning that the imaging device 210 is capable of zooming-in and zooming-out operations for generation of images.
  • a turn table 231 may be provided to support one or more (sample) objects (e.g., represented as 230) that are to be imaged.
  • a stepper motor 232 may be provided for rotation of the object(s) 230 or turn table 231.
  • the lighting arrangement of the pixel machine 201 may provide lighting to light up an object (or product) 230 at least substantially equally from all angles so one or more or all features of the object(s) 230 may have good resolution, contrast and colours.
  • the four side LED panels 220 (curved or round shape) employ more LEDs to provide sharp and crisp contrast of the object(s) 230.
  • the panels 220 may also help to produce useful shadows shining areas of the object(s) 230.
  • the curved shape angle of the LED panels 220 may give a more focused light to the object(s) 230.
  • the panels 220 may also be used to generate different light effects, e.g., highlights and long and softer shadows, Chiaroscuro effect, etc, for photography.
  • the key light provided may be angled such that it may light up any part of an object 230 with control of the intensity of the key light.
  • Each LED 229 may be addressed individually.
  • Such a technique may allow (full) control over object lighting so that an optimal lighting condition or environment suitable for product photography may be determined.
  • the optimal or best lighting condition or environment may be determined or suggested automatically with minimal or without any saturation and/or dark spots.
  • An interface 234 may be provided on the pixel machine 201, for example, for cable connection to the processor to be described below with reference to FIG. 3.
  • FIG. 3 shows a schematic back view of a main controller 340, according to various embodiments.
  • the main controller 340 may include a processor 342, for example, an industrial PC with embedded LEDs and motor controllers.
  • the main controller 340 may further include a display device (e.g., a touch-screen monitor) 352, a keyboard 354 and a mouse 356.
  • a display device e.g., a touch-screen monitor
  • An actuator for example, a single capture button 344, may be provided (integrated) with the processor 342.
  • a connector 348 may be provided for connection to a network, e.g., a local area network (LAN).
  • a power connector or socket 350 may also be provided.
  • An interface (or link connector) 346 may be provided on the processor 342 for communication with the pixel machine 201 (FIGS. 2A to 2E), for example, via the interface 234 (FIG. 2A). In this way, signals may be communicated from the processor 342 to the pixel machine 201, for example, to control movement and operation of the imaging device 210, to vary the parameter of the lighting provided by the lighting arrangement, etc.
  • a cable may be connected between the interfaces 234, 346, e.g., a multicore cable which may carry power, and control signals for LEDs 229, motor(s) (e.g., 232) and the imaging device 210 of the pixel machine 201.
  • FIG. 4 shows a flow chart 490 illustrating a method for imaging, as an example to determine or identify object pixels and background pixels based on object movement.
  • an object of interest e.g., a product
  • the apparatus e.g., 100, FIG. 1 A
  • various embodiments e.g., in the pixel machine 201, FIGS.
  • 2A-2E for example, on a turn table (e.g., 231). Lighting may be provided to illuminate the object. The background may also be illuminated by the lighting provided. The object may then be rotated (e.g., via rotation of the turn table) and a plurality of images may be generated with the rotation of the object. Command is sent to the imaging device (e.g., 210) to generate the images.
  • a turn table e.g., 231.
  • Lighting may be provided to illuminate the object.
  • the background may also be illuminated by the lighting provided.
  • the object may then be rotated (e.g., via rotation of the turn table) and a plurality of images may be generated with the rotation of the object.
  • Command is sent to the imaging device (e.g., 210) to generate the images.
  • the light intensity and/or colour emitted from the associated LEDs or LED panels may be provided and/or varied under the control of the processor 342 (FIG. 3) which may in turn take command via or from a software application, e.g., based on user input. It should be appreciated that key light may be provided on the object.
  • the button 344 (FIG. 3) may be actuated/pressed to perform the product imaging or photography. If the button 344 is pressed for more than 5 seconds, 360° photography may be carried out. Actuation of the button 344 may result in automated operations.
  • a respective pixel value may be compared against a threshold change value. This may mean that, a respective pixel value, relative to a reference value (e.g., another pixel value of a corresponding pixel) may be compared against a threshold change value. Put in a different way, the change in the pixel value may be compared against a threshold value. For example, the condition may be set as to whether the pixel value is less than the threshold change (pixel value ⁇ threshold change). If the result“pixel value ⁇ threshold change” is determined, the pixel is determined as belonging to the background at 494. Otherwise, the pixel is determined as belonging to the object at 495.
  • the threshold (change) value may be set via the corresponding software application.
  • the subject or object to be imaged may be positioned on a turn table, and then rotated at intervals of angles.
  • the pixels corresponding to the object change their positions while the pixels corresponding to background are“static” in terms of position and value, the pixels corresponding to the objects may be determined and separated from those corresponding to the background.
  • the images may be generated via two modes: photography mode and video (or movie) mode.
  • the imaging device (e.g., camera) 210 may capture frames at a rate of 60 seconds (may be less or more depending on device speed), and the relative rotational movement between the imaging device and the object to be imaged may be changed during the making of the video. Further, the light intensity and/or colours may be changed during the making of the video.
  • the auto iris mode of camera would be turned off so that the effect of rate of change of light in the background and the product may be captured.
  • the timing of the frames may be marked in the software application, which may be extracted and used to make a final or resultant 360° of still photograph.
  • the various methods for imaging may be used for making 360° photography of product by rotation where each frame has clear background elimination.
  • the object may help a user to place the object in the centre of the turn table to have a better result in 360° photography.
  • the object may be placed anywhere at the turn table and rotated, and the object pixels may be determined.
  • the image that is desired or best to put in front of the imaging device may be identified and such image with the product may be shown on a display device in half tone and half tone of live video, which may provide a live view which helps the user to match the location of the object following the image on screen by overlaying both images.
  • the product may first be detected where it is in the frame.
  • the centre or middle of the product may be determined by dividing X and Y values by 2.
  • a centre line of the product may be drawn.
  • the centre line may be thick as all the middle pixels may not lie on the same X and Y lines, so the width (threshold of centre) of the product may increase but in no way, may have value out of product boundary.
  • the difference between the current location and what should be if the product is placed in the centre of the tum-table may be determined and a half tone image may be shown to the user, and where necessary, to act to move the product to the centre of the turn table to have good 360 rotation view.
  • each light source e.g., LED
  • each light source may be controlled through a touch screen monitor or by one or more mouse clicks which help the user to create different lighting effects.
  • LEDs may be driven by a DC (direct current) controller to minimise or eliminate lighting non-sync locking effect.
  • DC controller direct current controller may eliminate the flicker caused with most LED driving arrangements used in low cost LED drivers using direct AC line of 50/60 hertz so as to save cost.
  • Each light source e.g., LED
  • Each light source may be addressed and controlled independently. While the light sources (e.g., LEDs) receive commands from the processor, the associated drivers for the light sources are located in the pixel machine.
  • LED light control from all angles may help to light up the round edges from left, right, top and bottom of the product, and good contrast and resolution pixels are made part of the final frame.
  • the intensity of the lighting may be decreased, thereby minimising reflection.
  • the lighting angle may be rotated, resulting in change in shadow position and in reflection effect.
  • Such a lighting pattern may be employed for identification of different pixels, for example, object pixels, background pixels, shadow pixels.
  • lighting may be provided to the sides of an object.
  • lighting may be provided from the front for a shiny object, the edges of the object may become mixed with the background.
  • the front lighting may then be turned off and side lighting may be provided for images to be generated to capture the edges of the object without saturation.
  • lighting may be provided from the left and/or right sides of the object. Subsequently, the images generated with the front lighting and the side lighting may be combined to form the complete object.
  • a camera stand may be mounted on the box defining the pixel machine, which may help in the minimisation or elimination of vibration or small differences in the position of detection of pixels in different frames.
  • the imaging device e.g., camera
  • the imaging device may be mounted on an XYZ motorized stand which may not only automatically zoom but adjust its physical location as per product size and/or placement on the turn table.
  • the imaging device may be moved up and down or sideway, in a curved motion. This may mean that the front-middle LED panel 204 may be movable together with the imaging device when the XYZ motorized stand is activated to control positioning of the imaging device.
  • the lighting may be turned on in a specific or defined manner to find the desired or best lighting based on pixel resolution and/or saturation levels.
  • front, back, left and right views of the object may be automatically captured by rotation of the turn table.
  • a user (only) needs to place the object front facing the imaging device.
  • the turn table may then rotate and adjust, for example, 5 - 10°, to determine the location where the maximum width of the object is detected. This may help to further eliminate any placement errors by the user.
  • the turn table may rotate at any angle, which may be defined in a default setting of the software. For example, the turn table may rotate 90° to capture the left, back and right views of the object for photoshoot.
  • camera white balance may be adjusted based on RGB color intensity which may help in capturing close to the natural colours of the object(s) or product(s).
  • a software application compatible with Windows® and Linux® operating systems may be provided.
  • the object on the turn table may be automatically determined, the camera may be adjusted to its best location, the lighting may be adjusted, object image(s) may be taken or generated, the background may be eliminated, the object may be centred in the middle of frame and both raw and compressed versions may be stored, for example, in the“My Pictures” folder in Windows® or on a flash drive connected to processor or to a network addressed folder.
  • Activating the button 344 triggers the automatic scripts written in the software application, which may include predefined template(s) to get most of product photographs correct.
  • an application programming interface (API) platform may be provided, which may be used to write plugin for web application which may support single button operation to upload product images in single, quad or 360° compressed and optimised for web resolution direct to e-commerce websites with single press of the button 344.
  • API application programming interface
  • the various methods may be used in a studio with bigger objects to shoot, including fashion models, clothing and cars. Background lighting may be varied with known values, and pixels may be marked and identified as belonging to the product or the background so that there is no necessity for manual editing, e.g., to remove the background. It should be appreciated that the various methods for imaging may be implemented, either individually or a combination thereof, in the apparatus for imaging of various embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

L'invention porte sur un appareil d'imagerie. L'appareil comprend un agencement d'éclairage conçu pour fournir un éclairage, et un processeur, le processeur étant configuré pour commander un dispositif d'imagerie pour générer une pluralité d'images, chaque image parmi la pluralité d'images représentant un objet d'intérêt, et pour faire varier un mouvement de rotation relatif entre le dispositif d'imagerie et l'objet pendant la génération de la pluralité d'images, et le processeur étant en outre configuré, pour un pixel respectif de pixels définissant une image de la pluralité d'images, pour déterminer un changement de valeur de pixel entre des pixels correspondants du pixel respectif à travers la pluralité d'images, et pour déterminer, sur la base du changement de valeur de pixel déterminé, le pixel respectif en tant que pixel appartenant à l'objet d'intérêt ou à un arrière-plan. L'invention concerne également un procédé d'imagerie.
PCT/MY2019/000030 2018-07-31 2019-07-24 Appareil et procédé d'imagerie WO2020027648A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI2018001381 2018-07-31
MYPI2018001381 2018-07-31

Publications (1)

Publication Number Publication Date
WO2020027648A1 true WO2020027648A1 (fr) 2020-02-06

Family

ID=69232168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2019/000030 WO2020027648A1 (fr) 2018-07-31 2019-07-24 Appareil et procédé d'imagerie

Country Status (1)

Country Link
WO (1) WO2020027648A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020096750A (ko) * 2001-06-21 2002-12-31 백승헌 파노라마 영상 제작시 장애물 제거장치 및 제거방법
EP1524864A1 (fr) * 2003-10-17 2005-04-20 Canon Europa N.V. Système à plateau tournant pour photographier des objects tridimensionnels
KR20120122362A (ko) * 2011-04-29 2012-11-07 주식회사 디아이랩 차량 촬상 시스템 및 그를 위한 광량 제어 방법
US20160028936A1 (en) * 2014-07-28 2016-01-28 Orbotech Ltd. Auto-focus system
KR20160038460A (ko) * 2014-09-30 2016-04-07 삼성전자주식회사 전자 장치와, 그의 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020096750A (ko) * 2001-06-21 2002-12-31 백승헌 파노라마 영상 제작시 장애물 제거장치 및 제거방법
EP1524864A1 (fr) * 2003-10-17 2005-04-20 Canon Europa N.V. Système à plateau tournant pour photographier des objects tridimensionnels
KR20120122362A (ko) * 2011-04-29 2012-11-07 주식회사 디아이랩 차량 촬상 시스템 및 그를 위한 광량 제어 방법
US20160028936A1 (en) * 2014-07-28 2016-01-28 Orbotech Ltd. Auto-focus system
KR20160038460A (ko) * 2014-09-30 2016-04-07 삼성전자주식회사 전자 장치와, 그의 제어 방법

Similar Documents

Publication Publication Date Title
JP6286481B2 (ja) オブジェクト向上のためのライティング・システムおよび方法
TW201602556A (zh) 具有投影光源的攝像方法及其攝像裝置
US20160044225A1 (en) Rapid Synchronized Lighting and Shuttering
US20180332239A1 (en) Background replacement utilizing infrared light and visible light
TWI568260B (zh) 同時顯示led光下的影像投射及擷取技術
CN106604005A (zh) 一种投影电视自动对焦方法及系统
CN108604046A (zh) 摄影支持装置及其工作方法、以及工作程序
US10594995B2 (en) Image capture and display on a dome for chroma keying
JP2016181068A (ja) 学習サンプル撮影装置
WO2020027647A1 (fr) Appareil et procédé d'imagerie
CN110677949A (zh) 灯具的控制方法和控制系统以及电子设备
US20200201165A1 (en) Photography system and method
CN110264394B (zh) 图像背景去除系统及方法、图像背景更换装置
WO2020027645A2 (fr) Appareil et procédé d'imagerie
CN106796385A (zh) 具有针对白点的调整的图像投影和捕获
WO2020027648A1 (fr) Appareil et procédé d'imagerie
KR102564522B1 (ko) 다시점 촬영 시스템 및 3d 체적 객체 생성 방법
WO2020027646A2 (fr) Appareil et procédé d'imagerie
US11589444B2 (en) Apparatuses and methods for illuminating objects
KR102087822B1 (ko) 소도체 등심 영상 획득기
JP2013026656A (ja) 撮影装置、撮影方法、および撮影プログラム
CN114071128A (zh) 一种adas测试灯箱装置及系统
JP2001174881A (ja) 撮影装置、及び記憶媒体
KR102655214B1 (ko) 자동 촬영 시스템 및 그 방법
JP3236995U (ja) 容器の表面検査装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19843947

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19843947

Country of ref document: EP

Kind code of ref document: A1