WO2014035642A1 - Visualisation en direct de peinture en lumière - Google Patents

Visualisation en direct de peinture en lumière Download PDF

Info

Publication number
WO2014035642A1
WO2014035642A1 PCT/US2013/054454 US2013054454W WO2014035642A1 WO 2014035642 A1 WO2014035642 A1 WO 2014035642A1 US 2013054454 W US2013054454 W US 2013054454W WO 2014035642 A1 WO2014035642 A1 WO 2014035642A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
display
memory
live view
Prior art date
Application number
PCT/US2013/054454
Other languages
English (en)
Inventor
Ryan Harrison WARNBERG
Michelle Kirstin McSWAIN
Original Assignee
Mri Lightpainting Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mri Lightpainting Llc filed Critical Mri Lightpainting Llc
Publication of WO2014035642A1 publication Critical patent/WO2014035642A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present invention generally relates to devices having a camera feature, and more particularly to a light painting live view.
  • smartphones such as the Apple iPhone®, Samsung Galaxy®, Blackberry Q10® and the like
  • tablet computers running, for example, Google's
  • Android® operating system (O/S) and Apple's iOS® O/S include among their features, built-in cameras for taking photos. Applications executing in the smartphones and tablet computers enable control of the built-in cameras, including light painting.
  • light painting is a photographic technique, often performed at night or in a dark area, where a photographer can introduce different lighting elements during a single long exposure photograph, light painting enables the capture of light trails, light graffiti tags, and so forth.
  • the invention features a method including, in a device including at least a processor, a memory, a display and a camera device having an on-screen viewfinder, accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the on-screen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the display, and rendering the first image into the second image to generate a final image
  • a device including at least a processor, a memory, a display and a camera device having an on-screen viewfinder, accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the on-screen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the display, and rendering the first image into
  • the invention features a method including, in a device including at least a processor, a memory, a display and a camera device, executing a light painting live view process in conjunction with the camera to provide a long exposure camera that displays a creation of an exposure in real time.
  • the invention features an apparatus including a processor, a memory, a display, and a camera device, the memory including a light painting live view process, the light painting live view process including accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the onscreen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the display, and rendering the first image into the second image to generate a final image.
  • a processor a memory, a display, and a camera device
  • the memory including a light painting live view process, the light painting live view process including accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the onscreen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the
  • FIG. 1 is a block diagram of an exemplary smartphone.
  • FIG. 2 is a flow diagram of an exemplary light painting live view process.
  • a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and the server can be a component.
  • One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
  • these components can execute from various computer readable media having various data structures stored thereon.
  • the components may communicate via local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal).
  • a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal).
  • the term "or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, "X employs A or B" is intended to mean any of the natural inclusive permutations.
  • an exemplary device 10 includes at least a processor 15, a memory 20, a display unit 25, a camera 30 and a graphical processing unit (GPU) 35.
  • processor 15 includes at least a processor 15, a memory 20, a display unit 25, a camera 30 and a graphical processing unit (GPU) 35.
  • memory 20 includes at least a processor 15, a memory 20, a display unit 25, a camera 30 and a graphical processing unit (GPU) 35.
  • GPU graphical processing unit
  • Example devices 10 include DSLR cameras, smartphones, tablet computers, personal data assistants, digital televisions, computers, laptops, devices with an integrated digital camera such as Nintendo® DS, wearable devices, devices with a digital camera, and so forth.
  • the GPU 35 is an electronic circuit designed to rapidly manipulate and alter memory 20 to accelerate a creation of images in a frame buffer intended for output to the display unit 25.
  • the memory 20 can include at least an operating system (O/S) 40, such as Windows®, Linux®, Google's Android®, Apple's iOS®, or a proprietary OS, and a light painting live view process 100.
  • O/S operating system
  • Light painting is a photographic technique in which exposures are made by moving a hand-held light source or by moving the camera.
  • the term light painting also encompasses images lit from outside the frame with hand-held light sources.
  • moving the light source the light can be used to selectively illuminate parts of the subject or to "paint" a picture by shining it directly into the camera lens.
  • Light painting requires a slow shutter speed, usually a second or more.
  • Light painting can take on the characteristics of a quick pencil sketch.
  • Light painting by moving the camera is the antithesis of traditional photography. At night, or in a dark room, the camera can be taken off the tripod and used like a paintbrush. An example is using the night sky as the canvas, the camera as the brush and cityscapes (amongst other light sources) as the palette. Putting energy into moving the camera by stroking lights, making patterns and laying down backgrounds can create abstract artistic images.
  • Light painting can be done interactively using a webcam. The painted image can already be seen while drawing by using a monitor or projector.
  • the light painting live view process 100 executes in conjunction with the camera 30 to provide a long exposure camera that displays the creation of the exposure in real time.
  • the device 10 can support a variety of applications, such as a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a blogging application, a web browsing application, a digital music player application, and/or a digital video player application.
  • applications such as a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a blogging application, a web browsing application, a digital music player application, and/or a digital video player application.
  • the light painting live view process 100 is a light painting application.
  • a user can use a light source to draw shapes and patterns in front of a camera set to a long exposure.
  • the light painting live view process 100 enables the user behind the camera 30 within the device 10 (or tablet computer) to watch the shapes or patterns that are being created, as they are being created. In prior approaches, the user must wait until the end of the exposure to see what has been made or created.
  • the light painting live view process 100 accesses (105) the camera, which captures individual frames of footage, each of the captured frames displayed on a viewfinder in cumulative succession.
  • the light painting live view process 100 renders (110) the captured frames on a graphical processing unit (GPU), which is a user-facing camera "viewfinder" feature of the light painting live view process 100.
  • GPU graphical processing unit
  • the light painting live view process 100 also sends (115) them through a shader program (also referred to as a vertex and fragment program) into graphical processing unit (GPU).
  • a shader is a computer program that is used to do shading, produce special effects and/or do postprocessing. Shaders calculate rendering effects on graphics hardware with a high degree of flexibility. Most shaders are coded for a graphics processing unit (GPU), though this is not a strict requirement.
  • the position, hue, saturation, brightness, and contrast of all pixels, vertices, or textures used to construct a final image can be altered on the fly, using algorithms defined in the shader, and can be modified by external variables or textures introduced by the program calling the shader.
  • Sending (115) the captured frames through the shader creates two images, one image saved (120) to the device's memory and the other image displayed (125) by light painting live view process 100 for the user to see as if they were watching a video.
  • the light painting live view process 100 uses frames from the camera as the input of the shader program and a progress frame as the output of the shader program. Through additive blending, one image is rendered (130) into the other by the light painting live view process 100, i.e., the image that is being drawn progressively is rendered to the display.
  • the light painting live view process 100 converts (135) the image that is rendered into the memory to a Joint Photographic Experts Group (JPEG) file and projects (140) the JPEG file as a final image on the display.
  • JPEG Joint Photographic Experts Group
  • GUI home screen graphical user interface
  • the GUI includes a main navigation bar that includes a pictorial rendering of a small camera.
  • the light painting live view process 100 opens up to the camera built into the device's memory.
  • the camera screen appears as though it's a video screen, ready for capture.
  • the navigation bar shows a button to tap to begin image capture.
  • a video capture session is initiated and anything that passes in front of the camera will leave a trail, similar to a long exposure on a single-lens reflex/digital single-lens reflex (SLR/DSLR) camera. The difference is that the user sees the trail as it is created, in real time, like a mixture of a stop motion video and an Etch-A-Sketch®.
  • SLR/DSLR single-lens reflex/digital single-lens reflex
  • Exposures can be set for one second, or they can run as long as the user has memory in their device to store the image/video data. The exposure can also be stopped by tapping the same button used to start the exposure. [0035] The user can move their camera around to capture trails, or they can make their own trails with a light of their own.
  • the captured frame is sent through a shader program into the GPU.
  • a GL_MAX blend operation which specifies how source and destination colors are combined, is responsible for producing the light painting, but to control the output a fragment shader program is used.
  • the fragment shader is run on each pixel of an image, producing for each input pixel a corresponding output pixel.
  • the fragment shader supports an "Ambient Light Amount" feature of the capture settings. By taking a brightness parameter between 0 and 1, the fragment shader enables throttling the affect of light input on the painting.
  • vec4 color texture2D(u_diffuseTexture, v_uv);
  • lumlntensity min(1.0, lumlntensity);
  • lumlntensity lumlntensity * lumlntensity
  • lumlntensity max(u_brightness, lumlntensity);
  • gl_FragColor color * lumlntensity
  • the light painting live view process 100 then generates images in stages:
  • Raw Image - this is the image data coming from the device's video camera, frame-by- frame, stored in a buffer managed by the operating system.
  • Input Image this is the image used as an input to the fragment shader program, stored in an OpenGL texture.
  • a texture is an OpenGL Object that contains one or more images that all have the same image format.
  • a texture can be used in two ways. It can be the source of a texture access from a shader, or it can be used as a render target. The raw image is copied into the input image.
  • Intermediate Output Image this is the output of the fragment shader program, stored in an OpenGL texture.
  • the input image is rendered into the intermediate output image, using a custom OpenGL frame buffer backed by an OpenGL texture.
  • frame buffer objects are a mechanism for rendering to images other than the default OpenGL Default frame buffer. They are OpenGL Objects that allow you to render directly to textures, as well as blitting from one frame buffer, to another.
  • Preview Image - this is the output of the fragment shader program, shown on the device's display.
  • the input image is rendered to the screen, using the default OpenGL frame buffer backed by the device's display.
  • Output Image - this is the output of copying and compressing the data from the intermediate output image to a JPEG representation.
  • the output image may be saved to the device's display's camera roll, shared via email, Facebook® or Twitter®, or uploaded to a server.
  • the pixels of the intermediate output image are blended with the pixels of the input image.
  • the output of that blending process is then used to replace the previous value of each pixel of the intermediate output image.
  • the OpenGL blend mode "GL_MAX” is used to blend the pixels.
  • the maximum of the two pixel values is the output of the operation.
  • Gr max G s G d
  • Output Image - this is the output of copying and compressing the data from the intermediate output image to a JPEG representation.
  • the output image may be saved to the device's display's camera roll, shared via email, Facebook® or Twitter®, or uploaded to the server.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne des procédés et un appareil, ainsi que des produits programmes d'ordinateur, en vue de la visualisation en direct de peinture en lumière. Un procédé comprend, dans un dispositif contenant au moins un processeur, une mémoire, un affichage et un dispositif d'appareil photo ayant un viseur sur l'écran, l'accès à l'appareil photo, la capture de trames individuelles de métrage, chacune des trames capturées étant affichée par le biais du viseur sur l'écran en succession cumulative, le rendu des trames capturées sur une unité de traitement graphique (GPU), l'envoi des trames capturées par le biais d'un programme de nuanceur, la génération d'au moins deux images, une première image sauvegardée dans la mémoire et une seconde image affichée sur l'affichage et le rendu de la première image dans la seconde image afin de générer une image définitive.
PCT/US2013/054454 2012-08-28 2013-08-12 Visualisation en direct de peinture en lumière WO2014035642A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261693795P 2012-08-28 2012-08-28
US61/693,795 2012-08-28

Publications (1)

Publication Number Publication Date
WO2014035642A1 true WO2014035642A1 (fr) 2014-03-06

Family

ID=50184134

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/054454 WO2014035642A1 (fr) 2012-08-28 2013-08-12 Visualisation en direct de peinture en lumière

Country Status (1)

Country Link
WO (1) WO2014035642A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104202521A (zh) * 2014-08-28 2014-12-10 深圳市中兴移动通信有限公司 拍摄方法及拍摄装置
WO2016011859A1 (fr) * 2014-07-23 2016-01-28 努比亚技术有限公司 Procédé pour filmer une vidéo en peinture de lumière, terminal mobile, et support d'enregistrement
CN105959588A (zh) * 2016-05-30 2016-09-21 努比亚技术有限公司 移动终端、光绘照片的拍摄装置及方法
WO2018119632A1 (fr) * 2016-12-27 2018-07-05 深圳市大疆创新科技有限公司 Procédé, dispositif et équipement de traitement d'image
CN114697555A (zh) * 2022-04-06 2022-07-01 百富计算机技术(深圳)有限公司 一种图像处理方法、装置、设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070031062A1 (en) * 2005-08-04 2007-02-08 Microsoft Corporation Video registration and image sequence stitching
US20090273686A1 (en) * 2008-05-02 2009-11-05 Nokia Corporation Methods, computer program products and apparatus providing improved image capturing
CN102497508A (zh) * 2011-12-13 2012-06-13 刘桂荣 一种实用光画照片的摄影方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070031062A1 (en) * 2005-08-04 2007-02-08 Microsoft Corporation Video registration and image sequence stitching
US20090273686A1 (en) * 2008-05-02 2009-11-05 Nokia Corporation Methods, computer program products and apparatus providing improved image capturing
CN102497508A (zh) * 2011-12-13 2012-06-13 刘桂荣 一种实用光画照片的摄影方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ROGGE, LAUREN.: "Integration of visual effects into the Virtual Video Camera system", 16 December 2009 (2009-12-16), Retrieved from the Internet <URL:http://www.cg.tu-bs.de/media/publications/integration-visual-effects-virtual-video-camera-system.pdf> [retrieved on 20131027] *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016011859A1 (fr) * 2014-07-23 2016-01-28 努比亚技术有限公司 Procédé pour filmer une vidéo en peinture de lumière, terminal mobile, et support d'enregistrement
WO2016011877A1 (fr) * 2014-07-23 2016-01-28 努比亚技术有限公司 Procédé pour filmer une vidéo en peinture de lumière, terminal mobile, et support d'enregistrement
US10129488B2 (en) 2014-07-23 2018-11-13 Nubia Technology Co., Ltd. Method for shooting light-painting video, mobile terminal and computer storage medium
CN104202521A (zh) * 2014-08-28 2014-12-10 深圳市中兴移动通信有限公司 拍摄方法及拍摄装置
CN104202521B (zh) * 2014-08-28 2016-05-25 努比亚技术有限公司 拍摄方法及拍摄装置
CN105959588A (zh) * 2016-05-30 2016-09-21 努比亚技术有限公司 移动终端、光绘照片的拍摄装置及方法
WO2018119632A1 (fr) * 2016-12-27 2018-07-05 深圳市大疆创新科技有限公司 Procédé, dispositif et équipement de traitement d'image
CN114697555A (zh) * 2022-04-06 2022-07-01 百富计算机技术(深圳)有限公司 一种图像处理方法、装置、设备及存储介质
CN114697555B (zh) * 2022-04-06 2023-10-27 深圳市兆珑科技有限公司 一种图像处理方法、装置、设备及存储介质

Similar Documents

Publication Publication Date Title
US9813638B2 (en) Lightpainting live view
US11558558B1 (en) Frame-selective camera
KR101864059B1 (ko) 이동 단말 및 그 촬영 방법
KR101873668B1 (ko) 이동 단말의 촬영 방법 및 이동 단말
US9264630B2 (en) Method and apparatus for creating exposure effects using an optical image stabilizing device
US10116879B2 (en) Method and apparatus for obtaining an image with motion blur
US9019400B2 (en) Imaging apparatus, imaging method and computer-readable storage medium
US9591347B2 (en) Displaying simulated media content item enhancements on mobile devices
KR101766614B1 (ko) 저속 셔터 촬영 방법 및 그 촬영 장치
WO2016019770A1 (fr) Procédé, dispositif et support de stockage de synthèse d&#39;images
US10148880B2 (en) Method and apparatus for video content stabilization
JP2018513640A (ja) パンニングショットの自動生成
US9420181B2 (en) Electronic camera, computer readable medium recording imaging control program thereon and imaging control method
WO2014035642A1 (fr) Visualisation en direct de peinture en lumière
WO2016011877A1 (fr) Procédé pour filmer une vidéo en peinture de lumière, terminal mobile, et support d&#39;enregistrement
JP2008217785A (ja) 表示コントローラおよび画像データ変換方法
CN114630053B (zh) 一种hdr图像显示方法及显示设备
WO2013082832A1 (fr) Procédé et dispositif de traitement d&#39;image
CN106162024A (zh) 照片处理方法及装置
TW201340705A (zh) 攝像裝置及其影像預覽系統及影像預覽方法
CN103297660A (zh) 一种实时交互的特效摄像摄影方法
US11792511B2 (en) Camera system utilizing auxiliary image sensors
CN109218602A (zh) 影像撷取装置、影像处理方法及电子装置
TW201722137A (zh) 產生包含物體移動軌跡的相片的方法及相關攝影裝置
TW201724838A (zh) 產生包含旋轉軌跡的相片的方法及相關攝影裝置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13832192

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13832192

Country of ref document: EP

Kind code of ref document: A1