US 20150042832A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2015/0042832 A1 Warnberg et al. (43) Pub. Date: Feb. 12, 2015

(54) LIGHTPAINTING LIVE VIEW (52) US. Cl. CPC ...... H04N 5/2621 (2013.01); H04N 5/23293 (71) ApplicantszRyan Harrison Warnberg, Brooklyn, (201301) NY (U S); Michelle Kirstin McSwain, USPC ...... 348/2181; 348/239 Brooklyn, NY (U S) (57) ABSTRACT (72) Inventors' §¥anwgir?sig?emaigtgi? geosovligi?” Methods apparatus, including computer program prod Brooklyn NY (Us) ucts, for a light painting live View. A method includes, in a ’ device comprising at least a processor, a memory, a display (21) APPL NO; 13/964,155 and a device having an on-screen view?nder, access ing the camera, capturing individual frames of footage, each (22) Filed; Aug_ 12, 2013 of the captured frames being displayed through the on-screen view?nder in cumulative succession, rendering the captured Publication Classi?cation frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least (51) Int. Cl. two images, a ?rst image saved to the memory and a second H04N 5/262 (2006.01) image displayed on the display, and rendering the ?rst image H04N 5/232 (2006.01) into the second image to generate a ?nal image.

100

Access the camera 105

Render the captured frames on a GPU m

Send captured frames through a shader program 115

Save one image to memory 120

Display one image on display 125

Render one image into the other image m

Convert the image that is rendered into the memory to a JPEG ?le ? Patent Application Publication Feb. 12, 2015 Sheet 1 of 2 US 2015/0042832 A1

Processor ? Memory Q

0/ S Q

Process m

GPU i Display Q

Camera ?

FIG. 1 Patent Application Publication Feb. 12, 2015 Sheet 2 of 2 US 2015/0042832 A1

10

Access the camera 105

Render the captured frames on a GPU m

Send captured frames through a shader program 115

Save one image to memory 120

Display one image on display 125

Render one image into the other image 130

Convert the image that is rendered into the memory to a J PEG ?le &

FIG. 2 US 2015/0042832 A1 Feb. 12, 2015

LIGHTPAINTING LIVE VIEW on- screen view?nder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), send CROSS REFERENCE TO RELATED ing the captured frames through a shader program, generating APPLICATIONS at least two images, a ?rst image saved to the memory and a [0001] This application claims the bene?t of US. Provi second image displayed on the display, and rendering the ?rst sional Application No. 61/693,795, ?led Aug. 28, 2012. The image into the second image to generate a ?nal image. disclosure of the prior application is considered part of and is [0010] These and other features and advantages will be incorporated by reference in the disclosure of this application. apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood BACKGROUND OF THE INVENTION that both the foregoing general description and the following detailed description are explanatory only and are not restric [0002] The present invention generally relates to devices tive of aspects as claimed. having a camera feature, and more particularly to a light painting live view. BRIEF DESCRIPTION OF THE DRAWINGS [0003] Like , smartphones, such as the Apple iPhone®, Samsung Galaxy®, Blackberry Q10® and the like, [0011] The invention will be more fully understood by ref and tablet computers running, for example, Google’s erence to the detailed description, in conjunction with the Android® operating system (/S) and Apple’s iOS® O/S, following ?gures, wherein: include among their features, built-in cameras for taking pho [0012] FIG. 1 is a block diagram of an exemplary smart tos. Applications executing in the smartphones and tablet phone. computers enable control of the built-in cameras, including [0013] FIG. 2 is a ?ow diagram of an exemplary light light painting. painting live view process. [0004] In general, light painting is a photographic tech nique, often performed at night or in a dark area, where a DETAILED DESCRIPTION photographer can introduce different lighting elements dur ing a single long photograph. light painting enables [0014] The subject innovation is now described with refer the capture of light trails, light graf?ti tags, and so forth. ence to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following SUMMARY OF THE INVENTION description, for purposes of explanation, numerous speci?c details are set forth in order to provide a thorough understand [0005] The following presents a simpli?ed summary of the ing of the present invention. It may be evident, however, that innovation in order to provide a basic understanding of some the present invention may be practiced without these speci?c aspects of the invention. This summary is not an extensive details. In other instances, well-known structures and devices overview of the invention. It is intended to neither identify key are shown in block diagram form in order to facilitate describ or critical elements of the invention nor delineate the scope of ing the present invention. the invention. Its sole purpose is to present some concepts of [0015] As used in this application, the terms “component,” the invention in a simpli?ed form as a prelude to the more “system,” “platform,” and the like can refer to a computer detailed description that is presented later. related entity or an entity related to an operational machine [0006] The present invention provides methods and appa with one or more speci?c functionalities. The entities dis ratus, including computer program products, for a light paint closed herein can be either hardware, a combination of hard ing live view. ware and software, software, or software in execution. For [0007] In general, in one aspect, the invention features a example, a component may be, but is not limited to being, a method including, in a device including at least a processor, a process running on a processor, a processor, an object, an memory, a display and a camera device having an on-screen executable, a thread of execution, a program, and/or a com view?nder, accessing the camera, capturing individual puter. By way of illustration, both an application running on frames of footage, each of the captured frames being dis a server and the server can be a component. One or more played through the on-screen view?nder in cumulative suc components may reside within a process and/or thread of cession, rendering the captured frames on a graphical pro execution and a component may be localized on one com cessing unit (GPU), sending the captured frames through a puter and/or distributed between two or more computers. shader program, generating at least two images, a ?rst image Also, these components can execute from various computer saved to the memory and a second image displayed on the readable media having various data structures stored thereon. display, and rendering the ?rst image into the second image to The components may communicate via local and/or remote generate a ?nal image processes such as in accordance with a signal having one or [0008] In another aspect, the invention features a method more data packets (e.g., data from one component interacting including, in a device including at least a processor, a with another component in a local system, distributed system, memory, a display and a camera device, executing a light and/or across a network such as the Internet with other sys painting live view process in conjunction with the camera to tems via the signal). provide a long exposure camera that displays a creation of an [0016] In addition, the term “or” is intended to mean an exposure in real time. inclusive “or” rather than an exclusive “or.” That is, unless [0009] In still another aspect, the invention features an speci?ed otherwise, or clear from context, “X employs A or apparatus including a processor, a memory, a display, and a B” is intended to mean any of the natural inclusive permuta camera device, the memory including a light painting live tions. That is, ifX employs A, X employs B, or X employs view process, the light painting live view process including bothA and B, then “X employs A or B” is satis?ed under any accessing the camera, capturing individual frames of footage, of the foregoing instances. Moreover, articles “a” and “an” as each of the captured frames being displayed through the used in the subject speci?cation and annexed drawings should US 2015/0042832 A1 Feb. 12, 2015

generally be construed to mean “one or more” unless speci [0026] As shown om FIG. 2, the light painting live view ?ed otherwise or clear from context to be directed to a singu process 100 accesses (105) the camera, which captures indi lar form. vidual frames of footage, each of the captured frames dis [0017] As shown in FIG. 1, an exemplary device 10 played on a view?nder in cumulative succession. includes at least a processor 15, a memory 20, a display unit [0027] While frames are being captured by the camera, the 25, a camera 30 and a graphical processing unit (GPU) 35. light painting live view process 100 renders (110) the cap Example devices 10 include DSLR cameras, smartphones, tured frames on a graphical processing unit (GPU), which is tablet computers, personal data assistants, digital televisions, a user-facing camera “view?nder” feature of the light paint computers, laptops, devices with an integrated digital camera ing live view process 100. such as Nintendo® DS, wearable devices, devices with a [0028] For every frame that is being captured to create an digital camera, and so forth. The GPU 35 is an electronic image, the light painting live view process 100 also sends circuit designed to rapidly manipulate and alter memory 20 to (115) them through a shader program (also referred to as a accelerate a creation of images in a frame buffer intended for vertex and fragment program) into graphical processing unit output to the display unit 25. (GPU). In general, a shader is a computer program that is used [0018] The memory 20 can include at least an operating to do shading, produce special effects and/ or do post-process system (0/8) 40, such as Windows®, Linux®, Google’s ing. Shaders calculate rendering effects on graphics hardware Android®, Apple’s iOS®, or a proprietary OS, and a light with a high degree of ?exibility. Most shaders are coded for a painting live view process 100. graphics processing unit (GPU), though this is not a strict [0019] Light painting is a photographic technique in which requirement. The position, hue, saturation, brightness, and exposures are made by moving a hand-held light source or by contrast of all pixels, vertices, or textures used to construct a moving the camera. The term light painting also encompasses ?nal image can be altered on the ?y, using algorithms de?ned images lit from outside the frame with hand-held light in the shader, and can be modi?ed by external variables or sources. By moving the light source, the light can be used to textures introduced by the program calling the shader. selectively illuminate parts of the subject or to “paint” a [0029] Sending (115) the captured frames through the picture by shining it directly into the camera lens. Light shader creates two images, one image saved (120) to the painting requires a slow , usually a second or device’s memory and the other image displayed (125) by light more. Light painting can take on the characteristics of a quick painting live view process 100 for the user to see as if they pencil sketch. were watching a video. The light painting live view process [0020] Light painting by moving the camera, also called 100 uses frames from the camera as the input of the shader camera painting, is the antithesis of traditional . program and a progress frame as the output of the shader At night, or in a dark room, the camera can be taken off the program. Through additive blending, one image is rendered and used like a paintbrush. An example is using the (130) into the other by the light painting live view process night sky as the canvas, the camera as the brush and cityscapes 100, i.e., the image that is being drawn progressively is ren (amongst other light sources) as the palette. Putting energy dered to the display. into moving the camera by stroking lights, making patterns [0030] Once the user signals the light painting live view and laying down backgrounds can create abstract artistic process to stop, the light painting live view process 100 con images. verts (135) the image that is rendered into the memory to a [0021] Light painting can be done interactively using a Joint Photographic Experts Group (JPEG) ?le and projects webcam. The painted image can already be seen while draw (140) the JPEG ?le as a ?nal image on the display. ing by using a monitor or projector. [0031] As described above, a user initiates the light paint [0022] Another technique used in the creation of light art is ing live view process 100, which generates a home screen the projection of images on to irregular surfaces (faces, bod graphical user interface (GUI). The GUI includes a main ies, buildings, and so forth), in effect “painting” them with navigation bar that includes a pictorial rendering of a small light. A photograph or other ?xed portrayal of the resulting camera. When the small camera is tapped, the light painting image is then made. live view process 100 opens up to the camera built into the [0023] The light painting live view process 100 executes in device’s memory. The camera screen appears as though it’s a conjunction with the camera 30 to provide a long exposure video screen, ready for capture. The navigation bar shows a camera that displays the creation of the expo sure in real time. button to tap to begin image capture. [0024] The device 10 can support a variety of applications, [0032] A video capture session is initiated and anything such as a telephone application, a video conferencing appli that passes in front of the camera will leave a trail, similar to cation, an e-mail application, an instant messaging applica a long exposure on a single-lens re?ex/digital single-lens tion, a blogging application, a web browsing application, a re?ex (SLR/DSLR) camera. The difference is that the user digital music player application, and/ or a digital video player sees the trail as it is created, in real time, like a mixture of a application. stop motion video and an Etch-A-Sketch®. [0025] The light painting live view process 100 is a light [0033] This is viewed facing through the view?nder on of painting application. In light painting, a user can use a light the light painting live view process 100, which is a screen that source to draw shapes and patterns in front of a camera set to accesses the forward facing camera on the device. Anything a long exposure. The light painting live view process 100 viewed by that camera is seen through the light painting live enables the userbehind the camera 30 within the device 10 (or view process 100 view?nder. tablet computer) to watch the shapes or patterns that are being [0034] Exposures can be set for one second, or they can run created, as they are being created. In prior approaches, the as long as the user has memory in their device to store the user must wait until the end of the exposure to see what has image/video data. The exposure can also be stopped by tap been made or created. ping the same button used to start the exposure. US 2015/0042832 A1 Feb. 12, 2015

[0035] The user can move their camera around to capture J PEG representation. The output image may be saved to the trails, or they can make their own trails with a light of their device’s display’s camera roll, shared via email, Facebook® own. or Twitter®, or uploaded to a server. [0036] For every frame that is being captured to create the [0063] Through additive blending, one image is rendered image, the captured frame is sent through a shader program into the other in the order laid out above. The image that is into the GPU. being drawn progressively is rendered to the display. [0037] A GL_MAX blend operation, which speci?es how [0064] Blending Modes Stage source and destination colors are combined, is responsible for [0065] To produce a light painting, the pixels of the inter producing the light painting, but to control the output a frag mediate output image are blended with the pixels of the input ment shader program is used. The fragment shader is run on image. The output of that blending process is then used to each pixel of an image, producing for each input pixel a replace the previous value of each pixel of the intermediate corresponding output pixel. The fragment shader supports an output image. “Ambient Light Amount” feature of the capture settings. By [0066] The OpenGL blend mode “GL_MAX” is used to taking a brightness parameter between 0 and l, the fragment blend the pixels. The maximum of the two pixel values is the shader enables throttling the affect of light input on the paint output of the operation. mg. [0067] The following describes the effect of the GL_MAX [0038] The following is one example of fragment shader blend mode on pixel values (taken from the OpenGL docu source code: mentation at [0039] precision mediump ?oat; [0068] http://www.opengl.org/sdk/docs/man/xhtml/gl [0040] varying vec2 v_uv; Blenquuation.xml): [0041] niform sampler2D u_diffuseTexture; [0069] Mode [0042] uniform ?oat u_brightness; [0070] RGB Components [0043] void main(void) [0044] { [0071] Alpha Component [0045] //sample color [007 2] GL_MAX [0046] vec4 color:texture2D(u_diffuseTexture, v_uv); [0073] RrImax R s R d [0047] //calculate luminance intensity [0074] GrImax [G] s G d [0048] ?oat lumIntensityIcolor.x * 0.299+color.y * [0075] BrImax [Bs B d 0.587+color.Z * 0.114; [0049] //clamp and exaggerate luminance intensity [0076] ArImax IA s A d [0050] lumIntensityImin(l.0, lumlntensity); [0077] When all done, the output image is displayed: [0051] lumIntensityIlumIntensity * lumlntensity; [0078] 5. Output Imageithis is the output of copying and [0052] lumIntensityImax(u_brightness, lumlntensity); compressing the data from the intermediate output image to a [0053] //output ?nal color J PEG representation. The output image may be saved to the [0054] gl_FragColor:color * lumlntensity; device’s display’s camera roll, shared via email, Facebook® [0055] } or Twitter®, or uploaded to the server. [0056] The light painting live view process 100 then gen [0079] While the above describes a particular order of erates images in stages: operations performed by certain embodiments of the inven [0057] Image Stages/Names Stage tion, it should be understood that such order is exemplary, as [0058] 1. Raw Imageithis is the image data coming from alternative embodiments may perform the operations in a the device’ s video camera, frame-by-frame, stored in a buffer different order, combine certain operations, overlap certain managed by the operating system. operations, or the like. References in the speci?cation to a given embodiment indicate that the embodiment described [0059] 2. Input Imageithis is the image used as an input to the fragment shader program, stored in an OpenGL texture. A may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the par texture is an OpenGL Object that contains one or more ticular feature, structure, or characteristic. images that all have the same image format. A texture can be used in two ways. It can be the source of a texture access from [0080] While given components of the system have been a shader, or it can be used as a render target. The raw image is described separately, one of ordinary skill will appreciate that copied into the input image. some of the functions may be combined or shared in given [0060] 3. Intermediate Output Imageithis is the output of instructions, program sequences, code portions, and the like. the fragment shader program, stored in an OpenGL texture. [0081] The foregoing description does not represent an The input image is rendered into the intermediate output exhaustive list of all possible implementations consistent image, using a custom OpenGL frame buffer backed by an with this disclosure or of all possible variations of the imple OpenGL texture. In general, frame buffer objects are a mentations described. A number of implementations have mechanism for rendering to images other than the default been described. Nevertheless, it will be understood that vari OpenGL Default frame buffer. They are OpenGL Objects that ous modi?cations may be made without departing from the allow you to render directly to textures, as well as blitting spirit and scope of the systems, devices, methods and tech from one frame buffer. to another. niques described here. Accordingly, other implementations [0061] 4. Preview Imageithis is the output of the fragment are within the scope of the following claims. shader program, shown on the device’s display. The input What is claimed is: image is rendered to the screen, using the default OpenGL 1. A method comprising: frame buffer backed by the device’s display. in a device comprising at least a processor, a memory, a [0062] 5. Output Imageithis is the output of copying and display and a camera device having an on-screen view compressing the data from the intermediate output image to a ?nder, accessing the camera; US 2015/0042832 A1 Feb. 12, 2015

capturing individual frames of footage, each of the cap 13. The method of claim 11 wherein the device is selected tured frames being displayed through the on-screen from the group consisting of a DSLR camera, a smartphone, view?nder in cumulative succession; a tablet computer, a personal data assistants, a digital televi rendering the captured frames on a graphical processing sions, a computers, a laptops, a device with an integrated unit (GPU); digital camera, a wearable device, and a device with a digital sending the captured frames through a shader program; camera. generating at least two images, a ?rst image saved to the 14. An apparatus comprising: memory and a second image displayed on the display; a processor; and a memory; rendering the ?rst image into the second image to generate a display; and a ?nal image. a camera device having an on-screen view?nder; 2. The method of claim 2 further comprising: the memory comprising a light painting live view process, compressing the ?nal image; and the light painting live view process comprising: converting the compressed ?nal image to a Joint Photo accessing the camera; graphic Experts Group (JPEG) ?le. capturing individual frames of footage, each of the cap 3. The method of claim 2 further comprising projecting the tured frames being displayed through the on-screen J PEG ?le on the display. view?nder in cumulative succession; 4. The method of claim 1 wherein the shader program rendering the captured frames on a graphical processing receives input from the camera and outputs a progress frame. unit (GPU); 5. The method of claim 1 wherein the device is a smart sending the captured frames through a shader program; phone or tablet computer. generating at least two images, a ?rst image saved to the 6. The method of claim 1 wherein the device is selected memory and a second image displayed on the display; from the group consisting of a DSLR camera, a smartphone, and a tablet computer, a personal data assistants, a digital televi rendering the ?rst image into the second image to generate sions, a computers, a laptops, a device with an integrated a ?nal image. digital camera, a wearable device, and a device with a digital 15. The apparatus of claim 14 wherein the light painting camera. live view process further comprises: a digital camera, and a personal data assistant. compressing the ?nal image; and 7. The method of claim 1 wherein generating the least two converting the compressed ?nal image to a Joint Photo images comprises: graphic Experts Group (JPEG) ?le. an image/name stage; and 16. The apparatus of claim 15 wherein the light painting blending modes stage. live view process further comprises projecting the JPEG ?le 8. The method of claim 7 wherein the image/name stage on the display. comprises: 17. The apparatus of claim 14 wherein the shader program storing image data coming from the camera in a buffer in receives input from the camera and outputs a progress frame. the memory; 18. The apparatus of claim 14 wherein generating the least using the stored image as an input to the shader program; two images comprises: and an image/name stage; and outputting an intermediate image from the shader program blending modes stage. to the display, the intermediate image blended with the 19. The apparatus of claim 18 wherein the image/name input images. stage comprises: 9. The method of claim 8 wherein the blending modes stage storing image data coming from the camera in a buffer in comprises: the memory; blending pixels of the intermediate output image with pix using the stored image as an input to the shader program; els of the input image; and and replacing previous values of pixels with pixels of the inter outputting an intermediate image from the shader program mediate output image. to the display, the intermediate image blended with the 10. The method of claim 8 wherein the input is a OpenGL input images. texture. 20. The apparatus of claim 19 wherein the blending modes 11. A method comprising: stage comprises: in a device comprising at least a processor, a memory, a blending pixels of the intermediate output image with pix display and a camera device having an on-screen view els of the input image; and ?nder, executing a light painting live view process in replacing previous values of pixels with pixels of the inter conjunction with the camera to provide a long exposure mediate output image. camera that displays a creation of an exposure in real time. 21. The apparatus of claim 19 wherein the input is a 12. The method of claim 11 wherein the device is a smart OpenGL texture. phone or tablet computer.