CN106686298A - Post-shooting processing method, post-shooting processing device and mobile terminal - Google Patents

Post-shooting processing method, post-shooting processing device and mobile terminal Download PDF

Info

Publication number
CN106686298A
CN106686298A CN201611073053.1A CN201611073053A CN106686298A CN 106686298 A CN106686298 A CN 106686298A CN 201611073053 A CN201611073053 A CN 201611073053A CN 106686298 A CN106686298 A CN 106686298A
Authority
CN
China
Prior art keywords
picture
video
shooting
transparency
post
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611073053.1A
Other languages
Chinese (zh)
Inventor
艾朝
苗雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201611073053.1A priority Critical patent/CN106686298A/en
Publication of CN106686298A publication Critical patent/CN106686298A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present invention discloses a post-shooting processing method, a post-shooting processing device and a mobile terminal. The post-shooting processing method includes the following steps that: a local playing area is set on a first post-shooting picture; the transparency of the local playing area is improved, so that a second picture is formed; a first post-shooting video is decoded, so that N frames of images can be obtained, each of the N frames of images is synthesized with the second picture, so that N third pictures can be generated; and coding is performed according to the N generated third pictures, so that a second video is generated; and the obtained second video and the first post-shooting picture are synthesized into a new file. With the post-shooting processing method adopted, when the new file which is obtained after a file shot by the Live Photo is processed is played, regions outside the local playing area set in the second picture in the new file will block video playing content in the video file. With the post-shooting processing method adopted, the functions of the existing Live Photo are supplemented, and the activity of the local area of the post-shooting picture can be improved, and user experience can be better improved.

Description

Shoot post-processing approach, shoot after-treatment device and mobile terminal
Technical field
The present invention relates to technique for taking, more particularly to a kind of shooting post-processing approach, shooting after-treatment device and movement are eventually End.
Background technology
Live Photo are a kind of mobile terminal shoot functions, and when user shoots, Live Photo functions can be protected automatically The shooting former seconds and photo of latter several seconds are deposited, and dynamic effect is generated when user checks, allow every photo just as coming back to life one Sample, photo can automatically play dynamic effect.
Live Photo are not video, but by multiple JPG Piece file mergences among MOV files, that is to say, that Live File after Photo shoots includes JPG files and MOV files.Live Photo are using photo+video, with JPG photo forms Encapsulation is internal to write MP4 video informations composition according to certain rule as external display container.So Live Photo's is outer It is exactly the photo of a JPG in performance.
Live Photo functions greatly improve shooting performance, also enhance Consumer's Experience.But, Live Photo shoot File afterwards is when playing, as long as the part for occurring in shooting process to move in whole pictures all can all with the side of activity Formula shows, it is impossible to which some of prominent photographed scene rotation of region such as fan of perseveration etc. needs especially to close The region of note such as someone expression.
The content of the invention
Present invention is primarily targeted at propose a kind of shooting post-processing approach, shoot after-treatment device and mobile terminal, Live Photo functions can be supplemented, the activeness of regional area, preferably lifts Consumer's Experience after prominent shooting.
For achieving the above object, a kind of shooting post-processing approach is embodiments provided, including:
Local play area is set out on the first picture after the picture is taken;
Second picture is formed after the transparency for improving local play area;
The first video after decoding shoots obtains N two field pictures, and the every two field picture for obtaining synthesizes with second picture, and respectively Generate the 3rd N number of picture;Wherein, N is the integer more than 1;
Second video is generated according to N number of 3rd coding of graphics for generating;
The second video for obtaining is synthesized into new file with the first picture after shooting.
Alternatively, it is described to include the second video for obtaining with the first picture synthesis new file after shooting:By described Two video files are stored in first picture and data are described between area.
Alternatively, the new file includes:First picture file is plus second video file plus described the One picture afterbody.
Alternatively, the transparency of raising local play area is:The transparency of the local play area is arranged It is that transparency is more than the high grade of transparency for presetting transparency threshold.
Alternatively, the high grade of transparency is 100% all-transparent.
Shoot after-treatment device present invention also offers a kind of, including setup module, processing module, decoding synthesis module, Coding module, synthesis module, wherein,
Setup module, for arranging out local play area on the first picture after the picture is taken;
Processing module, for forming second picture after the transparency for improving local play area;
Decoding synthesis module, the first video after shooting for decoding obtains N two field pictures, and the every two field picture for obtaining is with the Two pictures synthesize, and generate the 3rd N number of picture respectively;
Coding module, for generating the second video according to N number of 3rd coding of graphics for generating;
Synthesis module, the second video and the first picture after shooting for obtaining synthesizes new file.
Alternatively, the synthesis module specifically for:Second video is stored in into first picture to retouch with data State between area.
Alternatively, the processing module specifically for:The transparency of the local play area is set to into transparency big Second picture is formed after the high grade of transparency of default transparency threshold.
Alternatively, the high grade of transparency is 100% all-transparent.
Present invention also offers a kind of mobile terminal, including the shooting after-treatment device described in any of the above-described.
Technical scheme proposed by the present invention includes:Local play area is set out on the first picture after the picture is taken;Improve Second picture is formed after the transparency of local play area;The first video after decoding shoots obtains N two field pictures, the every frame for obtaining Image synthesizes with second picture, and generates the 3rd N number of picture respectively;The is generated according to N number of 3rd coding of graphics for generating Two videos;The second video for obtaining is synthesized into new file with the first picture after shooting.So, playing by the present invention to such as Live Photo shoot after file process after new file when, the local broadcast area arranged out in the second picture in new file Overseas region can block the video-frequency playing content in video file.The technical scheme provided by the present invention, is supplemented existing Live Photo functions, highlight the activeness of regional area after shooting, and preferably improve Consumer's Experience.
Description of the drawings
Fig. 1 is the hardware architecture diagram for realizing each optional mobile terminal of embodiment one of the invention;
Fig. 2 is the flow chart that the present invention shoots post-processing approach;
Fig. 3 is the schematic diagram that the picture after the present invention shoots to Live Photo arranges local play area;
Fig. 4 is the flow chart of the embodiment that the present invention shoots post-processing approach;
Fig. 5 is the schematic diagram of the embodiment that the picture after the present invention shoots to Live Photo arranges local play area;
Fig. 6 is the composition structural representation that the present invention shoots after-treatment device.
The realization of the object of the invention, functional characteristics and advantage will be described further referring to the drawings in conjunction with the embodiments.
Specific embodiment
It should be appreciated that specific embodiment described herein is not intended to limit the present invention only to explain the present invention.
The mobile terminal of each embodiment of the invention is realized referring now to Description of Drawings.In follow-up description, use For represent element such as " module ", " part " or " unit " suffix only for be conducive to the present invention explanation, itself Not specific meaning.Therefore, " module " can be used mixedly with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as moving Phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP The mobile terminal of (portable media player), guider etc. and such as numeral TV, desk computer etc. are consolidated Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for movement Outside the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is that the hardware configuration for realizing each optional mobile terminal of embodiment one of the invention is illustrated.
Mobile terminal 1 00 can include wireless communication unit 110, A/V (audio/video) input block 120, user input Unit 130, sensing unit 140, output unit 150, memorizer 160, interface unit 170, controller 180 and power subsystem 190 Etc..Fig. 1 shows the mobile terminal with various assemblies, it should be understood that being not required for implementing all groups for illustrating Part.More or less of component can alternatively be implemented.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assemblies, and it allows mobile terminal 1 00 and wireless communication system Or the radio communication between network.For example, wireless communication unit can include broadcasting reception module 111, mobile communication module 112nd, at least one of wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent The broadcast singal generated before the server or reception of broadcast singal and/or broadcast related information and/or broadcast related information And send it to the server of terminal.Broadcast singal can include TV broadcast singals, radio signals, data broadcasting Signal etc..And, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase Pass information can also be provided via mobile communications network, and in this case, broadcast related information can be by mobile communication mould Block 112 is receiving.Broadcast singal can be present in a variety of manners, and for example, it can be with the electronics of DMB (DMB) The form of program guide (EPG), the electronic service guidebooks (ESG) of digital video broadcast-handheld (DVB-H) etc. and exist.Broadcast Receiver module 111 can receive signal broadcast by using various types of broadcast systems.Especially, broadcasting reception module 111 Can be wide by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video Broadcast-hand-held (DVB-H), forward link media (MediaFLO@) Radio Data System, received terrestrial digital broadcasting integrated service Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111 may be constructed such that and be adapted to provide for extensively Broadcast the various broadcast systems and above-mentioned digit broadcasting system of signal.Via broadcasting reception module 111 receive broadcast singal and/ Or broadcast related information can be stored in memorizer 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal And at least one of server and/or receive from it radio signal.Such radio signal can be logical including voice Words signal, video calling signal or the various types of data for sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.The module can be internally or externally It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by the module can include WLAN (WLAN) (Wi-Fi), Wibro (WiMAX), Wimax (worldwide interoperability for microwave accesses), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant ToothTM, RF identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybeeTMEtc..
Location information module 115 is the module for checking or obtaining the positional information of mobile terminal.Location information module Typical case be GPS (global positioning system).According to current technology, GPS module 115 is calculated from three or more satellites Range information and correct time information and for calculate Information application triangulation, so as to according to longitude, latitude Highly accurately calculate three-dimensional current location information.Currently, the method for calculating position and temporal information is defended using three The error of star and the position that calculated by using other satellite correction and temporal information.Additionally, GPS module 115 Can be by Continuous plus current location information in real time come calculating speed information.
A/V input blocks 120 are used to receive audio or video signal.A/V input blocks 120 can include the He of camera 121 Mike 1220, the static map that 121 pairs, camera is obtained in Video Capture pattern or image capture mode by image capture apparatus The view data of piece or video is processed.Picture frame after process may be displayed on display unit 151.At Jing cameras 121 Picture frame after reason can be stored in memorizer 160 (or other storage mediums) or via wireless communication unit 110 and carry out Send, two or more cameras 1210 can be provided according to the construction of mobile terminal.Mike 122 can be in telephone relation mould Sound (voice data) is received via mike in formula, logging mode, speech recognition mode etc. operational mode, and can be by Such acoustic processing is voice data.Audio frequency (voice) data after process can be changed in the case of telephone calling model For the form output of mobile communication base station can be sent to via mobile communication module 112.Mike 122 can implement all kinds Noise eliminate (or suppress) algorithm eliminating (or suppression) in the noise for receiving and producing during sending audio signal or Person disturbs.
User input unit 130 can generate key input data to control each of mobile terminal according to the order of user input Plant operation.User input unit 130 allows the various types of information of user input, and can include keyboard, metal dome, touch Plate (for example, detection is due to the sensitive component of the change of touched and caused resistance, pressure, electric capacity etc.), roller, rocking bar etc. Deng.Especially, when touch pad is superimposed upon in the form of layer on display unit 151, touch screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 1 00, and (for example, mobile terminal 1 00 opens or closes shape State), the presence or absence of contact (that is, touch input), the mobile terminal of the position of mobile terminal 1 00, user for mobile terminal 1 00 100 orientation, the acceleration or deceleration movement of mobile terminal 1 00 and direction etc., and generate for controlling mobile terminal 1 00 The order of operation or signal.For example, when mobile terminal 1 00 is embodied as sliding-type mobile phone, sensing unit 140 can be sensed The sliding-type phone is opened or closed.In addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or Whether person's interface unit 170 couples with external device (ED).Sensing unit 140 can will be combined below including proximity transducer 1410 Touch screen to this being described.
Interface unit 170 is connected the interface that can pass through with mobile terminal 1 00 as at least one external device (ED).For example, External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing Line FPDP, memory card port, the port for device of the connection with identification module, audio input/output (I/O) end Mouth, video i/o port, ear port etc..Identification module can be that storage uses each of mobile terminal 1 00 for verifying user Kind of information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included Etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module can take the form of smart card, therefore, know Other device can be connected via port or other attachment means with mobile terminal 1 00.Interface unit 170 can be used for receive from The input (for example, data message, electric power etc.) of external device (ED) and the input for receiving is transferred in mobile terminal 1 00 One or more elements can be used for the transmission data between mobile terminal and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing to pass through it by electricity Power from base provide to mobile terminal 1 00 path or can serve as allow from base be input into various command signals pass through its It is transferred to the path of mobile terminal.Can serve as recognizing that mobile terminal is from the various command signals or electric power of base input The no signal being accurately fitted within base.Output unit 150 is configured to provide defeated with vision, audio frequency and/or tactile manner Go out signal (for example, audio signal, video signal, alarm signal, vibration signal etc.).Output unit 150 can include showing Unit 151, dio Output Modules 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processed in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity During words call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia files Download etc.) related user interface (UI) or graphic user interface (GUI).When mobile terminal 1 00 is in video calling pattern Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, illustrate video or figure UI or GUI of picture and correlation function etc..
Meanwhile, when the display unit 151 and touch pad touch screen with formation superposed on one another in the form of layer, display unit 151 can serve as input equipment and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT) In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least It is a kind of.Some in these display may be constructed such that transparence to allow user from outside viewing, and this is properly termed as transparent Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific The embodiment wanted, mobile terminal 1 00 can include two or more display units (or other display devices), for example, move Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch screen can be used for detection and touch Input pressure and touch input position and touch input area.
Dio Output Modules 152 can mobile terminal in call signal reception pattern, call mode, logging mode, It is that wireless communication unit 110 is received or in memorizer 160 when under the isotypes such as speech recognition mode, broadcast reception mode The voice data transducing audio signal of middle storage and it is output as sound.And, dio Output Modules 152 can be provided and movement The audio output (for example, call signal receives sound, message sink sound etc.) of the specific function correlation that terminal 100 is performed. Dio Output Modules 152 can include speaker, buzzer etc..
Alarm unit 153 can provide output so that event is notified to mobile terminal 1 00.Typical event can be with Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video is exported, alarm unit 153 can in a different manner provide output with the generation of notification event.For example, alarm unit 153 can be in the form of vibrating Output is provided, when calling, message or some other entrance communication (incomingcommunication) are received, alarm list Unit 153 can provide tactile output (that is, vibrating) to notify to user.By providing such tactile output, even if When the mobile phone of user is in the pocket of user, user also can recognize that the generation of various events.Alarm unit 153 The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memorizer 160 can store software program for the process and control operation performed by controller 180 etc., Huo Zheke With the data (for example, telephone directory, message, still image, video etc.) for temporarily storing own Jing outputs or will export.And And, memorizer 160 can be storing the vibration of various modes with regard to exporting when touching and being applied to touch screen and audio signal Data.
Memorizer 160 can include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, many Media card, card-type memorizer (for example, SD or DX memorizeies etc.), random access storage device (RAM), static random-access storage Device (SRAM), read only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can perform memorizer with by network connection The network storage device cooperation of 160 store function.
The overall operation of the generally control mobile terminal of controller 180.For example, controller 180 is performed and voice call, data The related control of communication, video calling etc. and process.In addition, controller 180 can be included for reproducing (or playback) many matchmakers The multi-media module 1810 of volume data, multi-media module 1810 can be constructed in controller 180, or is so structured that and control Device processed 180 is separated.Controller 180 can be with execution pattern identifying processing, by the handwriting input for performing on the touchscreen or figure Piece draws input and is identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit Appropriate electric power needed for part and component.
Various embodiments described herein can be with using such as computer software, hardware or its any combination of calculating Machine computer-readable recording medium is implementing.For hardware is implemented, embodiment described herein can be by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can Programming gate array (FPGA), processor, controller, microcontroller, microprocessor, it is designed to perform function described herein Implementing, in some cases, such embodiment can be implemented at least one in electronic unit in controller 180. For software is implemented, the embodiment of such as process or function can with allow to perform the single of at least one function or operation Software module is implementing.Software code can be come by the software application (or program) write with any appropriate programming language Implement, software code can be stored in memorizer 160 and be performed by controller 180.
So far, own Jing describes mobile terminal according to its function.Below, for the sake of brevity, will description such as folded form, Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc. is used as showing Example.Therefore, the present invention can be applied to any kind of mobile terminal, and be not limited to slide type mobile terminal.
As shown in Figure 1 mobile terminal 1 00 may be constructed such that using via frame or packet transmission data it is all if any Line and wireless communication system and satellite-based communication system are operating.
Based on above-mentioned mobile terminal hardware configuration and communication system, the inventive method each embodiment is proposed.
Fig. 2 is the flow chart that the present invention shoots post-processing approach, as shown in Fig. 2 including:
Step 200:Local play area is set out on the first picture after the picture is taken.
Fig. 3 is the schematic diagram that the picture after the present invention shoots to such as Live Photo arranges local play area, such as Fig. 3 institutes Show, on the first picture after Live Photo shoot, blank parts are picture region, and dash area is that the local for arranging out is broadcast Put region.
Wherein, local play area is arranged on the first picture can be using such as Canvas painting canvas draw come real It is existing.The present invention on the first picture after Live Photo shoot, it is emphasized that arrange out local play area.
Step 201:Second picture is formed after the transparency for improving local play area.
In this step, the transparency for improving local play area is:The transparency of local play area is set to transparent Degree more than the high grade of transparency of transparency threshold such as 80% for pre-setting, such as transparency be 90%, transparency be 100% it is complete It is transparent etc., as long as other scenes can be seen through local play area, with all-transparent from the point of view of, so processing means the A hole formed by local play area is plucked out on one picture.That is, correspondence local play area on second picture Position is a hole.Second picture is drawn like a width, there is individual hole in picture, it is assumed that holds this width picture and sees other scenery, only thoroughly Crossing hole can show the content of other scenery, and the content of the part in drawing around hole or this width picture.
Second picture in this step can arrange the picture format of transparency, such as to support Alpha (Alpha) passage PNG files etc..
In practical application, in order to improve the transparency of local play area, the region that first needs can be enhanced the transparency Gone out using the painting canvas draw such as Canvas, and copy to new figure layer;Then, the figure layer for replicating is loaded into and is chosen, then click on original Figure deletes the corresponding region in artwork;Afterwards, newly-built one white figure layer does background and is placed on orlop:Finally, after replication Figure layer in adjust its transparency.
Local play area in the present invention can be that user selects according to itself wish, can select one or one More than, main purpose is correspondence the partially opening of local play area " a fan window " on the first picture, so as to follow-up correspondence Show corresponding dynamic effect.
Step 202:The first video after decoding shoots obtains N two field pictures, and the every two field picture for obtaining is closed with second picture Into, and the 3rd N number of picture is generated respectively.Wherein, N is the integer more than 1.
In this step, due to being provided with local play area in second picture, therefore, for each 3rd pictures of N, only There is the part in the 3rd picture corresponding to local play area to embody the first video after decoding such as Live Photo shoots The situation of change of N two field pictures is obtained, and the other parts of the 3rd picture are then blocked by the pictorial information on second picture, that is, show Be region outside static second picture correspondence local play area image.
It is concrete real as long as the realization decoded in this step is realized according to the decoding process that existing Live Photo shoot The protection domain being now not intended to limit the present invention, repeats no more here.
Briefly, it is exactly, according to the file format stored after existing Live Photo shootings, to find wherein various pieces File.
Step 203:Second video is generated according to N number of 3rd coding of graphics for generating.
Coding in this step is the inverse process decoded in step 202, as long as the volume shot according to existing Live Photo Code mode realize, implement the protection domain being not intended to limit the present invention, repeat no more here.
Video playback is exactly in fact the display of a series of pictures, and the process by above-mentioned steps 202 and step 203 is visible, The inventive method is decoded to the first video content after Live Photo shootings, obtains some frame image datas;Then it is sharp Synthesized with each frame image data of the second picture of local play area to obtaining after decoding is provided with, obtain some removing Local play area also retains after decoding outside each frame image data for obtaining, other regions blocked by second picture some the Three picture frames;Again some 3rd picture frames to obtaining re-start coding and generate only by local play area presentation video information The second video.
Step 204:The second video for obtaining is synthesized into new file with the first picture after shooting.
In one embodiment, the file format after such as Live Photo shoot is as shown in table 1, including the first picture file+ Video file+data describe area+picture afterbody, and such as JPG files+MOV files+data describe area's+JPG suffix.
JPG files MOV files Data describe area JPG suffix
Table 1
In table 1, JPG files represent a complete secondary JPG picture files;A complete MOV video of MOV representation of file File;Data describe area, for describing how JPG files, MOV files are deposited in whole composite document, such as xth x It is MOV data etc. that individual byte starts to the storage of the yy byte;JPG postfix notation suffix names are .jpg.
In this step, the second video for obtaining is replaced into the first video after Live Photo shoot, and synthesize what is obtained New file form is as shown in table 2, including:First the+the second video file of picture file ++ data describe area+picture afterbody, such as JPG The MOV files of file+the two+data describe area's+JPG suffix.
JPG files 2nd MOV files Data describe area JPG suffix
Table 2
Synthesis in this step is exactly the second video file to be replaced into the first video and is stored in after Live Photo shootings JPG files and data describe between area, according to the form shown in table 2, will the 2nd MOV files be stored in JPG files and number According between description area.
It is by the document form for shooting post-processing approach generation of the invention:The video of photo+the second, overall packing forms Packing forms after shooting with Live Photo are consistent, for user, a still JPG picture of display.
So, in the new file after playing the file process after shooting to Live Photo by the present invention, by new The broadcasting of the second video in file, it can be seen that local play area present the first video to should local play area video Information, other regions then it is static show the first picture content.That is, the local play area ability only by arranging out Can show that video file plays to should be in the region of local play area video content.For more popular, this The local playing function that invention is provided, is to beat a hole in the JPG files after Live Photo shoot, as broadcasting Live Photo During video content, the video content only in this hole could show, all be the content of original JPG pictures outside hole.By this The technical scheme of bright offer, supplements existing Live Photo functions, highlights the activeness of regional area after shooting, preferably Improve Consumer's Experience.
Fig. 4 is the flow chart of the embodiment that the present invention shoots post-processing approach, with Live Photo shootings in the present embodiment As a example by, as shown in figure 4, including:
Step 400:On the first picture after Live Photo shoot, drawn using Canvas and local broadcast area is set out Domain.
As shown in figure 5, the local play area in the present embodiment is the fan region in picture.
Step 401:The transparency of local play area is set to into all-transparent and second picture is formed.
Second picture in the present embodiment can arrange the picture format of transparency to support Alpha (Alpha) passage, Such as PNG files.
Step 402:Storage format after shooting according to Live Photo, decodes the first video after Live Photo shoot and obtains To N two field pictures, the every two field picture for obtaining synthesizes with second picture, and generates the 3rd N number of picture respectively.
Step 403:According to N number of 3rd picture for generating, coded system coding is adopted ... to generate the second video.
Step 404:By the first picture synthesis new file after the second video for obtaining and Live Photo shootings.
New file in the present embodiment includes:Live Photo shoot after JPG files+according to Live Photo shoot after A MOV files and step 401 process after PNG document No.s after the 2nd MOV file+JPG afterbodys that obtain.
In the present embodiment, for user, user sees an i.e. Live of complete picture to the new file of synthesis JPG files after Photo shootings.
So, in the new file after playing the file process after shooting to Live Photo by the present invention, by new The broadcasting of the second video in file, it can be seen that local play area present the first video to should local play area video Information, other regions then it is static show the first picture content.That is, the local play area ability only by arranging out Can show that video file plays to should be in the region of local play area video content.For more popular, this The local playing function that invention is provided, is to beat a hole in the JPG files after Live Photo shoot, as broadcasting Live Photo During video content, the video content only in this hole could show, all be the content of original JPG pictures outside hole.By this The technical scheme of bright offer, supplements existing Live Photo functions, highlights the activeness of regional area after shooting, preferably Improve Consumer's Experience.
Fig. 6 be the present invention shoot after-treatment device composition structural representation, as shown in fig. 6, at least include setup module, Processing module, decoding synthesis module, coding module, synthesis module, wherein,
Setup module, for arranging out local play area on the first picture after the picture is taken;
Processing module, for forming second picture after the transparency for improving local play area;
Decoding synthesis module, the first video after shooting for decoding obtains N two field pictures, and the every two field picture for obtaining is with the Two pictures synthesize, and generate the 3rd N number of picture respectively;
Coding module, for generating the second video according to N number of 3rd coding of graphics for generating;
Synthesis module, the second video and the first picture after shooting for obtaining synthesizes new file.
Wherein, second picture can arrange the picture format of transparency to support Alpha (Alpha) passage, and such as PNG is literary Part etc..
Wherein, the new file for obtaining after synthesis includes:First the+the second video file of picture file+picture afterbody, such as JPG MOV file+JPG the afterbodys of file+the two.
Alternatively,
Processing module specifically for:The transparency of the local play area is set to into transparency and is more than what is pre-set Second picture is formed after the high grade of transparency of transparency threshold.
Alternatively, the high grade of transparency is all-transparent that transparency is 100%.
Synthesis module specifically for:By the second video for obtaining storage the first picture after the picture is taken and data describe area it Between.
The present invention also provides a kind of mobile terminal, includes the shooting after-treatment device described in any of the above-described.
The application also provides a kind of for realizing shooting the device of post processing, at least including memorizer and processor, wherein, Be stored with following executable instruction in memorizer:Local play area is set out on the first picture after the picture is taken;Improve local Second picture is formed after the transparency of play area;The first video after decoding shoots obtains N two field pictures, the every two field picture for obtaining Synthesize with second picture, and generate the 3rd N number of picture respectively;Second is generated according to N number of 3rd coding of graphics for generating to regard Frequently;The second video for obtaining is synthesized into new file with the first picture after shooting.
It should be noted that herein, term " including ", "comprising" or its any other variant are intended to non-row His property is included, so that a series of process, method, article or device including key elements not only include those key elements, and And also include other key elements being not expressly set out, or also include for this process, method, article or device institute inherently Key element.In the absence of more restrictions, the key element for being limited by sentence "including a ...", it is not excluded that including being somebody's turn to do Also there is other identical element in the process of key element, method, article or device.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on such understanding, technical scheme is substantially done to prior art in other words Going out the part of contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium In (such as ROM/RAM, magnetic disc, CD), including some instructions are used so that a station terminal (can be mobile phone, computer is serviced Device, air-conditioner, or network equipment etc.) perform method described in each embodiment of the invention.
The preferred embodiments of the present invention are these are only, the scope of the claims of the present invention is not thereby limited, it is every using this Equivalent structure or equivalent flow conversion that bright description and accompanying drawing content are made, or directly or indirectly it is used in other related skills Art field, is included within the scope of the present invention.

Claims (10)

1. it is a kind of to shoot post-processing approach, it is characterised in that to include:
Local play area is set out on the first picture after the picture is taken;
Second picture is formed after the transparency for improving local play area;
The first video after decoding shoots obtains N two field pictures, and the every two field picture for obtaining synthesizes with second picture, and generates respectively The 3rd N number of picture;Wherein, N is the integer more than 1;
Second video is generated according to N number of 3rd coding of graphics for generating;
The second video for obtaining is synthesized into new file with the first picture after shooting.
2. shooting post-processing approach according to claim 1, it is characterised in that it is described by the second video for obtaining with shoot The first picture synthesis new file afterwards includes:By second video file be stored in first picture and data describe area it Between.
3. shooting post-processing approach according to claim 1 and 2, it is characterised in that
The new file includes:First picture file adds the first picture afterbody plus second video file.
4. shooting post-processing approach according to claim 2, it is characterised in that raising local play area it is transparent Spend and be:The transparency of the local play area is set to into the high grade of transparency of the transparency more than default transparency threshold.
5. shooting post-processing approach according to claim 4, it is characterised in that the high grade of transparency is 100% full impregnated It is bright.
6. it is a kind of to shoot after-treatment device, it is characterised in that including setup module, processing module, decoding synthesis module, coding mould Block, synthesis module, wherein,
Setup module, for arranging out local play area on the first picture after the picture is taken;
Processing module, for forming second picture after the transparency for improving local play area;
Decoding synthesis module, for decoding shoot after the first video obtain N two field pictures, the every two field picture for obtaining with the second figure Piece synthesizes, and generates the 3rd N number of picture respectively;
Coding module, for generating the second video according to N number of 3rd coding of graphics for generating;
Synthesis module, the second video and the first picture after shooting for obtaining synthesizes new file.
7. shooting after-treatment device according to claim 6, it is characterised in that the synthesis module specifically for:By institute State that the second video is stored in first picture and data are described between area.
8. shooting after-treatment device according to claim 7, it is characterised in that the processing module specifically for:By institute The transparency for stating local play area is set to form second picture after the high grade of transparency of the transparency more than default transparency threshold.
9. shooting after-treatment device according to claim 8, it is characterised in that the high grade of transparency is 100% full impregnated It is bright.
10. a kind of mobile terminal, it is characterised in that including the shooting post processing described in claim any one of 6~claim 9 Device.
CN201611073053.1A 2016-11-29 2016-11-29 Post-shooting processing method, post-shooting processing device and mobile terminal Pending CN106686298A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611073053.1A CN106686298A (en) 2016-11-29 2016-11-29 Post-shooting processing method, post-shooting processing device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611073053.1A CN106686298A (en) 2016-11-29 2016-11-29 Post-shooting processing method, post-shooting processing device and mobile terminal

Publications (1)

Publication Number Publication Date
CN106686298A true CN106686298A (en) 2017-05-17

Family

ID=58866987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611073053.1A Pending CN106686298A (en) 2016-11-29 2016-11-29 Post-shooting processing method, post-shooting processing device and mobile terminal

Country Status (1)

Country Link
CN (1) CN106686298A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110248116A (en) * 2019-06-10 2019-09-17 腾讯科技(深圳)有限公司 Image processing method, device, computer equipment and storage medium
WO2023155576A1 (en) * 2022-02-16 2023-08-24 Beijing Xiaomi Mobile Software Co., Ltd. Encoding/decoding video picture data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970415A (en) * 2014-04-13 2014-08-06 数源科技股份有限公司 Method for achieving fade-in and fade-out effects based on Android
CN104380728A (en) * 2012-06-01 2015-02-25 阿尔卡特朗讯公司 Method and apparatus for mixing a first video signal and a second video signal
US9189852B2 (en) * 2012-02-02 2015-11-17 Google Inc. Method for manually aligning two digital images on mobile devices
CN105245777A (en) * 2015-09-28 2016-01-13 努比亚技术有限公司 Method and device for generating video image
CN105516610A (en) * 2016-02-19 2016-04-20 深圳新博科技有限公司 Method and device for shooting local dynamic image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9189852B2 (en) * 2012-02-02 2015-11-17 Google Inc. Method for manually aligning two digital images on mobile devices
CN104380728A (en) * 2012-06-01 2015-02-25 阿尔卡特朗讯公司 Method and apparatus for mixing a first video signal and a second video signal
CN103970415A (en) * 2014-04-13 2014-08-06 数源科技股份有限公司 Method for achieving fade-in and fade-out effects based on Android
CN105245777A (en) * 2015-09-28 2016-01-13 努比亚技术有限公司 Method and device for generating video image
CN105516610A (en) * 2016-02-19 2016-04-20 深圳新博科技有限公司 Method and device for shooting local dynamic image

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110248116A (en) * 2019-06-10 2019-09-17 腾讯科技(深圳)有限公司 Image processing method, device, computer equipment and storage medium
CN110248116B (en) * 2019-06-10 2021-10-26 腾讯科技(深圳)有限公司 Picture processing method and device, computer equipment and storage medium
WO2023155576A1 (en) * 2022-02-16 2023-08-24 Beijing Xiaomi Mobile Software Co., Ltd. Encoding/decoding video picture data

Similar Documents

Publication Publication Date Title
CN106502693B (en) A kind of image display method and device
CN106990828A (en) A kind of apparatus and method for controlling screen display
CN106067960A (en) A kind of mobile terminal processing video data and method
KR20090063528A (en) Mobile terminal and method of palying back data therein
CN106686301A (en) Picture shooting method and device
CN108111874B (en) file processing method, terminal and server
CN106941443A (en) One population historical record checks terminal and method
CN105245938B (en) The device and method for playing multimedia file
CN106791480A (en) A kind of terminal and video skimming creation method
CN106097284A (en) The processing method of a kind of night scene image and mobile terminal
CN106850941A (en) Method, photo taking and device
CN106909681A (en) A kind of information processing method and its device
CN106231095B (en) Picture synthesizer and method
CN106131327A (en) Terminal and image-pickup method
CN104731508B (en) Audio frequency playing method and device
CN106383707A (en) Picture display method and system
CN106657579A (en) Content sharing method, device and terminal
CN107426282A (en) A kind of picture loading method, terminal and server
CN105677717B (en) A kind of display methods and terminal
CN106993093A (en) A kind of image processing apparatus and method
CN106851114A (en) A kind of photo shows, photo generating means and method, terminal
CN104731484B (en) The method and device that picture is checked
CN106657729A (en) Mobile terminal and dual-camera device
CN106445148A (en) Method and device for triggering terminal application
CN109168029A (en) It is a kind of adjust resolution ratio method, equipment and computer can storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170517