CN106303290B - A kind of terminal and the method for obtaining video - Google Patents
A kind of terminal and the method for obtaining video Download PDFInfo
- Publication number
- CN106303290B CN106303290B CN201610865332.5A CN201610865332A CN106303290B CN 106303290 B CN106303290 B CN 106303290B CN 201610865332 A CN201610865332 A CN 201610865332A CN 106303290 B CN106303290 B CN 106303290B
- Authority
- CN
- China
- Prior art keywords
- duration
- target
- video
- reference object
- day
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2624—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The present invention discloses a kind of method and terminal for obtaining video, the described method includes: obtaining the first duration, obtain recorded in first period of the day from 11 p.m. to 1 a.m is long for the first multi-medium data of target reference object, at least width target image after first period of the day from 11 p.m. to 1 a.m is long to the shooting of target reference object and the second multi-medium data for target reference object recorded in second period of the day from 11 p.m. to 1 a.m is long, first period of the day from 11 p.m. to 1 a.m is long, second period of the day from 11 p.m. to 1 a.m length is obtained by the average cutting or non-averagely cutting to first duration;According to first and second multi-medium data and an at least width target image, target video is obtained.The present invention be at least able to solve due to record duration it is shorter and relatively fixed caused by recorded content is abundant, problem of flexibility difference, and then promotion user experience improves the ease for use of end product.
Description
Technical field
The present invention relates to export techniques, and in particular to a kind of terminal and the method for obtaining video.
Background technique
With the development of science and technology, the terminal devices such as smart phone, tablet computer (PAD), intelligent glasses are more welcome.Mesh
Before, for the shooting demand for meeting user, at least partly terminal device can realize following function: carry out picture bat to reference object
Video record is carried out to reference object in specific time before or after taking the photograph, and after the completion of picture shooting, will be recorded
Video shown with captured picture.In this way, can facilitate user's viewing in recording and picture shooting whole process
The variation of reference object.By taking reference object is flower as an example, it can be seen that flower is never open into open process, greatly improve user
Visual experience.But the value for recording duration (specific time) at present is smaller and relatively fixed, such as value 1.5s.Record duration
The shorter recorded content that can lead to is not abundant enough, is unable to satisfy the visual experience of user.Recording duration is relatively fixed, and flexibility is poor,
Applicability is poor for the photographed scene for needing to record for a long time.
Summary of the invention
To solve existing technical problem, a kind of method that the embodiment of the present invention provides terminal and obtains video, until
Be able to solve less due to record duration it is shorter and relatively fixed caused by recorded content is abundant, problem of flexibility difference, into
And user experience is promoted, improve the ease for use of end product.
The technical solution of the embodiment of the present invention is as follows:
The embodiment of the present invention provides a kind of terminal, and the terminal includes:
First acquisition unit, for obtaining the first duration, the recording duration of a length of pair of target reference object when described first,
First duration is at least more than the second duration, and described second when, a length of terminal carried out reference object before carrying out picture shooting
The scheduled recording duration of video record and the sum of the scheduled recording duration for carrying out video record to reference object after picture shooting;
Second acquisition unit, for obtaining the first multimedia for target reference object recorded in first period of the day from 11 p.m. to 1 a.m is long
Data;
Third acquiring unit, for obtaining after first period of the day from 11 p.m. to 1 a.m is long to an at least width mesh for target reference object shooting
Logo image;
4th acquiring unit, for obtaining the second multimedia for target reference object recorded in second period of the day from 11 p.m. to 1 a.m is long
Data, first period of the day from 11 p.m. to 1 a.m is long, second period of the day from 11 p.m. to 1 a.m is long is obtained and carrying out average cutting or non-average cutting to the first duration;
First obtains unit, for obtaining according to first and second multi-medium data and an at least width target image
Target video.
Optionally,
First multi-medium data is the target shooting pair taken in first period of the day from 11 p.m. to 1 a.m is long every the first scheduled duration
At least first image for elephant;
Second multi-medium data is the target shooting pair taken in second period of the day from 11 p.m. to 1 a.m is long every the second scheduled duration
At least second image for elephant;
Correspondingly, the first obtains unit, is also used to according at least first image, target image and second
The sequence of image carries out generation/production of video, obtains the target video.
Optionally, the terminal further include:
First detection unit, for detecting the first operation and meeting the first predetermined condition in the operational attribute of the first operation
When, the first broadcast unit is triggered, first operation is at least the operation for the target image;
First broadcast unit, for being played out to the target video.
Optionally,
The operational attribute is operating force;
The first detection unit, for after detecting the first operation, judging whether the operating force of the first operation reaches
To predetermined dynamics threshold value, the first judging result is generated;
When the operating force of the first operation of the first judging result characterization reaches predetermined dynamics threshold value, it is judged as the first operation
Operational attribute meet the first predetermined condition.
Optionally, first broadcast unit, is also used to:
It obtains the terminal and realizes the playing duration that synthetic video plays, the synthetic video is before the picture shooting
To through the scheduled recording duration to the first video recorded through the scheduled recording duration and after the picture shooting
Record the synthesis of the second obtained video;
Determine that the playing duration is target playing duration;
The target video is played out in the target playing duration.
The embodiment of the present invention also provides a kind of method for obtaining video, which comprises
Obtain the first duration, the recording duration of a length of pair of target reference object when described first, first duration is at least
Greater than the second duration, described second when, a length of terminal carried out the predetermined record of video record before carrying out picture shooting to reference object
Duration processed and the sum of the scheduled recording duration for carrying out video record to reference object after picture shooting;
Obtain the first multi-medium data for target reference object recorded in first period of the day from 11 p.m. to 1 a.m is long;
It obtains after first period of the day from 11 p.m. to 1 a.m is long to an at least width target image for target reference object shooting;
Obtain the second multi-medium data for target reference object recorded in second period of the day from 11 p.m. to 1 a.m is long, first period of the day from 11 p.m. to 1 a.m
Long, second period of the day from 11 p.m. to 1 a.m length is obtained and carrying out average or non-average cutting to the first duration;
According to first and second multi-medium data and an at least width target image, target video is obtained.
When the operational attribute of the first operation of detection and the first operation meets the first predetermined condition, first operation is at least
For the operation of the target image, the target video is played out.
Optionally,
First multi-medium data is the target shooting pair taken in first period of the day from 11 p.m. to 1 a.m is long every the first scheduled duration
At least first image for elephant;
Second multi-medium data is the target shooting pair taken in second period of the day from 11 p.m. to 1 a.m is long every the second scheduled duration
At least second image for elephant;
Correspondingly, it is described according to first and second multi-medium data and an at least width target image, obtain target view
Frequently, comprising:
Generation/production of video is carried out according to the sequence of at least first image, target image and second image,
Obtain the target video.
Optionally, after obtaining target video, the method also includes:
When the operational attribute of the first operation of detection and the first operation meets the first predetermined condition, first operation is at least
For the operation of the target image, the target video is played out.
Optionally, the method also includes:
The operational attribute is operating force;
After detecting the first operation, judge whether the operating force of the first operation reaches predetermined dynamics threshold value, generates the
One judging result;
When the operating force of the first operation of the first judging result characterization reaches predetermined dynamics threshold value, it is judged as the first operation
Operational attribute meet the first predetermined condition.
Optionally, before being played out to the target video, the method also includes:
It obtains the terminal and realizes the playing duration that synthetic video plays, the synthetic video is before the picture shooting
To through the scheduled recording duration to the first video recorded through the scheduled recording duration and after the picture shooting
Record the synthesis of the second obtained video;
Determine that the playing duration is target playing duration;
Correspondingly, described play out the target video, comprising:
The target video is played out in the target playing duration.
Terminal provided in an embodiment of the present invention and the method for obtaining video, wherein target is clapped the described method includes: obtaining
The recording duration of object is taken the photograph, duration is recorded and is greater than the second duration, and it is long to obtain first period of the day from 11 p.m. to 1 a.m length long when recording, second period of the day from 11 p.m. to 1 a.m
Interior the first multi-medium data that target reference object is recorded, the second multi-medium data, to target after first period of the day from 11 p.m. to 1 a.m is long
Reference object shooting an at least width target image, and by the first and second multi-medium data and an at least width target image be fabricated to/
It generates video and obtains target video.The wherein long average cutting or non-flat by recording duration of first period of the day from 11 p.m. to 1 a.m length, second period of the day from 11 p.m. to 1 a.m
Equal cutting and obtain.
Wherein, and single-phase ratio relatively fixed with current recording duration value, recording duration in this programme it is flexible
Property is stronger, and applicable shooting occasion is more.This programme can also be achieved the recording to the target reference object longer time, and current
Record that duration is shorter compares, recorded content more horn of plenty is greatly improved user experience, promotes the easy-to-use of end product
Property.
Detailed description of the invention
The hardware structural diagram of Fig. 1 optional mobile terminal of each embodiment to realize the present invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal 100 as shown in Figure 1;
Fig. 3 is the electrical structure block diagram of camera;
Fig. 4 is the implementation process schematic diagram of the method first embodiment of acquisition video of the invention;
Fig. 5 is the implementation process schematic diagram of the method second embodiment of acquisition video of the invention;
Fig. 6 (a) is the figure shot in first period of the day from 11 p.m. to 1 a.m is long to target reference object in the embodiment of the present invention
As schematic diagram;
Fig. 6 (b) is to be shot after first period of the day from 11 p.m. to 1 a.m is long to target reference object in the embodiment of the present invention
One image schematic diagram;
Fig. 6 (c) is the figure shot in second period of the day from 11 p.m. to 1 a.m is long to target reference object in the embodiment of the present invention
As schematic diagram;
Fig. 7 is a composed structure schematic diagram of terminal embodiment of the invention;
Fig. 8 is another composed structure schematic diagram of terminal embodiment of the present invention.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
The terminal of each embodiment of the present invention is realized in description with reference to the drawings.In subsequent description, using being used for
Indicate the suffix of such as " module ", " component " or " unit " of element only for being conducive to the explanation of the embodiment of the present invention,
There is no specific meanings for body.Therefore, " module " can be used mixedly with " component ".
Terminal can be implemented in a variety of manners.For example, terminal described in the embodiment of the present invention may include such as moving
Mobile phone, smart phone, laptop, digit broadcasting receiver, personal digital assistant (PDA, Personal Digital
Assistant), tablet computer (PAD), portable media player (PMP, Portable Media Player), navigation dress
The fixed terminal of the terminal and such as number TV, desktop computer etc. set etc..Hereinafter it is assumed that terminal is mobile terminal.
However, it will be understood by those skilled in the art that other than the element for being used in particular for mobile purpose, implementation according to the present invention
The construction of mode can also apply to the terminal of fixed type.
The hardware structural diagram of Fig. 1 each embodiment one optional mobile terminal to realize the present invention.
Mobile terminal 100 may include wireless communication unit 110, audio/video (A/V) input unit 120, user's input
Unit 130, sensing unit 140, output unit 150, memory 160, interface unit 170, controller 180 and power supply unit 190
Etc..Fig. 1 shows the mobile terminal with various assemblies, it should be understood that being not required for implementing all groups shown
Part.More or fewer components can alternatively be implemented.The element of mobile terminal will be discussed in more detail below.
Wireless communication unit 110 generally includes one or more components, allows mobile terminal 100 and wireless communication system
Or the radio communication between network.For example, wireless communication unit may include mobile communication module 112, wireless Internet mould
At least one of block 113 and short range communication module 114.
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal may include that voice is logical
Talk about signal, video calling signal or according to text and/or Multimedia Message transmission and/or received various types of data.
The Wi-Fi (Wireless Internet Access) of the support mobile terminal of wireless Internet module 113.The module can be internally or externally
It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved in the module may include WLAN (WLAN), wireless compatible
Property certification (Wi-Fi), WiMAX (Wibro), worldwide interoperability for microwave accesses (Wimax), high-speed downlink packet access
(HSDPA) etc..
Short range communication module 114 is the module for supporting short range communication.Some examples of short-range communication technology include indigo plant
Tooth, radio frequency identification (RFID, Radio Frequency Identification), Infrared Data Association (IrDA, Infrared
Data Association), ultra wide band (UWB, Ultra Wideband), purple honeybee etc..
A/V input unit 120 is for receiving audio or video signal.A/V input unit 120 may include 121 He of camera
Microphone 122, camera 121 is to the static images obtained in video acquisition mode or image capture mode by image capture apparatus
Or the image data of video is handled.Treated, and picture frame may be displayed on display unit 151.It is handled through camera 121
Picture frame afterwards can store in memory 160 (or other storage mediums) or be sent out via wireless communication unit 110
It send, two or more cameras 121 can be provided according to the construction of mobile terminal.Microphone 122 can be in telephone calling model, note
Sound (audio data) is received via microphone in record mode, speech recognition mode etc. operational mode, and can will in this way
Acoustic processing be audio data.Audio that treated (voice) data can be converted in the case where telephone calling model can
The format output of mobile communication base station is sent to via mobile communication module 112.Various types of make an uproar can be implemented in microphone 122
Sound eliminates (or inhibition) algorithm to eliminate the noise or do that (or inhibition) generates during sending and receiving audio signal
It disturbs.
The order that user input unit 130 can be inputted according to user generates key input data to control each of mobile terminal
Kind operation.User input unit 130 allows user to input various types of information, and may include keyboard, metal dome, touch
Plate (for example, the sensitive component of detection due to the variation of resistance, pressure, capacitor etc. caused by being contacted), idler wheel, rocking bar etc.
Deng.Particularly, when touch tablet is superimposed upon in the form of layer on display unit 151, touch screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 100, (for example, mobile terminal 100 opens or closes shape
State), the position of mobile terminal 100, user is for the presence or absence of contact (that is, touch input) of mobile terminal 100, mobile terminal
100 orientation, the acceleration or deceleration movement of mobile terminal 100 and direction etc., and generate for controlling mobile terminal 100
The order of operation or signal.For example, sensing unit 140 can sense when mobile terminal 100 is embodied as sliding-type mobile phone
The sliding-type phone is to open or close.In addition, sensing unit 140 be able to detect power supply unit 190 whether provide electric power or
Whether person's interface unit 170 couples with external device (ED).
Interface unit 170 be used as at least one external device (ED) connect with mobile terminal 100 can by interface.For example,
External device (ED) may include wired or wireless headphone port, external power supply (or battery charger) port, wired or nothing
Line data port, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Identification module can be storage and use each of mobile terminal 100 for verifying user
Kind of information and may include subscriber identification module (UIM, User Identify Module), client identification module (SIM,
Subscriber Identity Module), Universal Subscriber identification module (USIM, Universal Subscriber
Identity Module) etc..In addition, the device (hereinafter referred to as " identification device ") with identification module can take intelligence
The form of card, therefore, identification device can be connect via port or other attachment devices with mobile terminal 100.Interface unit 170
It can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) and transmit the input received
One or more elements in mobile terminal 100 can be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 100 is connect with external base, interface unit 170 may be used as allowing will be electric by it
Power, which is provided from pedestal to the path or may be used as of mobile terminal 100, allows the various command signals inputted from pedestal to pass through it
It is transferred to the path of mobile terminal.The various command signals or electric power inputted from pedestal, which may be used as mobile terminal for identification, is
The no signal being accurately fitted on pedestal.Output unit 150 is configured to provide with vision, audio and/or tactile manner defeated
Signal (for example, audio signal, vision signal, alarm signal, vibration signal etc.) out.Output unit 150 may include display
Unit 151, audio output module 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information handled in mobile terminal 100.For example, when mobile terminal 100 is in electricity
When talking about call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia file
Downloading etc.) relevant user interface (UI, User Interface) or graphic user interface (GUI, Graphical User
Interface).When mobile terminal 100 is in video calling mode or image capture mode, display unit 151 can be shown
Show captured image and/or received image, the UI or GUI that show video or image and correlation function etc..
Meanwhile when display unit 151 and touch tablet in the form of layer it is superposed on one another to form touch screen when, display unit
151 may be used as input unit and output device.Display unit 151 may include liquid crystal display (LCD, Liquid
Crystal Display), thin film transistor (TFT) LCD (TFT-LCD, Thin Film Transistor-LCD), organic light-emitting diodes
It manages in (OLED, Organic Light-Emitting Diode) display, flexible display, three-dimensional (3D) display etc.
It is at least one.Some in these displays may be constructed such that transparence to allow user to watch from outside, this is properly termed as
Transparent display, typical transparent display can be, for example, transparent organic light emitting diode (TOLED) display etc..According to
Specific desired embodiment, mobile terminal 100 may include two or more display units (or other display devices), example
Such as, mobile terminal may include outernal display unit (not shown) and inner display unit (not shown).Touch screen can be used for examining
Survey touch input pressure and touch input position and touch input area.
Audio output module 152 can mobile terminal be in call signal reception pattern, call mode, logging mode,
It is when under the isotypes such as speech recognition mode, broadcast reception mode, wireless communication unit 110 is received or in memory 160
The audio data transducing audio signal of middle storage and to export be sound.Moreover, audio output module 152 can provide and movement
The relevant audio output of specific function (for example, call signal receives sound, message sink sound etc.) that terminal 100 executes.
Audio output module 152 may include loudspeaker, buzzer etc..
Alarm unit 153 can provide output notifying event to mobile terminal 100.Typical event can be with
Including calling reception, message sink, key signals input, touch input etc..Other than audio or video output, alarm unit
153 can provide output in different ways with the generation of notification event.For example, alarm unit 153 can be in the form of vibration
Output is provided, when receiving calling, message or some other entrance communications (incoming communication), alarm list
Member 153 can provide tactile output (that is, vibration) to notify to user.By providing such tactile output, even if
When the mobile phone of user is in the pocket of user, user also can recognize that the generation of various events.Alarm unit 153
The output of the generation of notification event can be provided via display unit 151 or audio output module 152.
Memory 160 can store the software program etc. of the processing and control operation that are executed by controller 180, Huo Zheke
Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And
And memory 160 can store about the vibrations of various modes and audio signal exported when touching and being applied to touch screen
Data.
Memory 160 may include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, more
Media card, card-type memory (for example, SD or DX memory etc.), random access storage device (RAM, Random Access
Memory), static random-access memory (SRAM, Static Random Access Memory), read-only memory (ROM,
Read Only Memory), electrically erasable programmable read-only memory (EEPROM, Electrically Erasable
Programmable Read Only Memory), programmable read only memory (PROM, Programmable Read Only
Memory), magnetic storage, disk, CD etc..Moreover, mobile terminal 100 can execute memory with by network connection
The network storage device of 160 store function cooperates.
The overall operation of the usually control mobile terminal of controller 180.For example, controller 180 executes and voice communication, data
Communication, video calling etc. relevant control and processing.In addition, controller 180 may include for reproducing (or playback) more matchmakers
The multi-media module 181 of volume data, multi-media module 181 can construct in controller 180, or can be structured as and control
Device 180 separates.Controller 180 can be with execution pattern identifying processing, by the handwriting input executed on the touchscreen or picture
It draws input and is identified as character or image.Controller 180 further includes the file that can be used for forms such as photo, figure, texts
The image processing module 182 handled, image processing module 182 can construct in controller 180, also it is so structured that with
The separation of 180 phase of controller.
Power supply unit 190 receives external power or internal power under the control of controller 180 and provides operation each member
Electric power appropriate needed for part and component.
Terminal may be provided at least one slot, is inserted into subscriber identification module 100 to corresponding, passes through subscriber identification module
100 by wireless communication unit 110 be attached and communicate with network or other communication equipments.Subscriber identification module 100 can be with
Specially subscriber identification card (SIM, Subscriber Identity Module), universal subscriber identity card (USIM,
Universal Subscriber Identity Module), removable Subscriber Identity Module (RUIM, Removable User
Identity Module), common user identification card (UIM, Universal Subscriber Identity Module).
Various embodiments described herein can be to use the calculating of such as computer software, hardware or any combination thereof
Machine readable medium is implemented.Hardware is implemented, embodiment described herein can be by using application-specific IC
(ASIC, Application Specific Integrated Circuit), digital signal processor (DSP, Digital
Signal Processing), digital signal processing device (DSPD, Digital Signal Processing Device), can
Programmed logic device (PLD, Programmable Logic Device), field programmable gate array (FPGA, Field
Programmable Gate Array), processor, controller, microcontroller, microprocessor, be designed to execute it is described herein
At least one of the electronic unit of function implement, in some cases, such embodiment can be in controller 180
Middle implementation.For software implementation, the embodiment of such as process or function can with allow to execute at least one functions or operations
Individual software module implement.Software code can be by the software application write with any programming language appropriate
(or program) is implemented, and software code can store in memory 160 and executed by controller 180.
So far, mobile terminal is described according to its function.In the following, for the sake of brevity, will description such as folded form,
Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc., which is used as, to be shown
Example.Therefore, the present invention can be applied to any kind of mobile terminal, and be not limited to slide type mobile terminal.
Mobile terminal 100 as shown in Figure 1 may be constructed such that using via frame or grouping send data it is all if any
Line and wireless communication system and satellite-based communication system operate.
Referring now to Fig. 2 description communication system that wherein mobile terminal 100 according to the present invention can operate.
Different air interface and/or physical layer can be used in such communication system.For example, used by communication system
Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system
System (UMTS) (particularly, long term evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under
The description in face is related to cdma communication system, but such introduction is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system may include multiple mobile terminals 100, multiple base stations (BS) 270, base station
Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC 280 is configured to and Public Switched Telephony Network (PSTN)
290 form interface.MSC 280 is also structured to be formed with the BSC 275 that can be couple to base station 270 via back haul link and connect
Mouthful.Back haul link can be constructed according to any in several known interfaces, the interface include such as E1/T1, ATM,
IP, PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system may include multiple BSC as shown in Figure 2
2750。
Each BS 270 can service one or more subregions (or region), by multidirectional antenna or the day of direction specific direction
Each subregion of line covering is radially far from BS 270.Alternatively, each subregion can by two for diversity reception or more
Multiple antennas covering.Each BS 270, which may be constructed such that, supports multiple frequency distribution, and the distribution of each frequency has specific frequency
It composes (for example, 1.25MHz, 5MHz etc.).
What subregion and frequency were distributed, which intersects, can be referred to as CDMA Channel.BS 270 can also be referred to as base station transceiver
System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly indicating single BSC
275 and at least one BS 270.Base station can also be referred to as " cellular station ".Alternatively, each subregion of specific BS 270 can be claimed
For multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to the mobile terminal operated in system by broadcsting transmitter (BT) 295
100.Broadcasting reception module 111 as shown in Figure 1 is arranged at mobile terminal 100 to receive the broadcast sent by BT295
Signal.In fig. 2 it is shown that several satellites 300, such as global positioning system (GPS) satellite 300 can be used.Satellite 300 is helped
Help at least one of multiple mobile terminals 100 of positioning.
In Fig. 2, multiple satellites 300 are depicted, it is understood that, it is useful to can use any number of satellite acquisition
Location information.GPS module 115 as shown in Figure 1 is generally configured to cooperate with satellite 300 to obtain desired positioning and believe
Breath.It substitutes GPS tracking technique or except GPS tracking technique, its that can track the position of mobile terminal 100 can be used
Its technology.In addition, at least one 300 property of can choose of GPS satellite or extraly processing satellite dmb transmission.
As a typical operation of wireless communication system, BS 270 receives the reverse strand from various mobile terminals 100
Road signal.Mobile terminal 100 usually participates in call, information receiving and transmitting and other types of communication.Certain base station 270 is received each
Reverse link signal is handled in specific BS 270.The data of acquisition are forwarded to relevant BSC 275.BSC is provided
The mobile management function for the resource allocation and the coordination including the soft switching process between BS 270 conversed.BSC275 will also be received
Data be routed to MSC 280, the additional route service for forming interface with PSTN 290 is provided.Similarly, PSTN
290 form interface with MSC 280, and MSC and BSC 275 form interface, and BSC 275 controls BS 270 correspondingly with will be positive
Link signal is sent to mobile terminal 100.
Fig. 3 is the electrical structure block diagram of camera.
Camera lens 1211 is made of the multiple optical lens for being used to form shot object image, is single-focus lens or zoom lens.
Camera lens 1211 can move in the direction of the optical axis under the control of lens driver 1221, and lens driver 1221 is according to from mirror
The control signal of head drive control circuit 1222, controls the focal position of camera lens 1211, in the case where zoom lens, also controllably
Focal length processed.Lens driving control circuit 1222 carries out lens driver according to the control command from microprocessor 1217
1221 drive control.
Photographing element is configured near the position of the shot object image formed on the optical axis of camera lens 1211, by camera lens 1211
1212.Photographing element 1212 is for imaging shot object image and obtaining image data.On photographing element 1212 two dimension and
It is arranged in a matrix the photodiode for constituting each pixel.Each photodiode generates photoelectric conversion electricity corresponding with light income
Stream, the photoelectric conversion electric current carry out charge accumulation by the capacitor connecting with each photodiode.The front surface of each pixel configures
There is RGB (RGB) colour filter of bayer arrangement.
Photographing element 1212 is connect with imaging circuit 1213, which carries out charge in photographing element 1212
Accumulation control and picture signal read control, carry out after reducing resetting noise to the picture signal (analog picture signal) of the reading
Waveform shaping, and then gain raising etc. is carried out to become level signal appropriate.
Imaging circuit 1213 is connect with A/D converter 1214, which carries out modulus to analog picture signal
Conversion, to 1227 output digital image signal (hereinafter referred to as image data) of bus.
Bus 1227 is the transmitting path for being transmitted in the various data that the inside of camera reads or generates.In bus
1227 are connected to above-mentioned A/D converter 1214, are additionally connected to image processor 1215, jpeg processor 1216, micro process
Device 1217, Synchronous Dynamic Random Access Memory (SDRAM, Synchronous Dynamic random access memory)
1218, memory interface (hereinafter referred to as memory I/F) 1219, liquid crystal display (LCD, Liquid Crystal
Display) driver 1220.
Image processor 1215 to the image data of the output based on photographing element 1212 carry out optics black (OB,
Optical Black) subtract each other processing, blank level adjustment, color matrix operation, gamma conversion, colour difference signal processing, noise remove
The various image procossings such as processing, edge processing are changed in processing simultaneously.Imagery Data Recording is being situated between by jpeg processor 1216 in storage
When matter 1225, the image data read from SDRAM 1218 is compressed according to JPEG compression mode.In addition, jpeg processor 1216 is
Carry out the decompression that image reproducing showed and carried out JPEG image data.When unziping it, reading is recorded in storage medium
After implementing decompression in jpeg processor 1216, the image data of decompression is temporarily stored for file in 1225
It is shown in SDRAM 1218 and on LCD 1226.In addition, in the present embodiment, as compression of images decompression side
Formula is using JPEG mode, however compressed and decompressed mode is without being limited thereto, it is of course possible to using MPEG, TIFF, H.264 etc. its
His compressed and decompressed mode.
Microprocessor 1217 plays the function of the control unit as camera entirety, is uniformly controlled the various processing sequences of camera
Column.Microprocessor 1217 is connected to operating unit 1223 and flash memory 1224.
Operating unit 1223 includes but is not limited to physical button or virtual key, and the entity or virtual key can be electricity
Source button, camera button, edit key, dynamic image button, reproduction button, menu button, cross key, OK button, delete button,
The operational controls such as the various input buttons such as large buttons and various enter keys, detect the mode of operation of these operational controls.
It will test result to export to microprocessor 1217.In addition, the front surface in the LCD1226 as display is equipped with touching
Panel is touched, the touch location of user is detected, which is exported to microprocessor 1217.Microprocessor 1217 is according to coming from
The testing result of the operating position of operating unit 1223 executes various processing sequences corresponding with the operation of user.
Flash memory 1224 stores the program for executing the various processing sequences of microprocessor 1217.1217 basis of microprocessor
The control of program progress camera entirety.In addition, flash memory 1224 stores the various adjusted values of camera, microprocessor 1217, which is read, to be adjusted
Whole value carries out the control of camera according to the adjusted value.
SDRAM 1218 be for image data etc. temporarily stored can electricity rewrite volatile memory.It should
SDRAM 1218 temporarily store the image data exported from analog/digital (A/D) converter 1214 and image processor 1215,
The image data that carried out that treated in jpeg processor 1216 etc..
Memory interface 1219 is connect with storage medium 1225, is carried out image data and the text being attached in image data
First-class control data write-in storage medium 1225 and read from storage medium 1225 of part.Storage medium 1225 may be embodied as
Can on camera main-body the storage mediums such as memory card of disassembled and assembled freely, however it is without being limited thereto, be also possible to be built in camera
Hard disk etc. in main body.
LCD driver 1210 is connect with LCD 1226, will treated that image data is stored in by image processor 1215
SDRAM 1218 when needing to show, reads the image data that SDRAM 1218 is stored and shows on LCD 1226, alternatively,
The compressed image data of jpeg processor 1216 is stored in SDRAM 1218, and when needing to show, jpeg processor 1216 is read
The compressed image data of SDRAM 1218 is taken, then is unziped it, the image data after decompression is passed through into LCD 1226
It is shown.
LCD1226 configuration performs image display at the back side of camera main-body, however without being limited thereto, can also use and be based on having
The various display panels of machine EL i.e. Organic Light Emitting Diode (OLED, Organic Electro-Luminescence) carry out
Image is shown.
Electrical structure schematic diagram based on above-mentioned mobile terminal hardware configuration and camera proposes each implementation of the present invention
Example, specifically refers to subsequent explanation.
Those skilled in the art should and know, a function of the terminal being related in the embodiment of the present invention: carrying out figure
Piece shoots the forward and backward video record that can carry out scheduled recording duration to reference object respectively, which is referred to as the Dynamic Graph of terminal
Piece live photo function.Specifically, the function is: recording the view of one section of 1.5s before carrying out picture shooting to reference object
Frequently (a length of 1.5s when usual scheduled recording), then the video of one section of 1.5s of recording after picture shooting is carried out to reference object,
The video of long 3s is recorded in the front and back of shooting picture altogether, when playing out, by the figure of the 1.5s video, shooting recorded before shooting
The video of the 1.5s recorded after piece, shooting successively plays out, and obtains a dynamic play picture, makes the visual experience of user
It is greatly improved.But be only capable of that front and back is taken to record duration when terminal realizes live photo function at present to be respectively 1.5s, it is this
The relatively fixed mode flexibility of recording duration value is poor, and applicable shooting occasion is limited.Meanwhile it is shorter to record duration value
Cause recorded content not abundant enough, is unable to satisfy the visual experience of user.
Embodiment one
The first embodiment of the method provided by the invention for obtaining video, is applied in a terminal, which has video
Recording and camera function.The camera function can be realized by electrical structure shown in Fig. 3.The terminal can for tablet computer,
Conventional mobile phone, smart phone, electronic reader etc. can also be the wearable devices such as smart glasses, smart watches, intelligent shoe.
The optional terminal of the present invention is smart phone.The terminal can at least solve mesh by the method for acquisition video as shown in Figure 4
In preceding live photo function due to record duration it is shorter and relatively fixed caused by not abundant, the flexibility difference of recorded content etc.
Problem.
Fig. 4 is the implementation process schematic diagram of the method first embodiment of acquisition video of the invention;As shown in figure 4, described
Method includes:
Step 401: the first duration is obtained, the recording duration of a length of pair of target reference object when described first, described first
Duration is at least more than the second duration, and described second when, a length of terminal carried out video record to reference object before carrying out picture shooting
Scheduled recording duration with after picture shooting to reference object carry out video record the sum of scheduled recording duration;
Here, the executing subject of step 401 is terminal.Wherein, the second duration is that terminal utilizes live photo function
Realize the total duration recorded to target reference object.Terminal determines the first duration (when recording according to the type of target reference object
It is long).Terminal in this programme is preferred for cityscape, natural land, chronometer phenomenon, urban life, building manufacture, biology
The different types of scenery such as differentiation are shot, and can be also used for shooting daily life scene certainly, to everyday objects
It is shot, is not specifically limited.By taking the florescence in natural land is target reference object as an example, it is contemplated that the florescence is never open
It is usually several days to the open time, can be set when its first duration is recorded a length of 3 days or 5 when target reference object is flower
It etc..By taking the star in chronometer phenomenon as an example, it is contemplated that the appearance of star is usually several hours to disappearing, and can be shot in target
Object is set when its first duration is recorded a length of 6 hours or 5 hours etc. when being star.It is with the Tom Clancy Splinter Cell in biology differentiation
Example, it is contemplated that cell is completed to be usually some months from is being formed fission, can be set when target reference object is cell its first
A length of one month or three months etc. when duration is recorded.
In specific implementation, by user from the list between the target reference object type and the first duration that terminal provides
(in the list in advance certain recording duration is set for each type of target reference object) to target reference object type and
The target reference object shown under the type is chosen, i.e. execution selection operation such as claps target reference object type and target
It takes the photograph object to be chosen, terminal knows the type of current target reference object, according to column after reading this selection operation
The content recorded in table, read is the configured recording duration of selected target reference object in advance.A length of utilization when second
Total duration, that is, 3s (1.5s+1.5s) of live photo function realization video record.Herein, it is contemplated that recorded content enriches
Property preferably records duration in this programme and is typically larger than 3s.
Step 402: obtaining the first multi-medium data for target reference object recorded in first period of the day from 11 p.m. to 1 a.m is long;
Here, the executing subject of step 402 is terminal.Terminal is being read as the configured recording duration of target reference object
Afterwards, timing is carried out to the first duration, target in the part duration is clapped in the part duration (first period of the day from 11 p.m. to 1 a.m is long) of the first duration
The change procedure for taking the photograph object carries out Record and Save, forms the first multi-medium data.Terminal is read to be recorded in first period of the day from 11 p.m. to 1 a.m is long
And the first multi-medium data saved.
Step 403: obtaining after first period of the day from 11 p.m. to 1 a.m is long to an at least width target image for target reference object shooting;
Here, the executing subject of step 403 is terminal.The change procedure of target reference object in terminal is long to first period of the day from 11 p.m. to 1 a.m
After recording, start the camera function of terminal, and to target reference object at this time carry out picture once or twice more than
Shooting, obtain an at least width target image.Terminal reads captured target image.
Step 404: the second multi-medium data for target reference object recorded in second period of the day from 11 p.m. to 1 a.m is long is obtained, it is described
First period of the day from 11 p.m. to 1 a.m is long, second period of the day from 11 p.m. to 1 a.m is long is obtained and carrying out average cutting or non-average cutting to the first duration;
Here, the executing subject of step 404 is terminal.During the first duration carries out timing, terminal is at first
Record and Save, shape are carried out to the change procedure of target reference object in the part duration in long part duration (the second duration)
At the second multi-medium data.Terminal reads the second multi-medium data of the Record and Save in second period of the day from 11 p.m. to 1 a.m is long.
Those skilled in the art should and know, terminal carry out at times record during be according to record duration instruction
Time sequencing carry out sequence recording, but during being read out to the multi-medium data of recording and the picture of shooting
It has no between these three steps of the i.e. aforementioned step 402 of strict sequence, 403 and 404 without strict sequence.It needs
Illustrate, first period of the day from 11 p.m. to 1 a.m it is long with second period of the day from 11 p.m. to 1 a.m is long grow when recording in be different recording time section.First multi-medium data,
Second multi-medium data can be video, image or text etc., preferably video or image.
Step 405: according to first and second multi-medium data and an at least width target image, obtaining target video.
Here, the executing subject of step 405 is terminal.Terminal is reading the first and second multi-medium data and is shooting for target
After at least piece image of object shooting, according to the first multi-medium data, an at least width target image, the second multi-medium data
Sequence carries out generation/production of video, obtains target video, which shows target reference object in the first duration
Interior change procedure.Wherein, the switching software that picture can be used to video carries out the generation of picture to video, specifically refers to existing
There are the relevant technologies, details are not described herein again.
In the present embodiment, the recording duration to target reference object is obtained, which is greater than terminal and utilizes live
Photo function realizes the duration recorded to reference object, and it is long interior to target shooting pair to obtain the first long when recording period of the day from 11 p.m. to 1 a.m
As the first multi-medium data of recording, when recording long second period of the day from 11 p.m. to 1 a.m it is long in the second multimedia of target reference object recording
Data and after first period of the day from 11 p.m. to 1 a.m is long to an at least width target image for target reference object shooting, and by acquired the
One, two multi-medium datas and an at least width target image are fabricated to/generate video and obtain target video.Wherein, due to this programme
In the first duration be greater than 3s, be only capable of in current live photo function record 3s video compare, it can be achieved that target shooting
The recording of object longer time, recorded content more horn of plenty, is greatly improved user experience, promotes the easy-to-use of end product
Property.Meanwhile first duration (recording duration) can according to the type of target reference object flexible value, with current live photo
It records duration in function to be only capable of value 3s and compare, flexibility is stronger, and applicable shooting occasion is more.When target video plays out
When, by the broadcasting user of target video just it can be seen that (dynamic) change procedure of target reference object in the first duration,
What i.e. user watched is the dynamic play picture about target reference object, can effectively promote visual effect.
Embodiment two
The second embodiment of the method provided by the invention for obtaining video, is applied in a terminal, which has video
Recording and camera function.The camera function can be realized by electrical structure shown in Fig. 3.The terminal can for tablet computer,
Conventional mobile phone, smart phone, electronic reader etc. can also be the wearable devices such as smart glasses, smart watches, intelligent shoe.
The optional terminal of the present invention is smart phone.The terminal can at least solve mesh by the method for acquisition video as shown in Figure 5
In preceding live photo function due to record duration it is shorter and relatively fixed caused by not abundant, the flexibility difference of recorded content etc.
Problem.
Fig. 5 is the implementation process schematic diagram of the method first embodiment of acquisition video of the invention;As shown in figure 5, described
Method includes:
Step 501: the first duration is obtained, the recording duration of a length of pair of target reference object when described first, described first
Duration is at least more than the second duration, and described second when, a length of terminal carried out video record to reference object before carrying out picture shooting
Scheduled recording duration with after picture shooting to reference object carry out video record the sum of scheduled recording duration;
Here, the executing subject of step 501 is terminal.Wherein, described second when a length of terminal utilize dynamic picture live
Photo function realizes the total duration recorded to target reference object.When terminal determines first according to the type of target reference object
Long (recording duration).Terminal in this programme is preferred for cityscape, natural land, chronometer phenomenon, urban life, building
The different types of scenery such as manufacture, biology differentiation are shot, and can be also used for shooting daily life scene certainly, right
Everyday objects are shot, and are not specifically limited.By taking the florescence in natural land is target reference object as an example, it is contemplated that the florescence
Never opening into the open time is usually several days, can set its first duration when target reference object is flower and record duration
It is 3 days or 5 days etc..By taking the star in chronometer phenomenon as an example, it is contemplated that the appearance of star is usually several hours to disappearing, can be
Target reference object is set when its first duration is recorded a length of 6 hours or 5 hours etc. when being star.With thin in biology differentiation
For born of the same parents' fission, it is contemplated that cell completes usually some months from formed to fissioning, and can set when target reference object is cell
A length of one month or three months etc. when its fixed first duration is recorded.
In specific implementation, by user from the list between the target reference object type and the first duration that terminal provides
(in the list in advance certain recording duration is set for each type of target reference object) to target reference object type and
The target reference object shown under the type is chosen, i.e. execution selection operation such as claps target reference object type and target
It takes the photograph object to be chosen, terminal knows the type of current target reference object, according to column after reading this selection operation
The content recorded in table, read is the configured recording duration of selected target reference object in advance.A length of utilization when second
Total duration, that is, 3s (1.5s+1.5s) of live photo function realization video record.Herein, it is contemplated that recorded content enriches
Property preferably records duration in this programme and is typically larger than 3s.
Step 502: obtaining the first multi-medium data for target reference object recorded in first period of the day from 11 p.m. to 1 a.m is long;
Here, the executing subject of step 502 is terminal.Terminal is being read as the configured recording duration of target reference object
Afterwards, timing is carried out to the first duration, target in the part duration is clapped in the part duration (first period of the day from 11 p.m. to 1 a.m is long) of the first duration
The change procedure for taking the photograph object carries out Record and Save, forms the first multi-medium data.Terminal is read to be recorded in first period of the day from 11 p.m. to 1 a.m is long
And the first multi-medium data saved.
Step 503: obtaining after first period of the day from 11 p.m. to 1 a.m is long to an at least width target image for target reference object shooting;
Here, the executing subject of step 503 is terminal.The change procedure of target reference object in terminal is long to first period of the day from 11 p.m. to 1 a.m
After recording, start the camera function of terminal, and to target reference object at this time carry out picture once or twice more than
Shooting, obtain an at least width target image.Terminal reads captured target image.
Step 504: the second multi-medium data for target reference object recorded in second period of the day from 11 p.m. to 1 a.m is long is obtained, it is described
First period of the day from 11 p.m. to 1 a.m is long, second period of the day from 11 p.m. to 1 a.m length is obtained by the average cutting or non-average cutting to the first duration;
Here, the executing subject of step 504 is terminal.During the first duration carries out timing, terminal is at first
Record and Save, shape are carried out to the change procedure of target reference object in the part duration in long part duration (the second duration)
At the second multi-medium data.Terminal reads the second multi-medium data of the Record and Save in second period of the day from 11 p.m. to 1 a.m is long.
In step 502~504, first period of the day from 11 p.m. to 1 a.m is long, the second period of the day from 11 p.m. to 1 a.m length can average cutting by terminal to the first duration
And obtain, 3 days a length of when such as first, first period of the day from 11 p.m. to 1 a.m is long, long second period of the day from 11 p.m. to 1 a.m is respectively 1.5 days.First period of the day from 11 p.m. to 1 a.m is long, second period of the day from 11 p.m. to 1 a.m is long
The non-average cutting of the first duration can be obtained by terminal, it is 3 days a length of when such as first, first period of the day from 11 p.m. to 1 a.m a length of 1 day, the second son
Duration is 2 days.It is 3s and the recording time of picture front and back shooting with it is shorter to record duration in current live photo function
It needs to carry out 3s average cutting to compare, the first duration in the present embodiment is greater than 3s, and first period of the day from 11 p.m. to 1 a.m length, the second period of the day from 11 p.m. to 1 a.m length can
Average or non-average division is carried out according to actual use demand.In this way, can realize to the target reference object longer time
It records, the value for recording long such as first and second period of the day from 11 p.m. to 1 a.m length of the period of the day from 11 p.m. to 1 a.m is more flexible, and recorded content more horn of plenty can effectively promote user
Usage experience.
Those skilled in the art should and know, terminal carry out at times record during be according to record duration instruction
Time sequencing carry out sequence recording, but during being read out to the multi-medium data of recording and the picture of shooting
It has no between these three steps of the i.e. aforementioned step 502 of strict sequence, 503 and 504 without strict sequence.It needs
Illustrate, first period of the day from 11 p.m. to 1 a.m it is long with second period of the day from 11 p.m. to 1 a.m is long grow when recording in be different recording time section.
Here, it due to recording duration a length of when terminal is with first and executes sequentially in time to target reference object
It records, recording time longer such as a few minutes, a few houres, several days, even several years some months, ceaselessly recording necessarily causes in recording
Hold that shared capacity is larger, to reduce recorded content to the occupancy of terminal system capacity, in this programme in first and second period of the day from 11 p.m. to 1 a.m is long simultaneously
It is non real-time that target reference object is recorded, but target reference object is once shot at regular intervals.Specifically
, target reference object is once shot every the first scheduled duration in first period of the day from 11 p.m. to 1 a.m is long, in first period of the day from 11 p.m. to 1 a.m is long altogether
In first period of the day from 11 p.m. to 1 a.m taken is long/the first scheduled duration width image, the image is regarded as the first image.In second period of the day from 11 p.m. to 1 a.m is long every
Second scheduled duration once shoots target reference object, in second period of the day from 11 p.m. to 1 a.m length taken altogether in second period of the day from 11 p.m. to 1 a.m is long/
Second scheduled duration width image, regards the image as the second image.Alternatively, being accounted for reduce recorded content to terminal system capacity
With, target reference object is recorded in real time in first and second period of the day from 11 p.m. to 1 a.m is long in this programme, to and to first period of the day from 11 p.m. to 1 a.m it is long in, the
The content recorded in two period of the day from 11 p.m. to 1 a.m are long carries out taking out frame processing respectively, obtains the first multi-medium data, the second multi-medium data.Due to
First multi-medium data, the second multi-medium data are the contents that record treated by taking out frame data, to power system capacity
Occupancy the occupancy of power system capacity will be substantially reduced with the content recorded.Power system capacity is accounted in this way, not only reducing
With the image that different recording times are shot also illustrate target reference object when recording grow in variation.Wherein, first
Scheduled duration, the second scheduled duration are flexibly set according to actual use situation, are not specifically limited.By foregoing teachings it is found that
The change procedure by a few minutes, a few houres, several days, some months reference object even in several years can be achieved in this programme by one
Shorter play time plays out in the form of video, and this technology is referred to as " time-lapse photography " technology.The technology makes
The change procedure of reference object in a long time is compressed in a shorter time and plays out, and user can be enabled to watch flat
When the scene that can not with the naked eye discover, promote user experience.
Step 505: according to first and second multi-medium data and an at least width target image, obtaining target video;
Here, the executing subject of step 505 is terminal.Terminal is reading the first and second multi-medium data and is shooting for target
After at least piece image of object shooting, according to the first multi-medium data, an at least width target image, the second multi-medium data
Sequence carries out generation/production of video, obtains target video, which shows target reference object in the first duration
Interior change procedure.Wherein, the switching software that picture can be used to video carries out the generation of picture to video, specifically refers to existing
There are the relevant technologies, details are not described herein again.
Step 506: when the operational attribute of the first operation of detection and the first operation meets the first predetermined condition, first behaviour
Make to be at least the operation for the target image, the target video is played out.
Here, the executing subject of step 506 is terminal.Terminal is detecting user to the operation of target image and is determining this
Operation is operation (operational attribute the first predetermined item of satisfaction being shown to variation of the target reference object in the first duration
Part) when, terminal is to the video i.e. target video that can show (dynamic) change procedure of target reference object in the first duration
It plays out.
Wherein, whether the operational attribute for judging the first operation meets the first predetermined condition, comprising:
Operational attribute is preferably operating force;
Judge whether the operating force of the first operation reaches predetermined dynamics threshold value, generates the first judging result;Sentence when first
When the operating force of disconnected the first operation of result characterization reaches predetermined dynamics threshold value, it is judged as that the operational attribute of the first operation meets the
One predetermined condition, and the target video is played out.
Terminal reads the operation after detecting user to the operation of target image, through sensor such as pressure sensor
Operating force, and judge whether the operating force reaches predetermined dynamics threshold value such as 1.5N (ox), if reaching predetermined dynamics threshold value,
Then determine that the operation is the operation being shown to variation of the target reference object in the first duration, terminal is to can show mesh
Video, that is, the target video for marking (dynamic) change procedure of reference object in the first duration plays out.Wherein, dynamics threshold value
It can set according to the actual situation and flexibly.For the touch-control effect for reaching 3D Touch, usual dynamics threshold value value is larger to be needed
Great dynamics touch-control just can be carried out the broadcasting of target video.
In a preferred embodiment of the invention, before playing out to the target video, the method is also wrapped
Include: the playing duration for obtaining the terminal realization synthetic video broadcasting, which plays, utilizes the getable video of live photo function
Duration;Determine that the playing duration is target playing duration;Correspondingly, described play out the target video, comprising:
The target video is played out in the target playing duration.Wherein, the synthetic video is to through the scheduled recording
The first video that duration such as 1.5s are recorded (recorded video obtained before shooting picture) and to through the scheduled recording duration
Record the synthesis of obtained the second video (recorded video obtained after shooting picture), i.e., it is getable using live photo function
3s video.In general, terminal realizes that the playing duration of video playing and terminal utilize live photo using live photo function
Function realizes the recording duration substantially 3s very nearly the same of video record, by a few minutes, a few houres, several days, several in this programme
The change procedure of reference object, which is compressed in 3s to play out, in the moon or even several years uses " time-lapse photography " technology above-mentioned,
User can be enabled to watch the scene that usually naked eyes can not be discovered, promote the viewing experience of user.
In presently preferred embodiment, after playing out to the target video, the method is also
Include: be retrieved as the audio data of target video configuration, and when being played out to the target video, preferably described
When being played out in target playing duration to the target video, the audio data is exported.Here, target is being obtained
After video, it is equipped with certain music to target video, which is played together along with target video.So not only
The visual experience of user is improved, the audio experience of user is also improved, greatly strengthens the ease for use and hommization of terminal.
In the present embodiment, the recording duration to target reference object is obtained, which is greater than the first duration, and obtains
When recording the first long period of the day from 11 p.m. to 1 a.m it is long in the first multi-medium data, long the second son when recording that target reference object is recorded
Target reference object is clapped to the second multi-medium data of target reference object recording and after first period of the day from 11 p.m. to 1 a.m is long in duration
At least width target image taken the photograph, and by the first and second acquired multi-medium data and at least a width target image is fabricated to/gives birth to
Target video is obtained at video, the operation to target image is being detected and is determining that the operating force of the operation reaches predetermined dynamics
When threshold value, target video is played out.This programme has the advantage that
1) user is played just it can be seen that (dynamic) of the target reference object in the first duration becomes by target video
What change process, i.e. user were watched is the dynamic play picture about target reference object, can effectively promote visual effect.
2) the first duration (record duration) can according to the type of target reference object flexible value, with current live
It records duration in photo function to be only capable of value 3s and compare, flexibility is stronger, and applicable shooting occasion is more.
3) due in this programme the first duration be greater than 3s, with current live photo function in be only capable of record 3s video phase
Than, it can be achieved that recording to the target reference object longer time, recorded content more horn of plenty, are greatly improved user and use body
It tests, promotes the ease for use of end product.
It 4), can be even several by a few minutes, a few houres, several days, some months since this programme combines " time-lapse photography " technology
The change procedure of reference object, which was compressed in a shorter time, in year plays out, and user can be enabled to watch usually naked eyes nothing
The scene that method is discovered promotes the viewing experience of user.
5) every respectively pre- timing in the first and second multi-medium data is long by first period of the day from 11 p.m. to 1 a.m in this programme, second period of the day from 11 p.m. to 1 a.m is long
It is long to carry out picture shooting shooting and obtain, or by obtain and taking out frame to the content recorded, so can be by each
From the setting of scheduled duration, the selection of frame rate is taken out, is at least accomplished and the current getable recorded content of live photo function
Capacity it is suitable;With first period of the day from 11 p.m. to 1 a.m is long, second period of the day from 11 p.m. to 1 a.m is long carry out content obtained from real-time recording compared with will greatly reduce pair
The occupancy of terminal system capacity.
Below with reference to Fig. 6 (a)~(c), by shooting flower never grow flower bud, grow flower bud to the flowers are in blossom put for, it is right
This programme is described further, to deepen the understanding to this programme.
User's selected target reference object is flower, and between the target reference object type and the first duration that terminal provides
List in the type of target reference object is chosen, i.e., execution selection operation such as to target reference object type and such
The target reference object " flower " shown under type is chosen, and terminal knows current target after reading this selection operation
The type of reference object is natural land, and according to the content recorded in list, read is selected target reference object in advance
" flower " configured recording duration (the first duration) 3 days.Terminal is being read as the configured recording duration of target reference object 3 days
Afterwards, start the camera function of terminal, and carry out average or non-average cutting to duration is recorded.For this sentences non-flat equal cutting,
It is 1 day and 2 days this two parts durations by the first duration (recording duration) cutting in 3 days.Wherein, a length of 1 day of first period of the day from 11 p.m. to 1 a.m, the second son
Shi Changwei 2 days.In on day 1, target reference object " flower " is carried out every half an hour (the first scheduled duration) terminal primary
Picture shooting obtains several the first images, it is assumed that Fig. 6 (a) is the width in several first images.At the end of on day 1, to mesh
It marks reference object " flower " and carries out a picture shooting, obtain a width target image, it is assumed that target image is shown in Fig. 6 (b).?
In 2~3, picture shooting is carried out to target reference object " flower " every 1 hour terminal, obtains several the second images, it is false
Determining Fig. 6 (c) is the width in several first images.After growing when recording, what terminal automatically obtained interior shooting on day 1
The second image that first image, target image, shooting in the 2nd~3 day obtain preserves.Terminal reads the first figure preserved
Picture, the second image and target image, and using the switching software of picture to video according to the first image, target image and the second figure
The sequence of picture carries out generation/production of video, obtains target video.The limitation of length is considered herein, it is assumed that target video
In include Fig. 6 (a), three width images shown in (b) and Fig. 6 (c).Wherein, Fig. 6 (a) is the schematic diagram for never growing flower bud;Figure
6 (b) be the schematic diagram for growing flower bud;Fig. 6 (c) is the schematic diagram that the flowers are in blossom puts.
The touch control operation of target image (the first operation) is such as clicked when terminal detects user, double-click or pre-determined number
Click when, the operating force of the touch control operation is detected by built-in pressure sensor, and judge whether the operating force reaches
To predetermined dynamics threshold value, it is judged as when reaching predetermined dynamics threshold value, the target video generated by Fig. 6 (a)~(c) is broadcast
It puts.It can also be equipped with certain music, and then promote user in terms of vision and the sense of hearing two when playing out to target video
Impression.It can play in 3s and be generated by Fig. 6 (a)~Fig. 6 (c) with the playing duration of prespecified target video such as 3s
Target video.Flower can be clearly seen by the presentation user of target video never grow flower bud arrived again to growing flower bud
The process that the flowers are in blossom puts enables user see landscape/object dynamic change in nature, enhances visual experience.
In aforementioned schemes, the first duration flexible value according to the type of target reference object, with current live photo
It records duration in function to be only capable of value 3s and compare, flexibility is stronger, and applicable shooting occasion is more.First duration 3 in this programme
It is considerably longer than 3s, it can be achieved that target reference object compared with the video for being only capable of recording 3s in current live photo function
The recording of longer time, recorded content more horn of plenty, is greatly improved user experience, promotes the ease for use of end product.
By the broadcasting user of target video just it can be seen that (dynamic) change procedure of target reference object in the first duration, that is, use
What family was watched is the dynamic play picture about target reference object, can effectively promote visual effect.
Embodiment three
Terminal embodiment provided by the invention, the terminal have video record and camera function.The terminal can be plate
Computer, conventional mobile phone, smart phone, electronic reader etc., can also be wearable for smart glasses, smart watches, intelligent shoe etc.
Equipment.The optional terminal of the present invention is smart phone.
Fig. 7 is a composed structure schematic diagram of terminal embodiment of the invention;As shown in fig. 7, the terminal includes: first
Acquiring unit 601, second acquisition unit 602, third acquiring unit 603, the 4th acquiring unit 604 and first obtains unit
605;Wherein,
First acquisition unit 601, for obtaining the first duration, when described first when the recording of a length of pair of target reference object
Long, first duration is at least more than the second duration, and described second when, a length of terminal was before carrying out picture shooting to reference object
Carry out the scheduled recording duration of video record and the scheduled recording duration for carrying out video record to reference object after picture shooting
The sum of i.e. described second when a length of terminal target reference object is recorded using the realization of dynamic picture live photo function it is total
Duration;
Second acquisition unit 602, for obtaining in the long interior recording of first period of the day from 11 p.m. to 1 a.m for more than the first of target reference object
Media data;
Third acquiring unit 603, for obtaining after first period of the day from 11 p.m. to 1 a.m is long at least the one of the shooting of target reference object
Width target image;
4th acquiring unit 604, for obtaining in the long interior recording of second period of the day from 11 p.m. to 1 a.m for more than the second of target reference object
Media data, first period of the day from 11 p.m. to 1 a.m is long, second period of the day from 11 p.m. to 1 a.m is long is obtained and carrying out average cutting or non-average cutting to the first duration
It arrives;
First obtains unit 605 is used for according to first and second multi-medium data and an at least width target image,
Obtain target video.
Wherein, first period of the day from 11 p.m. to 1 a.m is long, second period of the day from 11 p.m. to 1 a.m is long by carrying out average cutting or non-average cutting to the first duration
And it obtains;
First multi-medium data is the target shooting pair taken in first period of the day from 11 p.m. to 1 a.m is long every the first scheduled duration
At least first image for elephant;
Second multi-medium data is the target shooting pair taken in second period of the day from 11 p.m. to 1 a.m is long every the second scheduled duration
At least second image for elephant;
Correspondingly, the first obtains unit 605, be also used to according at least first image, target image and
The sequence of second image carries out generation/production of video, obtains the target video.
As shown in figure 8, the terminal further include: first detection unit 606 and the first broadcast unit 607;Wherein,
First detection unit 606, for detecting the first operation and meeting the first predetermined item in the operational attribute of the first operation
When part, the first broadcast unit 607 is triggered, first operation is at least the operation for the target image;
First broadcast unit 607, for being played out to the target video.
Wherein, the operational attribute is operating force;
The first detection unit 606, for after detecting the first operation, judge first operation operating force whether
Reach predetermined dynamics threshold value, generates the first judging result;
When the operating force of the first operation of the first judging result characterization reaches predetermined dynamics threshold value, it is judged as the first operation
Operational attribute meet the first predetermined condition.
Wherein, first broadcast unit 607, is also used to:
It obtains the terminal and realizes the playing duration that synthetic video plays, wherein the synthetic video is to through described predetermined
Pair the first video for recording of duration and the synthesis to the second video recorded through the scheduled recording duration are recorded, i.e.,
The playing duration that synthetic video plays is that the playing duration of video playing is realized using live photo function;
Determine that the playing duration is target playing duration;
The target video is played out in the target playing duration.
Wherein, first broadcast unit 607, is also used to:
It is retrieved as the audio data of the target video configuration;
When being played out in the target playing duration to the target video, the audio data is exported.
In the present embodiment, first acquisition unit 601 is obtained to the recording duration of target reference object, is recorded duration and is greater than the
Two durations, second acquisition unit 602 obtain long interior more than first recorded to target reference object of the first long when recording period of the day from 11 p.m. to 1 a.m
Media data, the 4th acquiring unit 604 obtain long interior second recorded to target reference object of the second long when recording period of the day from 11 p.m. to 1 a.m
Multi-medium data, third acquiring unit 603 is after first period of the day from 11 p.m. to 1 a.m is long to an at least width target for target reference object shooting
Image, by the first and second multi-medium data and at least, a width target image is fabricated to/generates video and obtains first obtains unit 605
Target video.Due in this programme the first duration be greater than 3s, with current live photo function in be only capable of record 3s video phase
Than, it can be achieved that recording to the target reference object longer time, recorded content more horn of plenty, are greatly improved user and use body
It tests, promotes the ease for use of end product.Meanwhile first duration (recording duration) can be flexible according to the type of target reference object
Value, compared with recording duration in current live photo function and being only capable of value 3s, flexibility is stronger, and applicable shooting occasion is more
It is more.
In addition, detecting the operation to target image in first detection unit 606 and determining that the operation is to shoot to target
When the operation that variation of the object in the first duration is shown, the first broadcast unit 607 is triggered, the first broadcast unit 607 is right
Target video plays out.Wherein, by the broadcasting user of target video just it can be seen that target reference object is in the first duration
Interior (dynamic) change procedure, i.e., what user watched is the dynamic play picture about target reference object, can effectively be promoted
Visual effect.
It should be noted that terminal provided in an embodiment of the present invention, since the principle that the terminal solves the problems, such as is obtained with aforementioned
The method for taking video is similar, and therefore, the implementation process and implementation principle of terminal may refer to the aforementioned method for obtaining video
Implementation process and implementation principle description, overlaps will not be repeated.
It will be appreciated by those skilled in the art that managing the realization function of unit everywhere in Fig. 7, terminal shown in 8 can join
According to it is aforementioned obtain video method associated description and understand.It will be appreciated by those skilled in the art that in Fig. 7, terminal shown in 8
The function of each processing unit can be realized and running on the program on processor, can also pass through specific logic circuit reality
It is existing.
In practical applications, the first acquisition unit 601, second acquisition unit 602, third acquiring unit the 603, the 4th
Acquiring unit 604, first obtains unit 605, first detection unit 606 and the first broadcast unit 607 can be by central processing lists
First (CPU, Central Processing Unit) or Digital Signal Processing (DSP, Digital Signal Processor),
Or microprocessor (MPU, Micro Processor Unit) or field programmable gate array (FPGA, Field
Programmable Gate Array) etc. realize.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row
His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do
There is also other identical elements in the process, method of element, article or device.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art
The part contributed out can be embodied in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal device (can be mobile phone, computer, clothes
Business device, air conditioner or the network equipment etc.) execute method described in each embodiment of the present invention.
The above is only a preferred embodiment of the present invention, is not intended to limit the scope of the invention, all to utilize this hair
Equivalent structure or equivalent flow shift made by bright specification and accompanying drawing content is applied directly or indirectly in other relevant skills
Art field, is included within the scope of the present invention.
Below in conjunction with attached drawing to a preferred embodiment of the present invention will be described in detail, it should be understood that described below is excellent
Select embodiment only for the purpose of illustrating and explaining the present invention and is not intended to limit the present invention.
Claims (10)
1. a kind of terminal, which is characterized in that the terminal includes:
First acquisition unit, for obtaining the first duration, the recording duration of a length of pair of target reference object, described when described first
First duration is at least more than the second duration, and described second when, a length of terminal carried out video to reference object before carrying out picture shooting
The scheduled recording duration of recording and the sum of the scheduled recording duration for carrying out video record to reference object after picture shooting;
Second acquisition unit, for obtaining the first multimedia number for target reference object recorded in first period of the day from 11 p.m. to 1 a.m is long
According to, wherein target reference object is once shot at regular intervals in first period of the day from 11 p.m. to 1 a.m is long;
Third acquiring unit, for obtaining after first period of the day from 11 p.m. to 1 a.m is long to an at least width target figure for target reference object shooting
Picture;
4th acquiring unit, for obtaining the second multimedia number for target reference object recorded in second period of the day from 11 p.m. to 1 a.m is long
According to first period of the day from 11 p.m. to 1 a.m is long, second period of the day from 11 p.m. to 1 a.m is long is obtained and carrying out average cutting or non-average cutting to the first duration, described
First duration is at least determining according to the type of the target reference object;Wherein, when in second period of the day from 11 p.m. to 1 a.m is long every one section
Between target reference object is once shot;
First obtains unit, for obtaining target according to first and second multi-medium data and an at least width target image
Video.
2. terminal according to claim 1, which is characterized in that
First multi-medium data is the target reference object taken in first period of the day from 11 p.m. to 1 a.m is long every the first scheduled duration
At least first image;
Second multi-medium data is the target reference object taken in second period of the day from 11 p.m. to 1 a.m is long every the second scheduled duration
At least second image;
Correspondingly, the first obtains unit, is also used to according at least first image, target image and second image
Sequence carry out video generation/production, obtain the target video.
3. terminal according to claim 1 or 2, which is characterized in that the terminal further include:
First detection unit, for detecting the first operation and when the operational attribute of the first operation meets the first predetermined condition, touching
The first broadcast unit is sent out, first operation is at least the operation for the target image;
First broadcast unit, for being played out to the target video.
4. terminal according to claim 3, which is characterized in that
The operational attribute is operating force;
The first detection unit, for after detecting the first operation, judging whether the operating force of the first operation reaches pre-
Determine dynamics threshold value, generates the first judging result;
When the operating force of the first operation of the first judging result characterization reaches predetermined dynamics threshold value, it is judged as the behaviour of the first operation
Make attribute and meets the first predetermined condition.
5. terminal according to claim 3, which is characterized in that first broadcast unit is also used to:
It obtains the terminal and realizes the playing duration that synthetic video plays, wherein the synthetic video is before the picture shooting
To through the scheduled recording duration to the first video recorded through the scheduled recording duration and after the picture shooting
Record the synthesis of the second obtained video;
Determine that the playing duration is target playing duration;
The target video is played out in the target playing duration.
6. a kind of method for obtaining video, which is characterized in that the described method includes:
Obtain the first duration, the recording duration of a length of pair of target reference object when described first, first duration at least more than
Second duration, described second when, a length of terminal was when carrying out the scheduled recording of video record to reference object before carrying out picture shooting
The sum of scheduled recording duration that is long and carrying out video record to reference object after picture shooting;
Obtain the first multi-medium data for target reference object recorded in first period of the day from 11 p.m. to 1 a.m is long, wherein in second period of the day from 11 p.m. to 1 a.m
Target reference object is once shot at regular intervals in long;
It obtains after first period of the day from 11 p.m. to 1 a.m is long to an at least width target image for target reference object shooting;
The second multi-medium data for target reference object that acquisition is recorded in second period of the day from 11 p.m. to 1 a.m is long, first period of the day from 11 p.m. to 1 a.m length,
Second period of the day from 11 p.m. to 1 a.m is long to be obtained and carrying out average or non-average cutting to the first duration, and first duration is at least according to
What the type of target reference object determined;Wherein, one is carried out to target reference object at regular intervals in second period of the day from 11 p.m. to 1 a.m is long
Secondary shooting;
According to first and second multi-medium data and an at least width target image, target video is obtained.
7. according to the method described in claim 6, it is characterized in that,
First multi-medium data is the target reference object taken in first period of the day from 11 p.m. to 1 a.m is long every the first scheduled duration
At least first image;
Second multi-medium data is the target reference object taken in second period of the day from 11 p.m. to 1 a.m is long every the second scheduled duration
At least second image;
Correspondingly, it is described according to first and second multi-medium data and an at least width target image, target video is obtained,
Include:
Generation/production that video is carried out according to the sequence of at least first image, target image and second image, obtains
The target video.
8. method according to claim 6 or 7, which is characterized in that after obtaining target video, the method is also wrapped
It includes:
When the operational attribute of the first operation of detection and the first operation meets the first predetermined condition, first operation, which is at least, to be directed to
The operation of the target image plays out the target video.
9. according to the method described in claim 8, it is characterized in that, the method also includes:
The operational attribute is operating force;
After detecting the first operation, judge whether the operating force of the first operation reaches predetermined dynamics threshold value, generates first and sentence
Disconnected result;
When the operating force of the first operation of the first judging result characterization reaches predetermined dynamics threshold value, it is judged as the behaviour of the first operation
Make attribute and meets the first predetermined condition.
10. according to the method described in claim 8, it is characterized in that, before being played out to the target video, the side
Method further include:
It obtains the terminal and realizes the playing duration that synthetic video plays, the synthetic video is before the picture shooting to warp
The first video and recorded after the picture shooting to through the scheduled recording duration that the scheduled recording duration is recorded
The synthesis of the second obtained video;
Determine that the playing duration is target playing duration;
Correspondingly, described play out the target video, comprising:
The target video is played out in the target playing duration.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610865332.5A CN106303290B (en) | 2016-09-29 | 2016-09-29 | A kind of terminal and the method for obtaining video |
PCT/CN2017/100870 WO2018059206A1 (en) | 2016-09-29 | 2017-09-07 | Terminal, method of acquiring video, and data storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610865332.5A CN106303290B (en) | 2016-09-29 | 2016-09-29 | A kind of terminal and the method for obtaining video |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106303290A CN106303290A (en) | 2017-01-04 |
CN106303290B true CN106303290B (en) | 2019-10-08 |
Family
ID=57715660
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610865332.5A Active CN106303290B (en) | 2016-09-29 | 2016-09-29 | A kind of terminal and the method for obtaining video |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106303290B (en) |
WO (1) | WO2018059206A1 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106303290B (en) * | 2016-09-29 | 2019-10-08 | 努比亚技术有限公司 | A kind of terminal and the method for obtaining video |
CN107370953B (en) * | 2017-08-17 | 2019-03-19 | 北京达佳互联信息技术有限公司 | Filming control method, device and intelligent glasses |
CN109068052B (en) * | 2018-07-24 | 2020-11-10 | 努比亚技术有限公司 | Video shooting method, mobile terminal and computer readable storage medium |
CN110290425B (en) * | 2019-07-29 | 2023-04-07 | 腾讯科技(深圳)有限公司 | Video processing method, device and storage medium |
CN112825081A (en) * | 2019-11-20 | 2021-05-21 | 云丁网络技术(北京)有限公司 | Video information processing method and device, electronic equipment, processor and readable medium |
CN111126807B (en) * | 2019-12-12 | 2023-10-10 | 浙江大华技术股份有限公司 | Stroke segmentation method and device, storage medium and electronic device |
CN112822511A (en) * | 2020-12-31 | 2021-05-18 | 深圳康佳电子科技有限公司 | Video processing method, system, intelligent terminal and computer readable storage medium |
CN115052198B (en) * | 2022-05-27 | 2023-07-04 | 广东职业技术学院 | Image synthesis method, device and system for intelligent farm |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104244024A (en) * | 2014-09-26 | 2014-12-24 | 北京金山安全软件有限公司 | Video cover generation method and device and terminal |
CN104967904A (en) * | 2014-04-10 | 2015-10-07 | 腾讯科技(深圳)有限公司 | Method for recording and playing back terminal video and apparatus thereof |
CN105245777A (en) * | 2015-09-28 | 2016-01-13 | 努比亚技术有限公司 | Method and device for generating video image |
CN105979138A (en) * | 2016-05-30 | 2016-09-28 | 努比亚技术有限公司 | Video shooting apparatus and method, and mobile terminal |
CN106303669A (en) * | 2016-08-17 | 2017-01-04 | 深圳鑫联迅科技有限公司 | A kind of video clipping method and device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8681234B2 (en) * | 2010-09-28 | 2014-03-25 | Sony Computer Entertainment America Llc | System and methdod for capturing and displaying still photo and video content |
JP5447619B2 (en) * | 2011-11-08 | 2014-03-19 | 株式会社ニコン | Imaging device |
CN104125388B (en) * | 2013-04-25 | 2019-04-05 | 广州华多网络科技有限公司 | A kind of method and apparatus for shooting and storing photograph |
CN105828010B (en) * | 2016-03-28 | 2018-11-27 | 广东欧珀移动通信有限公司 | It is a kind of based on the video recording method and device dynamically taken pictures |
CN105704387A (en) * | 2016-04-05 | 2016-06-22 | 广东欧珀移动通信有限公司 | Shooting method and device of intelligent terminal and intelligent terminal |
CN106303290B (en) * | 2016-09-29 | 2019-10-08 | 努比亚技术有限公司 | A kind of terminal and the method for obtaining video |
-
2016
- 2016-09-29 CN CN201610865332.5A patent/CN106303290B/en active Active
-
2017
- 2017-09-07 WO PCT/CN2017/100870 patent/WO2018059206A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104967904A (en) * | 2014-04-10 | 2015-10-07 | 腾讯科技(深圳)有限公司 | Method for recording and playing back terminal video and apparatus thereof |
CN104244024A (en) * | 2014-09-26 | 2014-12-24 | 北京金山安全软件有限公司 | Video cover generation method and device and terminal |
CN105245777A (en) * | 2015-09-28 | 2016-01-13 | 努比亚技术有限公司 | Method and device for generating video image |
CN105979138A (en) * | 2016-05-30 | 2016-09-28 | 努比亚技术有限公司 | Video shooting apparatus and method, and mobile terminal |
CN106303669A (en) * | 2016-08-17 | 2017-01-04 | 深圳鑫联迅科技有限公司 | A kind of video clipping method and device |
Also Published As
Publication number | Publication date |
---|---|
WO2018059206A1 (en) | 2018-04-05 |
CN106303290A (en) | 2017-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106303290B (en) | A kind of terminal and the method for obtaining video | |
CN105430295B (en) | Image processing apparatus and method | |
CN106485689B (en) | A kind of image processing method and device | |
CN104811554B (en) | The switching method and terminal of camera mode | |
CN113194242B (en) | Shooting method in long-focus scene and mobile terminal | |
CN105187724B (en) | A kind of mobile terminal and method handling image | |
CN104660903B (en) | Image pickup method and filming apparatus | |
CN104902185B (en) | Image pickup method and device | |
CN113727017B (en) | Shooting method, graphical interface and related device | |
CN109981885B (en) | Method for presenting video by electronic equipment in incoming call and electronic equipment | |
CN110471606B (en) | Input method and electronic equipment | |
CN105227865B (en) | A kind of image processing method and terminal | |
CN112580400B (en) | Image optimization method and electronic equipment | |
US11470246B2 (en) | Intelligent photographing method and system, and related apparatus | |
CN105488756B (en) | Picture synthetic method and device | |
CN106851128A (en) | A kind of video data handling procedure and device based on dual camera | |
KR20160127606A (en) | Mobile terminal and the control method thereof | |
CN106303273A (en) | A kind of mobile terminal and camera control method thereof | |
CN113170037A (en) | Method for shooting long exposure image and electronic equipment | |
CN106803879A (en) | Cooperate with filming apparatus and the method for finding a view | |
CN105407275B (en) | Photo synthesizer and method | |
CN106021292B (en) | A kind of device and method for searching picture | |
CN114064160A (en) | Application icon layout method and related device | |
CN114466101B (en) | Display method and electronic equipment | |
CN116861019A (en) | Picture display method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |