CN105357560A - Caching processing method and device - Google Patents

Caching processing method and device Download PDF

Info

Publication number
CN105357560A
CN105357560A CN201510628540.9A CN201510628540A CN105357560A CN 105357560 A CN105357560 A CN 105357560A CN 201510628540 A CN201510628540 A CN 201510628540A CN 105357560 A CN105357560 A CN 105357560A
Authority
CN
China
Prior art keywords
mobile terminal
time point
time
data
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510628540.9A
Other languages
Chinese (zh)
Inventor
王海滨
王猛
苗雷
里强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201510628540.9A priority Critical patent/CN105357560A/en
Publication of CN105357560A publication Critical patent/CN105357560A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the invention provides a caching processing method, which is applied to a mobile terminal. The method comprises the following steps: acquiring data information acquired by the mobile terminal; processing the data information according to a first preset rule in order to generate cached information; and receiving an operation-control instruction of a user, and controlling the mobile terminal to read the cached information according to a second preset rule. The invention also provides a caching processing device. According to the caching processing method provided by the invention, the data information acquired in real time is cached, and required scene information can be extracted and processed through the cached data information according to the operation-control instruction of the user.

Description

Method for caching and processing and device
Technical field
The present invention relates to data processing, particularly relate to method for caching and processing and device.
Background technology
Along with gradually becoming strong of mobile terminal camera function, it is more and more main shooting style that user is undertaken taking pictures by mobile terminal.At present, the mode that mobile terminal is realizing taking pictures is mainly, and after entering the camera applications of mobile terminal, obtains real-time view data by camera, after mobile terminal detects the photographing instruction of user, directly the view data of current acquisition is generated as photo.This mode makes user often can miss specific scene because reaction is slow, and again cannot obtain the data message of this scene after missing at all.
Summary of the invention
Main purpose of the present invention is to propose a kind of method for caching and processing and device, makes mobile terminal from the data message of buffer memory, the scene information needed can be fetched row relax of going forward side by side.
The invention provides a kind of method for caching and processing for achieving the above object, be applied to mobile terminal, comprising: obtain the data message that mobile terminal collects; Process to generate cache information to data message according to the first preset rules; Receive the manipulation instruction of user, control mobile terminal and read cache information according to the second preset rules.
Alternatively, the first preset rules is: obtain the very first time point T1 and the second time point T2 that data message generates; Intercept the data message between very first time point T1 to the second time point T2; Is put data message between T1 and the second time point T2 the very first time stored in the internal memory of mobile terminal, to generate cache information.
Alternatively, very first time point T1 and the second time point T2 is spaced apart N, and the second time point T2 is real-time time point, to constantly update stored in the cache information in buffer memory.
Alternatively, the second preset rules is: when detecting the manipulation instruction of user, and acquisition user triggers time point t during manipulation instruction; Read the cache information that will generate after time point t in particular cache information in buffer memory when time point t and/or buffer memory.
Alternatively, obtain the data message that mobile terminal collects, comprising: the enabled instruction receiving user, enters shooting/recording mode;
When detecting that mobile terminal enters shooting/recording mode, the camera of mobile terminal and/or microphone Real-time Obtaining video data and/or voice data.
Present invention also offers a kind of buffer processing device, comprising: data acquisition unit, for obtaining the data message that mobile terminal collects; Buffer memory generation unit, for processing to generate cache information to data message according to the first preset rules; Processing unit, for receiving the manipulation instruction of user, controlling mobile terminal and reading cache information according to the second preset rules.
Alternatively, the first preset rules is: obtain the very first time point T1 and the second time point T2 that data message generates; Intercept the data message between very first time point T1 to the second time point T2; Is put data message between T1 and the second time point T2 the very first time stored in the internal memory of mobile terminal, to generate cache information.
Alternatively, very first time point T1 and the second time point T2 is spaced apart N, and the second time point T2 is real-time time point, to constantly update stored in the cache information in buffer memory.
Alternatively, the second preset rules is: when detecting the manipulation instruction of user, and acquisition user triggers time point t during manipulation instruction; Read the cache information that will generate after time point t in particular cache information in buffer memory when time point t and/or buffer memory.
Alternatively, data acquisition unit is further used for: the enabled instruction receiving user, enters shooting/recording mode; When detecting that mobile terminal enters shooting/recording mode, the camera of mobile terminal and/or microphone Real-time Obtaining video data and/or voice data.
The embodiment of the present invention is by the data message of Real-time Collection scene instantly, and as required the specific part in the data message collected is stored in buffer memory, mobile terminal, according to the manipulation instruction of user, can fetch the scene information needed row relax of going forward side by side by the data message of buffer memory.
Accompanying drawing explanation
Fig. 1 is the hardware configuration schematic diagram of the mobile terminal realizing each embodiment of the present invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the schematic flow sheet of method for caching and processing first embodiment of the present invention;
Fig. 4 is the flow chart of S310 step in method for caching and processing first embodiment of the present invention;
Fig. 5 is the flow chart of S320 step in method for caching and processing first embodiment of the present invention;
Fig. 6 is the flow chart of S330 step in method for caching and processing first embodiment of the present invention;
Fig. 7 is the interface schematic diagram of taking pictures of mobile terminal in method for caching and processing first embodiment of the present invention;
Fig. 8 is the recorded video initial interface schematic diagram of mobile terminal in method for caching and processing first embodiment of the present invention;
Fig. 9 is schematic diagram in the video record of mobile terminal in method for caching and processing first embodiment of the present invention;
Figure 10 is the recording initial interface schematic diagram of mobile terminal in method for caching and processing first embodiment of the present invention;
Figure 11 is schematic diagram in the recording of mobile terminal in method for caching and processing first embodiment of the present invention;
Figure 12 is the high-level schematic functional block diagram of buffer processing device first embodiment of the present invention.
The realization of the object of the invention, functional characteristics and advantage will in conjunction with the embodiments, are described further with reference to accompanying drawing.
Embodiment
Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
The mobile terminal realizing each embodiment of the present invention is described referring now to accompanying drawing.In follow-up description, use the suffix of such as " module ", " parts " or " unit " for representing element only in order to be conducive to explanation of the present invention, itself is specific meaning not.Therefore, " module " and " parts " can mixedly use.
Mobile terminal can be implemented in a variety of manners.Such as, the terminal described in the present invention can comprise the such as mobile terminal of mobile phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP (portable media player), guider etc. and the fixed terminal of such as digital TV, desktop computer etc.Below, suppose that terminal is mobile terminal.But it will be appreciated by those skilled in the art that except the element except being used in particular for mobile object, structure according to the embodiment of the present invention also can be applied to the terminal of fixed type.
Fig. 1 is the hardware configuration signal of the mobile terminal realizing each embodiment of the present invention.
Mobile terminal 100 can comprise wireless communication unit 110, A/V (audio/video) input unit 120, user input unit 130, sensing cell 140, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190 etc.Fig. 1 shows the mobile terminal with various assembly, it should be understood that, does not require to implement all assemblies illustrated.Can alternatively implement more or less assembly.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assembly, and it allows the radio communication between mobile terminal 100 and radio communication device or network.Such as, wireless communication unit can comprise at least one in broadcast reception unit 111, mobile comm unit 112, wireless interconnected net unit 113, short-range communication unit 114 and positional information unit 115.
Broadcast reception module 111 via broadcast channel from external broadcasting management server receiving broadcast signal and/or broadcast related information.Broadcast channel can comprise satellite channel and/or terrestrial channel.Broadcast management server can be generate and send the server of broadcast singal and/or broadcast related information or the broadcast singal generated before receiving and/or broadcast related information and send it to the server of terminal.Broadcast singal can comprise TV broadcast singal, radio signals, data broadcasting signal etc.And broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast related information also can provide via mobile communications network, and in this case, broadcast related information can be received by mobile communication module 112.Broadcast singal can exist in a variety of manners, such as, it can exist with the form of the electronic service guidebooks (ESG) of the electronic program guides of DMB (DMB) (EPG), digital video broadcast-handheld (DVB-H) etc.Broadcast reception module 111 can by using the broadcast of various types of broadcaster Received signal strength.Especially, broadcast reception module 111 can by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video broadcasting-hand-held (DVB-H), forward link media (MediaFLO ) the digital broadcast apparatus receiving digital broadcast of data broadcast device, received terrestrial digital broadcasting integrated service (ISDB-T) etc.Broadcast reception module 111 can be constructed to be applicable to providing the various broadcaster of broadcast singal and above-mentioned digital broadcast apparatus.The broadcast singal received via broadcast reception module 111 and/or broadcast related information can be stored in memory 160 (or storage medium of other type).
Radio signal is sent at least one in base station (such as, access point, Node B etc.), exterior terminal and server and/or receives radio signals from it by mobile communication module 112.Various types of data that such radio signal can comprise voice call signal, video calling signal or send according to text and/or Multimedia Message and/or receive.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can be inner or be externally couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can comprise WLAN (WLAN) (Wi-Fi), Wibro (WiMAX), Wimax (worldwide interoperability for microwave access), HSDPA (high-speed downlink packet access) etc.
Short range communication module 114 is the modules for supporting junction service.Some examples of short-range communication technology comprise bluetooth tM, radio-frequency (RF) identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybee tMetc..
Positional information module 115 is the modules of positional information for checking or obtain mobile terminal.The typical case of positional information module is GPS (global pick device).According to current technology, GPS module 115 calculates from the range information of three or more satellite and correct time information and for the Information application triangulation calculated, thus calculates three-dimensional current location information according to longitude, latitude and pin-point accuracy.Current, the method for calculating location and temporal information uses three satellites and by the error of the position that uses an other satellite correction calculation to go out and temporal information.In addition, GPS module 115 can carry out computational speed information by Continuous plus current location information in real time.
A/V input unit 120 is for audio reception or vision signal.A/V input unit 120 can comprise camera 121 and microphone 122, and the view data of camera 121 to the static images obtained by image capture apparatus in Video Capture pattern or image capture mode or video processes.Picture frame after process may be displayed on display unit 151.Picture frame after camera 121 processes can be stored in memory 160 (or other storage medium) or via wireless communication unit 110 and send, and can provide two or more cameras 121 according to the structure of mobile terminal.Such acoustic processing can via microphones sound (voice data) in telephone calling model, logging mode, speech recognition mode etc. operational mode, and can be voice data by microphone 122.Audio frequency (voice) data after process can be converted to the formatted output that can be sent to mobile communication base station via mobile comm unit 112 when telephone calling model.Microphone 122 can be implemented various types of noise and eliminate (or suppress) algorithm and receiving and sending to eliminate (or suppression) noise or interference that produce in the process of audio signal.
User input unit 130 can generate key input data to control the various operations of mobile terminal according to the order of user's input.User input unit 130 allows user to input various types of information, and keyboard, the young sheet of pot, touch pad (such as, detecting the touch-sensitive assembly of the change of the resistance, pressure, electric capacity etc. that cause owing to being touched), roller, rocking bar etc. can be comprised.Especially, when touch pad is superimposed upon on display unit 151 as a layer, touch-screen can be formed.
Sensing cell 140 detects the current state of mobile terminal 100, (such as, mobile terminal 100 open or close state), the position of mobile terminal 100, user for mobile terminal 100 contact (namely, touch input) presence or absence, the orientation of mobile terminal 100, the acceleration of mobile terminal 100 or speed is moved and direction etc., and generate order or the signal of the operation for controlling mobile terminal 100.Such as, when mobile terminal 100 is embodied as sliding-type mobile phone, sensing cell 140 can sense this sliding-type phone and open or close.In addition, whether whether sensing cell 140 can detect power subsystem 190 provides electric power or interface unit 170 to couple with external device (ED).Sensing cell 140 can comprise proximity transducer 1410 and will be described this in conjunction with touch-screen below.
Interface unit 170 is used as at least one external device (ED) and is connected the interface that can pass through with mobile terminal 100.Such as, external device (ED) can comprise wired or wireless head-band earphone port, external power source (or battery charger) port, wired or wireless FPDP, memory card port, for connecting the port, audio frequency I/O (I/O) port, video i/o port, ear port etc. of the device with identification module.Identification module can be that storage uses the various information of mobile terminal 100 for authentication of users and can comprise subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) etc.In addition, the device (hereinafter referred to " recognition device ") with identification module can take the form of smart card, and therefore, recognition device can be connected with mobile terminal 100 via port or other jockey.Interface unit 170 may be used for receive from external device (ED) input (such as, data message, electric power etc.) and the input received be transferred to the one or more element in mobile terminal 100 or may be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 100 is connected with external base, interface unit 170 can be used as to allow by it electric power to be provided to the path of mobile terminal 100 from base or can be used as the path that allows to be transferred to mobile terminal by it from the various command signals of base input.The various command signal inputted from base or electric power can be used as and identify whether mobile terminal is arranged on the signal base exactly.Output unit 150 is constructed to provide output signal (such as, audio signal, vision signal, alarm signal, vibration signal etc.) with vision, audio frequency and/or tactile manner.Output unit 150 can comprise display unit 151, audio output unit 152, alarm unit 153 etc.
Display unit 151 may be displayed on the information of process in mobile terminal 100.Such as, when mobile terminal 100 is in telephone calling model, display unit 151 can show with call or other communicate (such as, text messaging, multimedia file are downloaded etc.) be correlated with user interface (UI) or graphic user interface (GUI).When mobile terminal 100 is in video calling pattern or image capture mode, display unit 151 can the image of display capture and/or the image of reception, UI or GUI that video or image and correlation function are shown etc.
Meanwhile, when display unit 151 and touch pad as a layer superposed on one another to form touch-screen time, display unit 151 can be used as input unit and output device.Display unit 151 can comprise at least one in liquid crystal display (LCD), thin-film transistor LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc.Some in these displays can be constructed to transparence and watch from outside to allow user, and this can be called transparent display, and typical transparent display can be such as TOLED (transparent organic light emitting diode) display etc.According to the specific execution mode wanted, mobile terminal 100 can comprise two or more display units (or other display unit), such as, mobile terminal can comprise outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used for detecting touch input pressure and touch input position and touch and inputs area.
When audio output unit 152 can be under the isotypes such as call signal receiving mode, call mode, logging mode, speech recognition mode, broadcast reception mode at mobile terminal, voice data convert audio signals that is that wireless communication unit 110 is received or that store in memory 160 and exporting as sound.And audio output unit 152 can provide the audio frequency relevant to the specific function that mobile terminal 100 performs to export (such as, call signal receives sound, message sink sound etc.).Audio output unit 152 can comprise pick-up, buzzer etc.
Alarm unit 153 can provide and export that event informed to mobile terminal 100.Typical event can comprise calling reception, message sink, key signals input, touch input etc.Except audio or video exports, alarm unit 153 can provide in a different manner and export with the generation of notification event.Such as, alarm unit 153 can provide output with the form of vibration, when receive calling, message or some other enter communication (incomingcommunication) time, alarm unit 153 can provide sense of touch to export (that is, vibrating) to notify to user.By providing such sense of touch to export, even if when the mobile phone of user is in the pocket of user, user also can identify the generation of various event.Alarm unit 153 also can provide the output of the generation of notification event via display unit 151 or audio output unit 152.
Memory 160 software program that can store process and the control operation performed by controller 180 etc., or temporarily can store oneself through exporting the data (such as, telephone directory, message, still image, video etc.) that maybe will export.And, memory 160 can store about when touch be applied to touch-screen time the vibration of various modes that exports and the data of audio signal.
Memory 160 can comprise the storage medium of at least one type, described storage medium comprises flash memory, hard disk, multimedia card, card-type memory (such as, SD or DX memory etc.), random access storage device (RAM), static random-access memory (SRAM), read-only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc.And mobile terminal 100 can be connected the memory function of execute store 160 network storage device with by network cooperates.
Controller 180 controls the overall operation of mobile terminal usually.Such as, controller 180 performs the control relevant to voice call, data communication, video calling etc. and process.In addition, controller 180 can comprise the multimedia unit 181 for reproducing (or playback) multi-medium data, and multimedia unit 181 can be configured in controller 180, or can be configured to be separated with controller 180.Controller 180 can pattern recognition process, is identified as character or image so that input is drawn in the handwriting input performed on the touchscreen or picture.
Power subsystem 190 receives external power or internal power and provides each element of operation and the suitable electric power needed for assembly under the control of controller 180.
Various execution mode described herein can to use such as computer software, the computer-readable medium of hardware or its any combination implements.For hardware implementation, execution mode described herein can by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), processor, controller, microcontroller, microprocessor, being designed at least one performed in the electronic unit of function described herein and implementing, in some cases, such execution mode can be implemented in controller 180.For implement software, the execution mode of such as process or function can be implemented with allowing the independent software unit performing at least one function or operation.Software code can be implemented by the software application (or program) write with any suitable programming language, and software code can be stored in memory 160 and to be performed by controller 180.
So far, oneself is through the mobile terminal according to its functional description.Below, for the sake of brevity, by the slide type mobile terminal that describes in various types of mobile terminals of such as folded form, board-type, oscillating-type, slide type mobile terminal etc. exemplarily.Therefore, the present invention can be applied to the mobile terminal of any type, and is not limited to slide type mobile terminal.
Mobile terminal 100 as shown in Figure 1 can be constructed to utilize and send the such as wired and radio communication device of data via frame or grouping and satellite-based communicator operates.
Describe wherein according to the communicator that mobile terminal of the present invention can operate referring now to Fig. 2.
Such communicator can use different air interfaces and/or physical layer.Such as, the air interface used by communicator comprises such as frequency division multiple access (FDMA), time division multiple access (TDMA), code division multiple access (CDMA) and universal mobile communications device (UMTS) (especially, Long Term Evolution (LTE)), global mobile communication device (GSM) etc.As non-limiting example, description below relates to cdma communication device, but such instruction is equally applicable to the device of other type.
With reference to figure 2, cdma wireless communication device can comprise multiple mobile terminal 100, multiple base station (BS) 270, base station controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is constructed to form interface with Public Switched Telephony Network (PSTN) 290.MSC280 is also constructed to form interface with the BSC275 that can be couple to base station 270 via back haul link.Back haul link can construct according to any one in some interfaces that oneself knows, described interface comprises such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.Will be appreciated that device as shown in Figure 2 can comprise multiple BSC2750.
Each BS270 can serve one or more subregion (or region), by multidirectional antenna or point to specific direction each subregion of antenna cover radially away from BS270.Or each subregion can by two or more antenna covers for diversity reception.Each BS270 can be constructed to support multiple parallel compensate, and each parallel compensate has specific frequency spectrum (such as, 1.25MHz, 5MHz etc.).
Subregion can be called as CDMA Channel with intersecting of parallel compensate.BS270 also can be called as the sub-device of base station transceiver (BTS) or other equivalent terms.Under these circumstances, term " base station " may be used for broadly representing single BSC275 and at least one BS270.Base station also can be called as " cellular station ".Or each subregion of particular B S270 can be called as multiple cellular station.
As shown in Figure 2, broadcast singal is sent to the mobile terminal 100 operated in device by broadcsting transmitter (BT) 295.Broadcast reception module 111 as shown in Figure 1 is arranged on mobile terminal 100 and sentences the broadcast singal receiving and sent by BT295.In fig. 2, several global pick device (GPS) satellite 300 is shown.Satellite 300 helps at least one in the multiple mobile terminal 100 in location.
In fig. 2, depict multiple satellite 300, but be understandable that, the satellite of any number can be utilized to obtain useful locating information.GPS module 115 as shown in Figure 1 is constructed to coordinate to obtain the locating information wanted with satellite 300 usually.Substitute GPS tracking technique or outside GPS tracking technique, can use can other technology of position of tracking mobile terminal.In addition, at least one gps satellite 300 optionally or extraly can process satellite dmb transmission.
As a typical operation of radio communication device, BS270 receives the reverse link signal from various mobile terminal 100.Mobile terminal 100 participates in call usually, information receiving and transmitting communicates with other type.Each reverse link signal that certain base station 270 receives is processed by particular B S270.The data obtained are forwarded to relevant BSC275.BSC provides call Resourse Distribute and comprises the mobile management function of coordination of the soft switching process between BS270.The data received also are routed to MSC280 by BSC275, and it is provided for the extra route service forming interface with PSTN290.Similarly, PSTN290 and MSC280 forms interface, and MSC and BSC275 forms interface, and BSC275 correspondingly control BS270 so that forward link signals is sent to mobile terminal 100.
Based on the structure of above-mentioned mobile terminal hardware configuration, communicator, each embodiment of the inventive method is proposed.
As shown in Figure 3, the invention provides a kind of method for caching and processing, be applied to mobile terminal 100, comprise:
Step S310, obtains the data message that mobile terminal 100 collects.Wherein, data message comprises video data and voice data, also can be only have video data or only have voice data.Mobile terminal 100 can decide to gather video data and voice data according to the concrete manipulation instruction of user simultaneously, or only gathers video data or voice data.Such as, mobile terminal 100 can give user the type pointing out selection with the form playing window, also can control the data message type of collection with the gesture motion preset.
As shown in Figure 4, step S310 specifically comprises:
Step S410, accepts the enabled instruction of user, enters shooting/recording mode.User can carry out taking pictures or recorded video under image pickup mode.Voice recording can be carried out under recording mode.User by triggering special entity key to enter shooting/recording mode, or by the icon on operating mobile terminal 100 touch-screen, can enter specific APP and applies, to enter shooting/recording mode.
Step S420, detects mobile terminal 100 and whether enters shooting/recording mode.When detecting that mobile terminal 100 enters shooting/recording mode, enter step S430.
Step S430, the camera of mobile terminal 100 and/or microphone Real-time Obtaining video data and/or voice data.
Mobile terminal 100 can by calling internal interface to judge the mode state that mobile terminal 100 is current.In the present embodiment, regard as with the interface of current display and enter corresponding pattern.As shown in Fig. 7 ~ 8, display interface is now take pictures interface and recorded video interface under image pickup mode, therefore, regards as and enters image pickup mode.As shown in Figure 9, display interface is now recording interface, therefore, regards as and enters recording mode.In other embodiments, if when the APP application of mobile terminal 100 running background corresponding modes being detected, then regard as mobile terminal 100 and enter corresponding operator scheme, for example, when detecting that mobile terminal 100 backstage opens application of taking a picture, using micro-letter to apply even if so user is current, the mobile terminal 100 so also regarded as now enters image pickup mode.
As shown in Figure 5, mobile terminal 100 is now in interface of taking pictures, therefore, mobile terminal 100 enters image pickup mode, and camera starts Real-time Obtaining video data, microphone starts to obtain voice data, under image pickup mode, user also can pass through to trigger corresponding instruction, with the function of mute microphone (MIC) Real-time Obtaining voice data, that is, camera Real-time Obtaining video data is only had.As shown in Figure 7, the mobile terminal 100 of test is in recorded video interface, therefore, mobile terminal 100 enters image pickup mode, and camera starts Real-time Obtaining video data, microphone starts to obtain voice data, under image pickup mode, user also can pass through to trigger corresponding instruction, with the function of mute microphone (MIC) Real-time Obtaining voice data, that is, camera Real-time Obtaining video data is only had.As shown in Figure 6, mobile terminal 100 is now in recording interface, and therefore, mobile terminal 100 enters recording mode, and microphone starts Real-time Obtaining voice data.
It should be noted that when mobile terminal 100 is in take pictures interface and camera at Real-time Obtaining video data and/or microphone Real-time Obtaining voice data, user now triggers and starts shoot button.In like manner, when mobile terminal 100 is in recorded video or recording interface, user does not trigger and starts recorded video or record button.
Step S320, processes to generate cache information to data message according to the first preset rules.Camera or the real-time acquisition video data of microphone or speech data.The data message exceeding first time period P1, according to the rise time of data message, is given up by the processor of mobile terminal 100, by the data information memory that collects in first time period P1 in buffer memory, to form cache information.First time period P1 refers to this time period of putting sometime to current time after mobile terminal 100 enters into shooting/recording mode.For example, mobile terminal 100 enters image pickup mode in 10:00:00, and current time is 10:00:10, and so first time period P1 can be this time period of 10:00:07 ~ 10:00:03.Certainly, first time period P1 can start timing after mobile terminal 100 enters shooting/recording mode.Also can work as camera focusing to starting during certain objects to start timing, such as: focusing to during face or the object of motion time.
As shown in Figure 5, step S320 specifically comprises:
Step S510, obtains the very first time point T1 and the second time point T2 that data message generates.After mobile terminal 100 enters into shooting/recording mode, camera and/or microphone come into effect and gather video data and/or voice data.At a time, the very first time point T1 that recording data information generates, through after a period of time, records the second time point T2 that data message generates.In the present embodiment, very first time point T1 and the second time point T2 is spaced apart N, and wherein, N can be 2s, 3s or other time intervals arranged according to actual needs.The operation interface of mobile terminal 100 can provide corresponding option interface, for user input predetermined time interval N when operating.Also can the set interval generation of N of mobile terminal 100 is rule, such as, when detecting that user is when carrying out focusing camera/recording to personage, so N is automatically set as 2s the time interval, when detecting that user makes a video recording to vehicle, animal etc./records, so N is automatically set as 3s the time interval.Wherein, the second time point T2 is always real-time time, and very first time point T1 is the time falling behind N second with real-time time T2, and such as, during N=2s, real-time time T2 so is now 10:00:02, and so the very first time puts T1 is 10:00:00.
Step S520, intercepts the data message between very first time point T1 to the second time point T2.Mobile terminal 100 by obtaining very first time point T1 and the second time point T2, and then obtains first time period P1, that is, first time period P1 is the time period between very first time point T1 and the second time point T2.Because very first time point T1 lags behind the second time point T2, therefore give up camera/microphone data message of obtaining before very first time point T1, retain the data message in first time period P1.
Step S530, puts in data message between T1 and the second time point T2 internal memory stored in mobile terminal 100, to generate cache information by the very first time.By the data information memory that obtains in first time period P1 in buffer memory, form cache information, when operating for subsequent user, from this buffer memory, read the data of needs.Due to, second time point T2 is the real-time time of real-time update, and the interval N of very first time point T1 and the second time point T2 remains constant, therefore, the time period that first time period P1 is always up-to-date, therefore, the cache information be stored in buffer memory is also constantly update, and keeps synchronized update with the data message obtained with current camera/microphone.Such as: camera/microphone has obtained the data of 10 seconds, but only buffer memory data of nearest 2 seconds, abandon the 8 number of seconds certificates started most.
Step S330, receives the manipulation instruction of user, controls mobile terminal 100 and reads cache information according to the second preset rules.When mobile terminal 100 receives the manipulation instruction of user, obtain time point a now, such as, user clicks the virtual key of take pictures virtual key or recorded video by touch-screen, manipulates instruction accordingly to send to mobile terminal 100.Mobile terminal 100 reads the cache information of the second time period P2 from buffer memory, wherein the second time period was the time period calculated based on this time point a, can ensure that cache information that mobile terminal 100 reads is always the video data or voice data that the picture that obtains at time point a with camera/microphone or sound keeps being closely related like this.
As shown in Figure 6, step S330 specifically comprises:
Step S610, when detecting the manipulation instruction of user, acquisition user triggers time point t during manipulation instruction.The manipulation instruction of user can be when taking pictures interface, and user triggers the photographing instruction of key of taking pictures; Can be that user triggers the record command of recording key when recorded video interface; Can be that user triggers the record command of record button when recording interface.Time point t is the time of user when triggering command adapted thereto, that is, real-time time during triggering command.The second time point T2 of time point t now and first time period P is same time point.That is, in buffer memory now, cache information is the time period: the data message in t-N ~ t.
Step S620, reads the cache information that will generate after time point t in particular cache information in buffer memory when time point t and/or buffer memory.It is all data cached in buffer memory when mobile terminal 100 reads in time point t by processor, or the partial buffering data in buffer memory, or to wait for for starting point with time point t and obtain the data of write buffer memory, or data cached when reading in time point t in buffer memory and to wait for for starting point with time point t and obtain the data by writing buffer memory.Such as: if buffer memory 2 seconds data messages all the time in buffer memory, so, now mobile terminal 100 all can read out the buffer memory data message of 2 seconds, or the 1 number of seconds certificate only read wherein, or wait for and obtain the data that will write buffer memory, also can be the data in 2 seconds and the data that will write buffer memory, namely getting which partial data can combination in any.
In order to clear elaboration the present embodiment, to take pictures, recorded video and recording three concrete scenes are illustrated, and wherein first time period is 2s is example, illustrates method for caching and processing of the present invention.
As shown in Figure 7, user enters into by the camera applications in mobile terminal 100 entity key or mobile terminal 100 system desktop interface of taking pictures, now, detect mobile terminal 100 and enter exposal model at 10:00:00, user brings into use mobile terminal 100 to choose the picture of wish shooting, mobile terminal 100 processor controls camera and microphone starts Real-time Obtaining video data and voice data, and the video data obtained in camera and the nearest 2s of microphone and voice data are deposited in buffer memory in real time, when user 10:00:20 by touch take pictures virtual key or entity key (home key) take pictures time, mobile terminal 100 obtains the time point 10:00:20 that virtual key or entity key are triggered, then the processor of mobile terminal 100 reads video data and the voice data of this time period of 10:00:18 ~ 10:00:20 from buffer memory, and continue to read in real time and start until this time period of 10:00:22 is stored to video data in buffer memory and voice data successively from 10:00:20.The second time period P2 in above-mentioned example is actual is based on the default time point when virtual key or entity key are triggered, the cache information occurred in the buffer memory of 2s and this time period of 2s backward forward.By this kind of scheme, the video data that processor is read and voice data are the associated data of the front 2s of photo and the rear 2s taken.In other embodiments, after mobile terminal 100 obtains the time point 10:00:20 that virtual key or entity key be triggered, processor only can read the cache information of this time period of 10:00:19 ~ 10:00:20 from buffer memory, may also be the read-only 10:00:20 of taking to start, until this time period of 10:00:21 is stored to the cache information in buffer memory successively, to may also be the cache information only reading this time period of 10:00:18 ~ 10:00:19.
In other embodiments, the processor of mobile terminal 100 is different according to the object of focusing, and different time sections data cached in reading buffer memory, when detecting focusing object for personage, the second time period P2 is the rear N second of front N second of photo opporunity point and photo opporunity point.
In other embodiments, when detecting that continuing pressing after user touches key of taking pictures takes pictures key n second, then the second data cached time period P2 that processor reads is the rear n second of front N second of photo opporunity point and photo opporunity point.
Recorded video interface as shown in figures 8 and 10, first time period is 2s is example, illustrates method for caching and processing of the present invention.
The method of operation that user enters recorded video interface is identical with the above-mentioned method of operation entering interface of taking pictures, and difference is that processor reads the mode of the second time period P2 of buffer memory.When user carries out recorded video at 10:00:20 by touch recorded video virtual key or entity key (home key), mobile terminal 100 obtains the time point 10:00:20 that virtual key or entity key are triggered, then the processor of mobile terminal 100 reads video data and the voice data of this time period of 10:00:18 ~ 10:00:20 from buffer memory, now, mobile terminal 100 is still in recording process, when user touches recorded video virtual key or entity key (home key) to stop recording again at 10:10:00.Processor starts reading in real time until this time period of 10:10:02 is stored to video data in buffer memory and voice data successively from 10:10:00.That is, when mobile terminal 100 is in recorded video pattern, the second time period P2 is formed by with N second before recorded video start time point with N after recorded video end time point second.
In other embodiments, the processor of mobile terminal 100 is different according to the object of focusing, and different time sections data cached in reading buffer memory, when detecting focusing object for personage, the second time period P2 is the rear N second of front N second of recorded video start time point and recorded video end time point.
In other embodiments, continue pressing recording key n second when detecting after user touches recorded video key, then the second data cached time period P2 that processor reads is the rear n second of recorded video start time point front N second and recorded video end time point.
Below with the recording interface shown in Fig. 9 and 11, first time period is 2s is example, illustrates method for caching and processing of the present invention.
User enters into recording interface by the camera applications in mobile terminal 100 entity key or mobile terminal 100 system desktop, now, detect mobile terminal 100 and enter recording mode at 10:00:00, mobile terminal 100 processor controls microphone and starts Real-time Obtaining voice data, and deposits the video data obtained in 2s nearest for microphone and voice data in buffer memory in real time.When user is recorded by touch recorded video virtual key or entity key (home key) at 10:00:20, mobile terminal 100 obtains the time point 10:00:20 that virtual key or entity key are triggered, then the processor of mobile terminal 100 reads the voice data of this time period of 10:00:18 ~ 10:00:20 from buffer memory, now, mobile terminal 100 is still in recording process, when user touches recorded video virtual key or entity key (home key) to stop recording again at 10:10:00.Processor starts reading in real time until this time period of 10:10:02 is stored to the voice data in buffer memory successively from 10:10:00.That is, when mobile terminal 100 is in recording, the second time period P2 is by with N second before start time point of recording with formed second with N after End of Tape time point.
In other embodiments, continue pressing recording key n second when detecting after user touches record button, then the second data cached time period P2 that processor reads is the rear n second of recording start time point front N second and End of Tape time point.
It should be noted that, in the present invention in each embodiment, pressing region can by realizing at the upper integrated pressure sensor of the entity key (can be home key) such as described in Fig. 7-11, also can be arranged on any side of mobile terminal 100, pressure sensor is set by position a certain on side (can be determined by manufacturer according to user is actual easy to use) and realizes, it can also be only the some default virtual region on terminal touch screen, its particular location place does not affect the realization of various embodiments of the present invention, and the present invention does not limit this.
By method for caching and processing provided by the invention, mobile terminal 100 is after entering shooting/recording mode, camera and microphone start Real-time Obtaining video data and voice data, processor is according to the time of data genaration, the data exceeding the fixed time are abandoned, the i.e. always data that generate recently of buffer memory, when key or record button are taken pictures in user's triggering, processor reads to trigger the video data and voice data of taking pictures N second before based on key or record button time point and rear N second, by this kind of scheme, the video data that processor is read and voice data are and the photo taken, the video data that video or audio frequency are correlated with and voice data.
As shown in figure 12, second embodiment of the invention provides a kind of buffer processing device, comprising: data acquisition unit 101, buffer memory generation unit 102 and processing unit 102.
Data acquisition unit 101, for obtaining the data message that mobile terminal 100 collects.
Wherein, data message comprises video data and voice data, also can be only have video data or only have voice data.Mobile terminal 100 can decide to gather video data and voice data according to the concrete manipulation instruction of user simultaneously, or only gathers video data or voice data.Such as, mobile terminal 100 can give user the type pointing out selection with the form playing window, also can control the data message type of collection with the gesture motion preset.
Data acquisition unit 101, also for accepting the enabled instruction of user, enters shooting/recording mode.User can carry out taking pictures or recorded video under image pickup mode.Voice recording can be carried out under recording mode.User by triggering special entity key to enter shooting/recording mode, or by the icon on operating mobile terminal 100 touch-screen, can enter specific APP and applies, to enter shooting/recording mode.
Whether data acquisition unit 101, also enter shooting/recording mode for detecting mobile terminal 100.
When detecting that mobile terminal 100 enters shooting/recording mode, data acquisition unit 101, also for the camera of mobile terminal 100 and/or microphone Real-time Obtaining video data and/or voice data.
Mobile terminal 100 can by calling internal interface to judge the mode state that mobile terminal 100 is current.In the present embodiment, regard as with the interface of current display and enter corresponding pattern.As shown in Fig. 7 ~ 8, display interface is now take pictures interface and recorded video interface under image pickup mode, therefore, regards as and enters image pickup mode.As shown in Figure 9, display interface is now recording interface, therefore, regards as and enters recording mode.In other embodiments, if when the APP application of mobile terminal 100 running background corresponding modes being detected, then regard as mobile terminal 100 and enter corresponding operator scheme, for example, when detecting that mobile terminal 100 backstage opens application of taking a picture, using micro-letter to apply even if so user is current, the mobile terminal 100 so also regarded as now enters image pickup mode.
As shown in Figure 7, mobile terminal 100 is now in interface of taking pictures, therefore, mobile terminal 100 enters image pickup mode, and camera starts Real-time Obtaining video data, microphone starts to obtain voice data, under image pickup mode, user also can pass through to trigger corresponding instruction, with the function of mute microphone (MIC) Real-time Obtaining voice data, that is, camera Real-time Obtaining video data is only had.As shown in Figure 8, the mobile terminal 100 of test is in recorded video interface, therefore, mobile terminal 100 enters image pickup mode, and camera starts Real-time Obtaining video data, microphone starts to obtain voice data, under image pickup mode, user also can pass through to trigger corresponding instruction, with the function of mute microphone (MIC) Real-time Obtaining voice data, that is, camera Real-time Obtaining video data is only had.As shown in Figure 9, mobile terminal 100 is now in recording interface, and therefore, mobile terminal 100 enters recording mode, and microphone starts Real-time Obtaining voice data.
It should be noted that when mobile terminal 100 is in take pictures interface and camera at Real-time Obtaining video data and/or microphone Real-time Obtaining voice data, user now triggers and starts shoot button.In like manner, when mobile terminal 100 is in recorded video or recording interface, user does not trigger and starts recorded video or record button.
Buffer memory generation unit 102, for processing to generate cache information to data message according to the first preset rules.Camera or the real-time acquisition video data of microphone or speech data.The data message exceeding first time period P1, according to the rise time of data message, is given up by the processor of mobile terminal 100, by the data information memory that collects in first time period P1 in buffer memory, to form cache information.First time period P1 refers to this time period of putting sometime to current time after mobile terminal 100 enters into shooting/recording mode.For example, mobile terminal 100 enters image pickup mode in 10:00:00, and current time is 10:00:10, and so first time period P1 can be this time period of 10:00:07 ~ 10:00:03.Certainly, first time period P1 can start timing after mobile terminal 100 enters shooting/recording mode.Also can work as camera focusing to starting during certain objects to start timing, such as: focusing to during face or the object of motion time.
Buffer memory generation unit 102, also for obtaining the very first time point T1 and the second time point T2 that data message generates.After mobile terminal 100 enters into shooting/recording mode, camera and/or microphone come into effect and gather video data and/or voice data.At a time, the very first time T1 that recording data information generates, through after a period of time, records the second time T2 that data message generates.In the present embodiment, very first time point T1 and the second time point T2 is spaced apart N, and wherein, N can be 2s, 3s or other time intervals arranged according to actual needs.The operation interface of mobile terminal 100 can provide corresponding option interface, for user input predetermined time interval N when operating.Also can the set interval generation of N of mobile terminal 100 is rule, such as, when detecting that user is when carrying out focusing camera/recording to personage, so N is automatically set as 2s the time interval, when detecting that user makes a video recording to vehicle, animal etc./records, so N is automatically set as 3s the time interval.Wherein, the second time point T2 is always real-time time, and very first time point T1 is the time falling behind N second with real-time time T2, and such as, during N=2s, real-time time T2 so is now 10:00:02, and so the very first time puts T1 is 10:00:00.
Buffer memory generation unit 102, also for intercepting the data message between very first time point T1 to the second time point T2.Mobile terminal 100 by obtaining very first time point T1 and the second time point T2, and then obtains first time period P1, that is, first time period P1 is the time period between very first time point T1 and the second time point T2.Because very first time point T1 lags behind the second time point T2, therefore give up camera/microphone data message of obtaining before very first time point T1, retain the data message in first time period P1.
Buffer memory generation unit 102, also for being put the very first time in data message between T1 and the second time point T2 internal memory stored in mobile terminal 100, to generate cache information.By the data information memory that obtains in first time period P1 in buffer memory, form cache information, when operating for subsequent user, from this buffer memory, read the data of needs.Due to, second time point T2 is the real-time time of real-time update, and the interval N of very first time point T1 and the second time point T2 remains constant, therefore, the time period that first time period P1 is always up-to-date, therefore, the cache information be stored in buffer memory is also constantly update, and keeps synchronized update with the data message obtained with current camera/microphone.Such as: camera/microphone has obtained the data of 10 seconds, but only buffer memory data of nearest 2 seconds, abandon the 8 number of seconds certificates started most.
Processing unit 103, for receiving the manipulation instruction of user, controlling mobile terminal 100 and reading cache information according to the second preset rules.When mobile terminal 100 receives the manipulation instruction of user, obtain time point a now, such as, user clicks the virtual key of take pictures virtual key or recorded video by touch-screen, manipulates instruction accordingly to send to mobile terminal 100.Mobile terminal 100 reads the cache information of the second time period P2 from buffer memory, wherein the second time period was the time period calculated based on this time point a, can ensure that cache information that mobile terminal 100 reads is always the video data or voice data that the picture that obtains at time point a with camera/microphone or sound keeps being closely related like this.
Processing unit 103, also for when detecting the manipulation instruction of user, acquisition user triggers time point t during manipulation instruction.
The manipulation instruction of user can be when taking pictures interface, and user triggers the photographing instruction of key of taking pictures; Can be that user triggers the record command of recording key when recorded video interface; Can be that user triggers the record command of record button when recording interface.Time point t is the time of user when triggering command adapted thereto, that is, real-time time during triggering command.The second time point T2 of time point t now and first time period P is same time point.That is, in buffer memory now, cache information is the time period: the data message in t-N ~ t.
Processing unit 103, also for reading the cache information that will generate after time point t in particular cache information in buffer memory when time point t and/or buffer memory.
It is all data cached in buffer memory when mobile terminal 100 reads in time point t by processor, or the partial buffering data in buffer memory, or to wait for for starting point with time point t and obtain the data of write buffer memory, or data cached when reading in time point t in buffer memory and to wait for for starting point with time point t and obtain the data by writing buffer memory.Such as: if buffer memory 2 seconds data messages all the time in buffer memory, so, now mobile terminal 100 all can read out the buffer memory data message of 2 seconds, or the 1 number of seconds certificate only read wherein, or wait for and obtain the data that will write buffer memory, also can be the data in 2 seconds and the data that will write buffer memory, namely getting which partial data can combination in any.
It should be noted that, in this article, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thus make to comprise the process of a series of key element, method, article or device and not only comprise those key elements, but also comprise other key elements clearly do not listed, or also comprise by the intrinsic key element of this process, method, article or device.When not more restrictions, the key element limited by statement " comprising ... ", and be not precluded within process, method, article or the device comprising this key element and also there is other identical element.
The invention described above embodiment sequence number, just to describing, does not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be well understood to the mode that above-described embodiment method can add required general hardware platform by software and realize, hardware can certainly be passed through, but in a lot of situation, the former is better execution mode.Based on such understanding, technical scheme of the present invention can embody with the form of software product the part that prior art contributes in essence in other words, this computer software product is stored in a storage medium (as ROM/RAM, magnetic disc, CD), comprising some instructions in order to make a station terminal equipment (can be mobile phone, computer, server, air conditioner, or the network equipment etc.) perform method described in each embodiment of the present invention.
These are only the preferred embodiments of the present invention; not thereby the scope of the claims of the present invention is limited; every utilize specification of the present invention and accompanying drawing content to do equivalent structure or equivalent flow process conversion; or be directly or indirectly used in other relevant technical fields, be all in like manner included in scope of patent protection of the present invention.

Claims (10)

1. a method for caching and processing, is applied to mobile terminal, it is characterized in that, described method comprises:
Obtain the data message that described mobile terminal collects;
Process to generate cache information to described data message according to the first preset rules;
Receive the manipulation instruction of user, control described mobile terminal and read described cache information according to the second preset rules.
2. method for caching and processing as claimed in claim 1, it is characterized in that, described first preset rules is:
Obtain the very first time point T1 and the second time point T2 that described data message generates;
Intercept the data message between some T1 to the second of described very first time time point T2;
By in the internal memory of the data message between described very first time point T1 and the second time point T2 stored in described mobile terminal, to generate described cache information.
3. method for caching and processing as claimed in claim 2, is characterized in that, described very first time point T1 and the second time point T2 is spaced apart N, and described second time point T2 is real-time time point, to constantly update stored in the cache information in buffer memory.
4. method for caching and processing as claimed in claim 3, it is characterized in that, described second preset rules is:
When detecting the manipulation instruction of user, acquisition user triggers time point t during described manipulation instruction;
Read the cache information that will generate after time point t in particular cache information in buffer memory when time point t and/or buffer memory.
5. method for caching and processing as claimed in claim 1, is characterized in that the data message that the described mobile terminal of described acquisition collects comprises:
Receive the enabled instruction of user, enter shooting/recording mode;
When detecting that described mobile terminal enters described shooting/recording mode, the camera of described mobile terminal and/or microphone Real-time Obtaining video data and/or voice data.
6. a buffer processing device, is characterized in that, comprising:
Data acquisition unit, for obtaining the data message that described mobile terminal collects;
Buffer memory generation unit, for processing to generate cache information to described data message according to the first preset rules;
Processing unit, for receiving the manipulation instruction of user, controlling described mobile terminal and reading described cache information according to the second preset rules.
7. buffer processing device as claimed in claim 6, it is characterized in that, described first preset rules is:
Obtain the very first time point T1 and the second time point T2 that described data message generates;
Intercept the data message between some T1 to the second of described very first time time point T2;
By in the internal memory of the data message between described very first time point T1 and the second time point T2 stored in described mobile terminal, to generate described cache information.
8. buffer processing device as claimed in claim 7, is characterized in that, described very first time point T1 and the second time point T2 is spaced apart N, and described second time point T2 is real-time time point, to constantly update stored in the cache information in buffer memory.
9. buffer processing device as claimed in claim 8, it is characterized in that, described second preset rules is:
When detecting the manipulation instruction of user, acquisition user triggers time point t during described manipulation instruction;
Read the cache information that will generate after time point t in particular cache information in buffer memory when time point t and/or buffer memory.
10. buffer processing device as claimed in claim 9, it is characterized in that, described data acquisition unit is further used for:
Receive the enabled instruction of user, enter shooting/recording mode;
When detecting that described mobile terminal enters described shooting/recording mode, the camera of described mobile terminal and/or microphone Real-time Obtaining video data and/or voice data.
CN201510628540.9A 2015-09-28 2015-09-28 Caching processing method and device Pending CN105357560A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510628540.9A CN105357560A (en) 2015-09-28 2015-09-28 Caching processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510628540.9A CN105357560A (en) 2015-09-28 2015-09-28 Caching processing method and device

Publications (1)

Publication Number Publication Date
CN105357560A true CN105357560A (en) 2016-02-24

Family

ID=55333399

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510628540.9A Pending CN105357560A (en) 2015-09-28 2015-09-28 Caching processing method and device

Country Status (1)

Country Link
CN (1) CN105357560A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105704387A (en) * 2016-04-05 2016-06-22 广东欧珀移动通信有限公司 Shooting method and device of intelligent terminal and intelligent terminal
CN106028098A (en) * 2016-05-26 2016-10-12 努比亚技术有限公司 Video recording method, device, and terminal
CN106157986A (en) * 2016-03-29 2016-11-23 联想(北京)有限公司 A kind of information processing method and device, electronic equipment
CN106412207A (en) * 2016-11-30 2017-02-15 努比亚技术有限公司 Recording method and terminal
CN110324552A (en) * 2018-03-29 2019-10-11 沈阳美行科技有限公司 Grasping means, grasp shoot method, relevant apparatus equipment and the system of audio, video data

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0799628A (en) * 1993-09-28 1995-04-11 Hitachi Ltd Image pickup device
CN101795356A (en) * 2009-01-14 2010-08-04 索尼公司 Image-capture device, image-capture method, and image-capture program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0799628A (en) * 1993-09-28 1995-04-11 Hitachi Ltd Image pickup device
CN101795356A (en) * 2009-01-14 2010-08-04 索尼公司 Image-capture device, image-capture method, and image-capture program

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106157986A (en) * 2016-03-29 2016-11-23 联想(北京)有限公司 A kind of information processing method and device, electronic equipment
CN106157986B (en) * 2016-03-29 2020-05-26 联想(北京)有限公司 Information processing method and device and electronic equipment
CN105704387A (en) * 2016-04-05 2016-06-22 广东欧珀移动通信有限公司 Shooting method and device of intelligent terminal and intelligent terminal
CN106028098A (en) * 2016-05-26 2016-10-12 努比亚技术有限公司 Video recording method, device, and terminal
CN106412207A (en) * 2016-11-30 2017-02-15 努比亚技术有限公司 Recording method and terminal
CN110324552A (en) * 2018-03-29 2019-10-11 沈阳美行科技有限公司 Grasping means, grasp shoot method, relevant apparatus equipment and the system of audio, video data

Similar Documents

Publication Publication Date Title
CN105700776A (en) Device and method for switching background programs
CN104750420A (en) Screen capturing method and device
CN104935739A (en) Audio and video application control method and device
CN104660912A (en) Photographing method and photographing device
CN105204996A (en) Memory leak detection device and method and terminal
CN104898961A (en) Application rapid starting method and apparatus
CN105141833A (en) Terminal photographing method and device
CN105389110A (en) Fast touch apparatus and method
CN106610770A (en) Picture viewing method and device
CN104902309A (en) Multimedia file sharing method and device for mobile terminal
CN107018331A (en) A kind of imaging method and mobile terminal based on dual camera
CN105100603A (en) Photographing triggering device embedded in intelligent terminal and method of triggering photographing device
CN105243126A (en) Cross-screen screen capture method and apparatus
CN105357560A (en) Caching processing method and device
CN105357363A (en) Wearable device based application starting method and device
CN105094612A (en) Object selecting method and device
CN105162960A (en) Photographing device and method of frameless mobile terminal
CN104917965A (en) Shooting method and device
CN104980579A (en) Method and device for setting operation information in mobile terminal
CN104915215A (en) Icon recommending device and method
CN104951514A (en) Audio playing method and device
CN105739873A (en) Screen capturing method and terminal
CN104636043A (en) Virtual key display method and virtual key display device
CN106909681A (en) A kind of information processing method and its device
CN104898940A (en) Picture processing method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20160224

RJ01 Rejection of invention patent application after publication