CN106534674A - Method for displaying focus area and mobile terminal - Google Patents

Method for displaying focus area and mobile terminal Download PDF

Info

Publication number
CN106534674A
CN106534674A CN201610946035.3A CN201610946035A CN106534674A CN 106534674 A CN106534674 A CN 106534674A CN 201610946035 A CN201610946035 A CN 201610946035A CN 106534674 A CN106534674 A CN 106534674A
Authority
CN
China
Prior art keywords
area
camera
interface
viewing area
focusing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610946035.3A
Other languages
Chinese (zh)
Inventor
魏宇虹
里强
苗雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Publication of CN106534674A publication Critical patent/CN106534674A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

Embodiments of the invention disclose a method for displaying a focus area; the method is applied to a mobile terminal; the mobile terminal is provided with a camera and a second display area displayed on a camera shooting interface; the second display area is used for displaying the amplified focus area; the method comprises the following steps of focusing on a first position of the camera shooting interface and obtaining a position of the second display area at the camera shooting interface when the camera is utilized to shoot; overlapping at the position of the second display area at the camera shooting interface and the first position, and after a focus confirmation message is received, controlling the second display area to move to the second position; or when the position of the second display area at the camera shooting interface is a position of the focus and the camera is utilized to shoot, utilizing the second display area to display the amplified focus area in order to display the focus area directly. The embodiments of the invention also disclose the mobile terminal.

Description

A kind of method and mobile terminal for showing focusing area
Technical field
The present invention relates to technology of focusing, more particularly to a kind of method and mobile terminal for showing focusing area.
Background technology
At present, in the shooting for entering pedestrian or thing using the camera of mobile terminal, can be selected according to practical application scene Focusing area, to reach more preferable shooting effect.If however, user needs to be finely adjusted the position of focusing area, just needed Accurately to know the position of focusing area, focusing area is typically a smaller region, thus, be not easy to user passing through The position of focusing area is known in observation.
The content of the invention
To solve above-mentioned technical problem, it is whole with mobile that embodiment of the present invention expectation provides a kind of method for showing focusing area End, can intuitively show focusing area.
To reach above-mentioned purpose, what the technical scheme of the embodiment of the present invention was realized in:
The embodiment of the present invention proposes a kind of method for showing focusing area, is applied to mobile terminal, the mobile terminal Camera and the second viewing area shown on camera shooting interface are provided with, after second viewing area is used to show amplification Focusing area;Methods described includes:
When being shot using the camera, the first area that the camera shoots interface is focused, and obtains institute State the position that the second viewing area shoots interface in the camera;The position at interface is shot in the camera in second viewing area Formed with first area and overlapped, and after receiving focusing confirmation, controlled second viewing area and be moved to the camera shooting The second area at interface;
Or, second viewing area is focusing area position in the position that the camera shoots interface, using the phase When machine is shot, the focusing area after amplifying is shown using second viewing area.
In such scheme, focus in the first position that interface is shot to the camera, and obtain described second and show Behind the position that the camera shoots interface, methods described also includes in area:
When default overlay condition is met, determine that second viewing area shoots the position at interface and institute in the camera State first area and form overlap;
The overlay condition include it is following at least one:Second viewing area the camera shoot interface position with The first area exist lap, second viewing area shoot in the camera interface position center in described In first area.
In such scheme, the overlay condition also includes:Second viewing area shoots the position at interface in the camera With the lap area of the first area more than or equal to area threshold is set, the area threshold that sets is as firstth area Domain area and the product for setting coefficient, the setting coefficient are more than 0 and are less than or equal to 1.
It is in such scheme, described to control the second area that second viewing area is moved to the camera shooting interface, bag Include:
The second viewing area movement configured information is generated, the second viewing area movement configured information is described for indicating Camera shoots the position of the second area at interface;Control second viewing area and be moved to the second viewing area movement instruction letter The position that breath is indicated.
In such scheme, after the focusing area after amplifying is shown using second viewing area, methods described also includes:
Focusing area movement configured information is received, the focusing area movement configured information is used to indicate that focusing area is moved Positional information afterwards;
The second viewing area and focusing area are controlled while being moved to the position that the focusing area movement configured information is indicated.
The embodiment of the present invention also proposed a kind of mobile terminal, and the mobile terminal includes camera and controller;The phase There is on the shooting interface of machine the second viewing area, second viewing area is used to show the focusing area after amplifying;
The controller, for, when it is determined that the camera is shot, shooting the first area at interface to the camera Focused, and obtained the position that second viewing area shoots interface in the camera;In second viewing area described Camera shoots the position and first area at interface and forms overlap, and after receiving focusing confirmation, controls second viewing area It is moved to the second area that the camera shoots interface;
Or, second viewing area is focusing area position in the position that the camera shoots interface;The controller, The focusing area after amplifying is shown for when it is determined that the camera is shot, controlling second viewing area.
In such scheme, the controller is additionally operable to focus in the first position for shooting the camera at interface, and Second viewing area is obtained behind the position that the camera shoots interface, when default overlay condition is met, it is determined that described Second viewing area forms in the position and the first area that the camera shoots interface and overlaps;
The overlay condition include it is following at least one:Second viewing area the camera shoot interface position with The first area exist lap, second viewing area shoot in the camera interface position center in described In first area.
In such scheme, the overlay condition also includes:Second viewing area shoots the position at interface in the camera With the lap area of the first area more than or equal to area threshold is set, the area threshold that sets is as firstth area Domain area and the product for setting coefficient, the setting coefficient are more than 0 and are less than or equal to 1.
In such scheme, the controller moves configured information specifically for generating second viewing area, and described second Viewing area movement configured information is used for the position of the second area for indicating that the camera shoots interface;Control second viewing area It is moved to the position that the second viewing area movement configured information is indicated.
In such scheme, the controller is additionally operable to controlling the focusing area after second viewing area shows amplification Afterwards, focusing area movement configured information is received, after the focusing area movement configured information is used to indicate focusing area movement Positional information;
The controller, is additionally operable to control the second viewing area and focusing area while being moved to the focusing area movement refer to Show the position that information is indicated.
In a kind of method and mobile terminal for showing focusing area provided in an embodiment of the present invention, it is provided with mobile terminal Camera and the second viewing area shown on camera shooting interface, second viewing area is used to show the focusing area after amplifying Domain;Methods described includes:When being shot using the camera, the first position for shooting interface in the camera is focused, And obtain the position that second viewing area shoots interface in the camera;Boundary is shot in the camera in second viewing area The position in face and first position form and overlap, and after receiving focusing confirmation, control second viewing area and be moved to second Position;Or, second viewing area is focusing area position in the position that the camera shoots interface, is entered using the camera When row shoots, the focusing area after amplifying is shown using second viewing area;Thus, focusing area intuitively can be shown.
Description of the drawings
Fig. 1 is the hardware architecture diagram for realizing the optional mobile terminal of each embodiment one of the invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
The mobile terminal front view that Fig. 3 is related to for first embodiment of the invention;
The mobile terminal rearview that Fig. 4 is related to for first embodiment of the invention;
The schematic diagram of second viewing areas of the Fig. 5 to show on camera shooting interface in first embodiment of the invention;
Flow charts of the Fig. 6 for a kind of embodiment of the method for focusing area is shown in first embodiment of the invention;
Fig. 7 is the schematic diagram of user's click selecting focusing zone in first embodiment of the invention;
The schematic diagram of text prompt information of the Fig. 8 to issue the user with the embodiment of the present invention
Flow charts of the Fig. 9 for the method for focusing area is shown in second embodiment of the invention;
Flow charts of the Figure 10 for the method for focusing area is shown in third embodiment of the invention;
Composition structural representations of the Figure 11 for embodiment of the present invention mobile terminal.
Specific embodiment
It should be appreciated that specific embodiment described herein is not intended to limit the present invention only to explain the present invention.
The mobile terminal of each embodiment of the invention is realized referring now to Description of Drawings.In follow-up description, use For representing the suffix of such as " module ", " part " or " unit " of element only for being conducive to the explanation of the present invention, itself Not specific meaning.Therefore, " module " mixedly can be used with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the embodiment of the present invention can include all Such as mobile phone, smart phone, notebook computer, digit broadcasting receiver, personal digital assistant (PDA), panel computer (PAD), the mobile terminal of portable media player (PMP), guider etc. and such as numeral TV, desktop computer Etc. fixed terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except special Outside moving the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is the hardware architecture diagram for realizing the optional mobile terminal of each embodiment one of the invention.
Mobile terminal 1 00 can include audio/video (A/V) input block 120, user input unit 130, output unit 150th, memory 160, interface unit 170, controller 180 and power subsystem 190 etc..Fig. 1 is shown with various assemblies Mobile terminal, it should be understood that being not required for implementing all components for illustrating.Can alternatively implement more or less of Component.Will be discussed in more detail below the element of mobile terminal.
A/V input blocks 120 are used to receive audio or video signal.A/V input blocks 120 can include camera 121, phase Static images or the image of video that 121 pairs, machine is obtained by image capture apparatus in Video Capture pattern or image capture mode Data are processed.Picture frame after process is may be displayed on display unit 151.Picture frame Jing after the process of camera 121 can To be stored in memory 160 (or other storage mediums), two or more cameras can be provided according to the construction of mobile terminal 121。
User input unit 130 can generate key input data to control each of mobile terminal according to the order of user input Plant operation.User input unit 130 allows the various types of information of user input, and can include keyboard, metal dome, touch Plate (for example, detection is due to the sensitive component of the changes such as touched and caused resistance, pressure, electric capacity), roller, rocking bar etc.. Especially, when touch pad is superimposed upon on display unit 151 in the form of layer, touch-screen can be formed.
Interface unit 170 is connected the interface that can pass through as at least one external device (ED) with mobile terminal 1 00.For example, External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing Line FPDP, memory card port, the port for device of the connection with identification module, audio input/output (I/O) end Mouth, video i/o port, ear port etc..Identification module can be that storage uses each of mobile terminal 1 00 for verifying user Kind of information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included Etc..In addition, the device with identification module (hereinafter referred to as " identifying device ") can take the form of smart card, therefore, know Other device can be connected with mobile terminal 1 00 via port or other attachment means.Interface unit 170 can be used for receive from The input (for example, data message, electric power etc.) of external device (ED) and the input for receiving is transferred in mobile terminal 1 00 One or more elements can be used for the transmission data between mobile terminal and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing to pass through which by electricity Power provides the path of mobile terminal 1 00 from base or can serve as allowing from base the various command signals being input into pass through which It is transferred to the path of mobile terminal.Can serve as recognizing that mobile terminal is from the various command signals or electric power of base input The no signal being accurately fitted within base.Output unit 150 is configured to provide defeated with vision, audio frequency and/or tactile manner Go out signal (for example, audio signal, vision signal, alarm signal, vibration signal etc.).Output unit 150 can include showing Unit 151 etc..
The information that display unit 151 is processed in may be displayed on mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity During words call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia files Download etc.) related user interface (User's Interface, UI) or graphic user interface (GUI).When mobile terminal 1 00 In video calling pattern or image capture mode when, display unit 151 can show the image of capture and/or the figure of reception As, UI or GUI of video or image and correlation function etc. are shown.
Meanwhile, when the display unit 151 and touch pad touch-screen with formation superposed on one another in the form of layer, display unit 151 can serve as input unit and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT) In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least It is a kind of.Some in these displays may be constructed such that transparence to allow user from outside viewing, and this is properly termed as transparent Display, typical transparent display can be, for example, transparent organic light emitting diode (TOLED) display etc..According to specific The embodiment wanted, mobile terminal 1 00 can include two or more display units (or other display devices), for example, move Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used for detection and touch Input pressure and touch input position and touch input area.
Memory 160 can store software program for the process and control operation performed by controller 180 etc., Huo Zheke Temporarily to store the data that exported or will export (for example, telephone directory, message, still image, video etc.).And And, memory 160 can be storing the vibration of various modes with regard to exporting when touching and being applied to touch-screen and audio signal Data.
Memory 160 can include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, many Media card, card-type memory (for example, SD or DX memories etc.), random access storage device (RAM), static random-access storage Device (SRAM), read-only storage (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can perform memory with by network connection The network storage device cooperation of 160 store function.
The overall operation of the generally control mobile terminal of controller 180.For example, controller 180 is performed and voice call, data The related control of communication, video calling etc. and process.In addition, controller 180 can be included for reproducing (or playback) many matchmakers The multi-media module 181 of volume data, multi-media module 181 can be constructed in controller 180, or it is so structured that and control Device 180 is separated.Controller 180 can be with execution pattern identifying processing, by the handwriting input for performing on the touchscreen or picture Draw input and be identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit Appropriate electric power needed for part and component.
Various embodiments described herein can be with use such as computer software, hardware or its any combination of calculating Machine computer-readable recording medium is implementing.For hardware is implemented, embodiment described herein can be by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can Programming gate array (FPGA), processor, controller, microcontroller, microprocessor, it is designed to perform function described herein At least one in electronic unit implementing, in some cases, can be implemented in controller 180 by such embodiment. For software is implemented, the embodiment of such as process or function can with allow to perform the single of at least one function or operation Software module is implementing.Software code can be come by the software application (or program) write with any appropriate programming language Implement, perform during software code can be stored in memory 160 and by controller 180.
So far, mobile terminal is described according to its function.Below, for the sake of brevity, will description such as folded form, Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc. is used as showing Example.Therefore, the present invention can be applied to any kind of mobile terminal, and be not limited to slide type mobile terminal.
As shown in Figure 1 mobile terminal 1 00 may be constructed such that using via frame or packet transmission data it is all if any Line and wireless communication system and satellite-based communication system are operating.
The communication system that mobile terminal wherein of the invention is operable to is described referring now to Fig. 2.
Such communication system can use different air interfaces and/or physical layer.For example, used by communication system Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base stations (BS) 270, base station Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to and Public Switched Telephony Network (PSTN) 290 form interface.MSC280 is also structured to form interface with the BSC275 that can be couple to base station 270 via back haul link. Back haul link can be constructed according to any one in some known interfaces, and the interface includes such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system can include multiple BSC275 as shown in Figure 2.
Each BS270 can service one or more subregions (or region), by multidirectional antenna or the day of sensing specific direction Each subregion that line is covered is radially away from BS270.Or, each subregion can by for diversity reception two or more Antenna is covered.Each BS270 may be constructed such that the multiple frequency distribution of support, and each frequency distribution with specific frequency spectrum (for example, 1.25MHz, 5MHz etc.).
What subregion and frequency were distributed intersect can be referred to as CDMA Channel.BS270 can also be referred to as base station transceiver System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly representing single BSC275 and at least one BS270.Base station can also be referred to as " cellular station ".Or, each subregion of specific BS270 can be claimed For multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to broadcsting transmitter (BT) 295 mobile terminal operated in system 100.Broadcasting reception module 111 is arranged at mobile terminal 1 00 to receive the broadcast sent by BT295 as shown in Figure 1 Signal.In fig. 2 it is shown that several global positioning system (GPS) satellites 300.Satellite 300 helps position multiple mobile terminals At least one of 100.
In fig. 2, multiple satellites 300 are depicted, it is understood that be, it is possible to use any number of satellite obtains useful Location information.GPS module 115 is generally configured to coordinate to obtain the positioning wanted letter with satellite 300 as shown in Figure 1 Breath.Substitute GPS tracking techniques or outside GPS tracking techniques, it is possible to use can track the position of mobile terminal other Technology.In addition, at least one gps satellite 300 can optionally or additionally process satellite dmb transmission.
Used as a typical operation of wireless communication system, BS270 receives the reverse link from various mobile terminal 1s 00 Signal.Mobile terminal 1 00 generally participates in call, information receiving and transmitting and other types of communication.Each of the reception of certain base station 270 is anti- Processed in specific BS270 to link signal.The data of acquisition are forwarded to the BSC275 of correlation.BSC provides call Resource allocation and the mobile management function of the coordination including the soft switching process between BS270.BSC275 is also by the number for receiving According to MSC280 is routed to, which is provided for the extra route service with PSTN290 formation interfaces.Similarly, PSTN290 with MSC280 forms interface, and MSC and BSC275 form interface, and BSC275 correspondingly controls BS270 with by forward link signals It is sent to mobile terminal 1 00.
Based on above-mentioned mobile terminal hardware configuration and communication system, each embodiment of the invention is proposed.
First embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described.
First embodiment of the invention proposes a kind of method for showing focusing area, and can apply is having the shifting of shoot function In dynamic terminal.
Here, the mobile terminal of above-mentioned record include but is not limited to mobile phone, notebook computer, camera, PDA, PAD, PMP, guider etc..
Here, if mobile terminal has operating system, the operating system can be UNIX, Linux, Windows, Android (Android), Windows Phone etc..
In the first embodiment of the invention, when the mobile terminal of above-mentioned record is mobile phone, Fig. 3 is implemented for the present invention first The front view of the mobile terminal being related in example, the rearview of mobile terminals of the Fig. 4 to be related in first embodiment of the invention.
In first embodiment of the invention, camera on mobile terminal, is provided with, here, can also be for camera current shooting field Scape, determines focusing area;Shooting in camera and second viewing area can also be shown on interface, the second viewing area can be used for showing Show the focusing area after amplifying, for example, magnifying glass is provided with the shooting interface of terminal camera, the second viewing area is magnifying glass Viewing area.
The schematic diagram of second viewing areas of the Fig. 5 to show on camera shooting interface in first embodiment of the invention, such as Fig. 5 Shown, when mobile terminal is shot, the last layer that can shoot interface in camera shows the second viewing area, i.e. second shows Show that corresponding camera shooting picture can be blocked in area region.
It should be noted that species, position, number to the camera on mobile terminal etc. is not limited, it is exemplary, Camera in terminal can be preposition video camera, or rearmounted video camera;In addition, for the design parameter of camera does not also enter Row is limited.
It is understood that in prior art, when the second display zone position that camera shoots interface blocks focusing area, Focusing area position intuitively can not be shown;In order to solve the above problems, two kinds of enforcements in first embodiment of the invention, are proposed Mode shows focusing area, illustrates separately below.
The first embodiment.
Flow charts of the Fig. 6 for a kind of embodiment of the method for focusing area is shown in first embodiment of the invention, such as Fig. 6 Shown, the flow process includes:
Step 601:When being shot using the camera, the first area that the camera shoots interface is focused, And obtain the position that second viewing area shoots interface in the camera.
Here, the second viewing area can be default location in the position that the camera shoots interface, and default location can be by User is predefined, for example, default location be camera shoot the upper left corner at interface, the lower left corner, the upper right corner, the lower right corner or in Heart position etc..
First area is used to represent that camera shoots the regional area that interface needs to be focused, it is to be understood that according to In the image that focusing area shoots, it is relatively sharp that other parts are compared in focusing area part;
Further, first area position can be determined by actual photographed scene, it is also possible to according to the shooting of user Demand determines.
In actual enforcement, user be able to can pass through according to current actual photographed scene selecting focusing zone, user User input unit makes mobile terminal determine focusing area position to mobile terminal input focusing area position;It is optional at one Embodiment in, user terminal display screen be touch display screen when, mobile terminal can be issued the user with by display screen The configured information of focusing area is selected, after user receives configured information, point on interface can be shot in camera based on shooting demand The region for needing to be focused is hit, and the focusing area to selecting confirms that mobile terminal will be used after confirmation is received The region that family is clicked on is defined as focusing area.
Fig. 7 is the schematic diagram of user's click selecting focusing zone in first embodiment of the invention, as shown in fig. 7, showing boundary Circle on face represents that user clicks on the focusing area chosen.
It should be noted that the size of focusing area can be configured in advance, the shape of focusing area can be it is circular, Ellipse, rectangle etc.;
The size that the second viewing area shown on interface is shot in camera can be default value, it is also possible to be set in advance Put, the shape of the second viewing area can be circle, ellipse, rectangle etc..
Further, the shape of focusing area can be with identical with the shape of the second viewing area, it is also possible to different;Can at one In the embodiment of choosing, focusing area and the second viewing area are circle.
Step 602:Formed in the position and first area that the camera shoots interface in second viewing area and overlapped, and After receiving focusing confirmation, control second viewing area and be moved to the second area that the camera shoots interface.
In the specific implementation, when default overlay condition is met, it may be determined that second viewing area is in the camera The position and the first area for shooting interface forms overlap.
Here, the overlay condition include it is following at least one:Second viewing area shoots interface in the camera There is the center that lap, second viewing area shoot the position at interface in the camera with the first area in position In the first area.
For example, when the first area and the second viewing area are circle, if the circle centre position of second viewing area is in institute State in the border circular areas of first area, then can determine and meet default overlay condition, and then can determine that described second shows Area forms in the position and the first area that the camera shoots interface and overlaps.
Further, include that second viewing area shoots the position and the firstth area at interface in the camera in overlay condition When domain has lap, the overlay condition can also include:Second viewing area shoots the position at interface in the camera Put with the lap area of first area more than or equal to setting area threshold;
Here, area threshold is set as the first area area and the product of setting coefficient, set coefficient and be expressed as p, p More than 0 and less than or equal to 1.
In an optional embodiment, setting coefficient p is equal to 0.5, thus, when the second viewing area is shot in the camera When the position at interface is more than or equal to the half of the first area area with the lap area of first area, it may be determined that full The default overlay condition of foot, and then can determine that second viewing area shoots the position and described first at interface in the camera Region forms and overlaps.
It should be noted that when default overlay condition is unsatisfactory for, determining that second viewing area is clapped in the camera The position and the first area for taking the photograph interface does not form overlap.
In this step, after focusing area determines, user can be to mobile terminal input focusing confirmation, to represent fair Perhaps focused according to current focusing area;Here, user is not entered to the mode that mobile terminal is input into focusing confirmation Row is limited.
Here, the control second viewing area is moved to the second area that the camera shoots interface, can include:
The second viewing area movement configured information is generated, the second viewing area movement configured information is described for indicating Camera shoots the position of the second area at interface;Control second viewing area and be moved to the second viewing area movement instruction letter The position that breath is indicated.
In another optional embodiment, the position at interface and institute are shot in the camera in second viewing area State first area and formed and overlap, and after receiving focusing confirmation, information can be issued the user with, user is pointed out by the Two viewing areas are moved to the second area that the camera shoots interface;
In actual enforcement, information can be shown in text prompt information, or sound on display screen Information;The schematic diagram of text prompt information of the Fig. 8 to issue the user with the embodiment of the present invention, as shown in figure 8, the word Information can be " the second viewing area is moved to a not position Chong Die with the focusing area position formation please ", figure Confirming button is additionally provided with 8, after user clicks on confirming button, the second viewing area can be moved.
By foregoing description as can be seen that the method for showing focusing area using the first embodiment, when camera shoots boundary When the first area at the camera shooting interface is blocked in second viewing area in face, the second viewing area can be moved to the camera Shoot the second area at interface such that it is able to avoid the second display zone position that camera shoots interface from blocking focusing area, and then Focusing area can intuitively be shown.
Second embodiment
Described second shows that zone position is focusing area position in the position that the camera shoots interface, thus, utilizing When the camera is shot, it is possible to use second viewing area directly displays the focusing area after amplifying.
Here, the display size of the second viewing area can be consistent with focusing area, so, when being clapped using camera When taking the photograph, as the second viewing area is focusing area position in the position that the camera shoots interface, so according to the second viewing area Position can directly determine focusing area position, and can directly display the focusing area after amplifying.
Second embodiment
In order to be able to more embody the purpose of the present invention, on the basis of first embodiment of the invention, further lifted Example explanation.
In second embodiment of the invention, mobile terminal is mobile phone, and the camera of mobile terminal is rearmounted camera, the second viewing area For circle, the second viewing area is predeterminated position in the position that camera shoots interface.
Flow charts of the Fig. 9 for the method for focusing area is shown in second embodiment of the invention, as shown in figure 9, the method bag Include:
Step 901:When determining with camera and being shot, the prompting for selecting focusing area is issued the user with.
Here, the focusing mode of camera can be manual focus mode, thus, text prompt, auditory tone cues can be passed through The prompting for selecting focusing area is issued the user with Deng prompting mode.
Step 902:Based on the focusing area information for receiving user's selection, focusing area position is determined.
Here, focusing area is expressed as the first area that camera shoots interface.
Step 903:Judge second viewing area the center of circle whether be in determined by focusing area, if it is, Execution step 904;Otherwise, directly terminate flow process.
Step 904:Judge whether to receive focusing confirmation, if it is execution step 905;Otherwise, directly terminate stream Journey.
Here, by user input, for example, whether mobile terminal can issue the user with " according to current confirmation of focusing The focusing area of selection is focused " prompting, user provides echo message for the information that receives, if user is given Echo message when being to allow to be focused according to the current focusing area for selecting, the echo message that user is given is confirmed for focusing Information;Otherwise, the echo message that user is given is not focusing confirmation.
In an optional embodiment, mobile terminal can determine the circle centre position of second viewing area determined by At the moment of focusing area, start timer, before timer time is reached, if mobile terminal receives focusing confirmation, Execution step 905;Before timer time is reached, if mobile terminal does not receive focusing confirmation, terminate flow process.This In, the timing of timer can be pre-set.
Step 905:The second viewing area movement is controlled, makes the center of circle of the second viewing area after movement be not at being determined Focusing area in, afterwards, terminate flow process.
Here, the second viewing area after movement is the second area that camera shoots interface.
As can be seen that the method for the display focusing area using second embodiment of the invention, when camera shoots the of interface When two display zone positions block focusing area, the second viewing area can be moved to the second area that camera shoots interface, so as to The second display zone position that camera shoots interface can be avoided to block focusing area, and then can intuitively show focusing area.
3rd embodiment
In order to be able to more embody the purpose of the present invention, on the basis of first embodiment of the invention, further lifted Example explanation.
In third embodiment of the invention, mobile terminal is mobile phone, and the camera of mobile terminal is rearmounted camera, the second viewing area It is consistent with the size of focusing area, the second viewing area and focusing area are circle, and the second viewing area shoots boundary in camera The position in face is predeterminated position.
Flow charts of the Figure 10 for the method for focusing area is shown in third embodiment of the invention, as shown in Figure 10, the method Including:
Step 1001:When determining with camera and being shot, issue the user with that " focusing area position is the second viewing area The prompting of position ".
Step 1002:Focusing area movement configured information is received, the focusing area movement configured information is right for indicating Positional information after burnt region movement.
In an optional embodiment, user can indicate letter to mobile terminal input is mobile by user input unit Breath.
Step 1003:Configured information is moved based on the focusing area for receiving, the second viewing area of control and focusing area are simultaneously It is moved to the position that the focusing area movement configured information is indicated.
As can be seen that the method for the display focusing area using third embodiment of the invention, as the second viewing area is in institute The position for stating camera shooting interface is focusing area position, so can directly determine focusing area according to the second display zone position Position, and the focusing area after amplifying can be directly displayed;Further, when focusing area is moved, the second viewing area with Movement, now, equally can according to movement after the second display zone position directly can determine it is mobile after focusing area position Put.
Fourth embodiment
For the method for the display focusing area of first embodiment of the invention, fourth embodiment of the invention provides a kind of shifting Dynamic terminal, can apply in having the mobile terminal of shoot function.
Here, the mobile terminal of above-mentioned record include but is not limited to mobile phone, notebook computer, camera, PDA, PAD, PMP, guider etc..
Here, if mobile terminal has operating system, the operating system can be UNIX, Linux, Windows, Android (Android), Windows Phone etc..
In fourth embodiment of the invention, when the mobile terminal of above-mentioned record is mobile phone, Fig. 3 is implemented for the present invention the 4th The front view of the mobile terminal being related in example, the rearview of mobile terminals of the Fig. 4 to be related in first embodiment of the invention.
In fourth embodiment of the invention, camera on mobile terminal, is provided with, here, can also be for camera current shooting field Scape, determines focusing area;Shooting in camera and second viewing area can also be shown on interface, the second viewing area can be used for showing Show the focusing area after amplifying.
With reference to Fig. 5, when mobile terminal is shot, the last layer that can shoot interface in camera shows the second display Area, i.e. corresponding camera shooting picture can be blocked in the second viewing area region.
It should be noted that species, position, number to the camera on mobile terminal etc. is not limited, it is exemplary, Camera in terminal can be preposition video camera, or rearmounted video camera;In addition, for the design parameter of camera does not also enter Row is limited.
It is understood that in prior art, when the second display zone position that camera shoots interface blocks focusing area, Focusing area position intuitively can not be shown;In order to solve the above problems, it is possible to use the mobile terminal of the embodiment of the present invention; Composition structural representations of the Figure 11 for embodiment of the present invention mobile terminal, as shown in figure 11, the mobile terminal includes:Camera 1101 With controller 1102, the second viewing area on camera shooting interface, is shown, second viewing area is used to show right after amplifying Burnt region;Two kinds of embodiments of fourth embodiment of the invention are illustrated below.
The first embodiment.
The controller 1102, for, when it is determined that the camera is shot, shooting the first of interface to the camera Region is focused, and obtains the position that second viewing area shoots interface in the camera;Exist in second viewing area The camera shoots the position and first area at interface and forms overlap, and after receiving focusing confirmation, controls described second and show Show that area is moved to the second area that the camera shoots interface.
Here, the second viewing area can be default location in the position that the camera shoots interface, and default location can be by User is predefined, for example, default location be camera shoot the upper left corner at interface, the lower left corner, the upper right corner, the lower right corner or in Heart position etc..
First area is used to represent that camera shoots the regional area that interface needs to be focused, it is to be understood that according to In the image that focusing area shoots, it is relatively sharp that other parts are compared in focusing area part;
Further, first area position can be determined by actual photographed scene, it is also possible to according to the shooting of user Demand determines.
In actual enforcement, user be able to can pass through according to current actual photographed scene selecting focusing zone, user User input unit makes mobile terminal determine focusing area position to mobile terminal input focusing area position;It is optional at one Embodiment in, user terminal display screen be touch display screen when, mobile terminal can be issued the user with by display screen The configured information of focusing area is selected, after user receives configured information, point on interface can be shot in camera based on shooting demand The region for needing to be focused is hit, and the focusing area to selecting confirms that mobile terminal will be used after confirmation is received The region that family is clicked on is defined as focusing area.
It should be noted that the size of focusing area can be configured in advance, the shape of focusing area can be it is circular, Ellipse, rectangle etc..
The size that the second viewing area shown on interface is shot in camera can be default value, it is also possible to be set in advance Put, the shape of the second viewing area can be circle, ellipse, rectangle etc..
Further, the shape of focusing area can be with identical with the shape of the second viewing area, it is also possible to different;Can at one In the embodiment of choosing, focusing area and the second viewing area are circle.
In the specific implementation, when default overlay condition is met, it may be determined that second viewing area is in the camera The position and the first area for shooting interface forms overlap.
Here, the overlay condition include it is following at least one:Second viewing area shoots interface in the camera There is the center that lap, second viewing area shoot the position at interface in the camera with the first area in position In the first area.
For example, when the focusing area and the second viewing area are circle, if the circle centre position of second viewing area is in institute State in the border circular areas of first area, then can determine and meet default overlay condition, and then can determine that described second shows Area forms in the position and the first area that the camera shoots interface and overlaps.
Further, include that second viewing area shoots the position and the firstth area at interface in the camera in overlay condition When domain has lap, the overlay condition can also include:Second viewing area shoots the position at interface in the camera Put with the lap area of first area more than or equal to setting area threshold;
Here, area threshold is set as the first area area and the product of setting coefficient, set coefficient and be expressed as p, p More than 0 and less than or equal to 1.
In an optional embodiment, setting coefficient p is equal to 0.5, thus, when the second viewing area is shot in the camera When the position at interface is more than or equal to the half of the first area area with the lap area of first area, it may be determined that full The default overlay condition of foot, and then can determine that second viewing area shoots the position and described first at interface in the camera Region forms and overlaps.
It should be noted that when default overlay condition is unsatisfactory for, determining that second viewing area is clapped in the camera The position and the first area for taking the photograph interface does not form overlap.
Here, after focusing area determination, user can be to mobile terminal input focusing confirmation, to represent that permission is pressed Focused according to current focusing area;Here, user is not limited to the mode that mobile terminal is input into focusing confirmation System.
In actual enforcement, the controller 1102 moves configured information, institute specifically for generating second viewing area State the position of the second area that the second viewing area movement configured information is used to indicate that the camera shoots interface;Control described second Viewing area is moved to the position that the second viewing area movement configured information is indicated.
In another optional embodiment, the position at interface and institute are shot in the camera in second viewing area State first area and formed and overlap, and after receiving focusing confirmation, information can be issued the user with, user is pointed out by the Two viewing areas are moved to the second area that the camera shoots interface.
In actual enforcement, information can be shown in text prompt information, or sound on display screen Information;As shown in figure 8, the word information can be " please by the second viewing area be moved to one not with the focusing Regional location forms the position for overlapping ", confirming button is additionally provided with Fig. 8, after user clicks on confirming button, the can be moved Two viewing areas.
By foregoing description as can be seen that when the firstth area that camera shoots interface is blocked in the second viewing area that camera shoots interface During domain, the second viewing area can be moved to the first area that camera shoots interface such that it is able to avoid camera from shooting interface Second display zone position blocks focusing area, and then can intuitively show focusing area.
Second embodiment
Described second shows that zone position is focusing area position in the position that the camera shoots interface, thus, utilizing When the camera is shot, it is possible to use second viewing area directly displays the focusing area after amplifying.
Specifically, the controller 1102, for, when it is determined that the camera is shot, controlling second viewing area Show the focusing area after amplifying;So, when being shot using camera, as the second viewing area shoots boundary in the camera The position in face is focusing area position, so focusing area position can directly be determined according to the second display zone position, and can To directly display the focusing area after amplifying.
Further, the controller 1102, is additionally operable to controlling the focusing area after second viewing area shows amplification Behind domain, focusing area movement configured information is received, after the focusing area movement configured information is used to indicate focusing area movement Positional information
The controller 1102, is additionally operable to control the second viewing area and focusing area while being moved to the focusing area move The position that dynamic configured information is indicated.
Those skilled in the art are it should be appreciated that embodiments of the invention can be provided as method, system or computer program Product.Therefore, the present invention can adopt hardware embodiment, software implementation or the shape with reference to the embodiment in terms of software and hardware Formula.And, the present invention can use storage using the computer for wherein including computer usable program code at one or more The form of the computer program implemented on medium (including but not limited to magnetic disc store and optical memory etc.).
The present invention be with reference to method according to embodiments of the present invention, equipment (system), and computer program flow process Figure and/or block diagram are describing.It should be understood that can be by computer program instructions flowchart and/or each stream in block diagram The combination of journey and/or square frame and flow chart and/or flow process and/or square frame in block diagram.These computer programs can be provided The processor of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce A raw machine so that produced for reality by the instruction of computer or the computing device of other programmable data processing devices The device of the function of specifying in present one flow process of flow chart or one square frame of multiple flow processs and/or block diagram or multiple square frames.
These computer program instructions may be alternatively stored in and can guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works so that the instruction being stored in the computer-readable memory is produced to be included referring to Make the manufacture of device, the command device realize in one flow process of flow chart or one square frame of multiple flow processs and/or block diagram or The function of specifying in multiple square frames.
These computer program instructions can be also loaded in computer or other programmable data processing devices so that in meter Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented process, so as in computer or The instruction performed on other programmable devices is provided for realizing in one flow process of flow chart or multiple flow processs and/or block diagram one The step of function of specifying in individual square frame or multiple square frames.
The above, only presently preferred embodiments of the present invention is not intended to limit protection scope of the present invention.

Claims (10)

1. it is a kind of show focusing area method, it is characterised in that be applied to mobile terminal, the mobile terminal is provided with camera And the second viewing area shown on interface is shot in camera, second viewing area is used to show the focusing area after amplifying; Methods described includes:
When being shot using the camera, the first area that the camera shoots interface is focused, and obtain described the Two viewing areas shoot the position at interface in the camera;The position at interface and the are shot in the camera in second viewing area One region forms and overlaps, and after receiving focusing confirmation, controls second viewing area and be moved to the camera shooting interface Second area;
Or, second viewing area is focusing area position in the position that the camera shoots interface, is entered using the camera When row shoots, the focusing area after amplifying is shown using second viewing area.
2. method according to claim 1, it is characterised in that it is right to carry out in the first position that interface is shot to the camera Jiao, and second viewing area is obtained behind the position that the camera shoots interface, methods described also includes:
When default overlay condition is met, determine that second viewing area shoots the position at interface and described the in the camera One region forms and overlaps;
The overlay condition include it is following at least one:Second viewing area the camera shoot interface position with it is described The center that first area has the position that lap, second viewing area shoot interface in the camera is in described first In region.
3. method according to claim 2, it is characterised in that the overlay condition also includes:Second viewing area exists The camera shoots the position at interface with the lap area of the first area more than or equal to setting area threshold, described to set Determine the product that area threshold is the first area area and setting coefficient, the setting coefficient is more than 0 and is less than or equal to 1.
4. method according to claim 1, it is characterised in that the control second viewing area is moved to the camera The second area at interface is shot, including:
The second viewing area movement configured information is generated, the second viewing area movement configured information is used to indicate the camera Shoot the position of the second area at interface;Control second viewing area and be moved to second viewing area movement configured information to refer to The position for showing.
5. method according to claim 1, it is characterised in that the focusing after amplifying is being shown using second viewing area Behind region, methods described also includes:
Focusing area movement configured information is received, after the focusing area movement configured information is used to indicate focusing area movement Positional information;
The second viewing area and focusing area are controlled while being moved to the position that the focusing area movement configured information is indicated.
6. a kind of mobile terminal, it is characterised in that the mobile terminal includes camera and controller;The shooting interface of the camera Upper to have the second viewing area, second viewing area is used to show the focusing area after amplifying;
The controller, for, when it is determined that the camera is shot, carrying out to the first area that the camera shoots interface Focusing, and obtain the position that second viewing area shoots interface in the camera;In second viewing area in the camera The position and first area for shooting interface forms overlap, and after receiving focusing confirmation, controls the second viewing area movement The second area at interface is shot to the camera;
Or, second viewing area is focusing area position in the position that the camera shoots interface;The controller, is used for When it is determined that the camera is shot, control second viewing area and show the focusing area after amplifying.
7. mobile terminal according to claim 6, it is characterised in that the controller, is additionally operable to clapping the camera The first position for taking the photograph interface is focused, and obtains second viewing area behind the position that the camera shoots interface, full During the default overlay condition of foot, determine that second viewing area shoots the position at interface and the first area shape in the camera Into overlap;
The overlay condition include it is following at least one:Second viewing area the camera shoot interface position with it is described The center that first area has the position that lap, second viewing area shoot interface in the camera is in described first In region.
8. mobile terminal according to claim 7, it is characterised in that the overlay condition also includes:Described second shows Area shoots the position at interface with the lap area of the first area more than or equal to setting area threshold, institute in the camera Stating and area threshold being set as the first area area and the product for setting coefficient, the setting coefficient is more than 0 and is less than or equal to 1。
9. mobile terminal according to claim 6, it is characterised in that the controller, specifically for generating described second Configured information is moved in viewing area, and the second viewing area movement configured information is used to indicate the secondth area that the camera shoots interface The position in domain;Control second viewing area and be moved to the position that the second viewing area movement configured information is indicated.
10. mobile terminal according to claim 6, it is characterised in that the controller, is additionally operable in control described second After viewing area shows the focusing area after amplifying, focusing area movement configured information is received, the focusing area movement indicates letter Cease for indicating the positional information after focusing area movement;
The controller, is additionally operable to control the second viewing area and focusing area while being moved to the focusing area movement indicates letter The position that breath is indicated.
CN201610946035.3A 2016-10-17 2016-11-02 Method for displaying focus area and mobile terminal Pending CN106534674A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201610907296 2016-10-17
CN2016109072964 2016-10-17
CN201610941145 2016-10-25
CN2016109411450 2016-10-25

Publications (1)

Publication Number Publication Date
CN106534674A true CN106534674A (en) 2017-03-22

Family

ID=57892765

Family Applications (11)

Application Number Title Priority Date Filing Date
CN201610958157.4A Pending CN106504280A (en) 2016-10-17 2016-11-02 A kind of method and terminal for browsing video
CN201610947314.1A Active CN106572303B (en) 2016-10-17 2016-11-02 Picture processing method and terminal
CN201610946494.1A Active CN106572302B (en) 2016-10-17 2016-11-02 A kind of image information processing method and equipment
CN201610958160.6A Pending CN106572249A (en) 2016-10-17 2016-11-02 Region enlargement method and apparatus
CN201610952598.3A Active CN106412324B (en) 2016-10-17 2016-11-02 Device and method for prompting focusing object
CN201610944748.6A Active CN106453924B (en) 2016-10-17 2016-11-02 A kind of image capturing method and device
CN201610947313.7A Pending CN106534675A (en) 2016-10-17 2016-11-02 Method and terminal for microphotography background blurring
CN201610945947.9A Pending CN106375595A (en) 2016-10-17 2016-11-02 Auxiliary focusing apparatus and method
CN201610946035.3A Pending CN106534674A (en) 2016-10-17 2016-11-02 Method for displaying focus area and mobile terminal
CN201610946668.4A Active CN106502693B (en) 2016-10-17 2016-11-02 A kind of image display method and device
CN201610946623.7A Active CN106375596B (en) 2016-10-17 2016-11-02 Device and method for prompting focusing object

Family Applications Before (8)

Application Number Title Priority Date Filing Date
CN201610958157.4A Pending CN106504280A (en) 2016-10-17 2016-11-02 A kind of method and terminal for browsing video
CN201610947314.1A Active CN106572303B (en) 2016-10-17 2016-11-02 Picture processing method and terminal
CN201610946494.1A Active CN106572302B (en) 2016-10-17 2016-11-02 A kind of image information processing method and equipment
CN201610958160.6A Pending CN106572249A (en) 2016-10-17 2016-11-02 Region enlargement method and apparatus
CN201610952598.3A Active CN106412324B (en) 2016-10-17 2016-11-02 Device and method for prompting focusing object
CN201610944748.6A Active CN106453924B (en) 2016-10-17 2016-11-02 A kind of image capturing method and device
CN201610947313.7A Pending CN106534675A (en) 2016-10-17 2016-11-02 Method and terminal for microphotography background blurring
CN201610945947.9A Pending CN106375595A (en) 2016-10-17 2016-11-02 Auxiliary focusing apparatus and method

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN201610946668.4A Active CN106502693B (en) 2016-10-17 2016-11-02 A kind of image display method and device
CN201610946623.7A Active CN106375596B (en) 2016-10-17 2016-11-02 Device and method for prompting focusing object

Country Status (1)

Country Link
CN (11) CN106504280A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112584043A (en) * 2020-12-08 2021-03-30 维沃移动通信有限公司 Auxiliary focusing method and device, electronic equipment and storage medium

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106909274B (en) * 2017-02-27 2020-12-15 南京车链科技有限公司 Image display method and device
CN106973164B (en) * 2017-03-30 2019-03-01 维沃移动通信有限公司 A kind of take pictures weakening method and the mobile terminal of mobile terminal
CN107145285B (en) * 2017-05-12 2019-12-03 维沃移动通信有限公司 A kind of information extracting method and terminal
CN107222676B (en) * 2017-05-26 2020-06-02 Tcl移动通信科技(宁波)有限公司 Blurred picture generation method, storage device and mobile terminal
CN107247535B (en) * 2017-05-31 2021-11-30 北京小米移动软件有限公司 Intelligent mirror adjusting method and device and computer readable storage medium
WO2019014861A1 (en) * 2017-07-18 2019-01-24 Hangzhou Taruo Information Technology Co., Ltd. Intelligent object tracking
CN107613202B (en) * 2017-09-21 2020-03-10 维沃移动通信有限公司 Shooting method and mobile terminal
CN107807770A (en) * 2017-09-27 2018-03-16 阿里巴巴集团控股有限公司 A kind of screenshot method, device and electronic equipment
WO2019113746A1 (en) * 2017-12-11 2019-06-20 深圳市大疆创新科技有限公司 Manual-focus prompt method, control apparatus, photography device, and controller
CN109963200A (en) * 2017-12-25 2019-07-02 上海全土豆文化传播有限公司 Video broadcasting method and device
CN108536364A (en) * 2017-12-28 2018-09-14 努比亚技术有限公司 A kind of image pickup method, terminal and computer readable storage medium
CN108093181B (en) * 2018-01-16 2021-03-30 奇酷互联网络科技(深圳)有限公司 Picture shooting method and device, readable storage medium and mobile terminal
CN108471524B (en) * 2018-02-28 2020-08-07 北京小米移动软件有限公司 Focusing method and device and storage medium
CN108495029B (en) 2018-03-15 2020-03-31 维沃移动通信有限公司 Photographing method and mobile terminal
CN110349223B (en) * 2018-04-08 2021-04-30 中兴通讯股份有限公司 Image processing method and device
CN108876782A (en) * 2018-06-27 2018-11-23 Oppo广东移动通信有限公司 Recall video creation method and relevant apparatus
CN108989674A (en) * 2018-07-26 2018-12-11 努比亚技术有限公司 A kind of browsing video method, terminal and computer readable storage medium
CN109525888A (en) * 2018-09-28 2019-03-26 Oppo广东移动通信有限公司 Image display method, device, electronic equipment and storage medium
CN109816485B (en) * 2019-01-17 2021-06-15 口碑(上海)信息技术有限公司 Page display method and device
CN109648568B (en) * 2019-01-30 2022-01-04 深圳镁伽科技有限公司 Robot control method, system and storage medium
CN110333813A (en) * 2019-05-30 2019-10-15 平安科技(深圳)有限公司 Method, electronic device and the computer readable storage medium of invoice picture presentation
CN111355998B (en) * 2019-07-23 2022-04-05 杭州海康威视数字技术股份有限公司 Video processing method and device
CN110908558B (en) * 2019-10-30 2022-10-18 维沃移动通信(杭州)有限公司 Image display method and electronic equipment
CN112770042B (en) * 2019-11-05 2022-11-15 RealMe重庆移动通信有限公司 Image processing method and device, computer readable medium, wireless communication terminal
CN110896451B (en) * 2019-11-20 2022-01-28 维沃移动通信有限公司 Preview picture display method, electronic device and computer readable storage medium
CN111026316A (en) * 2019-11-25 2020-04-17 维沃移动通信有限公司 Image display method and electronic equipment
CN113132618B (en) * 2019-12-31 2022-09-09 华为技术有限公司 Auxiliary photographing method and device, terminal equipment and storage medium
CN111182211B (en) * 2019-12-31 2021-09-24 维沃移动通信有限公司 Shooting method, image processing method and electronic equipment
CN111526425B (en) * 2020-04-26 2022-08-09 北京字节跳动网络技术有限公司 Video playing method and device, readable medium and electronic equipment
CN111722775A (en) * 2020-06-24 2020-09-29 维沃移动通信(杭州)有限公司 Image processing method, device, equipment and readable storage medium
CN112188260A (en) * 2020-10-26 2021-01-05 咪咕文化科技有限公司 Video sharing method, electronic device and readable storage medium
CN114666490B (en) * 2020-12-23 2024-02-09 北京小米移动软件有限公司 Focusing method, focusing device, electronic equipment and storage medium
CN116055869B (en) * 2022-05-30 2023-10-20 荣耀终端有限公司 Video processing method and terminal

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11196301A (en) * 1997-12-26 1999-07-21 Casio Comput Co Ltd Electronic camera device
JP2003143144A (en) * 2001-11-01 2003-05-16 Matsushita Electric Ind Co Ltd Transmission system and method for detecting delay amount of signal propagation
JP2004064259A (en) * 2002-07-26 2004-02-26 Kyocera Corp System for confirming focus of digital camera
CN101340520A (en) * 2007-07-03 2009-01-07 佳能株式会社 Image data management apparatus and method, and recording medium
CN101494734A (en) * 2008-01-22 2009-07-29 佳能株式会社 Image-pickup apparatus and display controlling method for image-pickup apparatus
JP2011234083A (en) * 2010-04-27 2011-11-17 Canon Inc Imaging apparatus and control method thereof
CN102289336A (en) * 2010-06-17 2011-12-21 昆达电脑科技(昆山)有限公司 picture management system and method
US20130100319A1 (en) * 2009-05-15 2013-04-25 Canon Kabushiki Kaisha Image pickup apparatus and control method thereof
CN103595919A (en) * 2013-11-15 2014-02-19 深圳市中兴移动通信有限公司 Manual focusing method and shooting device
CN103842907A (en) * 2011-09-30 2014-06-04 富士胶片株式会社 Imaging device for three-dimensional image and image display method for focus state confirmation
CN104104787A (en) * 2013-04-12 2014-10-15 上海果壳电子有限公司 Shooting method, shooting system and hand-held device

Family Cites Families (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN87200129U (en) * 1987-01-08 1988-01-27 李传琪 Multifunction enlarger
JP4012015B2 (en) * 2002-08-29 2007-11-21 キヤノン株式会社 Image forming apparatus
JP2006295242A (en) * 2005-04-05 2006-10-26 Olympus Imaging Corp Digital camera
CN101202873B (en) * 2006-12-13 2012-07-25 株式会社日立制作所 Method and device for information record reproduction
JP4678603B2 (en) * 2007-04-20 2011-04-27 富士フイルム株式会社 Imaging apparatus and imaging method
CN101398527B (en) * 2007-09-27 2011-09-21 联想(北京)有限公司 Method for implementing zooming-in function on photo terminal and photo terminal thereof
CN101247489A (en) * 2008-03-20 2008-08-20 南京大学 Method for detail real-time replay of digital television
JP2010041175A (en) * 2008-08-01 2010-02-18 Olympus Imaging Corp Image reproducing apparatus, image reproducing method, and program
CN101778214B (en) * 2009-01-09 2011-08-31 华晶科技股份有限公司 Digital image pick-up device having brightness and focusing compensation function and image compensation method thereof
CN101895723A (en) * 2009-05-22 2010-11-24 深圳市菲特数码技术有限公司 Monitoring device
JP5460173B2 (en) * 2009-08-13 2014-04-02 富士フイルム株式会社 Image processing method, image processing apparatus, image processing program, and imaging apparatus
JP5546692B2 (en) * 2011-09-30 2014-07-09 富士フイルム株式会社 Imaging apparatus, imaging method, and program
JP2013093819A (en) * 2011-10-05 2013-05-16 Sanyo Electric Co Ltd Electronic camera
JP5936404B2 (en) * 2012-03-23 2016-06-22 キヤノン株式会社 Imaging apparatus, control method thereof, and program
CN103366352B (en) * 2012-03-30 2017-09-22 北京三星通信技术研究有限公司 Apparatus and method for producing the image that background is blurred
CN102932541A (en) * 2012-10-25 2013-02-13 广东欧珀移动通信有限公司 Mobile phone photographing method and system
CN103049175B (en) * 2013-01-22 2016-08-10 华为终端有限公司 Preview screen rendering method, device and terminal
CN103135927B (en) * 2013-01-25 2015-09-30 广东欧珀移动通信有限公司 A kind of mobile terminal rapid focus photographic method and system
CN103211621B (en) * 2013-04-27 2015-07-15 上海市杨浦区中心医院 Ultrasound directed texture quantitative measuring instrument and method thereof
CN104185981A (en) * 2013-10-23 2014-12-03 华为终端有限公司 Method and terminal selecting image from continuous captured image
CN103631599B (en) * 2013-12-11 2017-12-12 Tcl通讯(宁波)有限公司 One kind is taken pictures processing method, system and mobile terminal
CN104731494B (en) * 2013-12-23 2019-05-31 中兴通讯股份有限公司 A kind of method and apparatus of preview interface selection area amplification
JP6151176B2 (en) * 2013-12-27 2017-06-21 株式会社 日立産業制御ソリューションズ Focus control apparatus and method
CN103777865A (en) * 2014-02-21 2014-05-07 联想(北京)有限公司 Method, device, processor and electronic device for displaying information
CN104333689A (en) * 2014-03-05 2015-02-04 广州三星通信技术研究有限公司 Method and device for displaying preview image during shooting
CN103929596B (en) * 2014-04-30 2016-09-14 努比亚技术有限公司 Guide the method and device of shooting composition
CN104038699B (en) * 2014-06-27 2016-04-06 努比亚技术有限公司 The reminding method of focusing state and filming apparatus
CN104023172A (en) * 2014-06-27 2014-09-03 深圳市中兴移动通信有限公司 Shooting method and shooting device of dynamic image
CN104243825B (en) * 2014-09-22 2017-11-14 广东欧珀移动通信有限公司 A kind of mobile terminal Atomatic focusing method and system
CN104243827A (en) * 2014-09-23 2014-12-24 深圳市中兴移动通信有限公司 Shooting method and device
CN105512136A (en) * 2014-09-25 2016-04-20 中兴通讯股份有限公司 Method and device for processing based on layer
EP3018892A1 (en) * 2014-10-31 2016-05-11 Canon Kabushiki Kaisha Image capturing apparatus and control method thereof
CN104618627B (en) * 2014-12-31 2018-06-08 小米科技有限责任公司 Method for processing video frequency and device
CN105872349A (en) * 2015-01-23 2016-08-17 中兴通讯股份有限公司 Photographing method, photographing device and mobile terminal
CN104660913B (en) * 2015-03-18 2016-08-24 努比亚技术有限公司 Focus adjustment method and apparatus
CN104702846B (en) * 2015-03-20 2018-05-08 惠州Tcl移动通信有限公司 Mobile terminal camera preview image processing method and system
CN104754227A (en) * 2015-03-26 2015-07-01 广东欧珀移动通信有限公司 Method and device for shooting video
CN104836956A (en) * 2015-05-09 2015-08-12 陈包容 Processing method and device for cellphone video
CN104883619B (en) * 2015-05-12 2018-02-09 广州酷狗计算机科技有限公司 Audio-video frequency content commending system, method and device
CN104954672B (en) * 2015-06-10 2020-06-02 惠州Tcl移动通信有限公司 Manual focusing method of mobile terminal and mobile terminal
CN105100615B (en) * 2015-07-24 2019-02-26 青岛海信移动通信技术股份有限公司 A kind of method for previewing of image, device and terminal
CN105141858B (en) * 2015-08-13 2018-10-12 上海斐讯数据通信技术有限公司 The background blurring system and method for photo
CN105611145A (en) * 2015-09-21 2016-05-25 宇龙计算机通信科技(深圳)有限公司 Multi-graphic layer shooting method, multi-graphic layer shooting apparatus and terminal
CN105578275A (en) * 2015-12-16 2016-05-11 小米科技有限责任公司 Video display method and apparatus
CN105843501B (en) * 2016-02-03 2019-11-29 维沃移动通信有限公司 A kind of method of adjustment and mobile terminal of parameter of taking pictures
CN105979165B (en) * 2016-06-02 2019-02-05 Oppo广东移动通信有限公司 Blur photograph generation method, device and mobile terminal

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11196301A (en) * 1997-12-26 1999-07-21 Casio Comput Co Ltd Electronic camera device
JP2003143144A (en) * 2001-11-01 2003-05-16 Matsushita Electric Ind Co Ltd Transmission system and method for detecting delay amount of signal propagation
JP2004064259A (en) * 2002-07-26 2004-02-26 Kyocera Corp System for confirming focus of digital camera
CN101340520A (en) * 2007-07-03 2009-01-07 佳能株式会社 Image data management apparatus and method, and recording medium
CN101494734A (en) * 2008-01-22 2009-07-29 佳能株式会社 Image-pickup apparatus and display controlling method for image-pickup apparatus
US20130100319A1 (en) * 2009-05-15 2013-04-25 Canon Kabushiki Kaisha Image pickup apparatus and control method thereof
JP2011234083A (en) * 2010-04-27 2011-11-17 Canon Inc Imaging apparatus and control method thereof
CN102289336A (en) * 2010-06-17 2011-12-21 昆达电脑科技(昆山)有限公司 picture management system and method
CN103842907A (en) * 2011-09-30 2014-06-04 富士胶片株式会社 Imaging device for three-dimensional image and image display method for focus state confirmation
CN104104787A (en) * 2013-04-12 2014-10-15 上海果壳电子有限公司 Shooting method, shooting system and hand-held device
CN103595919A (en) * 2013-11-15 2014-02-19 深圳市中兴移动通信有限公司 Manual focusing method and shooting device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112584043A (en) * 2020-12-08 2021-03-30 维沃移动通信有限公司 Auxiliary focusing method and device, electronic equipment and storage medium
CN112584043B (en) * 2020-12-08 2023-03-24 维沃移动通信有限公司 Auxiliary focusing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN106453924A (en) 2017-02-22
CN106572303B (en) 2020-02-18
CN106534675A (en) 2017-03-22
CN106412324B (en) 2020-02-14
CN106453924B (en) 2019-11-15
CN106412324A (en) 2017-02-15
CN106504280A (en) 2017-03-15
CN106375596A (en) 2017-02-01
CN106572249A (en) 2017-04-19
CN106572303A (en) 2017-04-19
CN106375595A (en) 2017-02-01
CN106502693B (en) 2019-07-19
CN106502693A (en) 2017-03-15
CN106572302A (en) 2017-04-19
CN106375596B (en) 2020-04-24
CN106572302B (en) 2019-07-30

Similar Documents

Publication Publication Date Title
CN106534674A (en) Method for displaying focus area and mobile terminal
US9467812B2 (en) Mobile terminal and method for controlling the same
CN103037075B (en) Mobile terminal and method for generating an out-of-focus image
CN106909274A (en) A kind of method for displaying image and device
KR20120075775A (en) Mobile terminal and method for controlling the same
CN106506927A (en) A kind of terminal and the method shot using terminal
CN106775391A (en) Changing interface apparatus and method
CN104991772B (en) Remote operation bootstrap technique and device
CN106713716A (en) Double cameras shooting control method and device
CN106534552B (en) Mobile terminal and its photographic method
CN106406737A (en) A screen operating method and device and a mobile terminal
CN106911881A (en) A kind of an action shot filming apparatus based on dual camera, method and terminal
CN106506965A (en) A kind of image pickup method and terminal
CN106506945A (en) A kind of control method and terminal
CN106502526A (en) A kind of back-stage management method and terminal
CN105681654A (en) Photographing method and mobile terminal
CN106791149A (en) A kind of method of mobile terminal and control screen
CN106557261A (en) Rimless mobile terminal and its touch control method
CN107168612A (en) A kind of image acquisition method and terminal
CN106375608A (en) Terminal and method for using terminal to shoot
CN106897135A (en) Restoration methods and device after a kind of application interruption
CN106454087A (en) Shooting device and method
CN105430379A (en) Method and device for obtaining smoothness of image sensor
CN106791732A (en) A kind of image processing method and device
CN106651823A (en) Device and method for eliminating picture light spot and mobile terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170322

RJ01 Rejection of invention patent application after publication