WO2020108261A1 - 拍摄方法及终端 - Google Patents

拍摄方法及终端 Download PDF

Info

Publication number
WO2020108261A1
WO2020108261A1 PCT/CN2019/116123 CN2019116123W WO2020108261A1 WO 2020108261 A1 WO2020108261 A1 WO 2020108261A1 CN 2019116123 W CN2019116123 W CN 2019116123W WO 2020108261 A1 WO2020108261 A1 WO 2020108261A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
input
shooting
preview interface
preview
Prior art date
Application number
PCT/CN2019/116123
Other languages
English (en)
French (fr)
Inventor
崔晓东
Original Assignee
维沃移动通信(杭州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信(杭州)有限公司 filed Critical 维沃移动通信(杭州)有限公司
Priority to EP19890183.7A priority Critical patent/EP3890303A4/en
Publication of WO2020108261A1 publication Critical patent/WO2020108261A1/zh
Priority to US17/330,602 priority patent/US11689649B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1698Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00183Photography assistance, e.g. displaying suggestions to the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00347Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with another still picture apparatus, e.g. hybrid still picture apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32106Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title separate from the image data, e.g. in a different computer file
    • H04N1/32122Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title separate from the image data, e.g. in a different computer file in a separate device, e.g. in a memory or on a display separate from image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0008Connection or combination of a still picture apparatus with another apparatus
    • H04N2201/0013Arrangements for the control of the connected apparatus by the still picture apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0008Connection or combination of a still picture apparatus with another apparatus
    • H04N2201/0074Arrangements for the control of a still picture apparatus by the connected apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0084Digital still camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/325Modified version of the image, e.g. part of the image, image reduced in size or resolution, thumbnail or screennail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3273Display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3278Transmission

Definitions

  • Embodiments of the present disclosure relate to the field of communication technologies, and in particular, to a shooting method and terminal.
  • the shooting mode in the related art usually needs to click on the shooting control on the mobile terminal to perform shooting control, or use an external tool such as a selfie stick to perform shooting control. This shooting mode is limited by the distance between the mobile terminal and the photographer, and remote shooting control cannot be achieved.
  • Embodiments of the present disclosure provide a shooting method and terminal to solve the problem that remote shooting control cannot be achieved in the related art.
  • an embodiment of the present disclosure provides a shooting method, which is applied to a first terminal and includes:
  • the first target shooting data includes part or all of image information of the first preview interface, and the first target shooting data is a video or an image.
  • an embodiment of the present disclosure also provides a shooting method, which is applied to a second terminal and includes:
  • an embodiment of the present disclosure also provides a terminal
  • an embodiment of the present disclosure further provides a terminal.
  • the terminal is a second terminal among a first terminal and a second terminal that establish a remote connection for sharing a preview image, and includes:
  • a third sending module configured to send the first preview interface of the second terminal to the first terminal
  • the seventh receiving module is configured to receive and display the second preview interface of the first terminal sent by the first terminal.
  • an embodiment of the present disclosure also provides another terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program being executed by the processor Steps to achieve the above shooting method.
  • embodiments of the present disclosure also provide a computer-readable storage medium that stores a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, implements the steps of the above-described shooting method.
  • the first terminal receives and displays the first preview interface of the second terminal sent by the second terminal in a state where a remote connection has been established with the second terminal to share the shooting preview screen;
  • the first input of the first target in response to the first input, output first target shooting data; wherein, the first target shooting data includes part or all of the image information of the first preview interface, the first target shooting data For video or image.
  • FIG. 1 is a flowchart of a shooting method provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a display interface provided by an embodiment of the present disclosure
  • FIG. 3 is a second schematic diagram of a display interface provided by an embodiment of the present disclosure.
  • FIG. 4 is a third schematic diagram of a display interface provided by an embodiment of the present disclosure.
  • FIG. 5 is a fourth schematic diagram of a display interface provided by an embodiment of the present disclosure.
  • FIG. 6 is a fifth schematic diagram of a display interface provided by an embodiment of the present disclosure.
  • FIG. 7 is a sixth schematic diagram of a display interface provided by an embodiment of the present disclosure.
  • FIG. 8 is a seventh schematic diagram of a display interface provided by an embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram 8 of a display interface provided by an embodiment of the present disclosure.
  • FIG. 10 is a schematic view 9 of the display interface provided by an embodiment of the present disclosure.
  • FIG. 11 is a tenth schematic diagram of a display interface provided by an embodiment of the present disclosure.
  • FIG. 12 is an eleventh schematic diagram of a display interface provided by an embodiment of the present disclosure.
  • FIG. 13 is a flowchart of another shooting method provided by an embodiment of the present disclosure.
  • FIG. 14 is a structural diagram of a terminal provided by an embodiment of the present disclosure.
  • 15 is a structural diagram of another terminal provided by an embodiment of the present disclosure.
  • 16 is a structural diagram of another terminal provided by an embodiment of the present disclosure.
  • FIG. 17 is a structural diagram of another terminal provided by an embodiment of the present disclosure.
  • FIG. 1 is a flowchart of a shooting method provided by an embodiment of the present disclosure.
  • the shooting method is applied to the first terminal. As shown in FIG. 1, the shooting method includes:
  • Step 101 In a state where a remote connection has been established for sharing a preview preview screen with a second terminal, receive and display a first preview interface of the second terminal sent by the second terminal;
  • a camera sharing connection can be established with the second terminal through the camera application itself, for example, a sharing control is set on the interface of the camera application, a shared second terminal is added by clicking the sharing control, and the second terminal is added to the second terminal Send a sharing request to achieve camera sharing; in another embodiment, camera sharing can also be performed through a social platform.
  • a camera sharing control is set in the chat interface of the social platform. By clicking the camera sharing control, the user and A second terminal corresponding to at least one contact in the social platform establishes a camera sharing connection.
  • a terminal corresponding to the contact in the chat interface can be established to establish a camera sharing connection; if it is currently in the group chat interface, it can be established with The camera sharing connection of the terminal corresponding to at least one contact in the group. That is to say, the first terminal may establish a camera sharing connection with one or more second terminals.
  • the first terminal can acquire and display the first preview interface collected by the second terminal in real time.
  • the first terminal has a camera, and the first terminal may obtain and display a second preview interface collected by the camera of the first terminal.
  • the manner in which the first terminal displays the first preview interface and the second preview interface may be as shown in FIGS. 2 to 6, wherein, when the first terminal is a single-screen terminal, as shown in FIG. 2, the first terminal may A first preview interface is displayed, the first preview interface includes a first portrait screen 11 and a first background 12; after receiving the switching command, a second preview interface can be displayed, as shown in FIG. 3, the second preview interface includes Second portrait picture 21 and second background 22.
  • the first preview interface may be displayed on one screen, and the second preview interface may be displayed on another screen.
  • the first preview interface may be displayed on one screen, and the second preview interface may be displayed on another screen.
  • Step 102 Receive the user's first input
  • the first input is used to control the first terminal to take a video or take a photo.
  • the user can make the first input through a shooting button, voice, or gesture, which is not further limited herein.
  • Step 103 in response to the first input, output first target shooting data
  • the first target shooting data includes part or all of image information of the first preview interface, and the first target shooting data is a video or an image.
  • the first terminal in response to the first input, may directly perform video recording or photographing on part or all of the image information of the first preview interface, and may output first target shooting data, which is the first target shooting data It can be stored in the cache of the first terminal and saved locally by the user's saving operation; it can also be directly saved locally.
  • the first terminal may also output a shooting instruction to the second terminal to control the camera of the second terminal to perform a shooting action, and update the first preview interface of the first terminal in real time.
  • the first terminal may perform video recording or photo processing on part or all of the image information of the current first preview interface.
  • the second terminal may locally save the shooting data generated by this shooting action, and the shooting data may be stored locally.
  • first target shooting data may include part or all image information of the first preview interface.
  • the first terminal Take the captured image.
  • the first target shooting data includes part of the image information of the first preview interface
  • the image after the first preview interface and the second preview interface are stitched and synthesized is taken, which will be described in detail below.
  • the method further includes:
  • the target preview interface is an interface obtained by synthesizing image data of the first preview interface and the second preview interface
  • the second preview interface is a preview interface collected by the camera of the first terminal
  • the first target shooting data is all image data of the target preview interface.
  • the target preview interface may include part of the image data of the first preview interface and part of the image data of the second preview interface.
  • the first preview interface may include a first portrait screen 11 and a first background 12; the above-mentioned second preview interface may include a second portrait screen 21 and a second background 22.
  • the above target preview interface includes but is not limited to any of the following methods:
  • Method 1 the second portrait screen 21 and the first background 12;
  • Method 2 the first portrait picture 11, the second portrait picture 21 and the first background 12;
  • Method 3 the first portrait picture 11, the second portrait picture 21 and the second background 22;
  • Mode 4 the first portrait picture 11, the second portrait picture 21, part of the first background 12 and part of the second background 22.
  • the first preview interface and the second preview interface can be stitched and synthesized into the target preview interface, and then the target preview interface is processed to obtain the first target shooting data, which improves the flexibility of shooting .
  • the background in the target preview interface can also be updated. Specifically, after the target preview interface is displayed, the method further includes:
  • the updated background of the target area includes part or all of the background in the first preview interface, or part or all of the background in the second preview interface; the target area is the display of image data of the first preview interface Area or the display area of the image data of the second preview interface.
  • updating the background of the target area in the target preview interface refers to switching the background of the target area.
  • the background of the target preview interface may be set to the first background 12 or the second Background 22; in this embodiment, the background of the target area can be switched through the above-mentioned first input, for example, the first background 12 can be switched to the second background 22, or the second background 22 can be switched to the first background 12 . Since the background of the target area can be switched, the shooting flexibility is further improved.
  • the default is the single-person shooting mode.
  • the target preview interface shown in FIG. 6 includes a second portrait screen 21 and a first background 12.
  • the user can click the shooting mode switch to switch to the group photo shooting mode (for example, click the control B in the upper left corner of the figure), thereby displaying the target preview interface as shown in FIG. 7.
  • the target area is the entire display interface, and the user can update the display interface of the target area through the second input.
  • the user can click the first background 12 to switch the background As the second background 21, the target preview interface shown in FIG. 8 is displayed at this time.
  • the input method of the second input may be set according to actual conditions, for example, the target area may be directly clicked, an operation control may be set, or the second input may be realized by an operation gesture or the like.
  • the target preview interface may further include a dividing line for distinguishing the image data of the first preview interface and the image data of the second preview interface, the target preview interface is divided into the first by the dividing line Preview sub-region and second preview sub-region;
  • the method further includes:
  • the display area of the image data of the first preview sub-region and the second preview sub-region is updated.
  • the dividing line is the broken line shown in FIG. 9.
  • the dividing line is located in the middle of the screen, and the target preview interface is divided into the first preview sub-area and the first Two preview sub-area.
  • the user can adjust the position of the first dividing line through a drag operation to adjust the area of the image data of the first preview sub-region and the second preview sub-region.
  • the target preview interface after the adjustment is shown in FIG. 10. In this way, different segmentation methods can be set according to different preferences of the user, thereby making the background segmentation method more flexible.
  • the number of the above-mentioned second terminals may be one or more.
  • the above-mentioned first terminals may include a first screen and a second screen.
  • the preview interface displaying the second terminal includes:
  • a first preview interface of the second terminal selected by the fourth input is displayed.
  • the user of the first terminal can directly establish a remote connection for sharing the preview preview screen with multiple second terminals through social software.
  • all preview interfaces of the second terminal can be displayed on the first screen.
  • the above-mentioned first screen may display a preview interface of all second terminals and the first preview interface.
  • the fourth input may be a touch input to the preview interface of the second terminal.
  • the switching control A can also be displayed on the first screen, and the switching preview A can be used to switch the first preview interface.
  • at least two preview interfaces of the second terminal are displayed on the first screen, and the first preview interface can be selected through the fourth input, multiple terminals can simultaneously share the preview preview images, which is convenient User actions.
  • the shooting data may be stitched together.
  • the method further includes:
  • the N pieces of shooting data are stitched together to generate second target shooting data.
  • the fifth input may be an input for performing a touch operation on the preview interface of the second terminal.
  • the above-mentioned N is an integer greater than 1, specifically, the size of the N may be preset, for example, may be 2 or 4, that is, after performing a fixed number of fifth inputs, the user will respond to the N times of fifth inputs , Respectively, the preview interfaces of N second terminals are displayed on the second screen.
  • the above N may not be fixed. For example, after the fifth input is performed for the Mth time, the timing may be started to determine whether there is a fifth input within a preset period of time.
  • the time may be re-counted; if No fifth input is received, M is determined as N, and the fifth input is responded to N times. In other embodiments, the above N may also be the number of times the fifth input is received within a preset time period.
  • the fifth input may be a touch input to the second terminal, for example, it may be a click operation on the preview interface of the second terminal.
  • the display mode of the preview interface displaying the second terminal can be set according to the actual situation. For example, in an embodiment, after each fifth input, the preview interface of the currently clicked second terminal can be displayed and the i-th After the fifth input, the preview interface of the second terminal corresponding to the i-th fifth input may be photographed to obtain the i-th shooting data.
  • the above acquiring N pieces of shooting data generated based on the preview interfaces of the N second terminals includes:
  • the i-th shooting data is an image or a video
  • i is a positive integer
  • i is less than or equal to N.
  • the sixth input is used to control the first terminal to take a video or take a photo, and the user can make the sixth input through a shooting button, voice, or gesture, which is not further limited herein.
  • the preview interface of the second terminal is photographed through the sixth input, thereby improving the pertinence of shooting and ensuring the shooting effect.
  • the display and shooting process of the preview interface of the second terminal may also be in other real-time manners, for example, each time the user clicks the preview interface of the second terminal, the second screen A display of a preview interface is added to the preview interface of the second terminal where the added preview interface is clicked. After the fifth input is performed N times, each preview interface is photographed or video is taken to obtain N shooting data. It may also be that the preview interface of each second terminal displays a preset duration, and then switches to the next preview interface. During the display of each preview interface, corresponding shooting data will be generated, and then N shooting data will be obtained. Finally, the N shooting data are stitched together.
  • the first terminal may capture the preview interface of the user A (the user of the second terminal) to obtain the first photo and the user B (the second The preview interface of the terminal user) obtains the second photo, and then stitches the photos taken twice to obtain the final photo. Since it is possible to perform multiple shootings to obtain multiple shooting data and stitch the shooting data, its operation is convenient, and any terminal can obtain images obtained by synthesizing multiple shots from other ends. In addition, the user can save only the second target shooting data, which can reduce memory consumption.
  • the shooting site of the second terminal may also be adjusted through the first terminal. Specifically, after the above step 101, it further includes:
  • the first shooting adjustment information is used by the second terminal to adjust the shooting field of view, and the first shooting adjustment message carries the first input trajectory.
  • the first touch input is a sliding input.
  • the user may perform the first touch input on the first preview interface or the first touch input on the second preview interface.
  • the above-mentioned first input trajectory may be an actual sliding trajectory or a straight-line trajectory from the starting point to the ending point.
  • the first touch input can be performed on the first terminal, and the first terminal will perform the first input according to the first touch input.
  • the track generates a first shooting adjustment message, and then sends the first shooting adjustment message to the second terminal.
  • the second terminal first receives the first shooting adjustment message sent by the first terminal; then extracts the first input trajectory in the first shooting adjustment message; and determines the target adjustment angle based on the first input trajectory; By adjusting the size of the angle, the second terminal can perform different operations.
  • the position of the camera lens of the second terminal is adjusted based on the first input trajectory.
  • the lens of the camera can be mounted on a position adjustment component, which can control the lens to move within a certain range.
  • the position adjustment component can be an optical image stabilization component (OIS).
  • OIS can control the camera to pan to achieve optical image stabilization.
  • the position of the lens can be adjusted through OIS, thereby changing the shooting angle.
  • OIS can only control the lens of the camera to move in a smaller range.
  • the above-mentioned preset threshold is related to the range of the lens movement of the camera controlled by OIS, and the specific size will not be further described here.
  • the user of the first terminal can realize the remote control of the OIS of the second terminal through the first shooting adjustment message to adjust the shooting angle of the second terminal, so that the shooting operation can be automatically completed, simplifying the shooting difficulty .
  • the first movement distance and the first movement direction of the second terminal are determined based on the target adjustment angle; first prompt information is displayed, the first The prompt information is used to prompt the user of the second terminal to move the second terminal according to the first moving distance and the first moving direction.
  • the prompt method of the first prompt information may be set according to actual needs, for example, it may be prompted by text, or the direction of movement may be indicated by an arrow, and the first movement distance may be prompted by text, for example, in the form of a pop-up window
  • the first prompt message is displayed.
  • language prompts are also available.
  • the above-mentioned second terminal can also detect the current movement state through a built-in sensor to determine whether the user has moved to a specified position, thereby improving the accuracy of movement.
  • the first shooting adjustment information can be sent through the first terminal to remind the movement of the second shooting terminal or control the OIS to adjust the position of the camera, the shooting can be facilitated.
  • the user of the first terminal can also guide the second terminal to move to improve the pertinence of shooting.
  • the method may further include:
  • the second shooting adjustment information is used for the second terminal to record or take pictures based on the second input track, and the second shooting adjustment message carries the second input track.
  • the above-mentioned second touch input is a sliding input.
  • the user can perform the second touch input on the first preview interface or the second touch input on the second preview interface.
  • the above second input trajectory may be an actual sliding trajectory.
  • the second terminal first receives the above-mentioned second shooting adjustment message; then extracts the second input trajectory in the second shooting adjustment message; based on the second input trajectory, determines the second movement direction of the second terminal And the second moving distance; finally displaying second prompt information, the second prompt information is used to prompt the second terminal user to move the second terminal according to the second moving distance and the second moving direction during the recording or photographing process .
  • the user of the second terminal can move according to the second input trajectory, and take pictures or record according to the second input trajectory.
  • the user of the second terminal may be instructed to move the second terminal from top to bottom through the second trajectory, so as to realize the shooting of the specified trajectory.
  • the second track can be used to instruct the user to take a picture at the top, then take a picture in the middle, and finally take a picture at the bottom; for video, the user can be instructed by the second track to move the video
  • the trajectory when starting to record, the camera is used to shoot the top of the building, and then move the second terminal, so that the shooting picture gradually moves to the bottom of the building.
  • the embodiment of the present disclosure sends the second shooting adjustment information to the second terminal through the first terminal, thereby prompting the moving distance and moving direction of the second terminal, this can improve the pertinence and convenience of shooting. At the same time, since there is no need to call during the shooting process, the impact of the user's speaking on the video shooting when shooting the video is avoided.
  • the first terminal receives and displays the first preview interface of the second terminal sent by the second terminal in a state where a remote connection has been established with the second terminal to share the shooting preview screen;
  • the first input of the first target in response to the first input, output first target shooting data; wherein, the first target shooting data includes part or all of the image information of the first preview interface, the first target shooting data For video or image.
  • an embodiment of the present disclosure also provides a shooting method, which is applied to the second terminal. As shown in FIG. 13, the method includes:
  • Step 1301 Send the first preview interface of the second terminal to the first terminal in a state where a remote connection has been established with the first terminal to share the shooting preview screen;
  • Step 1302 Receive and display a second preview interface of the first terminal sent by the first terminal.
  • the preview interface of the second terminal may be adjusted. Specifically, when adjustment is required, the first terminal may send the first shooting adjustment information to the second terminal.
  • the shooting adjustment information may carry the first input track input by the user on the preview interface of the second terminal.
  • the second terminal may perform on-site adjustment according to the first shooting adjustment information.
  • the method further includes:
  • the OIS of the second terminal is controlled to adjust the position of the camera of the second terminal based on the first input trajectory.
  • the second terminal after receiving the first shooting adjustment message, the second terminal will determine the adjustment target adjustment angle according to the first shooting adjustment information.
  • the target adjustment angle is small, and the second terminal has OIS optical
  • the hardware device can be adjusted at a small angle, such as the shake function, it can be automatically adjusted in place and then shot. If the adjustment angle is large, manual adjustment is required.
  • the method further includes:
  • the target adjustment angle is greater than a preset threshold, determine the first movement distance and the first movement direction of the second terminal based on the target adjustment angle;
  • the first prompt information is used to prompt a second terminal user to move the second terminal according to the first moving distance and the first moving direction.
  • the user of the second terminal can move the second terminal according to the first prompt information to adjust the shooting angle to meet the shooting needs of the user of the first terminal.
  • the method further includes:
  • the second prompt information is used to prompt the second terminal user to move the second terminal according to the second moving distance and the second moving direction during recording or photographing.
  • this embodiment is an implementation manner of the second terminal corresponding to the embodiment shown in FIG. 1.
  • FIG. 14 is a structural diagram of a terminal provided by an embodiment of the present disclosure.
  • the terminal is a first terminal among a first terminal and a second terminal that establish a remote connection for sharing a preview preview screen, as shown in FIG.
  • the terminal 1400 includes:
  • the processing module 1401 is configured to receive and display the first preview interface of the second terminal sent by the second terminal;
  • the first receiving module 1402 is configured to receive the user's first input
  • the output module 1403 is configured to output first target shooting data in response to the first input
  • the first target shooting data includes part or all of image information of the first preview interface, and the first target shooting data is a video or an image.
  • the terminal 1400 further includes:
  • the first display module is used to display the target preview interface
  • the target preview interface is an interface obtained by synthesizing image data of the first preview interface and the second preview interface
  • the second preview interface is a preview interface collected by the camera of the first terminal
  • the first target shooting data is all image data of the target preview interface.
  • the terminal 1400 further includes:
  • the second receiving module is used to receive the second input of the user
  • a first update module configured to update the background of the target area in the target preview interface in response to the second input
  • the updated background of the target area includes part or all of the background in the first preview interface, or part or all of the background in the second preview interface; the target area is the display of image data of the first preview interface Area or the display area of the image data of the second preview interface.
  • the target preview interface includes a dividing line for distinguishing the image data of the first preview interface and the image data of the second preview interface, and the target preview interface is divided into the first by the dividing line Preview sub-region and second preview sub-region;
  • the terminal 1400 further includes:
  • a third receiving module configured to receive a user's third input to the dividing line
  • the second update module is configured to update the display area of the image data of the first preview sub-region and the second preview sub-region in response to the third input.
  • the terminal 1400 further includes:
  • a second display module configured to display at least two preview interfaces of the second terminal on the first screen
  • the processing module is specifically configured to: receive a user's fourth input to the preview interface of the at least two second terminals; in response to the fourth input, display a first preview of the second terminal selected by the fourth input interface.
  • the terminal 1400 further includes:
  • a fourth receiving module configured to receive the user's N fifth input to the preview interface of N second terminals
  • a third display module configured to respectively display preview interfaces of the N second terminals in response to the N fifth inputs
  • a first obtaining module configured to obtain N shooting data generated based on the preview interfaces of the N second terminals
  • a stitching module is used to stitch the N pieces of shooting data to generate second target shooting data.
  • the first obtaining module includes:
  • the display unit is configured to receive the sixth input of the user when receiving the i-th fifth input of the i-th second terminal and displaying the preview interface of the i-th second terminal;
  • the processing unit is configured to perform a shooting operation in response to the sixth input and generate the i-th shooting data
  • the i-th shooting data is an image or a video
  • i is a positive integer
  • i is less than or equal to N.
  • the terminal 1400 further includes:
  • a fifth receiving module configured to receive the first touch input of the user
  • a second obtaining module configured to obtain a first input track of the first touch input
  • a first message generating module configured to generate a first shooting adjustment message based on the first input trajectory
  • a first sending module configured to send the first shooting adjustment message to the second terminal
  • the first shooting adjustment information is used by the second terminal to adjust the shooting field of view, and the first shooting adjustment message carries the first input trajectory.
  • the terminal 1400 further includes:
  • a sixth receiving module configured to receive the second touch input of the user
  • a third obtaining module configured to obtain a second input track of the second touch input
  • a second message generation module configured to generate a second shooting adjustment message based on the second input trajectory
  • a second sending module configured to send the second shooting adjustment message to the second terminal
  • the second shooting adjustment information is used for the second terminal to record or take pictures based on the second input track, and the second shooting adjustment message carries the second input track.
  • the terminal 1400 can implement various processes implemented by the first terminal in the method embodiments of FIG. 1 to FIG. 12, and to avoid repetition, details are not described herein again.
  • FIG. 15 is a structural diagram of a terminal provided by an embodiment of the present disclosure.
  • the terminal is a second terminal among a first terminal and a second terminal that establish a remote connection for sharing a preview preview screen, as shown in FIG. 15.
  • the terminal 1500 includes:
  • the third sending module 1501 is configured to send the first preview interface of the second terminal to the first terminal;
  • the seventh receiving module 1502 is configured to receive and display the second preview interface of the first terminal sent by the first terminal.
  • the terminal 1500 further includes:
  • An eighth receiving module configured to receive the first shooting adjustment message sent by the first terminal
  • a first extraction module configured to extract the first input trajectory in the first shooting adjustment message
  • a first determining module configured to determine a target adjustment angle based on the first input trajectory
  • the first control module is configured to control the OIS of the second terminal to adjust the position of the camera of the second terminal based on the first input trajectory when the target adjustment angle is less than or equal to a preset threshold.
  • the terminal 1500 further includes:
  • a second determining module configured to determine the first moving distance and the first moving direction of the second terminal based on the target adjusting angle when the target adjusting angle is greater than a preset threshold
  • the fourth display module is configured to display first prompt information, and the first prompt information is used to prompt a second terminal user to move the second terminal according to the first moving distance and the first moving direction.
  • the terminal 1500 further includes:
  • a ninth receiving module configured to receive the second shooting adjustment message sent by the first terminal
  • a second extraction module configured to extract the second input trajectory in the second shooting adjustment message
  • a third determining module configured to determine a second moving direction and a second moving distance of the second terminal based on the second input trajectory
  • the fifth display module is used to display second prompt information, and the second prompt information is used to prompt the second terminal user to move the second terminal according to the second moving distance and the second moving direction during recording or photographing .
  • the terminal 1500 can implement various processes implemented by the second terminal in the method embodiment of FIG. 13. To avoid repetition, details are not described here.
  • the terminal 1600 includes but is not limited to: a radio frequency unit 1601, a network module 1602, an audio output unit 1603, an input unit 1604, a sensor 1605, a display unit 1606, The user input unit 1607, interface unit 16016, memory 1609, processor 1610, power supply 1611 and other components.
  • the terminal structure shown in FIG. 16 does not constitute a limitation on the terminal, and the terminal may include more or less components than those illustrated, or combine some components, or arrange different components.
  • the terminals include but are not limited to mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, pedometers, and the like.
  • the processor 1610 is configured to: receive and display the first preview interface of the second terminal sent by the second terminal in a state where a remote connection has been established with the second terminal to share the shooting preview screen;
  • the first target shooting data includes part or all of image information of the first preview interface, and the first target shooting data is a video or an image.
  • the processor 1610 before receiving the first input from the user, the processor 1610 is further configured to:
  • the target preview interface is an interface obtained by synthesizing image data of the first preview interface and the second preview interface
  • the second preview interface is a preview interface collected by the camera of the first terminal
  • the first target shooting data is all image data of the target preview interface.
  • the processor 1610 is further configured to:
  • the updated background of the target area includes part or all of the background in the first preview interface, or part or all of the background in the second preview interface; the target area is the display of image data of the first preview interface Area or the display area of the image data of the second preview interface.
  • the target preview interface includes a dividing line for distinguishing the image data of the first preview interface and the image data of the second preview interface, and the target preview interface is divided into the first by the dividing line Preview sub-region and second preview sub-region;
  • the processor 1610 is further used to:
  • the display area of the image data of the first preview sub-region and the second preview sub-region is updated.
  • the first terminal includes a first screen and a second screen; before displaying the preview interface of the second terminal, the processor 1610 is further configured to:
  • the displaying a preview interface of the second terminal includes:
  • a first preview interface of the second terminal selected by the fourth input is displayed.
  • the processor 1610 is further configured to:
  • the N pieces of shooting data are stitched together to generate second target shooting data.
  • processor 1610 is specifically used for:
  • the i-th shooting data is an image or a video
  • i is a positive integer
  • i is less than or equal to N.
  • the processor 1610 is further configured to:
  • the first shooting adjustment information is used by the second terminal to adjust the shooting field of view, and the first shooting adjustment message carries the first input trajectory.
  • the processor 1610 is further configured to:
  • the second shooting adjustment information is used for the second terminal to record or take pictures based on the second input track, and the second shooting adjustment message carries the second input track.
  • the first terminal receives and displays the first preview interface of the second terminal sent by the second terminal in a state where a remote connection has been established with the second terminal to share the shooting preview screen;
  • the first input of the first target in response to the first input, output first target shooting data; wherein, the first target shooting data includes part or all of the image information of the first preview interface, the first target shooting data For video or image.
  • the remote shooting function can be realized.
  • the radio frequency unit 1601 may be used to receive and send signals during sending and receiving information or during a call. Specifically, after receiving the downlink data from the base station, it is processed by the processor 1610; The uplink data is sent to the base station.
  • the radio frequency unit 1601 includes but is not limited to an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 1601 can also communicate with the network and other devices through a wireless communication system.
  • the terminal provides users with wireless broadband Internet access through the network module 1602, such as helping users send and receive e-mail, browse web pages, and access streaming media.
  • the audio output unit 1603 may convert the audio data received by the radio frequency unit 1601 or the network module 1602 or stored in the memory 1609 into an audio signal and output as sound. Moreover, the audio output unit 1603 may also provide audio output related to a specific function performed by the terminal 1600 (eg, call signal reception sound, message reception sound, etc.).
  • the audio output unit 1603 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 1604 is used to receive audio or video signals.
  • the input unit 1604 may include a graphics processor (Graphics Processing Unit, GPU) 16041 and a microphone 16042, and the graphics processor 16041 is used for a still picture or a video image obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode
  • the data is processed.
  • the processed image frame may be displayed on the display unit 1606.
  • the image frame processed by the graphics processor 16041 may be stored in the memory 1609 (or other storage medium) or sent via the radio frequency unit 1601 or the network module 1602.
  • the microphone 16042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 1601 in the case of a telephone call mode and output.
  • the terminal 1600 further includes at least one sensor 1605, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 16061 according to the brightness of the ambient light, and the proximity sensor can close the display panel 16061 and/or when the terminal 1600 moves to the ear Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when at rest, and can be used to recognize the posture of the terminal (such as horizontal and vertical screen switching, related games, Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.;
  • Sensor 1605 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, infrared Sensors, etc., will not be repeated here.
  • the display unit 1606 is used to display information input by the user or information provided to the user.
  • the display unit 1606 may include a display panel 16061, and the display panel 16061 may be configured in the form of a liquid crystal display (Liquid Crystal) (LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), or the like.
  • LCD Liquid Crystal
  • OLED Organic Light-Emitting Diode
  • the user input unit 1607 can be used to receive input numeric or character information, and generate key signal input related to user settings and function control of the terminal.
  • the user input unit 1607 includes a touch panel 16071 and other input devices 16072.
  • the touch panel 16071 also known as a touch screen, can collect user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc. on or near the touch panel 16071 operating).
  • the touch panel 16071 may include a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into contact coordinates, and then sends To the processor 1610, the command sent from the processor 1610 is received and executed.
  • various types of resistive, capacitive, infrared, and surface acoustic waves may be used to implement the touch panel 16071.
  • the user input unit 1607 may further include other input devices 16072.
  • other input devices 16072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which are not repeated here.
  • the touch panel 16071 may be overlaid on the display panel 16061, and when the touch panel 16071 detects a touch operation on or near it, it is transmitted to the processor 1610 to determine the type of touch event, and then the processor 1610 according to the touch The type of event provides corresponding visual output on the display panel 16061.
  • the touch panel 16071 and the display panel 16061 are implemented as two independent components to realize the input and output functions of the terminal, in some embodiments, the touch panel 16071 and the display panel 16061 may be integrated to The input and output functions of the terminal are implemented, which is not limited here.
  • the interface unit 1608 is an interface for connecting an external device to the terminal 1600.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 1608 may be used to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 1600 or may be used to transmit between the terminal 1600 and an external device data.
  • the memory 1609 can be used to store software programs and various data.
  • the memory 1609 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, application programs required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; the storage data area may store Data created by the use of mobile phones (such as audio data, phone books, etc.), etc.
  • the memory 1609 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 1610 is the control center of the terminal, connects various parts of the entire terminal with various interfaces and lines, executes or executes the software programs and/or modules stored in the memory 1609, and calls the data stored in the memory 1609 to execute Various functions and processing data of the terminal, so as to monitor the terminal as a whole.
  • the processor 1610 may include one or more processing units; optionally, the processor 1610 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, and application programs, etc.
  • the modulation processor mainly handles wireless communication. It is understandable that the above-mentioned modem processor may not be integrated into the processor 1610.
  • the terminal 1600 may further include a power supply 1611 (such as a battery) that supplies power to various components.
  • a power supply 1611 (such as a battery) that supplies power to various components.
  • the power supply 1611 may be logically connected to the processor 1610 through a power management system, thereby managing charge, discharge, and power consumption management through the power management system And other functions.
  • the terminal 1600 includes some function modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a terminal, including a processor 1610, a memory 1609, and a computer program stored on the memory 1609 and executable on the processor 1610.
  • a terminal including a processor 1610, a memory 1609, and a computer program stored on the memory 1609 and executable on the processor 1610.
  • the computer program is executed by the processor 1610
  • the various processes of the above embodiments of the shooting method are implemented, and the same technical effect can be achieved. To avoid repetition, they are not repeated here.
  • the terminal 1700 includes but is not limited to: a radio frequency unit 1701, a network module 1702, an audio output unit 1703, an input unit 1704, a sensor 1705, a display unit 1706, The user input unit 1707, interface unit 17017, memory 1709, processor 1710, power supply 1711 and other components.
  • the terminal structure shown in FIG. 17 does not constitute a limitation on the terminal, and the terminal may include more or less components than those shown, or combine some components, or arrange different components.
  • the terminals include but are not limited to mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, pedometers, and the like.
  • the processor 1710 is configured to send the first preview interface of the second terminal to the first terminal in a state where a remote connection has been established with the first terminal to share the shooting preview screen;
  • the processor 1710 is further configured to:
  • the OIS of the second terminal is controlled to adjust the position of the camera of the second terminal based on the first input trajectory.
  • the processor 1710 is further configured to:
  • the target adjustment angle is greater than a preset threshold, determine the first movement distance and the first movement direction of the second terminal based on the target adjustment angle;
  • the first prompt information is used to prompt a second terminal user to move the second terminal according to the first moving distance and the first moving direction.
  • the processor 1710 is further configured to:
  • the second prompt information is used to prompt the second terminal user to move the second terminal according to the second moving distance and the second moving direction during recording or photographing.
  • the first terminal receives and displays the first preview interface of the second terminal sent by the second terminal in a state where a remote connection has been established with the second terminal to share the shooting preview screen;
  • the first input of the first target in response to the first input, output first target shooting data; wherein, the first target shooting data includes part or all of the image information of the first preview interface, the first target shooting data For video or image.
  • the radio frequency unit 1701 may be used to receive and send signals during sending and receiving information or during a call. Specifically, after receiving the downlink data from the base station, it is processed by the processor 1710; The uplink data is sent to the base station.
  • the radio frequency unit 1701 includes but is not limited to an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 1701 can also communicate with the network and other devices through a wireless communication system.
  • the terminal provides users with wireless broadband Internet access through the network module 1702, such as helping users send and receive e-mail, browse web pages, and access streaming media.
  • the audio output unit 1703 may convert the audio data received by the radio frequency unit 1701 or the network module 1702 or stored in the memory 1709 into an audio signal and output as sound. Moreover, the audio output unit 1703 may also provide audio output related to a specific function performed by the terminal 1700 (eg, call signal reception sound, message reception sound, etc.).
  • the audio output unit 1703 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 1704 is used to receive audio or video signals.
  • the input unit 1704 may include a graphics processor (Graphics, Processing, Unit, GPU) 17041 and a microphone 17042.
  • the graphics processor 17041 is configured to process a still picture or video image obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. The data is processed.
  • the processed image frame may be displayed on the display unit 1706.
  • the image frame processed by the graphics processor 17041 may be stored in the memory 1709 (or other storage medium) or sent via the radio frequency unit 1701 or the network module 1702.
  • the microphone 17042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 1701 in the case of a telephone call mode and output.
  • the terminal 1700 further includes at least one sensor 1705, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 17061 according to the brightness of the ambient light, and the proximity sensor can close the display panel 17061 and/or when the terminal 1700 moves to the ear Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when at rest, and can be used to identify terminal postures (such as horizontal and vertical screen switching, related games, Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 1705 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, infrared Sensors, etc., will not be repeated here.
  • the display unit 1706 is used to display information input by the user or information provided to the user.
  • the display unit 1706 may include a display panel 17061, and the display panel 17061 may be configured in the form of a liquid crystal display (Liquid Crystal) (LCD), an organic light emitting diode (Organic Light-Emitting Diode, OLED), or the like.
  • LCD Liquid Crystal
  • OLED Organic Light-Emitting Diode
  • the user input unit 1707 can be used to receive input numeric or character information, and generate key signal input related to user settings and function control of the terminal.
  • the user input unit 1707 includes a touch panel 17071 and other input devices 17072.
  • the touch panel 17071 also known as a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc. on or near the touch panel 17071 operating).
  • the touch panel 17071 may include a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into contact coordinates, and then sends To the processor 1710, the command sent from the processor 1710 is received and executed.
  • various types of resistive, capacitive, infrared and surface acoustic waves can be used to implement the touch panel 17071.
  • the user input unit 1707 may include other input devices 17072.
  • other input devices 17072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • the touch panel 17071 may be overlaid on the display panel 17061.
  • the touch panel 17071 detects a touch operation on or near it, it is transmitted to the processor 1710 to determine the type of touch event, and then the processor 1710 according to the touch The type of event provides corresponding visual output on the display panel 17061.
  • the touch panel 17071 and the display panel 17061 are implemented as two independent components to realize the input and output functions of the terminal, in some embodiments, the touch panel 17071 and the display panel 17061 may be integrated to The input and output functions of the terminal are implemented, which is not limited here.
  • the interface unit 1708 is an interface for connecting an external device to the terminal 1700.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 1708 may be used to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 1700 or may be used to transmit between the terminal 1700 and an external device data.
  • the memory 1709 can be used to store software programs and various data.
  • the memory 1709 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone books, etc.), etc.
  • the memory 1709 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 1710 is the control center of the terminal, and uses various interfaces and lines to connect the various parts of the entire terminal, by running or executing the software programs and/or modules stored in the memory 1709, and by calling the data stored in the memory 1709, the execution Various functions and processing data of the terminal, so as to monitor the terminal as a whole.
  • the processor 1710 may include one or more processing units; optionally, the processor 1710 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, and application programs, etc.
  • the modulation processor mainly handles wireless communication. It is understandable that the above-mentioned modem processor may not be integrated into the processor 1710.
  • the terminal 1700 may further include a power supply 1711 (such as a battery) that supplies power to various components.
  • a power supply 1711 (such as a battery) that supplies power to various components.
  • the power supply 1711 may be logically connected to the processor 1710 through a power management system, thereby managing charge, discharge, and power consumption management through the power management system And other functions.
  • the terminal 1700 includes some function modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a terminal, including a processor 1710, a memory 1709, and a computer program stored on the memory 1709 and executable on the processor 1710.
  • a terminal including a processor 1710, a memory 1709, and a computer program stored on the memory 1709 and executable on the processor 1710.
  • the computer program is executed by the processor 1710
  • the various processes of the above embodiments of the shooting method are implemented, and the same technical effect can be achieved. To avoid repetition, they are not repeated here.
  • Embodiments of the present disclosure also provide a computer-readable storage medium that stores a computer program on the computer-readable storage medium.
  • the computer program is executed by a processor, the processes of the above-described shooting method embodiments are implemented, and the same technical effect can be achieved In order to avoid repetition, I will not repeat them here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.

Abstract

本公开提供一种拍摄方法及终端,其中,该拍摄方法应用于第一终端,所述第一终端与第二终端通过网络建立相机共享连接,该方法包括:在与第二终端已建立拍摄预览画面共享的远程连接的状态下,接收所述第二终端发送的所述第二终端的第一预览界面并显示;接收用户的第一输入;响应于所述第一输入,输出第一目标拍摄数据;其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。

Description

拍摄方法及终端
相关申请的交叉引用
本申请主张在2018年11月28日在中国提交的中国专利申请号No.201811433213.8的优先权,其全部内容通过引用包含于此。
技术领域
本公开实施例涉及通信技术领域,尤其涉及一种拍摄方法及终端。
背景技术
随着移动终端的相机拍照技术不断发展,通过移动终端进行拍摄照片或者视频已成为移动终端重要的功能之一。相关技术中的拍摄方式通常需要点击移动终端上拍摄控件进行拍摄控制,或者借助自拍杆等外部工具进行拍摄控制。这种拍摄方式受到了移动终端与拍摄者的距离限制,无法实现远程拍摄控制。
发明内容
本公开实施例提供一种拍摄方法及终端,以解决相关技术中无法实现远程拍摄控制的问题。
为了解决上述技术问题,本公开是这样实现的:
第一方面,本公开实施例提供了一种拍摄方法,应用于第一终端,包括:
在与第二终端已建立拍摄预览画面共享的远程连接的状态下,接收所述第二终端发送的所述第二终端的第一预览界面并显示;
接收用户的第一输入;
响应于所述第一输入,输出第一目标拍摄数据;
其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。
第二方面,本公开实施例还提供了一种拍摄方法,应用于第二终端,包括:
在与第一终端已建立拍摄预览画面共享的远程连接的状态下,将所述第二终端的第一预览界面发送至所述第一终端;
接收所述第一终端发送的所述第一终端的第二预览界面并显示。
第三方面,本公开实施例还提供一种终端,
第四方面,本公开实施例还提供一种终端,所述终端为建立拍摄预览画面共享的远程连接的第一终端和第二终端中的第二终端,包括:
第三发送模块,用于将所述第二终端的第一预览界面发送至所述第一终端;
第七接收模块,用于接收所述第一终端发送的所述第一终端的第二预览界面并显示。
第五方面,本公开实施例还提供另一种终端,包括处理器,存储器,存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现上述拍摄方法的步骤。
第六方面,本公开实施例还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现上述拍摄方法的步骤。
本公开实施例中,第一终端在与第二终端已建立拍摄预览画面共享的远程连接的状态下,接收所述第二终端发送的所述第二终端的第一预览界面并显示;接收用户的第一输入;响应于所述第一输入,输出第一目标拍摄数据;其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。这样,由于在第一终端上显示第二终端的第一预览画面,并在第一终端上拍摄,从而可以实现远程拍摄功能。
附图说明
图1是本公开实施例提供的一种拍摄方法的流程图;
图2是本公开实施例提供的显示界面示意图之一;
图3是本公开实施例提供的显示界面示意图之二;
图4是本公开实施例提供的显示界面示意图之三;
图5是本公开实施例提供的显示界面示意图之四;
图6是本公开实施例提供的显示界面示意图之五;
图7是本公开实施例提供的显示界面示意图之六;
图8是本公开实施例提供的显示界面示意图之七;
图9是本公开实施例提供的显示界面示意图之八;
图10是本公开实施例提供的显示界面示意图之九;
图11是本公开实施例提供的显示界面示意图之十;
图12是本公开实施例提供的显示界面示意图之十一;
图13是本公开实施例提供的另一种拍摄方法的流程图;
图14是本公开实施例提供的一种终端的结构图;
图15是本公开实施例提供的另一种终端的结构图;
图16是本公开实施例提供的另一种终端的结构图;
图17是本公开实施例提供的另一种终端的结构图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获取的所有其他实施例,都属于本公开保护的范围。
参见图1,图1是本公开实施例提供的一种拍摄方法的流程图。该拍摄方法,应用于第一终端,如图1所示,该拍摄方法,包括:
步骤101,在与第二终端已建立拍摄预览画面共享的远程连接的状态下,接收所述第二终端发送的所述第二终端的第一预览界面并显示;
本公开实施例中,上述第一终端与第二终端通过网络建立相机共享连接的方式可以根据实际需要进行设置。在一实施例中,可以通过相机应用本身与第二终端的建立相机共享连接,例如,在相机应用的界面上设置有共享控件,通过点击共享控件添加共享的第二终端,并向第二终端发送共享请求,从而实现相机共享;在另一实施例中,还可以通过社交平台进行相机共享,例如,在社交平台的聊天界面中设置有相机共享控件,通过点击该相机共享控件,使得用户与社交平台中的至少一个联系人对应的第二终端建立相机共 享连接。具体的,在一可选实施例中,若当前位于单人聊天界面中,可以建立与该聊天界面中联系人对应的终端建立相机共享连接;若当前位于群组聊天界面中,则可以建立与该群组中的至少一个联系人对应的终端的相机共享连接。也就是说,上述第一终端可以与一个或者多个第二终端建立相机共享连接。
在第一终端与第二终端建立相机共享连接后,第一终端可以实时获取并显示第二终端采集的第一预览界面。应理解,在本实施例中,上述第一终端上具有摄像头,第一终端可以获取并显示第一终端的摄像头采集的第二预览界面。具体的,第一终端显示第一预览界面和第二预览界面的方式可以如图2至6所示,其中,当第一终端为单屏幕的终端时,如图2所示,第一终端可以显示第一预览界面,该第一预览界面包括第一人像画面11和第一背景12;在接收到切换命令后,可以显示第二预览界面,如图3所示,该第二预览界面包括第二人像画面21和第二背景22。当第一终端为折叠屏时,如图4所示,可以在一屏幕上显示第一预览界面,在另一屏幕上显示第二预览界面。当第一终端为多面屏时,如图5所示,可以在一屏幕上显示第一预览界面,在另一屏幕上显示第二预览界面。
步骤102,接收用户的第一输入;
本实施例中,该第一输入用于控制第一终端进行拍摄视频或者拍摄照片,用户可以通过拍摄按钮、语音或手势等方式进行第一输入,在此不做进一步的限定。
步骤103,响应于所述第一输入,输出第一目标拍摄数据;
其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。
本实施例中,第一终端响应该第一输入,可以直接针对所述第一预览界面的部分或全部图像信息进行视频录制或者拍照处理,可以输出第一目标拍摄数据,该第一目标拍摄数据可以储存在第一终端的缓存中,通过用户的保存操作保存到本地;也可以直接保存到本地。此外第一终端还可以向第二终端输出拍摄指令,以控制第二终端的摄像头执行拍摄动作,并实时更新第一终端的第一预览界面。当达到满足拍摄条件时,第一终端可以针对当前的第 一预览界面的部分或全部图像信息进行视频录制或者拍照处理。其中,第二终端可以在本地保存本次拍摄动作产生的拍摄数据,该拍摄数据可以保存在本地。
应当说明的是,上述第一目标拍摄数据可以包括第一预览界面的部分或者全图像信息,当第一目标拍摄数据包括第一预览界面的全部图像信息时,是通过第一终端对第二终端拍摄的图像进行拍摄。当第一目标拍摄数据包括第一预览界面的部分图像信息时,是将第一预览界面和第二预览界面进行拼接合成后的图像进行拍摄,以下对此进行详细说明。
具体的,本实施例中,在上述步骤102之前,该方法还包括:
显示目标预览界面;
其中,所述目标预览界面为所述第一预览界面和第二预览界面的图像数据进行合成处理后得到的界面,所述第二预览界面为所述第一终端的摄像头采集的预览界面;所述第一目标拍摄数据为所述目标预览界面的所有图像数据。
本实施例中,上述目标预览界面可以包括第一预览界面的部分图像数据以及第二预览界面的部分图像数据。例如,第一预览界面可以包括第一人像画面11和第一背景12;上述第二预览界面可以包括第二人像画面21和第二背景22。上述目标预览界面包括但不限于以下任一种方式:
方式1,第二人像画面21以及第一背景12;
方式2,第一人像画面11、第二人像画面21以及第一背景12;
方式3,第一人像画面11、第二人像画面21以及第二背景22;
方式4,第一人像画面11、第二人像画面21、部分第一背景12以及部分第二背景22。
由于在本实施例中,可以将第一预览界面和第二预览界面进行拼接合成为目标预览界面,然后对目标预览界面进行拍摄处理,得到第一目标拍摄数据,这样,提高了拍摄的灵活性。
进一步的,在本实施例中,还可以对目标预览界面中的背景进行更新,具体的,在上述显示目标预览界面之后,所述方法还包括:
接收用户的第二输入;
响应于所述第二输入,更新所述目标预览界面中目标区域的背景;
其中,更新后的所述目标区域的背景包括第一预览界面中的部分或全部背景,或者包括第二预览界面中的部分或全部背景;所述目标区域为第一预览界面的图像数据的显示区或所述第二预览界面的图像数据的显示区。
应当说明的是,对目标预览界面中的目标区域的背景进行更新是指对该目标区域的背景进行切换,例如,上述目标预览界面的背景可以设置为第一背景12,也可以设置为第二背景22;本实施例中,可以通过上述第一输入,对目标区域的背景进行切换,例如可以将第一背景12切换为第二背景22,也可以将第二背景22切换为第一背景12。由于可以对目标区域的背景进行切换,从而进一步提高拍摄的灵活性。
为了更好的理解本公开,针对上述方式1至方式4对背景更新的方案进行详细说明。
例如,在一实施例中,默认为单人拍摄模式,上述目标预览界面如图6所示,包括第二人像画面21以及第一背景12。用户可以点击拍摄模式切换,切换到合影拍摄模式(例如,点击图中左上角的控件B),从而显示如图7所示的目标预览界面。此时,上述目标区域为整个显示界面,用户可以通过上述第二输入,更新目标区域的显示界面,具体的,在图7所示的状态下,用户可以点击第一背景12,从而将背景切换为第二背景21,此时显示如图8所示的目标预览界面。
本实施例中,上述第二输入的输入方式可以根据实际情况进行设置,例如可以直接点击目标区域,还可以设置操作控件,或者通过操作手势等方式实现第二输入。
进一步的,上述目标预览界面还可以包括用于区分所述第一预览界面的图像数据和所述第二预览界面的图像数据的分割线,所述目标预览界面被所述分割线划分为第一预览子区域和第二预览子区域;
所述显示目标预览界面之后,所述方法还包括:
接收用户对所述分割线的第三输入;
响应于所述第三输入,更新所述第一预览子区域和所述第二预览子区域的图像数据的显示面积。
如图9所示,上述分割线为图9所示的虚线,在默认状态下,上述分割线位于屏幕的中间,在宽度方向上将目标预览界面分割为面积相等的第一预览子区域和第二预览子区域。用户可以通过拖拽操作调整第一分割线的位置,从而调整第一预览子区域和第二预览子区域的图像数据的面积,具体的,调整之后的目标预览界面如图10所示。这样,可以根据用户不同的喜好设置不同的分割方式,从而使得对背景的分割方式更灵活。
应当说明的是,上述第二终端的数量可以有一个或者多个,例如,在本实施例中,上述第二终端具体至少为两个,上述第一终端可以包括第一屏和第二屏,在显示所述第二终端的预览界面之前,该方法还包括
在第一屏上显示至少两个第二终端的预览界面;
该显示所述第二终端的预览界面包括:
接收用户对所述至少两个第二终端的预览界面的第四输入;
响应于所述第四输入,显示所述第四输入选取的第二终端的第一预览界面。
本实施例中,第一终端的用户可以通过社交软件直接与多个第二终端建立拍摄预览画面共享的远程连接,在建立连接可以,可以通过第一屏显示所有的第二终端的预览界面。具体如图11所示,上述第一屏可以显示所有第二终端的预览界面以及第一预览界面。上述第四输入可以为对第二终端的预览界面的触控输入。当用户需要采用哪个预览界面作为第一预览界面进行显示时,可以直接点击相应的预览界面。此外,还可以在第一屏上显示切换控件A,通过切换控件A实现对第一预览界面的切换。本实施例中,由于通过在第一屏上显示至少两个第二终端的预览界面,并通过第四输入可以选择第一预览界面,这样可以实现多个终端同时进行拍摄预览画面共享,从而方便用户操作。
进一步的,在一可选实施例中,还可以对每一第二终端进行拍摄后,进行拍摄的数据进行拼接。例如,本实施例中,在第一屏上显示至少两个第二终端的预览界面之后,该方法还包括:
接收用户对N个第二终端的预览界面的N次第五输入;
响应于所述N次第五输入,分别显示所述N个第二终端的预览界面;
获取基于所述N个第二终端的预览界面生成的N个拍摄数据;
将所述N个拍摄数据进行拼接,生成第二目标拍摄数据。
本实施例中,上述第五输入可以是对第二终端的预览界面进行触控操作的输入。上述N为大于1的整数,具体的,该N的大小可以预先设置,例如可以为2或者4,也就是说用户在执行固定次数的第五输入后,将会响应该N次的第五输入,在第二屏上分别显示N个第二终端的预览界面。此外,上述N也可以不是固定的,例如可以在第M次进行第五输入后,开始计时,判断在预设时间段内是否有第五输入,若接收到第五输入,则重新计时;若没有接收到第五输入,将M确定为N,并响应N次第五输入。在其他实施例中,上述N还可以为预设时间段内接收到第五输入的次数。
其中,上述第五输入可以为对第二终端的触控输入,例如可以是对第二终端的预览界面进行点击的操作。显示第二终端的预览界面的显示方式可以根据实际情况进行设置,例如,在一实施例中,可以每次进行第五输入后,显示当前点击的第二终端的预览界面,在显示第i次第五输入后,可以对该第i次第五输入所对应的第二终端的预览界面进行拍摄,获得第i拍摄数据。具体的,在本实施例中,上述获取基于所述N个第二终端的预览界面生成的N个拍摄数据,包括:
在接收用户对第i个第二终端的第i次第五输入,并显示所述第i个第二终端的预览界面的情况下,接收用户的第六输入;
响应于所述第六输入,执行拍摄操作,生成第i拍摄数据;
其中,所述第i拍摄数据为图像或视频,i为正整数,i小于等于N。
本实施例中,上述第六输入用于控制第一终端进行拍摄视频或者拍摄照片,用户可以通过拍摄按钮、语音或手势等方式进行第六输入,在此不做进一步的限定。由于本实施例中,在每一次进行第五输入后,通过第六输入对第二终端的预览界面进行拍摄,从而可以提高拍摄的针对性,保证拍摄效果。
应当说明的是,在其他实施例中,对于第二终端的预览界面的显示以及拍摄过程,还可以其他实时方式,例如可以是:用户每一次点击第二终端的预览界面,可以在第二屏上增加一预览界面的显示,该增加的预览界面被点击的第二终端的预览界面。在进行N次第五输入后,对每一预览界面进行拍 照或者拍摄视频从而得到N个拍摄数据。还可以是:每一第二终端的预览界面显示预设时长,然后切换到下一预览界面,在每一预览界面显示的过程中,将会生成对应的拍摄数据,进而得到N个拍摄数据。最后将N个拍摄数据进行拼接。
本公开实施例中,第一终端在与多个第二终端建立拍摄预览画面共享的情况下,可以分别拍摄用户A(第二终端的用户)的预览界面得到第一照片和用户B(第二终端的用户)的预览界面得到第二照片,然后将两次拍摄的照片进行拼接,得到最终的照片。由于可以进行多次拍摄得到多个拍摄数据,并对拍摄数据进行拼接,其操作方便,任一终端可获得多个其他端的拍摄的图像经过合成得到的图像。此外用户可以仅保存第二目标拍摄数据,这样可以降低内存消耗。
进一步的,本公开实施例中,还可以通过第一终端对第二终端的拍摄现场进行调整。具体的,在上述步骤101之后,还包括:
接收用户的第一触控输入;
获取所述第一触控输入的第一输入轨迹;
基于所述第一输入轨迹,生成第一拍摄调整消息;
将所述第一拍摄调整消息发送至所述第二终端;
其中,所述第一拍摄调整信息用于所述第二终端调整拍摄视场,所述第一拍摄调整消息携带有所述第一输入轨迹。
本实施例中,上述第一触控输入为滑动输入,具体的,用户可以在第一预览界面上进行第一触控输入,也可以在第二预览界面上进行第一触控输入。上述第一输入轨迹可以为实际滑动轨迹,也可以是从起点到终点的直线轨迹。如图12所示,当第一终端的用户需要调整第二终端的拍摄角度时,可以在第一终端上进行第一触控输入,第一终端将会根据第一触控输入的第一输入轨迹生成第一拍摄调整消息,然后将该第一拍摄调整消息发送给第二终端。
具体的,第二终端首先接收第一终端发送的第一拍摄调整消息;然后提取所述第一拍摄调整消息中的第一输入轨迹;并基于所述第一输入轨迹,确定目标调整角度;根据调整角度的大小,第二终端可以执行不同的操作。
例如,一实施例中,在所述目标调整角度小于或等于预设阈值的情况下, 基于所述第一输入轨迹,调整所述第二终端的摄像头镜头的位置。
本实施例中,上述摄像头的镜头可以安装在位置调整组件上,该位置调整组件可以控制镜头在一定范围内平移,例如该位置调整组件可以为光学防抖组件(Optical image stabilization,OIS),通过OIS可以控制摄像进行平移可以实现光学防抖。在本实施例中可以通过OIS调整镜头的位置,从而改变拍摄角度。需要说明的是,OIS仅可以控制摄像头的镜头进行较小范围的移动,上述预设阈值与OIS控制摄像头的镜头移动的范围相关,具体大小在此不做进一步的说明。由于在本实施例中,第一终端的用户可以通过第一拍摄调整消息实现对第二终端远程对OIS进行控制,以调整第二终端的拍摄角度,从而可以自动完成拍摄操作,简化了拍摄难度。
另一实施例中,在所述目标调整角度大于预设阈值的情况下,基于目标调整角度,确定第二终端的第一移动距离和第一移动方向;显示第一提示信息,所述第一提示信息用于提示第二终端用户按照所述第一移动距离和第一移动方向移动所述第二终端。
本实施例中,上述第一提示信息的提示方式可以根据实际需要进行设置,例如可以通过文字进行提示,也可以通过箭头指示移动方向,通过文字提示第一移动距离,例如可以通过弹窗的形式显示第一提示信息。此外,还可以进行语言提示。上述第二终端还可以通过内置的传感器检测当前的移动状态,判断用户是否移动到指定位置,从而提高移动的准确性。
在本实施例中,由于可以通过第一终端发送第一拍摄调整信息,对第二拍摄终端的移动进行提醒或者控制OIS对摄像头的位置进行调整,从而可以方便拍摄。
进一步的,除了可以对第二终端的拍摄现场进行调整以外,还可以由第一终端的用户指导第二终端进行移动,以提高拍摄针对性。具体的,本实施例中,在上述步骤101之后,该方法还可以包括:
接收用户的第二触控输入;
获取所述第二触控输入的第二输入轨迹;
基于所述第二输入轨迹,生成第二拍摄调整消息;
将所述第二拍摄调整消息发送至所述第二终端;
其中,所述第二拍摄调整信息用于所述第二终端基于所述第二输入轨迹录像或者拍照,所述第二拍摄调整消息携带有所述第二输入轨迹。
本实施例中,上述第二触控输入为滑动输入,具体的,用户可以在第一预览界面上进行第二触控输入,也可以在第二预览界面上进行第二触控输入。上述第二输入轨迹可以为实际滑动轨迹。具体的,第二终端首先接收上述第二拍摄调整消息;然后提取所述第二拍摄调整消息中的第二输入轨迹;基于所述第二输入轨迹,确定所述第二终端的第二移动方向和第二移动距离;最后显示第二提示信息,所述第二提示信息用于提示第二终端用户在录像或者拍照过程中按照所述第二移动距离和第二移动方向移动所述第二终端。
这样第二终端的用户可以根据该第二输入轨迹进行移动,按照该第二输入轨迹进行拍照或者录制。例如在拍摄一个建筑物时,可以通过第二轨迹指示第二终端的用户从上至下移动第二终端,以实现指定轨迹的拍摄。具体的,对于拍照,可以通过第二轨迹指示用户在顶部拍摄一张照片,然后在中部拍摄一张照片,最后在底部拍摄一张照片;对于录像,可以通过第二轨迹指示用户录像所移动的轨迹,开始录像时,摄像头用于拍摄建筑物的顶部,然后移动第二终端,从而使得拍摄画面逐步向建筑物的底部移动。由于本公开实施例通过第一终端向第二终端发送第二拍摄调整信息,从而提示第二终端移动距离和移动方向,这样可以提高拍摄的针对性,以及便利性。同时,由于拍摄过程中无需进行通话,避免了拍摄视频时用户说话对视频拍摄造成的影响。
本公开实施例中,第一终端在与第二终端已建立拍摄预览画面共享的远程连接的状态下,接收所述第二终端发送的所述第二终端的第一预览界面并显示;接收用户的第一输入;响应于所述第一输入,输出第一目标拍摄数据;其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。这样,由于在第一终端上显示第二终端的第一预览画面,并在第一终端上拍摄,从而可以实现远程拍摄功能。
进一步的,本公开实施例还提供一种拍摄方法,应用于第二终端,如图13所示,该方法包括:
步骤1301,在与第一终端已建立拍摄预览画面共享的远程连接的状态下,将所述第二终端的第一预览界面发送至所述第一终端;
步骤1302,接收所述第一终端发送的所述第一终端的第二预览界面并显示。
应当说明的是,在第一终端进行拍摄时,可以对第二终端的预览界面进行调整,具体的,在需要调整时,第一终端可以向第二终端发送第一拍摄调整信息,该第一拍摄调整信息可以携带有用户在第二终端的预览界面输入的第一输入轨迹。第二终端可以根据该第一拍摄调整信息进行现场调整。
可选的,在一实施例中,所述接收所述第一终端发送的所述第一终端的第二预览界面并显示之后,还包括:
接收第一终端发送的第一拍摄调整消息;
提取所述第一拍摄调整消息中的第一输入轨迹;
基于所述第一输入轨迹,确定目标调整角度;
在所述目标调整角度小于或等于预设阈值的情况下,基于所述第一输入轨迹,控制所述第二终端的OIS调整所述第二终端的摄像头的位置。
在本实施例中,第二终端接收到上述第一拍摄调整消息后,将会根据该第一拍摄调整信息确定调整目标调整角度,当目标调整角度较小,且第二终端自带OIS光学防抖功能等可以小角度调整的硬件设备时,可以自动调整到位,然后进行拍摄。如果调整的角度较大,则需要进行手动调整。
可选的,在另一实施例中,所述提取所述第一拍摄调整消息中的第一输入轨迹之后,还包括:
在所述目标调整角度大于预设阈值的情况下,基于目标调整角度,确定第二终端的第一移动距离和第一移动方向;
显示第一提示信息,所述第一提示信息用于提示第二终端用户按照所述第一移动距离和第一移动方向移动所述第二终端。
在本实施例中,第二终端的用户可以根据第一提示信息,对第二终端进行移动,以调整拍摄角度,满足第一终端的用户的拍摄需求。
可选的,所述接收所述第一终端发送的所述第一终端的第二预览界面并显示之后,还包括:
接收第一终端发送的第二拍摄调整消息;
提取所述第二拍摄调整消息中的第二输入轨迹;
基于所述第二输入轨迹,确定所述第二终端的第二移动方向和第二移动距离;
显示第二提示信息,所述第二提示信息用于提示第二终端用户在录像或者拍照过程中按照所述第二移动距离和第二移动方向移动所述第二终端。
需要说明的是,本实施例作为图1所示的实施例对应的第二终端的实施方式,其具体的实施方式可以参见图2所示的实施例相关说明,以及达到相同的有益效果,为了避免重复说明,此处不再赘述。
参见图14,图14是本公开实施例提供的一种终端的结构图,所述终端为建立拍摄预览画面共享的远程连接的第一终端和第二终端中的第一终端,如图14所示,终端1400包括:
处理模块1401,用于接收所述第二终端发送的所述第二终端的第一预览界面并显示;
第一接收模块1402,用于接收用户的第一输入;
输出模块1403,用于响应于所述第一输入,输出第一目标拍摄数据;
其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。
可选的,所述终端1400还包括:
第一显示模块,用于显示目标预览界面;
其中,所述目标预览界面为所述第一预览界面和第二预览界面的图像数据进行合成处理后得到的界面,所述第二预览界面为所述第一终端的摄像头采集的预览界面;所述第一目标拍摄数据为所述目标预览界面的所有图像数据。
可选的,所述终端1400还包括:
第二接收模块,用于接收用户的第二输入;
第一更新模块,用于响应于所述第二输入,更新所述目标预览界面中目标区域的背景;
其中,更新后的所述目标区域的背景包括第一预览界面中的部分或全部背景,或者包括第二预览界面中的部分或全部背景;所述目标区域为第一预览界面的图像数据的显示区或所述第二预览界面的图像数据的显示区。
可选的,所述目标预览界面包括用于区分所述第一预览界面的图像数据和所述第二预览界面的图像数据的分割线,所述目标预览界面被所述分割线划分为第一预览子区域和第二预览子区域;
所述终端1400还包括:
第三接收模块,用于接收用户对所述分割线的第三输入;
第二更新模块,用于响应于所述第三输入,更新所述第一预览子区域和所述第二预览子区域的图像数据的显示面积。
可选的,所述终端1400还包括:
第二显示模块,用于在第一屏上显示至少两个第二终端的预览界面;
所述处理模块具体用于:接收用户对所述至少两个第二终端的预览界面的第四输入;响应于所述第四输入,显示所述第四输入选取的第二终端的第一预览界面。
可选的,所述终端1400还包括:
第四接收模块,用于接收用户对N个第二终端的预览界面的N次第五输入;
第三显示模块,用于响应于所述N次第五输入,分别显示所述N个第二终端的预览界面;
第一获取模块,用于获取基于所述N个第二终端的预览界面生成的N个拍摄数据;
拼接模块,用于将所述N个拍摄数据进行拼接,生成第二目标拍摄数据。
可选的,所述第一获取模块包括:
显示单元,用于在接收用户对第i个第二终端的第i次第五输入,并显示所述第i个第二终端的预览界面的情况下,接收用户的第六输入;
处理单元,用于响应于所述第六输入,执行拍摄操作,生成第i拍摄数据;
其中,所述第i拍摄数据为图像或视频,i为正整数,i小于等于N。
可选的,所述终端1400还包括:
第五接收模块,用于接收用户的第一触控输入;
第二获取模块,用于获取所述第一触控输入的第一输入轨迹;
第一消息生成模块,用于基于所述第一输入轨迹,生成第一拍摄调整消息;
第一发送模块,用于将所述第一拍摄调整消息发送至所述第二终端;
其中,所述第一拍摄调整信息用于所述第二终端调整拍摄视场,所述第一拍摄调整消息携带有所述第一输入轨迹。
可选的,所述终端1400还包括:
第六接收模块,用于接收用户的第二触控输入;
第三获取模块,用于获取所述第二触控输入的第二输入轨迹;
第二消息生成模块,用于基于所述第二输入轨迹,生成第二拍摄调整消息;
第二发送模块,用于将所述第二拍摄调整消息发送至所述第二终端;
其中,所述第二拍摄调整信息用于所述第二终端基于所述第二输入轨迹录像或者拍照,所述第二拍摄调整消息携带有所述第二输入轨迹。
终端1400能够实现图1至图12的方法实施例中第一终端实现的各个过程,为避免重复,这里不再赘述。
参见图15,图15是本公开实施例提供的一种终端的结构图,所述终端为建立拍摄预览画面共享的远程连接的第一终端和第二终端中的第二终端,如图15所示,终端1500包括:
第三发送模块1501,用于将所述第二终端的第一预览界面发送至所述第一终端;
第七接收模块1502,用于接收所述第一终端发送的所述第一终端的第二预览界面并显示。
可选的,所述终端1500还包括:
第八接收模块,用于接收第一终端发送的第一拍摄调整消息;
第一提取模块,用于提取所述第一拍摄调整消息中的第一输入轨迹;
第一确定模块,用于基于所述第一输入轨迹,确定目标调整角度;
第一控制模块,用于在所述目标调整角度小于或等于预设阈值的情况下,基于所述第一输入轨迹,控制所述第二终端的OIS调整所述第二终端的摄像头的位置。
可选的,所述终端1500还包括:
第二确定模块,用于在所述目标调整角度大于预设阈值的情况下,基于目标调整角度,确定第二终端的第一移动距离和第一移动方向;
第四显示模块,用于显示第一提示信息,所述第一提示信息用于提示第二终端用户按照所述第一移动距离和第一移动方向移动所述第二终端。
可选的,所述终端1500还包括:
第九接收模块,用于接收第一终端发送的第二拍摄调整消息;
第二提取模块,用于提取所述第二拍摄调整消息中的第二输入轨迹;
第三确定模块,用于基于所述第二输入轨迹,确定所述第二终端的第二移动方向和第二移动距离;
第五显示模块,用于显示第二提示信息,所述第二提示信息用于提示第二终端用户在录像或者拍照过程中按照所述第二移动距离和第二移动方向移动所述第二终端。
终端1500能够实现图13的方法实施例中第二终端实现的各个过程,为避免重复,这里不再赘述。
图16为实现本公开各个实施例的一种终端的硬件结构示意图,该终端1600包括但不限于:射频单元1601、网络模块1602、音频输出单元1603、输入单元1604、传感器1605、显示单元1606、用户输入单元1607、接口单元16016、存储器1609、处理器1610、以及电源1611等部件。本领域技术人员可理解,图16中示出的终端结构并不构成对终端的限定,终端可包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,终端包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
处理器1610,用于:在与第二终端已建立拍摄预览画面共享的远程连接的状态下,接收所述第二终端发送的所述第二终端的第一预览界面并显示;
接收用户的第一输入;
响应于所述第一输入,输出第一目标拍摄数据;
其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。
可选的,所述接收用户的第一输入之前,所述处理器1610还用于:
显示目标预览界面;
其中,所述目标预览界面为所述第一预览界面和第二预览界面的图像数据进行合成处理后得到的界面,所述第二预览界面为所述第一终端的摄像头采集的预览界面;所述第一目标拍摄数据为所述目标预览界面的所有图像数据。
可选的,所述显示目标预览界面之后,所述处理器1610还用于:
接收用户的第二输入;
响应于所述第二输入,更新所述目标预览界面中目标区域的背景;
其中,更新后的所述目标区域的背景包括第一预览界面中的部分或全部背景,或者包括第二预览界面中的部分或全部背景;所述目标区域为第一预览界面的图像数据的显示区或所述第二预览界面的图像数据的显示区。
可选的,所述目标预览界面包括用于区分所述第一预览界面的图像数据和所述第二预览界面的图像数据的分割线,所述目标预览界面被所述分割线划分为第一预览子区域和第二预览子区域;
所述显示目标预览界面之后,处理器1610还用于:
接收用户对所述分割线的第三输入;
响应于所述第三输入,更新所述第一预览子区域和所述第二预览子区域的图像数据的显示面积。
可选的,所述第一终端包括第一屏和第二屏;显示所述第二终端的预览界面之前,所述处理器1610还用于:
在第一屏上显示至少两个第二终端的预览界面;
所述显示所述第二终端的预览界面,包括:
接收用户对所述至少两个第二终端的预览界面的第四输入;
响应于所述第四输入,显示所述第四输入选取的第二终端的第一预览界面。
可选的,所述在第一屏上显示至少两个第二终端的预览界面之后,处理器1610还用于:
接收用户对N个第二终端的预览界面的N次第五输入;
响应于所述N次第五输入,分别显示所述N个第二终端的预览界面;
获取基于所述N个第二终端的预览界面生成的N个拍摄数据;
将所述N个拍摄数据进行拼接,生成第二目标拍摄数据。
可选的,处理器1610具体用于:
在接收用户对第i个第二终端的第i次第五输入,并显示所述第i个第二终端的预览界面的情况下,接收用户的第六输入;
响应于所述第六输入,执行拍摄操作,生成第i拍摄数据;
其中,所述第i拍摄数据为图像或视频,i为正整数,i小于等于N。
可选的,所述接收所述第二终端发送的所述第二终端的第一预览界面并显示之后,处理器1610还用于:
接收用户的第一触控输入;
获取所述第一触控输入的第一输入轨迹;
基于所述第一输入轨迹,生成第一拍摄调整消息;
将所述第一拍摄调整消息发送至所述第二终端;
其中,所述第一拍摄调整信息用于所述第二终端调整拍摄视场,所述第一拍摄调整消息携带有所述第一输入轨迹。
可选的,所述接收所述第二终端发送的所述第二终端的第一预览界面并显示之后,处理器1610还用于:
接收用户的第二触控输入;
获取所述第二触控输入的第二输入轨迹;
基于所述第二输入轨迹,生成第二拍摄调整消息;
将所述第二拍摄调整消息发送至所述第二终端;
其中,所述第二拍摄调整信息用于所述第二终端基于所述第二输入轨迹录像或者拍照,所述第二拍摄调整消息携带有所述第二输入轨迹。
本公开实施例中,第一终端在与第二终端已建立拍摄预览画面共享的远程连接的状态下,接收所述第二终端发送的所述第二终端的第一预览界面并显示;接收用户的第一输入;响应于所述第一输入,输出第一目标拍摄数据;其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。这样,由于在第一终端上显示第二 终端的第一预览画面,并在第一终端上拍摄,从而可以实现远程拍摄功能。
应理解的是,本公开实施例中,射频单元1601可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器1610处理;另外,将上行的数据发送给基站。通常,射频单元1601包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元1601还可通过无线通信系统与网络和其他设备通信。
终端通过网络模块1602为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元1603可将射频单元1601或网络模块1602接收的或者在存储器1609中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元1603还可提供与终端1600执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元1603包括扬声器、蜂鸣器以及受话器等。
输入单元1604用于接收音频或视频信号。输入单元1604可包括图形处理器(Graphics Processing Unit,GPU)16041和麦克风16042,图形处理器16041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可显示在显示单元1606上。经图形处理器16041处理后的图像帧可存储在存储器1609(或其它存储介质)中或者经由射频单元1601或网络模块1602进行发送。麦克风16042可接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可在电话通话模式的情况下转换为可经由射频单元1601发送到移动通信基站的格式输出。
终端1600还包括至少一种传感器1605,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板16061的亮度,接近传感器可在终端1600移动到耳边时,关闭显示面板16061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等; 传感器1605还可包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元1606用于显示由用户输入的信息或提供给用户的信息。显示单元1606可包括显示面板16061,可采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板16061。
用户输入单元1607可用于接收输入的数字或字符信息,以及产生与终端的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元1607包括触控面板16071以及其他输入设备16072。触控面板16071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板16071上或在触控面板16071附近的操作)。触控面板16071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器1610,接收处理器1610发来的命令并加以执行。此外,可采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板16071。除了触控面板16071,用户输入单元1607还可包括其他输入设备16072。具体地,其他输入设备16072可包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板16071可覆盖在显示面板16061上,当触控面板16071检测到在其上或附近的触摸操作后,传送给处理器1610以确定触摸事件的类型,随后处理器1610根据触摸事件的类型在显示面板16061上提供相应的视觉输出。虽然在图16中,触控面板16071与显示面板16061是作为两个独立的部件来实现终端的输入和输出功能,但是在某些实施例中,可将触控面板16071与显示面板16061集成而实现终端的输入和输出功能,具体此处不做限定。
接口单元1608为外部装置与终端1600连接的接口。例如,外部装置可包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输 出(I/O)端口、视频I/O端口、耳机端口等等。接口单元1608可用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到终端1600内的一个或多个元件或者可用于在终端1600和外部装置之间传输数据。
存储器1609可用于存储软件程序以及各种数据。存储器1609可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器1609可包括高速随机存取存储器,还可包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器1610是终端的控制中心,利用各种接口和线路连接整个终端的各个部分,通过运行或执行存储在存储器1609内的软件程序和/或模块,以及调用存储在存储器1609内的数据,执行终端的各种功能和处理数据,从而对终端进行整体监控。处理器1610可包括一个或多个处理单元;可选的,处理器1610可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可理解的是,上述调制解调处理器也可不集成到处理器1610中。
终端1600还可包括给各个部件供电的电源1611(比如电池),可选的,电源1611可通过电源管理系统与处理器1610逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,终端1600包括一些未示出的功能模块,在此不再赘述。
可选的,本公开实施例还提供一种终端,包括处理器1610,存储器1609,存储在存储器1609上并可在所述处理器1610上运行的计算机程序,该计算机程序被处理器1610执行时实现上述拍摄方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
图17为实现本公开各个实施例的一种终端的硬件结构示意图,该终端1700包括但不限于:射频单元1701、网络模块1702、音频输出单元1703、输入单元1704、传感器1705、显示单元1706、用户输入单元1707、接口单元17017、存储器1709、处理器1710、以及电源1711等部件。本领域技术人员可理解,图17中示出的终端结构并不构成对终端的限定,终端可包括比图 示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,终端包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
处理器1710,用于:在与第一终端已建立拍摄预览画面共享的远程连接的状态下,将所述第二终端的第一预览界面发送至所述第一终端;
接收所述第一终端发送的所述第一终端的第二预览界面并显示。
可选的,所述接收所述第一终端发送的所述第一终端的第二预览界面并显示之后,处理器1710还用于:
接收第一终端发送的第一拍摄调整消息;
提取所述第一拍摄调整消息中的第一输入轨迹;
基于所述第一输入轨迹,确定目标调整角度;
在所述目标调整角度小于或等于预设阈值的情况下,基于所述第一输入轨迹,控制所述第二终端的OIS调整所述第二终端的摄像头的位置。
可选的,所述提取所述第一拍摄调整消息中的第一输入轨迹之后,处理器1710还用于:
在所述目标调整角度大于预设阈值的情况下,基于目标调整角度,确定第二终端的第一移动距离和第一移动方向;
显示第一提示信息,所述第一提示信息用于提示第二终端用户按照所述第一移动距离和第一移动方向移动所述第二终端。
可选的,所述接收所述第一终端发送的所述第一终端的第二预览界面并显示之后,处理器1710还用于:
接收第一终端发送的第二拍摄调整消息;
提取所述第二拍摄调整消息中的第二输入轨迹;
基于所述第二输入轨迹,确定所述第二终端的第二移动方向和第二移动距离;
显示第二提示信息,所述第二提示信息用于提示第二终端用户在录像或者拍照过程中按照所述第二移动距离和第二移动方向移动所述第二终端。
本公开实施例中,第一终端在与第二终端已建立拍摄预览画面共享的远程连接的状态下,接收所述第二终端发送的所述第二终端的第一预览界面并 显示;接收用户的第一输入;响应于所述第一输入,输出第一目标拍摄数据;其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。这样,由于在第一终端上显示第二终端的第一预览画面,并在第一终端上拍摄,从而可以实现远程拍摄功能。
应理解的是,本公开实施例中,射频单元1701可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器1710处理;另外,将上行的数据发送给基站。通常,射频单元1701包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元1701还可通过无线通信系统与网络和其他设备通信。
终端通过网络模块1702为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元1703可将射频单元1701或网络模块1702接收的或者在存储器1709中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元1703还可提供与终端1700执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元1703包括扬声器、蜂鸣器以及受话器等。
输入单元1704用于接收音频或视频信号。输入单元1704可包括图形处理器(Graphics Processing Unit,GPU)17041和麦克风17042,图形处理器17041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可显示在显示单元1706上。经图形处理器17041处理后的图像帧可存储在存储器1709(或其它存储介质)中或者经由射频单元1701或网络模块1702进行发送。麦克风17042可接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可在电话通话模式的情况下转换为可经由射频单元1701发送到移动通信基站的格式输出。
终端1700还包括至少一种传感器1705,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板17061的亮度,接近传感器可在终端1700移动到耳边时,关闭显示面板17061和/或背光。作为运动传 感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器1705还可包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元1706用于显示由用户输入的信息或提供给用户的信息。显示单元1706可包括显示面板17061,可采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板17061。
用户输入单元1707可用于接收输入的数字或字符信息,以及产生与终端的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元1707包括触控面板17071以及其他输入设备17072。触控面板17071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板17071上或在触控面板17071附近的操作)。触控面板17071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器1710,接收处理器1710发来的命令并加以执行。此外,可采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板17071。除了触控面板17071,用户输入单元1707还可包括其他输入设备17072。具体地,其他输入设备17072可包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板17071可覆盖在显示面板17061上,当触控面板17071检测到在其上或附近的触摸操作后,传送给处理器1710以确定触摸事件的类型,随后处理器1710根据触摸事件的类型在显示面板17061上提供相应的视觉输出。虽然在图17中,触控面板17071与显示面板17061是作为两个独立的部件来实现终端的输入和输出功能,但是在某些实施例中,可将触控面板17071与显示面板17061集成而实现终端的输入和输出功能,具体此处不做限定。
接口单元1708为外部装置与终端1700连接的接口。例如,外部装置可包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元1708可用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到终端1700内的一个或多个元件或者可用于在终端1700和外部装置之间传输数据。
存储器1709可用于存储软件程序以及各种数据。存储器1709可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器1709可包括高速随机存取存储器,还可包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器1710是终端的控制中心,利用各种接口和线路连接整个终端的各个部分,通过运行或执行存储在存储器1709内的软件程序和/或模块,以及调用存储在存储器1709内的数据,执行终端的各种功能和处理数据,从而对终端进行整体监控。处理器1710可包括一个或多个处理单元;可选的,处理器1710可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可理解的是,上述调制解调处理器也可不集成到处理器1710中。
终端1700还可包括给各个部件供电的电源1711(比如电池),可选的,电源1711可通过电源管理系统与处理器1710逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,终端1700包括一些未示出的功能模块,在此不再赘述。
可选的,本公开实施例还提供一种终端,包括处理器1710,存储器1709,存储在存储器1709上并可在所述处理器1710上运行的计算机程序,该计算机程序被处理器1710执行时实现上述拍摄方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本公开实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述拍摄方法实施例 的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对相关技术做出贡献的部分可以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可为手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例所述的方法。
以上所述,仅为本公开的具体实施方式,但本公开的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应以权利要求的保护范围为准。

Claims (17)

  1. 一种拍摄方法,应用于第一终端,包括:
    在与第二终端已建立拍摄预览画面共享的远程连接的状态下,接收所述第二终端发送的所述第二终端的第一预览界面并显示;
    接收用户的第一输入;
    响应于所述第一输入,输出第一目标拍摄数据;
    其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。
  2. 根据权利要求1所述的方法,其中,所述接收用户的第一输入之前,所述方法还包括:
    显示目标预览界面;
    其中,所述目标预览界面为所述第一预览界面和第二预览界面的图像数据进行合成处理后得到的界面,所述第二预览界面为所述第一终端的摄像头采集的预览界面;所述第一目标拍摄数据为所述目标预览界面的所有图像数据。
  3. 根据权利要求2所述的方法,其中,所述显示目标预览界面之后,所述方法还包括:
    接收用户的第二输入;
    响应于所述第二输入,更新所述目标预览界面中目标区域的背景;
    其中,更新后的所述目标区域的背景包括第一预览界面中的部分或全部背景,或者包括第二预览界面中的部分或全部背景;所述目标区域为第一预览界面的图像数据的显示区或所述第二预览界面的图像数据的显示区。
  4. 根据权利要求2所述的方法,其中,所述目标预览界面包括用于区分所述第一预览界面的图像数据和所述第二预览界面的图像数据的分割线,所述目标预览界面被所述分割线划分为第一预览子区域和第二预览子区域;
    所述显示目标预览界面之后,所述方法还包括:
    接收用户对所述分割线的第三输入;
    响应于所述第三输入,更新所述第一预览子区域和所述第二预览子区域 的图像数据的显示面积。
  5. 根据权利要求1所述的方法,其中,所述第一终端包括第一屏和第二屏;显示所述第二终端的预览界面之前,所述方法还包括:
    在第一屏上显示至少两个第二终端的预览界面;
    所述显示所述第二终端的预览界面,包括:
    接收用户对所述至少两个第二终端的预览界面的第四输入;
    响应于所述第四输入,显示所述第四输入选取的第二终端的第一预览界面。
  6. 根据权利要求5所述的方法,其中,所述在第一屏上显示至少两个第二终端的预览界面之后,还包括:
    接收用户对N个第二终端的预览界面的N次第五输入;
    响应于所述N次第五输入,分别显示所述N个第二终端的预览界面;
    获取基于所述N个第二终端的预览界面生成的N个拍摄数据;
    将所述N个拍摄数据进行拼接,生成第二目标拍摄数据。
  7. 根据权利要求6所述的方法,其中,所述获取基于所述N个第二终端的预览界面生成的N个拍摄数据,包括:
    在接收用户对第i个第二终端的第i次第五输入,并显示所述第i个第二终端的预览界面的情况下,接收用户的第六输入;
    响应于所述第六输入,执行拍摄操作,生成第i拍摄数据;
    其中,所述第i拍摄数据为图像或视频,i为正整数,i小于等于N。
  8. 根据权利要求1所述的方法,其中,所述接收所述第二终端发送的所述第二终端的第一预览界面并显示之后,还包括:
    接收用户的第一触控输入;
    获取所述第一触控输入的第一输入轨迹;
    基于所述第一输入轨迹,生成第一拍摄调整消息;
    将所述第一拍摄调整消息发送至所述第二终端;
    其中,所述第一拍摄调整信息用于所述第二终端调整拍摄视场,所述第一拍摄调整消息携带有所述第一输入轨迹。
  9. 根据权利要求1所述的方法,其中,所述接收所述第二终端发送的所 述第二终端的第一预览界面并显示之后,还包括:
    接收用户的第二触控输入;
    获取所述第二触控输入的第二输入轨迹;
    基于所述第二输入轨迹,生成第二拍摄调整消息;
    将所述第二拍摄调整消息发送至所述第二终端;
    其中,所述第二拍摄调整信息用于所述第二终端基于所述第二输入轨迹录像或者拍照,所述第二拍摄调整消息携带有所述第二输入轨迹。
  10. 一种拍摄方法,应用于第二终端,包括:
    在与第一终端已建立拍摄预览画面共享的远程连接的状态下,将所述第二终端的第一预览界面发送至所述第一终端;
    接收所述第一终端发送的所述第一终端的第二预览界面并显示。
  11. 根据权利要求10所述的方法,其中,所述接收所述第一终端发送的所述第一终端的第二预览界面并显示之后,还包括:
    接收第一终端发送的第一拍摄调整消息;
    提取所述第一拍摄调整消息中的第一输入轨迹;
    基于所述第一输入轨迹,确定目标调整角度;
    在所述目标调整角度小于或等于预设阈值的情况下,基于所述第一输入轨迹,调整所述第二终端的摄像头镜头的位置。
  12. 根据权利要求11所述的方法,其中,所述提取所述第一拍摄调整消息中的第一输入轨迹之后,还包括:
    在所述目标调整角度大于预设阈值的情况下,基于目标调整角度,确定第二终端的第一移动距离和第一移动方向;
    显示第一提示信息,所述第一提示信息用于提示第二终端用户按照所述第一移动距离和第一移动方向移动所述第二终端。
  13. 根据权利要求10所述的方法,其中,所述接收所述第一终端发送的所述第一终端的第二预览界面并显示之后,还包括:
    接收第一终端发送的第二拍摄调整消息;
    提取所述第二拍摄调整消息中的第二输入轨迹;
    基于所述第二输入轨迹,确定所述第二终端的第二移动方向和第二移动 距离;
    显示第二提示信息,所述第二提示信息用于提示第二终端用户在录像或者拍照过程中按照所述第二移动距离和第二移动方向移动所述第二终端。
  14. 一种终端,所述终端为建立拍摄预览画面共享的远程连接的第一终端和第二终端中的第一终端,包括:
    处理模块,用于接收所述第二终端发送的所述第二终端的第一预览界面并显示;
    第一接收模块,用于接收用户的第一输入;
    输出模块,用于响应于所述第一输入,输出第一目标拍摄数据;
    其中,所述第一目标拍摄数据包括所述第一预览界面的部分或全部图像信息,所述第一目标拍摄数据为视频或者图像。
  15. 一种终端,所述终端为建立拍摄预览画面共享的远程连接的第一终端和第二终端中的第二终端,包括:
    第三发送模块,用于将所述第二终端的第一预览界面发送至所述第一终端;
    第七接收模块,用于接收所述第一终端发送的所述第一终端的第二预览界面并显示。
  16. 一种终端,包括处理器,存储器,存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现权利要求1至13中任一项所述拍摄方法的步骤。
  17. 一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现权利要求1至13任一项所述的拍摄方法的步骤。
PCT/CN2019/116123 2018-11-28 2019-11-07 拍摄方法及终端 WO2020108261A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19890183.7A EP3890303A4 (en) 2018-11-28 2019-11-07 PHOTOGRAPHY PROCESS AND TERMINAL
US17/330,602 US11689649B2 (en) 2018-11-28 2021-05-26 Shooting method and terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811433213.8A CN109361869B (zh) 2018-11-28 2018-11-28 一种拍摄方法及终端
CN201811433213.8 2018-11-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/330,602 Continuation US11689649B2 (en) 2018-11-28 2021-05-26 Shooting method and terminal

Publications (1)

Publication Number Publication Date
WO2020108261A1 true WO2020108261A1 (zh) 2020-06-04

Family

ID=65343354

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/116123 WO2020108261A1 (zh) 2018-11-28 2019-11-07 拍摄方法及终端

Country Status (4)

Country Link
US (1) US11689649B2 (zh)
EP (1) EP3890303A4 (zh)
CN (1) CN109361869B (zh)
WO (1) WO2020108261A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112188100A (zh) * 2020-09-29 2021-01-05 维沃移动通信有限公司 联合拍摄方法、联合拍摄装置及电子设备
CN114554097A (zh) * 2022-02-28 2022-05-27 维沃移动通信有限公司 显示方法、显示装置、电子设备和可读存储介质

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109361869B (zh) 2018-11-28 2021-04-06 维沃移动通信(杭州)有限公司 一种拍摄方法及终端
CN109976829A (zh) * 2019-03-18 2019-07-05 百度在线网络技术(北京)有限公司 一种语音技能提示语配置方法及装置
CN110798622B (zh) * 2019-11-29 2022-01-25 维沃移动通信有限公司 一种共享拍摄方法及电子设备
CN110971823B (zh) * 2019-11-29 2021-06-29 维沃移动通信(杭州)有限公司 一种参数调整方法及终端设备
CN111263093B (zh) * 2020-01-22 2022-04-01 维沃移动通信有限公司 一种录像方法及电子设备
CN111988528B (zh) 2020-08-31 2022-06-24 北京字节跳动网络技术有限公司 拍摄方法、装置、电子设备及计算机可读存储介质
CN114449134B (zh) * 2020-10-30 2024-02-13 华为技术有限公司 一种拍摄方法和终端设备
CN115514882A (zh) * 2021-01-30 2022-12-23 华为技术有限公司 一种分布式拍摄方法,电子设备及介质
CN114265538A (zh) * 2021-12-21 2022-04-01 Oppo广东移动通信有限公司 拍照控制方法及装置、存储介质和电子设备
CN114520878A (zh) * 2022-02-11 2022-05-20 维沃移动通信有限公司 视频拍摄方法、装置及电子设备
CN115002443B (zh) * 2022-04-29 2023-05-12 北京城市网邻信息技术有限公司 图像采集的处理方法、装置、电子设备及存储介质
CN115002347B (zh) * 2022-05-26 2023-10-27 深圳传音控股股份有限公司 图像处理方法、智能终端及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070139369A1 (en) * 2005-12-19 2007-06-21 Te-Yu Kao Portable electronic device and method for sharing a common display screen
CN102231038A (zh) * 2009-10-14 2011-11-02 鸿富锦精密工业(深圳)有限公司 摄影机调整系统及方法
CN104796610A (zh) * 2015-04-20 2015-07-22 广东欧珀移动通信有限公司 一种移动终端的摄像头共享方法、装置、系统及移动终端
CN109361869A (zh) * 2018-11-28 2019-02-19 维沃移动通信(杭州)有限公司 一种拍摄方法及终端

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1774037A (zh) * 2004-09-30 2006-05-17 英业达股份有限公司 照相手机的摄像头微调装置
CN103078924A (zh) * 2012-12-28 2013-05-01 华为技术有限公司 视野共享方法及设备
KR20160034065A (ko) 2014-09-19 2016-03-29 엘지전자 주식회사 이동 단말기 및 그 제어 방법
KR20170099330A (ko) * 2016-02-22 2017-08-31 김철원 미리보기 영상의 단말기간 공유 방법
KR20180094340A (ko) * 2017-02-15 2018-08-23 엘지전자 주식회사 이동단말기 및 그 제어 방법
CN107197144A (zh) * 2017-05-24 2017-09-22 珠海市魅族科技有限公司 拍摄控制方法及装置、计算机装置和可读存储介质
CN107659769B (zh) * 2017-09-07 2019-07-26 维沃移动通信有限公司 一种拍摄方法、第一终端及第二终端
CN108874343B (zh) * 2018-06-15 2022-04-29 Oppo广东移动通信有限公司 预览视窗共享方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070139369A1 (en) * 2005-12-19 2007-06-21 Te-Yu Kao Portable electronic device and method for sharing a common display screen
CN102231038A (zh) * 2009-10-14 2011-11-02 鸿富锦精密工业(深圳)有限公司 摄影机调整系统及方法
CN104796610A (zh) * 2015-04-20 2015-07-22 广东欧珀移动通信有限公司 一种移动终端的摄像头共享方法、装置、系统及移动终端
CN109361869A (zh) * 2018-11-28 2019-02-19 维沃移动通信(杭州)有限公司 一种拍摄方法及终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3890303A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112188100A (zh) * 2020-09-29 2021-01-05 维沃移动通信有限公司 联合拍摄方法、联合拍摄装置及电子设备
CN114554097A (zh) * 2022-02-28 2022-05-27 维沃移动通信有限公司 显示方法、显示装置、电子设备和可读存储介质

Also Published As

Publication number Publication date
EP3890303A4 (en) 2022-04-13
CN109361869B (zh) 2021-04-06
US11689649B2 (en) 2023-06-27
US20210281669A1 (en) 2021-09-09
CN109361869A (zh) 2019-02-19
EP3890303A1 (en) 2021-10-06

Similar Documents

Publication Publication Date Title
WO2020108261A1 (zh) 拍摄方法及终端
CN108668083B (zh) 一种拍照方法及终端
CN108513070B (zh) 一种图像处理方法、移动终端及计算机可读存储介质
CN111541845B (zh) 图像处理方法、装置及电子设备
WO2020042890A1 (zh) 视频处理方法、终端及计算机可读存储介质
WO2019192590A1 (zh) 拍照方法及移动终端
WO2021036536A1 (zh) 视频拍摄方法及电子设备
WO2019174628A1 (zh) 拍照方法及移动终端
CN111182205B (zh) 拍摄方法、电子设备及介质
WO2020020134A1 (zh) 拍摄方法及移动终端
JP2021516374A (ja) 画像処理方法及びフレキシブルスクリーン端末
WO2021036623A1 (zh) 显示方法及电子设备
WO2019196929A1 (zh) 一种视频数据处理方法及移动终端
WO2021197121A1 (zh) 图像拍摄方法及电子设备
CN111010523B (zh) 一种视频录制方法及电子设备
CN111031253B (zh) 一种拍摄方法及电子设备
CN111597370B (zh) 一种拍摄方法及电子设备
US11863901B2 (en) Photographing method and terminal
KR20220005087A (ko) 촬영 방법 및 단말
CN108881721B (zh) 一种显示方法及终端
WO2020011080A1 (zh) 显示控制方法及终端设备
WO2021017730A1 (zh) 截图方法及终端设备
WO2019120190A1 (zh) 拨号方法及移动终端
WO2020259162A1 (zh) 图片显示方法及终端
WO2020063136A1 (zh) 应用程序的启动方法及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19890183

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019890183

Country of ref document: EP

Effective date: 20210628