CN106973226B - Shooting method and terminal - Google Patents

Shooting method and terminal Download PDF

Info

Publication number
CN106973226B
CN106973226B CN201710200826.6A CN201710200826A CN106973226B CN 106973226 B CN106973226 B CN 106973226B CN 201710200826 A CN201710200826 A CN 201710200826A CN 106973226 B CN106973226 B CN 106973226B
Authority
CN
China
Prior art keywords
shooting
images
target
angle
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710200826.6A
Other languages
Chinese (zh)
Other versions
CN106973226A (en
Inventor
陈仕心
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mingdian Culture Communication Co., Ltd
Original Assignee
Shanghai Mingdian Culture Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mingdian Culture Communication Co Ltd filed Critical Shanghai Mingdian Culture Communication Co Ltd
Priority to CN201710200826.6A priority Critical patent/CN106973226B/en
Publication of CN106973226A publication Critical patent/CN106973226A/en
Application granted granted Critical
Publication of CN106973226B publication Critical patent/CN106973226B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Abstract

The embodiment of the invention discloses a shooting method and a terminal, wherein when a multi-angle shooting mode is started, a shooting instruction is received; carrying out multi-angle shooting on a target shooting object according to a shooting instruction and preset shooting parameters to obtain multi-frame shooting images of the target shooting object; acquiring a shooting end instruction, responding to the shooting end instruction, and performing expansion processing on a plurality of target shooting objects according to a plurality of overlapping areas among the plurality of target shooting objects in the multi-frame shooting images to generate a multi-angle expanded image of the target shooting objects; therefore, the problem that multi-angle expanded images of the target shooting object in rotary motion cannot be obtained is solved, and the application range of terminal shooting is expanded.

Description

Shooting method and terminal
Technical Field
The present invention relates to image capturing technologies, and in particular, to a capturing method and a terminal.
Background
With the continuous progress and high-speed development of mobile communication technology, many intelligent mobile terminals such as mobile phones and tablet computers are not only simple communication devices. Functions of positioning, entertainment, and the like of the mobile terminal are also increasingly widely used, wherein photographing is a function frequently used in daily life. In order to improve the interest and the practicability when the terminal is used for shooting, the shooting mode of the terminal is more and more abundant, for example, a time-delay shooting mode, a slow motion shooting mode, a shallow depth-of-field shooting mode and the like, and a user can obtain images with various different shooting effects through any shooting mode on the terminal.
In the prior art, a user shoots a target shooting object through a terminal to obtain an image with a fixed angle with the target shooting object, that is, an image shot by the terminal can only be an image of the target shooting object at a certain relative angle with the terminal. For a target shooting object which is rotating, only image information of the target shooting object at a certain moment can be obtained through one image, so that the defect that the terminal cannot obtain all image information of the target shooting object which is rotating through shooting one image is caused, and the application range of terminal shooting is limited.
Disclosure of Invention
In view of this, embodiments of the present invention provide a shooting method and a terminal, where when a target shooting object rotates, a plurality of frames of shooting images can be obtained by shooting the target shooting object at multiple angles, and the plurality of frames of shooting images are expanded on one image to present multi-angle image information of the target shooting object, so as to solve a problem that a multi-angle expanded image of the target shooting object that rotates cannot be obtained, and expand an application range of terminal shooting.
The technical scheme of the embodiment of the invention is realized as follows:
the embodiment of the invention provides a shooting method, which comprises the following steps:
when the multi-angle shooting mode is started, receiving a shooting instruction;
performing multi-angle shooting on a target shooting object according to the shooting instruction and preset shooting parameters to obtain multi-frame shooting images of the target shooting object;
and acquiring a shooting end instruction, responding to the shooting end instruction, and performing unfolding processing on the plurality of target shooting objects according to a plurality of overlapping areas among the plurality of target shooting objects in the multi-frame shooting images to generate a multi-angle unfolded image of the target shooting objects.
In the above scheme, the preset shooting parameters include a shooting angle and a shooting time interval; the multi-angle shooting of the target shooting object according to the shooting instruction and the preset shooting parameters comprises the following steps:
responding to the shooting instruction;
determining a reference point corresponding to the target shooting object, and determining a real-time angle between a connecting line between a preset shooting center and the reference point and a horizontal line;
and when the real-time angle is the same as the shooting angle, shooting the target shooting object at multiple angles according to the shooting time interval.
In the foregoing solution, the acquiring of the shooting end instruction includes:
acquiring the sum of pixel values of the multiple frames of shot images;
comparing a preset pixel threshold value with the sum of the pixel values;
and when the sum of the pixel values is greater than or equal to the preset pixel threshold value, generating the shooting end instruction.
In the above solution, after the comparing the preset pixel threshold value and the sum of the pixel values, the method further includes:
when the sum of the pixel values is smaller than the preset pixel threshold, acquiring a plurality of coincidence proportions among a plurality of target shooting objects in the multi-frame shooting image, and comparing the coincidence proportions with a preset coincidence threshold respectively;
and when at least one coincidence proportion in the multiple coincidence proportions is greater than or equal to a preset coincidence threshold value, generating the shooting ending instruction.
In the foregoing solution, the expanding the target photographic objects according to multiple overlapping areas between the target photographic objects in the multiple frames of photographic images to generate a multi-angle expanded image of the target photographic objects includes:
performing background filling processing on areas outside the target shooting objects in the multi-frame shooting images to generate multi-frame images to be spliced after the background is filled;
setting the overlapping area in two continuous frames of images to be spliced in the plurality of frames of images to be spliced as an unfolding boundary;
and unfolding the non-overlapped area of the plurality of frames of images to be spliced along the plurality of corresponding unfolded boundaries in the plurality of frames of images to be spliced to generate a multi-angle unfolded image of the target shooting object.
An embodiment of the present invention provides a terminal, where the terminal includes: a receiving unit, a shooting unit, an acquisition unit, a response unit, a processing unit,
the receiving unit is used for receiving a shooting instruction when the multi-angle shooting mode is started;
the shooting unit is used for shooting a target shooting object in multiple angles according to the shooting instruction and preset shooting parameters;
the acquisition unit is used for acquiring multi-frame shooting images of the target shooting object;
the acquisition unit is also used for acquiring a shooting ending instruction;
the response unit is used for responding to the shooting ending instruction;
the processing unit is used for expanding the target shooting objects according to a plurality of overlapping areas among the target shooting objects in the multi-frame shooting images to generate a multi-angle expanded image of the target shooting objects.
In the above scheme, the preset shooting parameters include a shooting angle and a shooting time interval; the photographing unit includes: a response subunit, a determination subunit, a capture subunit,
the response subunit is used for responding to the shooting instruction;
the determining subunit is configured to determine a reference point corresponding to the target shooting object, and determine a real-time angle between a horizontal line and a connection line between a preset shooting center and the reference point;
and the shooting subunit is used for shooting the target shooting object at multiple angles according to the shooting time interval when the real-time angle is the same as the shooting angle.
In the above scheme, the acquiring unit includes: an acquisition subunit, a comparison subunit, a generation subunit,
the acquiring subunit is configured to acquire a sum of pixel values of the multiple frames of captured images;
the comparison subunit is used for comparing a preset pixel threshold value with the pixel value sum;
and the generating subunit is configured to generate the shooting end instruction when the sum of the pixel values is greater than or equal to the preset pixel threshold.
In the foregoing solution, the obtaining subunit is further configured to, after comparing a preset pixel threshold with the pixel value sum, obtain a plurality of coincidence proportions between a plurality of target photographic objects in the multiple frames of photographic images when the pixel value sum is smaller than the preset pixel threshold;
the comparison subunit is further configured to compare the multiple coincidence proportions with a preset coincidence threshold value respectively;
the generating subunit is further configured to generate the shooting end instruction when at least one of the multiple coincidence proportions is greater than or equal to a preset coincidence threshold.
In the above aspect, the processing unit includes: a filling subunit, a setting subunit, a processing subunit,
the filling subunit is configured to perform background filling processing on an area outside the target photographic objects in the multiple frames of photographic images, and generate multiple frames of to-be-spliced images with the background filled;
the setting subunit is configured to set the overlapping area in two consecutive images to be stitched in the multiple frames of images to be stitched as an expansion boundary;
and the processing subunit is configured to perform expansion processing on a non-overlapping area of the multiple frames of images to be stitched according to the multiple expansion boundaries corresponding to the multiple frames of images to be stitched, so as to generate a multi-angle expansion image of the target shooting object.
Therefore, in the technical scheme of the embodiment of the invention, when the multi-angle shooting mode is started, the shooting instruction is received; carrying out multi-angle shooting on a target shooting object according to a shooting instruction and preset shooting parameters to obtain multi-frame shooting images of the target shooting object; and acquiring a shooting end instruction, responding to the shooting end instruction, and performing expansion processing on a plurality of target shooting objects according to a plurality of overlapping areas among the plurality of target shooting objects in the multi-frame shooting images to generate a multi-angle expanded image of the target shooting objects. Therefore, the shooting method and the terminal provided by the embodiment of the invention can acquire multi-frame shooting images by shooting the target shooting object at multiple angles under the condition that the target shooting object rotates, and spread the multi-frame shooting images on one image to present multi-angle image information of the target shooting object, so that the problem that multi-angle spread images of the target shooting object which rotates is unavailable is solved, and the application range of terminal shooting is expanded; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
Drawings
Fig. 1 is a schematic hardware configuration diagram of an alternative mobile terminal 100 for implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal 100 shown in FIG. 1;
fig. 3 is a first schematic flow chart illustrating an implementation of a shooting method according to an embodiment of the present invention;
fig. 4 is a schematic diagram of an implementation flow of the shooting method according to the embodiment of the present invention;
FIG. 5 is a schematic diagram of a display interface for a terminal to obtain a real-time angle according to an embodiment of the present invention;
fig. 6 is a schematic flow chart illustrating an implementation of the shooting method according to the embodiment of the present invention;
fig. 7 is a schematic diagram illustrating an implementation flow of a shooting method according to an embodiment of the present invention;
fig. 8 is a schematic flow chart illustrating an implementation of the shooting method according to the embodiment of the present invention;
fig. 9 is a schematic diagram of background filling processing performed on a captured image according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of generating a multi-angle unfolded image of a target photographic object in an embodiment of the present invention;
fig. 11 is a schematic flow chart illustrating a sixth implementation of the shooting method according to the embodiment of the present invention;
fig. 12 is a first schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 13 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 15 is a schematic structural diagram of a terminal according to a fourth embodiment of the present invention.
Detailed Description
It should be understood that the embodiments described herein are only for explaining the technical solutions of the present invention, and are not intended to limit the scope of the present invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a Personal Digital Assistant (PDA), a tablet computer (PAD), a Portable Multimedia Player (PMP), a navigation device, etc., and a stationary terminal such as a digital TV, a desktop computer, etc. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic hardware configuration of a mobile terminal 100 implementing various embodiments of the present invention, and as shown in fig. 1, the mobile terminal 100 may include a wireless communication unit 110, an audio/video (a/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, a power supply unit 190, and the like. Fig. 1 illustrates the mobile terminal 100 having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. The elements of the mobile terminal 100 will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive digital broadcasting by using a digital broadcasting system such as a data broadcasting system of multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcasting-handheld (DVB-H), forward link media (MediaFLO @), terrestrial digital broadcasting integrated service (ISDB-T), and the like. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal 100. The wireless internet module 113 may be internally or externally coupled to the terminal. The wireless internet access technology referred to by the wireless internet module 113 may include Wireless Local Area Network (WLAN), wireless compatibility authentication (Wi-Fi), wireless broadband (Wibro), worldwide interoperability for microwave access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.
The short-range communication module 114 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetooth (TM), Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbee (TM), and the like.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal 100. A typical example of the location information module 115 is a Global Positioning System (GPS) module 115. According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS module 115 can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 122, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the cameras 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the construction of the mobile terminal 100. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data to control various operations of the mobile terminal 100 according to a command input by a user. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port (a typical example is a Universal Serial Bus (USB) port), a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means.
The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal 100. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal 100 is accurately mounted on the cradle.
The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. Depending on the particular desired implementation, mobile terminal 100 may include two or more display units (or other display devices), for example, mobile terminal 100 may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The memory 160 may store software programs or the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, etc.) that has been output or is to be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal 100. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 181 for reproducing or playing back multimedia data, and the multimedia module 181 may be constructed within the controller 180 or may be constructed to be separated from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, the mobile terminal 100 has been described in terms of its functions. Hereinafter, the slide-type mobile terminal 100 among various types of mobile terminals 100, such as a folder-type, bar-type, swing-type, slide-type mobile terminal 100, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal 100, and is not limited to the slide type mobile terminal 100.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which the mobile terminal 100 according to the present invention is capable of operating will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 275.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz, 5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular BS270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
The mobile communication module 112 of the wireless communication unit 110 in the mobile terminal accesses the mobile communication network based on the necessary data (including the user identification information and the authentication information) of the mobile communication network (such as the mobile communication network of 2G/3G/4G, etc.) built in the mobile terminal, so as to transmit the mobile communication data (including the uplink mobile communication data and the downlink mobile communication data) for the services of web browsing, network multimedia playing, etc. of the mobile terminal user.
The wireless internet module 113 of the wireless communication unit 110 implements a function of a wireless hotspot by operating a related protocol function of the wireless hotspot, the wireless hotspot supports access by a plurality of mobile terminals (any mobile terminal other than the mobile terminal), transmits mobile communication data (including uplink mobile communication data and downlink mobile communication data) for mobile terminal user's services such as web browsing, network multimedia playing, etc. by multiplexing the mobile communication connection between the mobile communication module 112 and the mobile communication network, since the mobile terminal essentially multiplexes the mobile communication connection between the mobile terminal and the communication network for transmitting mobile communication data, the traffic of mobile communication data consumed by the mobile terminal is charged to the communication tariff of the mobile terminal by a charging entity on the side of the communication network, thereby consuming the data traffic of the mobile communication data included in the communication tariff contracted for use by the mobile terminal.
Based on the above hardware structure of the mobile terminal 100 and the communication system, various embodiments of the method of the present invention are proposed.
Example one
Fig. 3 is a first schematic flow chart illustrating an implementation process of the shooting method according to an embodiment of the present invention, as shown in fig. 3, in an embodiment of the present invention, a method for a terminal to perform multi-angle shooting may include the following steps:
step 101, when the multi-angle shooting mode is started, receiving a shooting instruction.
In the specific embodiment of the invention, when the multi-angle shooting mode is started, the terminal can receive the shooting instruction according to the selection operation of the user. The shooting instruction is an instruction for shooting a target shooting object in multiple angles.
It should be noted that, in an embodiment of the present invention, the terminal may be one of smart terminals having a photographing function, for example, a smart phone having a photographing function, a PAD, and the like. Further, the photographing function in the terminal may include a plurality of different photographing modes, such as panorama photographing, time-lapse photographing, depth-of-field photographing, and the like.
Further, in an embodiment of the present invention, the target photographic object is a person or an object that performs a rotational motion about a fixed central axis, for example, a dancer who rotates on the spot, a vase which rotates about a central axis, or the like.
It should be noted that, in an embodiment of the present invention, the multi-angle shooting mode may be used to shoot the target object in a rotating motion state at various angles, so as to obtain an image that can represent images of the target object at various angles.
And 102, shooting the target shooting object at multiple angles according to the shooting instruction and the preset shooting parameters, and acquiring multi-frame shooting images of the target shooting object.
In the specific embodiment of the invention, after receiving the shooting instruction, the terminal can carry out multi-angle shooting on the target shooting object according to the shooting instruction and the preset shooting parameters to obtain multi-frame shooting images of the target shooting object. The shooting parameters corresponding to the target shooting object can be a shooting angle and a shooting time interval.
Further, in an embodiment of the present invention, when the preset shooting parameters are a shooting angle and a shooting time interval, the terminal may adjust a relative angle with respect to the target shooting object according to the shooting angle, and after determining the relative angle, shoot the target shooting object according to the shooting time interval.
In an embodiment of the present invention, the shooting angle is a relative angle between a shooting device of the terminal and the target shooting object when the terminal shoots the target shooting object. Specifically, in order to ensure that the images of the target object at various angles in the multi-angle unfolded image obtained by the terminal in the multi-angle shooting mode are all at the same horizontal height, the terminal needs to keep the relative angle between the target object and the terminal unchanged when shooting the target object, and therefore the shooting angle needs to be determined and the target object needs to be shot according to the shooting angle.
It should be noted that, in an embodiment of the present invention, the shooting time interval is used to determine a frequency of acquiring each frame of image by the terminal when the terminal shoots the target shooting object. Specifically, in the specific embodiment of the present invention, when the target photographic object is in a rotational motion state, the terminal photographs the target photographic object according to the above-mentioned photographing time interval, so as to obtain the multi-frame photographic images of the target photographic object at different angles.
It should be noted that, in the embodiment of the present invention, the user may determine the shooting time interval according to the movement speed of the target shooting object, that is, when the target shooting object rotates faster, the user may set a smaller time interval, so as to increase the shooting frequency and obtain images of the target shooting object at more angles.
And 103, acquiring a shooting end instruction, responding to the shooting end instruction, and performing expansion processing on a plurality of target shooting objects according to a plurality of overlapped areas among the plurality of target shooting objects in the multi-frame shooting image to generate a multi-angle expanded image of the target shooting objects.
In a specific embodiment of the present invention, after the terminal performs multi-angle shooting on the target photographic object according to the shooting instruction and the preset shooting parameters and acquires the multi-frame photographic image of the target photographic object, the terminal may first acquire a shooting end instruction corresponding to the shooting instruction and then respond to the shooting end instruction, and specifically, the terminal may perform expansion processing on a plurality of target photographic objects according to a plurality of overlapping areas between the plurality of target photographic objects in the multi-frame photographic image to generate the multi-angle expanded image of the target photographic object.
Further, in an embodiment of the present invention, the shooting end instruction is an instruction for instructing the terminal to stop shooting, which corresponds to the shooting instruction. The terminal may generate a shooting end instruction corresponding to the shooting instruction through a plurality of methods, for example, when the accumulated time for the terminal to shoot according to the shooting instruction exceeds a preset shooting time threshold, the terminal may generate the shooting end instruction; or the terminal calculates the pixel value of a multi-frame shot image obtained by shooting according to the shooting instruction, and when the pixel value of the shot image meets a preset splicing condition, the terminal generates the shooting ending instruction; further, the terminal can also acquire a shooting end instruction according to the operation of the user.
Further, in the embodiment of the present invention, the multi-angle expanded image of the target photographic object, which is generated by expanding the multi-frame photographic image obtained by photographing, is an image representing images of the target photographic object at various angles. Such as a 360 degree image of a vase.
In the embodiment of the present invention, further, after the terminal generates a multi-angle unfolded image corresponding to the target photographic object, the multi-angle unfolded image may be displayed on a display interface of the terminal.
According to the shooting method provided by the embodiment of the invention, when a multi-angle shooting mode is started, a shooting instruction is received; carrying out multi-angle shooting on a target shooting object according to a shooting instruction and preset shooting parameters to obtain multi-frame shooting images of the target shooting object; and acquiring a shooting end instruction, responding to the shooting end instruction, and performing expansion processing on a plurality of target shooting objects according to a plurality of overlapping areas among the plurality of target shooting objects in the multi-frame shooting images to generate a multi-angle expanded image of the target shooting objects. Therefore, the shooting method provided by the embodiment of the invention can acquire multi-frame shooting images by shooting the target shooting object at multiple angles under the condition that the target shooting object rotates, and can expand the multi-frame shooting images on one image to present multi-angle image information of the target shooting object, so that the problem that the multi-angle expanded images of the target shooting object which rotates and moves cannot be acquired is solved, and the application range of terminal shooting is expanded; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
Example two
Fig. 4 is a schematic diagram illustrating a second implementation flow of the shooting method according to an embodiment of the present invention, as shown in fig. 4, in a specific embodiment of the present invention, a method for a terminal to perform multi-angle shooting on a target shooting object according to a shooting instruction and preset shooting parameters may include the following steps:
and step 102a, responding to a shooting instruction.
In a specific embodiment of the present invention, after acquiring the shooting instruction, the terminal may respond to the shooting instruction first, and then perform subsequent shooting processing.
And 102b, determining a reference point corresponding to the target shooting object, and determining a real-time angle between a connecting line between the preset shooting center and the reference point and a horizontal line.
In an embodiment of the present invention, the terminal may determine a reference point of the target photographic object first, and then determine a real-time angle between a horizontal line and a connection line between the preset photographic center and the reference point.
Further, in an embodiment of the present invention, the terminal may determine the reference point of the target photographic object through various methods, for example, the terminal may determine the reference point of the target photographic object through a selection operation of a user.
Further, in an embodiment of the present invention, after determining the reference point of the target photographic object, the terminal may determine a real-time angle between a horizontal line and a connection line between the preset photographing center and the reference point. Fig. 5 is a schematic view of a display interface for acquiring a real-time angle by a terminal in an embodiment of the present invention, where as shown in fig. 5, a reference point of a target photographic object determined by the terminal is the top of the target photographic object, and a real-time angle between a horizontal line and a connection line between a preset photographic center and the reference point, which is detected and acquired by the terminal, is 30 °, and if the preset photographic angle of the terminal is 20 °, the terminal needs to change the real-time angle because the real-time angle is not equal to the preset photographic angle.
And 102c, when the real-time angle is the same as the shooting angle, shooting the target shooting object at multiple angles according to the shooting time interval.
In a specific embodiment of the present invention, after the terminal determines the reference point of the target photographic object and determines the real-time angle between the horizontal line and the connection line between the preset photographing center and the reference point, if the terminal determines that the real-time angle and the photographing angle are the same, the terminal performs multi-angle photographing on the target photographic object according to the photographing time interval.
Further, in the embodiment of the present invention, before the terminal photographs the target photographic object, it is determined whether the real-time angle is the same as the photographic angle, and when the real-time angle is the same as the photographic angle, the terminal can photograph the target photographic object, thereby ensuring that images of each angle in the multi-angle image of the target photographic object obtained by the terminal in the multi-angle photographing mode are at the same level.
It should be noted that, in an embodiment of the present invention, the shooting time interval may be set by the terminal according to a selection operation of a user, or may be a default shooting time interval of the middle terminal. Specifically, in order to acquire multi-frame shot images of the target photographic object at different angles, the terminal may determine a corresponding shooting time interval according to the movement speed of the target photographic object.
In summary, in the embodiment of the present invention, through the steps 102a-102c, the shooting command is responded; then determining a reference point corresponding to the target shooting object, and determining a real-time angle between a connecting line between a preset shooting center and the reference point and a horizontal line; when the real-time angle is the same as the shooting angle, shooting the target shooting object at multiple angles according to the shooting time interval; therefore, the problem that multi-angle expanded images of the target shooting object in rotary motion cannot be obtained is solved, and the application range of terminal shooting is expanded; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
EXAMPLE III
Fig. 6 is a schematic flow chart of a third implementation of the shooting method according to the embodiment of the present invention, as shown in fig. 6, in a specific embodiment of the present invention, a method for a terminal to obtain a shooting end instruction may include the following steps:
and step 103a, acquiring the sum of pixel values of the multiple frames of shot images.
In an embodiment of the present invention, after acquiring the multiple frames of captured images, the terminal may acquire a sum of pixel values of the multiple frames of captured images in advance according to a pixel value of each of the multiple frames of captured images.
And 103b, comparing the preset pixel threshold value with the pixel value sum.
In a specific embodiment of the present invention, after acquiring the sum of pixel values of the multiple frames of captured images, the terminal compares the sum of pixel values with a preset pixel threshold, so as to determine whether a capture end instruction is generated for the multiple frames of captured images.
And 103c, generating a shooting end instruction when the sum of the pixel values is greater than or equal to a preset pixel threshold value.
In an embodiment of the present invention, after the terminal compares the sum of the pixel values with a preset pixel threshold, if the sum of the pixel values is greater than or equal to the preset pixel threshold, the terminal generates a shooting end instruction, and stops shooting the image.
It should be noted that, in the embodiment of the present invention, since the sum of pixel values exceeds the pixel number accommodation range of the terminal display module, the multi-angle unfolded image obtained by the processing cannot be normally displayed, and therefore, the sum of pixel values of the multi-frame captured images needs to be calculated in advance, so as to determine whether the terminal stops capturing and starts to perform the unfolding processing on the multi-frame captured images.
Fig. 7 is a schematic flow chart of an implementation flow of the shooting method according to the embodiment of the present invention, as shown in fig. 7, in the specific embodiment of the present invention, after the terminal compares the preset pixel threshold with the sum of the pixel values, that is, after step 103b, the method for the terminal to obtain the shooting end instruction may further include the following steps:
and 103d, when the sum of the pixel values is smaller than the preset pixel threshold, acquiring a plurality of coincidence proportions among a plurality of target shooting objects in the multi-frame shooting image, and comparing the coincidence proportions with the preset coincidence threshold respectively.
In an embodiment of the present invention, after the terminal compares the sum of the pixel values with a preset pixel threshold, if the sum of the pixel values is smaller than the preset pixel threshold, the terminal may obtain a plurality of coincidence ratios between a plurality of target photographic objects in the multi-frame photographic image, and then compare the plurality of coincidence ratios with the preset coincidence threshold respectively.
Further, in an embodiment of the present invention, the terminal may perform feature point identification and comparison on every two captured images in the obtained multiple captured images, so as to determine a coincidence portion of every two images to be stitched, and calculate and obtain multiple coincidence proportions between multiple target captured objects in the multiple captured images.
Further, in an embodiment of the present invention, after acquiring multiple coincidence ratios between multiple target photographic subjects in the multiple frames of photographic images, the terminal may compare the multiple coincidence ratios with preset coincidence thresholds, respectively, so as to further determine whether to generate a photographing end instruction.
And 103e, when at least one coincidence proportion in the multiple coincidence proportions is larger than or equal to a preset coincidence threshold value, generating a shooting ending instruction.
In a specific embodiment of the present invention, after comparing the multiple coincidence proportions with the preset coincidence threshold respectively, the terminal generates a shooting end instruction if at least one coincidence proportion in the multiple coincidence proportions is greater than or equal to the preset coincidence threshold.
Further, in an embodiment of the present invention, before the terminal shoots the target photographic object, the preset coincidence threshold may be preset, and specifically, when at least one of the multiple coincidence ratios is greater than or equal to the preset coincidence threshold, it may be considered that the target photographic object has rotated more than 360 degrees, that is, the terminal has acquired multiple frames of photographic images of the target photographic object at multiple angles within the range of [0, 2 pi ], then the terminal generates a shooting end instruction, and stops shooting the images.
In summary, in the embodiment of the present invention, through the above steps 103a to 103e, the terminal obtains the sum of pixel values of multiple frames of captured images; comparing a preset pixel threshold value with the sum of pixel values; when the sum of the pixel values is greater than or equal to a preset pixel threshold value, generating a shooting ending instruction; when the sum of the pixel values is smaller than a preset pixel threshold value, acquiring a plurality of coincidence proportions among a plurality of target shooting objects in a multi-frame shooting image, and comparing the coincidence proportions with the preset coincidence threshold value respectively; when at least one coincidence proportion in the multiple coincidence proportions is greater than or equal to a preset coincidence threshold value, generating a shooting ending instruction; therefore, the problem that multi-angle expanded images of the target shooting object in rotary motion cannot be obtained is solved, and the application range of terminal shooting is expanded; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
Example four
Fig. 8 is a schematic flow chart of an implementation of the shooting method according to the embodiment of the present invention, as shown in fig. 8, in an embodiment of the present invention, a method for generating a multi-angle unfolded image of a target shooting object by a terminal may include the following steps:
and 105a, performing background filling processing on areas except for the target shooting objects in the multi-frame shooting images to generate multi-frame images to be spliced after the background is filled.
In the specific embodiment of the invention, after the terminal acquires the shooting ending instruction and stops shooting, the terminal performs background filling processing on the areas except for the target shooting objects in the multi-frame shooting images to generate the multi-frame images to be spliced after the background is filled.
Further, in an embodiment of the present invention, the terminal performs a background filling process on one of the multiple frames of shot images, and specifically, the terminal performs a background filling process on an area outside a target shot object in the shot image, so as to obtain an image to be stitched after the image fills the background. Further, the terminal may sequentially perform background filling processing on each of the multiple frames of shot images, and finally obtain the multiple frames of images to be stitched after filling the background.
It should be noted that, in the specific embodiment of the present invention, the terminal may perform background filling processing on the multiple frames of captured images through multiple methods, and specifically, the terminal may determine a common image area in each frame of image to be stitched, that is, a background image outside a target captured object, and then fill the common image area with a solid color, so as to obtain the multiple frames of images to be stitched after filling the background.
Fig. 9 is a schematic diagram of background filling processing performed on a captured image in the embodiment of the present invention, and as shown in fig. 9, a terminal performs solid color filling on a background of the captured image in fig. (a), and then obtains an image to be stitched after the background filling, that is, a diagram (b).
And 105b, setting the overlapped area in two continuous images to be spliced in the multi-frame images to be spliced as an expansion boundary.
In a specific embodiment of the present invention, after the terminal performs background filling processing on the area outside the target photographic objects in the multi-frame photographic image and generates the multi-frame to-be-stitched image after the background is filled, the terminal may set the overlapping area in two consecutive to-be-stitched images in the multi-frame to-be-stitched image as the expansion boundary.
Further, in an embodiment of the present invention, the terminal may determine an overlapping area of two consecutive images to be stitched of the multiple images to be stitched, and then set the overlapping area as an unfolding boundary of the two consecutive images to be stitched. Further, the terminal may set a plurality of expansion boundaries of every two consecutive images to be stitched in the plurality of frames of images to be stitched.
And 105c, unfolding the non-overlapped areas of the multiple frames of images to be spliced along the plurality of corresponding unfolded boundaries in the multiple frames of images to be spliced to generate a multi-angle unfolded image of the target shooting object.
In a specific embodiment of the present invention, after the terminal sets a plurality of expansion boundaries of every two consecutive images to be stitched in the plurality of images to be stitched, the terminal may perform expansion processing on the non-overlapping area of the plurality of images to be stitched along the plurality of corresponding expansion boundaries in the plurality of images to be stitched, so as to generate a multi-angle expansion image of the target photographic object.
It should be noted that, in the specific embodiment of the present invention, before the terminal performs the expansion processing on the multiple frames of images to be stitched after the background is filled, it is necessary to perform multiple processing on the multiple frames of images to be stitched, for example, distortion correction of the images, perspective transformation of the images, alignment stitching of the images, brightness and color equalization processing of the images, and feature point identification and matching of the images, where preferably, the number of feature points is at least 4.
Fig. 10 is a schematic diagram of generating a multi-angle spread image of a target photographic object in an embodiment of the present invention, as shown in fig. 10, fig. (a), (b), and (c) are three frames of photographic images of different angles of a target photographic object rotating counterclockwise, respectively, and the terminal performs the background filling process on the multi-frame photographic images through the above steps 105a to 105c, and performs the spread process on a plurality of target photographic objects in the multi-frame photographic images to generate a multi-angle spread image of the target photographic object, that is, fig. (d).
In summary, in the specific embodiment of the present invention, through the steps 105a to 105c, the terminal performs background filling processing on the areas outside the multiple target photographic objects in the multiple frames of photographic images, so as to generate multiple frames of images to be stitched after the background is filled; setting the overlapping area of two continuous frames of images to be spliced in the multi-frame images to be spliced as an expansion boundary; expanding non-overlapping areas of the multiple frames of images to be spliced along multiple corresponding expanded boundaries in the multiple frames of images to be spliced to generate a multi-angle expanded image of the target shooting object; therefore, the problem that multi-angle expanded images of the target shooting object in rotary motion cannot be obtained is solved, and the application range of terminal shooting is expanded; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
EXAMPLE five
Fig. 11 is a schematic diagram illustrating a sixth implementation flow of the shooting method according to the embodiment of the present invention, as shown in fig. 11, in a specific embodiment of the present invention, after acquiring multiple frames of shot images, the terminal may first screen the multiple frames of shot images according to a preset rule, so as to determine multiple frames of available images in the multiple frames of shot images, specifically, the terminal selects multiple frames of available images in the multiple frames of shot images according to the preset rule, and the method for generating a multi-angle expanded image of a target shot object may include the following steps:
step 201, determining the coincidence proportion of two target shooting objects corresponding to two continuous shooting images in the multi-frame shooting images.
In a specific embodiment of the present invention, after the terminal shoots the target photographic object according to the preset shooting parameters and acquires the multi-frame photographic images, the terminal may sequentially determine the coincidence ratio of two target photographic objects corresponding to two consecutive photographic images in the multi-frame photographic images, so as to determine the usable images in the multi-frame photographic images according to the coincidence ratio.
Further, in the embodiment of the present invention, the terminal may perform feature point identification and comparison on two target photographic objects corresponding to each two consecutive images in the acquired multiple frames of photographic images, so as to determine the overlapping portion of the two target photographic objects corresponding to the two consecutive images, and calculate to obtain the overlapping ratio of the two target photographic objects corresponding to the two frames of images.
Step 202, when the overlapping proportion is greater than or equal to a first preset threshold value and less than or equal to a second preset threshold value, determining the two shot images as available images.
In a specific embodiment of the invention, after determining the coincidence proportion of two target photographic objects corresponding to two continuous photographic images in a multi-frame photographic image, if the coincidence proportion is greater than or equal to a first preset threshold value and less than or equal to a second preset threshold value, the terminal determines the two photographic images as usable images.
Further, in an embodiment of the present invention, before the terminal shoots the target shooting object, the first preset threshold and the second preset threshold may be preset, specifically, the first preset threshold and the second preset threshold are threshold ranges of the coincidence ratio of two target shooting objects corresponding to two consecutive images, that is, when the coincidence ratio of two target shooting objects corresponding to any two consecutive images is within the preset threshold range, the terminal may regard the two consecutive images as the available images. For example, when the first preset threshold is 1/4 and the second preset threshold is 1/3, the terminal determines that the coincidence ratio of two target photographic objects corresponding to two consecutive images is 0.3, and since the coincidence ratio of two target photographic objects corresponding to two consecutive images is 0.3 within the preset threshold range [1/4,1/3], the terminal may determine that the two images are usable images.
And 203, acquiring a shooting end instruction, responding to the shooting end instruction, and performing expansion processing on a plurality of target shooting objects according to a plurality of overlapped areas among the plurality of target shooting objects in the plurality of frames of available images to generate a multi-angle expanded image of the target shooting objects.
Therefore, the shooting method and the terminal provided by the embodiment of the invention can acquire multi-frame shooting images by shooting the target shooting object at multiple angles under the condition that the target shooting object rotates, and spread the multi-frame shooting images on one image to present multi-angle image information of the target shooting object, so that the problem that multi-angle spread images of the target shooting object which rotates is unavailable is solved, and the application range of terminal shooting is expanded; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
EXAMPLE six
Fig. 12 is a first schematic structural diagram of a terminal according to an embodiment of the present invention, and as shown in fig. 12, a terminal 3 according to an embodiment of the present invention includes: a receiving unit 31, a photographing unit 32, an acquisition unit 33, a response unit 34, and a processing unit 35. As can be seen from the above schematic diagram of the hardware structure of the mobile terminal 100 implementing various embodiments of the present invention and the schematic diagram of the wireless communication system of the mobile terminal 100 implementing various embodiments of the present invention, the embodiment of the present invention proposes that the receiving unit 31, the shooting unit 32, the obtaining unit 33, the responding unit 34, and the processing unit 35 in the terminal 3 can be implemented by a processor in the mobile terminal executing corresponding functions in the form of program codes; of course, the implementation can also be realized through a specific logic circuit; in the process of the specific embodiment, the processor may be a Central Processing Unit (CPU), a Microprocessor Unit (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
The terminal 3 provided by the embodiment of the invention comprises: a receiving unit 31, a photographing unit 32, an acquisition unit 33, a response unit 34, and a processing unit 35.
A receiving unit 31 for receiving a photographing instruction when the multi-angle photographing mode is turned on.
And a shooting unit 32, configured to perform multi-angle shooting on the target shooting object according to the shooting instruction and preset shooting parameters after the receiving unit 31 receives the shooting instruction.
An obtaining unit 33, configured to obtain a multi-frame shot image of the target shot object after the shooting unit 32 performs multi-angle shooting on the target shot object according to the shooting instruction and the preset shooting parameters; and is also used for acquiring a shooting end instruction.
A response unit 34 for responding to the photographing end instruction after the acquisition unit 33 acquires the photographing end instruction.
And the processing unit 35 is used for performing expansion processing on the plurality of target shooting objects according to a plurality of overlapping areas among the plurality of target shooting objects in the multi-frame shooting images to generate a multi-angle expanded image of the target shooting objects.
Fig. 13 is a schematic diagram of a second composition structure of the terminal according to the embodiment of the present invention, and as shown in fig. 13, the shooting unit 32 according to the embodiment of the present invention includes: a response subunit 321, a determination subunit 322, and a shooting subunit 323.
A response subunit 321, configured to respond to the shooting instruction.
And a determining subunit 322, configured to determine a reference point corresponding to the target photographic object after the response subunit 321 responds to the photographing instruction, and determine a real-time angle between a horizontal line and a connecting line between the preset photographing center and the reference point.
And a photographing sub-unit 323 for photographing the target photographic object at a plurality of angles at a photographing time interval when the real-time angle and the photographing angle are the same after the determining sub-unit 322 determines the real-time angle between the horizontal line and the connection line between the preset photographing center and the reference point.
Fig. 14 is a schematic diagram of a third configuration of a terminal according to an embodiment of the present invention, and as shown in fig. 14, the obtaining unit 33 according to an embodiment of the present invention includes: an acquisition sub-unit 331, a comparison sub-unit 332 and a generation sub-unit 333,
the acquiring subunit 331 is configured to acquire a sum of pixel values of multiple frames of captured images.
A comparison sub-unit 332 for comparing the preset pixel threshold value and the pixel value sum after the acquisition sub-unit 331 acquires the pixel value sum of the plurality of frames of captured images.
A generation subunit 333 configured to generate a shooting end instruction when the pixel value sum total is equal to or greater than the preset pixel threshold value after the comparison subunit 332 compares the preset pixel threshold value and the pixel value sum total.
Further, in the embodiment of the present invention, the obtaining subunit 331 is further configured to, after the comparing subunit 332 compares the preset pixel threshold value and the pixel value sum, obtain a plurality of coincidence ratios between a plurality of target photographic subjects in the multi-frame photographic image when the pixel value sum is smaller than the preset pixel threshold value.
The comparison subunit 332 is further configured to, after the acquisition subunit 331 acquires a plurality of coincidence ratios between a plurality of target photographic subjects in the multi-frame photographic image, compare the plurality of coincidence ratios with preset coincidence thresholds, respectively.
The generating subunit 333 is configured to, after the comparing subunit 332 compares the multiple overlap ratios with the preset overlap threshold, generate a shooting end instruction when at least one overlap ratio in the multiple overlap ratios is greater than or equal to the preset overlap threshold.
Fig. 15 is a schematic diagram of a composition structure of a terminal according to an embodiment of the present invention, and as shown in fig. 15, a processing unit 35 according to an embodiment of the present invention includes: a padding subunit 351, a setting subunit 352 and a processing subunit 353,
the filling subunit 351 is configured to perform background filling processing on regions other than the target photographic objects in the multi-frame photographic image, and generate a background-filled multi-frame image to be stitched.
The setting subunit 352 is configured to set, after the filling subunit 351 generates the multiple frames of images to be stitched after the background is filled, an overlapped area in two consecutive frames of images to be stitched in the multiple frames of images to be stitched as an expansion boundary.
And the processing subunit 353 is configured to, after the setting subunit 352 sets the overlapping area in two consecutive images to be stitched in the multiple frames of images to be stitched as the expansion boundary, perform expansion processing on the non-overlapping area in the multiple frames of images to be stitched according to multiple corresponding expansion boundaries in the multiple frames of images to be stitched, and generate a multi-angle expansion image of the target shooting object.
The receiving unit 31, the shooting unit 32, the obtaining unit 33, the responding unit 34 and the processing unit 35 provided by the embodiment of the present invention can be implemented in the form of program codes by executing corresponding functions by a processor in the mobile terminal; of course, the implementation can also be realized through a specific logic circuit; in the course of a particular embodiment, the processor may be a central processing unit, a microprocessor, a digital signal processor, a field programmable gate array, or the like.
The terminal provided by the embodiment of the invention receives a shooting instruction when the multi-angle shooting mode is started; carrying out multi-angle shooting on a target shooting object according to a shooting instruction and preset shooting parameters to obtain multi-frame shooting images of the target shooting object; and acquiring a shooting end instruction, responding to the shooting end instruction, and performing expansion processing on a plurality of target shooting objects according to a plurality of overlapping areas among the plurality of target shooting objects in the multi-frame shooting images to generate a multi-angle expanded image of the target shooting objects. Therefore, the terminal provided by the embodiment of the invention can acquire multi-frame shooting images by shooting the target shooting object at multiple angles under the condition that the target shooting object rotates, and can unfold the multi-frame shooting images on one image to present multi-angle image information of the target shooting object, so that the problem that multi-angle unfolded images of the target shooting object in rotary motion cannot be acquired is solved, and the application range of terminal shooting is expanded; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention. The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (8)

1. A photographing method, characterized in that the method comprises:
when the multi-angle shooting mode is started, receiving a shooting instruction;
performing multi-angle shooting on a target shooting object according to the shooting instruction and preset shooting parameters to obtain multi-frame shooting images of the target shooting object;
acquiring a shooting end instruction, responding to the shooting end instruction, and performing unfolding processing on a plurality of target shooting objects according to a plurality of overlapping areas among the plurality of target shooting objects in the multi-frame shooting images to generate a multi-angle unfolding image of the target shooting objects;
the acquiring of the shooting end instruction includes:
acquiring the sum of pixel values of the multiple frames of shot images;
comparing a preset pixel threshold value with the sum of the pixel values;
and when the sum of the pixel values is greater than or equal to the preset pixel threshold value, generating the shooting end instruction.
2. The method according to claim 1, wherein the preset photographing parameters include a photographing angle and a photographing time interval; the multi-angle shooting of the target shooting object according to the shooting instruction and the preset shooting parameters comprises the following steps:
responding to the shooting instruction;
determining a reference point corresponding to the target shooting object, and determining a real-time angle between a connecting line between a preset shooting center and the reference point and a horizontal line; wherein the reference point may be determined according to a selection operation;
and when the real-time angle is the same as the shooting angle, shooting the target shooting object at multiple angles according to the shooting time interval.
3. The method of claim 1, wherein after comparing the preset pixel threshold value and the pixel value sum, the method further comprises:
when the sum of the pixel values is smaller than the preset pixel threshold, acquiring a plurality of coincidence proportions among a plurality of target shooting objects in the multi-frame shooting image, and comparing the coincidence proportions with a preset coincidence threshold respectively;
and when at least one coincidence proportion in the multiple coincidence proportions is greater than or equal to a preset coincidence threshold value, generating the shooting ending instruction.
4. The method according to claim 1, wherein said expanding said plurality of target photographic objects according to a plurality of overlapping regions between said plurality of target photographic objects in said multi-frame photographic image to generate a multi-angle expanded image of said target photographic objects comprises:
performing background filling processing on areas outside the target shooting objects in the multi-frame shooting images to generate multi-frame images to be spliced after the background is filled;
setting the overlapping area in two continuous frames of images to be spliced in the plurality of frames of images to be spliced as an unfolding boundary;
and unfolding the non-overlapped area of the plurality of frames of images to be spliced along the plurality of corresponding unfolded boundaries in the plurality of frames of images to be spliced to generate a multi-angle unfolded image of the target shooting object.
5. A terminal, characterized in that the terminal comprises: a receiving unit, a shooting unit, an acquisition unit, a response unit, a processing unit,
the receiving unit is used for receiving a shooting instruction when the multi-angle shooting mode is started;
the shooting unit is used for shooting a target shooting object in multiple angles according to the shooting instruction and preset shooting parameters;
the acquisition unit is used for acquiring multi-frame shooting images of the target shooting object;
the acquisition unit is also used for acquiring a shooting ending instruction;
the response unit is used for responding to the shooting ending instruction;
the processing unit is used for expanding the target shooting objects according to a plurality of overlapping areas among the target shooting objects in the multi-frame shooting images to generate a multi-angle expanded image of the target shooting objects;
the acquisition unit includes: an acquisition subunit, a comparison subunit, a generation subunit,
the acquiring subunit is configured to acquire a sum of pixel values of the multiple frames of captured images;
the comparison subunit is used for comparing a preset pixel threshold value with the pixel value sum;
and the generating subunit is configured to generate the shooting end instruction when the sum of the pixel values is greater than or equal to the preset pixel threshold.
6. The terminal of claim 5, wherein the preset shooting parameters comprise a shooting angle and a shooting time interval; the photographing unit includes: a response subunit, a determination subunit, a capture subunit,
the response subunit is used for responding to the shooting instruction;
the determining subunit is configured to determine a reference point corresponding to the target shooting object, and determine a real-time angle between a horizontal line and a connection line between a preset shooting center and the reference point; wherein the reference point may be determined according to a selection operation;
and the shooting subunit is used for shooting the target shooting object at multiple angles according to the shooting time interval when the real-time angle is the same as the shooting angle.
7. The terminal of claim 5,
the obtaining subunit is further configured to, after comparing a preset pixel threshold value with the pixel value sum, obtain a plurality of coincidence proportions between a plurality of target photographic objects in the multi-frame photographic image when the pixel value sum is smaller than the preset pixel threshold value;
the comparison subunit is further configured to compare the multiple coincidence proportions with a preset coincidence threshold value respectively;
the generating subunit is further configured to generate the shooting end instruction when at least one of the multiple coincidence proportions is greater than or equal to a preset coincidence threshold.
8. The terminal of claim 5, wherein the processing unit comprises: a filling subunit, a setting subunit, a processing subunit,
the filling subunit is configured to perform background filling processing on an area outside the target photographic objects in the multiple frames of photographic images, and generate multiple frames of to-be-spliced images with the background filled;
the setting subunit is configured to set the overlapping area in two consecutive images to be stitched in the multiple frames of images to be stitched as an expansion boundary;
and the processing subunit is configured to perform expansion processing on a non-overlapping area of the multiple frames of images to be stitched according to the multiple expansion boundaries corresponding to the multiple frames of images to be stitched, so as to generate a multi-angle expansion image of the target shooting object.
CN201710200826.6A 2017-03-30 2017-03-30 Shooting method and terminal Active CN106973226B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710200826.6A CN106973226B (en) 2017-03-30 2017-03-30 Shooting method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710200826.6A CN106973226B (en) 2017-03-30 2017-03-30 Shooting method and terminal

Publications (2)

Publication Number Publication Date
CN106973226A CN106973226A (en) 2017-07-21
CN106973226B true CN106973226B (en) 2020-01-24

Family

ID=59337188

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710200826.6A Active CN106973226B (en) 2017-03-30 2017-03-30 Shooting method and terminal

Country Status (1)

Country Link
CN (1) CN106973226B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108163203B (en) * 2017-12-31 2020-10-13 深圳市道通智能航空技术有限公司 Shooting control method and device and aircraft
CN109302511A (en) * 2018-09-26 2019-02-01 努比亚技术有限公司 A kind of image capturing method, device and computer readable storage medium
TWI747039B (en) * 2019-09-02 2021-11-21 財團法人印刷創新科技硏究發展中心 Multi-angle mobile electronic device spectrum shooting system and image display method
CN113240940A (en) * 2021-05-07 2021-08-10 恒大恒驰新能源汽车研究院(上海)有限公司 Automobile reminding monitoring method, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1242074A (en) * 1996-10-25 2000-01-19 威伍沃克斯股份有限公司 Method and apparatus for three-dimensional color scanning
CN102905079A (en) * 2012-10-16 2013-01-30 北京小米科技有限责任公司 Method, device and mobile terminal for panorama shooting
CN103377469A (en) * 2012-04-23 2013-10-30 宇龙计算机通信科技(深圳)有限公司 Terminal and image processing method
CN104660897A (en) * 2013-11-20 2015-05-27 浪潮乐金数字移动通信有限公司 Acquisition method of 360-degree panoramic image based on mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1242074A (en) * 1996-10-25 2000-01-19 威伍沃克斯股份有限公司 Method and apparatus for three-dimensional color scanning
CN103377469A (en) * 2012-04-23 2013-10-30 宇龙计算机通信科技(深圳)有限公司 Terminal and image processing method
CN102905079A (en) * 2012-10-16 2013-01-30 北京小米科技有限责任公司 Method, device and mobile terminal for panorama shooting
CN104660897A (en) * 2013-11-20 2015-05-27 浪潮乐金数字移动通信有限公司 Acquisition method of 360-degree panoramic image based on mobile terminal

Also Published As

Publication number Publication date
CN106973226A (en) 2017-07-21

Similar Documents

Publication Publication Date Title
CN106454121B (en) Double-camera shooting method and device
CN106909274B (en) Image display method and device
WO2018019124A1 (en) Image processing method and electronic device and storage medium
WO2017050115A1 (en) Image synthesis method
CN105468158B (en) Color adjustment method and mobile terminal
CN106713716B (en) Shooting control method and device for double cameras
CN107071263B (en) Image processing method and terminal
CN106413128B (en) Projection method and mobile terminal
CN106973226B (en) Shooting method and terminal
CN106302651B (en) Social picture sharing method and terminal with social picture sharing system
CN106657782B (en) Picture processing method and terminal
CN106911881B (en) Dynamic photo shooting device and method based on double cameras and terminal
CN106534553B (en) Mobile terminal and shooting method thereof
CN106131327B (en) Terminal and image acquisition method
CN106851114B (en) Photo display device, photo generation device, photo display method, photo generation method and terminal
CN106993134B (en) Image generation device and method and terminal
CN107018326B (en) Shooting method and device
CN106937056B (en) Focusing processing method and device for double cameras and mobile terminal
CN106980460B (en) Mobile terminal and image processing method
CN106455009B (en) Network searching device and method
CN105791541B (en) Screenshot method and mobile terminal
CN105722246B (en) Network speed superposition device and method
CN109168029B (en) Method, device and computer-readable storage medium for adjusting resolution
CN106791567B (en) Switching method and terminal
CN106990896B (en) Stereo photo display method and device based on double cameras and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20191225

Address after: 200335 building C20, Huangchao villa, No. 2388, Hongqiao Road, Changning District, Shanghai

Applicant after: Shanghai Mingdian Culture Communication Co., Ltd

Address before: 518000 Guangdong Province, Shenzhen high tech Zone of Nanshan District City, No. 9018 North Central Avenue's innovation building A, 6-8 layer, 10-11 layer, B layer, C District 6-10 District 6 floor

Applicant before: Nubian Technologies Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant