CN106131327B - Terminal and image acquisition method - Google Patents

Terminal and image acquisition method Download PDF

Info

Publication number
CN106131327B
CN106131327B CN201610511117.5A CN201610511117A CN106131327B CN 106131327 B CN106131327 B CN 106131327B CN 201610511117 A CN201610511117 A CN 201610511117A CN 106131327 B CN106131327 B CN 106131327B
Authority
CN
China
Prior art keywords
terminal
image
touch screen
information
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610511117.5A
Other languages
Chinese (zh)
Other versions
CN106131327A (en
Inventor
赵欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Shengli Yinghua culture media Co., Ltd
Original Assignee
Ningbo Shengli Yinghua Culture Media Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Shengli Yinghua Culture Media Co Ltd filed Critical Ningbo Shengli Yinghua Culture Media Co Ltd
Priority to CN201610511117.5A priority Critical patent/CN106131327B/en
Publication of CN106131327A publication Critical patent/CN106131327A/en
Application granted granted Critical
Publication of CN106131327B publication Critical patent/CN106131327B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The invention discloses a terminal and an image acquisition method, wherein the terminal comprises: the two touch screens, the image capturing device and the interaction unit are arranged on two sides of the side edge of the terminal in an opposite mode; the image capturing device is used for acquiring an image; the interaction unit is used for setting interaction information on one touch screen when the image capture device carries out image acquisition, and projecting the set interaction information on the other touch screen for display. According to the terminal and the image acquisition method provided by the invention, when the image is acquired, the interactive information is set on one touch screen, and the set interactive information is projected to the other touch screen for display, so that a shot user can acquire the interactive information in real time, and the user experience is effectively improved.

Description

Terminal and image acquisition method
Technical Field
The invention relates to the field of terminal application, in particular to a terminal and an image acquisition method.
Background
At present, most mobile phones only have one screen, when a user uses a rear camera to take a picture of a photographed user, the photographed user cannot see display contents in the screen, and does not know whether the figure posture in the current picture meets the requirements of the photographed user, when the user uses a front camera to take a picture, the mobile phone is not as clear as the rear camera, and if the user does the picture clearly, the cost is increased.
Therefore, no matter the existing single-screen mobile phone is used for taking a self-timer or other people, the shooting means is monotonous, and the user experience is influenced.
Disclosure of Invention
The invention mainly aims to provide a terminal and an image acquisition method, and aims to enrich shooting means of the conventional terminal and improve user experience.
In order to achieve the above object, the present invention provides a terminal, including: the two touch screens, the image capturing device and the interaction unit are arranged on two sides of the side edge of the terminal in an opposite mode;
the image capturing device is used for acquiring an image;
the interaction unit is used for setting interaction information on one touch screen when the image capture device carries out image acquisition, and projecting the set interaction information on the other touch screen for display.
As an improvement of the terminal of the present invention, the interaction unit includes a setting module and a projection module; the two touch screens are respectively a first touch screen and a second touch screen;
the setting module is specifically used for extracting face data of the portrait on a first touch screen displaying the portrait when the terminal collects the image of the portrait, amplifying the extracted face data to a preset multiple, and setting the amplified face data as interaction information;
when the terminal collects images, extracting time data on a first touch screen displaying the images, and setting the extracted time data as interactive information; and/or
When the terminal collects images, materials are selected from a preset material library on a first touch screen displaying the images, or input data information is received from a preset interaction frame, and the selected materials or the received data information are set as interaction information;
and the projection module is used for projecting the set interaction information to a second touch screen for display.
As a further improvement of the terminal of the present invention, the material includes pictures, animation, music and back-decorated scene patterns;
the input data information includes input text, voice and pictures.
As another improvement of the terminal of the present invention, the terminal further comprises a display control unit for displaying the image captured by the image capturing device on one touch screen and mapping the captured image onto another touch screen.
As a further improvement of the terminal of the present invention, the terminal further includes two image control units respectively disposed on the two touch screens and configured to control the image capturing device to perform image capturing.
In addition, to achieve the above object, the present invention further provides an image capturing method for a terminal having two touch screens, including:
when the terminal collects images, setting interaction information on one touch screen;
and projecting the set interactive information to another touch screen for display.
As an improvement of the method of the present invention, the step of setting the interaction information on one of the touch screens when the terminal performs image acquisition includes:
when the terminal collects the portrait image, extracting the face data of the portrait on a touch screen displaying the portrait image, amplifying the extracted face data to a preset multiple, and setting the amplified face data as interaction information;
when the terminal collects images, extracting time data on the touch screen displaying the images, and setting the extracted time data as interactive information; and/or
When the terminal collects images, materials are selected from a preset material library on the touch screen displaying the images, or input data information is received from a preset interaction frame, and the selected materials or the received data information are set as interaction information.
As a further improvement of the method of the present invention, the material includes pictures, animation, music and back-decorated scene patterns;
the input data information includes input text, voice and pictures.
As another improvement of the method of the present invention, the step of setting the interaction information on one of the touch screens when the terminal performs image acquisition further includes:
displaying the image captured by the image capturing device on one touch screen, and mapping the captured image on another touch screen.
As a further improvement of the method of the present invention, the method further comprises:
two image control units are respectively arranged on the two touch screens;
and controlling the image capturing device to perform image acquisition through any one image control unit.
According to the terminal and the image acquisition method provided by the invention, when the image is acquired, the interactive information is set on one touch screen, and the set interactive information is projected to the other touch screen for display, so that a shot user can acquire the interactive information in real time, the interactive process is interesting, convenient and fast, and the user experience is effectively improved.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
fig. 3 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 4 is a flowchart of an image acquisition method of a terminal in an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic hardware configuration of a mobile terminal implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. Elements of the mobile terminal will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive a broadcast signal by using a signal such as multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcasting-handheld (DVB-H), forward link media (MediaFLO)@) Data ofA digital broadcasting system of a broadcasting system, an integrated services for terrestrial digital broadcasting (ISDB-T), etc. receives digital broadcasting. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet access technology to which the module relates may include WLAN (wireless LAN) (Wi-Fi), Wibro (wireless broadband), Wimax (worldwide interoperability for microwave access), HSDPA (high speed downlink packet access), and the like.
The short-range communication module 114 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetoothTMRadio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbeeTMAnd so on.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of the location information module is a GPS (global positioning system). According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS module 115 can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 122, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus 1210 (not shown in the drawings) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the cameras 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the construction of the mobile terminal. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, the touch screen 1510 (not shown in the drawing) may be formed, and of course, two touch screens 1510, 1511 (not shown in the drawing) may be used in the mobile terminal.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device. The sensing unit 140 may include a proximity sensor 1410 as will be described below in connection with a touch screen.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, an interaction unit 154, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. Depending on the particular desired implementation, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm unit 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm unit 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm unit 153 may provide an output in the form of vibration, and when a call, a message, or some other incoming communication (incomingmunication) is received, the alarm unit 153 may provide a tactile output (i.e., vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 may also provide an output notifying the occurrence of an event via the display unit 151 or the audio output module 152.
The interaction unit 154 may provide interaction information on two touch screens.
The memory 160 may store software programs and the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, and the like) that has been or will be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, and the multimedia module 181 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 2750.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz,5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT)295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The GPS module 115 as shown in fig. 1 is generally configured to cooperate with satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
Based on the above-described mobile terminal hardware structure and communication system, various embodiments of the present invention are proposed.
As shown in fig. 3, a first embodiment of the present invention proposes a terminal, which includes: two touch screens 1510 and 1511, an image capturing device 1210, and an interaction unit 154, which are disposed opposite to both sides of a side of the terminal;
the image capturing device 1210 is used for acquiring images;
the interaction unit is used for setting interaction information on one touch screen when the image capture device carries out image acquisition, and projecting the set interaction information on the other touch screen for display.
That is, in the terminal in this embodiment, the rear case of the terminal in the related art is also replaced with a touch screen, compared with the terminal in the related art having a touch screen.
The terminal in the embodiment utilizes the dual-screen attribute of the device, carries out innovation design aiming at a pain point scene in image acquisition (including acquired image data of a static picture or a video), utilizes two screens to carry out front-back interaction, namely, sets interactive information on one touch screen when the image is acquired through an interaction unit, and projects the set interactive information onto the other touch screen to display the interactive information, so that a shot user can acquire the interactive information in real time, the interaction process is interesting, convenient and fast, and the user experience is effectively improved.
For example, in the terminal according to the embodiment of the present invention, a dual-sided display photographing mode may be set, and when the interaction unit detects that the dual-sided display photographing mode is enabled, interaction information set on a current touch screen by a user is received, and the set interaction information is projected onto another touch screen to be displayed.
Specifically, on the current screen, the face in the central area of the screen is recognized according to the face recognition technology, the face is enlarged to be close to the size of the screen (whether one face or 2 faces or a plurality of faces can be correspondingly enlarged to a proper position according to the number), and the enlarged face (set interaction information) is projected to the back in a mapping mode.
Therefore, the user can more conveniently and clearly see whether the user is proper or not in the screen, and certainly, the function of amplifying the face display can be selected not to be started.
For example, in the terminal according to the embodiment of the present invention, a timed shooting or delayed shooting or video recording mode may be set, and when the interactive unit detects that the above mode is enabled, a time-related content part in the mode is extracted and displayed on another screen.
Specifically, 3, 2, 1 shot in countdown can be adopted, and a large character with strong black-white contrast can be displayed at the back of the screen to prompt a shot person;
or the time of recording, for example, 23 minutes and 15 seconds are recorded;
the content presentation of the delayed photography may be, for example, 567 delayed photography pictures are taken, and the photography is accumulated for 3 hours and 14 minutes.
Therefore, even if the front screen is turned off for recording or shooting in order to save power, the user can display the contents on another screen through the back screen (the current situation is known by partial digital information displayed on the amole screen or the electronic ink screen), and only a small part of power is consumed.
For another example, in the terminal according to the embodiment of the present invention, an interactive shooting mode may be set, and when the interactive shooting mode is detected to be enabled in the interaction unit, an interactive frame appears on a screen used by the user, and the user only needs to manually select a material in the material library or draw related content (i.e., a hand-drawn picture) in the interactive frame.
The content is projected to another screen just like a chat dialog box, and can be static or dynamic, or a video or a song can be selected.
The interactive shooting can be used by some users for shooting children, animals and the like, and pictures on the back screen are used for attracting shooting objects, so that wonderful moments are caught.
For another example, in the terminal according to the embodiment of the present invention, a personality shooting mode may be set, and when the interaction unit detects that the personality shooting mode is enabled, a personality back decoration scene pattern appears on a screen used by a user for the user to select, and the user may select some patterns or patterns to be displayed on another screen during shooting. For example, various styles of camera front views, or mobile phone desktop wallpaper, or some personalized large characters such as "do not move", "put one POSE", etc. can improve the fun of using the mobile phone.
In the embodiment of the present invention, the interaction unit may further include two interaction modules with completely the same function, and the two interaction modules are respectively disposed on the two touch screens.
In another embodiment of the terminal of the present invention, the terminal includes: two touch screens 1510 and 1511, an image capturing device 1210, and an interaction unit 154, which are disposed opposite to both sides of a side of the terminal;
the image capturing device 1210 is used for acquiring images;
the interaction unit 154 is configured to set interaction information on one of the touch screens when the image capturing apparatus performs image capturing, and project the set interaction information onto the other touch screen for display.
The interaction unit comprises a setting module and a projection module; the two touch screens are respectively a first touch screen 1510 and a second touch screen 1511;
the setting module is specifically configured to, when the terminal performs image acquisition of a portrait, extract face data of the portrait on a first touch screen 1510 that displays the portrait, amplify the extracted face data to a preset multiple, and set the amplified face data as interaction information;
when the terminal performs image acquisition, extracting time data on a first touch screen 1510 displaying an image, and setting the extracted time data as interaction information; and/or
When the terminal performs image acquisition, on a first touch screen 1510 displaying an image, a material is selected from a preset material library, or input data information is received from a preset interaction frame, and the selected material or the received data information is set as interaction information;
the projection module is configured to project the set interaction information onto a second touch screen 1511 for display.
In the embodiment of the invention, the amplified face data is set as the interactive information, so that a shot user can clearly see whether the user is proper or not in the screen in the image acquisition process, the interestingness in the image acquisition process is further improved, and the user experience is further improved.
Through on the first touch-sensitive screen 1510 that shows the image, extract time data, set up the time data who extracts as interactive information, can make the user that is shot know time information in the image acquisition process, further promote user experience.
The materials are selected from the preset material library, or the input data information is received from the preset interaction frame, and the selected materials or the received data information are set as the interaction information, so that the interest of image acquisition is increased, and the user experience is further improved.
Further, the material includes pictures, animations, music and back decoration scene patterns;
the input data information includes input text, voice and pictures.
In another embodiment of the terminal of the present invention, the terminal includes: two touch screens 1510 and 1511, an image capturing device 1210, an interaction unit 154, a display control unit, which are disposed at both sides of a side of the terminal;
the image capturing device 1210 is used for acquiring images;
the interaction unit 154 is configured to set interaction information on one touch screen when the image capture device performs image acquisition, and project the set interaction information onto the other touch screen for display;
and the display control unit is used for displaying the image acquired by the image capturing device on one touch screen and mapping the acquired image on the other touch screen.
Of course, in this embodiment, the interaction unit may also include the setting module and the projection module in the previous embodiment;
when using the leading camera auto heterodyne at single screen terminal of prior art, it is not as clear as the rear camera again, if do with leading camera and rear camera like clear increase the cost again. According to the embodiment of the invention, the image acquired by the image capturing device is displayed on one touch screen through the display control unit, and the acquired image is mapped to the other touch screen, so that the defects of the prior art can be overcome by only arranging one image capturing device in the terminal, and the cost is effectively saved.
In still another embodiment of the terminal of the present invention, the terminal includes: two touch screens 1510 and 1511, an image capturing device 1210, an interaction unit 154, two image control units, which are disposed at both sides of a side of the terminal in opposition;
the image capturing device 1210 is used for acquiring images;
the interaction unit 154 is configured to set interaction information on one touch screen when the image capture device performs image acquisition, and project the set interaction information onto the other touch screen for display;
and the two image control units are respectively arranged on the two touch screens and are used for controlling the image capturing device to acquire images.
Specifically, an image control unit may include:
the camera touch module is used for controlling the image capture device to acquire a static photo after receiving touch operation of a user;
the video touch control module is used for controlling the image capture device to acquire dynamic image data after receiving the touch control operation of a user;
and the flash lamp control module is used for setting a switch of the flash lamp.
Of course, in the embodiment of the present invention, the terminal may further include a display control unit, and the interaction unit 154 may also include a setting module and a projection module.
According to the terminal in the embodiment of the invention, only one image capturing device can be arranged on the terminal through the two image control units, so that high-effect other shooting and self-shooting can be realized, the manufacturing cost of the terminal is saved, and the user experience can be improved.
The invention further provides an image acquisition method.
Fig. 4 is an image capturing method, such as the image capturing method shown in fig. 2, for a terminal having two touch screens, including:
s401, setting interaction information on one touch screen when the terminal collects images;
s402, projecting the set interactive information to another touch screen for display.
The method in the embodiment utilizes the double-screen attribute of the equipment, carries out innovation design aiming at the pain point scene in image acquisition, utilizes two screens to carry out front-back interaction, sets interaction information on one touch screen, and projects the set interaction information onto the other touch screen for display, so that a shot user can obtain the interaction information in real time, the interaction process is interesting, convenient and fast, and the user experience is effectively improved.
In another embodiment of the method of the present invention, an image acquisition method comprises:
s401, setting interaction information on one touch screen when the terminal collects images;
s402, projecting the set interactive information to another touch screen for display.
When the terminal collects images, the step of setting interactive information on one touch screen comprises the following steps:
when the terminal collects the portrait image, extracting the face data of the portrait on a touch screen displaying the portrait image, amplifying the extracted face data to a preset multiple, and setting the amplified face data as interaction information;
when the terminal collects images, extracting time data on the touch screen displaying the images, and setting the extracted time data as interactive information; and/or
When the terminal collects images, materials are selected from a preset material library on the touch screen displaying the images, or input data information is received from a preset interaction frame, and the selected materials or the received data information are set as interaction information.
Wherein the material comprises pictures, animations, music and back decoration scene patterns;
the input data information includes input text, voice and pictures.
In yet another embodiment of the method of the present invention, an image acquisition method comprises:
s401, setting interaction information on one touch screen when the terminal collects images;
s402, projecting the set interactive information to another touch screen for display.
When the terminal acquires an image, the step of setting interactive information on one of the touch screens may further include:
displaying the image captured by the image capturing device on one touch screen, and mapping the captured image on another touch screen.
In yet another embodiment of the method of the present invention, an image acquisition method comprises:
s400, respectively arranging two image control units on the two touch screens; controlling the image capturing device to acquire images through any one image control unit;
s401, setting interaction information on one touch screen when the terminal collects images;
s402, projecting the set interactive information to another touch screen for display.
The method provided by the embodiment of the invention increases the interest of the terminal for acquiring the image and effectively improves the user experience.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (8)

1. A terminal, characterized in that the terminal comprises: the system comprises two touch screens, an image capturing device and an interaction unit which are oppositely arranged on two sides of the side edge of the terminal, wherein an electronic ink screen is arranged on the back surface of the terminal;
the image capturing device is used for acquiring an image;
the interaction unit is used for setting interaction information on one touch screen when the image capture device carries out image acquisition, and projecting the set interaction information onto the other touch screen for display;
the interactive shooting method comprises the steps that when the interactive unit detects that an interactive shooting mode is started, an interactive frame appears on a screen used by a user, the user selects materials in a material library or draws related content in the interactive frame to serve as interactive information, and the interactive information is projected to another display screen; the interaction unit comprises a setting module and a projection module; the two touch screens are respectively a first touch screen and a second touch screen;
the setting module is specifically used for extracting face data of the portrait on a first touch screen displaying the portrait when the terminal collects the image of the portrait, amplifying the extracted face data to a preset multiple, and setting the amplified face data as interaction information;
when the terminal collects images, extracting time data on a first touch screen displaying the images, and setting the extracted time data as interactive information;
and the projection module is used for projecting the set interaction information to a second touch screen for display.
2. The terminal of claim 1, wherein the material includes pictures, animations, music and backlit scene patterns;
the input data information includes input text, voice and pictures.
3. The terminal according to any of claims 1-2, wherein the terminal further comprises a display control unit for displaying the image captured by the image capturing device on one touch screen and mapping the captured image onto another touch screen.
4. The terminal according to claim 3, further comprising two image control units respectively disposed on the two touch screens and configured to control the image capturing device to perform image capturing.
5. An image acquisition method is used for a terminal with two touch screens, an electronic ink screen is arranged on the back of the terminal, and the method comprises the following steps:
when the terminal collects images, setting interaction information on one touch screen;
projecting the set interactive information to another touch screen for display;
when the interactive unit detects that the interactive shooting mode is started, an interactive frame appears on a screen used by a user, the user selects a material in a material library or draws related content in the interactive frame as interactive information, and the interactive information is projected to another display screen;
when the terminal collects images, the step of setting interactive information on one of the touch screens comprises the following steps:
when the terminal collects the portrait image, extracting the face data of the portrait on a touch screen displaying the portrait image, amplifying the extracted face data to a preset multiple, and setting the amplified face data as interaction information;
and when the terminal acquires an image, extracting time data on the touch screen displaying the image, and setting the extracted time data as interactive information.
6. The method of claim 5, wherein the material includes pictures, animations, music and backlit scene patterns;
the input data information includes input text, voice and pictures.
7. The method according to any one of claims 5 to 6, wherein the step of setting interactive information on one of the touch screens when the terminal performs image acquisition further comprises:
displaying the image captured by the image capturing device on one touch screen, and mapping the captured image on another touch screen.
8. The method of claim 7, wherein the method further comprises:
two image control units are respectively arranged on the two touch screens;
and controlling the image capturing device to perform image acquisition through any one image control unit.
CN201610511117.5A 2016-06-30 2016-06-30 Terminal and image acquisition method Active CN106131327B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610511117.5A CN106131327B (en) 2016-06-30 2016-06-30 Terminal and image acquisition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610511117.5A CN106131327B (en) 2016-06-30 2016-06-30 Terminal and image acquisition method

Publications (2)

Publication Number Publication Date
CN106131327A CN106131327A (en) 2016-11-16
CN106131327B true CN106131327B (en) 2020-02-14

Family

ID=57469079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610511117.5A Active CN106131327B (en) 2016-06-30 2016-06-30 Terminal and image acquisition method

Country Status (1)

Country Link
CN (1) CN106131327B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018145375A1 (en) * 2017-02-13 2018-08-16 广东美的制冷设备有限公司 Touch projection device and control method therefor, air conditioner controller and air conditioner
CN108234875B (en) * 2018-01-15 2021-02-02 Oppo广东移动通信有限公司 Shooting display method and device, mobile terminal and storage medium
CN108900765A (en) * 2018-06-12 2018-11-27 努比亚技术有限公司 A kind of shooting based reminding method, mobile terminal and computer readable storage medium
CN114237530A (en) * 2020-01-21 2022-03-25 华为技术有限公司 Display method and related device of folding screen
CN112702636B (en) * 2020-10-10 2022-12-20 广州奥缔飞梭数字科技有限公司 Intelligent digital display system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203057309U (en) * 2012-12-21 2013-07-10 天津三星光电子有限公司 Digital camera
CN104917974A (en) * 2015-05-13 2015-09-16 青岛海信移动通信技术股份有限公司 Long exposure method and camera device
CN105120180A (en) * 2015-09-24 2015-12-02 广东欧珀移动通信有限公司 Method and device for self-shooting based on rear camera, and mobile terminal
CN105491284A (en) * 2015-11-30 2016-04-13 小米科技有限责任公司 Preview image display method and device
CN105635409A (en) * 2014-10-28 2016-06-01 西安景行数创信息科技有限公司 Photographing method of mobile terminal
CN105677112A (en) * 2016-02-24 2016-06-15 上海天马微电子有限公司 Touch display panel and touch display device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9252547B2 (en) * 2014-07-04 2016-02-02 Cheng Uei Precision Industry Co., Ltd. Universal serial bus connector
CN104539844A (en) * 2014-12-18 2015-04-22 深圳市金立通信设备有限公司 Terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203057309U (en) * 2012-12-21 2013-07-10 天津三星光电子有限公司 Digital camera
CN105635409A (en) * 2014-10-28 2016-06-01 西安景行数创信息科技有限公司 Photographing method of mobile terminal
CN104917974A (en) * 2015-05-13 2015-09-16 青岛海信移动通信技术股份有限公司 Long exposure method and camera device
CN105120180A (en) * 2015-09-24 2015-12-02 广东欧珀移动通信有限公司 Method and device for self-shooting based on rear camera, and mobile terminal
CN105491284A (en) * 2015-11-30 2016-04-13 小米科技有限责任公司 Preview image display method and device
CN105677112A (en) * 2016-02-24 2016-06-15 上海天马微电子有限公司 Touch display panel and touch display device

Also Published As

Publication number Publication date
CN106131327A (en) 2016-11-16

Similar Documents

Publication Publication Date Title
CN106454121B (en) Double-camera shooting method and device
CN106534566B (en) Frameless mobile terminal and information display method thereof
CN106909274B (en) Image display method and device
CN105468158B (en) Color adjustment method and mobile terminal
CN106131327B (en) Terminal and image acquisition method
CN106911850B (en) Mobile terminal and screen capturing method thereof
CN105303398B (en) Information display method and system
CN106210286B (en) Parameter adjusting method and device for double-screen mobile terminal
CN106657782B (en) Picture processing method and terminal
CN106961524B (en) Information display method and device
CN106250130B (en) Mobile terminal and method for responding key operation
CN106598538B (en) Instruction set updating method and system
CN106911881B (en) Dynamic photo shooting device and method based on double cameras and terminal
CN106648324B (en) Hidden icon control method and device and terminal
CN106851114B (en) Photo display device, photo generation device, photo display method, photo generation method and terminal
CN106993134B (en) Image generation device and method and terminal
CN107018326B (en) Shooting method and device
CN106453883B (en) Intelligent terminal and message notification processing method thereof
CN106371704B (en) Application shortcut layout method of screen locking interface and terminal
CN106161790B (en) Mobile terminal and control method thereof
CN105791541B (en) Screenshot method and mobile terminal
CN107197084B (en) Method for projection between mobile terminals and first mobile terminal
CN106791449B (en) Photo shooting method and device
CN106951350B (en) Method and device for checking mobile terminal disk
CN107027113B (en) SIM card activation method and mobile terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TA01 Transfer of patent application right

Effective date of registration: 20200121

Address after: 315000 room 210-047, floor 2, building 003, no.750 Chuangyuan Road, high tech Zone, Ningbo City, Zhejiang Province

Applicant after: Ningbo Shengli Yinghua culture media Co., Ltd

Address before: 518057 Guangdong Province, Shenzhen high tech Zone of Nanshan District City, No. 9018 North Central Avenue's innovation building A, 6-8 layer, 10-11 layer, B layer, C District 6-10 District 6 floor

Applicant before: Nubian Technologies Ltd.

TA01 Transfer of patent application right