CN106980460B - Mobile terminal and image processing method - Google Patents

Mobile terminal and image processing method Download PDF

Info

Publication number
CN106980460B
CN106980460B CN201710211154.9A CN201710211154A CN106980460B CN 106980460 B CN106980460 B CN 106980460B CN 201710211154 A CN201710211154 A CN 201710211154A CN 106980460 B CN106980460 B CN 106980460B
Authority
CN
China
Prior art keywords
image
pixel
color information
touch
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710211154.9A
Other languages
Chinese (zh)
Other versions
CN106980460A (en
Inventor
陈盼
王猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yancheng Zhenghao Network Technology Co ltd
Original Assignee
Yancheng Zhenghao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yancheng Zhenghao Network Technology Co Ltd filed Critical Yancheng Zhenghao Network Technology Co Ltd
Priority to CN201710211154.9A priority Critical patent/CN106980460B/en
Publication of CN106980460A publication Critical patent/CN106980460A/en
Application granted granted Critical
Publication of CN106980460B publication Critical patent/CN106980460B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The present invention provides a mobile terminal, which comprises: the device comprises a control module and a processing module. The control module continuously acquires a plurality of images, detects a first image processing instruction, selects a basic image from the plurality of images according to the first image processing instruction, selects the basic image from the plurality of images, and acquires color information of the basic image. The processing module detects a second image processing instruction, selects an image to be processed from the plurality of images according to the second image processing instruction, and applies the acquired color information to the image to be processed. The invention also provides an image processing method. The mobile terminal and the image processing method provided by the invention can quickly adjust the color of a plurality of continuous images, and improve the image processing efficiency and accuracy.

Description

Mobile terminal and image processing method
Technical Field
The present invention relates to the field of communications, and in particular, to a mobile terminal and an image processing method.
Background
The mobile phone is an indispensable article in daily life, at present, a mobile phone terminal has a photographing function, the photographing function includes a continuous photographing function, however, after a plurality of photos are continuously obtained, one photo needs to be selected from the plurality of photos for independent editing processing, and the photo editing processing consumes a lot of resources, consumes a long time and is low in efficiency.
Disclosure of Invention
In view of this, the present invention provides a mobile terminal and an image processing method that can perform image processing quickly and accurately by using less resources.
The present invention provides a mobile terminal, comprising:
the control module is used for continuously acquiring a plurality of images, detecting a first image processing instruction, selecting a basic image according to the first image processing instruction and acquiring color information of the basic image; and
and the processing module is used for detecting a second image processing instruction, selecting an image to be processed from the plurality of images according to the second image processing instruction, and applying the acquired color information to the image to be processed.
Optionally, the mobile terminal includes an infrared sensing component array, each infrared sensing component includes a touch sensor and one or more pixels, the plurality of pixels are sequentially arranged into pixel rows, the control module is further configured to control the touch sensor to detect a first touch signal when the basic image is displayed, determine a corresponding first touch point according to the first touch signal, determine a first pixel corresponding to the first touch point, determine a first pixel row where the first pixel point is located, activate the infrared sensing components corresponding to all the pixels in the first pixel row, acquire color information of the pixels in the activated infrared sensing component, and record color information of all the pixels in the activated first pixel row as first color information;
the processing module is further configured to control the touch sensor to detect a second touch signal when the to-be-processed image is displayed, determine a corresponding second touch point according to the second touch signal, determine whether the positions of the first touch point and the second touch point are the same, determine a second pixel point corresponding to the second touch point when the positions of the first touch point and the second touch point are the same, determine a second pixel row where the second pixel point is located, and perform color adjustment on the second pixel row according to the first color information.
Optionally, the control module is further configured to control the touch sensor to detect a third touch signal when the basic image is displayed, determine a corresponding first touch trajectory according to the third touch signal, determine a plurality of third pixel points corresponding to the first touch trajectory, determine a third pixel row in which each of the third pixel points is located, activate the infrared sensing assemblies corresponding to all the pixel points in the determined third pixel rows, obtain color information of the pixel points in the activated infrared sensing assemblies, and record the color information of all the pixel points in the activated third pixel rows as second color information;
the processing module is further configured to control the touch sensor to detect a fourth touch signal when the to-be-processed image is displayed, determine a corresponding second touch trajectory according to the fourth touch signal, determine a plurality of fourth pixel points corresponding to the second touch trajectory, determine a fourth pixel row in which each fourth pixel point is located, determine whether screen positions of the fourth pixel row and the third pixel row are the same, query color information corresponding to a screen position in which the third pixel row is located from the second color information when the screen positions of the fourth pixel row and the third pixel row are the same, and perform color adjustment on the fourth pixel row according to the queried color information.
Optionally, the mobile terminal further includes:
the display module is used for displaying the basic image and the image to be processed according to the same proportion according to the sequence;
the judging module is used for judging whether the difference value of the color information values of the same pixel point on the screen is within a preset range when the basic image and the image to be processed are displayed;
the processing module is further configured to adjust the image to be processed according to the color information value of the base image for the same pixel point when the difference value of the color information values of the same pixel point on the screen when the base image and the image to be processed are displayed is within a preset range.
Optionally, the control module is further configured to number the obtained multiple images and obtain association information between the multiple images;
the processing module is further configured to select the image to be processed from the plurality of images according to the serial number, determine color information of the image to be processed according to the color information of the base image and the association information between the plurality of images, and activate a corresponding pixel point on the screen according to the color information of the image to be processed.
The invention also provides an image processing method, which is applied to the mobile terminal and comprises the following steps:
the mobile terminal continuously acquires a plurality of images, detects a first image processing instruction, selects a basic image according to the first image processing instruction, and acquires color information of the basic image; and
and detecting a second image processing instruction, selecting an image to be processed from the plurality of images according to the second image processing instruction, and applying the acquired color information to the image to be processed.
Optionally, the mobile terminal includes an infrared sensing component array, each infrared sensing component includes a touch sensor and one or more pixel points, the plurality of pixel points are arranged in a pixel row according to a sequence, and the image processing method further includes:
when the basic image is displayed, the mobile terminal controls the touch sensor to detect a first touch signal, and determines a corresponding first touch point according to the first touch signal;
judging a first pixel point corresponding to the first touch point;
judging a first pixel row where the first pixel point is located, and activating infrared sensing assemblies corresponding to all pixel points in the first pixel row;
acquiring color information of pixel points in the activated infrared sensing assembly, and recording the color information of all the pixel points in the activated first pixel row as first color information;
when the image to be processed is displayed, the mobile terminal controls the touch sensor to detect a second touch signal, and a corresponding second touch point is determined according to the second touch signal;
judging whether the positions of the first touch point and the second touch point are the same or not;
when the positions of the first touch point and the second touch point are the same, judging a second pixel point corresponding to the second touch point;
and judging a second pixel row where the second pixel point is located, and performing color adjustment on the second pixel row according to the first color information.
Optionally, the image processing method further includes:
when the basic image is displayed, the mobile terminal controls the touch sensor to detect a third touch signal, and a corresponding first touch track is determined according to the third touch signal;
judging a plurality of third pixel points corresponding to the first touch track;
judging a third pixel row where each third pixel point is located, and activating infrared sensing assemblies corresponding to all the pixel points in the judged third pixel row;
acquiring color information of pixel points in the activated infrared sensing assembly, and recording color information of all pixel points in the activated third pixel row as second color information;
when the image to be processed is displayed, the mobile terminal controls the touch sensor to detect a fourth touch signal, and a corresponding second touch track is determined according to the fourth touch signal;
judging a plurality of fourth pixel points corresponding to the second touch track;
judging a fourth pixel row where each fourth pixel point is located, and judging whether the screen positions where the fourth pixel row and the third pixel row are located are the same or not;
and when the screen positions of the fourth pixel line and the third pixel line are the same, inquiring color information corresponding to the screen position of the third pixel line from the second color information, and adjusting the color of the fourth pixel line according to the inquired color information.
Optionally, the image processing method further includes:
displaying the basic image and the image to be processed according to the same proportion according to the sequence;
judging whether the difference value of the color information values of the same pixel point on the screen when the basic image and the image to be processed are displayed is within a preset range;
and when the difference value of the color information values of the same pixel point on the screen when the basic image and the image to be processed are displayed is within a preset range, aiming at the same pixel point, adjusting the image to be processed according to the color information value of the basic image.
Optionally, when acquiring a plurality of images, the image processing method further includes:
numbering the obtained multiple images and obtaining the associated information among the multiple images;
and selecting the image to be processed from the plurality of images according to the serial number, judging the color information of the image to be processed according to the color information of the basic image and the correlation information among the plurality of images, and activating the corresponding pixel point on the screen according to the color information of the image to be processed.
The mobile terminal and the image processing method provided by the invention can process other images according to the color information of one image in a plurality of continuously acquired images, thereby improving the accuracy and efficiency of processing continuous images.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
FIG. 3 is a functional block diagram of a mobile terminal according to an embodiment of the present invention;
FIG. 4 is a functional block diagram of a mobile terminal according to a second embodiment of the present invention;
fig. 5 is a schematic distribution diagram of an infrared sensing array of a mobile terminal according to a third embodiment of the present invention;
FIG. 6 is a schematic diagram of a pixel row distribution of a four-mobile terminal according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a distribution of a plurality of continuous images of a five-mobile terminal according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a first screen of a sixth mobile terminal according to an embodiment of the present invention for performing image processing;
fig. 9 is a diagram illustrating a second screen of the seventh mobile terminal according to the embodiment of the present invention for performing image processing;
fig. 10 is a schematic diagram of a third screen of an eighth mobile terminal according to an embodiment of the present invention for performing image processing;
fig. 11 is a diagram illustrating a fourth screen of the mobile terminal for image processing according to the ninth embodiment of the present invention;
FIG. 12 is a flow chart of a ten image processing method according to an embodiment of the invention;
FIG. 13 is a flowchart of an eleventh image processing method according to an embodiment of the present invention;
FIG. 14 is a flowchart of a twelve image processing method according to an embodiment of the present invention;
FIG. 15 is a flowchart of a thirteenth image processing method according to an embodiment of the present invention;
fig. 16 is a flowchart of a fourteenth image processing method according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
An alternative mobile terminal for implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "modules" and "components" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention.
The mobile terminal 10 may include, but is not limited to, a memory 20, a controller 30, a wireless communication unit 40, an input unit 50, an input unit 60, a camera 70, a microphone 71, an interface unit 80, and a power supply unit 90. FIG. 1 illustrates a mobile terminal 10 having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. The elements of the mobile terminal 10 will be described in detail below.
The wireless communication unit 40 typically includes one or more components that allow wireless point communication between the mobile terminal 10 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast generation module, a mobile communication module, a wireless internet module, a short-range communication module, and a location message module.
The broadcast generation module receives a broadcast signal and/or a broadcast-related message from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or a broadcast-related message or a server that receives a previously generated broadcast signal and/or broadcast-related message and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated message may also be provided via a mobile communication network, and in this case, the broadcast associated message may be received by the mobile communication module. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video or audio broadcasting handheld (DVB-H), and the like. The broadcast generating module may receive signal broadcasts by using various types of broadcasting systems. In particular, the broadcast generation module may generate the broadcast by using a broadcast signal such as multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video or audio broadcast-handheld (DVB-H), forward link media (MediaFLO)@) A digital broadcasting system of a terrestrial digital broadcasting integrated service (ISDB-T), etc. receives digital broadcasting. The broadcast generating module may be constructed to be suitable for various broadcasting systems providing broadcast signals as well as the above-mentioned digital broadcasting system. Via broadcastersThe broadcast signals and/or broadcast-related messages received by the modules may be stored in memory 20 (or other type of storage medium).
The mobile communication module transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video or audio call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet access technology to which the module relates may include WLAN (wireless LAN) (Wi-Fi), Wibro (wireless broadband), Wimax (worldwide interoperability for microwave access), HSDPA (high speed downlink packet access), and the like.
The short-range communication module is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetoothTMRadio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbeeTMAnd so on.
The location message module is a module for checking or acquiring a location message of the mobile terminal. A typical example of the location message module is GPS (global positioning system). According to the current technology, the GPS module calculates distance messages and accurate time messages from three or more satellites and applies triangulation to the calculated messages, thereby accurately calculating a three-dimensional current location message according to longitude, latitude, and altitude. Currently, a method for calculating a position and time message uses three satellites and corrects an error of the calculated position and time message by using another satellite. In addition, the GPS module can calculate the speed message by continuously calculating the current location message in real time.
The output unit 50 is configured to provide output signals (e.g., audio signals, video or audio signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 50 may include a display unit 51, an audio output module 52, an alarm unit 53, and the like.
The display unit 51 may display messages processed in the mobile terminal 10. For example, when the mobile terminal 10 is in a phone call mode, the display unit 51 may display a User Interface (UI) or Graphical User Interface (GUI) associated with a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 10 is in a video or audio call mode or an image capturing mode, the display unit 51 may display a captured image and/or a received image, a UI or GUI showing video or audio or an image and related functions, and the like.
Meanwhile, when the display unit 51 and the touch panel are superimposed on each other in the form of layers to form a touch screen, the display unit 51 may function as an input device and an output device. The display unit 51 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. The mobile terminal 10 may include two or more display units (or other display devices) according to certain desired embodiments, for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 52 may convert audio data received by the wireless communication unit 40 or stored in the memory 20 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a language recognition mode, a broadcast reception mode, or the like. Also, the audio output module 52 may provide audio outputs related to specific functions performed by the mobile terminal 10 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 52 may include a speaker, a buzzer, and the like.
The alarm unit 53 may provide an output to notify the mobile terminal 10 of the occurrence of the event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video or audio output, the alarm unit 53 may provide output in different ways to notify the occurrence of an event. For example, the alarm unit 53 may provide an output in the form of vibration, and when a call, a message, or some other incoming communication (communicating communication) is received, the alarm unit 53 may provide a tactile output (i.e., vibration) to notify the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 53 may also provide an output notifying the occurrence of an event via the display unit 51 or the audio output module 52.
The input unit 60 may generate key input data to control various operations of the mobile terminal according to a command input by a user. The input unit 60 allows a user to input various types of messages, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 50 in the form of a layer, a touch screen may be formed. In an embodiment of the present invention, the input unit 60 includes a touch screen and an ink-jet screen. The camera 70 is used for taking picture data, and the microphone 71 is used for recording audio data.
The interface unit 80 serves as an interface through which at least one external device is connected to the mobile terminal 10. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video or audio I/O port, an earphone port, and the like. The identification module may be a module that stores various messages for authenticating a user using the mobile terminal 10 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, the device having the identification module (hereinafter referred to as "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 10 via a port or other connection means. The interface unit 80 may be used to receive input (e.g., data messages, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 10 or may be used to transmit data between the mobile terminal and external devices.
In addition, when the mobile terminal 10 is connected with an external cradle, the interface unit 80 may serve as a path through which power is supplied from the cradle to the mobile terminal 10 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle.
The memory 20 may store software programs or the like for processing and controlling operations performed by the controller 30, or may temporarily store data (e.g., a phonebook, messages, still images, video or audio, etc.) that has been or will be output. Also, the memory 20 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 20 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. Also, the mobile terminal 10 may cooperate with a network storage device that performs a storage function of the memory 20 through a network connection.
The controller 30 generally controls the overall operation of the mobile terminal. For example, the controller 30 performs control and processing related to a voice call, a data communication, a video or audio call, and the like. In addition, the controller 30 may include a multimedia module for reproducing (or playing back) multimedia data, and the multimedia module may be constructed within the controller 30 or may be constructed separately from the controller 30. The controller 30 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 90 receives external power or internal power and provides appropriate power required to operate the respective elements and components under the control of the controller 30.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, mobile terminals have been described in terms of their functionality. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 10 as shown in fig. 1 may be configured to operate with systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, a CDMA wireless communication system may include a plurality of mobile terminals 10, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with BSCs 275, which may be coupled to base stations via a backhaul. The backhaul line may be constructed according to any of several known interfaces, which may include, for example, european/american standard high capacity digital lines (E1/T1), Asynchronous Transfer Mode (ATM), network protocol (IP), point-to-point protocol (PPP), frame relay, high-rate digital subscriber line (HDSL), Asymmetric Digital Subscriber Line (ADSL), or various types of digital subscriber lines (xDSL). It will be understood that a system as shown in fig. 2 may include multiple BSCs 275.
Each BS 270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz, 5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cell sites". Alternatively, each partition of a particular BS 270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT)295 transmits a broadcast signal to mobile terminals 10 operating within the system. A broadcast generating module 111 as shown in fig. 1 is provided at the mobile terminal 10 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 facilitates locating at least one of the plurality of mobile terminals 10.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning messages may be obtained with any number of satellites. The GPS module 115 as shown in fig. 1 is generally configured to cooperate with the satellites 300 to obtain the desired positioning messages. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of a wireless communication system, the BS 270 receives reverse link signals from various mobile terminals 10. The mobile terminal 10 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, which interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS 270 to transmit forward link signals to the mobile terminal 10.
Based on the above mobile terminal hardware structure and communication system, the present invention provides various embodiments of the method.
Referring to fig. 3, fig. 3 is a functional block diagram of a mobile terminal according to an embodiment of the invention. The mobile terminal 10 shown in fig. 3 includes a control module 101 and a processing module 102. The functional modules are explained in detail below.
The control module 101 of the mobile terminal 10 continuously acquires a plurality of images, detects a first image processing instruction, selects a basic image from the plurality of images according to the first image processing instruction, and acquires color information of the basic image. The processing module 102 of the mobile terminal 10 detects the second image processing instruction, selects an image to be processed from the plurality of images according to the second image processing instruction, and applies the acquired color information to the image to be processed.
The mobile terminal provided by the embodiment can process other images according to the acquired color information and the color information of one image in a plurality of continuously acquired images, so that the accuracy and the efficiency of processing continuous images are improved.
Referring to fig. 4-6, fig. 4 is a functional block diagram of a second mobile terminal according to an embodiment of the present invention, fig. 5 is a schematic diagram of an infrared sensor array distribution of a third mobile terminal according to an embodiment of the present invention, and fig. 6 is a schematic diagram of a pixel row distribution of a fourth mobile terminal according to an embodiment of the present invention. The mobile terminal 10 shown in fig. 4 includes a control module 101, a processing module 102, a display module 103, and a determination module 104. The mobile terminal 10 shown in fig. 5 includes a screen 105 and an infrared sensor assembly array 106, where a1, A2.. An... A2n, B1, B2.. Bn... B2n, C1, C2... Cn... C2n, D1, D2... Dn... D2n, E1, E2.. En... E2n, F1, F2... Fn... F2n respectively represent an infrared sensor assembly, the infrared sensor assembly array 106 is composed of a plurality of infrared sensor assembly arrangements, the infrared sensor assemblies include infrared transmitters, infrared receivers, touch sensors, pixel points, etc., the infrared sensor assembly array 106 is distributed under the screen of the mobile terminal 10, and one arrangement is schematically shown in fig. 5, but not limited to other arrangements.
The mobile terminal 10 shown in fig. 6 includes a screen 105 and pixel rows 107, in fig. 6, a1 and a2.. an... a2n, b1 and b2... bn... b2n, c1 and c2... cn... c2n, d1 and d2.. 1.. d2 1, e1 and 1.. 1.. e2 1, f1 and 1.. 1.. f2 1 respectively represent a pixel point, a1 and a2.. 1.. a2 1 represent a pixel row, b1 and b 1.. 1.. b2 1 represent a pixel row, c1, 1.. c2.. 1, d 1.. 1.. b2 1 represents a pixel row, c1, d 1.. 1.. b2 1 represents a pixel row, c 72.. 1.. b 1.. 1, a 1.. 1.. 3.. b1, a 1.. 3, a pixel row, a 1. As described in detail below in conjunction with fig. 4-6.
Referring to fig. 4, the control module 101 of the mobile terminal 10 continuously obtains a plurality of images, detects a first image processing instruction, and selects a basic image from the plurality of images according to the first image processing instruction to obtain color information of the basic image. The processing module 102 of the mobile terminal 10 detects the second image processing instruction, selects an image to be processed from the plurality of images according to the second image processing instruction, and applies the acquired color information to the image to be processed.
Specifically, referring to fig. 5, the mobile terminal 10 of fig. 5 includes an array of infrared sensing elements 106, each of which includes a touch sensor, one or more pixels, arranged in a sequence of rows of pixels, e.g., see row of pixels 107 shown in fig. 6. When a basic image is displayed, the control module 101 of the mobile terminal 10 controls the touch sensor of the infrared sensing assembly to detect a first touch signal, determines a corresponding first touch point according to the first touch signal, determines a first pixel point corresponding to the first touch point, determines a first pixel row where the first pixel point is located, activates the infrared sensing assembly corresponding to all pixel points in the first pixel row, acquires color information of the pixel points in the activated infrared sensing assembly, and records the color information of all the pixel points in the activated first pixel row as first color information.
When the image to be processed is displayed, the processing module 102 of the mobile terminal 10 controls the touch sensor of the infrared sensing assembly to detect a second touch signal, determines a corresponding second touch point according to the second touch signal, determines whether the positions of the first touch point and the second touch point are the same, determines a second pixel point corresponding to the second touch point when the positions of the first touch point and the second touch point are the same, determines a second pixel row where the second pixel point is located, and performs color adjustment on the second pixel row according to the first color information.
It is supplementary to explain that, when the basic image is displayed, the control module 101 of the mobile terminal 10 further controls the touch sensor of the infrared sensing assembly to detect a third touch signal, determines a corresponding first touch trajectory according to the third touch signal, determines a plurality of third pixel points corresponding to the first touch trajectory, determines a third pixel row in which each third pixel point is located, activates the infrared sensing assembly corresponding to all pixel points in the determined third pixel row, obtains color information of the pixel points in the activated infrared sensing assembly, and records the color information of all pixel points in the activated third pixel row as the second color information.
When the image to be processed is displayed, the processing module 102 of the mobile terminal 10 further controls the touch sensor of the infrared sensing assembly to detect a fourth touch signal, determines a corresponding second touch trajectory according to the fourth touch signal, determines a plurality of fourth pixel points corresponding to the second touch trajectory, determines a fourth pixel row in which each fourth pixel point is located, determines whether screen positions of the fourth pixel row and the third pixel row are the same, queries color information corresponding to the screen position of the third pixel row from the second color information when the screen positions of the fourth pixel row and the third pixel row are the same, and performs color adjustment on the fourth pixel row according to the queried color information.
It is further added that, after determining the basic image, the display module of the mobile terminal 10 displays the basic image, and after determining the image to be processed, the display module of the mobile terminal 10 displays the image to be processed, that is, the basic image and the image to be processed are displayed according to the same proportion in sequence. The determining module 104 of the mobile terminal 10 determines whether the difference between the color information values of the same pixel point on the screen when displaying the basic image and the image to be processed is within a preset range. When the difference between the color information values of the same pixel point on the screen when displaying the basic image and the image to be processed is within the preset range, the processing module 102 of the mobile terminal 10 adjusts the image to be processed according to the color information value of the basic image for the same pixel point.
It should be further added that, after the plurality of images are continuously acquired, the control module 101 of the mobile terminal 10 numbers the plurality of acquired images and acquires the association information between the plurality of images. The processing module 102 of the mobile terminal 10 selects an image to be processed from the plurality of images according to the serial number, determines color information of the image to be processed according to the color information of the base image and the association information between the plurality of images, and activates corresponding pixel points on the screen according to the color information of the image to be processed.
The mobile terminal provided by the embodiment can process other images according to the acquired color information and the color information of one image in a plurality of continuously acquired images, so that the accuracy and the efficiency of processing continuous images are improved.
Referring to fig. 7-9, fig. 7 is a schematic diagram of distribution of a plurality of continuous images of a fifth mobile terminal according to an embodiment of the present invention, fig. 8 is a schematic diagram of a first screen of a sixth mobile terminal according to an embodiment of the present invention for performing image processing, and fig. 9 is a schematic diagram of a second screen of a seventh mobile terminal according to an embodiment of the present invention for performing image processing. The mobile terminal 10 of fig. 7-9 includes various functional modules shown in fig. 3-4. In fig. 7, 4 images, i.e., an image one, an image two, an image three, and an image four, which are continuously photographed for the same object, are simultaneously displayed on the screen 105 of the mobile terminal 10, in fig. 8, the image one is displayed on the screen 105 of the mobile terminal 10 in a preset screen scale, and in fig. 9, the image three is displayed on the screen 105 of the mobile terminal 10 in a screen scale of fig. 8. As described in detail below in conjunction with fig. 7-9.
Referring to fig. 7, the control module 101 of the mobile terminal 10 continuously acquires a plurality of images, and the continuous images acquired by the mobile terminal 10 include image one, image two, image three, and image four, and may also continuously acquire 10 or 20 images without limitation. The control module 101 of the mobile terminal 10 detects that the first image processing instruction selects an image one from the image one, the image two, the image three, and the image four as a basic image, and displays the basic image on the screen 105, specifically referring to fig. 8, in fig. 8, the image one is displayed on the screen 105 according to a preset ratio. The control module 101 of the mobile terminal 10 controls the touch sensor of the infrared sensing assembly to detect a first touch signal, determines a corresponding first touch point 1051 according to the first touch signal, determines a first pixel point corresponding to the first touch point 1051, determines a first pixel row in which the first pixel point is located, activates the infrared sensing assembly corresponding to all pixel points in the first pixel row, acquires color information of the pixel points in the activated infrared sensing assembly, and records the color information of all pixel points in the activated first pixel row as first color information.
Referring to fig. 9, when image three is selected as the image to be processed, the mobile terminal 10 displays image three on the screen 105 in a scale of displaying image one on the screen 105. The processing module 102 of the mobile terminal 10 controls the touch sensor of the infrared sensing assembly to detect a second touch signal, determines a corresponding second touch point 1052 according to the second touch signal, determines whether the positions of the first touch point 1051 and the second touch point 1052 are the same, determines a second pixel point corresponding to the second touch point when the positions of the first touch point 1051 and the second touch point 1052 are the same, determines a second pixel row where the second pixel point is located, and performs color adjustment on the second pixel row according to the first color information.
The mobile terminal provided by the embodiment can process other images according to the acquired color information and the color information of one image in a plurality of continuously acquired images, so that the accuracy and the efficiency of processing continuous images are improved.
Referring to fig. 10-11, fig. 10 is a schematic diagram of a third screen of an eighth mobile terminal for performing image processing according to an embodiment of the present invention, and fig. 11 is a schematic diagram of a fourth screen of a ninth mobile terminal for performing image processing according to an embodiment of the present invention. The mobile terminal 10 of fig. 10-11 includes various functional modules shown in fig. 3-4. Similarly, along with 4 images continuously photographed for the same object simultaneously displayed on the screen 105 of the mobile terminal 10 in fig. 7, the 4 images are image one, image two, image three, and image four, respectively, in fig. 10, image two is selected as a base image, image two is displayed on the screen 105 of the mobile terminal 10 in accordance with a preset screen scale, and in fig. 11, image four is displayed on the screen 105 of the mobile terminal 10 in accordance with the screen scale of fig. 10. This is explained in detail below with reference to fig. 7 and 10-11.
Referring to fig. 7, the control module 101 of the mobile terminal 10 continuously acquires a plurality of images, and the continuous images acquired by the mobile terminal 10 include image one, image two, image three, and image four, and may also continuously acquire 10 or 20 images without limitation. After the plurality of images are continuously acquired, the control module 101 of the mobile terminal 10 numbers the acquired plurality of images and acquires association information between the plurality of images. The processing module 102 of the mobile terminal 10 selects an image four from the plurality of images as an image to be processed according to the serial number, determines color information of the image four according to color information of the base image and association information among the plurality of images, activates corresponding pixel points on a screen according to the color information of the image to be processed, and can display the image four according to the base image.
Specifically, the control module 101 of the mobile terminal 10 detects the first image processing instruction, selects image two from image one, image two, image three, and image four as the base image, and displays the selected image two on the screen 105, specifically referring to fig. 10, in fig. 10, image two is displayed on the screen 105 according to a preset ratio, and image two is used as the base image. The control module 101 of the mobile terminal 10 controls the touch sensor of the infrared sensing assembly to detect a third touch signal, determines a corresponding first touch trajectory 1053 according to the third touch signal, and determines a plurality of third pixel points corresponding to the first touch trajectory 1053, where the first touch trajectory may be a straight line or a curve, determines a third pixel row in which each third pixel point is located, assumes that the corresponding pixel row is 2-6 pixel rows, activates the infrared sensing assemblies corresponding to all the pixel points in the determined 2-6 pixel rows, obtains color information of the pixel points in the activated infrared sensing assembly, and records the color information of all the pixel points in the activated 2-6 pixel rows as second color information.
Referring to fig. 11, the mobile terminal 10 selects image four as an image to be processed, and displays image four on the screen 105 in the display scale of image two. When the image to be processed is selected as the image four, the processing module 102 of the mobile terminal 10 controls the touch sensor of the infrared sensing assembly to detect a fourth touch signal, determines a corresponding second touch trajectory according to the fourth touch signal, where the second touch trajectory may be a straight line or a curved line, determines a plurality of fourth pixel points corresponding to the second touch trajectory, determines a fourth pixel row where each fourth pixel point is located, assumes that the determined pixel row is 2-4 pixel rows, determines whether screen positions where the fourth pixel row and the third pixel row are located are the same, i.e., determines whether screen positions where the 2-4 pixel rows and the 2-6 pixel rows are located are the same, when the screen positions where the second pixel row and the third pixel row are located are the same, queries color information corresponding to the screen position where the third pixel row is located from the second color information, and performs color adjustment on the fourth pixel row according to the queried color information, for example, the color information corresponding to the screen position where the 2-4 pixel lines are located is inquired from the second color information, and the color adjustment is performed on the 2-4 pixel lines in the image four according to the inquired color information.
It is further added that, after determining the basic image, the display module of the mobile terminal 10 displays the basic image, and after determining the image to be processed, the display module of the mobile terminal 10 displays the image to be processed, that is, the basic image and the image to be processed are displayed according to the same proportion in sequence. The determining module 104 of the mobile terminal 10 determines whether the difference between the color information values of the same pixel point on the screen when displaying the basic image and the image to be processed is within a preset range. When the difference between the color information values of the same pixel point on the screen when displaying the basic image and the image to be processed is within the preset range, the processing module 102 of the mobile terminal 10 adjusts the image to be processed according to the color information value of the basic image for the same pixel point.
For example, for the same pixel point, the basic image is displayed in red, the image to be processed is displayed in green, since the difference value between the two color information values exceeds the preset range, the basic image and the image to be processed are not considered to be the same object, and further color adjustment is not performed, for example, for the same pixel point, the basic image is displayed in red, the image to be processed is displayed in deep red, since the difference value between the two color information values does not exceed the preset range, the basic image and the image to be processed are considered to be the same object, and color adjustment is performed on the image to be processed according to the color information value of the basic image.
The mobile terminal provided by the embodiment can process other images according to the acquired color information and the color information of one image in a plurality of continuously acquired images, so that the accuracy and the efficiency of processing continuous images are improved.
The present invention also provides an image processing method, which is applied to the mobile terminal 10 shown in fig. 3-5, and the image processing method of the present embodiment is described in detail below.
Referring to fig. 12, fig. 12 is a flowchart of a ten-image processing method according to an embodiment of the present invention.
In step S1201, the control module 101 of the mobile terminal 10 continuously acquires a plurality of images, detects a first image processing instruction, selects a base image from the plurality of images according to the first image processing instruction, and acquires color information of the base image.
In step S1202, the processing module 102 of the mobile terminal 10 detects a second image processing instruction, selects an image to be processed from the plurality of images according to the second image processing instruction, and applies the acquired color information to the image to be processed.
The mobile terminal provided by the embodiment can process other images according to the acquired color information and the color information of one image in a plurality of continuously acquired images, so that the accuracy and the efficiency of processing continuous images are improved.
Referring to fig. 13, fig. 13 is a flowchart of an eleventh image processing method according to the embodiment of the invention. The method is applied to the mobile terminal 10 shown in fig. 3 to 5, and the image processing method of the present embodiment will be described in detail below.
In step S1301, when the base image is displayed, the control module 101 of the mobile terminal 10 controls the touch sensor of the infrared sensing assembly to detect a first touch signal, and determines a corresponding first touch point according to the first touch signal.
In step S1302, a first pixel point corresponding to the first touch point is determined.
In step S1303, a first pixel row where the first pixel point is located is determined, and the infrared sensing assemblies corresponding to all the pixel points in the first pixel row are activated.
In step S1304, color information of pixel points in the activated infrared sensing assembly is obtained, and color information of all pixel points in the activated first pixel row is recorded as first color information.
In step S1305, when the image to be processed is displayed, the processing module 102 of the mobile terminal 10 controls the touch sensor of the infrared sensing assembly to detect a second touch signal, and determines a corresponding second touch point according to the second touch signal.
In step S1306, it is determined whether the positions of the first touch point and the second touch point are the same.
In step S1307, when the positions of the first touch point and the second touch point are the same, a second pixel point corresponding to the second touch point is determined.
In step S1308, the second pixel row where the second pixel point is located is determined, and color adjustment is performed on the second pixel row according to the first color information.
The mobile terminal provided by the embodiment can process other images according to the acquired color information and the color information of one image in a plurality of continuously acquired images, so that the accuracy and the efficiency of processing continuous images are improved.
Referring to fig. 14, fig. 14 is a flowchart of a twelve-image processing method according to an embodiment of the present invention. The method is applied to the mobile terminal 10 shown in fig. 3 to 5, and the image processing method of the present embodiment will be described in detail below.
In step S1401, when the base image is displayed, the control module 101 of the mobile terminal 10 further controls the touch sensor of the infrared sensing assembly to detect a third touch signal, and determines a corresponding first touch trajectory according to the third touch signal.
In step S1402, a plurality of third pixel points corresponding to the first touch trajectory are determined.
In step S1403, the third pixel row where each third pixel point is located is determined, and the infrared sensing assemblies corresponding to all the pixel points in the determined third pixel row are activated.
In step S1404, color information of the pixel points in the activated infrared sensing assembly is obtained, and color information of all pixel points in the activated third pixel row is recorded as second color information.
In step S1405, when the to-be-processed image is displayed, the processing module 102 of the mobile terminal 10 further controls the touch sensor of the infrared sensing component to detect a fourth touch signal, and determines a corresponding second touch trajectory according to the fourth touch signal.
In step S1406, a plurality of fourth pixel points corresponding to the second touch trajectory are determined.
In step S1407, it is determined whether the fourth pixel row where each fourth pixel point is located is the same as the screen position where the third pixel row is located.
In step S1408, when the screen positions of the fourth pixel line and the third pixel line are the same, color information corresponding to the screen position of the third pixel line is inquired from the second color information, and the color of the fourth pixel line is adjusted according to the inquired color information.
The mobile terminal provided by the embodiment can process other images according to the acquired color information and the color information of one image in a plurality of continuously acquired images, so that the accuracy and the efficiency of processing continuous images are improved.
Referring to fig. 15, fig. 15 is a flowchart of a thirteenth image processing method according to the embodiment of the present invention. The method is applied to the mobile terminal 10 shown in fig. 3 to 5, and the image processing method of the present embodiment will be described in detail below.
In step S1501, after the basic image is determined, the display module of the mobile terminal 10 displays the basic image, and after the image to be processed is determined, the display module of the mobile terminal 10 displays the image to be processed, that is, the basic image and the image to be processed are displayed according to the same ratio in sequence.
In step S1502, the determining module 104 of the mobile terminal 10 determines whether the difference between the color information values of the same pixel point on the screen when displaying the basic image and the image to be processed is within a preset range.
In step S1503, when the difference between the color information values of the same pixel point on the screen when displaying the basic image and the image to be processed is within the preset range, the processing module 102 of the mobile terminal 10 adjusts the image to be processed according to the color information value of the basic image for the same pixel point.
The mobile terminal provided by the embodiment can process other images according to the acquired color information and the color information of one image in a plurality of continuously acquired images, so that the accuracy and the efficiency of processing continuous images are improved.
Referring to fig. 16, fig. 16 is a flowchart of a fourteenth image processing method according to an embodiment of the present invention. The method is applied to the mobile terminal 10 shown in fig. 3 to 5, and the image processing method of the present embodiment will be described in detail below.
In step S1601, after continuously acquiring a plurality of images, the control module 101 of the mobile terminal 10 numbers the acquired plurality of images and acquires association information between the plurality of images.
In step S1602, the processing module 102 of the mobile terminal 10 selects an image to be processed from the plurality of images according to the serial number, determines color information of the image to be processed according to the color information of the base image and the association information between the plurality of images, and activates corresponding pixel points on the screen according to the color information of the image to be processed.
The mobile terminal provided by the embodiment can display other images according to the acquired color information according to the color information of one image in a plurality of continuously acquired images, so that the accuracy and efficiency of processing continuous images are improved.
The present invention is not limited to the above preferred embodiments, and any modifications, equivalent substitutions and improvements made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (8)

1. A mobile terminal, comprising:
the infrared sensing assembly comprises an infrared sensing assembly array, a touch sensor and a plurality of pixel points, wherein each infrared sensing assembly comprises a touch sensor and one or more pixel points, and the plurality of pixel points are sequentially arranged into pixel rows;
the control module is used for continuously acquiring a plurality of images, detecting a first image processing instruction, selecting a basic image according to the first image processing instruction, controlling the touch sensor to detect a first touch signal when the basic image is displayed, determining a corresponding first touch point according to the first touch signal, determining a first pixel point corresponding to the first touch point, determining a first pixel row where the first pixel point is located, activating the infrared sensing assemblies corresponding to all the pixel points in the first pixel row, acquiring color information of the pixel points in the activated infrared sensing assemblies, and recording the color information of all the pixel points in the activated first pixel row as first color information; and
the processing module is used for detecting a second image processing instruction, selecting an image to be processed from the plurality of images according to the second image processing instruction, controlling the touch sensor to detect a second touch signal when the image to be processed is displayed, determining a corresponding second touch point according to the second touch signal, judging whether the positions of the first touch point and the second touch point are the same, judging a second pixel point corresponding to the second touch point when the positions of the first touch point and the second touch point are the same, judging a second pixel row where the second pixel point is located, and performing color adjustment on the second pixel row according to the first color information.
2. The mobile terminal according to claim 1, wherein the control module is further configured to control the touch sensor to detect a third touch signal when the base image is displayed, determine a corresponding first touch trajectory according to the third touch signal, determine a plurality of third pixel points corresponding to the first touch trajectory, determine a third pixel row in which each of the third pixel points is located, activate the infrared sensing assemblies corresponding to all the pixel points in the determined third pixel row, obtain color information of the pixel points in the activated infrared sensing assembly, and record the color information of all the pixel points in the activated third pixel row as second color information;
the processing module is further configured to control the touch sensor to detect a fourth touch signal when the to-be-processed image is displayed, determine a corresponding second touch trajectory according to the fourth touch signal, determine a plurality of fourth pixel points corresponding to the second touch trajectory, determine a fourth pixel row in which each fourth pixel point is located, determine whether screen positions of the fourth pixel row and the third pixel row are the same, query color information corresponding to a screen position in which the third pixel row is located from the second color information when the screen positions of the fourth pixel row and the third pixel row are the same, and perform color adjustment on the fourth pixel row according to the queried color information.
3. The mobile terminal of any of claims 1-2, further comprising:
the display module is used for displaying the basic image and the image to be processed according to the same proportion according to the sequence;
the judging module is used for judging whether the difference value of the color information values of the same pixel point on the screen is within a preset range when the basic image and the image to be processed are displayed;
the processing module is further configured to adjust the image to be processed according to the color information value of the base image for the same pixel point when the difference value of the color information values of the same pixel point on the screen when the base image and the image to be processed are displayed is within a preset range.
4. The mobile terminal of claim 3, wherein the control module is further configured to number the obtained plurality of images and obtain association information between the plurality of images;
the processing module is further configured to select the image to be processed from the plurality of images according to the serial number, determine color information of the image to be processed according to the color information of the base image and the association information between the plurality of images, and activate a corresponding pixel point on the screen according to the color information of the image to be processed.
5. An image processing method is applied to a mobile terminal, the mobile terminal comprises an infrared sensing assembly array, each infrared sensing assembly comprises a touch sensor and one or more pixel points, and the plurality of pixel points are arranged in a pixel row according to a sequence, and the image processing method is characterized by comprising the following steps of:
the mobile terminal continuously acquires a plurality of images, detects a first image processing instruction, selects a basic image according to the first image processing instruction,
when the basic image is displayed, the mobile terminal controls the touch sensor to detect a first touch signal, and determines a corresponding first touch point according to the first touch signal;
judging a first pixel point corresponding to the first touch point;
judging a first pixel row where the first pixel point is located, and activating infrared sensing assemblies corresponding to all pixel points in the first pixel row;
acquiring color information of pixel points in the activated infrared sensing assembly, and recording the color information of all the pixel points in the activated first pixel row as first color information; and
detecting a second image processing instruction, selecting an image to be processed from the plurality of images according to the second image processing instruction,
when the image to be processed is displayed, the mobile terminal controls the touch sensor to detect a second touch signal, and a corresponding second touch point is determined according to the second touch signal;
judging whether the positions of the first touch point and the second touch point are the same or not;
when the positions of the first touch point and the second touch point are the same, judging a second pixel point corresponding to the second touch point;
and judging a second pixel row where the second pixel point is located, and performing color adjustment on the second pixel row according to the first color information.
6. The image processing method according to claim 5, further comprising:
when the basic image is displayed, the mobile terminal controls the touch sensor to detect a third touch signal, and a corresponding first touch track is determined according to the third touch signal;
judging a plurality of third pixel points corresponding to the first touch track;
judging a third pixel row where each third pixel point is located, and activating infrared sensing assemblies corresponding to all the pixel points in the judged third pixel row;
acquiring color information of pixel points in the activated infrared sensing assembly, and recording color information of all pixel points in the activated third pixel row as second color information;
when the image to be processed is displayed, the mobile terminal controls the touch sensor to detect a fourth touch signal, and a corresponding second touch track is determined according to the fourth touch signal;
judging a plurality of fourth pixel points corresponding to the second touch track;
judging a fourth pixel row where each fourth pixel point is located, and judging whether the screen positions where the fourth pixel row and the third pixel row are located are the same or not;
and when the screen positions of the fourth pixel line and the third pixel line are the same, inquiring color information corresponding to the screen position of the third pixel line from the second color information, and adjusting the color of the fourth pixel line according to the inquired color information.
7. The image processing method of any of claims 5 to 6, further comprising:
displaying the basic image and the image to be processed according to the same proportion according to the sequence;
judging whether the difference value of the color information values of the same pixel point on the screen when the basic image and the image to be processed are displayed is within a preset range;
and when the difference value of the color information values of the same pixel point on the screen when the basic image and the image to be processed are displayed is within a preset range, aiming at the same pixel point, adjusting the image to be processed according to the color information value of the basic image.
8. The image processing method according to claim 7, wherein when a plurality of images are acquired, the image processing method further comprises:
numbering the obtained multiple images and obtaining the associated information among the multiple images;
and selecting the image to be processed from the plurality of images according to the serial number, judging the color information of the image to be processed according to the color information of the basic image and the correlation information among the plurality of images, and activating the corresponding pixel point on the screen according to the color information of the image to be processed.
CN201710211154.9A 2017-03-31 2017-03-31 Mobile terminal and image processing method Expired - Fee Related CN106980460B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710211154.9A CN106980460B (en) 2017-03-31 2017-03-31 Mobile terminal and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710211154.9A CN106980460B (en) 2017-03-31 2017-03-31 Mobile terminal and image processing method

Publications (2)

Publication Number Publication Date
CN106980460A CN106980460A (en) 2017-07-25
CN106980460B true CN106980460B (en) 2020-06-02

Family

ID=59343631

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710211154.9A Expired - Fee Related CN106980460B (en) 2017-03-31 2017-03-31 Mobile terminal and image processing method

Country Status (1)

Country Link
CN (1) CN106980460B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019023968A1 (en) * 2017-08-02 2019-02-07 深圳传音通讯有限公司 Image color adjustment system and color adjustment method for smart terminal
US11081516B2 (en) 2018-08-10 2021-08-03 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Display screen, electronic device and method for three-dimensional feature recognition
CN112783459A (en) * 2019-11-08 2021-05-11 北京汉美奥科节能设备有限公司 Device for changing color of designated position of screen by transmitting position data through screen and receiving position information point by point
CN111597618B (en) * 2020-05-19 2022-06-03 翟玉华 Bridge monitoring system based on BIM-GIS

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104023217A (en) * 2013-03-01 2014-09-03 联想(北京)有限公司 Information processing method and electronic equipment
CN105681582A (en) * 2016-03-18 2016-06-15 努比亚技术有限公司 Control color adjusting method and terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012181717A (en) * 2011-03-02 2012-09-20 Nikon Corp Image display device, imaging apparatus, and image correction program
US9128713B2 (en) * 2013-01-15 2015-09-08 Synaptics Incorporated Method and circuit to optimize N-line LCD power consumption
CN104407767A (en) * 2014-10-22 2015-03-11 深圳市金立通信设备有限公司 Method for regulating user interface

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104023217A (en) * 2013-03-01 2014-09-03 联想(北京)有限公司 Information processing method and electronic equipment
CN105681582A (en) * 2016-03-18 2016-06-15 努比亚技术有限公司 Control color adjusting method and terminal

Also Published As

Publication number Publication date
CN106980460A (en) 2017-07-25

Similar Documents

Publication Publication Date Title
CN106504280A (en) A kind of method and terminal for browsing video
CN106713716B (en) Shooting control method and device for double cameras
CN106302651B (en) Social picture sharing method and terminal with social picture sharing system
CN106980460B (en) Mobile terminal and image processing method
CN105045509A (en) Picture editing apparatus and method
CN106534553B (en) Mobile terminal and shooting method thereof
CN105138255A (en) Terminal and image information acquisition method
CN105975500A (en) Data processing method, data statistical system and backstage management system
CN106028090A (en) Mobile terminal and video recording method thereof
CN106993094B (en) Mobile terminal and scanning method
CN106851114B (en) Photo display device, photo generation device, photo display method, photo generation method and terminal
CN105261054A (en) Device and method for compositing audio GIF image
CN105242483A (en) Focusing realization method and device and shooting realization method and device
CN105743170B (en) A kind of control method and terminal to charge parallel
CN105072279A (en) Backup device and method based on mobile terminal
CN106791149A (en) A kind of method of mobile terminal and control screen
CN105262953B (en) A kind of mobile terminal and its method of control shooting
CN106131305A (en) A kind of control method and electronic equipment
CN107241497B (en) Mobile terminal and loudness output adjusting method
CN107194243B (en) Mobile terminal and method for installing application program
CN106990896B (en) Stereo photo display method and device based on double cameras and mobile terminal
CN106792878B (en) Data traffic monitoring method and device
CN105898158B (en) A kind of data processing method and electronic equipment
CN105141834A (en) Device and method for controlling picture shooting
CN106649753B (en) Data processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200507

Address after: Room 132, building 12, Greenland Business City, No. 58, hope Avenue, Yancheng City, Jiangsu Province

Applicant after: Yancheng Zhenghao Network Technology Co.,Ltd.

Address before: 518057 Guangdong Province, Shenzhen high tech Zone of Nanshan District City, No. 9018 North Central Avenue's innovation building A, 6-8 layer, 10-11 layer, B layer, C District 6-10 District 6 floor

Applicant before: NUBIA TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200602