CN111723035A - Image processing method, mobile terminal and wearable device - Google Patents

Image processing method, mobile terminal and wearable device Download PDF

Info

Publication number
CN111723035A
CN111723035A CN201910223405.4A CN201910223405A CN111723035A CN 111723035 A CN111723035 A CN 111723035A CN 201910223405 A CN201910223405 A CN 201910223405A CN 111723035 A CN111723035 A CN 111723035A
Authority
CN
China
Prior art keywords
image
processing
wearable device
mobile terminal
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201910223405.4A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qiku Internet Technology Shenzhen Co Ltd
Original Assignee
Qiku Internet Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qiku Internet Technology Shenzhen Co Ltd filed Critical Qiku Internet Technology Shenzhen Co Ltd
Priority to CN201910223405.4A priority Critical patent/CN111723035A/en
Publication of CN111723035A publication Critical patent/CN111723035A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • G06F13/382Information transfer, e.g. on bus using universal interface adapter
    • G06F13/385Information transfer, e.g. on bus using universal interface adapter for adaptation of a particular data processing system to different peripheral devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an image processing method, a mobile terminal and wearable equipment, wherein the image processing method comprises the steps of receiving a processing instruction generated by the wearable equipment based on sensed user action; processing the first image applied currently into a second image according to the processing instruction; and sending the second image to the wearable device. In this way, when the user cooperates the mobile terminal with the wearable device, the image processing function of the wearable device can be realized without adding an additional module, and the user experience is improved.

Description

Image processing method, mobile terminal and wearable device
Technical Field
The application relates to the technical field of intelligent terminals, in particular to an image processing method, a mobile terminal and wearable equipment.
Background
With the rapid development of mobile communication technology, intelligent terminals such as mobile terminals and wearable devices are widely popularized. In view of this, through mobile terminal and wearable equipment cooperation work, more abundant function has been expanded gradually. For example, the mobile terminal can be matched with VR (virtual reality) glasses to realize immersive experiences of games and movies. However, most wearable devices are limited by production technology and popularization cost, and the processing capability thereof is still apprehended to completely cope with the image processing task sometimes, thereby reducing the image processing effect. Furthermore, users often also need to resort to additional modules, such as handles, to adjust the desired image processing effect. All reducing user experience.
Disclosure of Invention
The main technical problem who solves of this application is how to carry out image processing when the user cooperates mobile terminal and wearable equipment to need not to increase the image processing function that extra module can realize wearable equipment, promote user experience.
In order to solve the technical problem, the technical scheme adopted by the application is as follows: an image processing method is provided, which includes receiving a processing instruction generated by a wearable device based on a sensed user action; processing the first image applied currently into a second image according to the processing instruction; and sending the second image to the wearable device.
In order to solve the above technical problem, another technical solution adopted by the present application is: an image processing method is provided, which includes generating a processing instruction corresponding to a sensed user action; sending the processing instruction to the mobile terminal so that the mobile terminal processes the currently applied first image into a second image according to the processing instruction; and receiving a second image sent by the mobile terminal.
In order to solve the above technical problem, the present application adopts another technical solution: a mobile terminal is provided comprising processing circuitry and communication circuitry coupled to each other; the processing circuit and the communication circuit can realize the image processing method when working.
In order to solve the above technical problem, the present application adopts another technical solution: the wearable device comprises a processing circuit, a communication circuit and a human-computer interaction circuit; the communication circuit and the man-machine interaction circuit are coupled with the processing circuit, and the image processing method can be realized when the processing circuit, the communication circuit and the man-machine interaction circuit work.
The beneficial effect of this application is: different from the prior art, the image processing method provided by the application receives a processing instruction generated by the wearable device based on the sensed user action, processes a first image currently applied into a second image according to the processing instruction, and then sends the second image to the wearable device. According to the image processing method and device, the processing instruction is generated by the wearable device based on sensing the user action, the processing instruction is sent to the mobile terminal, the mobile terminal carries out image processing after receiving the processing instruction, the image processing task is executed by the mobile terminal with high processing capacity, the image processing effect is guaranteed as far as possible, the processed image is transmitted back to the wearable device, the image processing function of the wearable device can be achieved without adding an additional module, and user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions in the present application, the drawings required in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive labor. Wherein:
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of an image processing method according to the present application;
FIG. 2 is a schematic flow chart diagram illustrating another embodiment of an image processing method according to the present application;
FIG. 3 is a schematic flow chart diagram illustrating another embodiment of an image processing method according to the present application;
FIG. 4 is a schematic flow chart diagram illustrating another embodiment of an image processing method according to the present application;
FIG. 5 is a schematic flow chart diagram illustrating a further embodiment of an image processing method according to the present application;
FIG. 6 is a schematic flow chart diagram illustrating another embodiment of an image processing method according to the present application;
FIG. 7 is a block diagram of an embodiment of a mobile terminal according to the present application;
FIG. 8 is a block diagram of another embodiment of a mobile terminal according to the present application;
fig. 9 is a frame diagram of an embodiment of a wearable device of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating an embodiment of an image processing method according to the present application. Specifically, the image processing method according to the present embodiment includes:
step S11: processing instructions generated by the wearable device based on the sensed user action are received.
Wearable devices may include, but are not limited to, VR glasses, 3D (3Dimensions) glasses, and the like. In one implementation scenario, the wearable device is integrated with a posture sensor that can recognize user actions, such as nodding the head, shaking the head, and so on. In another implementation, the wearable device is integrated with an infrared sensor, which can recognize user actions, such as blinking eyes, and the like, but the embodiment is not limited thereto.
The user action and the processing instruction can be prestored in the wearable device, in one implementation scenario, the corresponding relationship between the user action and the processing can be set by default, and in another implementation scenario, the corresponding relationship between the user action and the processing instruction can be set by user self-definition. For example: the blinking corresponds to a processing instruction for adjusting the decimated image, and the nodding corresponds to a processing instruction for adjusting the resolution of the image, and the embodiment is not particularly limited herein.
Wearable equipment passes through the bluetooth with mobile terminal and is connected, and wearable equipment also can form local area network with mobile terminal through the router to realize connecting, this embodiment does not specifically restrict wearable equipment and mobile terminal's connected mode.
Step S12: and processing the currently applied first image into a second image according to the processing instruction.
And the mobile terminal processes the currently applied first image into a second image according to the received processing instruction. The current application may be a game application, or may be other image-related applications such as a movie application. The first image may be an image frame of a game application or a video frame of a movie application, and the embodiment is not particularly limited.
Step S13: and sending the second image to the wearable device.
And the mobile terminal sends the second image processed according to the processing instruction to the wearable device, so that the user can acquire the desired image.
Through the embodiment, the processing instruction generated by the wearable device based on the sensed user action is received, the first image applied currently is processed into the second image according to the processing instruction, and then the second image is sent to the wearable device. According to the image processing method and device, the processing instruction is generated by the wearable device based on sensing the user action, the processing instruction is sent to the mobile terminal, the mobile terminal carries out image processing after receiving the processing instruction, the image processing task is executed by the mobile terminal with high processing capacity, the image processing effect is guaranteed as far as possible, the processed image is transmitted back to the wearable device, the image processing function of the wearable device can be achieved without adding an additional module, and user experience is improved.
In an embodiment, the first image is a two-dimensional image, and the second image obtained according to the processing instruction is a two-dimensional image, where step S13 in the above embodiment may specifically include: sending the second image and the format conversion instruction to the wearable device to cause the wearable device to convert the second image into a three-dimensional image in response to the format conversion instruction.
In one implementation scenario, the wearable device is integrated with a Graphics Processing Unit (GPU), which may perform complex mathematical and geometric computations to implement image Processing operations such as Graphics rendering, and the GPU may also have 2D (2Dimensions) or 3D Graphics acceleration functions. Two-dimensional images can be quickly converted to three-dimensional images by integrating a GPU in a wearable device.
In one implementation scenario, the wearable device and the mobile terminal may complete a format conversion task allocation negotiation when establishing a communication connection, so that the mobile terminal or the wearable device decides to perform image format conversion by the mobile terminal and/or the wearable device. For example, by exchanging internal parameters of the two, such as cpu (central Processing unit), GPU Processing capability, and memory size, the operation of converting a two-dimensional image into a three-dimensional image is performed by the user. For example, in the present embodiment, the wearable device alone performs the operation of converting a two-dimensional image into a three-dimensional image through negotiation; or, in an implementation scenario, the wearable device can be specified to independently complete the operation of converting the two-dimensional image into the three-dimensional image through user customization; or, in another implementation scenario, in order to implement load sharing, the wearable device may also undertake the task of converting the two-dimensional image into the three-dimensional image in a time-series manner through negotiation with the mobile terminal, for example, after the mobile terminal performs the task of converting the two-dimensional image into the three-dimensional image three times, the wearable device performs the task of converting the two-dimensional image into the three-dimensional image two times, that is, the mobile terminal and the wearable device perform format conversion intermittently according to the number of times of performing format conversion; or after the mobile terminal performs the format conversion task for three minutes, the wearable device performs the format conversion task for one minute, that is, the mobile terminal and the wearable device intermittently perform the format conversion operation according to the time for performing the format conversion, which is not limited in this embodiment.
In an implementation scenario, when the wearable device is negotiated with the mobile terminal or is customized by a user to undertake an operation task of converting a two-dimensional image into a three-dimensional image, the mobile terminal may further send only a second image to the wearable device without sending a format conversion instruction, and after receiving the second image, the wearable device performs format conversion on the second image by default to convert the second image into the three-dimensional image. In another implementation scenario, when the wearable device undertakes an operational task of converting the two-dimensional image into the three-dimensional image separately from the mobile terminal by negotiation with the mobile terminal or by user customization, during a timing when the wearable device undertakes an operational task of converting the two-dimensional image into the three-dimensional image, the mobile terminal sends the second image and the format conversion instruction to the wearable device to cause the wearable device to convert the second image into the three-dimensional image in response to the format conversion instruction.
In another embodiment, please refer to fig. 2, and fig. 2 is a flowchart illustrating another embodiment of an image processing method according to the present application. In this embodiment, the first image is a two-dimensional image, the second image transmitted to the wearable device is a three-dimensional image,
before the step S13, the image processing method of the present application may further include:
step S21: and converting the second image from the two-dimensional image into a three-dimensional image.
The mobile terminal converts the second image from a two-dimensional image to a three-dimensional image. In this embodiment, the task of converting the second image from the two-dimensional image to the three-dimensional image executed by the mobile terminal may be determined by completing a format conversion task allocation negotiation when the wearable device and the mobile terminal establish a communication connection, or may be specified by a user to complete an operation of converting the two-dimensional image into the three-dimensional image, which is not limited herein.
In an implementation scenario, the mobile terminal and the wearable device may also undertake a task of converting a two-dimensional image into a three-dimensional image in a time-sequential manner, which may specifically refer to the foregoing implementation, and this implementation is not described herein again.
The step S13 may include:
step S22: and sending the second image converted into the three-dimensional image to the wearable device.
And the mobile terminal sends the second image converted into the three-dimensional image to the wearable device. In one implementation scenario, the mobile terminal transmits the second image converted into the three-dimensional image to the wearable device through bluetooth, and in another implementation scenario, the mobile terminal may further transmit the second image converted into the three-dimensional image to the wearable device through Wireless Fidelity (WiFi).
In another embodiment, if the processing instruction is an extraction adjustment instruction, the step S12 may specifically include: and extracting partial images from the plurality of first images according to the extraction adjustment instruction to obtain a second image.
The extraction adjustment instruction is about adjusting the number of partial images extracted from the plurality of first images applied at present per second, and when the mobile terminal receives the processing instruction, the mobile terminal is indicated to execute the operation of adjusting the number of the extracted images from the plurality of first images. Specifically, referring to fig. 3, the step of extracting the partial images from the plurality of first images according to the extraction adjustment instruction to obtain the second image may specifically include the following steps:
the mobile terminal has a first correspondence between the reception frequency of the processing instructions and the decimation rate pre-stored. Each time the wearable device generates a processing instruction corresponding to the sensed user action, for example, each time the user blinks his or her eye, a decimation adjustment instruction is generated, so that the frequency of eye blinking of the user is the frequency of generating the processing instruction, that is, the frequency of receiving the processing instruction by the mobile terminal, for example: 1/second, 2/second, etc.; or, the user generates an extraction adjustment command once per click, so that the frequency of user's click is the frequency of generating the processing command, that is, the frequency of receiving the processing command by the mobile terminal, for example: 1/sec, 2/sec, etc., and the present embodiment is not particularly limited herein. The extraction rate is the number of frames per second, e.g. 10 frames/second, 20 frames/second, at which the mobile terminal extracts a partial image from the plurality of first images currently applied. When the current application is a game application or a movie application, the game or the movie has a certain frame rate, which indicates the frequency of continuous appearance of the applied images, for example, 35 frames/second, and in this embodiment, indicates that 35 frames of first images continuously appear per second; 30 frames/second, it means that 30 first images occur per second. The mobile terminal extracts a partial image from the first images to obtain a second image. In one implementation scenario, the decimation rate may be preset to be the same as the frame rate of the current application.
Step S31: and acquiring the receiving frequency of the processing instruction within the set time, and acquiring the extraction rate corresponding to the receiving frequency according to the first corresponding relation.
The mobile terminal obtains the receiving frequency of the processing instruction within the set time, wherein the set time can be 3 seconds, 5 seconds, 7 seconds and the like, and the receiving frequency can be 1/second and 2/second. According to the first corresponding relationship, an extraction rate corresponding to the receiving frequency is obtained, for example, an extraction rate of 20 frames/second corresponding to the receiving frequency of 1/second is obtained, or an extraction rate of 25 frames/second corresponding to the receiving frequency of 2/second is obtained, which is not limited herein.
Step S32: and extracting partial images from the plurality of first images according to the extraction adjusting instruction and the extraction rate to obtain a second image.
The mobile terminal extracts partial images from the plurality of first images according to the extraction adjustment instruction and the extraction rate to obtain second images. For example, the frame rate of the current application is 35 frames/second, that is, the frames of the images that continuously appear per second of the current application are 35 frames, the mobile terminal performs adjustment of the number of frames of the partial images extracted from the plurality of first images according to the extraction adjustment instruction, the specific number of frames of the extracted partial images is adjusted according to the extraction rate, the extraction rate is 20 frames/second, the first images of the current application continuously appear per second, the mobile terminal portion extracts 20 frames per second from the first images as the second images and sends the second images to the wearable device, so that the images that appear per second of the wearable device are 20 frames, the frame rate of the wearable device is adjusted, and finally the frame rate of the screen is adjusted according to the personal experience of the user.
In another embodiment, the processing instruction is a resolution adjustment instruction, and the step S12 may specifically include: the processing of the first image into the second image comprises: and carrying out resolution adjustment on the first image according to the resolution adjustment instruction so as to obtain a second image.
The resolution adjustment instruction is a processing instruction for adjusting the resolution of the image, and the mobile terminal, when receiving the processing instruction, indicates that the mobile terminal performs an operation of adjusting the resolution of the first image. Specifically, referring to fig. 4, the above-mentioned step of adjusting the resolution of the first image according to the resolution adjustment instruction to obtain the second image may specifically include the following steps:
the mobile terminal is prestored with a second corresponding relation between the receiving frequency of the processing instruction and the resolution parameter. Each time the wearable device generates a processing instruction corresponding to the sensed user action, for example, each time the user blinks his or her eye, a decimation adjustment instruction is generated, so that the frequency of eye blinking of the user is the frequency of generating the processing instruction, that is, the frequency of receiving the processing instruction by the mobile terminal, for example: 1/second, 2/second, etc.; or, the user generates an extraction adjustment command once per click, so that the frequency of user's click is the frequency of generating the processing command, that is, the frequency of receiving the processing command by the mobile terminal, for example: 1/sec, 2/sec, etc., and the present embodiment is not particularly limited herein. The resolution determines the fineness of the bitmap image, and generally, the higher the resolution of the image, the more pixels are included, the sharper the image becomes, and of course, the storage space occupied by the image file also increases. The resolution parameters are typically expressed in pixels per inch (ppi) or dots per inch (dpi).
Step S41: and acquiring the receiving frequency of the processing instruction within the set time, and acquiring a resolution parameter corresponding to the receiving frequency according to the second corresponding relation.
The mobile terminal obtains the receiving frequency of the processing instruction within the set time, wherein the set time can be 3 seconds, 5 seconds, 7 seconds and the like, and the receiving frequency can be 1/second and 2/second. According to the second corresponding relationship, the resolution parameter corresponding to the receiving frequency is obtained, for example, the resolution parameter corresponding to the obtained receiving frequency is 1/second is 480 × 272ppi, and the resolution parameter corresponding to the obtained receiving frequency is 2/second is 854 × 480ppi, which is not limited in this embodiment.
Step S42: and carrying out resolution adjustment on the first image according to the resolution adjustment instruction and the resolution parameter so as to obtain a second image with the resolution parameter.
And the mobile terminal performs resolution adjustment on the first image according to the resolution adjustment instruction and the resolution parameter so as to obtain a second image with the resolution parameter. For example, the currently applied image resolution parameter is 854 × 480ppi, the mobile terminal performs an operation of adjusting the resolution of the first image according to the resolution adjustment instruction, the mobile terminal adjusts the resolution of the first image according to the resolution parameter, and when the resolution parameter is 480 × 272ppi, the resolution of the currently applied first image is adjusted from 854 × 480ppi to 480 × 272ppi, the adjusted image is used as the second image and is transmitted to the wearable device, so that the adjustment of the image resolution of the wearable device is realized, and finally, the adjustment of the image resolution according to the personal experience of the user is realized.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating another embodiment of an image processing method according to the present application. After the step S11, the method may further include:
step S51: and judging whether the processing instruction is an application switching instruction, if so, only needing to be executed in the step S52, and if not, executing the step S53.
The application switching instruction is an instruction for switching an application which a user expects the mobile terminal to currently run. For example, the game application is switched to a movie application, or vice versa, the movie application is switched to a game application, and the like, and the present embodiment is not particularly limited.
Step S52: and switching the current application.
And if the processing instruction is an application switching instruction, switching the current application. In one implementation scenario, during the switching process, the screen of the mobile terminal is transmitted to the wearable device, and the application being switched is displayed on the wearable device, so that the user is aware of the currently switched application explicitly, and when the application is switched to the desired application, based on a preset action related to "determine operation", the wearable device senses the action, and then generates a processing instruction of "confirm operation", so as to switch to the desired application. In another implementation scenario, a third corresponding relationship between the receiving frequency of the application switching instruction and the switched application may be prestored in the mobile terminal, so that the mobile terminal switches to the application desired by the user according to the application switching instruction and the third corresponding relationship, for example, the wearable device generates the application switching instruction based on the sensed motion of shaking the head of the user, the user generates one application switching instruction every time the user shakes the head, the mobile terminal acquires the switched application, for example, the movie application, according to the frequency of the received application switching instruction, for example, 2 times/second, and the third corresponding relationship, and then the mobile terminal switches from the current application to the movie application. In other implementation scenarios, application switching may also be implemented in other manners, and this embodiment is not specifically limited herein.
Step S53: and executing the step of processing the currently applied first image into the second image according to the processing instruction and the subsequent steps.
If the processing command is not the application switching command, the above step S12 and the following steps are executed.
Through the embodiment, the current application can be switched according to the action of the user, and the user experience on different types of applications or the same type of applications is met.
Referring to fig. 6, fig. 6 is a flowchart illustrating an image processing method according to another embodiment of the present application. Specifically, the image processing method according to the present embodiment includes:
step S61: processing instructions corresponding to the sensed user actions are generated.
Wearable devices may include, but are not limited to, VR glasses, 3D (3Dimensions) glasses, and the like. In one implementation scenario, the wearable device is integrated with a posture sensor that can recognize user actions, such as nodding the head, shaking the head, and so on. In another implementation, the wearable device is integrated with an infrared sensor, which can recognize user actions, such as blinking eyes, and the like, but the embodiment is not limited thereto.
The user action and the processing instruction can be prestored in the wearable device, in one implementation scenario, the corresponding relationship between the user action and the processing can be set by default, and in another implementation scenario, the corresponding relationship between the user action and the processing instruction can be set by user self-definition. For example: the blinking corresponds to a processing instruction for adjusting the decimated image, and the nodding corresponds to a processing instruction for adjusting the resolution of the image, and the embodiment is not particularly limited herein.
Wearable equipment passes through the bluetooth with mobile terminal and is connected, and wearable equipment also can form local area network with mobile terminal through the router to realize connecting, this embodiment does not specifically restrict wearable equipment and mobile terminal's connected mode.
Step S62: and sending the processing instruction to the mobile terminal so that the mobile terminal processes the currently applied first image into a second image according to the processing instruction.
The wearable device may send the processing instruction to the mobile terminal through bluetooth, and may also send the processing instruction to the mobile terminal through Wireless Fidelity (WiFi), which is not specifically limited herein.
And the mobile terminal processes the currently applied first image into a second image according to the received processing instruction. The current application may be a game application, or may be other image-related applications such as a movie application. The first image may be an image frame of a game application or a video frame of a movie application, and the embodiment is not particularly limited.
Step S63: and receiving a second image sent by the mobile terminal.
And the wearable device receives the second image processed by the mobile terminal according to the processing instruction, so that the user can acquire the desired image.
Through the embodiment, the wearable device generates the processing instruction corresponding to the sensed user action, and sends the processing instruction to the mobile terminal, so that the mobile terminal processes the currently applied first image into the second image according to the processing instruction, and finally receives the second image sent by the mobile terminal. According to the image processing method and device, the processing instruction is generated by the wearable device based on sensing the user action, the processing instruction is sent to the mobile terminal, the mobile terminal carries out image processing after receiving the processing instruction, the image processing task is executed by the mobile terminal with high processing capacity, the image processing effect is guaranteed as far as possible, the processed image is transmitted back to the wearable device, the image processing function of the wearable device can be achieved without adding an additional module, and user experience is improved.
In another embodiment, the first image is a two-dimensional image, and the received second image is a three-dimensional image, that is, the mobile terminal completes the operation of converting the two-dimensional image into the three-dimensional image.
In yet another embodiment, the first image and the second image are both two-dimensional images, and after step S63, the method may further include: and carrying out format conversion on the second image to obtain a three-dimensional image, namely the wearable device finishes the operation of converting the two-dimensional image into the three-dimensional image.
In one implementation scenario, the wearable device is integrated with a Graphics Processing Unit (GPU), which may perform complex mathematical and geometric computations to implement image Processing operations such as Graphics rendering, and the GPU may also have 2D (2Dimensions) or 3D Graphics acceleration functions. Two-dimensional images can be quickly converted to three-dimensional images by integrating a GPU in a wearable device.
In one implementation scenario, the wearable device and the mobile terminal may complete a format conversion task allocation negotiation when establishing a communication connection, so that the mobile terminal or the wearable device decides to perform image format conversion by the mobile terminal and/or the wearable device. For example, by exchanging internal parameters of the two, such as cpu (central Processing unit), GPU Processing capability, and memory size, the operation of converting a two-dimensional image into a three-dimensional image is performed by the user. For example, in the present embodiment, the wearable device alone performs the operation of converting a two-dimensional image into a three-dimensional image through negotiation; or, in an implementation scenario, the wearable device can be specified to independently complete the operation of converting the two-dimensional image into the three-dimensional image through user customization; or, in another implementation scenario, in order to implement load sharing, the wearable device may also undertake the task of converting the two-dimensional image into the three-dimensional image in a time-series manner through negotiation with the mobile terminal, for example, after the mobile terminal performs the task of converting the two-dimensional image into the three-dimensional image three times, the wearable device performs the task of converting the two-dimensional image into the three-dimensional image two times, that is, the mobile terminal and the wearable device perform format conversion intermittently according to the number of times of performing format conversion; or after the mobile terminal performs the format conversion task for three minutes, the wearable device performs the format conversion task for one minute, that is, the mobile terminal and the wearable device intermittently perform the format conversion operation according to the time for performing the format conversion, which is not limited in this embodiment.
In an implementation scenario, when the wearable device is negotiated with the mobile terminal or is customized by a user to undertake an operation task of converting a two-dimensional image into a three-dimensional image, the mobile terminal may further send only a second image to the wearable device without sending a format conversion instruction, and after receiving the second image, the wearable device performs format conversion on the second image by default to convert the second image into the three-dimensional image. In another implementation scenario, when the wearable device undertakes an operational task of converting the two-dimensional image into the three-dimensional image in a time sequence by negotiating with the mobile terminal or being customized by the user to be in a time sequence with the mobile terminal, during the time sequence in which the wearable device undertakes the operational task of converting the two-dimensional image into the three-dimensional image, the mobile terminal sends the second image and the format conversion instruction to the wearable device to cause the wearable device to convert the second image into the three-dimensional image in response to the format conversion instruction.
Referring to fig. 7, fig. 7 is a schematic frame diagram of a mobile terminal according to an embodiment of the present application. In this embodiment, the mobile terminal includes a processing circuit 71 and a communication circuit 72 coupled to each other, and the processing circuit 71 and the communication circuit 72 can implement the image processing method in the above embodiment when operating. In particular, the processing circuit 71 is configured to control the communication circuit 72 to receive a processing instruction generated by the wearable device based on the sensed user action, the processing circuit 71 is further configured to process a currently applied first image into a second image according to the processing instruction, and the processing circuit 71 is further configured to control the communication circuit 72 to transmit the second image to the wearable device. The mobile terminal may be a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), and the like, and the embodiment is not limited herein.
The Processing circuit 71 may also be referred to as a CPU (Central Processing Unit). The processing circuit 71 may be an integrated circuit chip having signal processing capabilities. The processing Circuit 71 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processing circuit 71 may be commonly implemented by a plurality of circuit-forming chips.
Through the embodiment, the processing instruction is generated by the wearable device based on sensing the user action, the processing instruction is sent to the mobile terminal, the mobile terminal performs image processing after receiving the processing instruction, the image processing task is executed by the mobile terminal with high processing capacity, the image processing effect is guaranteed as far as possible, the processed image is transmitted back to the wearable device, the image processing function of the wearable device can be achieved without adding an additional module, and user experience is improved.
In another embodiment, the first image is a two-dimensional image, and the second image obtained by processing according to the processing instruction is a two-dimensional image; the processing circuit 71 controlling the communication circuit 72 to send the second image to the wearable device comprises: the processing circuit 71 controls the communication circuit 72 to send the second image and the format conversion instruction to the wearable device to cause the wearable device to convert the second image into a three-dimensional image in response to the format conversion instruction. Alternatively, the first image is a two-dimensional image, the second image sent to the wearable device is a three-dimensional image, and before the processing circuit 71 controls the communication circuit 72 to send the second image to the wearable device, the method includes: the processing circuit 71 converts the second image from a two-dimensional image to a three-dimensional image, and the processing circuit 71 controls the communication circuit 72 to transmit the second image to the wearable device includes: the processing circuit 71 sends the second image, which is converted into a three-dimensional image, to the wearable device.
In another embodiment, the processing instruction is a decimation adjustment instruction, and the processing circuit 71 processes the currently applied first image into the second image according to the processing instruction includes: the processing circuit 71 extracts partial images from the plurality of first images according to the extraction adjustment instruction to obtain second images.
In another embodiment, the processing instruction is a resolution adjustment instruction, and the processing circuit 71 processes the currently applied first image into the second image according to the processing instruction includes: the processing circuit 71 performs resolution adjustment on the first image according to the resolution adjustment instruction to obtain a second image.
In another embodiment, the mobile terminal is provided with a memory pre-stored with a first correspondence between the receiving frequency of the processing instructions and the extraction rate, and the extraction of the partial image from the plurality of first images by the processing circuit 71 according to the extraction adjustment instructions to obtain the second image comprises: the processing circuit 71 obtains the receiving frequency of the processing instruction within the set time, and obtains the extraction rate corresponding to the receiving frequency according to the first corresponding relation; the processing circuit 71 extracts partial images from the plurality of first images in accordance with the extraction adjustment instruction and the extraction rate to obtain second images.
In another embodiment, the mobile terminal stores a second correspondence between the receiving frequency of the processing instruction and the resolution parameter in advance, and the processing circuit 71 performs resolution adjustment on the first image according to the resolution adjustment instruction to obtain a second image includes: the processing circuit 71 obtains the receiving frequency of the processing instruction within the set time, and obtains the resolution parameter corresponding to the receiving frequency according to the second corresponding relationship; the processing circuit 71 performs resolution adjustment on the first image according to the resolution adjustment instruction and the resolution parameter to obtain a second image having the resolution parameter.
In another embodiment, the processing circuit 71 controlling the communication circuit 72 to receive the processing instruction generated by the wearable device based on the sensed user action includes: the processing circuit 71 determines whether the processing instruction is an application switching instruction, if so, the processing circuit 71 switches the current application, otherwise, the processing circuit 71 controls itself and the communication circuit 72 to execute the step of processing the first image currently applied into the second image according to the processing instruction and the subsequent steps.
Referring to fig. 8, fig. 8 is a schematic frame diagram of another embodiment of a mobile terminal according to the present application. The mobile terminal includes: radio Frequency (RF) circuit 810, memory 820, input unit 830, display unit 840, sensor 850, audio circuit 860, Wireless Fidelity (WiFi) module 870, processor 880, and power supply 890. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 8 is not intended to be limiting of mobile terminals and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. The following describes each component of the mobile terminal according to the present embodiment with reference to fig. 8:
the rf circuit 810 may be configured to receive and transmit signals during information transmission and reception, and in particular, receive downlink information of a base station and then process the downlink information to the processor 880; in addition, the data for designing uplink is transmitted to the base station. In general, radio frequency circuit 810 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the radio frequency circuit 810 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to global system for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 820 may be used to store software programs and modules, and the processor 880 executes various functional applications and data processing of the mobile terminal by operating the software programs and modules stored in the memory 820. The memory 820 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 820 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 830 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the input unit 830 may include a touch panel 831 and other input devices 832. The touch panel 831, also referred to as a touch screen, can collect touch operations performed by a user on or near the touch panel 831 (e.g., operations performed by the user on the touch panel 831 or near the touch panel 831 using any suitable object or accessory such as a finger, a stylus, etc.) and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 831 may include two portions, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts it to touch point coordinates, and sends the touch point coordinates to the processor 880, and can receive and execute commands from the processor 880. In addition, the touch panel 831 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 830 may include other input devices 832 in addition to the touch panel 831. In particular, other input devices 832 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 840 may be used to display information input by a user or information provided to the user and various menus of the mobile terminal. The display unit 840 may include a display panel 841, and the display panel 841 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Touch panel 831 can overlay display panel 841, and when touch panel 831 detects a touch operation thereon or nearby, communicate to processor 880 to determine the type of touch event, and processor 880 can then provide a corresponding visual output on display panel 841 based on the type of touch event. Although in fig. 8, the touch panel 831 and the display panel 841 are two separate components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 831 and the display panel 841 may be integrated to implement the input and output functions of the mobile phone.
The mobile terminal may also include at least one sensor 850, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 841 according to the brightness of ambient light. The audio circuitry 860, speaker 861, microphone 862 may provide an audio interface between the user and the mobile terminal. The audio circuit 860 can transmit the electrical signal converted from the received audio data to the speaker 861, and the electrical signal is converted into a sound signal by the speaker 861 and output; on the other hand, the microphone 862 converts the collected sound signal into an electrical signal, which is received by the audio circuit 860 and converted into audio data, which is then processed by the audio data output processor 880 and transmitted to, for example, another mobile terminal via the rf circuit 810, or the audio data is output to the memory 820 for further processing.
WiFi belongs to short-range wireless transmission technology, and the mobile terminal can provide wireless broadband internet access to the user through the wireless fidelity module 870. Although fig. 8 shows the wireless fidelity module 870, it is understood that it does not belong to the essential constitution of the mobile terminal and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 880 is a control center of the mobile terminal, connects various parts of the entire cellular phone using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 820 and calling data stored in the memory 820, thereby integrally monitoring the mobile terminal. Optionally, processor 880 may include one or more processing units; preferably, the processor 880 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, and the like. Processor 880 may or may not be integrated with modem processor 880.
The mobile terminal also includes a power supply 890, such as a battery, for powering the various components. The power supply may be logically coupled to the processor 880 through a power management system to manage charging, discharging, and power consumption management functions through the power management system. Although not shown, the mobile terminal may further include a camera, a bluetooth module, and the like, which will not be described herein.
Referring to fig. 9, fig. 9 is a schematic frame diagram of a wearable device according to an embodiment of the present application. In this embodiment, the wearable device includes a processing circuit 91, a communication circuit 92, and a human-computer interaction circuit 93, where the communication circuit 92 and the human-computer interaction circuit 93 are coupled to the processing circuit 91, and when the processing circuit 91, the communication circuit 92, and the human-computer interaction circuit 93 operate, the image processing method in the above embodiment can be implemented. Specifically, the processing circuit 91 is configured to generate a processing instruction corresponding to a user action sensed by the human-computer interaction circuit 93, and the processing circuit 91 is further configured to control the communication circuit 92 to send the processing instruction to the mobile terminal, so that the mobile terminal processes a currently applied first image into a second image according to the processing instruction; the processing circuit 91 is further configured to control the communication circuit 92 to receive the second image transmitted by the mobile terminal. Wearable devices include, but are not limited to, VR glasses, 3D glasses, and the present embodiment is not limited thereto.
The Processing circuit 91 may also be referred to as a CPU (Central Processing Unit). The processing circuit 91 may be an integrated circuit chip having signal processing capabilities. The processing Circuit 91 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processing circuit 91 may be commonly implemented by a plurality of circuit-forming chips.
Through the embodiment, the processing instruction is generated by the wearable device based on sensing the user action, the processing instruction is sent to the mobile terminal, the mobile terminal performs image processing after receiving the processing instruction, the image processing task is executed by the mobile terminal with high processing capacity, the image processing effect is guaranteed as far as possible, the processed image is transmitted back to the wearable device, the image processing function of the wearable device can be achieved without adding an additional module, and user experience is improved.
In another embodiment, the first image is a two-dimensional image, and the processing circuit 91 controls the second image received by the communication circuit 92 to be a three-dimensional image.
In yet another embodiment, the first image and the second image received by the processing circuit 91 and controlling the communication circuit 92 are two-dimensional images, and after the processing circuit 91 and controlling the communication circuit 92 to receive the second image sent by the mobile terminal, the processing circuit 91 is further configured to format convert the second image to obtain a three-dimensional image.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. An image processing method, characterized in that the method comprises:
receiving processing instructions generated by the wearable device based on the sensed user action;
processing the currently applied first image into a second image according to the processing instruction;
sending the second image to the wearable device.
2. The method of claim 1,
the first image is a two-dimensional image, and the second image obtained by processing according to the processing instruction is a two-dimensional image;
the sending the second image to the wearable device comprises:
sending the second image and format conversion instructions to the wearable device to cause the wearable device to convert the second image into a three-dimensional image in response to the format conversion instructions;
or the first image is a two-dimensional image, and the second image sent to the wearable device is a three-dimensional image;
prior to the sending the second image to the wearable device, further comprising:
converting the second image from a two-dimensional image to a three-dimensional image;
the sending the second image to the wearable device includes:
sending the second image converted into the three-dimensional image to the wearable device.
3. The method of claim 1,
the processing instruction is an extraction adjustment instruction, and processing the currently applied first image into a second image according to the processing instruction comprises:
extracting partial images from the plurality of first images according to the extraction adjusting instruction to obtain the second image;
and/or the processing instruction is a resolution adjustment instruction, and the processing the currently applied first image into the second image according to the processing instruction comprises:
and carrying out resolution adjustment on the first image according to the resolution adjustment instruction so as to obtain the second image.
4. The method of claim 3,
the mobile terminal prestores a first corresponding relation between the receiving frequency of the processing instruction and the extraction rate, and the extracting partial images from the plurality of first images according to the extraction adjustment instruction to obtain the second image comprises:
acquiring the receiving frequency of the processing instruction within a set time, and acquiring an extraction rate corresponding to the receiving frequency according to the first corresponding relation;
and extracting partial images from the plurality of first images according to the extraction adjusting instruction and the extraction rate to obtain the second image.
5. The method of claim 3,
the mobile terminal prestores a second corresponding relationship between the receiving frequency of the processing instruction and a resolution parameter, and the adjusting the resolution of the first image according to the resolution adjusting instruction to obtain the second image includes:
acquiring the receiving frequency of the processing instruction within a set time, and acquiring a resolution parameter corresponding to the receiving frequency according to the second corresponding relation;
and carrying out resolution adjustment on the first image according to the resolution adjustment instruction and the resolution parameter so as to obtain the second image with the resolution parameter.
6. The method of claim 1, wherein after receiving the processing instructions generated by the wearable device based on the sensed user action, the method comprises:
judging whether the processing instruction is an application switching instruction or not;
if yes, switching the current application;
and if not, executing the step of processing the currently applied first image into the second image according to the processing instruction and subsequent steps.
7. An image processing method, characterized in that the method comprises:
generating a processing instruction corresponding to the sensed user action;
sending the processing instruction to a mobile terminal so that the mobile terminal processes a first image applied currently into a second image according to the processing instruction;
and receiving the second image sent by the mobile terminal.
8. The method of claim 7,
the first image is a two-dimensional image, and the received second image is a three-dimensional image;
or, the first image and the received second image are both two-dimensional images, and after the second image sent by the mobile terminal is received, the method further includes:
and performing format conversion on the second image to obtain a three-dimensional image.
9. A mobile terminal, characterized in that the mobile terminal comprises:
processing circuitry and communication circuitry coupled to each other;
the processing circuit, the communication circuit being operative to implement the image processing method of any of claims 1-6.
10. A wearable device, characterized in that the wearable device comprises:
the device comprises a processing circuit, a communication circuit and a human-computer interaction circuit;
the communication circuit and the human-computer interaction circuit are coupled to the processing circuit, and when the processing circuit, the communication circuit and the human-computer interaction circuit are operated, the image processing method according to any one of claims 7 to 8 can be realized.
CN201910223405.4A 2019-03-22 2019-03-22 Image processing method, mobile terminal and wearable device Withdrawn CN111723035A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910223405.4A CN111723035A (en) 2019-03-22 2019-03-22 Image processing method, mobile terminal and wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910223405.4A CN111723035A (en) 2019-03-22 2019-03-22 Image processing method, mobile terminal and wearable device

Publications (1)

Publication Number Publication Date
CN111723035A true CN111723035A (en) 2020-09-29

Family

ID=72563068

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910223405.4A Withdrawn CN111723035A (en) 2019-03-22 2019-03-22 Image processing method, mobile terminal and wearable device

Country Status (1)

Country Link
CN (1) CN111723035A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116107535A (en) * 2022-12-26 2023-05-12 深圳市长丰影像器材有限公司 Method, system, equipment and storage medium for displaying image of microphone

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866262A (en) * 2014-02-21 2015-08-26 索尼公司 Wearable Device
US20170272784A1 (en) * 2016-03-16 2017-09-21 Xiaomi Inc. Live video broadcasting method and device
CN108055587A (en) * 2017-11-30 2018-05-18 星潮闪耀移动网络科技(中国)有限公司 Sharing method, device, mobile terminal and the storage medium of image file
CN108881781A (en) * 2018-07-17 2018-11-23 广东小天才科技有限公司 The determination method and device of video call process intermediate-resolution
CN108900850A (en) * 2018-05-31 2018-11-27 北京达佳互联信息技术有限公司 A kind of live broadcasting method, device and intelligent glasses
US20190087147A1 (en) * 2014-12-09 2019-03-21 Samsung Electronics Co., Ltd. Mobile device and method for operating mobile device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866262A (en) * 2014-02-21 2015-08-26 索尼公司 Wearable Device
US20190087147A1 (en) * 2014-12-09 2019-03-21 Samsung Electronics Co., Ltd. Mobile device and method for operating mobile device
US20170272784A1 (en) * 2016-03-16 2017-09-21 Xiaomi Inc. Live video broadcasting method and device
CN108055587A (en) * 2017-11-30 2018-05-18 星潮闪耀移动网络科技(中国)有限公司 Sharing method, device, mobile terminal and the storage medium of image file
CN108900850A (en) * 2018-05-31 2018-11-27 北京达佳互联信息技术有限公司 A kind of live broadcasting method, device and intelligent glasses
CN108881781A (en) * 2018-07-17 2018-11-23 广东小天才科技有限公司 The determination method and device of video call process intermediate-resolution

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116107535A (en) * 2022-12-26 2023-05-12 深圳市长丰影像器材有限公司 Method, system, equipment and storage medium for displaying image of microphone

Similar Documents

Publication Publication Date Title
CN108234276B (en) Method, terminal and system for interaction between virtual images
CN107256555B (en) Image processing method, device and storage medium
CN107817939B (en) Image processing method and mobile terminal
US10956025B2 (en) Gesture control method, gesture control device and gesture control system
CN107886321B (en) Payment method and mobile terminal
CN108196750B (en) Screen interface screenshot method and device and computer-readable storage medium
CN108418969B (en) Antenna feed point switching method and device, storage medium and electronic equipment
CN110750189B (en) Icon display method and device
CN111371705B (en) Download task execution method and electronic device
CN106101776B (en) Method, smart machine and the speaker of volume adjustment
CN108832297B (en) Antenna working method and mobile terminal
CN110806832A (en) Parameter adjusting method and electronic equipment
CN113542825B (en) Screen projection display method, system, terminal device and storage medium
CN106997750B (en) Terminal screen backlight adjusting method, mobile terminal and computer readable storage medium
CN111966436A (en) Screen display control method and device, terminal equipment and storage medium
CN111324407A (en) Animation display method, terminal and computer readable storage medium
CN109343811B (en) Display adjustment method and terminal equipment
CN110198560B (en) Power configuration method and terminal
CN108419283B (en) WIFI hotspot scanning method and mobile terminal
CN108984075B (en) Display mode switching method and device and terminal
CN111447598B (en) Interaction method and display device
CN111723035A (en) Image processing method, mobile terminal and wearable device
WO2020052635A1 (en) Communication method for mobile terminal, and mobile terminal
CN109491631B (en) Display control method and terminal
CN109462727B (en) Filter adjusting method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200929