CN114827468A - Control method, control device, computer-readable storage medium, and mobile terminal - Google Patents

Control method, control device, computer-readable storage medium, and mobile terminal Download PDF

Info

Publication number
CN114827468A
CN114827468A CN202210443632.XA CN202210443632A CN114827468A CN 114827468 A CN114827468 A CN 114827468A CN 202210443632 A CN202210443632 A CN 202210443632A CN 114827468 A CN114827468 A CN 114827468A
Authority
CN
China
Prior art keywords
current
sensor
value
mobile terminal
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210443632.XA
Other languages
Chinese (zh)
Inventor
韩旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210443632.XA priority Critical patent/CN114827468A/en
Publication of CN114827468A publication Critical patent/CN114827468A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Abstract

The application discloses a control method, a control device, a computer readable storage medium and a mobile terminal. The control method is applied to the mobile terminal. The mobile terminal comprises a water environment sensor and a camera system, and can work in an underwater mode. The control method comprises the following steps: in the underwater mode, turning on the camera system in response to a user input; searching a camera parameter corresponding to a current sensor signal output by a water environment sensor in a water environment-camera parameter database; applying the camera parameters to the camera system. According to the control method, the control device, the computer readable storage medium and the mobile terminal, when in an underwater mode, the camera parameters can be determined through the current sensor signals and the water environment-camera parameter database, so that a user does not need to manually adjust the camera parameters, the imaging quality is guaranteed, and the user experience is improved.

Description

Control method, control device, computer-readable storage medium, and mobile terminal
Technical Field
The present application relates to the field of imaging technologies, and in particular, to a control method, a control device, a computer-readable storage medium, and a mobile terminal.
Background
At present, some mobile phones have waterproofness, and users can take pictures and record images by using the mobile phones underwater. However, when a user enters underwater shooting, the shooting environment is already different from that of the water, and the quality of pictures shot by using the water parameters of the camera is poor.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, a computer readable storage medium and a mobile terminal.
The control method of the embodiment of the application is applied to the mobile terminal. The mobile terminal comprises a water environment sensor and a camera system. The mobile terminal may operate in an underwater mode. The control method comprises the following steps: in the underwater mode, turning on the camera system in response to a user input; searching a camera parameter corresponding to a current sensor signal output by the water environment sensor in a water environment-camera parameter database; applying the camera parameters to the camera system.
The control device of the embodiment of the application comprises a memory and a processor. The memory stores therein a computer program that, when executed by the processor, causes the processor to execute the control method of any of the above embodiments.
The computer-readable storage medium of the embodiments of the present application has stored thereon a computer program that, when executed by a processor, implements the control method of any of the above-described embodiments.
The mobile terminal of the embodiment of the application comprises a water environment sensor, a camera system and a processor. The mobile terminal may operate in an underwater mode. The processor is configured to: in the underwater mode, turning on the camera system in response to a user input; searching a camera parameter corresponding to a current sensor signal output by the water environment sensor in a water environment-camera parameter database; applying the camera parameters to the camera system.
According to the control method, the control device, the computer readable storage medium and the mobile terminal, when in an underwater mode, the camera parameters can be determined through the current sensor signals and the water environment-camera parameter database, so that the imaging quality can be ensured, and the user experience is improved.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart of a control method according to certain embodiments of the present application;
FIGS. 2 and 3 are schematic diagrams of a mobile terminal according to some embodiments of the present application;
FIGS. 4-9 are schematic flow charts of control methods according to certain embodiments of the present application;
FIG. 10 is a schematic view of a control device according to certain embodiments of the present application;
fig. 11 is a schematic connection diagram of a mobile terminal and a computer-readable storage medium according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
Referring to fig. 1, the present application discloses a control method applied to a mobile terminal 100, wherein the mobile terminal 100 includes a water environment sensor 10 and a camera system 20, the mobile terminal 100 can operate in an underwater mode, and the control method includes:
01: in the underwater mode, the camera system 20 is turned on in response to user input;
02: searching a camera parameter corresponding to a current sensor signal output by the water environment sensor 10 in a water environment-camera parameter database;
03: the camera parameters are applied to the camera system 20.
Referring to fig. 2 and fig. 3 together, the present application discloses a mobile terminal 100. The mobile terminal 100 includes a water environment sensor 10, a camera system 20, and a processor 30, and the mobile terminal 100 may operate in an underwater mode. The control method according to the embodiment of the present application can be implemented by the mobile terminal 100 according to the embodiment of the present application. Wherein, step 01, step 02 and step 03 can be implemented by the processor 30. That is, the processor 30 is configured to: in the underwater mode, the camera system 20 is turned on in response to user input; searching a camera parameter corresponding to a current sensor signal output by the water environment sensor 10 in a water environment-camera parameter database; the camera parameters are applied to the camera system 20.
According to the control method and the mobile terminal 100, in the underwater mode, the camera parameters can be determined through the current sensor signals and the water environment-camera parameter database, so that the imaging quality can be ensured, and the user experience is improved.
Specifically, the user may use the mobile terminal 100 underwater. In one embodiment, the mobile terminal 100 may include a plurality of operating modes, such as a terrestrial mode and a marine mode, and the mobile terminal 100 may automatically switch the operating modes. The water environment sensor 10 may be always in an on state, acquire environmental information, and determine whether the mobile terminal 100 is underwater. If it is recognized that the current environment is changed from the land to the underwater environment, the mobile terminal 100 may be automatically switched from the land mode to the underwater mode, which may facilitate the user to use the mobile terminal 100 underwater.
Referring to fig. 2 and 3 again, the user may take a picture or photograph under water using the mobile terminal 100. The camera system 20 includes a camera that may be disposed on a front surface of the mobile terminal 100 so that a user can use the camera for self-photographing underwater. A camera may also be provided at the rear of the mobile terminal 100 so that a user can photograph a scene underwater using the camera. The camera system 20 may further include 2 cameras, and the 2 cameras are respectively disposed on the front and the back, which is not limited herein.
When the user uses the mobile terminal 100 underwater, the mobile terminal 100 enters an underwater mode, and the user may trigger a photographing virtual key or press a photographing button to turn on the camera system 20. The water environment sensor 10 may output a current sensor signal, may search for a camera parameter corresponding to the current sensor signal in the water environment-camera parameter database using the current sensor signal, and then apply the camera parameter to the camera system 20.
Therefore, the water environment sensor 10 can be used for acquiring the current sensor signal in real time, and the current sensor signal can reflect the current shooting environment. The camera system 20 can be adjusted by generating appropriate camera parameters in combination with the current shooting environment, so that the imaging quality of underwater shooting can be improved, and a user can conveniently, conveniently and quickly complete underwater shooting actions.
In some embodiments, the water environment-camera parameter database includes a mapping relationship between the calibrated sensor signal and the calibrated camera parameter. Therefore, the same or the closest calibration sensor signal can be searched by comparing the current sensor signal with the calibration sensor signal. The calibrated sensor signal corresponds to calibrated camera parameters, which can be obtained as calibrated camera parameters corresponding to the current sensor signal, and then the calibrated camera parameters are applied to the camera system 20 as camera parameters. In this way, camera parameters can be adjusted according to the underwater environment. The speed of generating the camera parameters can be improved by setting the mapping relation in the water environment-camera parameter database, and the imaging quality of underwater shooting is improved.
In some embodiments, the water environment-camera parameter database includes a plurality of algorithms, such as an exposure compensation algorithm. The aquatic environment sensor 10 can output a current sensor signal that includes a current ambient light signal. The current ambient light signal may be analyzed to obtain current ambient light information, the current ambient light information and an exposure compensation algorithm in the water environment-camera parameter database are then utilized to calculate exposure parameters, and finally the exposure parameters are applied to the camera system 20. Therefore, if the current ambient light is dark, appropriate exposure parameters can be obtained according to the current ambient light information and the exposure compensation algorithm, and after the camera system 20 is adjusted by applying the exposure parameters, a shot picture can be brighter, the imaging quality of underwater shooting is improved, and the visual effect is ensured.
Referring to fig. 4, in some embodiments, the aquatic environment sensor 10 includes a temperature sensor, an atmospheric pressure sensor, and an ambient light sensor. The processor 30 may determine whether the mobile terminal 100 is switched to the underwater mode using the water environment-camera parameter database in combination with the signals generated by the three sensors. The water environment-camera parameter database can store data sets in all modes to judge the current working mode, can also store data sets of the underwater mode, and judges whether the underwater mode is the underwater mode or not by combining signals generated by the three sensors. Judging that the mobile terminal 100 is not in water, the mobile terminal 100 can be kept in a land mode; if it is determined that the mobile terminal 100 is underwater, the mobile terminal 100 may be switched to the underwater mode. When the mobile terminal 100 is switched to the underwater mode, the camera system 20 may automatically turn on the photographing function, and the image signal processing unit of the camera system 20 may prepare according to signals output by the three sensors. Therefore, underwater shooting can be simply, conveniently and quickly realized, the optimal camera parameters can be output according to signals output by the three sensors, and the imaging quality of underwater shooting is improved.
It is noted that, in some embodiments, the mobile terminal 100 may include not only a land mode and a underwater mode, but also a day mode, a night mode, a sport mode, and the like, which are not limited herein. The mobile terminal 100 may be used for photographing in various modes. Therefore, the working mode of the mobile terminal 100 can be switched, and the interest of shooting can be improved by shooting in different modes, so that the user experience is improved.
Referring to fig. 5, in some embodiments, the aquatic environment sensor 10 includes an ambient light sensor and a gyroscope, the camera parameters include focus parameters, and step 02 includes:
021: acquiring a current ambient light value by using an ambient light sensor;
022: acquiring a current angular velocity by using a gyroscope to obtain a jitter value;
023: and searching focusing parameters corresponding to the current ambient light value and the jitter value in a water environment-camera parameter database.
In some embodiments, the control method of the embodiments of the present application may be implemented by the mobile terminal 100 of the embodiments of the present application. The mobile terminal 100 includes a water environment sensor 10, the water environment sensor 10 includes an ambient light sensor and a gyroscope, and the camera parameters include focusing parameters. Step 021, step 022, and step 023 may all be implemented by processor 30. That is, the processor 30 is configured to: acquiring a current ambient light value by using an ambient light sensor; acquiring a current angular velocity by using a gyroscope to obtain a jitter value; and searching focusing parameters corresponding to the current ambient light value and the jitter value in a water environment-camera parameter database.
In the related art, focusing is difficult in underwater photography due to insufficient underwater light intensity, movement of a user relative to water, and flow of water current itself. The current ambient light value can be obtained through the ambient light sensor, and the current angular velocity is obtained through the gyroscope so as to obtain the jitter value. The water environment-camera parameter database comprises a mapping relation between a calibration environment light value and a calibration jitter value and a calibration focusing parameter. Therefore, the calibration focusing parameter can be determined by searching the corresponding calibration ambient light value by using the current ambient light value, searching the corresponding calibration jitter value by using the jitter value, and then using the searched calibration ambient light value and the calibration jitter value. The calibrated focus parameters are applied to the camera system 20 as focus parameters. Therefore, accurate focusing can be realized during underwater shooting, and the imaging quality of underwater shooting is improved.
In some implementations, a focus learning model trained in advance can be included within the environment-touch database. The ambient light values may include water flow refraction light values and ambient light and darkness values. The focusing learning model trained in advance can be obtained by training three values, namely a water flow refraction light value, an ambient light and darkness value and a shaking value in advance through a training set, and the focusing parameters corresponding to the current ambient light value and the shaking value can be calculated through the focusing learning model. Therefore, accurate focusing can be realized during underwater shooting, and the imaging quality of underwater shooting is improved.
Referring to fig. 6, in one embodiment, the user holds the mobile terminal 100 into water, the mobile terminal 100 can be switched to the underwater mode, and the camera system 20 is turned on. The user can preset underwater shooting shortcut key, and the preset underwater shooting shortcut key comprises a user-defined image/video switching key and a quick shooting key, so that the user can conveniently shoot underwater. After the system 20 is turned on, camera parameters, including focusing parameters, may be generated based on information from the ambient light sensor, the barometric sensor, and the gyroscope. The camera parameters are then applied to the camera system 20, for example: the flash of the camera system 20 is turned on according to the camera parameters, the image signal processing unit is controlled to process the picture according to the camera parameters, the OIS optical anti-shake is turned on according to the camera parameters, the exposure compensation is turned on according to the camera parameters, the auto-focus is realized according to the camera parameters, and the like, which are not limited herein.
Referring to fig. 7, in some embodiments, the aquatic environment sensor 10 includes an ambient light sensor, the camera parameters include filter parameters, and step 02 includes:
024: acquiring a current color value and a transmittance value by using an ambient light sensor;
025: and searching filter parameters corresponding to the current color value and the light transmittance value in a water environment-camera parameter database.
In some embodiments, the control method of the embodiments of the present application may be implemented by the mobile terminal 100 of the embodiments of the present application. The mobile terminal 100 includes a water environment sensor 10, the water environment sensor 10 includes an ambient light sensor, and the camera parameters include filter parameters. Both steps 024 and 025 may be implemented by processor 30. That is, the processor 30 is configured to: acquiring a current color value and a transmittance value by using an ambient light sensor; and searching filter parameters corresponding to the current color value and the light transmittance value in a water environment-camera parameter database.
Particularly, in underwater shooting, as the underwater depth increases, the red-orange-yellow spectrum in sunlight is absorbed by water, which can cause poor color and even blurred picture of the picture shot by the user. Therefore, filter parameters conforming to the current shooting environment are generated, and the filter parameters are applied to the camera system 20, so that the real color of the picture can be restored, the picture is clearer, and the imaging quality is improved. The ambient light sensor may obtain a current color value and a transmittance value. The mapping relation between the calibration color value and the calibration transmittance value and the calibration filter parameter is included in the water environment-camera parameter database, so that the corresponding calibration color value can be searched by using the current color value, the corresponding calibration transmittance value can be searched by using the transmittance value, and the calibration filter parameter can be determined by using the searched calibration color value and calibration transmittance value. The calibration filter parameters are applied to the camera system 20 as filter parameters. Therefore, the real color of the picture can be restored during underwater shooting, so that the picture is clearer, and the imaging quality is improved.
Referring to fig. 6 again, in some embodiments, since different regions have different water qualities, which may lead to different water permeability, when shooting underwater with poor permeability, filter parameters may be generated, and the filter parameters may be used to perform thinning and atomizing processing on the shot image to make the image clearer. And generating filter parameters to carry out filter processing on the shot picture so as to optimize the shot picture, wherein the filter processing comprises but is not limited to hue contrast processing, picture brightness contrast processing, depth of field processing and the like. When the user can shoot underwater, the camera system 20 is turned on to use the filter for processing, and the mobile terminal 100 can implement the filter for processing the picture when shooting in real time; or the photographed pixel points can be saved when the user photographs, and the filter is selected for processing at the later stage, which is not limited here.
Referring to fig. 8, in some embodiments, the mobile terminal 100 includes a touch screen, and the control method further includes:
041: acquiring a current touch signal in an underwater mode;
042: searching a touch setting parameter corresponding to the current touch signal in a water environment-touch database;
043: applying touch setting parameters to the touch screen.
In some embodiments, the control method of the embodiments of the present application may be implemented by the mobile terminal 100 of the embodiments of the present application. The mobile terminal 100 includes a touch screen. Step 041, step 042 and step 043 may all be implemented by processor 30. That is, the processor 30 is configured to: acquiring a current touch signal in an underwater mode; searching a touch setting parameter corresponding to the current touch signal in a water environment-touch database; applying touch setting parameters to the touch screen.
In one embodiment, the touch screen comprises a capacitive screen. When a user touches the capacitive screen, a touched area of the capacitive screen generates a touch signal, and the touch signal comprises a capacitance value signal. When the capacitive screen is used in the land mode, the generated capacitance signal is usually within the capacitance signal range, and the corresponding touch operation can be realized within the capacitance signal range. However, when the user uses the mobile terminal 100 underwater, the capacitive screen may recognize a pressure change caused by water flow as a touch signal, so that when the user touches the capacitive screen with a hand, the user is likely to have insensitivity or even touch failure. The mobile terminal 100 according to the embodiment of the present application can be switched to the underwater mode. In the underwater mode, a user touches the touch screen to generate a current touch signal, corresponding touch setting parameters can be searched in a water environment-touch database according to the current touch signal, and finally the touch setting parameters are applied to the touch screen, so that the effect of identifying the underwater touch screen of the user can be achieved, the touch insensitivity is prevented, and the condition of no response to touch is avoided.
Specifically, the water environment-touch database may include a pre-trained deep learning model, and the pre-trained deep learning model may be trained by samples of three basic scenes, namely, a pre-terrestrial mode touch signal, a pre-pure water flow pressure change signal, and a pre-underwater mode touch signal. Therefore, the current touch signal and the deep learning model trained in advance can be used for obtaining touch setting parameters, the touch setting parameters are applied to the touch screen, the effect of underwater touch control of the touch screen is guaranteed, the mobile terminal 100 can be normally used when a user is underwater, and the user can use the camera system 20 to shoot in a touch control mode.
Referring again to fig. 6, in some embodiments, when the user holds the mobile terminal 100 in water, the mobile terminal 100 may automatically switch to the underwater mode. A user can acquire a current touch signal by touching any position of a touch screen of the mobile terminal, generates touch setting parameters and applies the touch setting parameters to the touch screen, so that the touch screen enters an underwater touch mode. Therefore, the camera system 20 can be started in a touch manner, so that the use requirements of the user are met, and the use experience of the user is improved.
Referring to fig. 9, in some embodiments, the aquatic environment sensor 10 includes a temperature sensor, an air pressure sensor and an ambient light sensor, and the control method includes:
051: acquiring a current temperature value by using a temperature sensor;
052: acquiring a current air pressure value by using an air pressure sensor;
053: acquiring a current ambient light value by using an ambient light sensor;
054: it is determined that the mobile terminal 100 operates in the underwater mode when the current temperature value falls within the predetermined temperature range, when the current atmospheric pressure value falls within the predetermined atmospheric pressure range, and when the current ambient light value falls within the predetermined ambient light range.
In some embodiments, the control method of the embodiments of the present application may be implemented by the mobile terminal 100 of the embodiments of the present application. The mobile terminal 100 includes a water environment sensor 10, and the water environment sensor 10 includes a temperature sensor, an air pressure sensor, and an ambient light sensor. Step 051, step 052, step 053 and step 054 may all be implemented by processor 30. That is, the processor 30 is configured to: acquiring a current temperature value by using a temperature sensor; acquiring a current air pressure value by using an air pressure sensor; acquiring a current ambient light value by using an ambient light sensor; it is determined that the mobile terminal 100 is operated in the underwater mode when the current temperature value falls within the predetermined temperature range, when the current air pressure value falls within the predetermined air pressure range, and when the current ambient light value falls within the predetermined ambient light range.
Referring again to fig. 4, in some embodiments, the mobile terminal 100 includes a predetermined database. The current temperature value, the current air pressure value and the current ambient light value can be obtained, and whether the mobile terminal 100 works in the underwater mode can be judged by utilizing the pre-stored data in the preset database. A data set of underwater patterns is stored in a predetermined database, the data set including a predetermined temperature range, a predetermined pressure range, and a predetermined ambient light range. The predetermined temperature range, the predetermined atmospheric pressure range, and the predetermined ambient light range may be data obtained through a plurality of experimental analyses, and it may be determined that the mobile terminal 100 operates in the underwater mode when the current temperature value falls within the predetermined temperature range, the current atmospheric pressure value falls within the predetermined atmospheric pressure range, and the current ambient light value falls within the predetermined ambient light range.
The current temperature value is obtained by a temperature sensor, which in one embodiment is typically obtained on land higher than under water during the summer. For example: taking seawater as an example, the temperature value obtained on land in summer may be 25 ℃, and the temperature value obtained underwater may be 19 ℃. In winter, the temperature values obtained are generally lower on land than under water. For example: taking seawater as an example, the temperature value obtained on land in winter can be-10 ℃, at the moment, the seawater is in the state of ice-water mixture, and the temperature value obtained underwater can be 0 ℃. Therefore, when the land mode is converted into the underwater mode, the temperature value acquired by the temperature sensor changes. It is worth mentioning that the temperature value at shallow water is different from the temperature value at deep water in the same water area.
The current barometric pressure value is obtained by a barometric pressure sensor. The underwater air pressure value is different from the land air pressure value, and when the land mode is converted into the underwater mode, the air pressure value acquired by the air pressure sensor changes. It is worth mentioning that the air pressure value at deep water and the air pressure value at shallow water are different in the same water area, so that the water depth can be detected by the air pressure sensor.
The current ambient light value is obtained by an ambient light sensor. Water is a transparent medium, but the ambient light is absorbed and weakened continuously when the ambient light propagates in the water, so that the underwater ambient light value is different from the land ambient light value. And the ambient light values are different due to different water depths; the clarity of the water varies, as does the ambient light value.
In this way, it may be determined that the mobile terminal 100 operates in the underwater mode by combining the current temperature value, the current air pressure value, and the current ambient light value.
When it is determined that the mobile terminal 100 is operating in the underwater mode, the current underwater condition may be preliminarily determined by acquiring the current air pressure value using the air pressure sensor and the current ambient light value using the ambient light sensor. For example: the air pressure values at the water depth of 1 meter are different from the air pressure values at the water depth of 10 meters, so that the water depth can be accurately judged according to the current air pressure value; the clarity degree of the water can be judged by combining the depth of the water and the current ambient light value. Therefore, the camera parameters can be generated conveniently, the imaging quality is ensured, and the user experience is improved.
Referring to fig. 1 and 10 together, the present application discloses a control device 200, which includes a memory 220 and a processor 240. The memory 220 stores a computer program, and when the computer program is executed by the processor 240, the processor 240 executes the control method according to any one of the above embodiments of the present application. For example, a computer program is used to execute the following control method:
01: in the underwater mode, the camera system 20 is turned on in response to user input;
02: searching a camera parameter corresponding to a current sensor signal output by the water environment sensor 10 in a water environment-camera parameter database;
03: the camera parameters are applied to the camera system 20.
Referring to fig. 1 and 11 together, the present application discloses a computer readable storage medium 300, on which a computer program is stored, wherein the computer program is executed by a processor 240 to implement the control method of any of the above embodiments. For example, the mobile terminal 100 may be executed by the processor 240 to perform the following control methods:
01: in the underwater mode, the camera system 20 is turned on in response to user input;
02: searching a camera parameter corresponding to a current sensor signal output by the water environment sensor 10 in a water environment-camera parameter database;
03: the camera parameters are applied to the camera system 20.
As shown in fig. 11, the control method according to the embodiment of the present application may be implemented by the mobile terminal 100 according to the embodiment of the present application. Note that the computer-readable storage medium 300 may be a storage medium built in the mobile terminal 100, or may be a storage medium that can be inserted into and removed from the mobile terminal 100.
In the description of the embodiments of the present application, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present application, "a plurality" means two or more unless specifically defined otherwise.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the embodiments of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations of the above embodiments may be made by those of ordinary skill in the art within the scope of the present application.

Claims (12)

1. A control method is applied to a mobile terminal, and is characterized in that the mobile terminal comprises a water environment sensor and a camera system, and can work in an underwater mode, and the control method comprises the following steps:
in the underwater mode, turning on the camera system in response to a user input;
searching a camera parameter corresponding to a current sensor signal output by the water environment sensor in a water environment-camera parameter database; and
applying the camera parameters to the camera system.
2. The control method of claim 1, wherein the aquatic environment sensor comprises an environmental light sensor and a gyroscope, the camera parameters comprise focusing parameters, and the step of searching the aquatic environment-camera parameter database for the camera parameters corresponding to the current sensor signal output by the aquatic environment sensor comprises:
acquiring a current ambient light value by using the ambient light sensor;
acquiring a current angular velocity by using the gyroscope to obtain a jitter value;
and searching the focusing parameters corresponding to the current ambient light value and the jitter value in a water environment-camera parameter database.
3. The control method of claim 1, wherein the aquatic environment sensor comprises an ambient light sensor, the camera parameters comprise filter parameters, and the step of finding the camera parameters in the aquatic environment-camera parameter database corresponding to the current sensor signal output by the aquatic environment sensor comprises:
acquiring a current color value and a transmittance value by using the ambient light sensor;
and searching the filter parameters corresponding to the current color value and the light transmittance value in a water environment-camera parameter database.
4. The control method according to claim 1, wherein the mobile terminal includes a touch screen, the control method further comprising:
acquiring a current touch signal in the underwater mode;
searching a touch setting parameter corresponding to the current touch signal in a water environment-touch database;
and applying the touch setting parameters to the touch screen.
5. The control method of claim 1, wherein the aquatic environment sensor comprises a temperature sensor, an air pressure sensor, and an ambient light sensor, the control method comprising:
acquiring a current temperature value by using the temperature sensor;
acquiring a current air pressure value by using the air pressure sensor;
acquiring a current ambient light value by using the ambient light sensor;
and when the current temperature value falls into a preset temperature range, the current air pressure value falls into a preset air pressure range and the current environment light value falls into a preset environment light range, determining that the mobile terminal works in the underwater mode.
6. A control device comprising a memory and a processor, wherein the memory stores a computer program, and wherein the computer program, when executed by the processor, implements the control method according to any one of claims 1 to 5.
7. A computer-readable storage medium on which a computer program is stored, characterized in that the program, when executed by a processor, implements the control method of any one of claims 1 to 5.
8. A mobile terminal comprising a water environment sensor, a camera system, and a processor, the mobile terminal operable in an underwater mode, the processor configured to:
in the underwater mode, turning on the camera system in response to a user input;
searching a camera parameter corresponding to a current sensor signal output by the water environment sensor in a water environment-camera parameter database; and
applying the camera parameters to the camera system.
9. The mobile terminal of claim 8, wherein the aquatic environment sensor comprises an ambient light sensor and a gyroscope, wherein the camera parameters comprise focus parameters, and wherein the processor is configured to:
acquiring a current ambient light value by using the ambient light sensor;
acquiring a current angular velocity by using the gyroscope to obtain a jitter value;
and searching the focusing parameters corresponding to the current ambient light value and the jitter value in a water environment-camera parameter database.
10. The mobile terminal of claim 8, wherein the water environment sensor comprises an ambient light sensor, wherein the camera parameters comprise filter parameters, and wherein the processor is configured to:
acquiring a current color value and a transmittance value by using the ambient light sensor;
and searching the filter parameters corresponding to the current color value and the light transmittance value in a water environment-camera parameter database.
11. The mobile terminal of claim 8, wherein the mobile terminal comprises a touch screen, and wherein the processor is configured to:
acquiring a current touch signal in the underwater mode;
searching a touch setting parameter corresponding to the current touch signal in a water environment-touch database;
and applying the touch setting parameters to the touch screen.
12. The mobile terminal of claim 8, wherein the aquatic environment sensor comprises a temperature sensor, an air pressure sensor, and an ambient light sensor, and wherein the processor is further configured to:
acquiring a current temperature value by using the temperature sensor;
acquiring a current air pressure value by using the air pressure sensor;
acquiring a current ambient light value by using the ambient light sensor;
and when the current temperature value falls into a preset temperature range, the current air pressure value falls into a preset air pressure range and the current environment light value falls into a preset environment light range, determining that the mobile terminal works in the underwater mode.
CN202210443632.XA 2022-04-25 2022-04-25 Control method, control device, computer-readable storage medium, and mobile terminal Pending CN114827468A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210443632.XA CN114827468A (en) 2022-04-25 2022-04-25 Control method, control device, computer-readable storage medium, and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210443632.XA CN114827468A (en) 2022-04-25 2022-04-25 Control method, control device, computer-readable storage medium, and mobile terminal

Publications (1)

Publication Number Publication Date
CN114827468A true CN114827468A (en) 2022-07-29

Family

ID=82508402

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210443632.XA Pending CN114827468A (en) 2022-04-25 2022-04-25 Control method, control device, computer-readable storage medium, and mobile terminal

Country Status (1)

Country Link
CN (1) CN114827468A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004336326A (en) * 2003-05-07 2004-11-25 Canon Inc Image processor
CN101115148A (en) * 2006-07-25 2008-01-30 富士胶片株式会社 Image-taking apparatus and image display control method
CN102263899A (en) * 2010-05-25 2011-11-30 奥林巴斯映像株式会社 Photographing device and control method therefor
CN102870044A (en) * 2010-03-22 2013-01-09 伊斯曼柯达公司 Underwater camera with pressure sensor
CN104469173A (en) * 2014-12-31 2015-03-25 上海青橙实业有限公司 Image information obtaining method and terminal
CN105208287A (en) * 2015-10-15 2015-12-30 广东欧珀移动通信有限公司 Photographing method and device
CN110213480A (en) * 2019-04-30 2019-09-06 华为技术有限公司 A kind of focusing method and electronic equipment
CN111699671A (en) * 2019-05-31 2020-09-22 深圳市大疆创新科技有限公司 Exposure control method of shooting device and shooting device
CN111882489A (en) * 2020-05-15 2020-11-03 东北石油大学 Super-resolution graph recovery method for simultaneously enhancing underwater images
CN112154403A (en) * 2019-09-11 2020-12-29 深圳市大疆创新科技有限公司 Screen control method and terminal
CN114064175A (en) * 2021-11-11 2022-02-18 上海传英信息技术有限公司 Intelligent terminal control method, intelligent terminal and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004336326A (en) * 2003-05-07 2004-11-25 Canon Inc Image processor
CN101115148A (en) * 2006-07-25 2008-01-30 富士胶片株式会社 Image-taking apparatus and image display control method
CN102870044A (en) * 2010-03-22 2013-01-09 伊斯曼柯达公司 Underwater camera with pressure sensor
CN102263899A (en) * 2010-05-25 2011-11-30 奥林巴斯映像株式会社 Photographing device and control method therefor
CN104469173A (en) * 2014-12-31 2015-03-25 上海青橙实业有限公司 Image information obtaining method and terminal
CN105208287A (en) * 2015-10-15 2015-12-30 广东欧珀移动通信有限公司 Photographing method and device
CN110213480A (en) * 2019-04-30 2019-09-06 华为技术有限公司 A kind of focusing method and electronic equipment
CN111699671A (en) * 2019-05-31 2020-09-22 深圳市大疆创新科技有限公司 Exposure control method of shooting device and shooting device
CN112154403A (en) * 2019-09-11 2020-12-29 深圳市大疆创新科技有限公司 Screen control method and terminal
CN111882489A (en) * 2020-05-15 2020-11-03 东北石油大学 Super-resolution graph recovery method for simultaneously enhancing underwater images
CN114064175A (en) * 2021-11-11 2022-02-18 上海传英信息技术有限公司 Intelligent terminal control method, intelligent terminal and storage medium

Similar Documents

Publication Publication Date Title
JP6512810B2 (en) Image pickup apparatus, control method and program
WO2020038087A1 (en) Method and apparatus for photographic control in super night scene mode and electronic device
JP4178009B2 (en) Shooting system
US20150281565A1 (en) Image processing apparatus, method for controlling image processing apparatus and storage medium
JP5395512B2 (en) Imaging device
JP2003344891A (en) Automatic photographing mode setting camera
KR101930460B1 (en) Photographing apparatusand method for controlling thereof
US20040218055A1 (en) Camera shake warning and feedback system that teaches the photographer
JP2000350071A (en) Electronic still camera
JP2011146957A (en) Imaging apparatus, control method thereof, and program
CN110708463B (en) Focusing method, focusing device, storage medium and electronic equipment
JP2011192075A (en) Electronic device
JP2003241072A (en) Camera
US20080062277A1 (en) Photographing Apparatus
US8090253B2 (en) Photographing control method and apparatus using strobe
JP2013162412A (en) Imaging device, image quality adjustment method, and image quality adjustment program
KR101140414B1 (en) Digital Camera Apparatus for Supporting Deblurring and Method thereof
US10554879B2 (en) Imaging device, imaging device control method, and interchangeable lens
CN114827468A (en) Control method, control device, computer-readable storage medium, and mobile terminal
JP2007259004A (en) Digital camera, image processor, and image processing program
JP2005072949A (en) Image photographing device and program
KR101630295B1 (en) A digital photographing apparatus, a method for controlling the same, and a computer-readable medium
CN110809119A (en) Photographing method, photographing apparatus, and computer-readable storage medium
JP5663193B2 (en) Imaging device
WO2023236209A1 (en) Image processing method and apparatus, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination