CN111447439B - Image coding method, image coding device and mobile terminal - Google Patents

Image coding method, image coding device and mobile terminal Download PDF

Info

Publication number
CN111447439B
CN111447439B CN202010418153.3A CN202010418153A CN111447439B CN 111447439 B CN111447439 B CN 111447439B CN 202010418153 A CN202010418153 A CN 202010418153A CN 111447439 B CN111447439 B CN 111447439B
Authority
CN
China
Prior art keywords
camera
image data
abstraction layer
hardware abstraction
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010418153.3A
Other languages
Chinese (zh)
Other versions
CN111447439A (en
Inventor
邹剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN202010418153.3A priority Critical patent/CN111447439B/en
Publication of CN111447439A publication Critical patent/CN111447439A/en
Application granted granted Critical
Publication of CN111447439B publication Critical patent/CN111447439B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/617Upgrading or updating of programs or applications for camera control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application is applicable to the technical field of image coding, and provides an image coding method, an image coding device, a mobile terminal and a computer readable storage medium, comprising the following steps: if the camera hardware abstraction layer is detected to send the acquired image data to be coded to the camera application program, acquiring the state of the camera hardware abstraction layer; determining a target coding mode from the first coding mode and the second coding mode according to the state of the camera hardware abstraction layer; and coding the image data to be coded according to the target coding mode to generate the image data in the target format. The method and the device can solve the problem of poor flexibility of image coding in the prior art.

Description

Image coding method, image coding device and mobile terminal
Technical Field
The present application belongs to the field of image coding technology, and in particular, to an image coding method, an image coding device, a mobile terminal, and a computer-readable storage medium.
Background
With the rapid development of mobile terminals such as mobile phones and tablet computers, the use of the camera function of the mobile terminal is more and more popular. When a camera in a mobile terminal is used for taking a picture, in order to reduce the memory occupied by image data in the mobile terminal, the image data of a shot object generally needs to be encoded first and then stored in the memory of the mobile terminal. However, in the prior art, when encoding image data, a fixed encoding mode is usually used, and the flexibility is poor.
Disclosure of Invention
The application provides an image coding method, an image coding device, a mobile terminal and a computer readable storage medium, which are used for solving the problem of poor flexibility of image coding in the prior art.
In a first aspect, an embodiment of the present application provides an image encoding method, where the image encoding method includes:
if the camera hardware abstraction layer is detected to send the acquired image data to be coded to a camera application program, acquiring the state of the camera hardware abstraction layer, wherein the state of the camera hardware abstraction layer comprises that the camera hardware abstraction layer is in an open state and the camera hardware abstraction layer is in a closed state;
determining a target coding mode from a first coding mode and a second coding mode according to the state of the camera hardware abstraction layer, wherein the first coding mode is used for indicating to code image data to be coded received according to the camera application program, and the second coding mode is used for indicating to code the image data to be coded acquired according to the camera hardware abstraction layer;
and coding the image data to be coded according to the target coding mode to generate image data in a target format.
In a second aspect, an embodiment of the present application provides an image encoding apparatus, including:
the camera hardware abstraction layer processing module is used for acquiring the state of the camera hardware abstraction layer if the situation that the camera hardware abstraction layer sends the acquired image data to be coded to the camera application program is detected, wherein the state of the camera hardware abstraction layer includes that the camera hardware abstraction layer is in an open state and the camera hardware abstraction layer is in a closed state;
the encoding determining module is configured to determine a target encoding mode from a first encoding mode and a second encoding mode according to a state of the camera hardware abstraction layer, where the first encoding mode is used to indicate encoding according to image data to be encoded received by the camera application program, and the second encoding mode is used to indicate encoding according to the image data to be encoded acquired by the camera hardware abstraction layer;
and the image generation module is used for coding the image data to be coded according to the target coding mode to generate image data in a target format.
In a third aspect, an embodiment of the present application provides a mobile terminal, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the image encoding method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the image encoding method according to the first aspect.
In a fifth aspect, the present application provides a computer program product, which when run on a mobile terminal, causes the mobile terminal to perform the steps of the image encoding method according to the first aspect.
Therefore, when the camera hardware abstraction layer is detected to send the acquired image data to be coded to the camera application program, the state of the camera hardware abstraction layer is acquired, and the target coding mode capable of successfully coding the image data to be coded can be adaptively selected from the first coding mode and the second coding mode according to the state of the camera hardware abstraction layer, so that the flexibility of image coding can be improved while the successful coding of the image data to be coded is ensured.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of an image encoding method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of an implementation of an image encoding method provided in the second embodiment of the present application;
fig. 3 is a schematic diagram of an implementation architecture of an image encoding method according to a second embodiment of the present application;
fig. 4 is a schematic flow chart illustrating an implementation of an image encoding method according to a third embodiment of the present application;
fig. 5 is a schematic diagram of an implementation architecture of an image encoding method according to a third embodiment of the present application;
fig. 6 is a schematic structural diagram of an image encoding device according to a fourth embodiment of the present application;
fig. 7 is a schematic structural diagram of a mobile terminal according to a fifth embodiment of the present application;
fig. 8 is a schematic structural diagram of a mobile terminal according to a sixth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail. It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In particular implementations, the mobile terminals described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a mobile terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the mobile terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The mobile terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the mobile terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, which is a schematic diagram illustrating an implementation flow of an image encoding method provided in an embodiment of the present application, where the image encoding method is applied to a mobile terminal, and as shown in the figure, the image encoding method may include the following steps:
step 101, if it is detected that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, acquiring a state of the camera hardware abstraction layer.
The state of the camera hardware abstraction layer represents the data interaction characteristic between the camera hardware abstraction layer and the camera application program, the state of the camera hardware abstraction layer comprises that the camera hardware abstraction layer is in an open state and the camera hardware abstraction layer is in a closed state, the camera hardware abstraction layer is in the open state and represents that the data interaction can be carried out between the camera hardware abstraction layer and the camera application program, and the camera hardware abstraction layer is in the closed state and represents that the data interaction cannot be carried out between the camera hardware abstraction layer and the camera application program. The camera hardware abstraction layer refers to an interface layer interfacing a camera framework application program of a higher level in a camera application program to an underlying camera driver and hardware. The camera application is a piece of camera software.
Optionally, if it is detected that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, acquiring the state of the camera hardware abstraction layer includes:
if the fact that the camera hardware abstraction layer sends the acquired image data to be coded to the camera application program is detected, acquiring the state of a camera of the camera;
and if the camera is in an open state and is not in a called state, determining that the camera hardware abstraction layer is in an open state.
And if the camera is in an open and called state or in a closed state, determining that the camera hardware abstraction layer is in the closed state.
In the embodiment of the application, if the camera is turned on and is not called, it indicates that the camera application can be normally used (i.e., operations such as photographing and recording can be performed through the camera application), and data interaction can be performed between the camera hardware abstraction layer and the camera application; if the camera is in an open and called state or the camera is in a closed state, it indicates that the camera application cannot be used normally (i.e., operations such as photographing and recording cannot be performed through the camera application), and data interaction cannot be performed between the camera hardware abstraction layer and the camera application. The camera is called, which may mean that when the camera application uses the camera, the camera is called by other applications, for example, a WeChat calls the camera to perform a video call. The other application may refer to an application that can call the camera when the camera application uses the camera.
Before executing step 101, it may be detected whether the camera hardware abstraction layer acquires image data to be encoded, and if it is detected that the camera hardware abstraction layer acquires the image data to be encoded, it is detected whether the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program; and if the camera hardware abstraction layer is detected not to acquire the image data to be coded, returning to execute the step of detecting whether the camera hardware abstraction layer acquires the image data to be coded until any one of the following three conditions is detected, wherein the three conditions are that the camera hardware abstraction layer acquires the image data to be coded, a camera of the camera is closed, and the camera of the camera is occupied by other application programs.
Optionally, before it is detected that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, an embodiment of the present application includes:
acquiring image data to be encoded through a camera of a camera;
and sending the image data to be coded to a camera hardware abstraction layer.
When the camera of the camera is started by the camera application program, the image data to be coded of the object to be shot is obtained through the camera, the image data to be coded is sent to the camera hardware abstraction layer by the mobile terminal, and the subsequent coding of the image data to be coded can be facilitated. The mobile terminal starts the camera when detecting that the camera application program starts any function which needs to use the camera, for example, the camera is started when detecting that the camera application program starts a photographing function or a video recording function. Acquiring the image data to be encoded of the object to be photographed by the camera may refer to acquiring the image data to be encoded of the object to be photographed by a photosensitive element (i.e., an image sensor) in the camera, where the photosensitive element includes, but is not limited to, a Charge Coupled Device (CCD), a Metal-Oxide Semiconductor (CMOS), and the like.
And 102, determining a target coding mode from the first coding mode and the second coding mode according to the state of the camera hardware abstraction layer.
The first encoding mode is used to instruct to encode according to image data to be encoded received by a camera application, the second encoding mode is used to instruct to encode according to image data to be encoded acquired by a camera hardware abstraction layer, and the image data to be encoded acquired by the camera hardware abstraction layer may refer to image data to be encoded acquired from the camera application, or may be image data to be encoded acquired through another path (for example, image data to be encoded acquired from a camera of a camera), which is not limited herein. The target encoding mode may be an encoding mode that can successfully encode the image data to be encoded in a state where the camera hardware abstraction layer is located.
Corresponding relations between different coding modes and different states can be pre-established, and when the state of the camera hardware abstraction layer is obtained, the coding mode corresponding to the state of the camera hardware abstraction layer is obtained from the corresponding relations. For any one of the different encoding modes, successful encoding of the image data to be encoded can be realized in the state of the corresponding camera hardware abstraction layer. The successful coding of the image data to be coded can mean that the coding of the image data to be coded can be completed, and the application layer of the mobile terminal is ensured to obtain the coded image data, so that the coded image data can be stored, and the subsequent viewing and use of a user are facilitated.
And 103, coding the image data to be coded according to the target coding mode to generate image data in a target format.
The encoding of the image data to be encoded may refer to converting an image format of the image data to be encoded into a target format, where the image data in the target format refers to image data obtained by encoding the image data to be encoded according to a target encoding mode, that is, the image format of the image data obtained by encoding is the target format. The target format is different from the image format of the image data to be coded, and the memory space occupied by the image data in the target format is smaller than the memory space occupied by the image data to be coded. The target format may be a JPEG format, and the JPEG format belongs to a lossy compression format, and can compress image data in a smaller memory space. The target format may be any image format into which the user wants to encode the image data to be encoded, and is not limited herein.
The image data to be encoded can be image data output by a camera, the image formats of the image data are generally RAW format and YUV format, the memory space occupied by the image data of the two image formats is large, and the transmission speed of the image data in network transmission is slow due to the large memory space. Because the memory space occupied by the image data in the target format is smaller than the memory space occupied by the image data to be coded, the image data in the target format is obtained by coding the image data to be coded, the memory space occupied by the image data in the mobile terminal can be reduced, and the transmission speed of the image data is improved.
According to the method and the device, when the camera hardware abstraction layer is detected to send the acquired image data to be coded to the camera application program, the state of the camera hardware abstraction layer is acquired, and the target coding mode capable of successfully coding the image data to be coded can be adaptively selected from the first coding mode and the second coding mode according to the state of the camera hardware abstraction layer, so that the flexibility of image coding can be improved while the successful coding of the image data to be coded is ensured.
Referring to fig. 2, it is a schematic diagram of an implementation flow of an image encoding method provided in the second embodiment of the present application, where the image encoding method is applied to a mobile terminal, and as shown in the figure, the image encoding method may include the following steps:
step 201, if it is detected that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, acquiring a state of the camera hardware abstraction layer.
The step is the same as step 101, and reference may be made to the related description of step 101, which is not described herein again.
Step 202, if the camera hardware abstraction layer is in a closed state, acquiring image data to be encoded from the camera application program.
When the camera hardware abstraction layer is in a closed state, data interaction cannot be carried out between the camera hardware abstraction layer and the camera application program, image data to be coded can be obtained from the camera application program, coding of the image data to be coded is achieved on the application layer of the mobile terminal, therefore, the coded image data is obtained, and the image data are guaranteed not to be lost.
And 203, coding the image data to be coded according to the first coding mode to generate the image data in the target format.
The first encoding method is software encoding, and is to encode image data to be encoded by using a Central Processing Unit (CPU) of the mobile terminal, that is, to implement encoding of the image data to be encoded in an application layer of the mobile terminal.
Step 204, saving the image data in the target format.
After the image data in the target format is generated, the image data is saved, so that the subsequent viewing and use by a user can be facilitated.
As shown in fig. 3, which is a schematic view of an implementation architecture of an image encoding method provided in the second embodiment of the present application, after a photographing instruction is received, a camera acquires image data to be encoded of an object to be photographed, sends the image data to be encoded to a camera hardware abstraction layer, sends the image data to be encoded received by the camera hardware abstraction layer to a camera application program through a streaming pipeline, a Java local interface, and an image reading function, acquires a state of the camera hardware abstraction layer at this time, acquires the image data to be encoded from the camera application program through a data acquisition function if the camera hardware abstraction layer is in a closed state, encodes the image data to be encoded by using a first encoding method, obtains image data in a target format, and stores the image data in the target format. The pipeline is BQ (buffer queue), image data to be coded flows into the pipeline from a camera hardware abstract layer and flows out of the pipeline to a Java local interface; the Java local interface is used for realizing localization of image data to be coded in Java; the image reading function is used for packaging image data to be coded into a structural body; the data acquisition function is used to acquire data from the camera application.
It should be noted that when a photographing instruction is received, parameter information of the camera is also acquired, and the notification of the parameter information may be notified to the camera frame layer through a notification thread (i.e., notify thread), and then the parameter information is transmitted to the camera application program, and when the camera application program receives the image data to be encoded and the parameter information, the image data to be encoded and the parameter information are in the same thread, so that a correct time sequence can be ensured, and the parameter information corresponding to the image data in the target format is obtained. The parameter information refers to setting options in the camera application (for example, various settings such as resolution, whether geographical location information is recorded, whether watermarking is automatically performed, and the like).
According to the embodiment of the application, when the camera hardware abstraction layer is in the closed state, the image data to be coded is obtained from the camera application program and is subjected to software coding, so that when data interaction cannot be performed between the camera hardware abstraction layer and the camera application program, the image data to be coded can be successfully coded, and the problem that image loss possibly occurs when hardware coding is used is solved.
Referring to fig. 4, which is a schematic diagram of an implementation flow of an image encoding method provided in the third embodiment of the present application, where the image encoding method is applied to a mobile terminal, as shown in the figure, the image encoding method may include the following steps:
step 401, if it is detected that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, acquiring a state of the camera hardware abstraction layer.
The step is the same as step 101, and reference may be made to the related description of step 101, which is not described herein again.
Step 402, if the camera hardware abstraction layer is in an open state, sending image data to be coded, which is received by the camera application program, to the camera hardware coding layer.
And 403, coding the image data to be coded according to the second coding mode at the camera hardware abstraction layer to generate the image data in the target format.
The second encoding mode is hardware encoding, and is to encode image data to be encoded by using a Graphics Processing Unit (GPU) integrated in the camera, that is, to encode the image data to be encoded in a camera hardware abstraction layer.
When the camera hardware abstraction layer is in an open state, data interaction can be carried out between the camera hardware abstraction layer and the camera application program, at the moment, image data to be coded received by the camera application program can be returned to the camera hardware coding layer, hardware coding of the image data to be coded is realized on the camera hardware coding layer, and therefore the coded image data is obtained.
Step 404, sending the image data in the target format generated at the camera hardware abstraction layer to the camera application.
Step 405, acquiring image data in a target format from the camera application program, and saving the image data in the target format.
After the image data in the target format is generated, the image data is saved, so that the subsequent viewing and use by a user can be facilitated.
As shown in fig. 5, which is a schematic diagram of an implementation architecture of an image encoding method provided in the third embodiment of the present application, after a photographing instruction is received, a camera acquires image data to be encoded of an object to be photographed, sends the image data to be encoded to a camera hardware abstraction layer, sends the image data to be encoded received by the camera hardware abstraction layer to a camera application program through a flow pipeline, a Java local interface, and an image reading function, acquires a state of the camera hardware abstraction layer at this time, sends the image data to be encoded received by the camera application program to the camera hardware abstraction layer if the camera hardware abstraction layer is in an open state, encodes the image data to be encoded in a second encoding manner at the camera hardware abstraction layer to obtain image data in a target format, and sends encoded data in the target format generated at the camera hardware abstraction layer to the camera application program through the flow pipeline, the Java local interface, and the image reading function, and finally, acquiring the image data in the target format from the camera application program through a data acquisition function, and storing the image data in the target format. The terms of the stream pipeline, the Java local interface, the image reading function, the data obtaining function, and the like are described in embodiment two, and are not described herein again.
It should be noted that when a photographing instruction is received, parameter information of the camera is also acquired, and the notification of the parameter information may be notified to the camera framework layer through a notification thread (i.e., notify thread), and then the parameter information is transmitted to the camera application program, and when the camera application program receives the image data in the target format and the parameter information, the image data in the target format and the parameter information are in the same thread, so that a correct time sequence can be ensured, and the parameter information corresponding to the image data in the target format is obtained. The parameter information refers to setting options in the camera application (for example, various settings such as resolution, whether geographical location information is recorded, whether watermarking is automatically performed, and the like).
According to the embodiment of the application, when the camera hardware abstraction layer is in the open state, the image data to be coded received by the camera application program is sent to the camera hardware coding layer, and the image data to be coded is subjected to hardware coding on the camera hardware coding layer, so that the image coding efficiency is improved while the image data to be coded is successfully coded.
Fig. 6 is a schematic structural diagram of an image encoding device according to the fourth embodiment of the present application, and only the portions related to the embodiments of the present application are shown for convenience of description.
The image encoding device includes:
the state obtaining module 61 is configured to obtain a state of the camera hardware abstraction layer if it is detected that the camera hardware abstraction layer sends the obtained image data to be encoded to the camera application program, where the state of the camera hardware abstraction layer includes that the camera hardware abstraction layer is in an open state and the camera hardware abstraction layer is in a closed state;
the encoding determining module 62 is configured to determine a target encoding mode from a first encoding mode and a second encoding mode according to the state of the camera hardware abstraction layer, where the first encoding mode is used to instruct encoding according to image data to be encoded received by a camera application program, and the second encoding mode is used to instruct encoding according to image data to be encoded obtained by the camera hardware abstraction layer;
and the image generating module 63 is configured to encode the image data to be encoded according to the target encoding mode, and generate image data in the target format.
Optionally, the code determining module 62 is specifically configured to:
if the camera hardware abstraction layer is in a closed state, determining that the first coding mode is a target coding mode;
and if the camera hardware abstraction layer is in the open state, determining that the second coding mode is the target coding mode.
Optionally, when the first encoding method is the target encoding method, the image encoding device further includes:
the first acquisition module is used for acquiring image data to be encoded from a camera application program;
correspondingly, the image generating module 63 is specifically configured to:
and coding the image data to be coded according to the first coding mode to generate the image data in the target format.
Optionally, when the second encoding method is the target encoding method, the image encoding device further includes:
the first sending module is used for sending the image data to be coded received by the camera application program to the camera hardware abstraction layer;
correspondingly, the image generating module 63 is specifically configured to:
and at the camera hardware abstraction layer, coding the image data to be coded according to the second coding mode to generate the image data in the target format.
Optionally, after generating the image data in the target format, the image encoding device further includes:
the second sending module is used for sending the image data in the target format generated at the camera hardware abstraction layer to the camera application program;
and the data processing module is used for acquiring the image data in the target format from the camera application program and storing the image data in the target format.
Optionally, the state obtaining module 61 is specifically configured to:
if the fact that the camera hardware abstraction layer sends the acquired image data to be coded to the camera application program is detected, acquiring the state of a camera of the camera;
and if the camera is in an open state and is not in a called state, determining that the camera hardware abstraction layer is in an open state.
And if the camera is in an open and called state or in a closed state, determining that the camera hardware abstraction layer is in the closed state.
Optionally, before it is detected that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, the image encoding apparatus further includes:
the second acquisition module is used for acquiring image data to be coded through a camera of the camera;
and the third sending module is used for sending the image data to be coded to the camera hardware abstraction layer.
The image encoding device provided in the embodiment of the present application can be applied to the foregoing method embodiments, and for details, refer to the description of the foregoing method embodiments, which are not described herein again.
Fig. 7 is a schematic structural diagram of a mobile terminal according to a fifth embodiment of the present application. The mobile terminal as shown in the figure may include: one or more processors 701 (only one shown); one or more input devices 702 (only one shown), one or more output devices 703 (only one shown), and memory 704. The processor 701, the input device 702, the output device 703, and the memory 704 are connected by a bus 705. The memory 704 is used for storing instructions and the processor 701 is used for executing the instructions stored by the memory 704. Wherein:
the processor 701 is configured to, if it is detected that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, acquire a state of the camera hardware abstraction layer, where the state of the camera hardware abstraction layer includes that the camera hardware abstraction layer is in an on state and that the camera hardware abstraction layer is in an off state;
determining a target coding mode from a first coding mode and a second coding mode according to the state of the camera hardware abstraction layer, wherein the first coding mode is used for indicating to code image data to be coded received according to a camera application program, and the second coding mode is used for indicating to code the image data to be coded acquired according to the camera hardware abstraction layer;
and coding the image data to be coded according to the target coding mode to generate the image data in the target format.
Optionally, the processor 701 is specifically configured to:
if the camera hardware abstraction layer is in a closed state, determining that the first coding mode is a target coding mode;
and if the camera hardware abstraction layer is in the open state, determining that the second coding mode is the target coding mode.
Optionally, when the first encoding manner is the target encoding manner, before encoding the image data to be encoded according to the target encoding manner, the processor 701 is further configured to:
acquiring image data to be encoded from a camera application;
correspondingly, the processor 701 is specifically configured to:
and coding the image data to be coded according to the first coding mode to generate the image data in the target format.
Optionally, when the second encoding mode is the target encoding mode, before encoding the image data to be encoded according to the target encoding mode, the processor 701 is further configured to encode the image data to be encoded according to the target encoding mode
Sending image data to be coded received by a camera application program to a camera hardware abstraction layer;
correspondingly, the processor 701 is specifically configured to:
and at the camera hardware abstraction layer, coding the image data to be coded according to the second coding mode to generate the image data in the target format.
Optionally, after generating the image data in the target format, the processor 701 is further configured to:
sending the image data in the target format generated at the camera hardware abstraction layer to a camera application program;
image data in a target format is acquired from a camera application and stored.
Optionally, the processor 701 is specifically configured to:
if the fact that the camera hardware abstraction layer sends the acquired image data to be coded to the camera application program is detected, acquiring the state of a camera of the camera;
and if the camera is in an open state and is not in a called state, determining that the camera hardware abstraction layer is in an open state.
And if the camera is in an open and called state or in a closed state, determining that the camera hardware abstraction layer is in the closed state.
Optionally, before detecting that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, the processor 701 is further configured to:
acquiring image data to be encoded through a camera of a camera;
and sending the image data to be coded to a camera hardware abstraction layer.
It should be understood that, in the embodiment of the present Application, the Processor 701 may be a CPU, and the Processor may also be other general processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field-Programmable Gate arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 702 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, a data receiving interface, and the like. The output devices 703 may include a display (LCD, etc.), speakers, a data transmission interface, and so forth.
The memory 704 may include both read-only memory and random-access memory, and provides instructions and data to the processor 701. A portion of the memory 704 may also include non-volatile random access memory. For example, the memory 704 may also store device type information.
In a specific implementation, the processor 701, the input device 702, the output device 703, and the memory 704 described in this embodiment may execute the implementation described in the embodiment of the image encoding method provided in this embodiment, or may execute the implementation described in the image encoding apparatus described in the fourth embodiment, which is not described herein again.
Fig. 8 is a schematic structural diagram of a mobile terminal according to a sixth embodiment of the present application. As shown in fig. 8, the mobile terminal 8 of this embodiment includes: one or more processors 80 (only one of which is shown), a memory 81, and a computer program 82 stored in the memory 81 and executable on the at least one processor 80. The processor 80 implements the steps in the various image encoding method embodiments described above when executing the computer program 82. Alternatively, the processor 80, when executing the computer program 82, implements the functions of each module/unit in the above-described apparatus embodiments.
The mobile terminal 8 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The mobile terminal may include, but is not limited to, a processor 80, a memory 81. Those skilled in the art will appreciate that fig. 8 is merely an example of a mobile terminal 8 and does not constitute a limitation of the mobile terminal 8 and may include more or fewer components than shown, or some of the components may be combined, or different components, e.g., the mobile terminal may also include input output devices, network access devices, buses, etc.
The processor 80 may be a central processing unit CPU, but may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 81 may be an internal storage unit of the mobile terminal 8, such as a hard disk or a memory of the mobile terminal 8. The memory 81 may also be an external storage device of the mobile terminal 8, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the mobile terminal 8. Further, the memory 81 may also include both an internal storage unit and an external storage device of the mobile terminal 8. The memory 81 is used for storing the computer program and other programs and data required by the mobile terminal. The memory 81 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. For the specific working processes of the units and modules in the system, reference may be made to the corresponding processes in the foregoing method embodiments, which are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/mobile terminal and method may be implemented in other ways. For example, the above-described apparatus/mobile terminal embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
When the computer program product runs on a mobile terminal, the steps in the method embodiments can be realized when the mobile terminal executes the computer program product.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (8)

1. An image encoding method, characterized in that the image encoding method comprises:
if the camera hardware abstraction layer is detected to send the acquired image data to be coded to a camera application program, acquiring the state of the camera hardware abstraction layer, wherein the state of the camera hardware abstraction layer represents the data interaction characteristic between the camera hardware abstraction layer and the camera application program, and the state of the camera hardware abstraction layer comprises that the camera hardware abstraction layer is in an open state and the camera hardware abstraction layer is in a closed state;
determining a target coding mode from a first coding mode and a second coding mode according to the state of the camera hardware abstraction layer, wherein the first coding mode is used for indicating that software coding is carried out according to the image data to be coded received by the camera application program, and the second coding mode is used for indicating that hardware coding is carried out according to the image data to be coded acquired by the camera hardware abstraction layer;
coding the image data to be coded according to the target coding mode to generate image data in a target format;
the determining a target encoding mode from the first encoding mode and the second encoding mode according to the state of the camera hardware abstraction layer includes:
if the camera hardware abstraction layer is in a closed state, determining that the first coding mode is the target coding mode;
if the camera hardware abstraction layer is in an open state, determining that the second coding mode is the target coding mode;
if it is detected that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, acquiring the state of the camera hardware abstraction layer includes:
if the camera hardware abstraction layer is detected to send the acquired image data to be coded to the camera application program, acquiring the state of a camera of the camera;
if the camera is in an open state and is not in a called state, determining that the camera hardware abstraction layer is in an open state;
and if the camera is in an open and called state or in a closed state, determining that the camera hardware abstraction layer is in a closed state.
2. The image encoding method of claim 1, wherein if the first encoding scheme is the target encoding scheme, before the encoding of the image data to be encoded according to the target encoding scheme, the method comprises:
acquiring the image data to be encoded from the camera application;
correspondingly, the encoding the image data to be encoded according to the target encoding mode, and generating the image data in the target format includes:
and carrying out software coding on the image data to be coded according to the first coding mode to generate the image data in the target format.
3. The image encoding method of claim 1, wherein if the second encoding scheme is the target encoding scheme, before encoding the image data to be encoded according to the target encoding scheme, the method comprises:
sending the image data to be encoded received by the camera application to the camera hardware abstraction layer;
correspondingly, the encoding the image data to be encoded according to the target encoding mode, and generating the image data in the target format includes:
and at the camera hardware abstraction layer, carrying out hardware coding on the image data to be coded according to the second coding mode to generate the image data in the target format.
4. The image encoding method according to claim 3, comprising, after the generating of the image data of the target format:
sending the image data in the target format generated at the camera hardware abstraction layer to the camera application;
and acquiring the image data in the target format from the camera application program, and storing the image data in the target format.
5. The image encoding method of any one of claims 1 to 4, wherein before detecting that the camera hardware abstraction layer sends the acquired image data to be encoded to the camera application program, the method comprises:
acquiring image data to be encoded through a camera of a camera;
and sending the image data to be coded to the camera hardware abstraction layer.
6. An image encoding device, characterized by comprising:
the camera hardware abstraction layer state obtaining module is used for obtaining the state of the camera hardware abstraction layer if the camera hardware abstraction layer is detected to send the obtained image data to be coded to a camera application program, wherein the state of the camera hardware abstraction layer represents the data interaction characteristic between the camera hardware abstraction layer and the camera application program, and the state of the camera hardware abstraction layer includes that the camera hardware abstraction layer is in an open state and the camera hardware abstraction layer is in a closed state;
the encoding determining module is used for determining a target encoding mode from a first encoding mode and a second encoding mode according to the state of the camera hardware abstraction layer, wherein the first encoding mode is used for indicating that software encoding is carried out on image data to be encoded received according to the camera application program, and the second encoding mode is used for indicating that hardware encoding is carried out on the image data to be encoded obtained according to the camera hardware abstraction layer;
the image generation module is used for coding the image data to be coded according to the target coding mode and generating image data in a target format;
the code determination module is specifically configured to:
if the camera hardware abstraction layer is in a closed state, determining that the first coding mode is the target coding mode;
if the camera hardware abstraction layer is in an open state, determining that the second coding mode is the target coding mode;
the state acquisition module is specifically configured to:
if the camera hardware abstraction layer is detected to send the acquired image data to be coded to the camera application program, acquiring the state of a camera of the camera;
if the camera is in an open state and is not in a called state, determining that the camera hardware abstraction layer is in an open state;
and if the camera is in an open and called state or in a closed state, determining that the camera hardware abstraction layer is in a closed state.
7. A mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the image encoding method according to any one of claims 1 to 5 when executing the computer program.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the image coding method according to any one of claims 1 to 5.
CN202010418153.3A 2020-05-18 2020-05-18 Image coding method, image coding device and mobile terminal Active CN111447439B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010418153.3A CN111447439B (en) 2020-05-18 2020-05-18 Image coding method, image coding device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010418153.3A CN111447439B (en) 2020-05-18 2020-05-18 Image coding method, image coding device and mobile terminal

Publications (2)

Publication Number Publication Date
CN111447439A CN111447439A (en) 2020-07-24
CN111447439B true CN111447439B (en) 2022-08-09

Family

ID=71655264

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010418153.3A Active CN111447439B (en) 2020-05-18 2020-05-18 Image coding method, image coding device and mobile terminal

Country Status (1)

Country Link
CN (1) CN111447439B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113626649A (en) * 2021-08-02 2021-11-09 Oppo广东移动通信有限公司 Data storage method, data storage device, storage medium and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105338249A (en) * 2015-11-24 2016-02-17 努比亚技术有限公司 Independent camera system-based shooting method and mobile terminal
CN110771174A (en) * 2018-11-21 2020-02-07 深圳市大疆创新科技有限公司 Video processing method, ground control terminal and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9232177B2 (en) * 2013-07-12 2016-01-05 Intel Corporation Video chat data processing
US10547657B2 (en) * 2016-05-25 2020-01-28 Mark Nataros System and method for video gathering and processing
CN106303226A (en) * 2016-08-01 2017-01-04 乐视控股(北京)有限公司 Image processing method and device
CN109672884B (en) * 2017-10-13 2022-05-10 斑马智行网络(香港)有限公司 Image hardware coding processing method and device
CN110955541B (en) * 2019-12-09 2022-04-15 Oppo广东移动通信有限公司 Data processing method, device, chip, electronic equipment and readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105338249A (en) * 2015-11-24 2016-02-17 努比亚技术有限公司 Independent camera system-based shooting method and mobile terminal
CN110771174A (en) * 2018-11-21 2020-02-07 深圳市大疆创新科技有限公司 Video processing method, ground control terminal and storage medium

Also Published As

Publication number Publication date
CN111447439A (en) 2020-07-24

Similar Documents

Publication Publication Date Title
JP7429676B2 (en) Adaptive transfer functions for video encoding and decoding
CN108900770B (en) Method and device for controlling rotation of camera, smart watch and mobile terminal
US9906713B2 (en) Camera command set host command translation
CN107925749B (en) Method and apparatus for adjusting resolution of electronic device
CN108038112B (en) File processing method, mobile terminal and computer readable storage medium
CN112770059B (en) Photographing method and device and electronic equipment
WO2017202175A1 (en) Method and device for video compression and electronic device
CN112887608A (en) Image processing method and device, image processing chip and electronic equipment
JP2022547923A (en) Face image transmission method, value transfer method, device, electronic device
CN111447439B (en) Image coding method, image coding device and mobile terminal
CN112055156B (en) Preview image updating method and device, mobile terminal and storage medium
US9600296B2 (en) Executing a command within a transport mechanism based on a get and set architecture
KR20200108348A (en) Data transfer
CN109559319A (en) A kind of processing method and terminal of normal map
CN112419134A (en) Image processing method and device
CN112328351A (en) Animation display method, animation display device and terminal equipment
CN108958746B (en) Configuration file processing method, mobile terminal and computer readable storage medium
CN113438419B (en) Camera starting method and device and electronic equipment
CN108352161B (en) Dynamic audio codec enumeration
CN112199127A (en) Image data processing method and device, mobile terminal and storage medium
CN117412132A (en) Video generation method and device, electronic equipment and storage medium
CN117278693A (en) Image data processing circuit, method, electronic device, and medium
CN112258408A (en) Information display method and device and electronic equipment
KR20140042429A (en) Apparatus and method for capturing an image in a portable terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant