WO2023123787A1 - 图像处理方法、装置、电子设备及存储介质 - Google Patents

图像处理方法、装置、电子设备及存储介质 Download PDF

Info

Publication number
WO2023123787A1
WO2023123787A1 PCT/CN2022/090696 CN2022090696W WO2023123787A1 WO 2023123787 A1 WO2023123787 A1 WO 2023123787A1 CN 2022090696 W CN2022090696 W CN 2022090696W WO 2023123787 A1 WO2023123787 A1 WO 2023123787A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target
hardware abstraction
image processing
abstraction layer
Prior art date
Application number
PCT/CN2022/090696
Other languages
English (en)
French (fr)
Inventor
杨国昊
高川
张志辉
Original Assignee
北京小米移动软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小米移动软件有限公司 filed Critical 北京小米移动软件有限公司
Publication of WO2023123787A1 publication Critical patent/WO2023123787A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the present disclosure relates to the technical field of electronic equipment, and in particular, to an image processing method, device, electronic equipment, and storage medium.
  • the present disclosure provides an image processing method, device, electronic equipment and storage medium.
  • an image processing method is provided, which is applied to an electronic device, and the operating system of the electronic device includes a camera capability interface module, a framework layer, a hardware abstraction layer, and an application layer provided with a third-party application,
  • the method includes:
  • the third-party application calls an interface provided by the target camera capability interface module to send an image processing request to the framework layer;
  • the framework layer sends the image processing request to the hardware abstraction layer
  • the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request, obtains the target image, and sends the target image to the third-party application, and the target processing strategy is a native camera application Supported processing strategies.
  • the image processing request carries a target task identifier
  • the target task identifier is an identifier corresponding to a shooting function supported by a native camera application
  • the hardware abstraction layer processes the image to be processed according to a target processing strategy, including:
  • the hardware abstraction layer processes the image to be processed according to the target processing policy corresponding to the target task identifier.
  • the framework layer sends the image processing request to the hardware abstraction layer, including:
  • the framework layer encapsulates the image processing request to obtain a first image processing request
  • the framework layer sends the first image processing request to the hardware abstraction layer
  • the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request, including:
  • the hardware abstraction layer processes the image to be processed according to the target processing policy corresponding to the first image processing request.
  • the target processing strategy includes a target hardware processing strategy, a target software processing strategy, and a policy execution sequence
  • the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request, including :
  • the hardware abstraction layer processes the image to be processed according to the target hardware processing policy and the target software processing policy according to the policy execution sequence to obtain the target image.
  • the response to the first operation on the third-party application includes:
  • the camera capability interface module includes Camera X and a camera software development kit
  • the Camera X is set on the framework layer
  • the camera software development kit is set on the application layer.
  • the method also includes:
  • the third-party application calls the interface provided by the Camera X to send a query request to the framework layer, and the query request is used to query the framework layer for all The processing capability of the above image processing request;
  • the third-party application receives the query result from the framework layer through the interface provided by the Camera X, and calls the query result provided by the Camera X in response to the query result that the framework layer can process the image processing request.
  • the interface sends an image processing request to the framework layer.
  • the hardware abstraction layer includes a first hardware abstraction layer and a second hardware abstraction layer
  • the framework layer sends the image processing request to the hardware abstraction layer, including:
  • the framework layer sends the image processing request to the first hardware abstraction layer
  • the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request to obtain the target image, including:
  • the first hardware abstraction layer obtains the target image processing policy corresponding to the image processing request, generates a second image processing request according to the image processing request and the target image processing policy, and sends it to the second hardware abstraction layer layer;
  • the second hardware abstraction layer acquires an image to be processed according to the second image processing request
  • the second hardware abstraction layer sends the image to be processed to the first hardware abstraction layer
  • the first hardware abstraction layer processes the image to be processed according to the target processing strategy to obtain a target image.
  • the method also includes:
  • the second hardware abstraction layer calls the camera to collect the original image through the camera driver
  • the second hardware abstraction layer receives the original image from the camera, and determines the original image as the image to be processed.
  • an image processing apparatus which is applied to an electronic device, and the operating system of the electronic device includes a camera capability interface module, a framework layer, a hardware abstraction layer, and an application layer provided with a third-party application,
  • the devices include:
  • the first sending module is configured to respond to the first operation on the third-party application, and the third-party application calls an interface provided by the target camera capability interface module to send an image processing request to the framework layer;
  • the second sending module is configured to send the image processing request to the hardware abstraction layer by the framework layer;
  • the target image generating module is configured such that the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request to obtain a target image, and sends the target image to the third-party application,
  • the target processing strategies mentioned above are the processing strategies supported by native camera applications.
  • the image processing request carries a target task identifier
  • the target task identifier is an identifier corresponding to a shooting function supported by a native camera application
  • the target image generation module includes:
  • the first processing sub-module is configured such that the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the target task identifier.
  • the second sending module includes:
  • the first sending sub-module is configured such that the framework layer encapsulates the image processing request to obtain a first image processing request; the framework layer sends the first image processing request to the hardware abstraction layer.
  • the target image generation module includes:
  • the second processing sub-module is configured such that the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the first image processing request.
  • the target processing strategy includes a target hardware processing strategy, a target software processing strategy and a strategy execution sequence
  • the target image generation module includes:
  • the third processing sub-module is configured such that the hardware abstraction layer processes the image to be processed according to the target hardware processing policy and the target software processing policy according to the policy execution order to obtain the target image.
  • the first sending module includes:
  • the second sending submodule is configured to respond to a trigger operation on a target function control in the interface of the third-party application.
  • the camera capability interface module includes Camera X and a camera software development kit
  • the Camera X is set on the framework layer
  • the camera software development kit is set on the application layer.
  • the device also includes:
  • Inquiry module configured to, in the case that the target camera capability interface module is Camera X, the third-party application calls the interface provided by Camera X to send a query request to the framework layer, and the query request is used for query The processing capability of the framework layer for the image processing request;
  • the response module is configured such that the third-party application receives the query result from the framework layer through the interface provided by the Camera X, and responds to the query result that the framework layer can process the image processing request, Call the interface provided by the Camera X to send an image processing request to the framework layer.
  • the hardware abstraction layer includes a first hardware abstraction layer and a second hardware abstraction layer
  • the second sending module includes: a third sending submodule, configured so that the framework layer sends the image processing request sent to the first hardware abstraction layer.
  • the target image generation module includes: a target image generation sub-module configured to obtain, by the first hardware abstraction layer, a target image processing policy corresponding to the image processing request, and according to the image processing request and the The target image processing strategy generates a second image processing request and sends it to the second hardware abstraction layer; the second hardware abstraction layer acquires an image to be processed according to the second image processing request; the second hardware abstraction layer Sending the image to be processed to the first hardware abstraction layer; the first hardware abstraction layer processes the image to be processed according to the target processing strategy to obtain a target image.
  • the device also includes:
  • the image acquisition module is configured such that the second hardware abstraction layer calls the camera through the camera driver to acquire the original image.
  • the image-to-be-processed determining module is configured such that the second hardware abstraction layer receives the original image from the camera, and determines the original image as the image to be processed.
  • a computer-readable storage medium on which computer program instructions are stored, and when the program instructions are executed by a processor, the steps of the image processing method provided in the first aspect of the present disclosure are implemented.
  • an electronic device includes a camera capability interface module, a framework layer, a hardware abstraction layer, and an application layer provided with a third-party application, and the electronic device includes : a memory, on which a computer program is stored; a processor, used to execute the computer program in the memory, to realize the following steps through the Camera X, the framework layer, the hardware abstraction layer and the application layer provided with third-party applications:
  • the third-party application calls an interface provided by the target camera capability interface module to send an image processing request to the framework layer;
  • the framework layer sends the image processing request to the hardware abstraction layer
  • the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request, obtains the target image, and sends the target image to the third-party application, and the target processing strategy is a native camera application Supported processing strategies.
  • a computer program including computer readable code, which, when the computer readable code is run on a computing processing device, causes the computing processing device to execute the implementation of the first aspect of the present disclosure.
  • a third-party application calls the interface provided by the camera capability interface module to send an image processing request to the framework layer; then the framework layer sends the image processing request to to the hardware abstraction layer; finally, the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request to obtain a target image, and sends the target image to the third-party application.
  • the image processing request can be sent to the hardware abstraction layer through the camera capability interface module and the framework layer, so that the hardware abstraction layer can call the processing strategy supported by the native camera to process the image processing request initiated by the third-party application.
  • Third-party applications open the functions of the camera's underlying hardware/software algorithms.
  • Fig. 1 is a schematic structural diagram of an operating system of an electronic device according to an exemplary embodiment of the present disclosure.
  • Fig. 2 is a flow chart showing an image processing method according to an exemplary embodiment of the present disclosure.
  • Fig. 3 is a schematic diagram of an interface of a third-party application proposed according to an embodiment of the present disclosure.
  • Fig. 4 is a schematic structural diagram of another electronic device operating system proposed according to an embodiment of the present disclosure.
  • Fig. 5 is a flow chart showing another image processing method according to an exemplary embodiment of the present disclosure.
  • Fig. 6 is a structural block diagram of an image processing device according to an exemplary embodiment of the present disclosure.
  • Fig. 7 is a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of a computing processing device provided by an embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram of a program code storage unit for portable or fixed implementation of the method according to the present invention provided by an embodiment of the present disclosure.
  • FIG. 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • the electronic device includes an operating system, which may be an Android system.
  • the operating system includes a camera capability interface module , application layer, framework layer, hardware abstraction layer, kernel layer and hardware layer.
  • the application layer is provided with a third-party application and a native camera application of the terminal.
  • the framework layer includes various native application interfaces, such as camera application interfaces, application services (such as camera application services), and framework layer interfaces.
  • the framework layer is also provided with a camera capability expansion module.
  • the hardware abstraction layer includes a hardware abstraction layer interface (for example: HAL3.0).
  • an image processing strategy matching module and an image processing strategy module are also set in the hardware abstraction layer.
  • the kernel layer includes various drivers (such as screen Display driver, audio audio driver, etc.).
  • the hardware layer includes various hardware (such as image signal processor ISP+, front-end image sensor sensors, etc.).
  • Fig. 2 is a flow chart of an image processing method according to an exemplary embodiment.
  • the method can be applied to electronic devices, and the electronic devices can include mobile phones, notebooks, tablet computers, smart wearable devices, and the like.
  • the method includes:
  • the third-party application calls an interface provided by the target camera capability interface module to send an image processing request to the framework layer.
  • the electronic device may include an operating system as shown in FIG. 1 , and the operating system includes a camera capability interface module, a framework layer, a hardware abstraction layer, and an application layer configured with a third-party application.
  • the third-party application may be an application with a camera shooting function.
  • it may be a third-party camera application, or an application with a camera function such as WeChat, QQ, and Alipay.
  • the first operation may be used to trigger the start of the three-party application, such as clicking an application icon to start, or starting by voice.
  • the first operation may also be used to trigger the start of a camera function of a third-party application, such as performing a shooting interface.
  • responding to the first operation on the third-party application includes a step of: responding to a trigger operation on the target function control in the interface of the third-party application.
  • the camera control in the function selection area of the chat interface can be triggered, so that the third-party application can call the interface provided by the target camera capability interface module to send an image processing request to the framework layer.
  • the camera capability interface module may provide an interface for data transmission between the third-party application and the framework layer, so that the third-party application can send an image processing request to the framework layer through the interface.
  • the camera capability interface module may be Camera X, which is also called Android Camera X, which is an open channel for camera capabilities provided by Google.
  • Camera X can be integrated in the operating system. In some implementations, Camera X can be set in the frame layer.
  • Camera X provides interfaces for various standard image processing tasks, and the image processing request may be a processing request corresponding to one of the various standard image processing tasks provided by Camera X.
  • the camera capability interface module can also be a camera software development kit (Camera SDK).
  • the camera software development kit can also provide an interface for third-party applications to the framework layer, so that through this interface, the first Three-party applications can send image processing requests to the framework layer.
  • the camera software development kit can be set at the application layer, that is, run in a third-party application.
  • the framework layer sends the image processing request to the hardware abstraction layer.
  • the framework layer may send the image processing request to the hardware abstraction layer.
  • the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request, obtains the target image, and sends the target image to a third-party application, where the target processing strategy is a processing strategy supported by the native camera application.
  • the hardware abstraction layer may first obtain the target processing policy corresponding to the image processing request, and then process the image to be processed according to the target processing policy to obtain the target image.
  • the hardware abstraction layer can then send the target image to a third-party application.
  • the third-party application calls the interface provided by the camera capability interface module to send an image processing request to the framework layer; then the framework layer sends the image processing request to the hardware abstraction layer; finally, the hardware abstraction layer processes the image according to the target processing strategy corresponding to the image processing request Process the image to be processed to obtain the target image, and send the target image to a third-party application.
  • the image processing request can be sent to the hardware abstraction layer through the camera capability interface module and the framework layer, so that the hardware abstraction layer can call the processing strategy supported by the native camera to process the image processing request initiated by the third-party application.
  • the third-party application opens the function of the camera's underlying hardware/software algorithm processing strategy.
  • the image processing request can be Carries the target task ID, which is the ID corresponding to the shooting function supported by the underlying hardware/software algorithm of the native camera application that the third-party application wants to call.
  • the hardware abstraction layer treats the image according to the target processing strategy The processing includes the steps of: the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the target task identifier.
  • a field can be allocated in the image processing request to store the target task identifier, so that the third-party application can accurately call the functional interface provided by the camera capability interface module, so as to send the image processing request to the hardware abstraction through the framework layer Layer, so that after receiving the image processing request, the hardware abstraction layer can accurately call the target processing strategy corresponding to the target task ID to process the image to be processed.
  • the target processing policy corresponding to the target task identifier may be determined by an image processing policy matching module in the hardware abstraction layer.
  • the target task identifier may include identifiers corresponding to the HDR function, the beautification function, the blur function, the super night scene function, and the automatic processing function.
  • the third-party application may generate an image processing request corresponding to the blur function, and the image processing request may carry an identifier corresponding to the blur function.
  • the third-party application can call the interface corresponding to the blur function provided by the camera capability interface module according to the identifier corresponding to the blur function, and send an image processing request to the framework layer.
  • the framework layer sends the image processing request to The hardware abstraction layer includes: the framework layer encapsulates the image processing request to obtain the first image processing request; the framework layer sends the first image processing request to the hardware abstraction layer.
  • the hardware abstraction layer processes the image to be processed according to the target processing policy corresponding to the image processing request, including: the hardware abstraction layer processes the image to be processed according to the target processing policy corresponding to the first intermediate image processing request.
  • the image processing request can be parsed through the framework layer first, and then the framework layer encapsulates the parsed content according to the request type that the hardware abstraction layer can recognize to obtain the first image processing request, and then the framework layer Then send the first image processing request to the hardware abstraction layer, so that the hardware abstraction layer can analyze the received first image processing request, thereby obtaining the target processing strategy corresponding to the first image processing request, and according to the target The processing strategy processes the image to be processed.
  • the first image processing request may be parsed and encapsulated by the camera capability expansion module in the framework layer.
  • the framework layer after the framework layer encapsulates the content obtained by parsing and obtains the first image processing request, it can send the first image processing request through the Android native information link, that is, through the camera application interface, the framework layer interface, and the hardware abstraction layer interface in turn. An image processing request is sent to the hardware abstraction layer.
  • the image processing request is parsed and encapsulated at the framework layer, the first image processing request is obtained, but the first image processing request and the underlying hardware/software algorithm of the camera to be requested by the image processing request
  • the processing strategy is the same, that is, the framework layer parses and encapsulates the image processing request through different protocols without changing the essence of the request. For example, the substantive content such as target task identification and function execution parameters have not changed, therefore, the target processing strategy finally invoked by the hardware abstraction layer also does not change.
  • the application layer can provide the user with an interface corresponding to the application program.
  • FIG. 3 shows a camera function interface of a third-party application.
  • the interface includes an image preview area 1
  • the task function selection control 20 near the image preview area 10 includes the camera function control originally belonging to the native camera application, such as the camera function 1 control, the camera function 2 control, and the camera function 3 control. Since the operating system architecture and the image processing method provided by the embodiments of the present disclosure can be applied, the task function originally belonging to the native camera application can also be used in the third-party application. For example, HDR, beauty, blur, super night scene, automation and other functions.
  • the application layer in response to the first operation on the third-party application, the application layer displays the interface shown in Figure 3 to the user, the electronic device can continue to respond The trigger operation of the target camera function control generates an image processing request.
  • the target camera function control can be understood as the camera function control originally belonging to the native camera application.
  • the target camera function control can be displayed in the interface of the third-party application, so that the user can trigger it to realize the user's desired function. native camera functionality. Furthermore, after the user triggers the target camera function control, the third-party application can generate an image processing request.
  • the target processing strategy includes a target hardware processing strategy, a target software processing strategy, and a strategy
  • the execution sequence in this case, the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request, may include steps:
  • the hardware abstraction layer processes the image to be processed according to the target hardware processing strategy and the target software processing strategy according to the policy execution sequence to obtain the target image.
  • the hardware processing strategy can be understood as a strategy for processing using a hardware algorithm provided by the hardware layer.
  • the hardware abstraction layer processes the image to be processed according to the target hardware processing strategy, which may include steps:
  • the hardware abstraction layer sends the target hardware processing strategy to the hardware layer; the hardware layer processes the image to be processed according to the target hardware processing strategy.
  • the hardware abstraction layer can encapsulate the target processing strategy, generate a new image processing request, and then send the new image processing request to the kernel layer for processing, so that the kernel layer can call the corresponding hardware, so that the corresponding hardware processes the image to be processed according to the target hardware algorithm.
  • the target hardware algorithm is a hardware algorithm corresponding to the target task identifier.
  • the hardware abstraction layer encapsulates the target hardware processing strategy according to the protocol between different layers, without changing the content of the target hardware processing strategy.
  • the software processing strategy can be understood as a strategy for processing using software algorithms provided by the hardware abstraction layer.
  • the hardware abstraction layer processes the image to be processed according to the target software processing policy, and the hardware abstraction layer may call the target software algorithm stored by itself to process the image to be processed.
  • the target software algorithm is a software algorithm corresponding to the target task identifier.
  • the image to be processed can be processed according to the target software processing strategy through the image processing strategy module in the hardware abstraction layer.
  • the process of the hardware abstraction layer processing the image to be processed according to the target hardware processing policy and the process of the hardware abstraction layer processing the image to be processed according to the target hardware processing policy may be performed sequentially or alternately.
  • the image to be processed is firstly processed by the hardware algorithm of the hardware layer, and then the processed intermediate image is processed by the software algorithm of the hardware abstraction layer, so as to obtain the target image.
  • the image to be processed is firstly processed by the software algorithm of the hardware abstraction layer, and then the processed intermediate image is processed by the hardware algorithm of the hardware layer, so as to obtain the target image.
  • the image to be processed targeted by the above hardware algorithm or software algorithm may be the original image collected by the camera.
  • the method of the embodiment of the present disclosure may further include the following steps:
  • the hardware abstraction layer calls the camera to collect the original image through the camera driver; the hardware abstraction layer receives the original image from the camera, and determines the original image as the image to be processed.
  • the process of the hardware abstraction layer calling the camera to acquire the original image through the camera driver is different from the process of the hardware layer processing the image to be processed according to the target hardware processing policy.
  • the former is the process of generating the original image
  • the latter is the process of processing the image to be processed according to the target hardware processing strategy.
  • an image may be collected by a camera, and sent to the hardware abstraction layer as an image to be processed.
  • the image to be processed targeted by the hardware algorithm or the software algorithm may also be a network image sent from a third-party application.
  • the image to be processed may be the original image collected by calling the camera through the camera driver, or may be a network image sent from a third-party application, which improves the diversity of image data sources.
  • the image to be processed targeted by the above hardware algorithm or software algorithm may also be an intermediate image obtained after the original image collected by the camera is first processed by the hardware algorithm or software algorithm.
  • the hardware abstraction layer includes a first hardware abstraction layer and a second hardware abstraction layer, wherein the first hardware abstraction layer can be understood as a new hardware abstraction layer added by an electronic device manufacturer, and the second hardware abstraction layer
  • the abstraction layer can be understood as the hardware abstraction layer provided by the platform.
  • the framework layer sends the image processing request to the hardware abstraction layer, including the steps: the framework layer sends the image processing request to the first hardware abstraction layer; in this case, the hardware abstraction layer
  • the processing strategy processes the image to be processed to obtain the target image, including the steps of: the first hardware abstraction layer obtains the target image processing strategy corresponding to the image processing request, and generates a second image processing request according to the image processing request and the target image processing strategy, and sends to the second hardware abstraction layer; the second hardware abstraction layer obtains the image to be processed according to the second image processing request; the second hardware abstraction layer sends the image to be processed to the first hardware abstraction layer; the first hardware abstraction layer according to the target processing strategy, Process the image to be processed to obtain the target image.
  • the image processing request sent by the framework layer can be received through the first hardware abstraction layer, and then the first hardware abstraction layer obtains the target image processing request corresponding to the image processing request.
  • the first hardware abstraction layer can generate a second image processing request according to the image processing request and the target image processing strategy, and send it to the second hardware abstraction layer
  • the second hardware abstraction layer can, after receiving the second image processing request, Acquire the image to be processed according to the second image processing request, and then the second hardware abstraction layer can send the image to be processed to the first hardware abstraction layer, so that the first hardware abstraction layer can process the image to be processed according to the target processing strategy, and obtain target image.
  • the image processing request sent by the framework layer receiving the first hardware abstraction layer may be the aforementioned first image processing request.
  • the first hardware abstraction layer may acquire a target image processing policy corresponding to the image processing request according to the target task identifier carried in the image processing request.
  • the image to be processed obtained by the second hardware abstraction layer according to the second image processing request may be the original image collected by calling the camera driver through the camera, or a network image sent from a third-party application, or a The hardware layer uses the target hardware processing strategy to process the image.
  • the hardware abstraction layer in the operating system of the electronic device may include a first hardware abstraction layer and a second hardware abstraction layer , wherein, the first hardware abstraction layer is provided with a first hardware abstraction layer interface (for example: HAL3.0), an image processing strategy matching module and an image processing strategy module, wherein the image processing strategy module includes a variety of native camera application bottom layer processing Algorithm, the second hardware abstraction layer is provided with a second hardware abstraction layer interface (for example: HAL3.0), in addition, in Fig. 4, the camera capability interface module is Camera X.
  • a first hardware abstraction layer interface for example: HAL3.0
  • an image processing strategy matching module for example: an image processing strategy module
  • an image processing strategy module includes a variety of native camera application bottom layer processing Algorithm
  • the second hardware abstraction layer is provided with a second hardware abstraction layer interface (for example: HAL3.0)
  • the camera capability interface module is Camera X.
  • the first hardware abstraction layer by setting the first hardware abstraction layer and the second hardware abstraction layer, a cross-platform design is used.
  • the platform is replaced or upgraded, the first hardware abstraction layer can still be combined with the first hardware abstraction layer provided by any platform.
  • the second hardware abstraction layer realizes the image processing method of the embodiment of the present disclosure, which improves the scope of application of the image processing method of the embodiment of the present disclosure.
  • Camera X provides a standard interface through which the framework layer can be accessed.
  • the framework layer of the electronic device cannot process the image processing request, and sends the processed first image processing request to the hardware abstraction layer.
  • the method of the embodiment of the present disclosure further includes the step: in the case that the target camera capability interface module corresponding to the interface identifier is Camera X , the third-party application calls the interface provided by Camera X to send a query request to the framework layer, and the query request is used to query the processing capability of the framework layer for image processing requests; the third-party application receives the query result from the framework layer through the interface provided by Camera X, And in response to the query result that the framework layer can process the image processing request, call the interface provided by Camera X to send the image processing request to the framework layer.
  • Camera X can provide a query interface, so that the third-party application can send a query request to the framework layer through the interface, to query whether the framework layer can process the image processing request from the third-party application, if the framework The third-party application returns a query result indicating that the image processing request can be processed, and the third-party application can call the interface provided by Camera X to send an image processing request to the framework layer.
  • the third-party application sends an image processing request to Camera X.
  • the image processing request includes a target task identifier.
  • Camera X sends an image processing request to the camera capability expansion module in the framework layer.
  • the camera capability expansion module parses and encapsulates the image processing request to obtain a first image processing request.
  • the first image processing request includes a target task identifier.
  • the camera capability expansion module sends a first image processing request to the first hardware abstraction layer.
  • the first hardware abstraction layer parses the first image processing request to obtain a target task identifier, and acquires a target processing strategy corresponding to the target task identifier.
  • obtaining the target processing strategy corresponding to the target task identifier may be realized by an image processing strategy matching module in the first hardware abstraction layer.
  • the first hardware abstraction layer encapsulates the first image processing request and the target processing strategy to obtain a second image processing request.
  • the first hardware abstraction layer sends the second image processing request to the second hardware abstraction layer.
  • the second image processing request includes a target hardware processing policy, a target software processing policy, and a policy execution sequence.
  • the target hardware processing policy is executed first, and then the target software processing policy is executed as an example for illustration.
  • the second hardware abstraction layer sends the target hardware processing policy to the hardware layer.
  • the hardware layer processes the image to be processed according to the target hardware processing strategy to obtain an intermediate image.
  • the hardware layer returns the intermediate image to the first hardware abstraction layer.
  • the image processing strategy module in the first hardware abstraction layer processes the intermediate image according to the target software processing strategy to obtain the target image.
  • the target software processing strategy in the first hardware abstraction layer may be stored by itself, or may be returned by the second hardware abstraction layer.
  • the first hardware abstraction layer returns the target image to the third-party application.
  • the path through which the first hardware abstraction layer returns the target image to the third-party application is just opposite to the path through which the third-party application sends the first image processing request to the first hardware abstraction layer. That is to say, the camera capability expansion module and Camera X in the frame layer need to be passed in the middle.
  • the execution order of some processes in the embodiments of the present disclosure may also be executed in a different execution order than that recorded in the foregoing specific embodiments.
  • two sequential processes can actually be executed in parallel, or they can sometimes be executed in reverse order, depending on the functions involved.
  • Fig. 6 is a structural block diagram of an image processing device according to an exemplary embodiment.
  • the device 200 is applied to an electronic device, and the operating system of the electronic device includes a camera capability interface module, a framework layer, a hardware abstraction layer, and an application layer provided with a third-party application.
  • the device 200 includes:
  • the first sending module 210 is configured to respond to the first operation to the third-party application, and the third-party application calls an interface provided by the target camera capability interface module to send an image processing request to the framework layer;
  • the second sending module 220 is configured so that the framework layer sends the image processing request to the hardware abstraction layer;
  • the target image generation module 230 is configured such that the hardware abstraction layer processes the image to be processed according to the target processing policy corresponding to the image processing request to obtain a target image, and sends the target image to the third-party application,
  • the target processing strategy is a processing strategy supported by native camera applications.
  • the image processing request carries a target task identifier
  • the target task identifier is an identifier corresponding to a shooting function supported by a native camera application
  • the target image generation module 230 includes:
  • the first processing sub-module is configured such that the hardware abstraction layer processes the image to be processed according to the target processing policy corresponding to the target task identifier.
  • the second sending module 220 includes:
  • the first sending sub-module is configured such that the framework layer encapsulates the image processing request to obtain a first image processing request; the framework layer sends the first image processing request to the hardware abstraction layer.
  • the target image generation module includes:
  • the second processing sub-module is configured such that the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the first image processing request.
  • the target processing strategy includes a target hardware processing strategy, a target software processing strategy, and a strategy execution sequence
  • the target image generation module 230 includes:
  • the third processing sub-module is configured such that the hardware abstraction layer processes the image to be processed according to the target hardware processing policy and the target software processing policy according to the policy execution order to obtain the target image.
  • the first sending module 210 includes:
  • the second sending submodule is configured to respond to a trigger operation on a target function control in the interface of the third-party application.
  • the camera capability interface module includes Camera X and a camera software development kit
  • the Camera X is set on the framework layer
  • the camera software development kit is set on the application layer.
  • device 200 also includes:
  • Inquiry module configured to, in the case that the target camera capability interface module is Camera X, the third-party application calls the interface provided by Camera X to send a query request to the framework layer, and the query request is used for query The processing capability of the framework layer for the image processing request;
  • the response module is configured such that the third-party application receives the query result from the framework layer through the interface provided by the Camera X, and responds to the query result that the framework layer can process the image processing request, Call the interface provided by the Camera X to send an image processing request to the framework layer.
  • the hardware abstraction layer includes a first hardware abstraction layer and a second hardware abstraction layer
  • the second sending module 220 includes: a third sending submodule, configured so that the framework layer processes the image A request is sent to the first hardware abstraction layer.
  • the target image generation module includes: a target image generation sub-module configured to obtain, by the first hardware abstraction layer, a target image processing policy corresponding to the image processing request, and according to the image processing request and the The target image processing strategy generates a second image processing request and sends it to the second hardware abstraction layer; the second hardware abstraction layer acquires an image to be processed according to the second image processing request; the second hardware abstraction layer Sending the image to be processed to the first hardware abstraction layer; the first hardware abstraction layer processes the image to be processed according to the target processing strategy to obtain a target image.
  • device 200 also includes:
  • the image acquisition module is configured such that the second hardware abstraction layer calls the camera through the camera driver to acquire the original image.
  • the image-to-be-processed determining module is configured such that the second hardware abstraction layer receives the original image from the camera, and determines the original image as the image to be processed.
  • the present disclosure also provides a computer-readable storage medium on which computer program instructions are stored, and when the program instructions are executed by a processor, the steps of the image processing method provided in the present disclosure are implemented.
  • the present disclosure also provides an electronic device.
  • the operating system of the electronic device includes a camera capability interface module, a framework layer, a hardware abstraction layer, and an application layer provided with third-party applications.
  • the electronic device includes:
  • the processor is used to execute the computer program in the memory, so as to realize the following steps through the camera capability interface module, the framework layer, the hardware abstraction layer and the application layer provided with third-party applications:
  • the third-party application calls an interface provided by the target camera capability interface module to send an image processing request to the framework layer;
  • the framework layer sends the image processing request to the hardware abstraction layer
  • the hardware abstraction layer processes the image to be processed according to the target processing strategy corresponding to the image processing request, obtains the target image, and sends the target image to the third-party application, and the target processing strategy is a native camera application Supported processing strategies.
  • the image processing request carries a target task identifier
  • the target task identifier is an identifier corresponding to a shooting function supported by a native camera application
  • the processor of the electronic device is also used to execute a computer program in the memory to The following steps are implemented through the camera capability interface module, framework layer, hardware abstraction layer, and application layer with third-party applications:
  • the hardware abstraction layer processes the image to be processed according to the target processing policy corresponding to the target task identifier
  • the processor of the electronic device is also used to execute the computer program in the memory, so as to implement the following steps through the camera capability interface module, the framework layer, the hardware abstraction layer, and the application layer provided with third-party applications:
  • the framework layer encapsulates the image processing request to obtain a first image processing request
  • the framework layer sends the first image processing request to the hardware abstraction layer
  • the hardware abstraction layer processes the image to be processed according to the target processing policy corresponding to the first image processing request.
  • the target processing strategy includes a target hardware processing strategy, a target software processing strategy, and a strategy execution sequence.
  • layer, hardware abstraction layer, and application layer with third-party applications to implement the following steps:
  • the hardware abstraction layer processes the image to be processed according to the target hardware processing policy and the target software processing policy according to the policy execution sequence to obtain the target image.
  • the processor of the electronic device is also used to execute the computer program in the memory, so as to implement the following steps through the camera capability interface module, the framework layer, the hardware abstraction layer, and the application layer provided with third-party applications:
  • the camera capability interface module includes Camera X and a camera software development kit
  • the Camera X is set on the framework layer
  • the camera software development kit is set on the application layer.
  • the processor of the electronic device is also used to execute the computer program in the memory, so as to implement the following steps through the camera capability interface module, the framework layer, the hardware abstraction layer, and the application layer provided with third-party applications:
  • the third-party application calls the interface provided by the Camera X to send a query request to the framework layer, and the query request is used to query the framework layer for all The processing capability of the above image processing request;
  • the third-party application receives the query result from the framework layer through the interface provided by the Camera X, and in response to the query result that the framework layer can process the image processing request, calls the camera X provided
  • the interface sends an image processing request to the framework layer.
  • the hardware abstraction layer includes a first hardware abstraction layer and a second hardware abstraction layer
  • the processor of the electronic device is also used to execute the computer program in the memory to pass the camera capability interface module, framework layer,
  • the hardware abstraction layer and the application layer provided with third-party applications implement the following steps:
  • the framework layer sends the image processing request to the first hardware abstraction layer
  • the first hardware abstraction layer obtains the target image processing policy corresponding to the image processing request, generates a second image processing request according to the image processing request and the target image processing policy, and sends it to the second hardware abstraction layer layer;
  • the second hardware abstraction layer acquires an image to be processed according to the second image processing request
  • the second hardware abstraction layer sends the image to be processed to the first hardware abstraction layer
  • the first hardware abstraction layer processes the image to be processed according to the target processing strategy to obtain a target image.
  • the processor of the electronic device is also used to execute the computer program in the memory, so as to implement the following steps through the camera capability interface module, the framework layer, the hardware abstraction layer, and the application layer provided with third-party applications:
  • the second hardware abstraction layer calls the camera to collect the original image through the camera driver
  • the second hardware abstraction layer receives the original image from the camera, and determines the original image as the image to be processed.
  • Fig. 7 is a block diagram of an electronic device according to an exemplary embodiment.
  • the electronic device 300 may be a terminal device such as a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or a server.
  • electronic device 300 may include one or more of the following components: processing component 302, memory 304, power component 306, multimedia component 308, audio component 310, input/output (I/O) interface 312, sensor component 314 , and the communication component 316.
  • the processing component 302 generally controls the overall operations of the electronic device 300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • the processing component 302 may include one or more processors 320 to execute instructions to complete all or part of the steps of the above-mentioned image processing method.
  • processing component 302 may include one or more modules that facilitate interaction between processing component 302 and other components.
  • processing component 302 may include a multimedia module to facilitate interaction between multimedia component 308 and processing component 302 .
  • the memory 304 is configured to store various types of data to support operations at the electronic device 300 . Examples of such data include instructions for any application or method operating on the electronic device 300, contact data, phonebook data, messages, pictures, videos, and the like.
  • the memory 304 can be implemented by any type of volatile or non-volatile storage device or their combination, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), Magnetic Memory, Flash Memory, Magnetic or Optical Disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EPROM erasable Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Magnetic or Optical Disk Magnetic Disk
  • the power component 306 provides power to various components of the electronic device 300 .
  • Power components 306 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for electronic device 300 .
  • the multimedia component 308 includes a screen that provides an output interface between the electronic device 300 and the user.
  • the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user.
  • the touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may not only sense a boundary of a touch or a swipe action, but also detect duration and pressure associated with the touch or swipe operation.
  • the multimedia component 308 includes a front camera and/or a rear camera. When the electronic device 300 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front camera and rear camera can be a fixed optical lens system or have focal length and optical zoom capability.
  • the audio component 310 is configured to output and/or input audio signals.
  • the audio component 310 includes a microphone (MIC), which is configured to receive external audio signals when the electronic device 300 is in operation modes, such as call mode, recording mode and voice recognition mode. Received audio signals may be further stored in memory 304 or sent via communication component 316 .
  • the audio component 310 also includes a speaker for outputting audio signals.
  • the I/O interface 312 provides an interface between the processing component 302 and a peripheral interface module, which may be a keyboard, a click wheel, a button, and the like. These buttons may include, but are not limited to: a home button, volume buttons, start button, and lock button.
  • Sensor assembly 314 includes one or more sensors for providing various aspects of status assessment for electronic device 300 .
  • the sensor component 314 can detect the open/close state of the electronic device 300, the relative positioning of components, such as the display and the keypad of the electronic device 300, and the sensor component 314 can also detect the electronic device 300 or a component of the electronic device 300. Position changes, presence or absence of user contact with electronic device 300 , electronic device 300 orientation or acceleration/deceleration and temperature changes of electronic device 300 .
  • the sensor assembly 314 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact.
  • Sensor assembly 314 may also include an optical sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor component 314 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor or a temperature sensor.
  • the communication component 316 is configured to facilitate wired or wireless communication between the electronic device 300 and other devices.
  • the electronic device 300 can access a wireless network based on communication standards, such as WiFi, 2G or 3G, or a combination thereof.
  • the communication component 316 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
  • communication component 316 also includes a near field communication (NFC) module to facilitate short-range communication.
  • NFC near field communication
  • the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, Infrared Data Association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • Bluetooth Bluetooth
  • electronic device 300 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic component implementation for performing the image processing method described above.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable A programmable gate array
  • controller microcontroller, microprocessor or other electronic component implementation for performing the image processing method described above.
  • non-transitory computer-readable storage medium including instructions, such as the memory 304 including instructions, which can be executed by the processor 320 of the electronic device 300 to implement the above image processing method.
  • the non-transitory computer readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
  • the present disclosure also proposes a computer program, including computer readable code, which, when the computer readable code is run on a computing processing device, causes the computing processing device to execute the aforementioned image Approach.
  • FIG. 8 is a schematic structural diagram of a computing processing device provided by an embodiment of the present disclosure.
  • the computing processing device typically includes a processor 1110 and a computer program product or computer readable medium in the form of memory 1130 .
  • Memory 1130 may be electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the memory 1130 has a storage space 1150 for program code 1151 for performing any method steps in the methods described above.
  • the storage space 1150 for program codes may include respective program codes 1151 for respectively implementing various steps in the above methods. These program codes can be read from or written into one or more computer program products.
  • These computer program products comprise program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks.
  • Such a computer program product is typically a portable or fixed storage unit as shown in FIG. 9 .
  • the storage unit may have storage segments, storage spaces, etc. arranged similarly to the storage 1130 in the server of FIG. 8 .
  • the program code can eg be compressed in a suitable form.
  • the storage unit includes computer readable code 1151', i.e. code readable by, for example, a processor such as 1110, which when executed by the server causes the server to perform the various steps in the methods described above.
  • first and second are used for descriptive purposes only, and cannot be interpreted as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features.
  • the features defined as “first” and “second” may explicitly or implicitly include at least one of these features.
  • “plurality” means at least two, such as two, three, etc., unless otherwise specifically defined.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate or transmit a program for use in or in conjunction with an instruction execution system, device or device.
  • computer-readable media include the following: electrical connection with one or more wires (electronic device), portable computer disk case (magnetic device), random access memory (RAM), Read Only Memory (ROM), Erasable and Editable Read Only Memory (EPROM or Flash Memory), Fiber Optic Devices, and Portable Compact Disc Read Only Memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, as it may be possible, for example, by optically scanning the paper or other medium, followed by editing, interpreting, or other suitable processing if necessary.
  • the program is processed electronically and stored in computer memory.
  • various parts of the present disclosure may be implemented in hardware, software, firmware or a combination thereof.
  • various steps or methods may be implemented by software or firmware stored in memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware as in another embodiment, it can be implemented by any one or a combination of the following techniques known in the art: a discrete Logic circuits, ASICs with suitable combinational logic gates, Programmable Gate Arrays (PGA), Field Programmable Gate Arrays (FPGA), etc.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing module, each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. If the integrated modules are implemented in the form of software function modules and sold or used as independent products, they can also be stored in a computer-readable storage medium.
  • the storage medium mentioned above may be a read-only memory, a magnetic disk or an optical disk, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Television Signal Processing For Recording (AREA)
  • Telephonic Communication Services (AREA)

Abstract

提供了一种图像处理方法、装置、电子设备及存储介质,方法包括:响应于对第三方应用的第一操作,第三方应用调用目标相机能力接口模块提供的接口向框架层发送图像处理请求(S110);框架层将图像处理请求发送到硬件抽象层(S120);硬件抽象层根据与图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将目标图像发送给第三方应用,目标处理策略为原生相机应用支持的处理策略(S130)。

Description

图像处理方法、装置、电子设备及存储介质 技术领域
本公开涉及电子设备技术领域,尤其涉及一种图像处理方法、装置、电子设备及存储介质。
背景技术
目前手机的图像处理技术较为成熟,结合硬件与算法,大多数手机都能够实现HDR、人像、夜景拍摄等功能,图像处理方法十分多样。
然而,HDR、人像、夜景拍摄等功能对应的算法属于相机底层的硬件/软件算法,只有手机自带的原生相机应用可以使用,第三方应用无法访问。
发明内容
为克服相关技术中存在的问题,本公开提供一种图像处理方法、装置、电子设备及存储介质。
根据本公开实施例的第一方面,提供一种图像处理方法,应用于电子设备,所述电子设备操作系统包括相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层,该方法包括:
响应于对第三方应用的第一操作,所述第三方应用调用目标相机能力接口模块提供的接口向所述框架层发送图像处理请求;
所述框架层将所述图像处理请求发送到所述硬件抽象层;
所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将所述目标图像发送给所述第三方应用,所述目标处理策略为原生相机应用支持的处理策略。
在一些实施方式中,所述图像处理请求携带目标任务标识,所述目标任务标识为原生相机应用支持的拍摄功能对应的标识,所述硬件抽象层根据目标处理策略对待处理图像进行处理,包括:
所述硬件抽象层根据与所述目标任务标识对应的目标处理策略,对所述待处理图像进行处理。
在一些实施方式中,所述框架层将所述图像处理请求发送到所述硬件抽象层,包括:
所述框架层对所述图像处理请求进行封装,得到第一图像处理请求;
所述框架层将所述第一图像处理请求发送到所述硬件抽象层;
所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,包括:
所述硬件抽象层根据与所述第一图像处理请求对应的目标处理策略对所述待处理图像进行处理。
在一些实施方式中,所述目标处理策略包括目标硬件处理策略、目标软件处理策略以及策略执行顺序,所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,包括:
所述硬件抽象层按照所述策略执行顺序,根据所述目标硬件处理策略以及所述目标软件处理策略对所述待处理图像进行处理,得到所述目标图像。
在一些实施方式中,所述响应于对第三方应用的第一操作,包括:
响应于对所述第三方应用的界面中的目标功能控件的触发操作。
在一些实施方式中,所述相机能力接口模块包括Camera X以及相机软件开发工具包,所述Camera X设置于所述框架层,所述相机软件开发工具包设置于所述应用层。
在一些实施方式中,所述方法还包括:
在所述目标相机能力接口模块为Camera X的情况下,所述第三方应用调用所述Camera X提供的接口向所述框架层发送查询请求,所述查询请求用于查询所述框架层对所述图像处理请求的处理能力;
所述第三方应用通过所述Camera X提供的接口接收来自于所述框架层的查询结果,并响应于所述框架层能够对所述图像处理请求进行处理的查询结果,调用所述Camera X提供的接口向所述框架层发送图像处理请求。
在一些实施方式中,所述硬件抽象层包括第一硬件抽象层以及第二硬件抽象层,所述框架层将所述图像处理请求发送到硬件抽象层,包括:
所述框架层将所述图像处理请求发送到所述第一硬件抽象层;
所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,包括:
所述第一硬件抽象层获取所述图像处理请求对应的目标图像处理策略,以及根据所述图像处理请求以及所述目标图像处理策略生成第二图像处理请求,并发送到所述第二 硬件抽象层;
所述第二硬件抽象层根据所述第二图像处理请求获取待处理图像;
所述第二硬件抽象层将所述待处理图像发送到所述第一硬件抽象层;
所述第一硬件抽象层根据所述目标处理策略,对所述待处理图像进行处理,得到目标图像。
在一些实施方式中,所述方法还包括:
所述第二硬件抽象层通过相机驱动调用摄像头采集原始图像;
所述第二硬件抽象层接收来自所述摄像头的所述原始图像,并将所述原始图像确定为所述待处理图像。
根据本公开实施例的第二方面,提供一种图像处理装置,应用于电子设备,所述电子设备操作系统包括相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层,所述装置包括:
第一发送模块,被配置为响应于对第三方应用的第一操作,所述第三方应用调用目标相机能力接口模块提供的接口向所述框架层发送图像处理请求;
第二发送模块,被配置为所述框架层将所述图像处理请求发送到所述硬件抽象层;
目标图像生成模块,被配置为所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将所述目标图像发送给所述第三方应用,所述目标处理策略为原生相机应用支持的处理策略。
在一些实施方式中,所述图像处理请求携带目标任务标识,所述目标任务标识为原生相机应用支持的拍摄功能对应的标识,目标图像生成模块包括:
第一处理子模块,被配置为所述硬件抽象层根据与所述目标任务标识对应的目标处理策略,对所述待处理图像进行处理。
在一些实施方式中,第二发送模块,包括:
第一发送子模块,被配置为所述框架层对所述图像处理请求进行封装,得到第一图像处理请求;所述框架层将所述第一图像处理请求发送到所述硬件抽象层。这种情况下,目标图像生成模块,包括:
第二处理子模块,被配置为所述硬件抽象层根据与所述第一图像处理请求对应的目标处理策略对所述待处理图像进行处理。
在一些实施方式中,所述目标处理策略包括目标硬件处理策略、目标软件处理策略 以及策略执行顺序,目标图像生成模块,包括:
第三处理子模块,被配置为所述硬件抽象层按照所述策略执行顺序,根据所述目标硬件处理策略以及所述目标软件处理策略对所述待处理图像进行处理,得到所述目标图像。
在一些实施方式中,所述第一发送模块,包括:
第二发送子模块,被配置为响应于对所述第三方应用的界面中的目标功能控件的触发操作。
在一些实施方式中,所述相机能力接口模块包括Camera X以及相机软件开发工具包,所述Camera X设置于所述框架层,所述相机软件开发工具包设置于所述应用层。
在一些实施方式中,装置还包括:
查询模块,被配置为在所述目标相机能力接口模块为Camera X的情况下,所述第三方应用调用所述Camera X提供的接口向所述框架层发送查询请求,所述查询请求用于查询所述框架层对所述图像处理请求的处理能力;
响应模块,被配置为所述第三方应用通过所述Camera X提供的接口接收来自于所述框架层的查询结果,并响应于所述框架层能够对所述图像处理请求进行处理的查询结果,调用所述Camera X提供的接口向所述框架层发送图像处理请求。
在一些实施方式中,所述硬件抽象层包括第一硬件抽象层以及第二硬件抽象层,第二发送模块,包括:第三发送子模块,被配置为所述框架层将所述图像处理请求发送到所述第一硬件抽象层。这种情况下,目标图像生成模块,包括:目标图像生成子模块,被配置为所述第一硬件抽象层获取所述图像处理请求对应的目标图像处理策略,以及根据所述图像处理请求以及所述目标图像处理策略生成第二图像处理请求,并发送到所述第二硬件抽象层;所述第二硬件抽象层根据所述第二图像处理请求获取待处理图像;所述第二硬件抽象层将所述待处理图像发送到所述第一硬件抽象层;所述第一硬件抽象层根据所述目标处理策略,对所述待处理图像进行处理,得到目标图像。
在一些实施方式中,装置还包括:
图像采集模块,被配置为所述第二硬件抽象层通过相机驱动调用摄像头采集原始图像。
待处理图像确定模块,被配置为所述第二硬件抽象层接收来自所述摄像头的所述原始图像,并将所述原始图像确定为所述待处理图像。
根据本公开实施例的第三方面,提供一种计算机可读存储介质,其上存储有计算机程序指令,该程序指令被处理器执行时实现本公开第一方面所提供的图像处理方法的步骤。
根据本公开实施例的第四方面,提供一种电子设备,所述电子设备的操作系统包括相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层,所述电子设备包括:存储器,其上存储有计算机程序;处理器,用于执行存储器中的计算机程序,以通过Camera X、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
响应于对第三方应用的第一操作,所述第三方应用调用目标相机能力接口模块提供的接口向所述框架层发送图像处理请求;
所述框架层将所述图像处理请求发送到所述硬件抽象层;
所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将所述目标图像发送给所述第三方应用,所述目标处理策略为原生相机应用支持的处理策略。
根据本公开实施例的第五方面,提供一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行本公开第一方面实施例所提出的图像处理方法。
本公开的实施例提供的技术方案可以包括以下有益效果:第三方应用调用所述相机能力接口模块提供的接口向所述框架层发送图像处理请求;然后所述框架层将所述图像处理请求发送到所述硬件抽象层;最后所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将所述目标图像发送给所述第三方应用。可见,通过相机能力接口模块以及框架层可以将图像处理请求发送到硬件抽象层,从而使得硬件抽象层能够调用原生相机支持的处理策略来对第三方应用发起的图像处理请求进行处理,实现了向第三方应用开放相机底层硬件/软件算法的功能。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
本公开上述的和/或附加的方面和优点从下面结合附图对实施例的描述中将变得明显和容易理解,其中:
图1是根据本公开一示例性实施例示出的一种电子设备操作系统的结构示意图。
图2是根据本公开一示例性实施例示出的一种图像处理方法的流程图。
图3是根据本公开一实施例提出的一种第三方应用的界面示意图。
图4是根据本公开一实施例提出的另一种电子设备操作系统的结构示意图。
图5是根据本公开一示例性实施例示出的另一种图像处理方法的流程图。
图6是根据本公开一示例性实施例示出的一种图像处理装置的结构框图。
图7是根据本公开一示例性实施例示出的一种电子设备的框图。
图8为本公开实施例提供了一种计算处理设备的结构示意图。
图9为本公开实施例提供了一种用于便携式或者固定实现根据本发明的方法的程序代码的存储单元的示意图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
请参阅图1,图1是本公开实施例提供的一种电子设备的结构示意图,电子设备包括操作系统,该操作系统可以是安卓系统,如图1所示,该操作系统包括相机能力接口模块、应用层、框架层、硬件抽象层、内核层和硬件层。
其中,应用层设置有第三方应用以及终端的原生相机应用。框架层中包括各种原生应用接口,例如相机应用接口,应用服务(例如相机应用服务)、框架层接口,此外,框架层中还设置有相机能力拓展模块。硬件抽象层包括硬件抽象层接口(例如:HAL3.0),此外,硬件抽象层中还设置有图像处理策略匹配模块以及图像处理策略模块。内核层包括各种驱动(例如屏幕Display驱动、音频Audio驱动等)。硬件层包括各种硬件(例如图像信号处理器ISP+,前端图像传感器sensors等)。
下面对本申请实施例进行详细介绍。
图2是根据一示例性实施例示出的一种图像处理方法的流程图,该方法可以应用于电子设备,电子设备可以包括手机、笔记本、平板电脑、智能穿戴设备等。该方法包括:
S110,响应于对第三方应用的第一操作,第三方应用调用目标相机能力接口模块提供的接口向框架层发送图像处理请求。
本公开实施例中,电子设备可以包括如图1所示的操作系统,操作系统包括相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层。本公开实施例中,第三方应用可以是具有相机拍摄功能的应用。例如,可以是第三方相机应用,也可以是微信、QQ、支付宝等具有相机功能的应用。
因此,在一些实施方式中,第一操作可以是用于触发对三方应用的启动,如点击应用图标启动,或者语音启动。
在另一些实施方式中,第一操作还可以是用于触发第三方应用的相机功能的启动,例如进行拍摄界面。这种情况下,响应于对第三方应用的第一操作,包括步骤:响应于对第三方应用的界面中的目标功能控件的触发操作。
示例性地,在微信应用中,可以对聊天界面中的功能选择区中的拍照控件的触发操作,从而使得第三方应用可以调用目标相机能力接口模块提供的接口向框架层发送图像处理请求。
本公开实施例中,相机能力接口模块可以提供第三方应用与框架层之间进行数据传输的接口,从而通过该接口,第三方应用便能够将图像处理请求发送到框架层。
在一些实施方式中,相机能力接口模块可以是Camera X,Camera X也称为Android Camera X,是一种Google提供的相机能力开放渠道。
本公开实施例中,可以在操作系统中集成Camera X。在一些实施方式中,Camera X可以设置于框架层中。
其中,Camera X提供了多种标准图像处理任务的接口,图像处理请求可以是针对Camera X提供的多种标准图像处理任务中的一种任务对应的处理请求。
在另一些实施方式中,相机能力接口模块还可以是相机软件开发工具包(Camera SDK),同样地,相机软件开发工具包也可以提供第三方应用到框架层的接口,从而通过该接口,第三方应用可以向框架层发送图像处理请求。
在一些实施方式中,相机软件开发工具包可以设置于应用层,即运行在第三方应用中。
S120,框架层将图像处理请求发送到硬件抽象层。
本公开实施例中,框架层在接收到图像处理请求之后,可以将图像处理请求发送到硬件抽象层。
S130,硬件抽象层根据与图像处理请求对应的目标处理策略对待处理图像进行处理, 得到目标图像,并将目标图像发送给第三方应用,目标处理策略为原生相机应用支持的处理策略。
本公开实施例中,硬件抽象层在接收到框架层发送的图像处理请求之后,可以先获取与图像处理请求对应的目标处理策略,然后根据目标处理策略对待处理图像进行处理,得到目标图像。接着,硬件抽象层便可以将目标图像发送到第三方应用。
采用上述方法,第三方应用调用相机能力接口模块提供的接口向框架层发送图像处理请求;然后框架层将图像处理请求发送到硬件抽象层;最后硬件抽象层根据与图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将目标图像发送给第三方应用。可见,通过相机能力接口模块以及框架层可以将图像处理请求发送到硬件抽象层,从而使得硬件抽象层能够调用原生相机支持的处理策略来对第三方应用发起的图像处理请求进行处理,实现了向第三方应用开放相机底层硬件/软件算法处理策略的功能。
此外,考虑到相机能力接口模块提供的功能接口较多,因此,为了能够准确调用相机能力接口模块提供的接口,以便于准确调用处理策略对图像进行处理,在一些实施方式中,图像处理请求可以携带有目标任务标识,目标任务标识即为第三方应用想要调用的原生相机应用底层硬件/软件算法所支持的拍摄功能对应的标识,这种情况下,硬件抽象层根据目标处理策略对待处理图像进行处理,包括步骤:硬件抽象层根据与目标任务标识对应的目标处理策略,对待处理图像进行处理。
本公开实施例中,可以在图像处理请求中分配一个字段来存储目标任务标识,以便于第三方应用能够准确调用相机能力接口模块提供的功能接口,从而将图像处理请求经框架层发送到硬件抽象层,进而使得硬件抽象层在接收到图像处理请求之后,可以准确调用与目标任务标识对应的目标处理策略对待处理图像进行处理。
在一些实施方式中,继续参考图1,可以通过硬件抽象层中的图像处理策略匹配模块来确定目标任务标识对应的目标处理策略。
示例性地,目标任务标识可以包括HDR功能、美颜功能、虚化功能、超级夜景功能、自动化处理功能等对应的标识。
示例性地,当第三方应用想要实现虚化功能时,第三方应用可以生成虚化功能对应的图像处理请求,该图像处理请求可以携带有虚化功能对应的标识。此时,第三方应用便可以根据虚化功能对应的标识调用相机能力接口模块提供的虚化功能对应的接口,向 框架层发送图像处理请求。
此外,考虑到相机能力接口模块的接口不能直接被硬件抽象层识别,因此,为了使得硬件抽象层能够对接收到的图像处理请求进行处理,在一些实施方式中,框架层将图像处理请求发送到硬件抽象层,包括:框架层对图像处理请求进行封装,得到第一图像处理请求;框架层将第一图像处理请求发送到硬件抽象层。这种情况下,硬件抽象层根据与图像处理请求对应的目标处理策略对待处理图像进行处理,包括:硬件抽象层根据与第一中间图像处理请求对应的目标处理策略对待处理图像进行处理。
本公开实施例中,可以先经过框架层对图像处理请求进行解析,然后,框架层再按照硬件抽象层能够识别的请求类型对解析得到的内容进行封装,得到第一图像处理请求,接着框架层再将第一图像处理请求发送到硬件抽象层,如此,硬件抽象层便能够对接收到的第一图像处理请求进行解析,从而获取到与第一图像处理请求对应的目标处理策略,并根据目标处理策略对待处理图像进行处理。
在一些实施方式中,继续参考图1,可以通过框架层中的相机能力拓展模块对第一图像处理请求进行解析以及封装。
在一些实施方式中,框架层在对解析得到的内容进行封装,得到第一图像处理请求之后,可以通过安卓原生信息链路,即依次通过相机应用接口、框架层接口以及硬件抽象层接口将第一图像处理请求发送到硬件抽象层。
需要说明的是,虽然在框架层对图像处理请求进行了解析以及封装的过程,得到了第一图像处理请求,但是第一图像处理请求以及图像处理请求所要请求的调用的相机底层硬件/软件算法处理策略是相同的,也即,框架层是通过不同的协议对图像处理请求进行解析封装,而未改变请求的实质内容。例如,目标任务标识、功能执行参数等实质内容未发生改变,因此,硬件抽象层最终调用的目标处理策略也不改变。
可以理解的是,应用层可以向用户提供对应应用程序的界面,示例性地,请参阅图3,示出了一种第三方应用的相机功能界面,界面中包括图像预览区域1,以及分布在图像预览区域10附近的任务功能选择控件20,其中包括原本属于原生相机应用的拍照功能控件,例如拍照功能1控件、拍照功能2控件、拍照功能3控件等。由于可以应用本公开实施例提供的操作系统架构以及图像处理方法,因此,同样可以在第三方应用中使用原本属于原生相机应用的任务功能。例如,HDR、美颜、虚化、超级夜景、自动化等功能。
在一些实施方式中,在响应于对第三方应用的第一操作,使得应用层向用户展示如 图3所示的界面的情况下,电子设备可以继续响应于对第三方应用的相机功能界面中的目标拍照功能控件的触发操作,生成图像处理请求。
其中,目标拍照功能控件则可以理解为原本属于原生相机应用的拍照功能控件。
由于本公开实施例的方法实现了向第三方应用开放相机底层硬件/软件算法的功能,因此,可以在第三方应用的界面中显示目标拍照功能控件,以便于用户进行触发,以实现用户想要的原生相机的功能。进而,当用户在触发了目标拍照功能控件之后,第三方应用便可以生成图像处理请求。
此外,考虑到对待处理图像完成某种处理任务,可能需要先后经过多种硬件或者软件处理策略的配合,因此,在一些实施方式中,目标处理策略包括目标硬件处理策略、目标软件处理策略以及策略执行顺序,在这种情况下,硬件抽象层根据与图像处理请求对应的目标处理策略对待处理图像进行处理,可以包括步骤:
硬件抽象层按照策略执行顺序,根据目标硬件处理策略以及目标软件处理策略对待处理图像进行处理,得到目标图像。
其中,硬件处理策略可以理解为利用硬件层提供的硬件算法进行处理的策略。这种情况下,硬件抽象层根据目标硬件处理策略对待处理图像进行处理,可以包括步骤:
硬件抽象层将目标硬件处理策略发送到硬件层;硬件层根据目标硬件处理策略对待处理图像进行处理。
在一些实施方式中,硬件抽象层可以将目标处理策略进行封装,生成新的图像处理请求,然后将新的图像处理请求发送到内核层处理,从而内核层可以通过对应的驱动调用硬件层中对应的硬件,使得对应的硬件根据目标硬件算法对待处理图像进行处理。其中,目标硬件算法是目标任务标识对应的硬件算法。
同样地,此处硬件抽象层对目标硬件处理策略进行封装也是根据不同层之间的协议进行封装,而不会改变目标硬件处理策略的内容。
其中,软件处理策略可以理解为利用硬件抽象层提供的软件算法进行处理的策略。在一些实施方式中,硬件抽象层根据目标软件处理策略对待处理图像进行处理,可以是硬件抽象层调用自身存储的目标软件算法对待处理图像进行处理。其中,目标软件算法是目标任务标识对应的软件算法。
在一些实施方式中,继续参考图1,可以通过硬件抽象层中的图像处理策略模块来根据目标软件处理策略对待处理图像进行处理。
需要说明的是,上述过程仅单独针对硬件抽象层根据目标硬件处理策略对待处理图像进行处理的过程,以及单独针对硬件抽象层根据目标硬件处理策略对待处理图像进行处理的过程进行了详细描述。
在一些实施方式中,硬件抽象层根据目标硬件处理策略对待处理图像进行处理的过程,以及硬件抽象层根据目标硬件处理策略对待处理图像进行处理的过程可以是先后执行或者交替执行的。例如,先对待处理图像经过硬件层的硬件算法进行处理,再对处理后的中间图像经过硬件抽象层的软件算法进行处理,从而得到目标图像。又例如,先对待处理图像经过硬件抽象层的软件算法进行处理,再对处理后的中间图像经过硬件层的硬件算法进行处理,从而得到目标图像。
因此,在一些实施方式中,上述硬件算法或者软件算法针对的待处理图像可以是摄像头采集的原始图像,这种情况下,本公开实施例的方法还可以包括以下步骤:
硬件抽象层通过相机驱动调用摄像头采集原始图像;硬件抽象层接收来自摄像头的原始图像,并将原始图像确定为待处理图像。
需要说明的是,本公开实施例中,硬件抽象层通过相机驱动调用摄像头采集原始图像的过程与硬件层根据目标硬件处理策略对待处理图像进行处理的过程不同。前者是生成原始图像的过程,后者是根据目标硬件处理策略对待处理图像进行处理的过程。
本公开实施例中,可以通过摄像头采集图像,并发送到硬件抽象层作为待处理图像。
此外,在另一些实施方式中,上述硬件算法或者软件算法针对的待处理图像还可以是来自第三方应用发送的网络图像。
可见,本公开实施例中,待处理图像可以是通过相机驱动调用摄像头采集的原始图像,也可以是来自第三方应用发送的网络图像,提高了图像数据的来源多样性。
在另一些实施方式中,上述硬件算法或者软件算法针对的待处理图像也可以是摄像头采集的原始图像先经过硬件算法或者软件算法处理后,得到的中间图像。
此外,考虑到各个平台,例如高通、天玑等平台的芯片已经具有硬件抽象层,对于第三方电子设备厂家,不方便进行修改,因此,为了使得硬件抽象层可以实现根据目标处理策略对待处理图像进行处理的过程,在一些实施方式中,硬件抽象层包括第一硬件抽象层以及第二硬件抽象层,其中,第一硬件抽象层可以理解为电子设备厂商新增的硬件抽象层,第二硬件抽象层可以理解为平台提供的硬件抽象层。这种情况下,框架层将图像处理请求发送到硬件抽象层,包括步骤:框架层将图像处理请求发送到第一硬件抽 象层;这种情况下,硬件抽象层根据与图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,包括步骤:第一硬件抽象层获取图像处理请求对应的目标图像处理策略,以及根据图像处理请求以及目标图像处理策略生成第二图像处理请求,并发送到第二硬件抽象层;第二硬件抽象层根据第二图像处理请求获取待处理图像;第二硬件抽象层将待处理图像发送到第一硬件抽象层;第一硬件抽象层根据目标处理策略,对待处理图像进行处理,得到目标图像。
本公开实施例中,为了实现向第三方应用开放原生相机应用的功能,可以通过第一硬件抽象层接收框架层发送的图像处理请求,然后第一硬件抽象层获取图像处理请求对应的目标图像处理策略,接着第一硬件抽象层可以根据图像处理请求以及目标图像处理策略生成第二图像处理请求,并发送到第二硬件抽象层,第二硬件抽象层在接收到第二图像处理请求之后,可以根据第二图像处理请求获取待处理图像,然后第二硬件抽象层便可以将待处理图像发送到第一硬件抽象层,以使第一硬件抽象层根据目标处理策略,对待处理图像进行处理,得到目标图像。
在一些实施方式中,第一硬件抽象层接收框架层发送的图像处理请求可以是前述的第一图像处理请求。
在一些实施方式中,第一硬件抽象层可以根据图像处理请求携带的目标任务标识获取图像处理请求对应的目标图像处理策略。
在一些实施方式中,第二硬件抽象层根据第二图像处理请求获取的待处理图像可以是调用相机驱动通过摄像头采集的原始图像,也可以是来自第三方应用发送的网络图像,还可以是经过硬件层采用目标硬件处理策略进行处理后的图像。
请参考图4,示出了本公开实施例提供的一种电子设备的结构示意图,如图4所示,电子设备操作系统中的硬件抽象层可以包括第一硬件抽象层以及第二硬件抽象层,其中,第一硬件抽象层中设置有第一硬件抽象层接口(例如:HAL3.0)、图像处理策略匹配模块以及图像处理策略模块,其中,图像处理策略模块包括多种原生相机应用底层处理算法,第二硬件抽象层中设置有第二硬件抽象层接口(例如:HAL3.0),此外,在图4中,相机能力接口模块为Camera X。
本公开实施例中,通过设置第一硬件抽象层以及第二硬件抽象层,使用了跨平台设计,在平台更换或者升级的时候,依然能够通过第一硬件抽象层结合任意一种平台提供的第二硬件抽象层实现本公开实施例的图像处理方法,提高了本公开实施例的图像处理 方法适用范围。
此外,结合前述内容可知,Camera X提供了标准接口,可以通过Camera X提供的标准接口访问框架层,然而,考虑到一些电子设备并不希望向第三方应用开放原生相机底层硬件/软件算法,因此,这种情况下,电子设备的框架层不能够对图像处理请求进行处理,并向硬件抽象层发送处理后的第一图像处理请求。因此,为了使得第三方应用能够向框架层发送第一图像处理请求,在一些实施方式中,本公开实施例的方法还包括步骤:在接口标识对应的目标相机能力接口模块为Camera X的情况下,第三方应用调用Camera X提供的接口向框架层发送查询请求,查询请求用于查询框架层对图像处理请求的处理能力;第三方应用通过Camera X提供的接口接收来自于框架层的查询结果,并响应于框架层能够对图像处理请求进行处理的查询结果,调用Camera X提供的接口向框架层发送图像处理请求。
本公开实施例中,Camera X可以提供查询接口,以使第三方应用可以通过该接口向框架层发送查询请求,以查询框架层是否能够对来自于第三方应用的图像处理请求进行处理,若框架层向第三方应用返回了表征能够对图像处理请求进行处理的查询结果,则第三方应用便可以调用Camera X提供的接口向框架层发送图像处理请求。
下面,结合图5,以一个完整的处理流程对本公开实施例的图像处理方法进行详细描述。该流程基于图4所示的操作系统架构实现。具体流程如下:
S201,第三方应用向Camera X下发图像处理请求。
其中,图像处理请求包括目标任务标识。
S202,Camera X向框架层中的相机能力拓展模块下发图像处理请求。
S203,相机能力拓展模块对图像处理请求进行解析以及封装,得到第一图像处理请求。
其中,第一图像处理请求包括目标任务标识。
S204,相机能力拓展模块向第一硬件抽象层发送第一图像处理请求。
S205,第一硬件抽象层从第一图像处理请求中解析得到目标任务标识,并获取与目标任务标识对应的目标处理策略。
其中,获取与目标任务标识对应的目标处理策略可以是第一硬件抽象层中的图像处理策略匹配模块实现的。
S206,第一硬件抽象层对第一图像处理请求以及目标处理策略进行封装,得到第二 图像处理请求。
S207,第一硬件抽象层将第二图像处理请求发送到第二硬件抽象层。
其中,第二图像处理请求包括目标硬件处理策略以及目标软件处理策略以及策略执行顺序,本公开实施例中以先执行目标硬件处理策略,再执行目标软件处理策略为例进行说明。
S208,第二硬件抽象层将目标硬件处理策略发送到硬件层。
S209,硬件层根据目标硬件处理策略对待处理图像进行处理,得到中间图像。
S210,硬件层将中间图像返回第一硬件抽象层。
S211,第一硬件抽象层中的图像处理策略模块根据目标软件处理策略对中间图像进行处理,得到目标图像。
其中,第一硬件抽象层中的目标软件处理策略可以是自身存储的,也可以是第二硬件抽象层返回的。
S212,第一硬件抽象层将目标图像返回第三方应用。
需要说明的是,第一硬件抽象层将目标图像返回第三方应用的路径与第三方应用的第一图像处理请求发送到第一硬件抽象层的路径正好相反。也即,中间均需要经过框架层中的相机能力拓展模块以及Camera X。
需要说明的是,本公开实施例提供以上一些具体可实施方式的示例,在互不抵触的前提下,各个实施例示例之间可任意组合,以形成新一种图像处理方法。应当理解的,对于由任意示例所组合形成的新一种图像处理方法,均应落入本申请的保护范围。
另外需要说明的是,在有些作为替换的实现方式中,本公开实施例中部分流程的执行顺序也可以以不同于前述具体实施例中所记载的执行顺序进行执行。例如,两个连续的流程实际上可以并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。
图6是根据一示例性实施例示出的一种图像处理装置的结构框图。参照图6,该装置200应用于电子设备,所述电子设备操作系统包括相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层,该装置200包括:
第一发送模块210,被配置为响应于对第三方应用的第一操作,所述第三方应用调用目标相机能力接口模块提供的接口向所述框架层发送图像处理请求;
第二发送模块220,被配置为所述框架层将所述图像处理请求发送到所述硬件抽象层;
目标图像生成模块230,被配置为所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将所述目标图像发送给所述第三方应用,所述目标处理策略为原生相机应用支持的处理策略。
在一些实施方式中,所述图像处理请求携带目标任务标识,所述目标任务标识为原生相机应用支持的拍摄功能对应的标识,目标图像生成模块230包括:
第一处理子模块,被配置为所述硬件抽象层根据与所述目标任务标识对应的目标处理策略,对所述待处理图像进行处理。
在一些实施方式中,第二发送模块220,包括:
第一发送子模块,被配置为所述框架层对所述图像处理请求进行封装,得到第一图像处理请求;所述框架层将所述第一图像处理请求发送到所述硬件抽象层。这种情况下,目标图像生成模块,包括:
第二处理子模块,被配置为所述硬件抽象层根据与所述第一图像处理请求对应的目标处理策略对所述待处理图像进行处理。
在一些实施方式中,所述目标处理策略包括目标硬件处理策略、目标软件处理策略以及策略执行顺序,目标图像生成模块230,包括:
第三处理子模块,被配置为所述硬件抽象层按照所述策略执行顺序,根据所述目标硬件处理策略以及所述目标软件处理策略对所述待处理图像进行处理,得到所述目标图像。
在一些实施方式中,所述第一发送模块210,包括:
第二发送子模块,被配置为响应于对所述第三方应用的界面中的目标功能控件的触发操作。
在一些实施方式中,所述相机能力接口模块包括Camera X以及相机软件开发工具包,所述Camera X设置于所述框架层,所述相机软件开发工具包设置于所述应用层。
在一些实施方式中,装置200还包括:
查询模块,被配置为在所述目标相机能力接口模块为Camera X的情况下,所述第三方应用调用所述Camera X提供的接口向所述框架层发送查询请求,所述查询请求用于查询所述框架层对所述图像处理请求的处理能力;
响应模块,被配置为所述第三方应用通过所述Camera X提供的接口接收来自于所述框架层的查询结果,并响应于所述框架层能够对所述图像处理请求进行处理的查询结果, 调用所述Camera X提供的接口向所述框架层发送图像处理请求。
在一些实施方式中,所述硬件抽象层包括第一硬件抽象层以及第二硬件抽象层,第二发送模块220,包括:第三发送子模块,被配置为所述框架层将所述图像处理请求发送到所述第一硬件抽象层。这种情况下,目标图像生成模块,包括:目标图像生成子模块,被配置为所述第一硬件抽象层获取所述图像处理请求对应的目标图像处理策略,以及根据所述图像处理请求以及所述目标图像处理策略生成第二图像处理请求,并发送到所述第二硬件抽象层;所述第二硬件抽象层根据所述第二图像处理请求获取待处理图像;所述第二硬件抽象层将所述待处理图像发送到所述第一硬件抽象层;所述第一硬件抽象层根据所述目标处理策略,对所述待处理图像进行处理,得到目标图像。
在一些实施方式中,装置200还包括:
图像采集模块,被配置为所述第二硬件抽象层通过相机驱动调用摄像头采集原始图像。
待处理图像确定模块,被配置为所述第二硬件抽象层接收来自所述摄像头的所述原始图像,并将所述原始图像确定为所述待处理图像。
本公开还提供一种计算机可读存储介质,其上存储有计算机程序指令,该程序指令被处理器执行时实现本公开提供的图像处理方法的步骤。
本公开还提供一种电子设备,电子设备的操作系统包括相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层,电子设备包括:
存储器,其上存储有计算机程序;
处理器,用于执行存储器中的计算机程序,以通过相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
响应于对第三方应用的第一操作,所述第三方应用调用目标相机能力接口模块提供的接口向所述框架层发送图像处理请求;
所述框架层将所述图像处理请求发送到所述硬件抽象层;
所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将所述目标图像发送给所述第三方应用,所述目标处理策略为原生相机应用支持的处理策略。
在一些实施方式中,所述图像处理请求携带目标任务标识,所述目标任务标识为原生相机应用支持的拍摄功能对应的标识,电子设备的处理器,还用于执行存储器中的计 算机程序,以通过相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
所述硬件抽象层根据与所述目标任务标识对应的目标处理策略,对所述待处理图像进行处理;。
在一些实施方式中,电子设备的处理器,还用于执行存储器中的计算机程序,以通过相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
所述框架层对所述图像处理请求进行封装,得到第一图像处理请求;
所述框架层将所述第一图像处理请求发送到所述硬件抽象层;
所述硬件抽象层根据与所述第一图像处理请求对应的目标处理策略对所述待处理图像进行处理。
在一些实施方式中,所述目标处理策略包括目标硬件处理策略、目标软件处理策略以及策略执行顺序,电子设备的处理器,还用于执行存储器中的计算机程序,以通过相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
所述硬件抽象层按照所述策略执行顺序,根据所述目标硬件处理策略以及所述目标软件处理策略对所述待处理图像进行处理,得到所述目标图像。
在一些实施方式中,电子设备的处理器,还用于执行存储器中的计算机程序,以通过相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
响应于对所述第三方应用的界面中的目标功能控件的触发操作。
在一些实施方式中,所述相机能力接口模块包括Camera X以及相机软件开发工具包,所述Camera X设置于所述框架层,所述相机软件开发工具包设置于所述应用层。
在一些实施方式中,电子设备的处理器,还用于执行存储器中的计算机程序,以通过相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
在所述目标相机能力接口模块为Camera X的情况下,所述第三方应用调用所述Camera X提供的接口向所述框架层发送查询请求,所述查询请求用于查询所述框架层对所述图像处理请求的处理能力;
所述第三方应用通过所述Camera X提供的接口接收来自于所述框架层的查询结果, 并响应于所述框架层能够对所述图像处理请求进行处理的查询结果,调用所述Camera X提供的接口向所述框架层发送图像处理请求。
在一些实施方式中,所述硬件抽象层包括第一硬件抽象层以及第二硬件抽象层,电子设备的处理器,还用于执行存储器中的计算机程序,以通过相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
所述框架层将所述图像处理请求发送到所述第一硬件抽象层;
所述第一硬件抽象层获取所述图像处理请求对应的目标图像处理策略,以及根据所述图像处理请求以及所述目标图像处理策略生成第二图像处理请求,并发送到所述第二硬件抽象层;
所述第二硬件抽象层根据所述第二图像处理请求获取待处理图像;
所述第二硬件抽象层将所述待处理图像发送到所述第一硬件抽象层;
所述第一硬件抽象层根据所述目标处理策略,对所述待处理图像进行处理,得到目标图像。
在一些实施方式中,电子设备的处理器,还用于执行存储器中的计算机程序,以通过相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
所述第二硬件抽象层通过相机驱动调用摄像头采集原始图像;
所述第二硬件抽象层接收来自所述摄像头的所述原始图像,并将所述原始图像确定为所述待处理图像。
图7是根据一示例性实施例示出的一种电子设备的框图。例如,电子设备300可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等终端设备,还可以是服务器。
参照图7,电子设备300可以包括以下一个或多个组件:处理组件302,存储器304,电力组件306,多媒体组件308,音频组件310,输入/输出(I/O)的接口312,传感器组件314,以及通信组件316。
处理组件302通常控制电子设备300的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件302可以包括一个或多个处理器320来执行指令,以完成上述的图像处理方法的全部或部分步骤。此外,处理组件302可以包括一个或多个模块,便于处理组件302和其他组件之间的交互。例如,处理组件302 可以包括多媒体模块,以方便多媒体组件308和处理组件302之间的交互。
存储器304被配置为存储各种类型的数据以支持在电子设备300的操作。这些数据的示例包括用于在电子设备300上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器304可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电力组件306为电子设备300的各种组件提供电力。电力组件306可以包括电源管理系统,一个或多个电源,及其他与为电子设备300生成、管理和分配电力相关联的组件。
多媒体组件308包括在电子设备300和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件308包括一个前置摄像头和/或后置摄像头。当电子设备300处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件310被配置为输出和/或输入音频信号。例如,音频组件310包括一个麦克风(MIC),当电子设备300处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器304或经由通信组件316发送。在一些实施例中,音频组件310还包括一个扬声器,用于输出音频信号。
I/O接口312为处理组件302和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件314包括一个或多个传感器,用于为电子设备300提供各个方面的状态评估。例如,传感器组件314可以检测到电子设备300的打开/关闭状态,组件的相对定位,例如组件为电子设备300的显示器和小键盘,传感器组件314还可以检测电子设备 300或电子设备300一个组件的位置改变,用户与电子设备300接触的存在或不存在,电子设备300方位或加速/减速和电子设备300的温度变化。传感器组件314可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件314还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件314还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件316被配置为便于电子设备300和其他设备之间有线或无线方式的通信。电子设备300可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件316经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,通信组件316还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,电子设备300可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述图像处理方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器304,上述指令可由电子设备300的处理器320执行以完成上述图像处理方法。例如,非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
在另一示例性实施例中,本公开还提出了一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行前述的图像处理方法。
图8为本公开实施例提供了一种计算处理设备的结构示意图。该计算处理设备通常包括处理器1110和以存储器1130形式的计算机程序产品或者计算机可读介质。存储器1130可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。存储器1130具有用于执行上述方法中的任何方法步骤的程序代码1151的存储空间1150。例如,用于程序代码的存储空间1150可以包括分别用于实现上面的方法中的各种步骤的各个程序代码1151。这些程序代码可以从一个或者多个计算机 程序产品中读出或者写入到这一个或者多个计算机程序产品中。这些计算机程序产品包括诸如硬盘,紧致盘(CD)、存储卡或者软盘之类的程序代码载体。这样的计算机程序产品通常为如图9所示的便携式或者固定存储单元。该存储单元可以具有与图8的服务器中的存储器1130类似布置的存储段、存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括计算机可读代码1151’,即可以由例如诸如1110之类的处理器读取的代码,这些代码当由服务器运行时,导致该服务器执行上面所描述的方法中的各个步骤。
本领域技术人员在考虑说明书及实践本公开后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本公开的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本公开的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现定制逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本公开的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被 本公开的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本公开的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。如,如果用硬件来实现和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本公开各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述 了本公开的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本公开的限制,本领域的普通技术人员在本公开的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (13)

  1. 一种图像处理方法,其特征在于,应用于电子设备,所述电子设备的操作系统包括相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层,所述方法包括:
    响应于对第三方应用的第一操作,所述第三方应用调用目标相机能力接口模块提供的接口向所述框架层发送图像处理请求;
    所述框架层将所述图像处理请求发送到所述硬件抽象层;
    所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将所述目标图像发送给所述第三方应用,所述目标处理策略为原生相机应用支持的处理策略。
  2. 根据权利要求1所述的方法,其特征在于,所述图像处理请求携带目标任务标识,所述目标任务标识为原生相机应用支持的拍摄功能对应的标识,所述硬件抽象层根据目标处理策略对待处理图像进行处理,包括:
    所述硬件抽象层根据与所述目标任务标识对应的目标处理策略,对所述待处理图像进行处理。
  3. 根据权利要求1所述的方法,其特征在于,所述框架层将所述图像处理请求发送到所述硬件抽象层,包括:
    所述框架层对所述图像处理请求进行封装,得到第一图像处理请求;
    所述框架层将所述第一图像处理请求发送到所述硬件抽象层;
    所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,包括:
    所述硬件抽象层根据与所述第一图像处理请求对应的目标处理策略对所述待处理图像进行处理。
  4. 根据权利要求1所述的方法,其特征在于,所述目标处理策略包括目标硬件处理策略、目标软件处理策略以及策略执行顺序,所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,包括:
    所述硬件抽象层按照所述策略执行顺序,根据所述目标硬件处理策略以及所述目标软件处理策略对所述待处理图像进行处理,得到所述目标图像。
  5. 根据权利要求1所述的方法,其特征在于,所述响应于对第三方应用的第一操作, 包括:
    响应于对所述第三方应用的界面中的目标功能控件的触发操作。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述相机能力接口模块包括Camera X以及相机软件开发工具包,所述Camera X设置于所述框架层,所述相机软件开发工具包设置于所述应用层。
  7. 根据权利要求6所述的方法,其特征在于,所述方法还包括:
    在所述目标相机能力接口模块为Camera X的情况下,所述第三方应用调用所述Camera X提供的接口向所述框架层发送查询请求,所述查询请求用于查询所述框架层对所述图像处理请求的处理能力;
    所述第三方应用通过所述Camera X提供的接口接收来自于所述框架层的查询结果,并响应于所述框架层能够对所述图像处理请求进行处理的查询结果,调用所述Camera X提供的接口向所述框架层发送图像处理请求。
  8. 根据权利要求1-5任一项所述的方法,其特征在于,所述硬件抽象层包括第一硬件抽象层以及第二硬件抽象层,所述框架层将所述图像处理请求发送到硬件抽象层,包括:
    所述框架层将所述图像处理请求发送到所述第一硬件抽象层;
    所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,包括:
    所述第一硬件抽象层获取所述图像处理请求对应的目标图像处理策略,以及根据所述图像处理请求以及所述目标图像处理策略生成第二图像处理请求,并发送到所述第二硬件抽象层;
    所述第二硬件抽象层根据所述第二图像处理请求获取待处理图像;
    所述第二硬件抽象层将所述待处理图像发送到所述第一硬件抽象层;
    所述第一硬件抽象层根据所述目标处理策略,对所述待处理图像进行处理,得到目标图像。
  9. 根据权利要求8所述的方法,其特征在于,所述方法还包括:
    所述第二硬件抽象层通过相机驱动调用摄像头采集原始图像;
    所述第二硬件抽象层接收来自所述摄像头的所述原始图像,并将所述原始图像确定为所述待处理图像。
  10. 一种图像处理装置,其特征在于,应用于电子设备,所述电子设备操作系统包括相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层,所述装置包括:
    第一发送模块,被配置为响应于对第三方应用的第一操作,所述第三方应用调用目标相机能力接口模块提供的接口向所述框架层发送图像处理请求;
    第二发送模块,被配置为所述框架层将所述图像处理请求发送到所述硬件抽象层;
    目标图像生成模块,被配置为所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将所述目标图像发送给所述第三方应用,所述目标处理策略为原生相机应用支持的处理策略。
  11. 一种电子设备,其特征在于,所述电子设备的操作系统包括相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层,所述电子设备包括:
    存储器,其上存储有计算机程序;
    处理器,用于执行所述存储器中的所述计算机程序,以通过相机能力接口模块、框架层、硬件抽象层以及设置有第三方应用的应用层实现以下步骤:
    响应于对第三方应用的第一操作,所述第三方应用调用目标相机能力接口模块提供的接口向所述框架层发送图像处理请求;
    所述框架层将所述图像处理请求发送到所述硬件抽象层;
    所述硬件抽象层根据与所述图像处理请求对应的目标处理策略对待处理图像进行处理,得到目标图像,并将所述目标图像发送给所述第三方应用,所述目标处理策略为原生相机应用支持的处理策略。
  12. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现权利要求1-9中任一项所述方法的步骤。
  13. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行根据权利要求1-9中任一项所述的图像处理方法。
PCT/CN2022/090696 2021-12-28 2022-04-29 图像处理方法、装置、电子设备及存储介质 WO2023123787A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111627422.8 2021-12-28
CN202111627422.8A CN116361022A (zh) 2021-12-28 2021-12-28 图像处理方法、装置、电子设备及存储介质

Publications (1)

Publication Number Publication Date
WO2023123787A1 true WO2023123787A1 (zh) 2023-07-06

Family

ID=86910637

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/090696 WO2023123787A1 (zh) 2021-12-28 2022-04-29 图像处理方法、装置、电子设备及存储介质

Country Status (2)

Country Link
CN (1) CN116361022A (zh)
WO (1) WO2023123787A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070112441A1 (en) * 2002-03-28 2007-05-17 Amir Shahindoust Modular layer for abstracting peripheral hardware characteristics
US20170048461A1 (en) * 2015-08-12 2017-02-16 Samsung Electronics Co., Ltd. Method for processing image and electronic device supporting the same
CN110933275A (zh) * 2019-12-09 2020-03-27 Oppo广东移动通信有限公司 拍照方法及相关设备
CN110995994A (zh) * 2019-12-09 2020-04-10 上海瑾盛通信科技有限公司 图像拍摄方法及相关装置
CN111182223A (zh) * 2020-01-14 2020-05-19 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070112441A1 (en) * 2002-03-28 2007-05-17 Amir Shahindoust Modular layer for abstracting peripheral hardware characteristics
US20170048461A1 (en) * 2015-08-12 2017-02-16 Samsung Electronics Co., Ltd. Method for processing image and electronic device supporting the same
CN110933275A (zh) * 2019-12-09 2020-03-27 Oppo广东移动通信有限公司 拍照方法及相关设备
CN110995994A (zh) * 2019-12-09 2020-04-10 上海瑾盛通信科技有限公司 图像拍摄方法及相关装置
CN111182223A (zh) * 2020-01-14 2020-05-19 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备

Also Published As

Publication number Publication date
CN116361022A (zh) 2023-06-30

Similar Documents

Publication Publication Date Title
JP6072362B2 (ja) アプリケーションプログラムの処理方法、装置、プログラム及び記憶媒体
JP2022523989A (ja) Uiコンポーネントを表示するための方法及び電子デバイス
WO2021063090A1 (zh) 一种建立应用组合的方法与电子设备
JP7268275B2 (ja) 着信があるときに電子デバイス上に映像を提示するための方法、および電子デバイス
JP2023506936A (ja) マルチ画面共働方法およびシステム、ならびに電子デバイス
CN110262692B (zh) 一种触摸屏扫描方法、装置及介质
CN111367456A (zh) 通信终端及多窗口模式下的显示方法
EP3525446A1 (en) Photographing method and terminal
EP3826228A1 (en) Broadcasting and discovering methods, broadcasting and discovering devices and storage medium
US20220343056A1 (en) Method for loading image and electronic device
JP2023534182A (ja) ファイルを開く方法およびデバイス
CN106933111B (zh) 一种控制设备的方法及装置
CN112188034A (zh) 一种图像处理方法、装置、终端设备及介质
US11847305B1 (en) Page display method, electronic device and storage medium
WO2023123787A1 (zh) 图像处理方法、装置、电子设备及存储介质
JP6441385B2 (ja) 情報入力方法、装置、プログラム及び記録媒体
CN109032583B (zh) 数据交互方法及装置
CN113642010A (zh) 一种获取扩展存储设备数据的方法及移动终端
WO2023123786A1 (zh) 视频数据处理方法、电子设备及存储介质
CN113179362B (zh) 电子设备及其图像显示方法
WO2024103894A1 (zh) 一种录屏方法及电子设备
WO2023245374A1 (zh) 视图显示方法、装置、电子设备和存储介质
CN114531493B (zh) 一种请求处理方法、装置、电子设备及存储介质
CN107124505B (zh) 录制方法及装置
CN113392055B (zh) 文件传输方法、文件传输装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22913083

Country of ref document: EP

Kind code of ref document: A1