WO2017107424A1 - 图像处理方法、装置、设备及非易失性计算机存储介质 - Google Patents

图像处理方法、装置、设备及非易失性计算机存储介质 Download PDF

Info

Publication number
WO2017107424A1
WO2017107424A1 PCT/CN2016/088100 CN2016088100W WO2017107424A1 WO 2017107424 A1 WO2017107424 A1 WO 2017107424A1 CN 2016088100 W CN2016088100 W CN 2016088100W WO 2017107424 A1 WO2017107424 A1 WO 2017107424A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal device
video data
application
projected
data stream
Prior art date
Application number
PCT/CN2016/088100
Other languages
English (en)
French (fr)
Inventor
陈聪
宋晔
Original Assignee
百度在线网络技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 百度在线网络技术(北京)有限公司 filed Critical 百度在线网络技术(北京)有限公司
Publication of WO2017107424A1 publication Critical patent/WO2017107424A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41422Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Definitions

  • the present invention relates to communication technologies, and in particular, to an image processing method, apparatus, device, and non-volatile computer storage medium.
  • terminal devices integrate more and more functions, so that more and more corresponding applications (Applications, APPs) are included in the system function list of the terminal devices.
  • the application installed on some terminal devices which is called the interconnection application in the present invention, needs to be installed by using a related cooperation application on another terminal device connected thereto, otherwise there is no use value, especially for some vehicle interconnections.
  • application Two related applications, that is, an interconnected application and a coordinated application, can independently output their related service contents, and the user can operate in the interconnected application to implement a cooperative application on another terminal device connected to the terminal device in which the terminal device is connected. To use the services provided by the matching application.
  • the two related applications that is, the interconnected application and the coordinated application independently output their related service contents
  • aspects of the present invention provide an image processing method, apparatus, device, and non-volatile computer storage medium for improving application development efficiency.
  • An aspect of the present invention provides an image processing method including:
  • the projected video data stream includes:
  • Video data frames without DTS and PTS Video data frames without DTS and PTS.
  • Video data transmitted based on RTP Video data transmitted based on RTP.
  • the method further includes:
  • the interconnected application processes the projected video data stream using at least one component to output the projected video data stream.
  • the at least one component comprising a video source component, a video filtering component, a hard decoding component, and a video output component.
  • the first terminal device is an in-vehicle terminal device
  • the second terminal device is a user terminal device.
  • the operating system of the vehicle-mounted terminal device is a Linux operating system, a WinCE operating system, a QNX operating system, or an Android operating system.
  • an image processing apparatus comprising:
  • An acquiring unit configured to acquire projection image data displayed by a projectable area on a display device of the second terminal device connected to the first terminal device where the interconnection application is located;
  • a coding unit configured to perform video coding processing on the projected image data to obtain a projected video data stream
  • a sending unit configured to send a projected video data stream to the interconnected application, where the interconnected application outputs the projected video data stream.
  • the projected video data stream includes:
  • Video data frames without DTS and PTS Video data frames without DTS and PTS.
  • Video data transmitted based on RTP Video data transmitted based on RTP.
  • the first terminal device is an in-vehicle terminal device
  • the second terminal device is a user terminal device.
  • the operating system of the vehicle-mounted terminal device is a Linux operating system, a WinCE operating system, a QNX operating system, or an Android operating system.
  • an apparatus comprising:
  • One or more processors are One or more processors;
  • One or more programs the one or more programs being stored in the memory, when executed by the one or more processors:
  • a nonvolatile computer storage medium stores one or more programs that, when executed by a device, cause the device to:
  • the embodiment of the present invention obtains the projected image data displayed by the projectable area on the display device of the second terminal device connected to the first terminal device where the first terminal device is connected, and further performs the projected image data.
  • Video encoding processing to obtain a projected video data stream such that a projected video data stream can be sent to the interconnected application for the interconnected application to output the projected video data stream, as by the display device of the terminal device in which the application is to be
  • the displayed real-time content is projected into the interconnected application for output, so that the interconnected application does not independently output its related service content. Therefore, the simple service that is implemented by the interconnected application, that is, the content output service, needs to be developed, and the interconnected application is developed without cooperation.
  • the application of the complex service content and the development of the interconnected application can effectively reduce the development time of the interconnected application, thereby improving the application development efficiency.
  • the original image data displayed on the display area of the second terminal device connected to the first terminal device connected to the first terminal device connected to the first terminal device is obtained, and then the parameter is converted according to the preset image.
  • the original image data is subjected to image conversion processing to obtain projected image data, so that the projected image data can be stored as Specifying a data format for performing video encoding processing on the projected image data of the specified data format to obtain a projected video data stream to be sent to the interconnected application, without manual participation, simple operation, and high correct rate, thereby improving Image processing efficiency and reliability.
  • the projected image data required for the video encoding process can be automatically obtained, which can effectively improve the efficiency of image processing, and can effectively improve the automation degree of image processing.
  • FIG. 1 is a schematic flowchart diagram of an image processing method according to an embodiment of the present invention
  • FIG. 2 is a schematic structural diagram of an image processing apparatus according to another embodiment of the present invention.
  • the user terminal device involved in the embodiments of the present invention may include but Not limited to mobile phones, personal digital assistants (PDAs), wireless handheld devices, tablet computers, personal computers (PCs), MP3 players, MP4 players, wearable devices (eg, smart Glasses, smart watches, smart bracelets, etc.).
  • the vehicle-mounted terminal device involved in the embodiment of the present invention may also be referred to as a vehicle, and refers to an abbreviation of an in-vehicle infotainment product installed in a car.
  • the car machine must be capable of realizing people and vehicles, vehicles and the outside world ( Information communication between car and car).
  • the operating system of the in-vehicle terminal device may be, but not limited to, a Linux operating system, a WinCE operating system, a QNX operating system, or an Android operating system. This embodiment is not particularly limited.
  • FIG. 1 is a schematic flowchart of an image processing method according to an embodiment of the present invention, as shown in FIG. 1 .
  • the first terminal device and the second terminal device pass at least, but are not limited to, a Bluetooth connection, a Universal Serial Bus (USB) connection, and at least a Wireless Fidelity (WI-FI) connection.
  • a Bluetooth connection a Universal Serial Bus (USB) connection
  • a Wireless Fidelity (WI-FI) connection a Wireless Fidelity connection.
  • One of the connections is not specifically limited in this embodiment.
  • the execution body of the application of the second terminal device that is, the application corresponding to the interconnection application, or the application of the second terminal device (ie, the cooperation corresponding to the interconnection application)
  • the functional unit such as a plug-in or a software development kit (SDK) in the application is not particularly limited in this embodiment.
  • the application may be a local application installed on the second terminal device (nativeApp), or may also be a web application (webApp) of the browser on the second terminal device. Special restrictions are made.
  • the projection image data is subjected to video encoding processing to obtain the projected video data.
  • Flowing enabling a projected video data stream to be sent to the interconnected application for the connected application to output the projected video data stream, since the real-time content displayed by the display device of the terminal device in which the application is located is projected into the interconnected application.
  • the output is such that the interconnected application no longer outputs its related service content independently. Therefore, the simple service that is implemented by the interconnected application, that is, the content output service, is developed, and the interconnected application is developed without developing complex service content for the application.
  • the interconnected application can effectively reduce the development time of the interconnected application, thereby improving the development efficiency of the application.
  • the first terminal device is preferably an in-vehicle terminal device; and the second terminal device is preferably a user terminal device.
  • the first terminal device may also be a user terminal device; the second terminal device may also be an in-vehicle terminal device.
  • the user terminal equipment The installed application can be executed separately, without relying on another terminal device connected to it, or by using a corresponding matching application on another terminal device connected to it.
  • the device connection between the first terminal device and the second terminal device is established, based on the established device connection, it is further required to further establish cooperation between the interconnected application on the first terminal device and the second terminal device. Communication connection between applications. In this way, the user can work with the application or cooperate with the second terminal device where the application is located through the interconnection application.
  • the communication connection may include, but is not limited to, at least one of a Bluetooth connection, a USB connection, and a WI-FI connection, which is not specifically limited in this embodiment.
  • the projected video data stream may be sent to the interconnected application by using a communication connection between the interconnected application on the first terminal device and the coordinated application on the second terminal device, for the interconnected application to output the Project a video stream.
  • an image intercepting process may be performed on an interface displayed by the display device of the second terminal device to obtain original image data.
  • the original image data may be subjected to a series of conversion processing to obtain the video encoding hardware of the second terminal device where the application is located. Processed projected image data.
  • the obtained projection image data can be subjected to video encoding processing to obtain a video data stream, which is sent to the interconnection application for output.
  • the original image data displayed by the projectable area on the display device of the second terminal device to which the first terminal device connected to the first terminal device is connected may be obtained, and then the parameter may be converted according to the preset image.
  • Image conversion processing is performed on the original image data to obtain projection image data.
  • the projected image data may be stored in a specified data format for performing video encoding processing on the projected image data of the specified data format to obtain a projected video data stream and sent to the interconnected application.
  • the size data of the display device of the second terminal device may be further acquired, and further, the projectable region may be obtained according to the size data of the display device.
  • the size data of the display device for example, 720 pixels (width) ⁇ 1280 pixels (height), the entire display area of the display device, as the projectable area, that is, the starting point is ( 0,0), a rectangular area with a width of 720 pixels and a height of 1280 pixels.
  • the location information of the virtual button and/or the status bar of the second terminal device on the display device may be acquired, for example, when the vertical screen is displayed, the virtual button is displayed on the display device.
  • the lower side of the display area, the status bar is on the upper side of the display area of the display device; when the horizontal screen is displayed, the virtual button is not displayed on the right side of the display area of the display device, and further, according to the The size data of the display device and the position data of the virtual button and/or status bar on the display device display other displays in the display area of the display device except the virtual button and/or the display area where the status bar is located A region as the projectable region.
  • the image conversion parameters used may include, but are not limited to, at least one of the following parameters:
  • the projectable area is a display area other than the display area of the virtual button and/or the status bar in the entire display area of the display device, specifically according to the size of the projectable area
  • the data, and the preset size data that is consistent with or close to the size data of the display device of the first terminal device directly determines image scaling parameters, that is, horizontal scaling and vertical scaling.
  • image scaling parameter the size of the acquired original image data can be converted to a preset size.
  • a plurality of sets of preset size data may be preset, and a set of preset size data that meets the conversion requirement is selected according to the size data of the display device of the first terminal device, and the above operation is performed.
  • the projectable area is the entire display area of the display device, specifically according to the size data of the projectable area, and a pre-preparation of the size data of the display device that is slightly larger than the first terminal device.
  • Set the size data to determine the image scaling parameters, horizontal scaling and vertical scaling.
  • image clipping parameters are obtained according to the preset size data and the position data of the virtual button and/or status bar of the second terminal device on the display device.
  • image cropping parameter image data other than the virtual button and/or the image of the region in which the status bar is located in the image data after the scaling process can be extracted.
  • a plurality of sets of preset size data may be preset, and a set of preset size data that meets the conversion requirement is selected according to the size data of the display device of the first terminal device, and the above operation is performed.
  • the image flipping parameter is obtained according to the aspect ratio of the acquired original image data and the aspect ratio of the display device of the first terminal device. For example, determining that the acquired original image data is vertical screen image data according to the aspect ratio of the acquired original image data, and determining that the first terminal device is a horizontal screen display device according to the aspect ratio of the display device of the first terminal device Therefore, it is possible to obtain an image flip parameter that rotates 270° clockwise with the specified point as the center of rotation. With the image flipping parameter, the image data after the scaling process can be flipped so that the starting point of the image data after the scaling process coincides with the starting point of the image data after the flipping.
  • the hardware information of the video encoding hardware of the second terminal device may be further acquired, and further, the designating the video encoding hardware may be determined according to the hardware information of the video encoding hardware. Data Format.
  • the hardware information of the so-called video encoding hardware refers to the basic attributes of the hardware name, hardware type, and data format of the image data to be processed of the video encoding hardware. According to the hardware information of the video encoding hardware, the data format of the image data to be processed required by the video encoding hardware when performing the video encoding process may be determined, and the data format may be used as the specified data format.
  • the image data of any interface displayed by the display device of the second terminal device may be acquired in advance, according to the image data. Determine its storage format. If If the storage format is the specified data format, then the subsequently obtained projected image data may be directly stored; if the storage format is not the specified data format, the subsequent obtained storage may be obtained.
  • the projected image data of the format is first converted into the projected image data of the specified data format, and then stored.
  • the conversion rule provided in the prior art may be used, or the conversion rule customized by the present invention may also be used, which is not specifically limited in this embodiment.
  • the projected image data may be specifically stored as a preset data format, for example, ARGB_8888, where A represents a transparency component, R represents a red component, a G green component, and a B blue component, respectively.
  • the image data of any interface displayed by the display device of the second terminal device may be acquired in advance, according to the image data. Determine its storage format. If the storage format is the preset data format, the subsequently obtained projected image data may be directly stored; if the storage format is not the preset data format, then the subsequent The obtained projected image data of the storage format is first converted into projected image data of the preset data format, and then stored.
  • the conversion rule provided in the prior art may be used, or the conversion rule customized by the present invention may also be used, which is not specifically limited in this embodiment.
  • the projected image data of the preset data format may be stored as the specified data format. Specifically, the projected image data of the preset data format may be first converted into the projected image data of the specified data format, and then stored.
  • the specific conversion rule may be a conversion rule provided in the prior art, or may also be adopted.
  • the custom conversion rule of the present invention is not particularly limited in this embodiment.
  • the original image data displayed on the display area of the second terminal device connected to the first terminal device connected to the first terminal device connected to the first terminal device is obtained, and then the image conversion processing is performed on the original image data according to the preset image conversion parameter.
  • Obtaining projected image data so that the projected image data can be stored as a specified data format for performing video encoding processing on the projected image data of the specified data format to obtain a projected video data stream to be sent to the interconnect
  • the application without manual participation, simple operation, and high correct rate, improves the efficiency and reliability of image processing.
  • the obtained projected video data stream may include, but is not limited to:
  • DTS Decode Time Stamp
  • PTS Presentation Time Stamp
  • Video data frames without DTS and PTS Video data frames without DTS and PTS.
  • RTP Real-time Transport Protocol
  • the DTS is used to identify when the projected video data stream read into the memory starts to be sent to the decoder for decoding; the PTS is used to measure when the decoded video frame is displayed.
  • the interconnect application can decode and display in real time after receiving the projected video data, thereby Can effectively improve the real-time performance of image processing.
  • the interconnected application may further process the projected video data stream by using at least one component, To output the projected video data stream.
  • the at least one component may include, but is not limited to, a video source component, a video filtering component, a hard decoding component, and a video output component, which are not specifically limited in this embodiment.
  • a video source component configured to receive a video frame included in the projected video data, and adjust a transmission speed of the video frame to ensure that the number of video frames transmitted per second satisfies a preset condition.
  • the video filtering component filters the video frames output by the video source component.
  • the hard decoding component decodes the video frame output by the video filtering component.
  • the video output component plays the decoded video frame output by the hard decoding component.
  • the projection image data displayed on the display area of the display device of the second terminal device connected to the first terminal device connected to the first terminal device is obtained, and then the video image processing is performed on the projected image data to obtain the image data.
  • Projecting a video data stream to enable transmission of a projected video data stream to the interconnected application for outputting the projected video data stream by the interconnected application by projecting real-time content displayed by a display device of the terminal device in which the application is located
  • the output is output in the interconnected application, so that the interconnected application does not independently output its related service content. Therefore, it is only necessary to develop the interconnected application for the simple service that the interconnected application implements, that is, the complex service that is not required for the coordinated application. Content, the development of the interconnected application, can effectively reduce the development time of the connected application, thereby improving the development efficiency of the application.
  • the original image data displayed on the display area of the second terminal device connected to the first terminal device connected to the first terminal device connected to the first terminal device is obtained, and then the parameter is converted according to the preset image.
  • the original image data is imaged Converting processing to obtain projected image data, so that the projected image data can be stored as a specified data format for performing video encoding processing on the projected image data of the specified data format to obtain a projected video data stream to be sent to the
  • the interconnected application does not require manual participation, is simple to operate, and has a high correct rate, thereby improving the efficiency and reliability of image processing.
  • the projected image data required for the video encoding process can be automatically obtained, which can effectively improve the efficiency of image processing, and can effectively improve the automation degree of image processing.
  • FIG. 2 is a schematic structural diagram of an image processing apparatus according to another embodiment of the present invention, as shown in FIG. 2 .
  • the image processing apparatus of the present embodiment may include an acquisition unit 21, an encoding unit 22, and a transmitting unit 23.
  • the acquiring unit 21 is configured to acquire the projected image data displayed by the projectable area on the display device of the second terminal device to which the first terminal device connected to the first terminal device is connected, and the encoding unit 22 is configured to use the projected image data.
  • the image processing apparatus of this embodiment may be located at the second terminal device.
  • the application is the cooperation application corresponding to the interconnection application, or may be a plug-in or a software development kit (SDK), etc., which is set in the application of the second terminal device (ie, the corresponding application corresponding to the interconnection application).
  • SDK software development kit
  • the functional unit is not particularly limited in this embodiment.
  • the application may be a local application installed on the second terminal device (nativeApp), or may also be a web application (webApp) of the browser on the second terminal device. Special restrictions are made.
  • the first terminal device is preferably an in-vehicle terminal device; and the second terminal device is preferably a user terminal device.
  • the first terminal device may also be a user terminal device; the second terminal device may also be an in-vehicle terminal device.
  • the applications installed on the user terminal device can be executed separately, without relying on another terminal device connected thereto for normal use, or relying on a corresponding cooperation application on another terminal device connected thereto. Normal use.
  • the projected video data stream obtained by the encoding unit 22 may include, but is not limited to:
  • DTS Decode Time Stamp
  • PTS Presentation Time Stamp
  • Video data frames without DTS and PTS Video data frames without DTS and PTS.
  • RTP Real-time Transport Protocol
  • the image data displayed by the projectable area on the display device of the second terminal device to which the first terminal device connected to the first terminal device is connected is obtained by the acquiring unit, and then the video is performed by the encoding unit on the projected image data.
  • Encoding process to obtain a projected video data stream such that the transmitting unit can send a projected video data stream to the interconnected application, for the connected application to output the projected video data stream, due to the display device through the terminal device where the matching application is located
  • the displayed real-time content is projected into the interconnected application for output, so that the interconnected application no longer outputs its related service content independently. Therefore, only the simple service that is implemented by the interconnected application, that is, the content output service, needs to be developed, and the connected application is not required to be targeted.
  • Cooperating with the complex service content implemented by the application developing the interconnected application can effectively reduce the development time of the interconnected application, thereby improving the development efficiency of the application.
  • the original image data displayed on the display area of the second terminal device connected to the first terminal device connected to the first terminal device connected to the first terminal device is obtained, and then the parameter is converted according to the preset image.
  • the projected image data can be stored as a specified data format for performing video encoding processing on the projected image data of the specified data format to obtain
  • the projected video data stream is sent to the interconnected application without manual participation, the operation is simple, and the correct rate is high, thereby improving the efficiency and reliability of image processing.
  • the projected image data required for the video encoding process can be automatically obtained, which can effectively improve the efficiency of image processing, and can effectively improve the automation degree of image processing.
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the integrated unit implemented in the form of a software functional unit can be stored in a computer readable storage medium.
  • the software functional unit is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, an audio processing engine, or a network device, etc.) or a processor to perform various embodiments of the present invention. Part of the steps of the method.
  • the foregoing storage medium includes: a USB flash drive, a mobile hard disk, a read-only memory (ROM), and a random access memory (Random Access).
  • ROM read-only memory
  • Random Access random access memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

本发明提供一种图像处理方法、装置、设备及非易失性计算机存储介质。本发明实施例通过获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据,进而对所述投射图像数据,进行视频编码处理,以获得投射视频数据流,使得能够向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流,由于通过将配合应用所在终端设备的显示装置所显示的实时内容投射到互联应用中进行输出,使得互联应用不再独立输出其相关的服务内容,因此,只需要针对互联应用所实现的简单服务即内容输出服务,开发该互联应用,无需针对配合应用所实现的复杂服务内容,开发该互联应用,能够有效降低互联应用的开发时间,从而提高了应用的开发效率。

Description

图像处理方法、装置、设备及非易失性计算机存储介质
本申请要求了申请日为2015年12月25日,申请号为201510994009.3发明名称为“图像处理方法及装置”的中国专利申请的优先权。
技术领域
本发明涉及通信技术,尤其涉及一种图像处理方法、装置、设备及非易失性计算机存储介质。
背景技术
随着通信技术的发展,终端设备集成了越来越多的功能,从而使得终端设备的系统功能列表中包含了越来越多相应的应用(Application,APP)。有些终端设备上所安装的应用,本发明中称为互联应用,需要通过与其连接的另一个终端设备上安装一个相关的配合应用才能正常使用,否则没有任何使用价值,特别是对于一些车机互联应用。通常,两个相关的应用即互联应用与配合应用可以各自独立输出其相关的服务内容,用户可以通过在互联应用中进行操作,从而实现操作与其所在终端设备连接的另一个终端设备上的配合应用,以使用配合应用所提供的服务。
然而,由于两个相关的应用即互联应用与配合应用各自独立输出其相关的服务内容,使得需要针对配合应用所实现的服务内容,分别独立开发两个相关的应用中每个应用,从而导致了应用的开发效率的降低。
发明内容
本发明的多个方面提供一种图像处理方法、装置、设备及非易失性计算机存储介质,用以提高应用的开发效率。
本发明的一方面,提供一种图像处理方法,包括:
获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据;
对所述投射图像数据,进行视频编码处理,以获得投射视频数据流;
向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述投射视频数据流包括:
带DTS和PTS中的至少一项的视频数据帧;或者
不带DTS和PTS的视频数据帧;或者
基于RTP进行传输的视频数据。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述向所述互联应用发送投射视频数据流方法之后,还包括:
所述互联应用利用至少一个组件,对所述投射视频数据流进行处理,以输出所述投射视频数据流。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述至少一个组件包括视频源组件、视频过滤组件、硬解码组件和视频输出组件。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,
所述第一终端设备为车载终端设备;
所述第二终端设备为用户终端设备。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述车载终端设备的操作系统为Linux操作系统、WinCE操作系统、QNX操作系统或Android操作系统。
本发明的另一方面,提供一种图像处理装置,包括:
获取单元,用于获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据;
编码单元,用于对所述投射图像数据,进行视频编码处理,以获得投射视频数据流;
发送单元,用于向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述投射视频数据流包括:
带DTS和PTS中的至少一项的视频数据帧;或者
不带DTS和PTS的视频数据帧;或者
基于RTP进行传输的视频数据。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,
所述第一终端设备为车载终端设备;
所述第二终端设备为用户终端设备。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述车载终端设备的操作系统为Linux操作系统、WinCE操作系统、QNX操作系统或Android操作系统。
本发明的另一方面,提供一种设备,包括:
一个或者多个处理器;
存储器;
一个或者多个程序,所述一个或者多个程序存储在所述存储器中,当被所述一个或者多个处理器执行时:
获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据;
对所述投射图像数据,进行视频编码处理,以获得投射视频数据流;
向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
本发明的另一方面,提供一种非易失性计算机存储介质,所述非易 失性计算机存储介质存储有一个或者多个程序,当所述一个或者多个程序被一个设备执行时,使得所述设备:
获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据;
对所述投射图像数据,进行视频编码处理,以获得投射视频数据流;
向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
由所述技术方案可知,本发明实施例通过获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据,进而对所述投射图像数据,进行视频编码处理,以获得投射视频数据流,使得能够向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流,由于通过将配合应用所在终端设备的显示装置所显示的实时内容投射到互联应用中进行输出,使得互联应用不再独立输出其相关的服务内容,因此,只需要针对互联应用所实现的简单服务即内容输出服务,开发该互联应用,无需针对配合应用所实现的复杂服务内容,开发该互联应用,能够有效降低互联应用的开发时间,从而提高了应用的开发效率。
另外,采用本发明所提供的技术方案,通过获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的原始图像数据,进而根据预设图像转换参数,对所述原始图像数据进行图像转换处理,以获得投射图像数据,使得能够将所述投射图像数据存储为 指定数据格式,以供对所述指定数据格式的投射图像数据,进行视频编码处理,以获得投射视频数据流发送给所述互联应用,无需人工参与,操作简单,而且正确率高,从而提高了图像处理的效率和可靠性。
另外,采用本发明所提供的技术方案,一旦获取原始图像数据,即能够自动获得视频编码处理所需要的投射图像数据,能够有效提高图像处理的效率,而且能够有效提高图像处理的自动化程度。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明一实施例提供的图像处理方法的流程示意图;
图2为本发明另一实施例提供的图像处理装置的结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的全部其他实施例,都属于本发明保护的范围。
需要说明的是,本发明实施例中所涉及的用户终端设备可以包括但 不限于手机、个人数字助理(Personal Digital Assistant,PDA)、无线手持设备、平板电脑(Tablet Computer)、个人电脑(Personal Computer,PC)、MP3播放器、MP4播放器、可穿戴设备(例如,智能眼镜、智能手表、智能手环等)等。本发明实施例中所涉及的车载终端设备,还可以称为车机,指的是安装在汽车里面的车载信息娱乐产品的简称,车机在功能上要能够实现人与车,车与外界(车与车)的信息通信。所述车载终端设备的操作系统可以为但不限于Linux操作系统、WinCE操作系统、QNX操作系统或Android操作系统,本实施例对此不进行特别限定。
另外,本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系。
图1为本发明一实施例提供的图像处理方法的流程示意图,如图1所示。
101、获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据。
本发明中,第一终端设备与第二终端设备通过但不限于蓝牙连接、通用串行总线(Universal Serial Bus,USB)连接和无线相容性认证(Wireless Fidelity,WI-FI)连接中的至少一项进行连接,本实施例对此不进行特别限定。
102、对所述投射图像数据,进行视频编码处理,以获得投射视频数据流。
103、向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
需要说明的是,101~103的执行主体可以为位于第二终端设备的应用即互联应用所对应的配合应用,或者还可以为设置在位于第二终端设备的应用(即互联应用所对应的配合应用)中的插件或软件开发工具包(Software Development Kit,SDK)等功能单元,本实施例对此不进行特别限定。
可以理解的是,所述应用可以是安装在第二终端设备上的本地程序(nativeApp),或者还可以是第二终端设备上的浏览器的一个网页程序(webApp),本实施例对此不进行特别限定。
这样,通过获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据,进而对所述投射图像数据,进行视频编码处理,以获得投射视频数据流,使得能够向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流,由于通过将配合应用所在终端设备的显示装置所显示的实时内容投射到互联应用中进行输出,使得互联应用不再独立输出其相关的服务内容,因此,只需要针对互联应用所实现的简单服务即内容输出服务,开发该互联应用,无需针对配合应用所实现的复杂服务内容,开发该互联应用,能够有效降低互联应用的开发时间,从而提高了应用的开发效率。
可选地,在本实施例的一个可能的实现方式中,所述第一终端设备优选为车载终端设备;所述第二终端设备优选为用户终端设备。
反过来,所述第一终端设备也可以为用户终端设备;所述第二终端设备也可以为车载终端设备。但是,在实际应用中,用户终端设备上所 安装的应用都可以单独执行,无需依赖与其连接的另一个终端设备才能正常使用,或者依赖与其连接的另一个终端设备上安装一个相应的配合应用才能正常使用。
本发明中,在完成第一终端设备与第二终端设备之间的设备连接建立之后,基于所建立的设备连接,还需要进一步建立第一终端设备上的互联应用与第二终端设备上的配合应用之间的通信连接。这样,用户才可以通过互联应用,操作配合应用或者配合应用所在第二终端设备。
可以理解的是,所述通信连接可以包括但不限于蓝牙连接、USB连接和WI-FI连接中的至少一项,本实施例对此不进行特别限定。
具体来说,具体可以通过第一终端设备上的互联应用与第二终端设备上的配合应用之间的通信连接,向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
可选地,在本实施例的一个可能的实现方式中,在101中,具体可以对所述第二终端设备的显示装置所显示的界面进行图像截取处理,以获得原始图像数据。进而,则可以根据操作人员根据配合应用所在的终端设备的品牌和型号所设置的图像处理参数,对原始图像数据进行一系列转换处理,以获得配合应用所在第二终端设备的视频编码硬件所能够处理的投射图像数据。这样,则可以对所获得的投射图像数据进行视频编码处理,以获得视频数据流,发送给所述互联应用进行输出。
然而,由于配合应用所在的第二终端设备的类型繁多,采用人工操作来逐一根据每种第二终端设备设置不同的图像处理参数,会使得操作复杂,且容易出错,从而导致了图像处理的效率和可靠性的降低。为了解决这一问题,提出了另外一个可能的实现方式。
在该可能的实现方式中,具体可以获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的原始图像数据,进而,则可以根据预设图像转换参数,对所述原始图像数据进行图像转换处理,以获得投射图像数据。然后,则可以将所述投射图像数据存储为指定数据格式,以供对所述指定数据格式的投射图像数据,进行视频编码处理,以获得投射视频数据流发送给所述互联应用。
在该实现方式中,还可以进一步获取所述第二终端设备的显示装置的尺寸数据,进而,则可以根据所述显示装置的尺寸数据,获得所述可投射区域。
在一个具体的实现过程中,具体可以根据显示装置的尺寸数据,例如,720像素(宽)×1280像素(高),将显示装置的全部显示区域,作为所述可投射区域即起始点为(0,0),宽为720个像素,高为1280个像素的矩形区域。
在另一个具体的实现过程中,具体可以获取所述第二终端设备的虚拟按键和/或状态栏在所述显示装置上的位置数据,例如,进行竖屏显示时,虚拟按键在显示装置的显示区域的下侧,状态栏在显示装置的显示区域的上侧;进行横屏显示时,虚拟按键在显示装置的显示区域的右侧,状态栏一般不予显示等,进而,则可以根据所述显示装置的尺寸数据和所述虚拟按键和/或状态栏在所述显示装置上的位置数据,将显示装置的全部显示区域中除了虚拟按键和/或状态栏所在显示区域之外的其他显示区域,作为所述可投射区域。
在该实现方式中,所采用的所述图像转换参数,可以包括但不限于下列参数中的至少一项:
图像缩放参数;
图像剪裁参数;以及
图像翻转参数。
在一个具体的实现过程中,若所述可投射区域为显示装置的全部显示区域中除了虚拟按键和/或状态栏所在显示区域之外的其他显示区域,具体可以根据所述可投射区域的尺寸数据,以及与第一终端设备的显示装置的尺寸数据一致或接近的预设尺寸数据,直接确定图像缩放参数即横向缩放比例和纵向缩放比例。利用该图像缩放参数,则可以将所获取的原始图像数据的尺寸转换为预设尺寸。
具体来说,可以预先设置若干组预设尺寸数据即宽高参数对,根据第一终端设备的显示装置的尺寸数据,选择满足转换需求的一组预设尺寸数据,执行上述操作。
在另一个具体的实现过程中,若所述可投射区域为显示装置的全部显示区域,具体可以根据所述可投射区域的尺寸数据,以及稍微大于第一终端设备的显示装置的尺寸数据的预设尺寸数据,确定图像缩放参数即横向缩放比例和纵向缩放比例。然后,再根据预设尺寸数据和第二终端设备的虚拟按键和/或状态栏在所述显示装置上的位置数据,获得图像剪裁参数。利用该图像剪裁参数,则可以将经过缩放处理之后的图像数据中除了虚拟按键和/或状态栏所在区域的图像之外的图像数据提取出来。
具体来说,可以预先设置若干组预设尺寸数据即宽高参数对,根据第一终端设备的显示装置的尺寸数据,选择满足转换需求的一组预设尺寸数据,执行上述操作。
在另一个具体的实现过程中,根据所获取的原始图像数据的横纵比,以及第一终端设备的显示装置的横纵比,获得图像翻转参数。例如,根据所获取的原始图像数据的横纵比,确定所获取的原始图像数据为竖屏图像数据,根据第一终端设备的显示装置的横纵比,确定第一终端设备为横屏显示装置,因此,则可以获得以指定点作为旋转中心,顺时针旋转270°的图像翻转参数。利用该图像翻转参数,则可以将经过缩放处理之后的图像数据进行翻转,以使得经过缩放处理之后的图像数据的起始点与翻转之后的图像数据的起始点重合。
在该实现方式中,还可以进一步获取所述第二终端设备的视频编码硬件的硬件信息,进而,则可以根据所述视频编码硬件的硬件信息,确定所述视频编码硬件所对应的所述指定数据格式。
所谓的视频编码硬件的硬件信息,是指视频编码硬件的硬件名称、硬件类型、待处理图像数据的数据格式等基本属性。根据所述视频编码硬件的硬件信息,则可以确定所述视频编码硬件在执行视频编码处理时所要求的待处理图像数据的数据格式,并可以将该数据格式作为所述指定数据格式。
在一个具体的实现过程中,具体可以直接将所述投射图像数据存储为指定数据格式,例如,RGB_565,其中,R表示红色分量,G绿色分量,B蓝色分量,分别用5位、6位和5位记录RGB值,即R=5,G=6,B=5,那么,一个像素点占5+6+5=16位。
例如,由于不同终端设备获取其显示装置所显示的界面的图像数据的存储格式各不相同,因此,可以预先获取第二终端设备的显示装置所显示的任意界面的图像数据,根据该图像数据,确定其存储格式。若该 存储格式为所述指定数据格式,那么,则可以将后续所获得的投射图像数据,直接进行存储处理;若该存储格式不为所述指定数据格式,那么,则可以将后续所获得的该存储格式的投射图像数据,先转换为所述指定数据格式的投射图像数据,再进行存储处理。其中,具体的转换规则,可以采用现有技术中所提供的转换规则,或者还可以采用本发明自定义的转换规则,本实施例对此不进行特别限定。
在另一个具体的实现过程中,具体可以将所述投射图像数据存储为预设数据格式,例如,ARGB_8888,其中,A表示透明度分量,R表示红色分量,G绿色分量,B蓝色分量,分别用8位记录ARGB值,即A=8,R=8,G=8,B=8,那么,一个像素点占8+8+8+8=32位。
例如,由于不同终端设备获取其显示装置所显示的界面的图像数据的存储格式各不相同,因此,可以预先获取第二终端设备的显示装置所显示的任意界面的图像数据,根据该图像数据,确定其存储格式。若该存储格式为所述预设数据格式,那么,则可以将后续所获得的投射图像数据,直接进行存储处理;若该存储格式不为所述预设数据格式,那么,则可以将后续所获得的该存储格式的投射图像数据,先转换为所述预设数据格式的投射图像数据,再进行存储处理。其中,具体的转换规则,可以采用现有技术中所提供的转换规则,或者还可以采用本发明自定义的转换规则,本实施例对此不进行特别限定。
进而,则可以将所述预设数据格式的投射图像数据存储为所述指定数据格式。具体来说,可以将所述预设数据格式的投射图像数据,先转换为所述指定数据格式的投射图像数据,再进行存储处理。其中,具体的转换规则,可以采用现有技术中所提供的转换规则,或者还可以采用 本发明自定义的转换规则,本实施例对此不进行特别限定。
这样,通过获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的原始图像数据,进而根据预设图像转换参数,对所述原始图像数据进行图像转换处理,以获得投射图像数据,使得能够将所述投射图像数据存储为指定数据格式,以供对所述指定数据格式的投射图像数据,进行视频编码处理,以获得投射视频数据流发送给所述互联应用,无需人工参与,操作简单,而且正确率高,从而提高了图像处理的效率和可靠性。
可选地,在本实施例的一个可能的实现方式中,所获得的所述投射视频数据流,可以包括但不限于:
带解码时间标签(Decode Time Stamp,DTS)和显示时间标签(Presentation Time Stamp,PTS)中的至少一项的视频数据帧;或者
不带DTS和PTS的视频数据帧;或者
基于实时传输协议(Real-time Transport Protocol,RTP)进行传输的视频数据。
其中,DTS用于标识读入内存中的投射视频数据流在什么时候开始送入解码器中进行解码;PTS用于度量解码后的视频帧什么时候被显示出来。
在该实现方式中,若所获得的所述投射视频数据流为不带DTS和PTS的视频数据帧,那么,互联应用在接收到所述投射视频数据之后,则可以实时进行解码和显示,从而能够有效提高图像处理的实时性。
可选地,在本实施例的一个可能的实现方式中,在103之后,所述互联应用还可以利用至少一个组件,对所述投射视频数据流进行处理, 以输出所述投射视频数据流。
具体来说,所述至少一个组件可以包括但不限于视频源组件、视频过滤组件、硬解码组件和视频输出组件,本实施例对此不进行特别限定。
视频源组件,用于接收所述投射视频数据中所包含的视频帧,并调整视频帧的传输速度,以保证每秒所传输的视频帧数满足预先设置的条件。
视频过滤组件,对视频源组件所输出的视频帧进行过滤处理。
硬解码组件,对视频过滤组件所输出的视频帧,进行解码处理。
视频输出组件,将硬解码组件所输出的解码之后的视频帧,进行播放。
本实施例中,通过获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据,进而对所述投射图像数据,进行视频编码处理,以获得投射视频数据流,使得能够向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流,由于通过将配合应用所在终端设备的显示装置所显示的实时内容投射到互联应用中进行输出,使得互联应用不再独立输出其相关的服务内容,因此,只需要针对互联应用所实现的简单服务即内容输出服务,开发该互联应用,无需针对配合应用所实现的复杂服务内容,开发该互联应用,能够有效降低互联应用的开发时间,从而提高了应用的开发效率。
另外,采用本发明所提供的技术方案,通过获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的原始图像数据,进而根据预设图像转换参数,对所述原始图像数据进行图像 转换处理,以获得投射图像数据,使得能够将所述投射图像数据存储为指定数据格式,以供对所述指定数据格式的投射图像数据,进行视频编码处理,以获得投射视频数据流发送给所述互联应用,无需人工参与,操作简单,而且正确率高,从而提高了图像处理的效率和可靠性。
另外,采用本发明所提供的技术方案,一旦获取原始图像数据,即能够自动获得视频编码处理所需要的投射图像数据,能够有效提高图像处理的效率,而且能够有效提高图像处理的自动化程度。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本发明所必须的。
在所述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
图2为本发明另一实施例提供的图像处理装置的结构示意图,如图2所示。本实施例的图像处理装置可以包括获取单元21、编码单元22和发送单元23。其中,获取单元21,用于获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据;编码单元22,用于对所述投射图像数据,进行视频编码处理,以获得投射视频数据流;发送单元23,用于向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
需要说明的是,本实施例的图像处理装置可以为位于第二终端设备 的应用即互联应用所对应的配合应用,或者还可以为设置在位于第二终端设备的应用(即互联应用所对应的配合应用)中的插件或软件开发工具包(Software Development Kit,SDK)等功能单元,本实施例对此不进行特别限定。
可以理解的是,所述应用可以是安装在第二终端设备上的本地程序(nativeApp),或者还可以是第二终端设备上的浏览器的一个网页程序(webApp),本实施例对此不进行特别限定。
可选地,在本实施例的一个可能的实现方式中,所述第一终端设备优选为车载终端设备;所述第二终端设备优选为用户终端设备。
反过来,所述第一终端设备也可以为用户终端设备;所述第二终端设备也可以为车载终端设备。但是,在实际应用中,用户终端设备上所安装的应用都可以单独执行,无需依赖与其连接的另一个终端设备才能正常使用,或者依赖与其连接的另一个终端设备上安装一个相应的配合应用才能正常使用。
可选地,在本实施例的一个可能的实现方式中,所述编码单元22所获得的所述投射视频数据流,可以包括但不限于:
带解码时间标签(Decode Time Stamp,DTS)和显示时间标签(Presentation Time Stamp,PTS)中的至少一项的视频数据帧;或者
不带DTS和PTS的视频数据帧;或者
基于实时传输协议(Real-time Transport Protocol,RTP)进行传输的视频数据。
需要说明的是,图1对应的实施例中方法,可以由本实施例提供的图像处理装置实现。详细描述可以参见图1对应的实施例中的相关内容, 此处不再赘述。
本实施例中,通过获取单元获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据,进而由编码单元对所述投射图像数据,进行视频编码处理,以获得投射视频数据流,使得发送单元能够向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流,由于通过将配合应用所在终端设备的显示装置所显示的实时内容投射到互联应用中进行输出,使得互联应用不再独立输出其相关的服务内容,因此,只需要针对互联应用所实现的简单服务即内容输出服务,开发该互联应用,无需针对配合应用所实现的复杂服务内容,开发该互联应用,能够有效降低互联应用的开发时间,从而提高了应用的开发效率。
另外,采用本发明所提供的技术方案,通过获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的原始图像数据,进而根据预设图像转换参数,对所述原始图像数据进行图像转换处理,以获得投射图像数据,使得能够将所述投射图像数据存储为指定数据格式,以供对所述指定数据格式的投射图像数据,进行视频编码处理,以获得投射视频数据流发送给所述互联应用,无需人工参与,操作简单,而且正确率高,从而提高了图像处理的效率和可靠性。
另外,采用本发明所提供的技术方案,一旦获取原始图像数据,即能够自动获得视频编码处理所需要的投射图像数据,能够有效提高图像处理的效率,而且能够有效提高图像处理的自动化程度。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,所述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例 中的对应过程,在此不再赘述。
在本发明所提供的几个实施例中,应该理解到,所揭露的系统,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。所述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
所述以软件功能单元的形式实现的集成的单元,可以存储在一个计算机可读取存储介质中。所述软件功能单元存储在一个存储介质中,包括若干指令用以使得一台计算机装置(可以是个人计算机,音频处理引擎,或者网络装置等)或处理器(processor)执行本发明各个实施例所述方法的部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access  Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。

Claims (12)

  1. 一种图像处理方法,其特征在于,包括:
    获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据;
    对所述投射图像数据,进行视频编码处理,以获得投射视频数据流;
    向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
  2. 根据权利要求1所述的方法,其特征在于,所述投射视频数据流包括:
    带DTS和PTS中的至少一项的视频数据帧;或者
    不带DTS和PTS的视频数据帧;或者
    基于RTP进行传输的视频数据。
  3. 根据权利要求1或2所述的方法,其特征在于,所述向所述互联应用发送投射视频数据流方法之后,还包括:
    所述互联应用利用至少一个组件,对所述投射视频数据流进行处理,以输出所述投射视频数据流。
  4. 根据权利要求1~3任一权利要求所述的方法,其特征在于,所述至少一个组件包括视频源组件、视频过滤组件、硬解码组件和视频输出组件。
  5. 根据权利要求1~4任一权利要求所述的方法,其特征在于,
    所述第一终端设备为车载终端设备;
    所述第二终端设备为用户终端设备。
  6. 根据权利要求5所述的方法,其特征在于,所述车载终端设备的 操作系统为Linux操作系统、WinCE操作系统、QNX操作系统或Android操作系统。
  7. 一种图像处理装置,其特征在于,包括:
    获取单元,用于获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据;
    编码单元,用于对所述投射图像数据,进行视频编码处理,以获得投射视频数据流;
    发送单元,用于向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
  8. 根据权利要求7所述的装置,其特征在于,所述投射视频数据流包括:
    带DTS和PTS中的至少一项的视频数据帧;或者
    不带DTS和PTS的视频数据帧;或者
    基于RTP进行传输的视频数据。
  9. 根据权利要求7或8所述的装置,其特征在于,
    所述第一终端设备为车载终端设备;
    所述第二终端设备为用户终端设备。
  10. 根据权利要求9所述的装置,其特征在于,所述车载终端设备的操作系统为Linux操作系统、WinCE操作系统、QNX操作系统或Android操作系统。
  11. 一种设备,包括:
    一个或者多个处理器;
    存储器;
    一个或者多个程序,所述一个或者多个程序存储在所述存储器中,当被所述一个或者多个处理器执行时:
    获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据;
    对所述投射图像数据,进行视频编码处理,以获得投射视频数据流;
    向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
  12. 一种非易失性计算机存储介质,所述非易失性计算机存储介质存储有一个或者多个程序,当所述一个或者多个程序被一个设备执行时,使得所述设备:
    获取互联应用所在第一终端设备所连接的第二终端设备的显示装置上可投射区域所显示的投射图像数据;
    对所述投射图像数据,进行视频编码处理,以获得投射视频数据流;
    向所述互联应用发送投射视频数据流,以供所述互联应用输出所述投射视频数据流。
PCT/CN2016/088100 2015-12-25 2016-07-01 图像处理方法、装置、设备及非易失性计算机存储介质 WO2017107424A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510994009.3A CN105635774A (zh) 2015-12-25 2015-12-25 图像处理方法及装置
CN201510994009.3 2015-12-25

Publications (1)

Publication Number Publication Date
WO2017107424A1 true WO2017107424A1 (zh) 2017-06-29

Family

ID=56050198

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/088100 WO2017107424A1 (zh) 2015-12-25 2016-07-01 图像处理方法、装置、设备及非易失性计算机存储介质

Country Status (2)

Country Link
CN (1) CN105635774A (zh)
WO (1) WO2017107424A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112287263A (zh) * 2020-10-30 2021-01-29 安徽鸿程光电有限公司 网页显示方法、装置、系统及终端、计算机可读存储介质

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018070763A1 (ko) * 2016-10-14 2018-04-19 삼성전자 주식회사 컨텐츠를 전송하는 방법 및 장치
CN111107404A (zh) * 2019-12-27 2020-05-05 珠海全志科技股份有限公司 汽车与移动端的界面视频快速播放方法、系统及存储介质
CN111225259B (zh) * 2020-01-19 2021-09-14 重庆众鸿科技有限公司 一种车载互联视频显示加速装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005084032A1 (en) * 2004-02-20 2005-09-09 Koninklijke Philips Electronics N.V. Method of video decoding
CN101155299A (zh) * 2006-09-29 2008-04-02 明基电通股份有限公司 图像数据更新方法以及应用该方法的播放系统
CN103197910A (zh) * 2013-04-17 2013-07-10 东软集团股份有限公司 图像更新方法和装置
CN104333762A (zh) * 2014-11-24 2015-02-04 成都瑞博慧窗信息技术有限公司 一种视频解码方法
CN105611357A (zh) * 2015-12-25 2016-05-25 百度在线网络技术(北京)有限公司 图像处理方法及装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101959347B1 (ko) * 2012-05-25 2019-03-18 삼성전자주식회사 복수의 통신 단말을 이용한 다중 디스플레이 방법, 기계로 읽을 수 있는 저장 매체 및 통신 단말
CN104156258B (zh) * 2013-05-14 2018-07-06 腾讯科技(深圳)有限公司 一种应用程序的共享操作方法、相关设备和系统
CN104980326A (zh) * 2014-04-03 2015-10-14 联想移动通信软件(武汉)有限公司 一种终端设备之间分享应用内容的方法及装置
CN105094732B (zh) * 2015-06-29 2018-07-31 小米科技有限责任公司 屏幕显示方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005084032A1 (en) * 2004-02-20 2005-09-09 Koninklijke Philips Electronics N.V. Method of video decoding
CN101155299A (zh) * 2006-09-29 2008-04-02 明基电通股份有限公司 图像数据更新方法以及应用该方法的播放系统
CN103197910A (zh) * 2013-04-17 2013-07-10 东软集团股份有限公司 图像更新方法和装置
CN104333762A (zh) * 2014-11-24 2015-02-04 成都瑞博慧窗信息技术有限公司 一种视频解码方法
CN105611357A (zh) * 2015-12-25 2016-05-25 百度在线网络技术(北京)有限公司 图像处理方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112287263A (zh) * 2020-10-30 2021-01-29 安徽鸿程光电有限公司 网页显示方法、装置、系统及终端、计算机可读存储介质
CN112287263B (zh) * 2020-10-30 2024-04-02 安徽鸿程光电有限公司 网页显示方法、装置、系统及终端、计算机可读存储介质

Also Published As

Publication number Publication date
CN105635774A (zh) 2016-06-01

Similar Documents

Publication Publication Date Title
WO2017107426A1 (zh) 图像处理方法、装置、设备及非易失性计算机存储介质
KR102221023B1 (ko) 이미지를 처리하는 전자장치 및 방법
WO2017107424A1 (zh) 图像处理方法、装置、设备及非易失性计算机存储介质
EP3039530B1 (en) Method and system for presenting content
US8087053B2 (en) System and method for transmitting an animated figure
CN108924538B (zh) Ar设备的屏幕拓展方法
WO2017101355A1 (zh) 图像处理方法及装置
WO2020063246A1 (zh) 点云编解码方法和编解码器
CN105450965B (zh) 一种视频转换方法、装置和系统
CN107357585B (zh) 视频获取方法、装置、视频设备及存储介质
CN108628563A (zh) 显示装置、显示方法以及存储介质
CN114786040B (zh) 数据通信方法、系统、电子设备和存储介质
CN103929640B (zh) 用于管理视频流播的技术
CN106605411A (zh) 对图形域中的视频数据进行流式传输
TW201346840A (zh) 影像對位顯示方法、裝置以及電腦程式產品
JP2012522285A (ja) データ及び三次元レンダリングを符号化するためのシステム及びフォーマット
JP2023538825A (ja) ピクチャのビデオへの変換の方法、装置、機器および記憶媒体
WO2023035973A1 (zh) 视频处理方法、装置、设备及介质
WO2023273905A1 (zh) 信息同屏方法、发送端、接收端及计算机可读存储介质
CN114390307A (zh) 图像画质增强方法、装置、终端及可读存储介质
CN105824658B (zh) 一种电子设备的相机模块启动方法及电子设备
CN115668273A (zh) 电子装置、其控制方法和电子系统
KR20220036061A (ko) 전자 장치, 그 제어 방법 및 전자 시스템
CN102595162A (zh) 图像处理设备、图像处理方法和程序
CN112788193A (zh) 图像传输方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16877237

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16877237

Country of ref document: EP

Kind code of ref document: A1