CN116055789B - Live broadcast picture amplifying method, system, equipment and medium based on android system - Google Patents

Live broadcast picture amplifying method, system, equipment and medium based on android system Download PDF

Info

Publication number
CN116055789B
CN116055789B CN202310294528.3A CN202310294528A CN116055789B CN 116055789 B CN116055789 B CN 116055789B CN 202310294528 A CN202310294528 A CN 202310294528A CN 116055789 B CN116055789 B CN 116055789B
Authority
CN
China
Prior art keywords
picture
picture data
texture
live broadcast
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310294528.3A
Other languages
Chinese (zh)
Other versions
CN116055789A (en
Inventor
程文波
刘震
李龙华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Xingxi Technology Co ltd
Original Assignee
Hangzhou Xingxi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Xingxi Technology Co ltd filed Critical Hangzhou Xingxi Technology Co ltd
Priority to CN202310294528.3A priority Critical patent/CN116055789B/en
Publication of CN116055789A publication Critical patent/CN116055789A/en
Application granted granted Critical
Publication of CN116055789B publication Critical patent/CN116055789B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4424Monitoring of the internal components or processes of the client device, e.g. CPU or memory load, processing speed, timer, counter or percentage of the hard disk space used
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application relates to a live broadcast picture amplifying method, a live broadcast picture amplifying system, live broadcast picture amplifying equipment and a live broadcast picture amplifying medium based on an android system, wherein the live broadcast picture amplifying method comprises the following steps: registering the surface texture of the live application program to the first process by starting the first process; monitoring picture data of a live broadcast application program by controlling surface textures through a first process, and generating texture IDs of the picture data; transmitting texture IDs of picture data of the preset pictures to a second process in response to an amplifying instruction of the preset pictures, and executing amplifying processing of the preset pictures through the second process; and transmitting the amplified picture data to a first process, and transmitting the picture data to a live broadcast application program through the first process. The application solves the problems that the appointed local picture can not be amplified in real time and the application range is small in the live video, realizes the local amplification of the live picture on the system layer, and the local amplification of the live picture realized by the system layer is effective for all live applications of the system.

Description

Live broadcast picture amplifying method, system, equipment and medium based on android system
Technical Field
The application relates to the field of image data processing, in particular to a live broadcast picture amplifying method, a live broadcast picture amplifying system, live broadcast picture amplifying equipment and live broadcast picture amplifying medium based on an android system.
Background
At present, a plurality of cameras are generally required to be equipped in live broadcast, one of the cameras is used for main body display, the other one is used for local close-up, and specific details of live broadcast video can be watched through switching of a broadcasting guide table. Although the method can watch specific details in live broadcast, the cost is high and the operation is complex. To help users to better view details in live video, one related approach is: in the video playing process, the mobile terminal receives a switching request for switching a playing mode into full-screen playing from a user: after receiving the switch request, the video is played full screen. In carrying out the application, the applicant has found that the above technique has at least the following problems: because the screen of the android terminal is usually smaller, when the android terminal plays live video in a full screen, local pictures of a designated area cannot be amplified in real time, and small details in the local area cannot be clearly seen by users with poor eyesight or long viewing distance, so that the user experience of watching live video is affected; furthermore, this approach cannot support third party applications, relying only on separate playback software.
At present, no effective solution is proposed for the problem that a specified local picture cannot be amplified in real time and the application range is small in a live video in the related technology.
Disclosure of Invention
The embodiment of the application provides a live broadcast picture amplifying method, a live broadcast picture amplifying system, live broadcast picture amplifying equipment and a live broadcast picture amplifying medium based on an android system, which at least solve the problems that a live broadcast video in the related technology cannot amplify a specified local picture in real time and the application range is small.
In a first aspect, an embodiment of the present application provides a live broadcast picture amplifying method based on an android system, where the method includes:
starting a first process, and registering the surface texture of the live application program to the first process;
the first process controls the surface texture to monitor the picture data of the live broadcast application program and generates a texture ID of the picture data;
responding to an amplifying instruction of a preset picture, transmitting a texture ID of picture data of the preset picture to a second process, and executing amplifying processing of the preset picture through the second process;
and transmitting the amplified picture data to the first process, and transmitting the picture data to the live broadcast application program through the first process.
In some of these embodiments, registering the surface texture of the live application with the first process includes:
and under the condition that the first process monitors that the live broadcast application program opens a camera preview picture, setting the surface texture of the camera preview picture, and registering the surface texture to the first process.
In some of these embodiments, after listening to the picture data of the live application by controlling the surface texture by the first process and generating a texture ID of the picture data, the method comprises:
and calling a system HAL service through an android system camera service to acquire picture data of a camera, and storing the picture data in the surface texture.
In some embodiments, responding to the zoom-in instruction for the preset screen includes:
acquiring a preset amplifying point in a preset picture;
and generating a preview frame for local amplification by taking the preset amplification point as a central point, wherein the size and the shape of the preview frame are custom-set, and the shape of the preview frame comprises a round frame, a rectangular frame and a star frame.
In some embodiments, performing the enlarging process of the preset screen by the second process includes:
based on the texture ID of the picture data of the preset picture, obtaining the picture data corresponding to the texture ID through the second process,
and carrying out local enlargement on the picture data based on the preview frame.
In some of these embodiments, locally zooming in on the picture data based on the preview box includes:
sampling the picture data in the preview frame through a fragment shader to obtain a first texture region of the picture data;
mapping the first texture region to a second texture region, wherein the second texture region is larger than the first texture region.
In some embodiments, the transferring the enlarged picture data to the first process includes:
and transmitting the amplified picture data to the first process through android process communication, wherein the android process communication comprises Binder communication, socket communication, shared memory communication, message queue communication and signal traffic communication.
In a second aspect, the embodiment of the application provides a live broadcast picture amplifying system based on an android system, which comprises a background service module, a video stream acquisition module, an amplifying management application module and a process communication module;
the background service module is used for starting a first process and registering the surface texture of the live application program to the first process;
the video stream obtaining module is used for controlling the surface texture through the first process to monitor the picture data of the live broadcast application program and generating a texture ID of the picture data;
the amplification management application module is used for responding to an amplification instruction of a preset picture and executing the amplification processing of the preset picture through a second process;
the process communication module is used for transmitting the texture ID of the picture data of the preset picture to a second process, and transmitting the picture data after the amplification processing to the first process, and transmitting the picture data to the live broadcast application program through the first process.
In a third aspect, an embodiment of the present application provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing a method according to any one of the first aspects when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored thereon a computer program which when executed by a processor implements a method as described in the first aspect above.
Compared with the related art, the live broadcast picture amplifying method, system, equipment and medium based on the android system provided by the embodiment of the application are characterized in that the surface texture of the live broadcast application program is registered to the first process by starting the first process; monitoring picture data of a live broadcast application program by controlling surface textures through a first process, and generating texture IDs of the picture data; transmitting texture IDs of picture data of the preset pictures to a second process in response to an amplifying instruction of the preset pictures, and executing amplifying processing of the preset pictures through the second process; and transmitting the amplified picture data to a first process, and transmitting the picture data to a live broadcast application program through the first process. The method solves the problems that the appointed local picture can not be amplified in real time and the application range is small in the live video, realizes the local amplification of the live picture on a system layer, and is different from the implementation level of the local amplification of the live picture in the prior art (the local amplification of the live picture in the prior art is based on an application layer). The live broadcast picture local amplification realized by the system layer is effective for all live broadcast applications in the system, and the local amplification realized by a single live broadcast application in the prior art is effective for the application only.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
fig. 1 is a step flowchart of a live view zooming method based on an android system according to an embodiment of the present application;
fig. 2 is an overall flowchart of a live view zooming method based on an android system according to an embodiment of the present application;
fig. 3 is a partially enlarged schematic illustration of a live view according to an embodiment of the present application;
FIG. 4 is a process communication schematic according to an embodiment of the application;
fig. 5 is a block diagram of a live view screen enlarging system based on an android system according to an embodiment of the present application;
fig. 6 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application.
The attached drawings are identified: 51. a background service module; 52 a video stream acquisition module; 53. an amplification management application module; 54. and a process communication module.
Detailed Description
The present application will be described and illustrated with reference to the accompanying drawings and examples in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. All other embodiments, which can be made by a person of ordinary skill in the art based on the embodiments provided by the present application without making any inventive effort, are intended to fall within the scope of the present application.
It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is possible for those of ordinary skill in the art to apply the present application to other similar situations according to these drawings without inventive effort. Moreover, it should be appreciated that while such a development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as having the benefit of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by those of ordinary skill in the art that the described embodiments of the application can be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. The terms "a," "an," "the," and similar referents in the context of the application are not to be construed as limiting the quantity, but rather as singular or plural. The terms "comprising," "including," "having," and any variations thereof, are intended to cover a non-exclusive inclusion; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to only those steps or elements but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in connection with the present application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as used herein means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The terms "first," "second," "third," and the like, as used herein, are merely distinguishing between similar objects and not representing a particular ordering of objects.
Example 1
The embodiment of the application provides a live broadcast picture amplifying method based on an android system, wherein fig. 1 is a step flow chart of the live broadcast picture amplifying method based on the android system according to the embodiment of the application, fig. 2 is an overall flow chart of the live broadcast picture amplifying method based on the android system according to the embodiment of the application, and as shown in fig. 1 and 2, the method comprises the following steps:
step S102, starting a first process, and registering the surface texture of the live application program to the first process;
specifically, as shown in fig. 2, the live application program is monitored through a first process (a background control service), when the first process monitors that the live application program opens a camera preview screen, a Surface Texture (Surface Texture) of the camera preview screen is set, and the first process invokes the android system camera service to determine whether the Surface Texture in the system is the Surface Texture of the live application program, if so, the Surface Texture is registered to the first process.
It should be noted that, before the live application program performs the live operation, the camera preview screen needs to be opened to perform relevant live settings. Therefore, whether the live broadcast application program opens the camera preview picture or not is monitored through the first process, surface textures are set for picture amplification processing, and picture amplification of the live broadcast application program can be achieved at a system layer. That is, all live applications installed in the system can set the surface texture for picture magnification based on the first process listening to the camera preview picture, rather than just validating a single live application.
Step S104, monitoring the picture data of the live broadcast application program through the surface texture controlled by the first process, and generating the texture ID of the picture data;
specifically, as shown in fig. 2, the surface texture is controlled by a first process to monitor the picture data of the live application program and generate the texture ID of the picture data; and calling a system HAL service through an android system camera service to acquire picture data of the camera, and storing the picture data in the surface texture.
Step S106, responding to the amplifying instruction of the preset picture, transmitting the texture ID of the picture data of the preset picture to a second process, and executing the amplifying processing of the preset picture through the second process;
it should be noted that, the preset frame is a live frame of the live application program, and the range of the preset frame is a live frame range of the live application program; the zoom-in instruction may be an instruction to zoom in on a preset screen (image frame of live broadcast screen) at a certain time, or an instruction to zoom in on a preset screen (video stream of live broadcast screen) at a certain time period.
Specifically, a preset amplifying point in a preset picture is obtained; and generating a preview frame for local amplification by taking a preset amplification point as a central point, wherein the size and the shape of the preview frame are set in a self-defined mode, the preview frame comprises a round frame, a rectangular frame and a star frame, and further local amplification factors in the preview frame can be set in a self-defined mode. As shown in fig. 2, based on the texture ID of the preset screen, screen data corresponding to the texture ID is acquired by a second process (enlargement management application), and the screen data is locally enlarged based on the preview frame.
Preferably, sampling the picture data in the preview frame by a fragment shader to obtain a first texture region of the picture data; the first texture region is mapped to a second texture region, wherein the second texture region is larger than the first texture region.
Step S108, the amplified picture data is transmitted to a first process, and the picture data is transmitted to the live broadcast application program through the first process.
Through the steps S102 to S108 in the embodiment of the application, the problems that the appointed local picture can not be amplified in real time and the application range is small in the live video are solved, the local amplification of the live picture is realized on a system layer, and the implementation level of the local amplification of the live picture is different from that of the live picture in the prior art (the local amplification of the live picture in the prior art is based on an application layer). The live image local amplification realized by the system layer is effective for all live applications in the system, and the local amplification realized by a single live application in the prior art is effective only for the application.
In some of these embodiments, step S102 registers a Surface Texture of the live application to the first process, where the Surface Texture (Surface Texture) is used to capture screen data of the live application in the android system.
Specifically, surface Texture (Surface Texture) is a new class added from Android. Surface Texture (Surface Texture) is used to capture picture data of live applications, and the video stream may be camera preview or video decoding data. In the embodiment of the application, a first process (background control service) registers surface textures into a third-party live broadcast application (a shake-sound APP, a fight fish APP, a fast-hand APP and other third-party live broadcast applications) and is used for capturing picture data of a live broadcast application program in an android system.
In some embodiments, a preset amplifying point in a preset picture is obtained, and a preview frame of picture data is generated by taking the preset amplifying point as a central point;
specifically, fig. 3 is a schematic view of live broadcast picture local enlargement according to an embodiment of the present application, as shown in fig. 3, a preset enlargement point in a preset picture is obtained and determined according to a click position of a user, and the preset enlargement point in the preset picture may also be determined according to an enlargement parameter set by a man-machine interaction interface. As shown in fig. 3, the enlarged point is an origin coordinate, and an enlarged circular preview frame is generated according to the origin coordinate, and the preview frame shape may be custom set, and the preview frame shape includes, but is not limited to, a circular frame, a rectangular frame, and a star frame. By enlarging the picture in the preview frame, the user can conveniently watch the picture with unclear local details in live broadcast.
In some embodiments, sampling, by a fragment shader, picture data within a preview frame to obtain a first texture region of the picture data; the first texture region is mapped to a second texture region, wherein the second texture region is larger than the first texture region.
Specifically, the picture data in the preview frame is sampled by a fragment shader to obtain a first texture region of the picture data, and then original texture coordinates of each point in the first texture region are mapped on offset texture coordinates through amplification processing, wherein the offset texture coordinates are coordinates of corresponding points in a second texture region, and the second texture region is a region amplified by the first texture region.
Preferably, fig. 3 is a partially enlarged schematic view of a live view according to an embodiment of the present application, as shown in fig. 3, a circular frame is a preview frame for partial enlargement, and for a point (a hollow dot) in the circular frame, the coordinates of the hollow dot are mapped onto the coordinates of a solid dot, that is, original texture coordinates of all points in the circular frame are mapped onto offset texture coordinates, and the first texture region is enlarged into a second texture region, so as to complete the partial enlargement based on the circular frame. The local images are effectively enlarged through the graphic open library, so that a user can watch details in the live video more clearly.
In some of these embodiments, step S108 of transferring the data after the enlargement processing to the first process (background control service) includes:
and transmitting the amplified data to a first process through android process communication, wherein the android process communication comprises Binder communication, socket communication, shared memory communication, message queue communication and signal traffic communication.
Specifically, as shown in fig. 4, a process communication schematic diagram according to an embodiment of the present application is shown in fig. 4, where a live application program and a first process (a background control service) communicate with each other, and a surface texture of the live application program is registered in the first process and is managed by the first process. The first process acquires corresponding picture data based on the texture ID and transmits the picture data to a second process (amplification management application) through process communication, and the second process transmits the amplified data to the first process through Binder communication, wherein the Binder communication is as follows: and sending data to a kernel address space through a first system interface (copy_from_user), establishing a mapping relation between the kernel address space and a process address space in a first process through a second system interface (mmap), and taking the amplified data in the kernel address space by the first process based on the mapping relation. The first process transmits the amplified data to the live application.
It should be noted that the steps illustrated in the above-described flow or flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
Example 2
The embodiment of the application provides a live broadcast picture amplifying system based on an android system, and fig. 5 is a structural block diagram of the live broadcast picture amplifying system based on the android system according to the embodiment of the application, and as shown in fig. 5, the system comprises a background service module 51, a video stream acquisition module 52, an amplifying management application module 53 and a process communication module 54;
the background service module 51 is configured to start a first process, and register a surface texture of the live application program to the first process;
a video stream obtaining module 52, configured to monitor the picture data of the live application program by controlling the surface texture through the first process, and generate a texture ID of the picture data;
an enlargement management application module 53 for executing enlargement processing of the preset screen through a second process in response to an enlargement instruction for the preset screen;
the process communication module 54 is configured to transmit the texture ID of the picture data of the preset picture to the second process, and transmit the picture data after the enlargement process to the first process, and transmit the picture data to the live broadcast application program through the first process.
The background service module 51, the video stream acquisition module 52, the amplification management application module 53 and the process communication module 54 in the embodiment of the application solve the problem that the live video cannot amplify the appointed local picture in real time and has a small application range, realize local amplification of the live picture on a system layer, and are different from the implementation level of local amplification of the live picture in the prior art (the local amplification of the live picture in the prior art is based on the application layer). The live image local amplification realized by the system layer is effective for all live applications in the system, and the local amplification realized by a single live application in the prior art is effective for only the application.
The respective units may be functional units or program units, and may be implemented by software or hardware. For units implemented in hardware, the various units described above may be located in the same processor; or the units may be located in different processors in any combination.
Example 3
The present embodiment also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
It should be noted that, specific examples in this embodiment may refer to examples described in the foregoing embodiments and alternative implementations, and this embodiment is not repeated herein.
In addition, in combination with the live broadcast picture amplifying method based on the android system in the above embodiment, the embodiment of the application can be realized by providing a storage medium. The storage medium has a computer program stored thereon; the computer program when executed by the processor implements any one of the live broadcast picture amplifying methods based on the android system in the above embodiments.
Example 4
In one embodiment, a computer device is provided, which may be a terminal. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to realize a live image amplifying method based on an android system. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
In one embodiment, fig. 6 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application, and as shown in fig. 6, an electronic device, which may be a server, is provided, and an internal structure diagram thereof may be as shown in fig. 6. The electronic device includes a processor, a network interface, an internal memory, and a non-volatile memory connected by an internal bus, where the non-volatile memory stores an operating system, computer programs, and a database. The processor is used for providing computing and control capability, the network interface is used for communicating with an external terminal through network connection, the internal memory is used for providing an environment for the operation of an operating system and a computer program, when the computer program is executed by the processor, the live image amplifying method based on the android system is realized, and the database is used for storing data.
It will be appreciated by those skilled in the art that the structure shown in fig. 6 is merely a block diagram of a portion of the structure associated with the present inventive arrangements and is not limiting of the electronic device to which the present inventive arrangements are applied, and that a particular electronic device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It should be understood by those skilled in the art that the technical features of the above-described embodiments may be combined in any manner, and for brevity, all of the possible combinations of the technical features of the above-described embodiments are not described, however, they should be considered as being within the scope of the description provided herein, as long as there is no contradiction between the combinations of the technical features.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (9)

1. The live broadcast picture amplifying method based on the android system is characterized by comprising the following steps of:
starting a first process, setting surface textures of a camera preview picture and registering the surface textures to the first process when the first process monitors that a live broadcast application program opens the camera preview picture;
the first process controls the surface texture to monitor the picture data of the live broadcast application program and generates a texture ID of the picture data;
transmitting texture IDs of picture data of a preset picture to an amplification management application in response to an amplification instruction of the preset picture, and executing amplification processing of the preset picture through the amplification management application;
and transmitting the amplified picture data to the first process, and transmitting the picture data to the live broadcast application program through the first process.
2. The method according to claim 1, wherein after monitoring picture data of the live application by controlling the surface texture by the first process and generating a texture ID of the picture data, the method comprises:
and calling a system HAL service through an android system camera service to acquire picture data of a camera, and storing the picture data in the surface texture.
3. The method of claim 1, wherein responding to the zoom-in instruction for the preset screen comprises:
acquiring a preset amplifying point in a preset picture;
and generating a preview frame for local amplification by taking the preset amplification point as a central point, wherein the size and the shape of the preview frame are custom-set, and the shape of the preview frame comprises a round frame, a rectangular frame and a star frame.
4. The method according to claim 3, wherein performing, by the magnification management application, magnification processing of the preset screen includes:
acquiring picture data corresponding to the texture ID through the magnification management application based on the texture ID of the picture data of the preset picture;
and carrying out local amplification on the picture data based on the preview frame.
5. The method of claim 4, wherein locally zooming in on the picture data based on the preview box comprises:
sampling the picture data in the preview frame through a fragment shader to obtain a first texture region of the picture data;
mapping the first texture region to a second texture region, wherein the second texture region is larger than the first texture region.
6. The method of claim 1, wherein transferring the enlarged picture data to the first process comprises:
and transmitting the amplified picture data to the first process through android process communication, wherein the android process communication comprises Binder communication, socket communication, shared memory communication, message queue communication and signal traffic communication.
7. The live broadcast picture amplifying system based on the android system is characterized by comprising a background service module, a video stream acquisition module, an amplifying management application module and a process communication module;
the background service module is used for starting a first process, setting the surface texture of a camera preview picture and registering the surface texture to the first process when the first process monitors that a live broadcast application program opens the camera preview picture;
the video stream obtaining module is used for controlling the surface texture through the first process to monitor the picture data of the live broadcast application program and generating a texture ID of the picture data;
the amplifying management application module is used for responding to an amplifying instruction of a preset picture and executing amplifying processing of the preset picture through the amplifying management application;
the process communication module is used for transmitting the texture ID of the picture data of the preset picture to an amplification management application, and transmitting the picture data after the amplification processing to the first process, and transmitting the picture data to the live broadcast application program through the first process.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 6 when executing the computer program.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any one of claims 1 to 6.
CN202310294528.3A 2023-03-24 2023-03-24 Live broadcast picture amplifying method, system, equipment and medium based on android system Active CN116055789B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310294528.3A CN116055789B (en) 2023-03-24 2023-03-24 Live broadcast picture amplifying method, system, equipment and medium based on android system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310294528.3A CN116055789B (en) 2023-03-24 2023-03-24 Live broadcast picture amplifying method, system, equipment and medium based on android system

Publications (2)

Publication Number Publication Date
CN116055789A CN116055789A (en) 2023-05-02
CN116055789B true CN116055789B (en) 2023-08-11

Family

ID=86116662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310294528.3A Active CN116055789B (en) 2023-03-24 2023-03-24 Live broadcast picture amplifying method, system, equipment and medium based on android system

Country Status (1)

Country Link
CN (1) CN116055789B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6559846B1 (en) * 2000-07-07 2003-05-06 Microsoft Corporation System and process for viewing panoramic video
WO2007105979A1 (en) * 2006-03-14 2007-09-20 Siemens Aktiengesellschaft Handling a request in an automation system
CN106792092A (en) * 2016-12-19 2017-05-31 广州虎牙信息科技有限公司 Live video flow point mirror display control method and its corresponding device
CN110784704A (en) * 2019-11-11 2020-02-11 四川航天神坤科技有限公司 Display method and device of monitoring video and electronic equipment
CN111010587A (en) * 2019-12-24 2020-04-14 网易(杭州)网络有限公司 Live broadcast control method, device and system
CN113791888A (en) * 2021-11-17 2021-12-14 北京鲸鲮信息系统技术有限公司 Linux application process management method and device
CN114040251A (en) * 2021-09-17 2022-02-11 北京旷视科技有限公司 Audio and video playing method, system, storage medium and computer program product
CN115168069A (en) * 2022-06-16 2022-10-11 杭州马兰头医学科技有限公司 Method, system, device and medium for magnifying glass to penetrate window based on Hook
CN115567754A (en) * 2022-09-29 2023-01-03 深圳市锐明技术股份有限公司 Video playing method, device, equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6559846B1 (en) * 2000-07-07 2003-05-06 Microsoft Corporation System and process for viewing panoramic video
WO2007105979A1 (en) * 2006-03-14 2007-09-20 Siemens Aktiengesellschaft Handling a request in an automation system
CN106792092A (en) * 2016-12-19 2017-05-31 广州虎牙信息科技有限公司 Live video flow point mirror display control method and its corresponding device
CN110784704A (en) * 2019-11-11 2020-02-11 四川航天神坤科技有限公司 Display method and device of monitoring video and electronic equipment
CN111010587A (en) * 2019-12-24 2020-04-14 网易(杭州)网络有限公司 Live broadcast control method, device and system
CN114040251A (en) * 2021-09-17 2022-02-11 北京旷视科技有限公司 Audio and video playing method, system, storage medium and computer program product
CN113791888A (en) * 2021-11-17 2021-12-14 北京鲸鲮信息系统技术有限公司 Linux application process management method and device
CN115168069A (en) * 2022-06-16 2022-10-11 杭州马兰头医学科技有限公司 Method, system, device and medium for magnifying glass to penetrate window based on Hook
CN115567754A (en) * 2022-09-29 2023-01-03 深圳市锐明技术股份有限公司 Video playing method, device, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
流媒体技术在远程教学中的应用;吴承佳;;厦门广播电视大学学报(第01期);全文 *

Also Published As

Publication number Publication date
CN116055789A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
US10778865B2 (en) Image processing apparatus, image processing system, and image processing method
EP3346702A1 (en) Communication terminal, image communication system, communication method, and carrier means
US10523820B2 (en) High-quality audio/visual conferencing
US20180249047A1 (en) Compensation for delay in ptz camera system
WO2014025319A1 (en) System and method for enabling user control of live video stream(s)
JP6081037B1 (en) Image receiving and reproducing apparatus, image generating and transmitting apparatus, display system, image receiving and reproducing method, image generating and transmitting method, image receiving and reproducing program, and image generating and transmitting program
JP5935694B2 (en) Moving image distribution system and moving image distribution method
CN102959955A (en) Sharing an image
CN112367554A (en) Message interaction method and device, electronic equipment and storage medium
CN111050204A (en) Video clipping method and device, electronic equipment and storage medium
CN113521728A (en) Cloud application implementation method and device, electronic equipment and storage medium
CN114513506A (en) Service processing method, access edge cloud server and service processing system
CN115830224A (en) Multimedia data editing method and device, electronic equipment and storage medium
KR101311463B1 (en) remote video transmission system
CN115379105B (en) Video shooting method, device, electronic equipment and storage medium
KR100710388B1 (en) Mobile communication terminal with multi-tasking having dual-display equipment and its method
CN114339363A (en) Picture switching processing method and device, computer equipment and storage medium
JP2007251887A (en) Communication system, mobile terminal, server, and computer program
CN116055789B (en) Live broadcast picture amplifying method, system, equipment and medium based on android system
CN112616053A (en) Transcoding method and device of live video and electronic equipment
CN109714628B (en) Method, device, equipment, storage medium and system for playing audio and video
CN116723353A (en) Video monitoring area configuration method, system, device and readable storage medium
CN113965768B (en) Live broadcasting room information display method and device, electronic equipment and server
CN114501136B (en) Image acquisition method, device, mobile terminal and storage medium
CN114666477B (en) Video data processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant