CN108111914B - Video playing area identification method and device and media playing equipment - Google Patents
Video playing area identification method and device and media playing equipment Download PDFInfo
- Publication number
- CN108111914B CN108111914B CN201611050873.9A CN201611050873A CN108111914B CN 108111914 B CN108111914 B CN 108111914B CN 201611050873 A CN201611050873 A CN 201611050873A CN 108111914 B CN108111914 B CN 108111914B
- Authority
- CN
- China
- Prior art keywords
- video playing
- target pixel
- pixel lines
- determining
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000000605 extraction Methods 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 8
- 230000009471 action Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4782—Web browsing, e.g. WebTV
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/443—OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
- H04N21/4438—Window management, e.g. event handling following interaction with the user interface
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
- H04N21/4858—End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a video playing area identification method, a video playing area identification device and media playing equipment, wherein the method comprises the following steps: intercepting an image of a video playing page; extracting color feature data of each pixel in the image; calculating the number of pixels of the image with the same color characteristic data on each pixel line; selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as target pixel lines; and determining a video playing area in the video playing page according to the two target pixel lines. The method can accurately determine the specific playing area when the multimedia playing device plays the video, has small calculated amount and high determining speed, and provides convenience for subsequent video analysis.
Description
Technical Field
The invention relates to the technical field of video image processing, in particular to a method and a device for identifying a video playing area and media playing equipment.
Background
When a video is played on a multimedia playing device, the played video needs to be analyzed, such as determining and analyzing a first frame, a frame rate, a playing speed, and the like of the video. The video analysis needs to capture the video content played in the video playing area to obtain an accurate analysis result. The inventor finds that no method for accurately determining the video playing area in the prior art brings inconvenience to video analysis.
Disclosure of Invention
In view of this, the present invention provides a video playing area identification method, which can accurately determine a video playing area on a multimedia playing device.
The technical scheme provided by the invention is as follows:
the video playing area identification method provided by the embodiment of the invention is applied to media playing equipment, and comprises the following steps:
intercepting an image of a video playing page;
extracting color feature data of each pixel in the image;
calculating the number of pixels of the image with the same color characteristic data on each pixel line;
selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as target pixel lines;
and determining a video playing area in the video playing page according to the two target pixel lines.
The video playing area recognition device provided by the embodiment of the invention is applied to media playing equipment, and comprises:
the image intercepting module is used for intercepting images of the video playing page;
the color feature extraction module is used for extracting color feature data of each pixel in the image;
the calculation module is used for calculating the number of pixels of the image with the same color characteristic data on each pixel line;
the target pixel line determining module is used for selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as a target pixel line;
and the video playing area determining module is used for determining a video playing area in the video playing page according to the two target pixel lines.
The media playing device provided by the embodiment of the invention comprises:
a memory;
a processor; and
a video playback zone identification device comprising one or more modules stored in the memory and executed by the processor, the video playback zone identification device comprising:
the image intercepting module is used for intercepting images of the video playing page;
the color feature extraction module is used for extracting color feature data of each pixel in the image;
the calculation module is used for calculating the number of pixels of the image with the same color characteristic data on each pixel line;
the target pixel line determining module is used for selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as a target pixel line;
and the video playing area determining module is used for determining a video playing area in the video playing page according to the two target pixel lines. Compared with the prior art, the video playing area identification method, the video playing area identification device and the media playing equipment provided by the embodiment of the invention have the advantages that the target pixel line is determined according to the color characteristics of the pixels in the pixel line on the intercepted image of the video playing page, the position of the specific video playing area is determined through the target pixel line, the specific video playing area of the multimedia playing equipment can be accurately determined, the calculated amount is small, the determination speed is high, and convenience is provided for subsequent video analysis.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a block diagram of a media playing device according to an embodiment of the present invention.
Fig. 2 is a flowchart of a video playing area identification method according to an embodiment of the present invention.
Fig. 3a is a sub-flowchart of step S104 in fig. 2.
FIG. 3b is a schematic diagram of the method shown in FIG. 3 a.
Fig. 4a is another sub-flowchart of step S104.
Fig. 4b is a schematic diagram of the method shown in fig. 4 a.
Fig. 5 is a sub-flowchart of step S105 in fig. 2.
Fig. 6 is another sub-flowchart of step S105 in fig. 2.
Fig. 7a is another sub-flowchart of step S105 in fig. 2.
Fig. 7b is an illustrative diagram of the method shown in fig. 7 a.
Fig. 8 is a schematic functional block diagram of a video playing area recognition apparatus according to an embodiment of the present invention.
Icon: 100-a media playing device; 110-video playing area identification means; 111-a memory; 112-a memory controller; 113-a processor; 114-peripheral interfaces; 115-input-output unit; 116-a display unit; 1101-an image intercepting module; 1102-a color feature extraction module; 1103-a calculation module; 1104-a target pixel line determination module; 1105-video playing area determination module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
When the multimedia playing device plays a video, there is a need to analyze the played video, such as identifying the first frame of the video, determining the playing speed, and the like, and the video analysis needs to determine a specific video playing area.
Referring to fig. 1, fig. 1 is a block diagram of a media playing device 100 according to an embodiment of the present invention. The media playback apparatus 100 includes a video playback area recognition device 110, a memory 111, a storage controller 112, a processor 113, a peripheral interface 114, an input/output unit 115, and a display unit 116.
In this embodiment, the media playing device 100 may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a personal digital assistant, a mobile internet device, or any other device that can be used to play media content such as video.
The memory 111, the memory controller 112, the processor 113, the peripheral interface 114, the input/output unit 115, and the display unit 116 are electrically connected to each other directly or indirectly to implement data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The video playing area recognition device 110 includes at least one software functional module which can be stored in the memory 111 in the form of software or firmware (firmware) or solidified in an Operating System (OS) of the media playing apparatus 100. The memory 111 stores application programs downloaded and installed by the media playback apparatus 100. The processor 113 is used for executing executable modules stored in the memory 111, such as software functional modules and computer programs included in the video playing area recognition device 110.
The Memory 111 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory 111 is used for storing a program, and the processor 113 executes the program after receiving an execution instruction. Access to the memory 111 by the processor 113 and possibly other components may be under the control of the memory controller 112.
The processor 113 may be an integrated circuit chip having signal processing capabilities. The Processor 113 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP)), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The peripheral interface 114 couples various input/output devices to the processor 113 and to the memory 111. In some embodiments, the peripheral interface 114, the processor 113, and the memory controller 112 may be implemented in a single chip. In other examples, they may be implemented separately from the individual chips.
The input/output unit 115 is used for providing input data for a user to realize the interaction of the user with the media playing device 100. The input/output unit 115 may be, but is not limited to, a mouse, a keyboard, and the like.
The display unit 116 provides an interactive interface (e.g., a user operation interface) between the media playback apparatus 100 and a user or is used to display image data. In this embodiment, the display unit 116 may be a liquid crystal display or a touch display. In the case of a touch display, the display can be a capacitive touch screen or a resistive touch screen, which supports single-point and multi-point touch operations. Supporting single-point and multi-point touch operations means that the touch display can sense touch operations generated from one or more positions on the touch display, and the sensed touch operations are sent to the processor 113 for calculation and processing.
Referring to fig. 2, an embodiment of the present application provides a video playing area identification method, which is applied to a multimedia playing device, and a related flow of the method can be implemented by the processor, where the method includes:
step S101, intercepting images of video playing pages.
Specifically, the multimedia playing device 100 may load a player in a browser to play a video, and may also start video playing software to play the video. For example, when a mobile terminal such as a mobile phone is used to play a video, a video playing page may be opened in a browser, and a suitable player may be loaded on the video playing page to play the video. The mobile terminal generally plays videos in two modes, namely horizontal screen playing and vertical screen playing. In the vertical screen display mode, the video display area of the mobile terminal is generally a portion of the entire display unit 116. At this time, the mobile terminal can also receive the operation of the user, and convert the video played in the vertical screen into the horizontal screen. In the case of horizontal screen display, the video display area will generally fill the entire display unit 116, i.e., full screen display.
In this embodiment of the application, the video playing page refers to a display state of the display unit 116 when displaying a video, and the captured image may be an image of the entire display unit 116 or an image of a preset area on the display unit 116. The action of capturing the image can be triggered when the loading of the video playing page is completed, or can be triggered when a screenshot command sent by a user is received.
Step S102, extracting color characteristic data of each pixel in the image.
After the image of the video playing page is captured, the image may be analyzed to extract color feature data of each pixel in the image, such as RGB data of each pixel in the image or determine the color feature data of the pixel in the image according to a color description file stored inside the multimedia playing device 100.
Step S103, calculating the number of pixels of the image with the same color characteristic data on each pixel line.
Specifically, a plane coordinate system may be established on the display unit 116 of the multimedia playing device 100 to determine the specific position of each pixel on the display unit 116. The plane coordinate system generally uses the lower left corner or the upper left corner of the display unit 116 as the origin of coordinates, the horizontal direction as the abscissa axis, and the vertical direction as the ordinate axis. Other forms of planar coordinate systems may of course be established. The pixel line in the embodiment of the present application may be a line segment where each row of pixels is located in a direction parallel to the abscissa axis. The number of pixels with the same color feature data on each pixel line can be determined by comparing the color feature data between different pixels on each pixel line.
And step S104, selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as target pixel lines.
In the embodiment of the present application, the number of target pixel lines is two, but the number of pixel lines in which the number of pixels having the same color feature data exceeds a preset value may be more than two. At this time, two pixel lines from among a plurality of such pixel lines need to be selected as target pixel lines.
Specifically, as shown in fig. 3a, step S104 may include step S1041 and step S1042. Step S1041 and step S1042 will be specifically described below.
Step S1041, sequentially calculating whether pixels with the same color feature data in each pixel line on the upper and lower sides of a preset pixel line exceed the preset value, with a preset pixel line in the image as a center.
In the process of determining the target pixel line, a preset pixel line may be determined as a starting point of the calculation. The preset pixel line may be referred to as a visual center, i.e., a center of a user's sight line, and the position of the preset pixel line may be determined according to actual conditions, for example, a pixel line with a height of one quarter of the display range of the display unit 116 is selected as the center, or another suitable pixel line is selected as the center. After the preset pixel line is determined, whether the number of pixels with the same color feature data in each pixel line exceeds the preset value is respectively calculated upwards and downwards by taking the preset pixel line as a starting point.
Step S1042 is to take two pixel lines, which are calculated from the upper and lower sides of the preset pixel line and have the same color feature data and exceed the preset value, as the target pixel line.
In the process of respectively calculating towards two sides by taking the preset pixel line as the center, the number of pixels with the same color characteristic data on the pixel line of the boundary of the video playing area exceeds the preset value. If the determined preset pixel line is located between the upper boundary and the lower boundary of the video playing area, two target pixel lines respectively located at two sides of the upper line of the preset pixel line can be determined. Moreover, as long as it is ensured that the predetermined preset pixel line is located between the upper and lower boundaries of the video playing area, between the upper and lower boundaries, only two target pixel lines meeting the condition can be obtained.
Referring to fig. 3B, a shaded portion in the figure is a video playing area, where a line segment a is a determined preset pixel line, and whether the pixel line meets a preset condition is calculated from a as a center to both sides.
In addition, in the process of calculating the target pixel line by taking the preset pixel line as the center, the pixel lines meeting the conditions, which are obtained by first calculating the upper side and the lower side of the preset pixel line, can be used as the target pixel line, so that as long as the preset pixel line can be determined between the upper boundary and the lower boundary of the video playing area, two pixel lines meeting the conditions can be obtained and used as the target pixel line.
In addition, after the preset pixel line is determined, in the coordinate system of the display unit 116, that is, in the process of calculating from the preset pixel line to both sides by taking the preset pixel line as a center, in a preset region by taking the preset pixel line as a center, a pixel line meeting the condition may be determined, and if two pixel lines meeting the condition are not found in the preset region, it may be sequentially determined whether the pixel lines outside the boundary of the preset region meet the preset condition.
In another embodiment, as shown in fig. 4a, step S104 may further include sub-step S1043, sub-step S1044, and sub-step S1045, which are further described below with reference to sub-step S1043, sub-step S1044, and sub-step S1045.
Step S1043, determining whether the number of pixel lines in the image, of which the number of pixels with the same color feature data exceeds a preset value, is greater than 2.
In step S1044, if the number of pixels with the same color data is greater than 2, the distance between each two pixel lines with the same number of pixels exceeding the preset value is calculated.
Step S1045, selecting two pixel lines corresponding to a distance value that is the same as a preset distance value as the target pixel line.
In the process of determining the target pixel line, pixel lines with the same color feature data and the number of pixels exceeding a preset value need to be determined first. Since there may be more than two pixel lines satisfying the condition in the video playing page, the specific number of pixel lines satisfying the condition needs to be determined first. If the number of pixel lines satisfying the condition is greater than two, the distance between two of these pixel lines is calculated. Since the pixel lines are parallel to each other, the relationship between the distance between every two pixel lines and the preset distance value is determined. The preset distance value may be predetermined according to display parameters, resolution, and the like of the display unit 116 of the multimedia playing terminal, and the preset distance value is actually a distance between upper and lower boundaries of the video playing area, that is, lengths of left and right boundaries of the video playing area. If the distance between two pixel lines is equal to the preset distance value, the two pixel lines are the upper and lower boundaries of the video playing area, and the two pixel lines can be used as the target pixel lines.
Referring to fig. 4b, if three pixel lines B, C, D are calculated, the distances between two pixel lines can be calculated to determine the final target pixel line. And if the distance between B and C is calculated to be equal to the preset distance value, determining B and C as the target pixel line.
Since there may be a plurality of pixel lines satisfying the condition that the number of pixels having the same color feature data exceeds the preset value, there may also be a plurality of groups of two pixel lines satisfying the condition that the distance is equal to the preset distance value. Steps S1043 to S1045 may be performed in addition to step S1041 and step S1042, when the target pixel line cannot be determined in step S1041 and step S1042, or may be performed independently.
Step S105, determining a video playing area in the video playing page according to the two target pixel lines.
After the target pixel lines are determined, i.e. the upper and lower boundaries of the video playing area are determined, a specific video playing area can be determined according to the two target pixel lines.
Specifically, as shown in fig. 5, step S105 may include a sub-step S1051, as follows.
Step S1051, a region surrounded by four pixel end points of the two target pixel lines connected end to end is used as the video playing region.
The target pixel line is two line segments which are parallel to each other, the end points of the two line segments are connected end to form a rectangle, and the area is a video playing area.
In another embodiment, as shown in fig. 6, step S105 may further include:
step S1052, acquiring a width of the image.
Step S1053, determining the heights of the two target pixel lines in the image.
Step S1054, determining a rectangular region between the two target pixel lines according to the height of the target pixel line and the width of the image, and determining the rectangular region as the video playing region.
The position of the target pixel line is determined in a coordinate system based on the display unit 116, and the ordinate of the target pixel line can be determined. The area between the target pixel lines includes a video playing area, and when the video playing area is a part of the whole video playing page on many media playing terminals, the width of the video playing area is the same as the width of the display unit 116, that is, the width of the video playing area is equal to the width of the image. At this time, the width of the image, that is, the width of the video playing area, is determined, and the height of the upper and lower boundaries of the video playing area is determined by determining the height of the target pixel line. The specific coordinate positions of the four vertexes of the video playing area can be determined by combining the width of the image, so that the specific position of the video playing area can be determined.
In another embodiment, as shown in fig. 7a, step S105 may further include:
step S1055, respectively determining end points of pixels having the same color feature data in the two target pixel lines.
Step S1056, a region formed by connecting end points of pixels with the same color feature data in the two target pixel lines end to end is used as the video playing region.
The target pixel line is determined, that is, the upper and lower boundaries of the video playing area are determined, but the left and right boundaries need to be determined to determine the specific position of the specific video playing area. As previously described, on some mobile terminals, the left and right boundaries of the video playback area may be the boundaries of a video playback page. And the video playback area may also be a small portion of the entire video playback page. At this time, end points of pixels having the same color feature data on the target pixel line, which are end points of one segment of the target pixel line, may be determined, and these end points may be boundary points of the left and right boundaries of the video playback area on the target pixel line. The end points are determined, namely the intersection points of the left and right boundaries of the video playing area and the target pixel line are determined, so that the specific position of the video playing area can be determined.
Referring to fig. 7B, after it is determined that B and C are target pixel lines, a line segment composed of pixels with the same color feature on the target pixel line B is E, and a line segment composed of pixels with the same color feature on the target pixel line C is F, at this time, an area enclosed by E and F end to end may be used as a video playing area.
In summary, by determining the target pixel line according to the color characteristics of the pixels in the pixel line on the captured image of the video playing page and determining the position of the specific video playing area through the target pixel line, the specific playing area of the multimedia playing device when playing the video can be accurately determined, the calculation amount is small, the determination speed is high, and convenience is provided for subsequent video analysis.
As shown in fig. 8, the video playing area identifying device 110 further provided in the embodiment of the present application includes an image capturing module 1101, a color feature extracting module 1102, a calculating module 1103, a target pixel line determining module 1104, and a video playing area determining module 1105.
An image capture module 1101 configured to capture an image of the video playback page. The method for capturing the image by the image capture module 1101 is described in detail in step S101.
A color feature extraction module 1102, configured to extract color feature data of each pixel in the image. The method for extracting the color feature data by the color feature extraction module 1102 is described in detail in step S102.
A calculating module 1103, configured to calculate the number of pixels of the image on each pixel line, where the color feature data of the image is the same. The method for determining the pixel line meeting the condition by the calculating module 1103 is described in detail in step S103.
And a target pixel line determining module 1104, configured to select two pixel lines with the same color feature data and the number of pixels exceeding a preset value as a target pixel line. The step of determining the target pixel line by the target pixel line determining module 1104 is described above in step S104 and its sub-steps.
A video playing area determining module 1105, configured to determine a video playing area in the video playing page according to the two target pixel lines. The method for determining the video playing area by the video playing area determining module 1105 is described in detail in the above step S105 and its sub-steps.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (12)
1. The video playing area identification method is applied to media playing equipment and is characterized by comprising the following steps:
intercepting an image of a video playing page;
extracting color feature data of each pixel in the image;
calculating the number of pixels of the image with the same color characteristic data on each pixel line; the pixel line is a line segment where each row of pixels is located in the direction parallel to the abscissa axis of a preset plane coordinate system;
selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as target pixel lines;
determining a video playing area in the video playing page according to the two target pixel lines;
the step of selecting two pixel lines with the same color feature data and the number of pixels exceeding a preset value as a target pixel line comprises the following steps:
sequentially calculating whether the number of pixels with the same color characteristic data in each pixel line at the upper side and the lower side of the preset pixel line exceeds the preset value or not by taking one preset pixel line in the image as a center;
taking two pixel lines with the same color characteristic data, which are obtained by respectively calculating from the upper side and the lower side of the preset pixel line, exceeding the preset value as the target pixel lines;
the step of selecting two pixel lines with the same color feature data and the number of pixels exceeding a preset value as a target pixel line comprises the following steps:
judging whether the number of pixel lines with the same number of pixels with the same color characteristic data in the image exceeds a preset value is more than 2 or not;
if the number of the pixels is more than 2, respectively calculating the distance between every two pixel lines with the same color data and the number of the pixels exceeding a preset value;
selecting two pixel lines corresponding to a distance value which is the same as a preset distance value as the target pixel lines; the preset distance value is the distance between the upper boundary and the lower boundary of the video playing area;
the method further comprises the following steps:
determining a left boundary and a right boundary of a video playing area, so as to determine the video playing area according to the left boundary and the right boundary.
2. The method for identifying video playing areas according to claim 1, wherein the step of determining the video playing areas in the video playing page according to the two selected target pixel lines comprises:
and taking a region formed by the end-to-end connection of four pixel end points of the two target pixel lines as the video playing region.
3. The method for identifying video playing areas according to claim 1, wherein the step of determining the video playing areas in the video playing page according to the two selected target pixel lines comprises:
acquiring the width of the image;
determining the height of the two target pixel lines in the image;
and determining a rectangular area between the two target pixel lines by using the height of the target pixel line and the width of the image, and determining the rectangular area as the video playing area.
4. The method for identifying video playing areas according to claim 1, wherein the step of determining the video playing areas in the video playing page according to the two selected target pixel lines comprises:
respectively determining the end points of pixels with the same color characteristic data in the two target pixel lines;
and taking an area formed by connecting end points of pixels with the same color characteristic data in the two target pixel lines end to end as the video playing area.
5. A video playing area recognition device is applied to media playing equipment, and is characterized by comprising:
the image intercepting module is used for intercepting images of the video playing page;
the color feature extraction module is used for extracting color feature data of each pixel in the image;
the calculation module is used for calculating the number of pixels of the image with the same color characteristic data on each pixel line; the pixel line is a line segment where each row of pixels is located in the direction parallel to the abscissa axis of a preset plane coordinate system; the target pixel line determining module is used for selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as a target pixel line;
the video playing area determining module is used for determining a video playing area in the video playing page according to the two target pixel lines;
the method for selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as the target pixel line by the target pixel line determining module comprises the following steps:
sequentially calculating whether the number of pixels with the same color characteristic data in each pixel line at the upper side and the lower side of the preset pixel line exceeds the preset value or not by taking one preset pixel line in the image as a center;
taking two pixel lines with the same color characteristic data, which are obtained by respectively calculating from the upper side and the lower side of the preset pixel line, exceeding the preset value as the target pixel lines;
the method for selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as the target pixel line by the target pixel line determining module comprises the following steps:
judging whether the number of pixel lines with the same number of pixels with the same color characteristic data in the image exceeds a preset value is more than 2 or not;
if the number of the pixels is more than 2, respectively calculating the distance between every two pixel lines with the same color data and the number of the pixels exceeding a preset value;
selecting two pixel lines corresponding to a distance value which is the same as a preset distance value as the target pixel lines; the preset distance value is the distance between the upper boundary and the lower boundary of the video playing area;
the video playing area recognition device further comprises:
means for determining a left boundary and a right boundary of a video playback area;
the video playing area determining module further determines the video playing area according to the left boundary and the right boundary.
6. The apparatus according to claim 5, wherein the method for determining the video playing area in the video playing page by the video playing area determining module according to the two selected target pixel lines comprises:
and taking a region formed by the end-to-end connection of four pixel end points of the two target pixel lines as the video playing region.
7. The apparatus according to claim 5, wherein the method for determining the video playing area in the video playing page by the video playing area determining module according to the two selected target pixel lines comprises:
acquiring the width of the image;
determining the height of the two target pixel lines in the image;
and determining a rectangular area between the two target pixel lines by using the height of the target pixel line and the width of the image, and determining the rectangular area as the video playing area.
8. The apparatus according to claim 5, wherein the method for determining the video playing area in the video playing page by the video playing area determining module according to the two selected target pixel lines comprises:
respectively determining the end points of pixels with the same color characteristic data in the two target pixel lines;
and taking an area formed by connecting end points of pixels with the same color characteristic data in the two target pixel lines end to end as the video playing area.
9. A media playback apparatus, comprising:
a memory;
a processor; and
a video playback zone identification device comprising one or more modules stored in the memory and executed by the processor, the video playback zone identification device comprising:
the image intercepting module is used for intercepting images of the video playing page;
the color feature extraction module is used for extracting color feature data of each pixel in the image;
the calculation module is used for calculating the number of pixels of the image with the same color characteristic data on each pixel line; the pixel line is a line segment where each row of pixels is located in the direction parallel to the abscissa axis of a preset plane coordinate system;
the target pixel line determining module is used for selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as a target pixel line;
the video playing area determining module is used for determining a video playing area in the video playing page according to the two target pixel lines;
the method for selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as the target pixel line by the target pixel line determining module comprises the following steps:
sequentially calculating whether the number of pixels with the same color characteristic data in each pixel line at the upper side and the lower side of the preset pixel line exceeds the preset value or not by taking one preset pixel line in the image as a center;
taking two pixel lines with the same color characteristic data, which are obtained by respectively calculating from the upper side and the lower side of the preset pixel line, exceeding the preset value as the target pixel lines;
the method for selecting two pixel lines with the same color characteristic data and the number of pixels exceeding a preset value as the target pixel line by the target pixel line determining module comprises the following steps:
judging whether the number of pixel lines with the same number of pixels with the same color characteristic data in the image exceeds a preset value is more than 2 or not;
if the number of the pixels is more than 2, respectively calculating the distance between every two pixel lines with the same color data and the number of the pixels exceeding a preset value;
selecting two pixel lines corresponding to a distance value which is the same as a preset distance value as the target pixel lines; the preset distance value is the distance between the upper boundary and the lower boundary of the video playing area;
the media playing device further comprises:
means for determining a left boundary and a right boundary of a video playback area;
the video playing area determining module further determines the video playing area according to the left boundary and the right boundary.
10. The media playing device of claim 9, wherein the method for determining the video playing area in the video playing page according to the selected two target pixel lines by the video playing area determining module comprises:
and taking a region formed by the end-to-end connection of four pixel end points of the two target pixel lines as the video playing region.
11. The media playing device of claim 9, wherein the method for determining the video playing area in the video playing page according to the selected two target pixel lines by the video playing area determining module comprises:
acquiring the width of the image;
determining the height of the two target pixel lines in the image;
and determining a rectangular area between the two target pixel lines by using the height of the target pixel line and the width of the image, and determining the rectangular area as the video playing area.
12. The media playing device of claim 9, wherein the method for determining the video playing area in the video playing page according to the selected two target pixel lines by the video playing area determining module comprises:
respectively determining the end points of pixels with the same color characteristic data in the two target pixel lines;
and taking an area formed by connecting end points of pixels with the same color characteristic data in the two target pixel lines end to end as the video playing area.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611050873.9A CN108111914B (en) | 2016-11-24 | 2016-11-24 | Video playing area identification method and device and media playing equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611050873.9A CN108111914B (en) | 2016-11-24 | 2016-11-24 | Video playing area identification method and device and media playing equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108111914A CN108111914A (en) | 2018-06-01 |
CN108111914B true CN108111914B (en) | 2020-11-03 |
Family
ID=62204965
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611050873.9A Active CN108111914B (en) | 2016-11-24 | 2016-11-24 | Video playing area identification method and device and media playing equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108111914B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109286801B (en) * | 2018-09-20 | 2020-04-14 | 深圳市酷开网络科技有限公司 | Image color display method, storage medium and smart television |
CN111131812A (en) * | 2019-12-31 | 2020-05-08 | 北京奇艺世纪科技有限公司 | Broadcast time testing method and device and computer readable storage medium |
CN116506695B (en) * | 2023-06-26 | 2023-09-08 | 北京搜狐新动力信息技术有限公司 | Video stream playing method and device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040008886A1 (en) * | 2002-07-02 | 2004-01-15 | Yuri Boykov | Using graph cuts for editing photographs |
CN101931736A (en) * | 2009-06-18 | 2010-12-29 | 言炬 | Processing method and system of video picture black borders |
CN102005038B (en) * | 2009-08-31 | 2014-10-15 | 鸿富锦精密工业(深圳)有限公司 | Image edge positioning method |
CN102622595A (en) * | 2011-01-28 | 2012-08-01 | 北京千橡网景科技发展有限公司 | Method and equipment used for positioning picture contained in image |
CN102254302B (en) * | 2011-06-07 | 2013-01-02 | 盛乐信息技术(上海)有限公司 | Picture trimming system and method thereof |
CN104574403B (en) * | 2015-01-12 | 2017-09-22 | 飞天诚信科技股份有限公司 | A kind of intelligent method of cutting out |
-
2016
- 2016-11-24 CN CN201611050873.9A patent/CN108111914B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN108111914A (en) | 2018-06-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108347657B (en) | Method and device for displaying bullet screen information | |
CN108111914B (en) | Video playing area identification method and device and media playing equipment | |
US8624928B2 (en) | System and method for magnifying a webpage in an electronic device | |
US9841886B2 (en) | Display control apparatus and control method thereof | |
JP5391951B2 (en) | Face detection result analysis system, face detection result analysis apparatus, and computer program | |
JP2021177399A (en) | Information processor, control method, and program | |
US10594940B1 (en) | Reduction of temporal and spatial jitter in high-precision motion quantification systems | |
US9501681B1 (en) | Decoding visual codes | |
JP2016194833A (en) | Commodity display position determination processing method, commodity display position determination processing program, and commodity display position determination processing apparatus | |
EP3408752B1 (en) | Object management and visualization using a computing device | |
JP5776312B2 (en) | Image analysis apparatus, image analysis method, image analysis program, and recording medium | |
KR101982258B1 (en) | Method for detecting object and object detecting apparatus | |
US9019223B2 (en) | Touch input layout configuration | |
WO2017024954A1 (en) | Method and device for image display | |
CN111131812A (en) | Broadcast time testing method and device and computer readable storage medium | |
CN108763491B (en) | Picture processing method and device and terminal equipment | |
CN106354409A (en) | Information display method and device and terminal | |
JP7567166B2 (en) | Image processing device, image processing method, and program | |
US9826163B2 (en) | Image processing apparatus, control method, and recording medium | |
CN107133022B (en) | Control display method and device in terminal equipment | |
CN113961526A (en) | Method and device for detecting screen shot picture | |
US10459576B2 (en) | Display apparatus and input method thereof | |
US10706315B2 (en) | Image processing device, image processing method, and computer program product | |
CN111290676B (en) | Method, device and equipment for intercepting picture of designated area in client | |
US9384527B2 (en) | Electronic device and image displaying method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20200526 Address after: 310052 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province Applicant after: Alibaba (China) Co.,Ltd. Address before: 510627 Guangdong city of Guangzhou province Whampoa Tianhe District Road No. 163 Xiping Yun Lu Yun Ping B radio 14 floor tower square Applicant before: GUANGZHOU UCWEB COMPUTER TECHNOLOGY Co.,Ltd. |
|
TA01 | Transfer of patent application right | ||
GR01 | Patent grant | ||
GR01 | Patent grant |