CN109241304B - Picture processing method, device and equipment - Google Patents

Picture processing method, device and equipment Download PDF

Info

Publication number
CN109241304B
CN109241304B CN201810935679.1A CN201810935679A CN109241304B CN 109241304 B CN109241304 B CN 109241304B CN 201810935679 A CN201810935679 A CN 201810935679A CN 109241304 B CN109241304 B CN 109241304B
Authority
CN
China
Prior art keywords
filling
picture
offset
parameter
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810935679.1A
Other languages
Chinese (zh)
Other versions
CN109241304A (en
Inventor
孙龙飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN201810935679.1A priority Critical patent/CN109241304B/en
Publication of CN109241304A publication Critical patent/CN109241304A/en
Application granted granted Critical
Publication of CN109241304B publication Critical patent/CN109241304B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Abstract

The invention discloses a picture processing method, a picture processing device and picture processing equipment. Wherein, the method comprises the following steps: acquiring a filling picture filled into the filling area and a filling mode; determining a first filling parameter corresponding to a preset filling mode based on the filling picture; obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relation between the filling mode and the preset filling mode; and filling the filling picture into the filling area according to a preset filling mode based on the second filling parameter. The invention solves the technical problems that the picture processing method in the prior art fills the filling picture into the filling area according to different filling modes, the processing process is complex and the workload is large.

Description

Picture processing method, device and equipment
Technical Field
The invention relates to the field of image processing, in particular to a picture processing method, a picture processing device and picture processing equipment.
Background
For the PPT picture, the PPT picture padding includes 8 tiling modes (left-upper mode, left-lower mode, right-upper mode, right-lower mode, left-side mode, top-side mode, bottom-side mode, right-side mode). At present, the implementation process of filling the PPT picture in the existing rendering mode is shown in fig. 1, and a specific tiling mode can be obtained, so as to obtain a filling parameter of the PPT picture, then a support for the specific tiling mode is added at a rendering end, and rendering display is performed according to the filling parameter.
However, because of a large number of types of tiling modes, in order to implement PPT picture filling, a rendering end needs to add support of all tiling modes, and thus the processing process is complex and the workload is large.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a picture processing method, a picture processing device and picture processing equipment, which are used for at least solving the technical problems that a picture processing method in the prior art fills a filling picture into a filling area according to different filling modes, the processing process is complex and the workload is large.
According to an aspect of the embodiments of the present invention, there is provided an image processing method, including: acquiring a filling picture filled into the filling area and a filling mode; determining a first filling parameter corresponding to a preset filling mode based on the filling picture; obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relation between the filling mode and the preset filling mode; and filling the filling picture into the filling area according to a preset filling mode based on the second filling parameter.
Further, the filling mode includes one of the following: top left alignment, bottom left alignment, top right alignment, bottom right alignment, left alignment, top alignment, bottom alignment, and right alignment.
Further, obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relationship between the filling mode and the preset filling mode includes: acquiring the size of a filling picture and the size of a filling area; obtaining a corresponding offset distance based on the size of the filling picture, the size of the filling area and the offset relation; and obtaining a second filling parameter based on the first filling parameter and the offset distance.
Further, the offset relationship includes: a vertical offset and/or a horizontal offset, the offset distance comprising: an offset distance in a vertical direction and/or an offset distance in a horizontal direction, wherein the vertical offset comprises: a first vertical offset or a second vertical offset, the horizontal offset comprising: a first horizontal offset or a second horizontal offset.
Further, in a case that the offset relationship includes the first vertical offset, obtaining, based on the size of the filler picture, the size of the filler region, and the offset relationship, a corresponding offset distance includes: obtaining the ratio of the height of the filling area to the height of the filling picture to obtain a first quantity value of the filling picture in the vertical direction; and acquiring the product of the fractional part value of the first quantity value and the height of the filling picture to obtain a first offset distance in the vertical direction.
Further, in a case that the offset relationship includes the first horizontal offset, obtaining, based on the size of the filler picture, the size of the filler region, and the offset relationship, a corresponding offset distance includes: obtaining the ratio of the width of the filling area to the width of the filling picture to obtain a second numerical value of the filling picture in the horizontal direction; and acquiring the product of the fractional part value of the second numerical value and the width of the filling picture to obtain a first offset distance in the horizontal direction.
Further, in a case that the offset relationship includes the second vertical offset, obtaining, based on the size of the filler picture, the size of the filler region, and the offset relationship, a corresponding offset distance includes: obtaining the ratio of the height of the filling area to a first preset value to obtain a first ratio; acquiring a high ratio of the first ratio to the filling picture to obtain a third quantity value of the filling picture in the vertical direction; and obtaining a second offset distance in the vertical direction based on the fractional part numerical value of the third numerical value, the height of the filled picture and a second preset value.
Further, obtaining a second offset distance in the vertical direction based on the fractional part value of the third quantity value, the height of the filled picture and a second preset value, includes: under the condition that the fractional part of the third numerical value is greater than or equal to a second preset value, obtaining the difference between the fractional part of the third numerical value and the second preset value to obtain a first difference value, and obtaining the product of the first difference value and the height of the filled picture to obtain a second offset distance in the vertical direction; and under the condition that the decimal part of the third quantity value is smaller than a second preset value, acquiring the sum of the numerical value of the decimal part of the third quantity value and the second preset value to obtain a first sum value, and acquiring the product of the first sum value and the height of the filled picture to obtain a second offset distance in the vertical direction.
Further, in a case that the offset relationship includes the second horizontal offset, obtaining, based on the size of the filler picture, the size of the filler region, and the offset relationship, a corresponding offset distance includes: obtaining the ratio of the width of the filling area to a first preset value to obtain a second ratio; acquiring the ratio of the second ratio to the width of the filling picture to obtain a fourth numerical value of the filling picture in the vertical direction; and obtaining a second offset distance in the horizontal direction based on the fractional part numerical value of the fourth numerical value, the width of the filling picture and a second preset value.
Further, obtaining a second offset distance in the horizontal direction based on the fractional part value of the fourth quantity value, the width of the filled picture and a second preset value, includes: under the condition that the fractional part of the fourth numerical value is greater than or equal to a second preset value, obtaining the difference between the fractional part of the fourth numerical value and the second preset value to obtain a second difference value, and obtaining the product of the second difference value and the width of the filled picture to obtain a second offset distance in the horizontal direction; and under the condition that the decimal part of the fourth numerical value is smaller than the second preset value, acquiring the sum of the numerical value of the decimal part of the fourth numerical value and the second preset value to obtain a second sum value, and acquiring the product of the second sum value and the width of the filled picture to obtain a second offset distance in the horizontal direction.
Further, where the filling pattern includes a bottom left alignment, the offset relationship includes a first vertical offset; in the case where the filling pattern includes an upper right alignment, the offset relationship includes a first horizontal offset; in the case where the filling pattern includes a lower right alignment, the offset relationship includes a first vertical offset and a first horizontal offset; in the case where the filling pattern comprises left alignment, the offset relationship comprises a second vertical offset; in the case where the filling manner includes the upper alignment, the offset relationship includes a second horizontal offset; in the case where the filling pattern includes right alignment, the offset relationship includes a second vertical offset and a first horizontal offset; where the filling pattern includes a down-alignment, the offset relationship includes a first vertical offset and a second horizontal offset.
Further, obtaining a second filling parameter based on the first filling parameter and the offset distance includes: and adjusting parameters in the first filling parameters based on the offset distance in the vertical direction and/or the offset distance in the horizontal direction to obtain second filling parameters.
According to another aspect of the embodiments of the present invention, there is also provided an image processing method, including: displaying the filling picture filled into the filling area and the filling mode; and displaying the filled filling area, wherein the filled filling area is obtained by filling the filling picture into the filling area according to a preset filling mode based on a second filling parameter corresponding to the filling mode, and the second filling parameter is obtained based on a first filling parameter corresponding to the preset filling mode and an offset relation between the filling mode and the preset filling mode.
According to another aspect of the embodiments of the present invention, there is also provided an image processing apparatus, including: the acquisition module is used for acquiring the filling picture filled into the filling area and the filling mode corresponding to the filling picture; the determining module is used for determining a first filling parameter corresponding to a preset filling mode based on the filling picture; the processing module is used for obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relation between the filling mode and the preset filling mode; and the filling module is used for filling the filling picture into the filling area according to a preset filling mode based on the second filling parameter.
According to another aspect of the embodiments of the present invention, there is also provided an image processing apparatus, including: the first display module is used for displaying the filling picture filled into the filling area and the filling mode; and the second display module is used for displaying the filled filling area, wherein the filled filling area is obtained by filling the filling picture into the filling area according to the preset filling mode based on a second filling parameter corresponding to the filling mode, and the second filling parameter is obtained based on a first filling parameter corresponding to the preset filling mode and the offset relation between the filling mode and the preset filling mode.
According to another aspect of the embodiments of the present invention, there is also provided a picture processing apparatus including: the display screen is used for displaying the filling pictures filled into the filling area; an input device for inputting a filling mode; and the processor is connected with the display screen and the input device and used for determining a first filling parameter corresponding to the preset filling mode based on the filling picture, obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relation between the filling mode and the preset filling mode, and filling the filling picture into the filling area according to the preset filling mode based on the second filling parameter.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium, where the storage medium includes a stored program, and when the program runs, the apparatus on which the storage medium is located is controlled to execute the above-mentioned picture processing method.
According to another aspect of the embodiments of the present invention, there is also provided a processor, where the processor is configured to execute a program, where the program executes the above-mentioned picture processing method.
In the embodiment of the invention, after the filling picture and the filling mode filled in the filling area are obtained, the first filling parameter corresponding to the preset filling mode is determined based on the filling picture, the second filling parameter corresponding to the filling mode is obtained based on the first filling parameter and the offset relation between the filling mode and the preset filling mode, the filling picture is filled in the filling area according to the preset filling mode further based on the second filling parameter, so that under the condition that the rendering end supports the upper left alignment mode by default, other alignment modes can be converted into the offset relative to the upper left alignment for rendering, the same rendering effect is realized, the processing complexity is reduced, the cost is reduced, the workload is reduced, the technical effect of rapidly supporting picture analysis is achieved, and the problem that the picture processing method in the prior art fills the filling picture in the filling area according to different filling modes is solved, the processing process is complex and the workload is large.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of a picture processing method according to the prior art;
FIG. 2 is a flow chart of a method of picture processing according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a fill picture according to an embodiment of the present invention;
FIG. 4 is a diagram of an upper left aligned rendering effect according to an embodiment of the invention;
FIG. 5 is a diagram of a bottom left aligned rendering effect according to an embodiment of the invention;
FIG. 6 is a diagram of an upper right aligned rendering effect according to an embodiment of the invention;
FIG. 7 is a schematic illustration of a left-aligned rendering effect according to an embodiment of the invention;
FIG. 8 is a flow chart of another picture processing method according to an embodiment of the invention;
fig. 9 is a schematic structural diagram of a video data processing apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of another video data processing apparatus according to an embodiment of the present invention; and
fig. 11 is a schematic structural diagram of a picture processing device according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of a picture processing method, it should be noted that the steps shown in the flowchart of the figure may be executed in a computer system such as a set of computer executable instructions, and that while a logical order is shown in the flowchart, in some cases, the steps shown or described may be executed in an order different from that here.
The image processing method provided in this embodiment may be executed by an image processing device, where the image processing device may be implemented in a software and/or hardware manner, and the image processing device may be formed by two or more physical entities, or may be formed by one physical entity. The image processing device can be a computer, a mobile phone, a tablet or an intelligent interactive tablet and the like. In the embodiment, an intelligent interactive panel of a picture processing device is taken as an example for description, where the intelligent interactive panel may be an integrated device that controls content displayed on a display panel and implements human-computer interaction operations through a touch technology, and integrates one or more functions of a projector, an electronic whiteboard, a curtain, a sound, a television, a video conference terminal, and the like.
In an embodiment, the smart interactive tablet establishes a data connection with at least one external device. Among these, external devices include, but are not limited to: mobile phones, notebook computers, USB flash disks, tablet computers, desktop computers, and the like. The embodiment of the communication mode of the data connection between the external device and the intelligent interactive tablet is not limited, and the communication mode can be a USB connection mode, an Internet mode, a local area network mode, a Bluetooth mode, a Wi-Fi mode or a ZigBee mode and the like.
Further, when the smart interactive tablet performs data interaction with at least one external device, the screen projection data is sent to the smart interactive tablet, so that the external device, which displays the screen projection content of the screen projection data, of the smart interactive tablet serves as a screen projection client.
Optionally, the screen projection client and/or the intelligent interaction tablet are installed with screen projection application software, and the screen projection application software may be installed in the screen projection client and/or the intelligent interaction tablet in advance, or may be downloaded from a third-party device or a server and installed for use when the screen projection client and/or the intelligent interaction tablet starts a screen projection application. The third-party device is not limited in the embodiment. Specifically, the screen-casting application software is used for acquiring content displayed by the screen-casting client, using the content as screen-casting data, and instructing the intelligent interactive tablet to display the content. In the embodiment, the screen projection client and the intelligent interactive tablet are exemplified to be simultaneously installed with screen projection application software. The screen projection application software of the screen projection client is used for acquiring screen projection data and directly or indirectly sending the screen projection data to the intelligent interactive panel. If the screen projection client is indirectly sent, the screen projection client can be sent to the intelligent interaction panel through the transfer equipment, and the transfer equipment can be a wireless screen transmission device or other equipment with a data transfer/processing function. And the screen projection application software of the intelligent interactive panel is used for receiving screen projection data and converting the screen projection data into corresponding contents, so that the intelligent interactive panel can conveniently display the contents. It should be noted that the resolution ratios of the display screen of the screen projection client and the display screen of the intelligent interactive panel are different, and the screen projection data is obtained based on the resolution ratio of the screen projection client, so that in order to display the screen projection data in the display screen of the intelligent interactive panel, the screen projection application software needs to determine a screen mapping relationship according to the resolution ratios of the display screen of the screen projection client and the display screen of the intelligent interactive panel, and then convert the screen projection data according to the screen mapping relationship to obtain screen projection content. In the embodiment, the display contents of the screen shot content and the screen shot data are substantially the same, and only the resolution is different.
Further, the screen projection data is in a picture format, and may be screen capture data obtained by screen capturing display content of the screen projection client, or a presentation (Microsoft Office PowerPoint, PPT). In the embodiment, the screen projection data is described as an example of a PPT picture.
In an embodiment, the smart interactive flat panel display screen is a touch screen, and the touch screen may include: capacitive screen, electromagnetic screen or infrared screen, etc. Generally, the touch screen may receive a touch operation input by a user through a finger or an input device. Wherein the input device includes but is not limited to: a stylus, an infrared pen, and/or a capacitive pen, etc.
Fig. 2 is a flowchart of a picture processing method according to an embodiment of the present invention, as shown in fig. 2, the method includes the following steps:
step S202, a filling picture filled into the filling area and a filling mode are obtained.
Optionally, the filling manner may include one of: top left alignment, bottom left alignment, top right alignment, bottom right alignment, left alignment, top alignment, bottom alignment, and right alignment.
Specifically, the filling picture may be a PPT picture, as shown in fig. 3, and is used to be filled into the filling area according to different tiling modes, the filling picture may be sent to the intelligent interactive tablet by the screen-projection client, and the filling picture in the screen-projection client may be a picture stored in the screen-projection client, or a picture input by the user in real time or downloaded from the network. The filling area can be an area selected by a user on the intelligent interaction tablet, and can be any area on a display screen of the intelligent interaction tablet, the user can select the starting position of the filling area through operations such as clicking, double-clicking or long-pressing, and the end position of the filling area is determined through operations such as dragging or sliding, so that the filling area is obtained. In order to fill the filling picture into the filling area, it is necessary to ensure that the size of the filling area is larger than that of the filling picture, and the larger the size of the filling area is, the larger the number of filling pictures that need to be filled is. The filling mode can be a tiling mode selected by a user on the intelligent interactive flat panel and can be any one of 8 tiling modes, the 8 tiling modes are displayed on a display screen of the intelligent interactive flat panel, and the user selects any one of the tiling modes through operations such as clicking, double-clicking or long-pressing.
Further, the intelligent interactive tablet may acquire the filling picture first and then acquire the filling mode selected by the user, or the intelligent interactive tablet may acquire the filling mode selected by the user first and then acquire the filling picture. The sending time of the filling picture can be set according to the actual situation.
Step S204, based on the filling picture, determining a first filling parameter corresponding to a preset filling mode.
Specifically, the preset filling manner may be a tiling manner supported by a default at a rendering end on the smart interactive tablet, for example, may be a left alignment manner. The first fill parameter may be a generic picture fill tiled xml, which represents the node as: < Viewport >0.1,0.2,0.3,0.4</Viewport >, wherein the first two parameters determine the horizontal and vertical offset of the filled picture and the last two parameters determine the size of the filled picture, i.e. the width and height of the filled picture. After determining the size of the filler picture, a parameter value for each of the first filling parameters may be obtained, thereby obtaining the first filling parameters.
It should be noted that, through testing, it is found that the first two parameters corresponding to different tiling modes are different, and different alignment effects can be achieved by adjusting the two parameters.
Step S206, based on the first filling parameter and the offset relationship between the filling mode and the preset filling mode, a second filling parameter corresponding to the filling mode is obtained.
Optionally, the offset relationship may include: a vertical offset and/or a horizontal offset, wherein the vertical offset comprises: a first vertical offset or a second vertical offset, the horizontal offset comprising: a first horizontal offset or a second horizontal offset.
Further, where the filling pattern includes a bottom left alignment, the offset relationship includes a first vertical offset; in the case where the filling pattern includes an upper right alignment, the offset relationship includes a first horizontal offset; in the case where the filling pattern includes a lower right alignment, the offset relationship includes a first vertical offset and a first horizontal offset; in the case where the filling pattern comprises left alignment, the offset relationship comprises a second vertical offset; in the case where the filling manner includes the upper alignment, the offset relationship includes a second horizontal offset; in the case where the filling pattern includes right alignment, the offset relationship includes a second vertical offset and a first horizontal offset; where the filling pattern includes a down-alignment, the offset relationship includes a first vertical offset and a second horizontal offset.
In particular, the offset relationship described above is used to characterize the difference between the effects exhibited by different tiling styles. Through research, the upper left alignment indicates that, in the filled area after filling, the left frame of the filled picture at the upper left corner coincides with the left frame of the filled area, and the upper frame of the filled picture coincides with the upper frame of the filled area, as shown in fig. 4; as shown in fig. 5, the lower left alignment indicates that, in the filled area after filling, the left frame of the filled picture at the lower left corner coincides with the left frame of the filled area, and the lower frame of the filled picture coincides with the lower frame of the filled area, and compared with the upper left alignment, it is equivalent to that the upper left alignment is translated downwards in the vertical direction, so that it can be determined that the offset relationship between the lower left alignment and the upper left alignment is the first vertical offset; as shown in fig. 6, the upper-right alignment indicates that, in the filled area after filling, the right frame of the filled picture at the upper-right corner coincides with the right frame of the filled area, and the upper frame of the filled picture coincides with the upper frame of the filled area, and compared with the upper-left alignment, it is equivalent to that the upper-left alignment is shifted to the right in the horizontal direction, so that it can be determined that the offset relationship between the upper-right alignment and the upper-left alignment is the first horizontal offset; the lower right alignment indicates that in the filled area after filling, the right frame of the filled picture at the lower right corner coincides with the right frame of the filled area, and the lower frame of the filled picture coincides with the lower frame of the filled area, and compared with the upper left alignment, it is equivalent to that the upper left alignment translates downward in the vertical direction and rightward in the horizontal direction, so that it can be determined that the offset relationship between the lower right alignment and the upper left alignment includes a first vertical offset and a first horizontal offset; as shown in fig. 7, the left alignment indicates that, in the filled area after filling, the horizontal middle line of the left centered filled picture coincides with the horizontal middle line of the filled area, and the left frame of the filled picture coincides with the left frame of the filled area, and compared with the upper left alignment, it is equivalent to that the upper left alignment is shifted downward in the vertical direction, and therefore, it can be determined that the offset relationship between the left alignment and the upper left alignment is the second vertical offset; the upper alignment indicates that in the filled area after filling, the vertical middle line of the filled picture at the center above the filled area coincides with the vertical middle line of the filled area, and the upper frame of the filled picture coincides with the upper frame of the filled area, and compared with the upper left alignment, it is known that the upper left alignment translates to the right in the horizontal direction, so that it can be determined that the offset relationship between the upper alignment and the upper left alignment is the second horizontal offset; the lower alignment indicates that in the filled area after filling, the vertical middle line of the filled picture at the center below is overlapped with the vertical middle line of the filled area, and the lower border of the filled picture is overlapped with the lower border of the filled area, and compared with the upper left alignment, the lower border of the filled picture is equivalent to that the upper left alignment translates rightwards in the horizontal direction and downwards in the vertical direction, so that the offset relationship between the lower alignment and the upper left alignment can be determined to comprise a first vertical offset and a second horizontal offset; the right alignment indicates that, in the filled area after filling, the horizontal middle line of the filled picture at the center on the right side coincides with the horizontal middle line of the filled area, and the right frame of the filled picture coincides with the right frame of the filled area. After the offset relationship is determined, the first two parameters of the upper left alignment mode can be adjusted according to the offset relationship between the filling mode selected by the user and the upper left alignment mode, so as to achieve a correct rendering effect.
The specific implementation of the above steps is as steps S2062 to S2066:
in step S2062, the size of the fill picture and the size of the fill area are obtained.
Specifically, the size of the filler picture may be the width and height of the filler picture, picW and picH, respectively. The size of the fill area may be the width and height of the fill area, boundW and boundH, respectively.
Step S2064, obtaining a corresponding offset distance based on the size of the filling picture, the size of the filling area and the offset relationship.
Optionally, the offset distance may include: an offset distance in a vertical direction and/or an offset distance in a horizontal direction.
Specifically, because the rendering effects of different tiling modes are different, and because the default upper left alignment is performed when no alignment mode support is performed during rendering, the offset distance of the rendering effect of the different tiling modes relative to the rendering effect of the upper left alignment can be determined according to the offset relationship between the different tiling modes, so that the parameters in the xml obtained by the upper left alignment mode are adjusted according to the offset distance, and a new xml is obtained, so that the rendering effect of filling according to the tiling mode selected by the user is realized.
For the left-down alignment, the right-down alignment, and the down alignment, the distance of downward translation in the vertical direction is the same as compared to the left-up alignment, and all of them are the first offset distance in the vertical direction. The specific implementation is as step S212 to step S214:
in step S212, a ratio of the height of the filling area to the height of the filling picture is obtained, so as to obtain a first quantity value of the filling picture in the vertical direction.
Specifically, the first quantity value may be the number of the filling pictures required for the filling pictures to be able to cover the entire filling area in the vertical direction, which needs to be accurate to decimal places, and the calculation formula is as follows: picCount ═ bound h/picH.
Step S214, obtain the product of the fractional part value of the first quantity value and the height of the filled picture, to obtain the first offset distance in the vertical direction.
Specifically, since the rendering effect after shifting the integer number of the distance of the filler pictures is the same as that before shifting, only the first offset distance in the vertical direction needs to be determined according to the fractional part value fractput of picCount, and the calculation formula is as follows: length h ═ fractPart picH.
For the upper right alignment, the lower right alignment, and the right alignment, the distance of the right translation in the horizontal direction is the same as the upper left alignment, and all of them are the first offset distance in the horizontal direction. Specifically, the implementation is as follows from step S222 to step S224:
in step S222, a ratio of the width of the filling area to the width of the filling picture is obtained, so as to obtain a second numerical value of the filling picture in the horizontal direction.
Specifically, the second numerical value may be the number of the filling pictures required for the filling pictures to be able to fill the entire filling area in the horizontal direction, which needs to be accurate to a decimal place, and the calculation formula is as follows: picCount ═ bound w/picW.
Step S224, a product of the fractional part value of the second numerical value and the width of the filled picture is obtained to obtain a first offset distance in the horizontal direction.
Specifically, since the rendering effect after shifting the integer number of the distance of the filled picture is the same as that before shifting, only the first offset distance in the horizontal direction needs to be determined according to the fractional part value fractput of picCount, and the calculation formula is as follows: length hw ═ fractPart picW.
For the left alignment and the right alignment, the distances translated downwards in the vertical direction are the same relative to the upper left alignment mode, and are both the second offset distance in the vertical direction and smaller than the first offset distance in the vertical direction. The specific implementation is as in steps S232 to S236:
step S232, a ratio of the height of the filling area to the first preset value is obtained, so as to obtain a first ratio.
Specifically, for left alignment and right alignment, a horizontal middle line of the center filling picture coincides with a horizontal middle line of the filling area, the first preset value may be 2, the first ratio may be half of the filling area in the vertical direction, and the calculation formula is as follows: bound H/2.
In step S234, a high ratio of the first ratio to the filled picture is obtained, and a third quantitative value of the filled picture in the vertical direction is obtained.
Specifically, the third quantity value may be the number of the filling pictures required for filling half of the whole filling area with the filling pictures in the vertical direction, which needs to be accurate to decimal places, and the calculation formula is as follows: picCount ═ (boundH/2)/picH.
In step S236, a second offset distance in the vertical direction is obtained based on the fractional part value of the third quantity value, the height of the filled picture, and the second preset value.
Specifically, the second preset value may be 0.5. Since the rendering effect after shifting the integer number of padding pictures by the distance is the same as before shifting, it is only necessary to determine the second offset distance in the vertical direction from the fractional part value fractput of picCount. The specific implementation is as follows from step S2362 to step S2364:
step S2362, when the fractional value of the third numerical value is greater than or equal to the second preset value, obtaining a difference between the fractional value of the third numerical value and the second preset value to obtain a first difference value, and obtaining a product of the first difference value and a height of the filled picture to obtain a second offset distance in the vertical direction.
Specifically, when the second offset distance in the vertical direction is determined, two cases need to be distinguished, and when fractPart is greater than or equal to 0.5, the calculation formula is as follows: length h ═ (fractPart-0.5) picH.
Step S2364, in a case that the fractional part of the third numerical value is smaller than the second preset value, obtaining a sum of the fractional part of the third numerical value and the second preset value to obtain a first sum, and obtaining a product of the first sum and a height of the filled picture to obtain a second offset distance in the vertical direction.
Specifically, when the second offset distance in the vertical direction is determined, two cases need to be distinguished, and when fractPart <0.5, the calculation formula is as follows: length h ═ (fractPart +0.5) × (picH).
For the upper alignment and the lower alignment, the distance translated to the right in the horizontal direction is the same as the distance translated to the right in the horizontal direction with respect to the upper-left alignment, and both are the second offset distance in the horizontal direction, which is smaller than the first offset distance in the horizontal direction. Specifically, the implementation is as follows from step S242 to step S246:
step S242, a ratio of the width of the filling area to the first preset value is obtained to obtain a second ratio.
Specifically, for left alignment and right alignment, the horizontal middle line of the center filled picture coincides with the horizontal middle line of the filled region, the second ratio mentioned above may be half of the filled region in the horizontal direction, and the calculation formula is as follows: bound W/2.
In step S244, a ratio of the second ratio to the width of the filling picture is obtained, so as to obtain a fourth quantity value of the filling picture in the vertical direction.
Specifically, the fourth quantity value may be a quantity of the filling pictures required for filling half of the entire filling area in the horizontal direction, which is required to be accurate to a decimal place, and the calculation formula is as follows: picCount ═ (boundW/2)/picW.
In step S246, a second offset distance in the horizontal direction is obtained based on the fractional part value of the fourth quantity value, the width of the filled picture, and the second preset value.
Specifically, since the rendering effect after shifting the integer number of distances of the filler pictures is the same as that before shifting, it is only necessary to determine the second offset distance in the vertical direction according to the fractional part value fractput of picCount. The specific implementation is as step S2462 to step S2464:
step S2462, when the fractional part of the fourth quantity value is greater than or equal to the second preset value, obtaining a difference between the fractional part of the fourth quantity value and the second preset value to obtain a second difference value, and obtaining a product of the second difference value and the width of the filled picture to obtain a second offset distance in the horizontal direction.
Specifically, when the second offset distance in the horizontal direction is determined, two cases need to be distinguished, and when fractPart is greater than or equal to 0.5, the calculation formula is as follows: length hw ═ (fractPart-0.5) picW.
Step S2464, in a case that the fractional part of the fourth numerical value is smaller than the second preset value, obtaining a sum of the fractional part of the fourth numerical value and the second preset value to obtain a second sum, and obtaining a product of the second sum and the width of the filled picture to obtain a second offset distance in the horizontal direction.
Specifically, when determining the second offset distance in the horizontal direction, two cases need to be distinguished, and when fractPart <0.5, the calculation formula is as follows: length hw ═ (fractPart +0.5) × picW.
It should be noted that, when the tiling mode selected by the user is top-left justified, the offset distance may be determined to be 0.
Step S2066, a second filling parameter is obtained based on the first filling parameter and the offset distance.
Specifically, a first parameter in the first filling parameter determines an offset condition in the horizontal direction, a second parameter determines an offset condition in the vertical direction, the first parameter may be adjusted according to a first offset distance and a second offset distance in the horizontal direction, and the second parameter may be adjusted according to the first offset distance and the second offset distance in the vertical direction. The specific implementation is as step S252:
step S252, adjusting parameters in the first filling parameters based on the offset distance in the vertical direction and/or the offset distance in the horizontal direction, to obtain second filling parameters.
Specifically, when the offset relationship is the first vertical offset, the second parameter may be adjusted according to the first offset distance in the vertical direction, that is, the sum of the parameter value of the second parameter and the first offset distance in the vertical direction is calculated, so as to obtain the adjusted parameter value of the second parameter; when the offset relationship is the second vertical offset, the second parameter may be adjusted according to the second offset distance in the vertical direction, that is, the sum of the parameter value of the second parameter and the second offset distance in the vertical direction is calculated, so as to obtain the adjusted parameter value of the second parameter. When the offset relationship is the first horizontal offset, the first parameter may be adjusted according to the first offset distance in the horizontal direction, that is, the sum of the parameter value of the first parameter and the first offset distance in the horizontal direction is calculated, so as to obtain the parameter value of the adjusted first parameter; when the offset relationship is the second horizontal offset, the first parameter may be adjusted according to the second offset distance in the horizontal direction, that is, the sum of the parameter value of the first parameter and the second offset distance in the horizontal direction is calculated, so as to obtain the adjusted parameter value of the first parameter. After the first parameter and/or the second parameter in the first filling parameter are/is adjusted, the second filling parameter is obtained according to the adjusted parameter value and the unadjusted parameter value, that is, the filling parameter when the tiling mode selected by the user is converted into the upper left alignment mode is determined as the completed filling parameter.
Further, for the upper left alignment, the parameter values of the first parameter and the second parameter can be guaranteed to be unchanged; for left-down alignment, the parameter value of the second parameter can be adjusted only based on the first offset distance in the vertical direction, and the parameter value of the first parameter is ensured to be unchanged; for the upper right alignment, the parameter value of the first parameter may be adjusted only based on the first offset distance in the horizontal direction, and the parameter value of the second parameter is ensured to be unchanged; for the right-down alignment, the parameter value of the first parameter may be adjusted based on the first offset distance in the horizontal direction, and the parameter value of the second parameter may be adjusted based on the first offset distance in the vertical direction; for left alignment, the parameter value of the second parameter may be adjusted only based on the second offset distance in the vertical direction, and the parameter value of the first parameter is ensured to be unchanged; for the upper alignment, the parameter value of the first parameter may be adjusted only based on the second offset distance in the horizontal direction, and the parameter value of the second parameter is ensured to be unchanged; for right alignment, a parameter value of a first parameter may be adjusted based on a first offset distance in a horizontal direction, and a parameter value of a second parameter may be adjusted based on a second offset distance in a vertical direction; for lower alignment, the parameter value of the first parameter may be adjusted based on the second offset distance in the horizontal direction, and the parameter value of the second parameter may be adjusted based on the first offset distance in the vertical direction.
Step S208, based on the second filling parameter, filling the filling picture into the filling area according to a preset filling manner.
Specifically, after the completed filling parameters are obtained, rendering may be performed according to a top-left alignment mode based on the completed filling parameters, so that the rendered filling region is rendered according to a tiling mode selected by the user, and the effect of the rendered filling region is the same, that is, a correct rendering effect is obtained.
In the embodiment of the invention, after the filling picture and the filling mode filled in the filling area are obtained, the first filling parameter corresponding to the preset filling mode is determined based on the filling picture, the second filling parameter corresponding to the filling mode is obtained based on the first filling parameter and the offset relation between the filling mode and the preset filling mode, the filling picture is filled in the filling area according to the preset filling mode further based on the second filling parameter, so that under the condition that the rendering end supports the upper left alignment mode by default, other alignment modes can be converted into the offset relative to the upper left alignment for rendering, the same rendering effect is realized, the processing complexity is reduced, the cost is reduced, the workload is reduced, the technical effect of rapidly supporting picture analysis is achieved, and the problem that the picture processing method in the prior art fills the filling picture in the filling area according to different filling modes is solved, the processing process is complex and the workload is large.
Example 2
There is also provided, in accordance with an embodiment of the present invention, an embodiment of a method for picture processing, it being noted that the steps illustrated in the flowchart of the figure may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order different than that presented herein.
Fig. 8 is a flowchart of another picture processing method according to an embodiment of the present invention, as shown in fig. 8, the method includes the following steps:
step S802, displaying the filling picture filled in the filling area and the filling mode.
Specifically, the filling picture and the filling mode may be displayed by a display screen of the smart interactive tablet.
Step S804, displaying a filled filling area, where the filled filling area is obtained by filling the filling picture into the filling area according to a preset filling manner based on a second filling parameter corresponding to the filling manner, and the second filling parameter is obtained based on a first filling parameter corresponding to the preset filling manner and an offset relationship between the filling manner and the preset filling manner.
Specifically, the processing may be performed by a processor of the intelligent interactive tablet, a first filling parameter corresponding to a preset filling manner is determined based on the filling picture, a second filling parameter corresponding to the filling manner is obtained based on the first filling parameter and an offset relationship between the filling manner and the preset filling manner, the filling picture is filled into the filling region according to the preset filling manner based on the second filling parameter, a filled filling region is obtained, the filling region is displayed on a display screen of the intelligent interactive tablet, and the same rendering effect of the filling according to the filling manner can be obtained.
In the embodiment of the invention, after the filling picture and the filling mode filled into the filling area are obtained, the filling picture and the filling mode are displayed, the first filling parameter corresponding to the preset filling mode is determined based on the filling picture, the second filling parameter corresponding to the filling mode is obtained based on the first filling parameter and the offset relation between the filling mode and the preset filling mode, the filled filling area is displayed after the filling picture is filled into the filling area according to the preset filling mode based on the second filling parameter, so that under the condition that a rendering end supports the upper left alignment mode by default, other alignment modes can be converted into the offset aligned with respect to the upper left for rendering, the same rendering effect is realized, the processing complexity is reduced, the cost is reduced, the workload is reduced, and the technical effect of rapidly supporting picture analysis is achieved, and the technical problems that the filling picture is filled into the filling area by the picture processing method in the prior art according to different filling modes, the processing process is complex and the workload is large are solved.
Example 3
According to an embodiment of the present invention, an embodiment of a picture processing apparatus is provided, where the picture processing apparatus provided in this embodiment may be integrated in a picture processing device, the picture processing device may be formed by two or more physical entities or may be formed by one physical entity, and the picture processing device may be a computer, a mobile phone, a tablet, a projector, an intelligent interactive tablet, or the like.
Fig. 9 is a schematic structural diagram of an apparatus for processing video data according to an embodiment of the present invention, as shown in fig. 9, the apparatus includes: an acquisition module 92, a determination module 94, a processing module 96, and a population module 98.
The obtaining module 92 is configured to obtain a filling picture filled into the filling area and a filling mode corresponding to the filling picture; the determining module 94 is configured to determine a first filling parameter corresponding to a preset filling mode based on the filling picture; the processing module 96 is configured to obtain a second filling parameter corresponding to the filling manner based on the first filling parameter and the offset relationship between the filling manner and the preset filling manner; the filling module 98 is configured to fill the filling picture into the filling area according to a preset filling manner based on the second filling parameter.
On the basis of the above embodiment, the processing module includes: an acquisition unit configured to acquire a size of a fill picture and a size of a fill area; the first processing unit is used for obtaining a corresponding offset distance based on the size of the filling picture, the size of the filling area and the offset relation; and the second processing unit is used for obtaining a second filling parameter based on the first filling parameter and the offset distance.
On the basis of the above-described embodiment, in a case where the offset relationship includes the first vertical offset, the first processing unit includes: the first obtaining subunit is configured to obtain a ratio of a height of the filling area to a height of the filling picture, so as to obtain a first quantity value of the filling picture in the vertical direction; and the second acquisition subunit is used for acquiring the product of the fractional part numerical value of the first numerical value and the height of the filling picture to obtain a first offset distance in the vertical direction.
On the basis of the above-described embodiment, in the case where the offset relationship includes the first horizontal offset, the first processing unit includes: the third acquiring subunit is used for acquiring the ratio of the width of the filling area to the width of the filling picture to obtain a second numerical value of the filling picture in the horizontal direction; and the fourth acquiring subunit is used for acquiring the product of the fractional part numerical value of the second numerical value and the width of the filling picture to obtain the first offset distance in the horizontal direction.
On the basis of the above-described embodiment, in the case where the offset relationship includes the second vertical offset, the first processing unit includes: the fifth obtaining subunit is configured to obtain a ratio of the height of the filling area to the first preset value, so as to obtain a first ratio; a sixth obtaining subunit, configured to obtain a high ratio of the first ratio to the filling picture, and obtain a third quantity value of the filling picture in the vertical direction; and the first processing subunit is used for obtaining a second offset distance in the vertical direction based on the fractional part numerical value of the third numerical value, the height of the filling picture and a second preset value.
On the basis of the foregoing embodiment, the first processing subunit is configured to, when a fractional value of the third numerical value is greater than or equal to a second preset value, obtain a difference between the fractional value of the third numerical value and the second preset value to obtain a first difference value, and obtain a product of the first difference value and a height of the filled picture to obtain a second offset distance in the vertical direction; and under the condition that the decimal part of the third quantity value is smaller than a second preset value, acquiring the sum of the numerical value of the decimal part of the third quantity value and the second preset value to obtain a first sum value, and acquiring the product of the first sum value and the height of the filled picture to obtain a second offset distance in the vertical direction.
On the basis of the above-described embodiment, in the case where the offset relationship includes the second horizontal offset, the first processing unit includes: the seventh obtaining subunit is configured to obtain a ratio of the width of the filling area to the first preset value, so as to obtain a second ratio; the eighth acquiring subunit is configured to acquire a ratio of the second ratio to a width of the filling picture, so as to obtain a fourth quantity value of the filling picture in the vertical direction; and the second processing subunit is used for obtaining a second offset distance in the horizontal direction based on the fractional part numerical value of the fourth numerical value, the width of the filling picture and a second preset value.
On the basis of the foregoing embodiment, the second processing subunit is configured to, when the fractional value of the fourth numerical value is greater than or equal to the second preset value, obtain a difference between the fractional value of the fourth numerical value and the second preset value to obtain a second difference value, and obtain a product of the second difference value and the width of the filled picture to obtain a second offset distance in the horizontal direction; and under the condition that the decimal part of the fourth numerical value is smaller than the second preset value, acquiring the sum of the numerical value of the decimal part of the fourth numerical value and the second preset value to obtain a second sum value, and acquiring the product of the second sum value and the width of the filled picture to obtain a second offset distance in the horizontal direction.
On the basis of the above embodiment, the second processing unit includes: and the adjusting subunit is used for adjusting the parameters in the first filling parameters based on the offset distance in the vertical direction and/or the offset distance in the horizontal direction to obtain second filling parameters.
In the embodiment of the invention, after the filling picture and the filling mode filled in the filling area are obtained, the first filling parameter corresponding to the preset filling mode is determined based on the filling picture, the second filling parameter corresponding to the filling mode is obtained based on the first filling parameter and the offset relation between the filling mode and the preset filling mode, the filling picture is filled in the filling area according to the preset filling mode further based on the second filling parameter, so that under the condition that the rendering end supports the upper left alignment mode by default, other alignment modes can be converted into the offset relative to the upper left alignment for rendering, the same rendering effect is realized, the processing complexity is reduced, the cost is reduced, the workload is reduced, the technical effect of rapidly supporting picture analysis is achieved, and the problem that the picture processing method in the prior art fills the filling picture in the filling area according to different filling modes is solved, the processing process is complex and the workload is large.
Example 4
According to an embodiment of the present invention, an embodiment of a picture processing apparatus is further provided, where the picture processing apparatus provided in this embodiment may be integrated in a picture processing device, the picture processing device may be formed by two or more physical entities or may be formed by one physical entity, and the picture processing device may be a computer, a mobile phone, a tablet, a projector, an intelligent interactive tablet, or the like.
Fig. 10 is a schematic structural diagram of another video data processing apparatus according to an embodiment of the present invention, as shown in fig. 10, the apparatus including: a first display module 102 and a second display module 104.
The first display module 102 is configured to display a filling picture filled into the filling area and a filling manner; the second display module 104 is configured to display a filled filling area, where the filled filling area is obtained by filling a filling picture into the filling area according to a preset filling manner based on a second filling parameter corresponding to the filling manner, and the second filling parameter is obtained based on a first filling parameter corresponding to the preset filling manner and an offset relationship between the filling manner and the preset filling manner.
In the embodiment of the invention, after the filling picture and the filling mode filled into the filling area are obtained, the filling picture and the filling mode are displayed, the first filling parameter corresponding to the preset filling mode is determined based on the filling picture, the second filling parameter corresponding to the filling mode is obtained based on the first filling parameter and the offset relation between the filling mode and the preset filling mode, the filled filling area is displayed after the filling picture is filled into the filling area according to the preset filling mode based on the second filling parameter, so that under the condition that a rendering end supports the upper left alignment mode by default, other alignment modes can be converted into the offset aligned with respect to the upper left for rendering, the same rendering effect is realized, the processing complexity is reduced, the cost is reduced, the workload is reduced, and the technical effect of rapidly supporting picture analysis is achieved, and the technical problems that the filling picture is filled into the filling area by the picture processing method in the prior art according to different filling modes, the processing process is complex and the workload is large are solved.
Example 5
According to an embodiment of the present invention, there is provided an embodiment of a picture processing apparatus.
Fig. 11 is a schematic structural diagram of a picture processing apparatus according to an embodiment of the present invention, and as shown in fig. 11, the picture processing apparatus includes: a display screen 112, an input device 114, and a processor 116. The number of the processors 116 in the picture processing apparatus may be one or more, and one processor 116 is taken as an example in fig. 11. The display 112, the input device 114 and the processor 116 of the picture processing apparatus may be connected by a bus or other means, and fig. 11 illustrates the connection by the bus as an example. In an embodiment, the image processing device may be a computer, a mobile phone, a tablet, a projector, an interactive smart tablet, or the like. In the embodiment, an intelligent interactive tablet of a picture processing device is taken as an example for description.
The display screen 112 is a touch-enabled display screen 112, which may be a capacitive screen, an electromagnetic screen, or an infrared screen. In general, the display screen 112 is used for displaying data according to instructions of the processor 116, and is also used for receiving touch operations applied to the display screen 112 and sending corresponding signals to the processor 116 or other devices. Optionally, when the display screen 112 is an infrared screen, the display screen further includes an infrared touch frame, and the infrared touch frame is disposed around the display screen 112, and may also be configured to receive an infrared signal and send the infrared signal to the processor 116 or other devices.
The input device 114 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the picture processing apparatus, and may also be a camera for acquiring images and a sound pickup apparatus for acquiring audio data. It should be noted that the specific composition of the input device 114 can be set according to actual conditions.
The processor 116 executes various functional applications of the device and data processing by executing software programs, instructions and modules stored in the memory, that is, implements the above-described picture processing method. The memory is used as a computer readable storage medium for storing software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the picture processing method according to any embodiment of the present invention (e.g., the obtaining module 82, the determining module 84, the processing module 86, and the filling module 88 in the first picture processing apparatus; and the first display module 92 and the second display module 94 in the second picture processing apparatus). The memory can mainly comprise a program storage area and a data storage area, wherein the program storage area can store an operating system and an application program required by at least one function; the storage data area may store data created according to use of the device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory may further include memory located remotely from the processor 116, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Specifically, in the embodiment, when the processor 116 executes one or more programs stored in the memory, the following operations are specifically implemented: instructing the display screen 112 to display the filling picture filled into the filling area; acquiring the filling mode input by the input device 116; determining a first filling parameter corresponding to a preset filling mode based on the filling picture, obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relation between the filling mode and the preset filling mode, and filling the filling picture into the filling area according to the preset filling mode based on the second filling parameter.
On the basis of the foregoing embodiment, when the processor 116 obtains the second filling parameter corresponding to the filling manner based on the first filling parameter and the offset relationship between the filling manner and the preset filling manner, the method specifically includes: acquiring the size of a filling picture and the size of a filling area; obtaining a corresponding offset distance based on the size of the filling picture, the size of the filling area and the offset relation; and obtaining a second filling parameter based on the first filling parameter and the offset distance.
On the basis of the foregoing embodiment, when the offset relationship includes the first vertical offset, the processor 116, when obtaining the corresponding offset distance based on the size of the filled picture, the size of the filled area, and the offset relationship, specifically includes: obtaining the ratio of the height of the filling area to the height of the filling picture to obtain a first quantity value of the filling picture in the vertical direction; and acquiring the product of the fractional part value of the first quantity value and the height of the filling picture to obtain a first offset distance in the vertical direction.
On the basis of the foregoing embodiment, when the offset relationship includes the first horizontal offset, the processor 116, when obtaining the corresponding offset distance based on the size of the filling picture, the size of the filling area, and the offset relationship, specifically includes: obtaining the ratio of the width of the filling area to the width of the filling picture to obtain a second numerical value of the filling picture in the horizontal direction; and acquiring the product of the fractional part value of the second numerical value and the width of the filling picture to obtain a first offset distance in the horizontal direction.
On the basis of the foregoing embodiment, when the offset relationship includes the second vertical offset, the processor 116, when obtaining the corresponding offset distance based on the size of the filling picture, the size of the filling area, and the offset relationship, specifically includes: obtaining the ratio of the height of the filling area to a first preset value to obtain a first ratio; acquiring a high ratio of the first ratio to the filling picture to obtain a third quantity value of the filling picture in the vertical direction; and obtaining a second offset distance in the vertical direction based on the fractional part numerical value of the third numerical value, the height of the filled picture and a second preset value.
On the basis of the foregoing embodiment, when the processor 116 obtains the second offset distance in the vertical direction based on the fractional part value of the third quantity value, the height of the filled picture, and the second preset value, the method specifically includes: under the condition that the fractional part of the third numerical value is greater than or equal to a second preset value, obtaining the difference between the fractional part of the third numerical value and the second preset value to obtain a first difference value, and obtaining the product of the first difference value and the height of the filled picture to obtain a second offset distance in the vertical direction; and under the condition that the decimal part of the third quantity value is smaller than a second preset value, acquiring the sum of the numerical value of the decimal part of the third quantity value and the second preset value to obtain a first sum value, and acquiring the product of the first sum value and the height of the filled picture to obtain a second offset distance in the vertical direction.
On the basis of the foregoing embodiment, when the offset relationship includes the second horizontal offset, the processor 116, when obtaining the corresponding offset distance based on the size of the filling picture, the size of the filling area, and the offset relationship, specifically includes: obtaining the ratio of the width of the filling area to a first preset value to obtain a second ratio; acquiring the ratio of the second ratio to the width of the filling picture to obtain a fourth numerical value of the filling picture in the vertical direction; and obtaining a second offset distance in the horizontal direction based on the fractional part numerical value of the fourth numerical value, the width of the filling picture and a second preset value.
On the basis of the foregoing embodiment, when the processor 116 obtains the second offset distance in the horizontal direction based on the fractional part value of the fourth quantity value, the width of the filled picture and the second preset value, the method specifically includes: under the condition that the fractional part of the fourth numerical value is greater than or equal to a second preset value, obtaining the difference between the fractional part of the fourth numerical value and the second preset value to obtain a second difference value, and obtaining the product of the second difference value and the width of the filled picture to obtain a second offset distance in the horizontal direction; and under the condition that the decimal part of the fourth numerical value is smaller than the second preset value, acquiring the sum of the numerical value of the decimal part of the fourth numerical value and the second preset value to obtain a second sum value, and acquiring the product of the second sum value and the width of the filled picture to obtain a second offset distance in the horizontal direction.
On the basis of the foregoing embodiment, when the processor 116 obtains the second filling parameter based on the first filling parameter and the offset distance, the method specifically includes: and adjusting parameters in the first filling parameters based on the offset distance in the vertical direction and/or the offset distance in the horizontal direction to obtain second filling parameters.
In the embodiment of the invention, after the filling picture and the filling mode filled in the filling area are obtained, the first filling parameter corresponding to the preset filling mode is determined based on the filling picture, the second filling parameter corresponding to the filling mode is obtained based on the first filling parameter and the offset relation between the filling mode and the preset filling mode, the filling picture is filled in the filling area according to the preset filling mode further based on the second filling parameter, so that under the condition that the rendering end supports the upper left alignment mode by default, other alignment modes can be converted into the offset relative to the upper left alignment for rendering, the same rendering effect is realized, the processing complexity is reduced, the cost is reduced, the workload is reduced, the technical effect of rapidly supporting picture analysis is achieved, and the problem that the picture processing method in the prior art fills the filling picture in the filling area according to different filling modes is solved, the processing process is complex and the workload is large.
Example 6
According to an embodiment of the present invention, an embodiment of a storage medium is provided, where the storage medium includes a stored program, and when the program runs, a device on which the storage medium is located is controlled to execute the above-mentioned picture processing method.
Example 7
According to an embodiment of the present invention, an embodiment of a processor is provided, where the processor is configured to execute a program, and the program executes the above-mentioned picture processing method.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (17)

1. An image processing method, comprising:
acquiring a filling picture filled into the filling area and a filling mode;
determining a first filling parameter corresponding to a preset filling mode based on the filling picture;
obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relation between the filling mode and the preset filling mode;
filling the filling picture into the filling area according to the preset filling mode based on the second filling parameter;
obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relationship between the filling mode and the preset filling mode, including: acquiring the size of the filling picture and the size of the filling area; obtaining a corresponding offset distance based on the size of the filling picture, the size of the filling area and the offset relation; and obtaining the second filling parameter based on the first filling parameter and the offset distance, wherein the first two parameters in the first filling parameter are used for determining the horizontal and vertical offset conditions of the filled picture, and the last two parameters are used for determining the width and height of the filled picture.
2. The method of claim 1, wherein the filling manner comprises one of: top left alignment, bottom left alignment, top right alignment, bottom right alignment, left alignment, top alignment, bottom alignment, and right alignment.
3. The method of claim 1, wherein the offset relationship comprises: a vertical offset and/or a horizontal offset, the offset distance comprising: an offset distance in a vertical direction and/or an offset distance in a horizontal direction, wherein the vertical offset comprises: a first vertical offset or a second vertical offset, the horizontal offset comprising: a first horizontal offset or a second horizontal offset.
4. The method of claim 3, wherein in the case that the offset relationship comprises the first vertical offset, deriving a corresponding offset distance based on the size of the filler picture, the size of the filler region, and the offset relationship based on the size of the filler picture comprises:
obtaining the ratio of the height of the filling area to the height of the filling picture to obtain a first quantity value of the filling picture in the vertical direction;
and obtaining the product of the fractional part value of the first quantity value and the height of the filling picture to obtain a first offset distance in the vertical direction.
5. The method according to claim 3, wherein in the case that the offset relationship includes the first horizontal offset, deriving a corresponding offset distance based on the size of the filler picture, the size of the filler region, and the offset relationship based on the size of the filler picture comprises:
obtaining the ratio of the width of the filling area to the width of the filling picture to obtain a second numerical value of the filling picture in the horizontal direction;
and acquiring the product of the fractional part value of the second numerical value and the width of the filling picture to obtain a first offset distance in the horizontal direction.
6. The method of claim 3, wherein in the case that the offset relationship comprises the second vertical offset, deriving a corresponding offset distance based on the size of the filler picture, the size of the filler region, and the offset relationship based on the size of the filler picture comprises:
obtaining the ratio of the height of the filling area to a first preset value to obtain a first ratio;
obtaining a high ratio of the first ratio to the filling picture to obtain a third quantity value of the filling picture in the vertical direction;
and obtaining a second offset distance in the vertical direction based on the fractional part numerical value of the third numerical value, the height of the filled picture and a second preset value.
7. The method of claim 6, wherein obtaining the second offset distance in the vertical direction based on the fractional part value of the third quantity value, the height of the filler picture and a second preset value comprises:
under the condition that the fractional part of the third numerical value is greater than or equal to the second preset value, obtaining the difference between the fractional part of the third numerical value and the second preset value to obtain a first difference value, and obtaining the product of the first difference value and the height of the filled picture to obtain a second offset distance in the vertical direction;
and under the condition that the decimal part of the third numerical value is smaller than the second preset value, acquiring the sum of the numerical value of the decimal part of the third numerical value and the second preset value to obtain a first sum, and acquiring the product of the first sum and the height of the filled picture to obtain a second offset distance in the vertical direction.
8. The method according to claim 3, wherein in a case that the offset relationship includes the second horizontal offset, deriving a corresponding offset distance based on the size of the filler picture, the size of the filler region, and the offset relationship based on the size of the filler picture comprises:
obtaining the ratio of the width of the filling area to a first preset value to obtain a second ratio;
obtaining a ratio of the second ratio to the width of the filling picture to obtain a fourth numerical value of the filling picture in the vertical direction;
and obtaining a second offset distance in the horizontal direction based on the fractional part numerical value of the fourth numerical value, the width of the filled picture and a second preset value.
9. The method of claim 8, wherein obtaining the second offset distance in the horizontal direction based on the fractional part value of the fourth quantity value, the width of the filler picture and a second preset value comprises:
under the condition that the fractional part of the fourth numerical value is greater than or equal to the second preset value, obtaining the difference between the fractional part of the fourth numerical value and the second preset value to obtain a second difference value, and obtaining the product of the second difference value and the width of the filled picture to obtain a second offset distance in the horizontal direction;
and under the condition that the decimal part of the fourth numerical value is smaller than the second preset value, acquiring the sum of the numerical value of the decimal part of the fourth numerical value and the second preset value to obtain a second sum value, and acquiring the product of the second sum value and the width of the filled picture to obtain a second offset distance in the horizontal direction.
10. The method of claim 3,
in a case where the filling pattern includes a lower left alignment, the offset relationship includes the first vertical offset;
in a case where the filling pattern includes an upper right alignment, the offset relationship includes the first horizontal offset;
in a case where the filling pattern includes a lower right alignment, the offset relationship includes the first vertical offset and the first horizontal offset;
in the case that the filling pattern comprises a left alignment, the offset relationship comprises the second vertical offset;
in a case where the filling pattern comprises an upper alignment, the offset relationship comprises the second horizontal offset;
in the case that the filling pattern comprises a right alignment, the offset relationship comprises the second vertical offset and the first horizontal offset;
where the filling pattern comprises a down-alignment, the offset relationship comprises the first vertical offset and the second horizontal offset.
11. The method of claim 3, wherein deriving the second filling parameter based on the first filling parameter and the offset distance comprises:
and adjusting parameters in the first filling parameters based on the offset distance in the vertical direction and/or the offset distance in the horizontal direction to obtain the second filling parameters.
12. An image processing method, comprising:
displaying the filling picture filled into the filling area and the filling mode;
displaying a filled filling area, wherein the filled filling area is obtained by filling the filling picture into the filling area according to a preset filling mode based on a second filling parameter corresponding to the filling mode, and the second filling parameter is obtained based on a first filling parameter corresponding to the preset filling mode and an offset relation between the filling mode and the preset filling mode;
obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relationship between the filling mode and the preset filling mode, including: acquiring the size of the filling picture and the size of the filling area; obtaining a corresponding offset distance based on the size of the filling picture, the size of the filling area and the offset relation; obtaining the second filling parameter based on the first filling parameter and the offset distance, wherein the first filling parameter comprises: the first two parameters are used for determining the horizontal and vertical offset conditions of the filled picture, and the last two parameters are used for determining the width and height of the filled picture.
13. A picture processing apparatus, comprising:
the acquisition module is used for acquiring the filling picture filled into the filling area and the filling mode corresponding to the filling picture;
the determining module is used for determining a first filling parameter corresponding to a preset filling mode based on the filling picture;
the processing module is used for obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relation between the filling mode and the preset filling mode;
the filling module is used for filling the filling picture into the filling area according to the preset filling mode based on the second filling parameter;
obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relationship between the filling mode and the preset filling mode, including: acquiring the size of the filling picture and the size of the filling area; obtaining a corresponding offset distance based on the size of the filling picture, the size of the filling area and the offset relation; obtaining the second filling parameter based on the first filling parameter and the offset distance, wherein the first filling parameter comprises: the first two parameters are used for determining the horizontal and vertical offset conditions of the filled picture, and the last two parameters are used for determining the width and height of the filled picture.
14. A picture processing apparatus, comprising:
the first display module is used for displaying the filling picture filled into the filling area and the filling mode;
the second display module is used for displaying a filled filling area, wherein the filled filling area is obtained by filling the filling picture into the filling area according to a preset filling mode based on a second filling parameter corresponding to the filling mode, and the second filling parameter is obtained based on a first filling parameter corresponding to the preset filling mode and an offset relation between the filling mode and the preset filling mode;
obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relationship between the filling mode and the preset filling mode, including: acquiring the size of the filling picture and the size of the filling area; obtaining a corresponding offset distance based on the size of the filling picture, the size of the filling area and the offset relation; obtaining the second filling parameter based on the first filling parameter and the offset distance, wherein the first filling parameter comprises: the first two parameters are used for determining the horizontal and vertical offset conditions of the filled picture, and the last two parameters are used for determining the width and height of the filled picture.
15. A picture processing device, comprising:
the display screen is used for displaying the filling pictures filled into the filling area;
an input device for inputting a filling mode;
the processor is connected with the display screen and the input device and used for determining a first filling parameter corresponding to a preset filling mode based on the filling picture, obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relation between the filling mode and the preset filling mode, and filling the filling picture into the filling area according to the preset filling mode based on the second filling parameter;
obtaining a second filling parameter corresponding to the filling mode based on the first filling parameter and the offset relationship between the filling mode and the preset filling mode, including: acquiring the size of the filling picture and the size of the filling area; obtaining a corresponding offset distance based on the size of the filling picture, the size of the filling area and the offset relation; obtaining the second filling parameter based on the first filling parameter and the offset distance, wherein the first filling parameter comprises: the first two parameters are used for determining the horizontal and vertical offset conditions of the filled picture, and the last two parameters are used for determining the width and height of the filled picture.
16. A storage medium, characterized in that the storage medium comprises a stored program, wherein when the program runs, a device where the storage medium is located is controlled to execute the picture processing method according to any one of claims 1 to 12.
17. A processor, configured to execute a program, wherein the program executes the picture processing method according to any one of claims 1 to 12.
CN201810935679.1A 2018-08-16 2018-08-16 Picture processing method, device and equipment Active CN109241304B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810935679.1A CN109241304B (en) 2018-08-16 2018-08-16 Picture processing method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810935679.1A CN109241304B (en) 2018-08-16 2018-08-16 Picture processing method, device and equipment

Publications (2)

Publication Number Publication Date
CN109241304A CN109241304A (en) 2019-01-18
CN109241304B true CN109241304B (en) 2021-12-03

Family

ID=65071311

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810935679.1A Active CN109241304B (en) 2018-08-16 2018-08-16 Picture processing method, device and equipment

Country Status (1)

Country Link
CN (1) CN109241304B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113312569B (en) * 2021-05-17 2023-10-03 浪潮金融信息技术有限公司 Method, system and medium for pseudo-randomly displaying webpage background

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408989A (en) * 2008-10-17 2009-04-15 北大方正集团有限公司 Method and apparatus for filling primitive base on graphics set split joint
CN104424295A (en) * 2013-09-02 2015-03-18 联想(北京)有限公司 Information processing method and electronic equipment
CN104424174A (en) * 2013-09-11 2015-03-18 北京大学 Document processing system and document processing method
CN104881844A (en) * 2015-06-29 2015-09-02 北京金山安全软件有限公司 Picture combination method and device and terminal equipment
US20170287181A1 (en) * 2013-04-23 2017-10-05 Adobe Systems Incorporated Fast high-fidelity flood-filling on vector artwork
CN107248187A (en) * 2017-05-22 2017-10-13 武汉地大信息工程股份有限公司 A kind of method of quick three-dimensional model texture cutting restructuring
CN107943363A (en) * 2017-11-21 2018-04-20 广州视睿电子科技有限公司 The collocation method and device of background image, interactive intelligent tablet computer and storage medium
CN108376416A (en) * 2018-01-16 2018-08-07 天津大学 A kind of image generating method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408989A (en) * 2008-10-17 2009-04-15 北大方正集团有限公司 Method and apparatus for filling primitive base on graphics set split joint
US20170287181A1 (en) * 2013-04-23 2017-10-05 Adobe Systems Incorporated Fast high-fidelity flood-filling on vector artwork
CN104424295A (en) * 2013-09-02 2015-03-18 联想(北京)有限公司 Information processing method and electronic equipment
CN104424174A (en) * 2013-09-11 2015-03-18 北京大学 Document processing system and document processing method
CN104881844A (en) * 2015-06-29 2015-09-02 北京金山安全软件有限公司 Picture combination method and device and terminal equipment
CN107248187A (en) * 2017-05-22 2017-10-13 武汉地大信息工程股份有限公司 A kind of method of quick three-dimensional model texture cutting restructuring
CN107943363A (en) * 2017-11-21 2018-04-20 广州视睿电子科技有限公司 The collocation method and device of background image, interactive intelligent tablet computer and storage medium
CN108376416A (en) * 2018-01-16 2018-08-07 天津大学 A kind of image generating method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Survey on Image Processing Under Image Restoration;J. Narmadha etal.;《2017 IEEE International Conference on Electrical, Instrumentation and Communication Engineering (ICEICE)》;20170430;第59-62页 *
实时图像仿射变换系统的研究与实现;王金辉等;《自动控制与检测》;20120224;全文 *

Also Published As

Publication number Publication date
CN109241304A (en) 2019-01-18

Similar Documents

Publication Publication Date Title
CN109831662B (en) Real-time picture projection method and device of AR (augmented reality) glasses screen, controller and medium
US9723123B2 (en) Multi-screen control method and device supporting multiple window applications
CN111866423B (en) Screen recording method for electronic terminal and corresponding equipment
US9898241B2 (en) Information sharing system, image processing apparatus, and image processing method
CN110750197B (en) File sharing method, device and system, corresponding equipment and storage medium
US20130219295A1 (en) Multimedia system and associated methods
EP2879044B1 (en) Information processing apparatus, program, information processing system, and information processing method
CN106708452B (en) Information sharing method and terminal
AU2011345468A1 (en) Three dimensional (3D) display terminal apparatus and operating method thereof
CN108958569B (en) Control method, device, system and terminal of smart television and smart television
EP3021572A1 (en) Display apparatus and control method thereof
CN111414225A (en) Three-dimensional model remote display method, first terminal, electronic device and storage medium
CN112073798B (en) Data transmission method and equipment
CN108259923B (en) Video live broadcast method, system and equipment
EP2645622B1 (en) Image processing apparatus and image processing system
US20170229102A1 (en) Techniques for descriptor overlay superimposed on an asset
US9509733B2 (en) Program, communication apparatus and control method
WO2022242497A1 (en) Video photographing method and apparatus, electronic device, and storage medium
CN111352560B (en) Screen splitting method and device, electronic equipment and computer readable storage medium
CN109241304B (en) Picture processing method, device and equipment
CN108399058B (en) Signal display control method and device
CN104461219A (en) Devices and method for processing information
CN114007022A (en) Video source switching method and device and LED display screen control system
CN111045770A (en) Method, first terminal, device and readable storage medium for remote exhibition
CN113596561A (en) Video stream playing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant