CN114863008A - Image processing method, image processing device, electronic equipment and storage medium - Google Patents

Image processing method, image processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114863008A
CN114863008A CN202210625319.8A CN202210625319A CN114863008A CN 114863008 A CN114863008 A CN 114863008A CN 202210625319 A CN202210625319 A CN 202210625319A CN 114863008 A CN114863008 A CN 114863008A
Authority
CN
China
Prior art keywords
image
gray
area
target
mask layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210625319.8A
Other languages
Chinese (zh)
Other versions
CN114863008B (en
Inventor
闫晓林
吕鸿瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xintang Sichuang Educational Technology Co Ltd
Original Assignee
Beijing Xintang Sichuang Educational Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xintang Sichuang Educational Technology Co Ltd filed Critical Beijing Xintang Sichuang Educational Technology Co Ltd
Priority to CN202210625319.8A priority Critical patent/CN114863008B/en
Publication of CN114863008A publication Critical patent/CN114863008A/en
Application granted granted Critical
Publication of CN114863008B publication Critical patent/CN114863008B/en
Priority to PCT/CN2023/097058 priority patent/WO2023232014A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides an image processing method, an apparatus, an electronic device, and a storage medium, the method including: receiving a trigger operation of a user on a scroll bar in an image scroll list; in response to an ending instruction of the trigger operation, determining a target color image to be displayed in the image scrolling list; under the condition that the target color image meets the preset gray setting display condition, rendering the target color image based on a preset gray setting shader to obtain a target gray image; displaying the target gray image within a display area of the image scrolling list. According to the scheme, the target color image needing gray display is rendered and gray-set by using the gray-set shader, so that the corresponding gray image does not need to be added to each color image in the resource packet, the size of the resource packet is effectively reduced, and the storage resource can be saved.

Description

Image processing method, image processing device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
In some application scenarios, it is often necessary to display the same image in alternating color and gray states, for example, when a button is not available, a gray button image is displayed, and when a button is available, a color button image is displayed.
At present, in the related art, for a scene that needs to alternately display images with two different colors, namely color and gray, usually, a gray image is added for each color image to correspond to the color image, so that a resource package of the scene contains two sets of images with the same content but different colors, and when gray setting or lighting is needed, a proper image resource is selected from the resource package to perform replacement display.
However, in the above scheme, the resource package includes two sets of image resources, and the content of the image is the same, and only the colors are different, so that the repeated resources increase the package body of the resource package, and occupy a larger storage resource.
Disclosure of Invention
In order to solve the technical problems or at least partially solve the technical problems, embodiments of the present disclosure provide an image processing method, an apparatus, an electronic device, and a storage medium.
According to an aspect of the present disclosure, there is provided an image processing method including:
receiving a triggering operation of a user on a scroll bar in an image scroll list;
in response to an ending instruction of the trigger operation, determining a target color image to be displayed in the image scrolling list;
under the condition that the target color image meets the preset gray setting display condition, rendering the target color image based on a preset gray setting shader to obtain a target gray image;
displaying the target gray image within a display area of the image scrolling list.
According to another aspect of the present disclosure, there is provided an image processing apparatus including:
the receiving module is used for receiving the triggering operation of a user on a scroll bar in an image scroll list;
the determining module is used for responding to an ending instruction of the triggering operation and determining a target color image to be displayed in the image rolling list;
the rendering module is used for rendering the target color image based on a preset gray placing shader under the condition that the target color image is determined to meet preset gray placing display conditions, so that a target gray image is obtained;
and the display module is used for displaying the target gray image in a display area of the image scrolling list.
According to another aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing a program, wherein the program is stored in the memory,
wherein the program comprises instructions which, when executed by the processor, cause the processor to carry out the image processing method according to the preceding aspect.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the image processing method according to the preceding aspect.
According to another aspect of the present disclosure, a computer program product is provided, comprising a computer program, wherein the computer program realizes the image processing method of the preceding aspect when executed by a processor.
According to one or more technical schemes provided in the embodiments of the present disclosure, when a target color image to be displayed meets a preset gray setting display condition, a preset gray setting shader is used to render the target color image, so as to obtain a target gray image and display the target gray image in a display area of an image scrolling list, and the target color image to be gray displayed is rendered and gray set by using the gray setting shader, so that it is not necessary to add a corresponding gray image for each color image in a resource packet, thereby effectively reducing the size of the resource packet and saving storage resources.
Drawings
Further details, features and advantages of the disclosure are disclosed in the following description of exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 shows a flow diagram of an image processing method according to an exemplary embodiment of the present disclosure;
FIG. 2 is a diagram illustrating an example of a display effect in which a target color image includes three images and all satisfy a graying display condition in an exemplary embodiment of the present disclosure;
FIG. 3 shows a flow chart of an image processing method according to another exemplary embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating an effect of a transparent mask layer covering a display area according to an exemplary embodiment of the disclosure;
fig. 5 shows a schematic block diagram of an image processing apparatus according to an exemplary embodiment of the present disclosure;
FIG. 6 illustrates a block diagram of an exemplary electronic device that can be used to implement embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description. It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
An image processing method, an apparatus, an electronic device, and a storage medium provided by the present disclosure are described below with reference to the accompanying drawings.
At present, in an application scenario where a gray image and a color image need to be alternately displayed, two ways are generally adopted: firstly, a corresponding gray image is added for each color image, namely two sets of images are added in a resource package, the two sets of images have the same content, the difference is that only one set of images is colored, the other set of images is gray, when the images need to be arranged or lightened, the images with the corresponding colors are selected from the resource package to be displayed, however, the method can cause the resource package to contain repeated resources, so that the bag body of the resource package is larger, more storage resources can be occupied, and the workload of developers can be increased by replacing the images for display; secondly, parameters of RGB channels of the image are directly modified, the image is integrally changed into gray, but the processing mode has poor effect, the image is blackened integrally, and the visual effect is poor.
In order to solve the problems, the present disclosure provides an image processing method, wherein when a target color image to be displayed meets a preset gray setting display condition, a preset gray setting shader is used to render the target color image to obtain a target gray image, and the target gray image is displayed in a display area of an image rolling list, so that the target color image to be gray displayed is rendered and gray set by the gray setting shader, and thus, a corresponding gray image does not need to be added to each color image in a resource packet, the size of the resource packet is effectively reduced, and storage resources can be saved; by adopting the scheme, the gray image can be obtained by rendering and gray setting based on the preset gray setting shader, so that the development work of replacing the image is avoided, and the workload of developers is reduced; the gray setting mode provided by the disclosure can not change RGB channel parameters of the original color image, can set gray for the image under the condition of not influencing the quality of the original image to obtain a gray image, and improves the visual effect of the gray image.
Fig. 1 shows a flowchart of an image processing method according to an exemplary embodiment of the present disclosure, which may be performed by an image processing apparatus provided in an embodiment of the present disclosure, wherein the apparatus may be implemented in software and/or hardware, and may be generally integrated in an electronic device, including but not limited to a computer, a mobile phone, a server, and the like.
As shown in fig. 1, the image processing method may include the steps of:
step 101, receiving a trigger operation of a user on a scroll bar in an image scroll list.
The image scrolling list may be any image scrolling list displayed in a display, for example, the image scrolling list may be an image scrolling list when an image resource in a resource package is displayed in an application program currently logged in by a user, and when the user views the resource package, each image in the resource package is displayed in the form of the image scrolling list, where the resource package includes a plurality of color images.
It can be understood that the contents displayed in the scroll list can be switched by operating the scroll bar in the scroll list. In the embodiment of the present disclosure, the user may perform a trigger operation on the scroll bar in the image scroll list to switch the image displayed in the display area of the image scroll list.
The scroll bar comprises a scroll slider, a blank area of the scroll bar and arrows at two ends of the scroll bar, the triggering operation on the scroll bar comprises but is not limited to a dragging operation on the scroll slider, a clicking operation on the blank area of the scroll bar or the arrows at two ends of the scroll bar and the like, and a user can perform the triggering operation on the scroll bar in a mouse, a touch pen, a finger and the like mode.
And 102, responding to an ending instruction of the triggering operation, and determining a target color image to be displayed in the image rolling list.
In the embodiment of the disclosure, after a user executes a trigger operation on a scroll bar in an image scroll list, the electronic device receives the trigger operation and monitors an end instruction of the trigger operation. For example, when it is monitored that a finger touches a scroll slider, a tracking attribute of a scroll bar changes to yes, when it is monitored that the finger moves on a screen, a drag attribute of the scroll bar changes to yes, when it is monitored that the finger leaves the screen, both the tracking attribute and the drag attribute of the scroll bar change to no, and at this time, it is determined that an instruction for ending touch operation is monitored.
And then, when an ending instruction of the touch operation is monitored, a target color image to be displayed in the image scrolling list is determined in response to the ending instruction.
Wherein the determined target color image comprises at least one image. It can be understood that when a plurality of images are included in the target color image, the plurality of images may not be completely displayed in the display area of the image scroll list due to the size limitation of the display area of the image scroll list. For example, if the target color image includes three images, the display area may only be able to display the second image completely, only the bottom portion of the content may be displayed for the first image, and only the top portion of the content may be displayed for the third image.
It can be understood that, when the scheme of the present disclosure is adopted, only one set of color images needs to be added to the resource package, and therefore, the determined image to be displayed is a color image, which is called a target color image.
It should be noted that, in the embodiment of the present disclosure, a currently common manner of determining target content displayed in a scroll list according to a trigger operation on a scroll bar may be adopted to determine a target color image to be displayed in an image scroll list, which is not described in detail herein.
And 103, under the condition that the target color image meets the preset gray setting display condition, rendering the target color image based on a preset gray setting shader to obtain a target gray image.
The gray setting shader is preset, the gray setting shader gives out modified gray parameters, and the gray setting shader can be used for performing gray rendering on the color image on the premise of not influencing the image quality of the color image to obtain a gray image.
In the embodiment of the present disclosure, after the target color image to be displayed is determined, whether the target color image meets the preset gray setting display condition may be further determined, and the target color image meeting the gray setting display condition is rendered and gray set based on a preset gray setting shader to obtain a target gray image.
The ash setting display condition can be preset according to an application scene and actual requirements.
For example, in a truthful classroom scene created based on Unity 3D, a corresponding color image may be set for each class, the class that the user has learned is lit, i.e., a color image is displayed, and a gray image is displayed for the class that the user has not learned, so for this scene, the gray setting display condition may be set as the corresponding class that the user has not learned. Furthermore, after the target color image is determined, whether the user learns the course corresponding to the target color image or not can be judged, if the user learns the course, the user is determined not to meet the gray setting display condition, if the user does not learn the course, the user is determined to meet the gray setting display condition, and the target color image is rendered based on the gray setting shader to obtain the corresponding target gray image.
It should be noted that, in the embodiment of the present disclosure, when the target color image includes multiple images, whether each image in the target color image satisfies a preset graying display condition may be respectively determined, and for the images satisfying the graying display condition, rendering and graying are performed based on the graying shader to obtain corresponding target gray images, and for the images not satisfying the graying display condition, the original color is retained, and graying processing is not performed.
And 104, displaying the target gray image in a display area of the image scrolling list.
In the embodiment of the present disclosure, after the target gray image is obtained, the target gray image may be displayed in the display area of the image scrolling list.
It can be understood that if the target color image does not satisfy the graying-out display condition, the target color image is displayed in the display area of the image scroll list. When the target color image includes a plurality of images, some of the images may satisfy the graying display condition and some of the images may not satisfy the graying display condition, the corresponding gray image is obtained by rendering and graying the images satisfying the graying display condition only based on the graying shader, and when displaying, the gray image subjected to graying is displayed for the images satisfying the graying display condition, and the original color image is displayed for the images not satisfying the graying display condition. And, when the target color image includes a plurality of images, each image may not be displayed completely in the display area, for example, the target color image includes three images and an exemplary illustration of the display effect satisfying the gray-setting display condition is shown in fig. 2, and as can be seen from fig. 2, only one gray image (image B) in the middle can be displayed completely in the display area of the image scrolling list, and only a part of the content can be displayed for the previous image (image a) and the next image (image C).
The image processing method of the embodiment of the disclosure determines a target color image to be displayed in an image scrolling list by receiving a trigger operation of a user on a scroll bar in the image scrolling list and responding to an end instruction of the trigger operation, and performs rendering processing on the target color image based on a preset gray setting shader under the condition that the target color image is determined to meet a preset gray setting display condition to obtain a target gray image, and then displays the target gray image in a display area of the image scrolling list. By adopting the scheme disclosed by the invention, the target color image needing gray display is rendered and gray-set by utilizing the gray-set shader, so that the corresponding gray image does not need to be added for each color image in the resource packet, the size of the resource packet is effectively reduced, and the storage resource can be saved.
In some application scenes, for example, in a virtual game scene, materials such as props, task cards, medals, and the like, which are already picked up by a user, are displayed in a backpack of the user, images of the materials which are already picked up by the user are lightened, that is, color images are displayed, and images corresponding to the materials which are not picked up by the user are grayed, that is, gray images are displayed. Thus, in an alternative embodiment of the present disclosure, the method further comprises:
acquiring backpack information of the user in a current virtual scene, wherein the backpack information comprises a color image of a material which is received by the user;
determining that the target color image satisfies the graying display condition in a case where the target color image is not included in the color image.
The current virtual scene includes, but is not limited to, a virtual classroom scene, a virtual game scene, etc., and the types and contents of the materials may be different according to the difference of the current virtual scene. For example, if the current virtual scene is a virtual classroom scene, the materials may be courses, knowledge cards, and the like included in the virtual classroom scene; and if the current virtual scene is a virtual game scene, the materials can be level cards, props and the like in the virtual game scene.
In the embodiment of the disclosure, the backpack information of the user in the current virtual scene can be acquired according to the login information of the user on the premise of obtaining the authorization of the user, the backpack information comprises the color image of the material which is received by the user, the target color image to be displayed is matched with the color image in the backpack information, and whether the color image comprises the target color image is judged. When matching is performed, matching can be performed in an image recognition and image identification ratio equivalent mode, and the specific matching mode is not limited by the disclosure. And if the matching result shows that the color image does not contain the target color image, determining that the material corresponding to the target color image is not picked up by the user, and determining that the target color image meets the preset gray setting display condition.
Further, in an optional implementation manner of the present disclosure, if the color image includes the target color image, it may be determined that the material corresponding to the target color image has been received by the user, and it is determined that the target color image does not satisfy the preset graying display condition, and then the target color image is displayed in the display area of the image scrolling list.
In the embodiment of the disclosure, by acquiring backpack information of a user in a current virtual scene, the backpack information includes a color image of a material which the user has received, and under the condition that the color image does not include a target color image, determining that the target color image meets a gray setting display condition, and performing gray setting display on the target color image to display a corresponding target gray image; or, under the condition that the color image comprises the target color image, the target color image is determined not to meet the gray setting display condition, and the target color image is displayed in the display area of the image rolling list, so that whether the target color image meets the gray setting display condition or not is judged according to the material picking state in the virtual scene, and the image with the corresponding color is displayed according to the material picking state.
When a plurality of images are displayed in the display area of the image scrolling list, each image may not be completely displayed, for example, in the display effect shown in fig. 2, only the image B may be completely displayed, only the partial image at the bottom may be displayed for the image a, and only the partial image at the top may be displayed for the image C, which is not beautiful enough and affects the visual experience of the user. In order to solve the above problem, the present disclosure further provides an image processing method, where the image that cannot be completely displayed is masked, so that the image that cannot be completely displayed is invisible to a user, and the user can only see the image that is completely displayed in the display area of the image scrolling list, thereby improving the image display effect and improving the visual experience of the user. The present solution is described in detail below with reference to fig. 3.
As shown in fig. 3, on the basis of the foregoing embodiment, the image processing method may further include the steps of:
step 201, determining a complete image which can be completely displayed in the display area from the target gray image.
For example, an image that can be completely displayed in the display area in the target gray image may be determined according to the arrangement order of the images included in the target gray image, the size of each image, and the size of the display area of the image scroll list, and the image that can be completely displayed may be referred to as a complete image.
For example, a target color image to be displayed and color images that can be completely displayed in the display area may be determined according to position information of a complete image currently displayed in the display area of the image scroll list when a trigger operation on the scroll bar is received, an offset amount of the scroll slider when an end instruction of the trigger operation is received, size information of each image, and the like, and a color image that can be completely displayed is determined, that is, a complete image that can be completely displayed in the display area in a target gray image is determined.
Step 202, a transparent mask layer is obtained, wherein the transparent mask layer comprises a visualization area and a non-visualization area.
The transparent mask layer can be preset and comprises a visual area and a non-visual area, and the coordinates and the length and the width of the visual area and the non-visual area can be flexibly adjusted according to needs.
Step 203, covering the display area with the transparent mask layer, wherein the visualization area covers the complete image in the display area, and the non-visualization area covers other areas of the display area.
In the embodiment of the present disclosure, after the transparent mask layer is obtained, the display area of the image scrolling list may be covered by using the transparent mask layer, wherein the visualized area of the transparent mask layer covers the complete image in the display area of the image scrolling list, and the non-visualized area covers other areas outside the area where the complete image is located in the display area of the image scrolling list, so that the complete image in the display area is visible to the user, and the other areas are invisible to the user.
For example, the coordinates and the size of the visualization area of the transparent mask layer can be flexibly adjusted according to the display position of the complete image in the display area, so that the visualization area is matched with the display area of the complete image, and the effect that the visualization area only covers the complete image is achieved, so that the complete image is visible. And the size and the acting of the non-visual area can be flexibly adjusted according to the display position of the complete image and the position of the display area, so that the non-visual area is matched with other areas outside the area where the complete image is located in the display area, the effect that the non-visual area covers other areas is achieved, and the image displayed in other areas is invisible.
According to the image processing method, the complete image which can be completely displayed in the display area is determined from the target gray image, the transparent mask layer is obtained, the transparent mask layer comprises the visual area and the non-visual area, and then the display area is covered by the transparent mask layer, wherein the visual area covers the complete image in the display area, and the non-visual area covers other areas of the display area. By adopting the scheme, the incomplete images displayed in the display area are masked, the incomplete images are invisible, and only the completely displayed images are visible for a user, so that the phenomenon that the incomplete half images are displayed in the image rolling list is avoided, the image display effect is improved, and the visual experience of the user is promoted.
In the embodiment of the disclosure, the corresponding transparent mask layer may be preset for different position relationships between the complete image and the image that cannot be completely displayed in the target gray image, so as to cover the display area of the image rolling list with the corresponding transparent mask layer, thereby reducing the processing complexity during the covering. Thus, in an alternative embodiment of the present disclosure, obtaining a transparent mask layer may include:
if the target gray image comprises the complete image and a partial image of a subsequent image of the complete image, acquiring a first transparent mask layer, wherein the first transparent mask layer comprises a first visual area and a first non-visual area, and the distance between two adjacent edges between the first visual area and the first non-visual area is smaller than a preset distance threshold;
if the target gray image comprises a partial image of a previous image of the complete image, a partial image of the complete image and a partial image of a next image of the complete image, acquiring a second transparent mask layer, wherein the second transparent mask layer comprises a second non-visual area, a second visual area and a third non-visual area, the distance between the second non-visual area and the first visual area on two adjacent sides is smaller than a preset distance threshold, and the distance between the second visual area and the third non-visual area on two adjacent sides is smaller than the preset distance threshold;
and if the target gray image comprises a partial image of a previous image of the complete image and the complete image, acquiring a third transparent mask layer, wherein the third transparent mask layer comprises a fourth non-visual area and a third visual area, and the distance between the third visual area and the fourth non-visual area on two adjacent sides is smaller than the preset distance threshold.
The preset distance threshold may be set according to actual requirements, for example, the preset distance threshold may be set according to an interval between two adjacent images in the image scrolling list, where the preset distance threshold is not greater than the interval between two adjacent images, so as to avoid a phenomenon that a visualized area cannot completely cover a complete image or a non-visualized area cannot completely cover a partial image.
It can be understood that, when the image scroll list is a list in the vertical direction, the interval between two adjacent images refers to the distance between the bottom edge of the previous image and the top edge of the next image; when the image scroll list is a list in the horizontal direction, the interval between two adjacent images refers to the distance between the right side of the previous image and the left side of the next image.
In the embodiment of the present disclosure, when the transparent mask layer needs to be obtained, a position relationship between a complete image that can be completely displayed in the display area of the image scrolling list and a partial image that cannot be completely displayed in the target gray image may be determined, and then an appropriate transparent mask layer may be selected according to the determined position relationship to cover the display area. Specifically, if the target gray image comprises a complete image and a partial image of a subsequent image of the complete image, that is, the complete image and the partial image have a position relationship that the complete image is in front of the partial image, a first transparent mask layer is obtained, wherein the first transparent mask layer comprises a first visualized area and a first non-visualized area, and the distance between two adjacent edges of the first visualized area and the first non-visualized area is smaller than a preset distance threshold. If the target gray image comprises a partial image of a previous image of the complete image, a partial image of a complete image and a partial image of a next image of the complete image, namely the complete image and the partial images have a positional relationship that the partial images are listed on two sides of the complete image, a second transparent mask layer is obtained, the second transparent mask layer comprises a second non-visual area, a second visual area and a third non-visual area, the distance between the second non-visual area and the first visual area on two adjacent sides is smaller than a preset distance threshold, and the distance between the second visual area and the third non-visual area on two adjacent sides is smaller than a preset distance threshold. And if the target gray image comprises a partial image of a previous image of the complete image and the complete image, namely the position relation between the complete image and the partial image is that the partial image is in front of the complete image, acquiring a third transparent mask layer, wherein the third transparent mask layer comprises a fourth non-visual area and a third visual area, and the distance between two adjacent sides of the third visual area and the fourth non-visual area is smaller than a preset distance threshold. Wherein, the front and the back are determined according to the appearance sequence of the images in the image rolling list, the image appearing first is in front, and the image appearing later is in back.
In the embodiment of the disclosure, different transparent mask layers are selected to cover the display area of the image rolling list according to different position relations of the complete image and the partial image contained in the target gray image, so that a proper transparent mask layer is selected to cover the display area, and the flexibility of the method is improved. In addition, the transparent mask layer corresponding to the position relation between the complete image and the partial image is selected to cover the display area of the image rolling list, only the coordinates and the sizes of the visual area and the non-visual area need to be simply adjusted during covering, the positions of the visual area and the non-visual area do not need to be changed, and therefore the processing complexity during covering is reduced.
In an optional implementation manner of the present disclosure, when the obtained transparent mask layer is used to cover the display area of the image scrolling list, first position information of the display area and second position information of a complete image in the display area, which can be completely displayed in the display area, may be obtained first, and third position information corresponding to other areas in the display area except the complete image is determined according to the first position information and the second position information, and then the visualized area of the transparent mask layer is displayed on the complete image in an overlaid manner based on the second position information, and the non-visualized area of the transparent mask layer is displayed in an overlaid manner on the other areas based on the third position information.
Wherein the position information may be represented in the form of two-dimensional coordinates.
In the embodiment of the disclosure, by acquiring the second position information of the area needing to be visually displayed in the display area and the third position information of other areas needing to be non-visually displayed, the visual area of the transparent mask layer is displayed on the complete image in a covering manner based on the second position information, and the non-visual area is displayed on the other areas in a covering manner based on the third position information, the complete image in the display area can be accurately visually displayed, and partial images in the display area can be covered to be invisible, so that the transparent mask effect is improved.
It can be understood that when the target gray image includes a partial image which cannot be completely displayed, the partial image needs to be masked so as to be invisible to a user, so that the display effect of the image is improved; when only a complete image that can be completely displayed in the display area is included in the target gray image, it is meaningless to mask the display area with the transparent mask layer. Therefore, in an alternative embodiment of the present disclosure, before the transparent mask layer is obtained to cover the display area, it may be determined whether the target gray image includes a residual image that cannot be completely displayed in the display area, and when it is determined that the target gray image includes a residual image that cannot be completely displayed in the display area, the step of obtaining the transparent mask layer to cover the display area of the image scrolling list in the above embodiment is performed.
For example, when determining whether the target gray image includes a residual image that cannot be completely displayed in the display area, it may be determined whether the target gray image includes a residual image that cannot be completely displayed according to the size of the display area of the image scroll list and the total size of each image in the target gray image. For example, taking the image scrolling list as a list in the vertical direction as an example, the height of the display area of the image scrolling list may be obtained, the number of images included in the target gray image and the height of each image may be obtained, the total height required for displaying the target gray image is determined according to the height of each image and a preset interval between two adjacent images, and if the total height is greater than the height of the display area, it is determined that the target gray image includes a residual image that cannot be completely displayed in the display area.
In the embodiment of the disclosure, before the transparent mask layer is obtained to cover the display area, it is determined that the target gray image includes a residual image that cannot be completely displayed in the display area, and then the transparent mask layer is obtained to cover the display area, so that the masking operation is performed only when the target gray image includes an image that cannot be completely displayed, and the masking operation is not required for the situation that the target gray image can be completely displayed in the display area, thereby avoiding unnecessary masking operation and being beneficial to saving processing resources.
It should be noted that, in the embodiment of the present disclosure, when a color image is displayed in a display area of a scrolling list of images, the incomplete color image displayed may also be masked by using the above scheme, so that only the complete color image displayed is visible to a user.
Fig. 4 is a schematic diagram illustrating an effect of a transparent mask layer covering a display area according to an exemplary embodiment of the disclosure. In the display effect shown in fig. 2, the image a and the image C only display partial images, which affects the aesthetic property, so that with the solution of the present disclosure, a transparent mask layer can be obtained to cover the partial images of the image a and the image C, so that the user can only see the completely displayed image B, the display effect after covering is as shown in fig. 4, the dotted line part represents a non-visual area of the transparent mask layer, it can be understood that, in practical applications, the dotted line does not exist, and fig. 4 only uses the dotted line frame to indicate the covered non-visual area, instead of the actual display content. As can be seen from fig. 4, the images a and C, which show only part of the images, are invisible to the user, improving the image display effect and enhancing the aesthetic appearance.
The exemplary embodiment of the present disclosure also provides an image processing apparatus. Fig. 5 shows a schematic block diagram of an image processing apparatus according to an exemplary embodiment of the present disclosure, and as shown in fig. 5, the image processing apparatus 50 includes: a receiving module 501, a determining module 502, a rendering module 503, and a presentation module 504.
The receiving module 501 is configured to receive a trigger operation of a user on a scroll bar in an image scroll list;
a determining module 502, configured to determine, in response to an end instruction of the trigger operation, a target color image to be displayed in the image scrolling list;
a rendering module 503, configured to, when it is determined that the target color image meets a preset gray setting display condition, perform rendering processing on the target color image based on a preset gray setting shader to obtain a target gray image;
a display module 504, configured to display the target gray image in a display area of the image scrolling list.
Optionally, the image processing apparatus 50 further includes:
the first acquisition module is used for acquiring backpack information of the user in a current virtual scene, wherein the backpack information comprises a color image of a material which is received by the user;
and the judging module is used for determining that the target color image meets the gray setting display condition under the condition that the target color image is not included in the color images.
Optionally, the determining module is further configured to:
determining that the target color image does not satisfy the graying display condition in a case where the target color image is included in the color image;
the display module 504 is further configured to:
and displaying the target color image in a display area of the image scrolling list.
Optionally, the image processing apparatus 50 further includes:
the first processing module is used for determining a complete image which can be completely displayed in the display area from the target gray image;
the second obtaining module is used for obtaining a transparent mask layer, and the transparent mask layer comprises a visual area and a non-visual area;
a second processing module, configured to cover the display area with the transparent mask layer, where the visualized area covers the complete image in the display area, and the non-visualized area covers other areas of the display area.
Optionally, the second processing module is further configured to:
acquiring first position information of the display area and second position information of the complete image in the display area;
determining third position information corresponding to other areas except the complete image in the display area according to the first position information and the second position information;
displaying the visualization area of the transparent mask layer over the complete image based on the second position information;
displaying the non-visualization area of the transparent mask layer over the other area based on the third position information.
Optionally, the second obtaining module is further configured to:
if the target gray image comprises the complete image and a partial image of a subsequent image of the complete image, acquiring a first transparent mask layer, wherein the first transparent mask layer comprises a first visual area and a first non-visual area, and the distance between two adjacent edges between the first visual area and the first non-visual area is smaller than a preset distance threshold;
if the target gray image comprises a partial image of a previous image of the complete image, a partial image of the complete image and a partial image of a next image of the complete image, acquiring a second transparent mask layer, wherein the second transparent mask layer comprises a second non-visual area, a second visual area and a third non-visual area, the distance between the second non-visual area and the first visual area on two adjacent sides is smaller than a preset distance threshold, and the distance between the second visual area and the third non-visual area on two adjacent sides is smaller than the preset distance threshold;
and if the target gray image comprises a partial image of a previous image of the complete image and the complete image, acquiring a third transparent mask layer, wherein the third transparent mask layer comprises a fourth non-visual area and a third visual area, and the distance between the third visual area and the fourth non-visual area on two adjacent sides is smaller than the preset distance threshold.
Optionally, the image processing apparatus 50 further includes:
and the third processing module is used for determining that the target gray image comprises a residual image which cannot be completely displayed in the display area.
The image processing device provided by the embodiment of the disclosure can execute any image processing method applicable to the electronic equipment provided by the embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method. Reference may be made to the description of any method embodiment of the disclosure that may not be described in detail in the embodiments of the apparatus of the disclosure.
An exemplary embodiment of the present disclosure also provides an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor. The memory stores a computer program executable by the at least one processor, the computer program, when executed by the at least one processor, is for causing the electronic device to perform an image processing method according to an embodiment of the present disclosure.
The disclosed exemplary embodiments also provide a non-transitory computer readable storage medium storing a computer program, wherein the computer program, when executed by a processor of a computer, is for causing the computer to perform an image processing method according to an embodiment of the present disclosure.
The exemplary embodiments of the present disclosure also provide a computer program product comprising a computer program, wherein the computer program, when being executed by a processor of a computer, is adapted to cause the computer to carry out an image processing method according to an embodiment of the present disclosure.
Referring to fig. 6, a block diagram of a structure of an electronic device 1100, which may be a server or a client of the present disclosure, which is an example of a hardware device that may be applied to aspects of the present disclosure, will now be described. Electronic device is intended to represent various forms of digital electronic computer devices, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other suitable computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the electronic device 1100 includes a computing unit 1101, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)1102 or a computer program loaded from a storage unit 1108 into a Random Access Memory (RAM) 1103. In the RAM1103, various programs and data necessary for the operation of the device 1100 may also be stored. The calculation unit 1101, the ROM 1102, and the RAM1103 are connected to each other by a bus 1104. An input/output (I/O) interface 1105 is also connected to bus 1104.
A number of components in electronic device 1100 connect to I/O interface 1105, including: an input unit 1106, an output unit 1107, a storage unit 1108, and a communication unit 1109. The input unit 1106 may be any type of device capable of inputting information to the electronic device 1100, and the input unit 1106 may receive input numeric or character information and generate key signal inputs related to user settings and/or function controls of the electronic device. Output unit 1107 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, a video/audio output terminal, a vibrator, and/or a printer. Storage unit 1108 may include, but is not limited to, a magnetic or optical disk. The communication unit 1109 allows the electronic device 1100 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunications networks, and may include, but is not limited to, modems, network cards, infrared communication devices, wireless communication transceivers and/or chipsets, such as bluetooth (TM) devices, WiFi devices, WiMax devices, cellular communication devices, and/or the like.
The computing unit 1101 can be a variety of general purpose and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 1101 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and the like. The calculation unit 1101 performs the respective methods and processes described above. For example, in some embodiments, the image processing method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 1108. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 1100 via the ROM 1102 and/or the communication unit 1109. In some embodiments, the computing unit 1101 may be configured to perform the image processing method by any other suitable means (e.g., by means of firmware).
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
As used in this disclosure, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

Claims (10)

1. An image processing method, wherein the method comprises:
receiving a trigger operation of a user on a scroll bar in an image scroll list;
in response to an ending instruction of the trigger operation, determining a target color image to be displayed in the image scrolling list;
under the condition that the target color image meets the preset gray setting display condition, rendering the target color image based on a preset gray setting shader to obtain a target gray image;
displaying the target gray image within a display area of the image scrolling list.
2. The image processing method according to claim 1, wherein the determining that the target color image satisfies a preset graying-out display condition comprises:
acquiring backpack information of the user in a current virtual scene, wherein the backpack information comprises a color image of a material which is received by the user;
determining that the target color image satisfies the graying display condition in a case where the target color image is not included in the color image.
3. The image processing method of claim 2, wherein the method further comprises:
determining that the target color image does not satisfy the graying display condition in a case where the target color image is included in the color image;
and displaying the target color image in a display area of the image scrolling list.
4. The image processing method of any of claims 1 to 3, wherein the method further comprises:
determining a complete image which can be completely displayed in the display area from the target gray image;
obtaining a transparent mask layer, wherein the transparent mask layer comprises a visual area and a non-visual area;
covering the display area with the transparent mask layer, wherein the visualization area covers the complete image within the display area, and the non-visualization area covers other areas of the display area.
5. The image processing method of claim 4, wherein said covering the display area with the transparent mask layer comprises:
acquiring first position information of the display area and second position information of the complete image in the display area;
determining third position information corresponding to other areas except the complete image in the display area according to the first position information and the second position information;
displaying the visualization area of the transparent mask layer over the complete image based on the second position information;
displaying the non-visualization area of the transparent mask layer over the other area based on the third position information.
6. The image processing method of claim 4, wherein the obtaining a transparent mask layer comprises:
if the target gray image comprises the complete image and a partial image of a subsequent image of the complete image, acquiring a first transparent mask layer, wherein the first transparent mask layer comprises a first visual area and a first non-visual area, and the distance between two adjacent edges between the first visual area and the first non-visual area is smaller than a preset distance threshold;
if the target gray image comprises a partial image of a previous image of the complete image, a partial image of the complete image and a partial image of a next image of the complete image, acquiring a second transparent mask layer, wherein the second transparent mask layer comprises a second non-visual area, a second visual area and a third non-visual area, the distance between the second non-visual area and the first visual area on two adjacent sides is smaller than a preset distance threshold, and the distance between the second visual area and the third non-visual area on two adjacent sides is smaller than the preset distance threshold;
and if the target gray image comprises a partial image of a previous image of the complete image and the complete image, acquiring a third transparent mask layer, wherein the third transparent mask layer comprises a fourth non-visual area and a third visual area, and the distance between the third visual area and the fourth non-visual area on two adjacent sides is smaller than the preset distance threshold.
7. The image processing method of claim 4, wherein the method further comprises:
determining that a residual image that cannot be completely displayed within the display area is included in the target gray image.
8. An image processing apparatus comprising:
the receiving module is used for receiving the triggering operation of a user on a scroll bar in an image scroll list;
the determining module is used for responding to an ending instruction of the triggering operation and determining a target color image to be displayed in the image rolling list;
the rendering module is used for rendering the target color image based on a preset gray placing shader under the condition that the target color image is determined to meet preset gray placing display conditions, so that a target gray image is obtained;
and the display module is used for displaying the target gray image in a display area of the image scrolling list.
9. An electronic device, comprising:
a processor; and
a memory for storing a program, wherein the program is stored in the memory,
wherein the program comprises instructions which, when executed by the processor, cause the processor to carry out the image processing method according to any one of claims 1-7.
10. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the image processing method according to any one of claims 1 to 7.
CN202210625319.8A 2022-06-02 2022-06-02 Image processing method, image processing device, electronic equipment and storage medium Active CN114863008B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210625319.8A CN114863008B (en) 2022-06-02 2022-06-02 Image processing method, image processing device, electronic equipment and storage medium
PCT/CN2023/097058 WO2023232014A1 (en) 2022-06-02 2023-05-30 Image processing method and apparatus, and electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210625319.8A CN114863008B (en) 2022-06-02 2022-06-02 Image processing method, image processing device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114863008A true CN114863008A (en) 2022-08-05
CN114863008B CN114863008B (en) 2023-03-10

Family

ID=82623884

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210625319.8A Active CN114863008B (en) 2022-06-02 2022-06-02 Image processing method, image processing device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN114863008B (en)
WO (1) WO2023232014A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023232014A1 (en) * 2022-06-02 2023-12-07 北京新唐思创教育科技有限公司 Image processing method and apparatus, and electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110134308A (en) * 2019-05-17 2019-08-16 深圳前海微众银行股份有限公司 Method for exhibiting data, device, equipment and computer readable storage medium
CN110717005A (en) * 2019-10-10 2020-01-21 支付宝(杭州)信息技术有限公司 Thermodynamic diagram texture generation method, device and equipment
CN111858986A (en) * 2020-06-28 2020-10-30 深圳市维度统计咨询股份有限公司 Picture browsing method, system and storage medium
CN111967702A (en) * 2019-05-20 2020-11-20 阿里巴巴集团控股有限公司 Data processing method and system
CN112184595A (en) * 2020-10-23 2021-01-05 青岛海信移动通信技术股份有限公司 Mobile terminal and image display method thereof
US20210193018A1 (en) * 2019-12-18 2021-06-24 Chongqing Boe Optoelectronics Technology Co., Ltd. Image processing method and display device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2675171B1 (en) * 2012-06-11 2018-01-24 BlackBerry Limited Transparency information in image or video format not natively supporting transparency
US9800852B1 (en) * 2016-09-07 2017-10-24 Essential Products, Inc. Color reconstruction
CN114399437A (en) * 2021-12-31 2022-04-26 上海米哈游璃月科技有限公司 Image processing method and device, electronic equipment and storage medium
CN114863008B (en) * 2022-06-02 2023-03-10 北京新唐思创教育科技有限公司 Image processing method, image processing device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110134308A (en) * 2019-05-17 2019-08-16 深圳前海微众银行股份有限公司 Method for exhibiting data, device, equipment and computer readable storage medium
CN111967702A (en) * 2019-05-20 2020-11-20 阿里巴巴集团控股有限公司 Data processing method and system
CN110717005A (en) * 2019-10-10 2020-01-21 支付宝(杭州)信息技术有限公司 Thermodynamic diagram texture generation method, device and equipment
US20210193018A1 (en) * 2019-12-18 2021-06-24 Chongqing Boe Optoelectronics Technology Co., Ltd. Image processing method and display device
CN111858986A (en) * 2020-06-28 2020-10-30 深圳市维度统计咨询股份有限公司 Picture browsing method, system and storage medium
CN112184595A (en) * 2020-10-23 2021-01-05 青岛海信移动通信技术股份有限公司 Mobile terminal and image display method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023232014A1 (en) * 2022-06-02 2023-12-07 北京新唐思创教育科技有限公司 Image processing method and apparatus, and electronic device and storage medium

Also Published As

Publication number Publication date
WO2023232014A1 (en) 2023-12-07
CN114863008B (en) 2023-03-10

Similar Documents

Publication Publication Date Title
US11412292B2 (en) Video processing method, video processing device, and storage medium
EP3769509B1 (en) Multi-endpoint mixed-reality meetings
CN104718528A (en) Method and device for determining color of interface control, and terminal device
CN109542376B (en) Screen display adjustment method, device and medium
CN113163050B (en) Session interface display method and device
CN110070551A (en) Rendering method, device and the electronic equipment of video image
CN105824531A (en) Method and device for adjusting numbers
EP3679825A1 (en) A printing method and system of a nail printing apparatus, and a medium thereof
WO2022194003A1 (en) Screen capture method and apparatus, electronic device, and readable storage medium
CN113359995B (en) Man-machine interaction method, device, equipment and storage medium
CN114863008B (en) Image processing method, image processing device, electronic equipment and storage medium
CN111338721A (en) Online interaction method, system, electronic device and storage medium
US10628984B2 (en) Facial model editing method and apparatus
CN106855772A (en) A kind of information displaying method and device
CN115129214A (en) Display device and color filling method
CN107133347B (en) Method and device for displaying visual analysis chart, readable storage medium and terminal
CN111198636A (en) Status bar display method, terminal and storage medium
CN109766530B (en) Method and device for generating chart frame, storage medium and electronic equipment
CN112035877A (en) Information hiding method and device, electronic equipment and readable storage medium
CN111107264A (en) Image processing method, image processing device, storage medium and terminal
CN111796736B (en) Application sharing method and device and electronic equipment
CN114518859A (en) Display control method, display control device, electronic equipment and storage medium
CN112148409A (en) Window image effect realization method and device and storage medium
CN104951223A (en) Method and device for achieving magnifying lens on touch screen and host
CN115145451B (en) Frame selection method, device and equipment on terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant