CN113672134B - Media information editing method, device, computer readable medium and electronic equipment - Google Patents

Media information editing method, device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN113672134B
CN113672134B CN202110875251.4A CN202110875251A CN113672134B CN 113672134 B CN113672134 B CN 113672134B CN 202110875251 A CN202110875251 A CN 202110875251A CN 113672134 B CN113672134 B CN 113672134B
Authority
CN
China
Prior art keywords
media information
association
editing
target
text
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110875251.4A
Other languages
Chinese (zh)
Other versions
CN113672134A (en
Inventor
姜伟
王宁
张爽
郎勇
程龙
刘恺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sogou Technology Development Co Ltd
Original Assignee
Beijing Sogou Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sogou Technology Development Co Ltd filed Critical Beijing Sogou Technology Development Co Ltd
Priority to CN202110875251.4A priority Critical patent/CN113672134B/en
Publication of CN113672134A publication Critical patent/CN113672134A/en
Application granted granted Critical
Publication of CN113672134B publication Critical patent/CN113672134B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a media information editing method, a device, a computer readable medium and electronic equipment. The method comprises the following steps: acquiring reference data for editing media information, and extracting reference media information from the reference data; acquiring edited basic media information in a media information editing area, and determining a target association relationship between the reference media information and the basic media information; determining a target editing mode for editing the reference media information into the media information editing area according to the target association relation; and editing the reference media information into the media information editing area according to the target editing mode so as to obtain new basic media information. The technical scheme of the embodiment of the application can improve the efficiency of editing the media information.

Description

Media information editing method, device, computer readable medium and electronic equipment
Technical Field
The present application relates to the field of computer and media data processing technologies, and in particular, to a media information editing method, device, computer readable medium and electronic equipment.
Background
In an application scene of media information editing, for example, in an application scene of text information editing, a section of text is usually selected from other places and then copied in an input box, however, the editing mode can only mechanically edit the text, a more abundant editing mode can not be provided, the flexibility of text editing is not strong, a proper text is difficult to edit, and further the efficiency of text editing is low. Based on this, how to improve the efficiency of editing media information is a technical problem to be solved.
Disclosure of Invention
Embodiments of the present application provide a method, an apparatus, a computer program product or a computer program, a computer readable medium, and an electronic device for editing media information, so that efficiency of editing media information can be improved at least to some extent.
Other features and advantages of the application will be apparent from the following detailed description, or may be learned by the practice of the application.
According to an aspect of an embodiment of the present application, there is provided a media information editing method including: acquiring reference data for editing media information, and extracting reference media information from the reference data; acquiring edited basic media information in a media information editing area, and determining a target association relationship between the reference media information and the basic media information; determining a target editing mode for editing the reference media information into the media information editing area according to the target association relation; and editing the reference media information into the media information editing area according to the target editing mode so as to obtain new basic media information.
According to an aspect of an embodiment of the present application, there is provided a media information editing apparatus including: a first acquisition unit configured to acquire reference data for media information editing and extract reference media information from the reference data; a second acquisition unit configured to acquire the base media information that has been edited in the media information editing area, and determine a target association relationship between the reference media information and the base media information; the first determining unit is used for determining a target editing mode for editing the reference media information into the media information editing area according to the target association relation; and the editing unit is used for editing the reference media information into the media information editing area according to the target editing mode so as to obtain new basic media information.
In some embodiments of the present application, based on the foregoing solution, the first obtaining unit is configured to: selecting reference data for editing media information in an interface; and acquiring the reference data when a drag event of dragging the reference data to a media information editing area in the interface is detected.
In some embodiments of the application, based on the foregoing solution, the first obtaining unit is further configured to: identifying a data type of the reference data; extracting the reference media information in the reference data by a media information extraction model corresponding to the data type.
In some embodiments of the present application, based on the foregoing, the data type of the reference data includes any one or a combination of any plurality of text data, image data, audio data, video data, and web page data.
In some embodiments of the present application, based on the foregoing solution, the second obtaining unit includes: a second determining unit, configured to obtain at least one association relationship between the reference media information and the base media information; and determining an association relationship as the target association relationship in the at least one association relationship.
In some embodiments of the application, based on the foregoing, the second determining unit is configured to: displaying the association relationship between the reference media information and the basic media information through at least one first control, wherein each first control corresponds to one association relationship between the reference media information and the basic media information; when a trigger event aiming at a target first control in the at least one first control is detected, determining an association relation corresponding to the target first control as the target association relation, wherein the target first control is any control in the at least one first control.
In some embodiments of the application, based on the foregoing, the second determining unit is configured to: determining the association degree of the reference media information and the basic media information on various association relations, wherein the association degree is used for representing the association strength of the reference media information and the basic media information on corresponding association relations; and determining an association relationship as the target association relationship in the at least one association relationship based on the association degrees of the reference media information and the basic media information on various association relationships.
In some embodiments of the application, based on the foregoing, the reference media information comprises reference text, the base media information comprises base text, and the second determining unit is configured to: carrying out semantic analysis on the reference text and the basic text through a semantic analysis model to obtain semantic features of the reference text and the basic text; and determining the association degree of the reference text and the basic text on various association relations semantically through semantic features of the reference text and the basic text.
In some embodiments of the application, based on the foregoing, the second determining unit is configured to: determining the association relation of which the association degree exceeds a preset threshold value as an association relation to be selected, and displaying the association relation to be selected through at least one second control, wherein each second control corresponds to one association relation to be selected; when a trigger event aiming at a target second control in the at least one second control is detected, determining a to-be-selected association relationship corresponding to the target second control as the target association relationship, wherein the target second control is any control in the at least one second control.
In some embodiments of the application, based on the foregoing, the second determining unit is configured to: and determining the association relation with the highest association degree as the target association relation.
In some embodiments of the present application, based on the foregoing aspect, the reference media information includes a reference text, the base media information includes a base text, the target association includes any one of a similar association, an above association, a below association, and an explanatory association, and the first determining unit is configured to: when the target association relationship is a similar association relationship, determining a target editing mode of replacing the basic text with the reference text; when the target association relationship is the above association relationship, determining a target editing mode before combining the reference text to the basic text; when the target association relationship is the following association relationship, determining a target editing mode after merging the reference text to the basic text; and when the target association relationship is an explanatory association relationship, determining a target editing mode of inserting the reference text into the adaptation position in the basic text.
In some embodiments of the application, based on the foregoing, the apparatus further comprises: and a writing unit for writing the reference media information to the media information editing area when there is no base media information that has been edited in the media information editing area.
According to an aspect of embodiments of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the electronic device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the electronic device performs the media information editing method described in the above embodiment.
According to an aspect of the embodiments of the present application, there is provided a computer-readable medium having stored thereon a computer program which, when executed by a processor, implements a media information editing method as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic apparatus including: one or more processors; and a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the media information editing method as described in the above embodiments.
In the technical solutions provided in some embodiments of the present application, by extracting reference media information from reference data and determining a target association relationship between the reference media information and base media information already edited in a media information editing area, a target editing manner of editing the reference media information into the media information editing area may be determined according to the target association relationship, so that the reference media information may be edited into the media information editing area according to the target editing manner. The method comprises the steps of extracting the reference media information from the reference data, providing various materials for editing the media information, enriching the editing content of the media information, and determining a target editing mode suitable for editing the reference media information into the media information editing area through the target association relation between the reference media information and the basic media information.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application. It is evident that the drawings in the following description are only some embodiments of the present application and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art. In the drawings:
FIG. 1 shows a schematic diagram of an exemplary hardware environment in which the technical scheme of embodiments of the present application may be applied;
FIG. 2 illustrates a flowchart of a media information editing method according to one embodiment of the present application;
FIG. 3 illustrates a detailed flow diagram of acquiring reference data for editing media information according to one embodiment of the application;
FIG. 4 illustrates an interface illustration of dragging the reference data to a text editing area in the interface according to one embodiment of the application;
FIG. 5 shows a detailed flow diagram of extracting reference media information in the reference data according to one embodiment of the application;
FIG. 6 illustrates a detailed flow diagram of determining a target association between the reference media information and the base media information according to one embodiment of the application;
FIG. 7 shows a detailed flow diagram of determining an association among the at least one association as the target association in accordance with one embodiment of the application;
FIG. 8 illustrates another interface illustration of determining one of the at least one relationship as the target relationship in accordance with one embodiment of the present application;
FIG. 9 shows a detailed flow diagram of determining an association among the at least one association as the target association in accordance with one embodiment of the application;
FIG. 10 is a detailed flow diagram of determining the relevance of reference text to base text in various relevance relationships according to one embodiment of the application;
FIG. 11 is a detailed flow chart of determining an association among the at least one candidate association as the target association according to one embodiment of the application;
FIG. 12 is another interface illustration of determining an association among the at least one candidate association as the target association according to one embodiment of the application;
FIG. 13 shows an overall flow diagram of a text editing method according to one embodiment of the application;
FIG. 14 shows a block diagram of a media information editing apparatus according to one embodiment of the present application;
Fig. 15 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the application may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
It should be noted that: references herein to "a plurality" means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., a and/or B may represent: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the objects so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in other sequences than those illustrated or otherwise described.
Embodiments of the present application relate to techniques related to artificial intelligence by which fully automated processing of data (e.g., media information data) is achieved. Artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) is the theory, method, technique, and application system that simulates, extends, and extends human intelligence using a digital computer or a machine controlled by a digital computer, perceives the environment, obtains knowledge, and uses the knowledge to obtain optimal results. In other words, artificial intelligence is an integrated technology of computer science that attempts to understand the essence of intelligence and to produce a new intelligent machine that can react in a similar way to human intelligence. Artificial intelligence, i.e. research on design principles and implementation methods of various intelligent machines, enables the machines to have functions of sensing, reasoning and decision.
FIG. 1 shows a schematic diagram of an exemplary hardware environment in which the technical scheme of embodiments of the present application may be applied.
As shown in fig. 1, an implementation environment of the technical solution of the embodiment of the present application may include a terminal device. For example, any one of the smart phone 101, tablet 102, touch display 103, and portable computer 104 shown in fig. 1 is included, but of course, other electronic devices with touch or non-touch display functions, etc. may be used.
In one embodiment of the present application, a user may implement the technical solution of the embodiment of the present application using a smart phone with a touch or non-touch display function, such as smart phone 101 shown in fig. 1. Specifically, the screen of the smart phone may display an interface, and editing of media information may be performed in the interface displayed on the screen.
Further, as shown in fig. 1, any one of the terminal devices may obtain reference data for editing media information, extract reference media information from the reference data, then obtain edited basic media information in a media information editing area, determine a target association relationship between the reference media information and the basic media information, determine a target editing mode for editing the reference media information into the media information editing area according to the target association relationship, and finally edit the reference media information into the media information editing area by the terminal device according to the target editing mode to obtain new basic media information.
In this embodiment, by referencing the target association relationship between the media information and the basic media information, a target editing mode adapted to edit the reference media information into the media information editing area may be determined, so that the rationality of editing the media information may be ensured, and the efficiency of editing the media information may be improved.
It should be noted that, the media information mentioned in the present application may include text information, image information, audio information, video information, text information, image information, audio information, and various combinations of video information.
The implementation details of the technical scheme of the embodiment of the application are described in detail below:
Fig. 2 shows a flowchart of a media information editing method according to an embodiment of the present application, which can be performed by a device having a calculation processing function, such as the terminal device shown in fig. 1. Referring to fig. 2, the method for editing media information at least includes steps 210 to 270, which are described in detail as follows:
in step 210, reference data for editing of media information is acquired, and reference media information is extracted from the reference data.
In the present application, the data type of the reference data may be various, for example, may refer to text data, may refer to image data, may refer to audio data, may refer to video data, may refer to web page data, and may refer to any combination of various image data, audio data, video data and web page data.
In one embodiment of the present application, the acquisition of reference data for editing of media information may be performed in accordance with the steps shown in fig. 3.
Referring to FIG. 3, a detailed flow diagram of acquiring reference data for editing media information is shown, according to one embodiment of the application. Specifically, steps 211 to 212 are included:
In step 211, reference data for editing of media information is selected in an interface.
In step 212, the reference data is acquired when a drag event is detected that drags the reference data to a media information editing area in the interface.
In order to better understand the present embodiment, the following will take media information as text information as an example, and will be described in connection with fig. 4 in the context of a mobile phone interface.
Referring to fig. 4, an interface illustration of dragging the reference data to a text editing area in the interface is shown, according to one embodiment of the application.
As illustrated in fig. 4, the cell phone interface 410, cell phone interface 420, cell phone interface 430, and cell phone and interface 440 each include a text editing area 450.
In one case, in the cell phone interface 410, the text data 461 is acquired upon detecting that the selected text data 461 is dragged into the text editing area 450.
In one case, in the cell phone interface 420, the video data 462 is acquired upon detecting that the selected video data 462 is dragged into the text editing area 450.
In one case, in the cell phone interface 430, the audio data 463 is acquired upon detecting a drag of the selected audio data 463 into the text editing area 450.
In one case, in the cell phone interface 440, the image data 464 is acquired upon detecting the dragging of the selected image data 464 into the text editing area 450.
According to the method and the device for editing the text, the reference data are obtained in a dragging mode, so that the friendliness of interaction with a mobile phone interface in the text editing process can be improved, and the text editing efficiency can be improved.
It should be noted that, in this embodiment, the interface may be a mobile phone interface, a PC interface, or a tablet computer interface, and it is understood that the interface mentioned herein may refer to all interfaces having an interaction function.
In one embodiment of the present application, the reference data for editing the media information is acquired, and the reference data may be copied to the media information editing area in the interface, so that the reference data for editing the media information is acquired.
Specifically, for example, a text file or an image file may be copied to the media information editing area in the interface so that reference data for media information editing is acquired.
In one embodiment of the application, the extraction of the reference media information in the reference data may be performed in accordance with the steps shown in fig. 5.
Referring to fig. 5, a detailed flow diagram of extracting reference media information from the reference data is shown, according to one embodiment of the application. Specifically, the method comprises the steps 213 to 214:
In step 213, the data type of the reference data is identified.
In step 214, the reference media information is extracted from the reference data by a media information extraction model corresponding to the data type.
In the present application, the media information extraction model may be trained in advance, and is mainly used for extracting the reference media information from the reference data.
Specifically, for example, when the data type is text data, the reference media information may be extracted from the text data through a corresponding media information extraction model.
For example, when the data type is image data, the image information in the image data can be identified through the corresponding media information extraction model, and the reference media information in the image information can be extracted.
Also for example, when the data type is audio data, sound information in the audio data may also be converted into reference media information through a corresponding media information extraction model.
For example, when the data type is video data or web page data, the reference media information may also be extracted from the video data or web page data through the corresponding media information extraction model.
With continued reference to fig. 2, in step 230, the base media information that has been edited in the media information editing area is acquired, and a target association relationship between the reference media information and the base media information is determined.
In the present application, the basic media information may refer to media information that has been edited in the media information editing area. For example, the cell phone interface 410 shown in fig. 4, the underlying media information (i.e., underlying text) in the text editing area includes "Xiaoming, i hear that a traffic accident occurred in the south Beijing road yesterday evening.
In one embodiment of the present application, determining the target association relationship between the reference media information and the base media information may be performed according to the steps shown in fig. 6.
Referring to fig. 6, a detailed flowchart of determining a target association between the reference media information and the base media information is shown, according to one embodiment of the application. Specifically, the method comprises the steps 231 to 232:
In step 231, at least one association between the reference media information and the base media information is obtained.
In step 232, an association is determined as the target association among the at least one association.
In one embodiment of the present application, in the case where the reference media information may include a reference text and the base media information may include a base text, at least one association relationship may be corresponding between the reference text and the base text, for example, a similar association relationship, a contextual association relationship, or an explanatory association relationship. The explanatory association may refer to an explanatory text using a reference text as a base text, for example, the reference text "highest mountain in the world" and the base text "nacreous peak" may be explanatory association.
In one embodiment of step 232 shown in fig. 6, determining an association relationship among the at least one association relationship as the target association relationship may be performed according to the steps shown in fig. 7.
Referring to fig. 7, a detailed flowchart of determining an association among the at least one association as the target association according to one embodiment of the present application is shown. Specifically, the method includes steps 2321 to 2322:
In step 2321, an association between the reference media information and the base media information is displayed through at least one first control, where each first control corresponds to an association between the reference media information and the base media information.
In step 2322, when a trigger event for a target first control in the at least one first control is detected, determining an association relationship corresponding to the target first control as the target association relationship, where the target first control is any one control in the at least one first control.
In order to better understand the present embodiment, the following will take media information as text information as an example, and will be described in conjunction with fig. 8 in the context of a PC interface.
Referring to fig. 8, another interface illustration of determining one of the at least one association as the target association according to one embodiment of the present application is shown.
As illustrated in fig. 8, a text editing area 804 is included in the PC interface 801, and after the reference data 806 is dragged to the text editing area 804, the reference text 803 in the reference data is extracted, and at this time, an association relationship between the reference text 803 and the base text 802 is shown through at least one first control 805.
For example, through the control "first editing mode", the control "second editing mode", the control "third editing mode" and the control "fourth editing mode", different association relationships between the reference text 803 and the base text 802 may be displayed, and the user may click on one control to select the association relationship between the reference text 803 and the base text 802 as the target association relationship, for example, click on the control "second editing mode", and then trigger an event of the control "second editing mode". Further, when a trigger event aiming at the control 'second editing mode' is detected, determining the association relation corresponding to the control 'second editing mode' as the target association relation.
In the application, on one hand, more selectable editing modes can be provided for subsequent media information editing by referring to various association relations between the media information and the basic media information, and the user can select different option controls corresponding to the various association relations, so that the friendliness of interaction with a PC interface in the process of editing the media information can be improved, and the media information editing efficiency can be improved. On the other hand, based on the target association relation corresponding to the target first control selected by the user according to the actual situation, the accuracy of editing the subsequent media information can be improved.
In another embodiment of step 232 shown in fig. 6, determining an association relationship among the at least one association relationship as the target association relationship may be performed according to the steps shown in fig. 9.
Referring to fig. 9, a detailed flowchart of determining an association among the at least one association as the target association according to an embodiment of the present application is shown. Specifically, steps 2323 to 2324 are included:
In step 2323, a degree of association of the reference media information and the base media information on various associations is determined, the degree of association being used to characterize a strength of association of the reference media information and the base media information on respective associations.
In step 2324, an association relationship is determined as the target association relationship among the at least one association relationship based on the association degrees of the reference media information and the base media information on the various association relationships.
In one embodiment of step 2323 as shown in fig. 9, determining the degree of association of the reference media information and the base media information on various association relationships may be performed in accordance with the steps shown in fig. 10.
Referring to fig. 10, taking media information as an example of text information, i.e., in the case where the reference media information includes reference text and the base media information includes base text, a detailed flowchart for determining the degree of association of the reference text and the base text on various association relations according to one embodiment of the present application is shown. Specifically, the method comprises steps 23231 to 23232:
in step 23231, semantic analysis is performed on the reference text and the basic text through a semantic analysis model, so as to obtain semantic features of the reference text and the basic text.
In step 23232, the degree of association of the reference text and the base text on various association relationships is semantically determined by semantic features of the reference text and the base text.
In the application, the association relationship between the reference text and the basic text can be established in terms of semantic association relationship, based on which text feature vectors for the reference text and the basic text can be output through a semantic analysis model, wherein the text feature vectors are used for representing semantic features of the reference text and the basic text, and further, the association degree of the reference text and the basic text on various association relationships can be calculated according to mathematical relationship between the text feature vectors of the reference text and the text feature vectors of the basic text.
For example, a similarity correlation between the reference text and the base text on a similar correlation relationship is calculated. In the application, the similarity correlation between the reference text and the basic text on the similarity correlation is calculated, and the vector distance between the text feature vector of the reference text and the text feature vector of the basic text can be calculated in practice, if the vector distance is smaller, the similarity correlation is indicated to be about large, and if the vector distance is larger, the similarity correlation is indicated to be about small.
Also for example, a context relevance of the reference text and the base text on the context relevance is calculated.
Also for example, a degree of context correlation of the reference text and the base text over a context correlation relationship is calculated.
For example, the degree of explanatory relevance of the reference text and the base text on the explanatory relevance relationship is also calculated.
In one embodiment of step 2324 shown in fig. 9, determining, as the target association, one of the at least one association based on the association degrees of the reference media information and the base media information on various association relations may be performed according to the steps shown in fig. 11.
Referring to fig. 11, a detailed flowchart of determining an association relationship among the at least one candidate association relationship as the target association relationship according to an embodiment of the present application is shown. Specifically, the method comprises the steps 23241 to 23242:
In step 23241, determining the association relationship with the association degree exceeding a predetermined threshold as a to-be-selected association relationship, and displaying the to-be-selected association relationship through at least one second control, wherein each second control corresponds to one to-be-selected association relationship.
In step 23242, when a trigger event for a target second control in the at least one second control is detected, determining a to-be-selected association relationship corresponding to the target second control as the target association relationship, where the target second control is any one control in the at least one second control.
In the application, the association degree corresponding to the association relation to be selected exceeds the preset threshold, the possibility of the reference media information and the basic media information on the actual association relation can be enhanced to a certain extent, the association relation to be selected is further displayed through the second control and is selected by a user, the friendliness of interaction with an interface in the process of editing the media information can be improved, and the editing efficiency of the media information can be improved. On the other hand, the target association relationship is determined from the screened association relationship through the event triggering the second control, so that the accuracy of editing the subsequent media information can be further improved.
The predetermined threshold value may be set according to the actual situation.
In another embodiment of step 2324 shown in fig. 9, based on the association degrees of the reference media information and the basic media information on various association relations, an association relation is determined as the target association relation in the at least one association relation, or an association relation with the highest association degree may be determined as the target association relation.
In the application, the association relationship with the highest association degree is determined as the target association relationship, namely, in the subsequent media information editing process, the media information can be edited according to the media information editing mode corresponding to the association relationship with the highest association degree, and it can be understood that the embodiment can also improve the accuracy of editing the subsequent media information.
With continued reference to fig. 2, in step 250, a target editing mode for editing the reference media information into the media information editing area is determined according to the target association relationship.
In an embodiment of the present application, taking media information as text information as an example, that is, in a case where the reference media information includes reference text and the base media information includes base text, the target association relationship may include any one of a similar association relationship, an above association relationship, a below association relationship, and an explanatory association relationship.
Specifically, in this embodiment, when the target association relationship is a similar association relationship, a target editing manner of replacing the basic text with the reference text may be determined.
In this embodiment, when the target association relationship is the above association relationship, a target editing manner before the reference text is merged into the base text may be determined.
In this embodiment, when the target association relationship is a following association relationship, a target editing manner after the reference text is merged into the base text may be determined.
In this embodiment, when the target association relationship is an explanatory association relationship, a target editing manner of inserting the reference text into the adaptive position in the base text may be determined.
In the application, the target editing mode is selected from a plurality of editing modes for editing the reference media information into the media information editing area through the target association relation, so that a proper media information editing mode can be determined, and the media information editing efficiency is improved.
With continued reference to fig. 2, in step 270, the reference media information is edited into the media information editing area in the target editing manner to obtain new base media information.
In the present application, after determining a target editing mode for editing the reference media information into the media information editing area, the reference media information may be edited into the media information editing area in accordance with the target editing mode.
In order to better understand how the media information editing is performed by those skilled in the art, the following will take the media information as text information as an example, and will be described in connection with fig. 12 in the context of a mobile phone interface.
Referring to fig. 12, another interface illustration of determining an association among the at least one candidate association as the target association according to an embodiment of the present application is shown.
As shown in fig. 12, in the mobile phone interface 1210, the user selects a section of news text data 1212 and drags it to the text editing area 1211, at this time, a plurality of controls are popped up in the mobile phone interface 1220, where each control may correspond to a relationship between the reference text and the basic text to be selected, after clicking the "explain-insert-edit" control, the user inserts the reference text into the basic text according to the target editing mode of the adaptive position of the reference text "6 months 30 days, a man's hit and run" insert basic text "in the evening, i listens to the fact that a traffic accident occurs in the south-to-the-road at night" in the south-to-the-road at night, and obtains a new basic text "the small state, i listens to the fact that a traffic accident of a man hit and run occurs in the south-to-the-road at night. ".
Therefore, the media information editing scheme provided by the embodiment provides multiple media information editing modes for the user, and determines the media information editing mode by triggering the control event, so that the interactive friendliness of the user and the editing interface is improved to a great extent, the efficiency of media information editing is further improved, and meanwhile, the accuracy of media information editing can be improved by selecting one adaptive target editing mode from the media information editing modes corresponding to the multiple controls.
In one embodiment of the present application, the reference media information is directly written to the media information editing area when there is no base media information that has been edited in the media information editing area.
In order to better understand the present application, a method for editing media information will be described below with reference to fig. 13, taking the media information as text information as an example.
Referring to fig. 13, an overall flowchart of a text editing method according to an embodiment of the present application is shown, specifically including steps 1301 to 1306:
step 1301, the reference data is selected.
Step 1302, drag the reference data to the text editing area, and extract the reference text in the reference data.
In step 1303, if the text editing area has the basic text, execution is performed 1304 if not, and execution is performed 1305 if yes.
In step 1304, the reference text is directly written into a text editing area.
In step 1305, a control window for selecting a text editing mode is popped up, and a target editing mode is determined by clicking the control.
And step 1306, editing the reference text into the text editing area according to the target editing mode.
In the technical solutions provided in some embodiments of the present application, by extracting reference media information from reference data and determining a target association relationship between the reference media information and base media information already edited in a media information editing area, a target editing manner of editing the reference media information into the media information editing area may be determined according to the target association relationship, so that the reference media information may be edited into the media information editing area according to the target editing manner. The method comprises the steps of extracting the reference media information from the reference data, providing various materials for editing the media information, enriching the editing content of the media information, and determining a target editing mode suitable for editing the reference media information into the media information editing area through the target association relation between the reference media information and the basic media information.
The following describes an embodiment of the apparatus of the present application, which can be used to perform the media information editing method in the above embodiment of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the media information editing method described above.
Fig. 14 shows a block diagram of a media information editing apparatus according to an embodiment of the present application.
Referring to fig. 14, a media information editing apparatus 1400 according to an embodiment of the present application includes: a first acquisition unit 1401, a second acquisition unit 1402, a first determination unit 1403, and an editing unit 1404.
Wherein, the first obtaining unit 1401 is used for obtaining reference data for editing media information and extracting reference media information from the reference data; a second acquisition unit 1402 configured to acquire base media information that has been edited in a media information editing area, and determine a target association relationship between the reference media information and the base media information; a first determining unit 1403 configured to determine a target editing manner of editing the reference media information into the media information editing area according to the target association relationship; an editing unit 1404, configured to edit the reference media information into the media information editing area according to the target editing mode, so as to obtain new basic media information.
In some embodiments of the present application, based on the foregoing scheme, the first acquiring unit 1401 is configured to: selecting reference data for editing media information in an interface; and acquiring the reference data when a drag event of dragging the reference data to a media information editing area in the interface is detected.
In some embodiments of the present application, based on the foregoing scheme, the first acquiring unit 1401 is further configured to: identifying a data type of the reference data; extracting the reference media information in the reference data by a media information extraction model corresponding to the data type.
In some embodiments of the present application, based on the foregoing, the data type of the reference data includes any one or a combination of any plurality of text data, image data, audio data, video data, and web page data.
In some embodiments of the present application, based on the foregoing scheme, the second obtaining unit 1402 includes: a second determining unit, configured to obtain at least one association relationship between the reference media information and the base media information; and determining an association relationship as the target association relationship in the at least one association relationship.
In some embodiments of the application, based on the foregoing, the second determining unit is configured to: displaying the association relationship between the reference media information and the basic media information through at least one first control, wherein each first control corresponds to one association relationship between the reference media information and the basic media information; when a trigger event aiming at a target first control in the at least one first control is detected, determining an association relation corresponding to the target first control as the target association relation, wherein the target first control is any control in the at least one first control.
In some embodiments of the application, based on the foregoing, the second determining unit is configured to: determining the association degree of the reference media information and the basic media information on various association relations, wherein the association degree is used for representing the association strength of the reference media information and the basic media information on corresponding association relations; and determining an association relationship as the target association relationship in the at least one association relationship based on the association degrees of the reference media information and the basic media information on various association relationships.
In some embodiments of the application, based on the foregoing, the reference media information comprises reference text, the base media information comprises base text, and the second determining unit is configured to: carrying out semantic analysis on the reference text and the basic text through a semantic analysis model to obtain semantic features of the reference text and the basic text; and determining the association degree of the reference text and the basic text on various association relations semantically through semantic features of the reference text and the basic text.
In some embodiments of the application, based on the foregoing, the second determining unit is configured to: determining the association relation of which the association degree exceeds a preset threshold value as an association relation to be selected, and displaying the association relation to be selected through at least one second control, wherein each second control corresponds to one association relation to be selected; when a trigger event aiming at a target second control in the at least one second control is detected, determining a to-be-selected association relationship corresponding to the target second control as the target association relationship, wherein the target second control is any control in the at least one second control.
In some embodiments of the application, based on the foregoing, the second determining unit is configured to: and determining the association relation with the highest association degree as the target association relation.
In some embodiments of the present application, based on the foregoing aspect, the reference media information includes a reference text, the base media information includes a base text, the target association includes any one of a similar association, an above association, a below association, and an explanatory association, and the first determining unit 1403 is configured to: when the target association relationship is a similar association relationship, determining a target editing mode of replacing the basic text with the reference text; when the target association relationship is the above association relationship, determining a target editing mode before combining the reference text to the basic text; when the target association relationship is the following association relationship, determining a target editing mode after merging the reference text to the basic text; and when the target association relationship is an explanatory association relationship, determining a target editing mode of inserting the reference text into the adaptation position in the basic text.
In some embodiments of the application, based on the foregoing, the apparatus further comprises: and a writing unit for writing the reference media information to the media information editing area when there is no base media information that has been edited in the media information editing area.
Fig. 15 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
It should be noted that, the computer system 1500 of the electronic device shown in fig. 15 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 15, the computer system 1500 includes a central processing unit (Central Processing Unit, CPU) 1501, which can perform various appropriate actions and processes, such as performing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 1502 or a program loaded from a storage portion 1508 into a random access Memory (Random Access Memory, RAM) 1503. In the RAM 1503, various programs and data required for the operation of the system are also stored. The CPU 1501, ROM 1502, and RAM 1503 are connected to each other through a bus 1504. An Input/Output (I/O) interface 1505 is also connected to bus 1504.
The following components are connected to I/O interface 1505: an input section 1506 including a keyboard, mouse, and the like; an output portion 1507 including a Cathode Ray Tube (CRT), a Liquid crystal display (Liquid CRYSTAL DISPLAY, LCD), and a speaker; a storage section 1508 including a hard disk and the like; and a communication section 1509 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 1509 performs communication processing via a network such as the internet. A drive 1510 is also connected to the I/O interface 1505 as needed. Removable media 1511, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is mounted on the drive 1510 as needed so that a computer program read therefrom is mounted into the storage section 1508 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program can be downloaded and installed from a network via the communication portion 1509, and/or installed from the removable medium 1511. When executed by a Central Processing Unit (CPU) 1501, performs the various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), a flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Where each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the media information editing method described in the above embodiment.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more programs which, when executed by one of the electronic devices, cause the electronic device to implement the media information editing method described in the above embodiment.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a touch terminal, or a network device, etc.) to perform the method according to the embodiments of the present application.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (14)

1. A method of editing media information, the method comprising:
Acquiring reference data for editing media information, and extracting reference media information from the reference data;
Acquiring edited basic media information in a media information editing area, and determining a target association relationship between the reference media information and the basic media information;
Determining a target editing mode for editing the reference media information into the media information editing area according to the target association relation;
Editing the reference media information into the media information editing area according to the target editing mode so as to obtain new basic media information;
the obtaining the reference data for editing the media information comprises the following steps:
Selecting reference data for editing media information in an interface;
and acquiring the reference data when a drag event of dragging the reference data to a media information editing area in the interface is detected.
2. The method of claim 1, wherein said extracting reference media information in said reference data comprises:
Identifying a data type of the reference data;
extracting the reference media information in the reference data by a media information extraction model corresponding to the data type.
3. The method of claim 2, wherein the data type of the reference data comprises any one or a combination of any plurality of text data, image data, audio data, video data, and web page data.
4. The method of claim 1, wherein the determining the target association between the reference media information and the base media information comprises:
Acquiring at least one association relationship between the reference media information and the basic media information;
and determining an association relationship as the target association relationship in the at least one association relationship.
5. The method of claim 4, wherein determining one of the at least one association as the target association comprises:
Displaying the association relationship between the reference media information and the basic media information through at least one first control, wherein each first control corresponds to one association relationship between the reference media information and the basic media information;
when a trigger event aiming at a target first control in the at least one first control is detected, determining an association relation corresponding to the target first control as the target association relation, wherein the target first control is any control in the at least one first control.
6. The method of claim 4, wherein determining one of the at least one association as the target association comprises:
Determining the association degree of the reference media information and the basic media information on various association relations, wherein the association degree is used for representing the association strength of the reference media information and the basic media information on corresponding association relations;
And determining an association relationship as the target association relationship in the at least one association relationship based on the association degrees of the reference media information and the basic media information on various association relationships.
7. The method of claim 6, wherein the reference media information comprises reference text and the base media information comprises base text, and wherein the determining the degree of association of the reference media information and the base media information over various association relationships comprises:
Carrying out semantic analysis on the reference text and the basic text through a semantic analysis model to obtain semantic features of the reference text and the basic text;
and determining the association degree of the reference text and the basic text on various association relations semantically through semantic features of the reference text and the basic text.
8. The method of claim 6, wherein determining one of the at least one association relationship as the target association relationship based on the degree of association of the reference media information and the base media information on various association relationships, comprises:
Determining the association relation of which the association degree exceeds a preset threshold value as an association relation to be selected, and displaying the association relation to be selected through at least one second control, wherein each second control corresponds to one association relation to be selected;
when a trigger event aiming at a target second control in the at least one second control is detected, determining a to-be-selected association relationship corresponding to the target second control as the target association relationship, wherein the target second control is any control in the at least one second control.
9. The method of claim 6, wherein determining one of the at least one association relationship as the target association relationship based on the degree of association of the reference media information and the base media information on various association relationships, comprises:
And determining the association relation with the highest association degree as the target association relation.
10. The method of claim 1, wherein the reference media information includes a reference text, the base media information includes a base text, the target association includes any one of a similar association, a previous association, a next association, and an explanatory association, and the determining a target editing manner of editing the reference media information into the media information editing area according to the target association includes:
When the target association relationship is a similar association relationship, determining a target editing mode of replacing the basic text with the reference text;
when the target association relationship is the above association relationship, determining a target editing mode before combining the reference text to the basic text;
when the target association relationship is the following association relationship, determining a target editing mode after merging the reference text to the basic text;
and when the target association relationship is an explanatory association relationship, determining a target editing mode of inserting the reference text into the adaptation position in the basic text.
11. The method according to claim 1, wherein the method further comprises:
and writing the reference media information into the media information editing area when the edited basic media information does not exist in the media information editing area.
12. A media information editing apparatus, the apparatus comprising:
a first acquisition unit configured to acquire reference data for media information editing and extract reference media information from the reference data;
the first acquisition unit is configured to: selecting reference data for editing media information in an interface; acquiring the reference data when a drag event for dragging the reference data to a media information editing area in the interface is detected;
A second acquisition unit configured to acquire the base media information that has been edited in the media information editing area, and determine a target association relationship between the reference media information and the base media information;
The first determining unit is used for determining a target editing mode for editing the reference media information into the media information editing area according to the target association relation;
and the editing unit is used for editing the reference media information into the media information editing area according to the target editing mode so as to obtain new basic media information.
13. A computer readable storage medium having stored therein at least one program code loaded and executed by a processor to implement operations performed by a media information editing method according to any of claims 1 to 11.
14. An electronic device comprising one or more processors and one or more memories, the one or more memories having stored therein at least one piece of program code that is loaded and executed by the one or more processors to perform the operations performed by the media information editing method of any of claims 1-11.
CN202110875251.4A 2021-07-30 2021-07-30 Media information editing method, device, computer readable medium and electronic equipment Active CN113672134B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110875251.4A CN113672134B (en) 2021-07-30 2021-07-30 Media information editing method, device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110875251.4A CN113672134B (en) 2021-07-30 2021-07-30 Media information editing method, device, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113672134A CN113672134A (en) 2021-11-19
CN113672134B true CN113672134B (en) 2024-06-04

Family

ID=78540925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110875251.4A Active CN113672134B (en) 2021-07-30 2021-07-30 Media information editing method, device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113672134B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109119079A (en) * 2018-07-25 2019-01-01 天津字节跳动科技有限公司 voice input processing method and device
CN109523609A (en) * 2018-10-16 2019-03-26 华为技术有限公司 A kind of method and terminal of Edition Contains
CN109740128A (en) * 2018-04-18 2019-05-10 北京字节跳动网络技术有限公司 A kind of text editing householder method, device and equipment
CN112148869A (en) * 2020-09-30 2020-12-29 北京知道未来信息技术有限公司 Text reference information generation method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK201670539A1 (en) * 2016-03-14 2017-10-02 Apple Inc Dictation that allows editing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109740128A (en) * 2018-04-18 2019-05-10 北京字节跳动网络技术有限公司 A kind of text editing householder method, device and equipment
CN109119079A (en) * 2018-07-25 2019-01-01 天津字节跳动科技有限公司 voice input processing method and device
CN109523609A (en) * 2018-10-16 2019-03-26 华为技术有限公司 A kind of method and terminal of Edition Contains
CN112148869A (en) * 2020-09-30 2020-12-29 北京知道未来信息技术有限公司 Text reference information generation method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Albl-Mikasa, M et al..Professional translations of non-native English: 'before and after' texts from the European Parliament's Editing Unit.TRANSLATOR.2018,第23卷(第4期),全文. *
大数据+AI智能与文字编辑融合的实践思考;黎彩秀;;新闻潮;20200215(02);全文 *

Also Published As

Publication number Publication date
CN113672134A (en) 2021-11-19

Similar Documents

Publication Publication Date Title
CN108595583B (en) Dynamic graph page data crawling method, device, terminal and storage medium
US10762678B2 (en) Representing an immersive content feed using extended reality based on relevancy
CN108804299A (en) Application exception processing method and processing device
US20230237255A1 (en) Form generation method, apparatus, and device, and medium
US20130290944A1 (en) Method and apparatus for recommending product features in a software application in real time
CN107832440A (en) A kind of data digging method, device, server and computer-readable recording medium
CN108197105B (en) Natural language processing method, device, storage medium and electronic equipment
US10331800B2 (en) Search results modulator
CN110096605B (en) Image processing method and device, electronic device and storage medium
CN114969443A (en) Quantum computation visual debugging method and system, computer equipment and storage medium
CN112685534B (en) Method and apparatus for generating context information of authored content during authoring process
CN113672134B (en) Media information editing method, device, computer readable medium and electronic equipment
CN106959945B (en) Method and device for generating short titles for news based on artificial intelligence
CN112947984B (en) Application program development method and device
CN114757299A (en) Text similarity judgment method and device and storage medium
CN111310484B (en) Automatic training method and platform of machine translation model, electronic device and storage medium
CN112288835A (en) Image text extraction method and device and electronic equipment
CN113515280A (en) Page code generation method and device
CN114063848B (en) Information editing method, device, computer readable medium and electronic equipment
CN112631525A (en) Storage and display method, device, equipment and medium
CN111161737A (en) Data processing method and device, electronic equipment and storage medium
CN112579036B (en) Method, system, equipment and storage medium for realizing report designer of voice input
CN114721915B (en) Point burying method and device
CN113706209B (en) Operation data processing method and related device
CN115857906B (en) Method, system, electronic device and medium for generating low-code chart

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant