CN113485609B - Electronic book sharing method, device, storage medium and apparatus - Google Patents

Electronic book sharing method, device, storage medium and apparatus Download PDF

Info

Publication number
CN113485609B
CN113485609B CN202110815925.1A CN202110815925A CN113485609B CN 113485609 B CN113485609 B CN 113485609B CN 202110815925 A CN202110815925 A CN 202110815925A CN 113485609 B CN113485609 B CN 113485609B
Authority
CN
China
Prior art keywords
information
audio
background image
background
book sharing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110815925.1A
Other languages
Chinese (zh)
Other versions
CN113485609A (en
Inventor
陈松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
MIGU Digital Media Co Ltd
MIGU Culture Technology Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
MIGU Digital Media Co Ltd
MIGU Culture Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, MIGU Digital Media Co Ltd, MIGU Culture Technology Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202110815925.1A priority Critical patent/CN113485609B/en
Publication of CN113485609A publication Critical patent/CN113485609A/en
Application granted granted Critical
Publication of CN113485609B publication Critical patent/CN113485609B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an electronic book sharing method, equipment, a storage medium and a device, wherein the method comprises the following steps: when a book sharing instruction is received, generating background audio according to the current position information and the current air temperature information, generating a background image according to the current time information and the current weather information, determining book sharing information according to the reading audio, the background audio and the background image, and sending the book sharing information to a receiving terminal; compared with the existing mode of sharing the electronic books to other users through image-text sharing; according to the method and the device for sharing the books, when the book sharing instruction is received, the background audio and the background image are generated, and the book sharing information is determined according to the reading audio, the background audio and the background image, so that the book sharing information of the pictures and texts can be replaced by the book sharing information with the reading audio, the background audio and the background image, the display content is enriched, and the user experience is improved.

Description

Electronic book sharing method, device, storage medium and apparatus
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method, an apparatus, a storage medium, and a device for sharing electronic books.
Background
Currently, when a user reads an electronic book through a mobile phone APP, the electronic book can be shared with other users through an image-text sharing mode. However, the image-text sharing mode can only display fixed image-text, and display content is monotonous, so that user experience is poor.
The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present invention and is not intended to represent an admission that the foregoing is prior art.
Disclosure of Invention
The invention mainly aims to provide an electronic book sharing method, equipment, a storage medium and a device, and aims to solve the technical problems that in the prior art, only fixed pictures and texts can be displayed in electronic book sharing, display content is monotonous, and user experience is poor.
In order to achieve the above object, the present invention provides an electronic book sharing method, which includes the following steps:
when a book sharing instruction is received, generating background audio according to the current position information and the current air temperature information;
generating a background image according to the current time information and the current weather information;
determining book sharing information according to the reading audio, the background audio and the background image;
and sending the book sharing information to a receiving terminal.
Optionally, when receiving the book sharing instruction, the step of generating the background audio according to the current position information and the current air temperature information includes:
when a book sharing instruction is received, determining a target tone frequency through a preset tone model according to current air temperature information;
acquiring a pitch frequency range of each musical instrument, and determining a set of musical instruments according to the target pitch frequency and the pitch frequency range;
searching characteristic instrument information corresponding to the current position information, and selecting a target instrument from the instrument set according to the characteristic instrument information;
background audio is generated from the target musical instrument and the target pitch frequency.
Optionally, the step of generating the background image according to the current time information and the current weather information includes:
acquiring a user shooting image, and searching an initial background image corresponding to the user shooting image;
generating the weather scene image according to the current weather information, and carrying out image aggregation on the weather scene image and the initial background image to obtain candidate background images;
acquiring sunrise time information corresponding to the current position information, and generating illumination shadow information according to the current time information and the sunrise time information;
And performing shadow adjustment on the candidate background images according to the illumination shadow information to obtain background images.
Optionally, the step of obtaining sunrise time information corresponding to the current position information and generating illumination shadow information according to the current time information and the sunrise time information includes:
acquiring sunrise time information corresponding to a current position, and determining an illumination score through a preset illumination model according to the current time information and the sunrise time information;
and determining a shadow score according to the illumination score through a preset shadow model, and taking the shadow score as illumination shadow information.
Optionally, the step of performing shadow adjustment on the candidate background image according to the illumination shadow information to obtain a background image includes:
generating a first background image according to the candidate background image and a first image frame, and setting a shadow mask on the first background image;
generating a second background image according to the candidate background image, a second image frame and the illumination shadow information, and setting the first background image on an upper image layer of the second background image;
and determining a target shade angle according to the current time information and the sunrise time information, and adjusting the position of the shadow shade according to the target shade angle to obtain a background image.
Optionally, before the step of determining the book sharing information according to the speakable audio, the background audio and the background image, the method further includes:
acquiring the inclination angle of current equipment, and determining the book placement angle according to the inclination angle of the current equipment;
determining books to be shared according to the book sharing instruction, and acquiring book pictures of the books to be shared;
rotating the book pictures according to the book placing angle to obtain book pictures to be displayed;
the determining book sharing information according to the speakable audio, the background audio and the background image includes:
and determining book sharing information according to the reading audio, the background image and the book pictures to be displayed.
Optionally, before the step of determining the book sharing information according to the speakable audio, the background audio and the background image, the method further includes:
acquiring user information and generating virtual character animation according to the user information;
the determining book sharing information according to the speakable audio, the background audio and the background image includes:
and determining book sharing information according to the reading audio, the background image and the virtual character animation.
In addition, in order to achieve the above objective, the present invention also provides an electronic book sharing device, which includes a memory, a processor, and an electronic book sharing program stored in the memory and capable of running on the processor, where the electronic book sharing program is configured to implement the electronic book sharing method as described above.
In addition, in order to achieve the above object, the present invention further provides a storage medium, on which an electronic book sharing program is stored, which when executed by a processor, implements the electronic book sharing method as described above.
In addition, in order to achieve the above object, the present invention also provides an electronic book sharing device, including: the system comprises a background audio generation module, a background image generation module, a sharing information generation module and a sharing information sending module;
the background audio generation module is used for generating background audio according to the current position information and the current air temperature information when receiving a book sharing instruction;
the background image generation module is used for generating a background image according to the current time information and the current weather information;
The sharing information generation module is used for determining book sharing information according to the reading audio, the background audio and the background image;
and the sharing information sending module is used for sending the book sharing information to a receiving terminal.
When a book sharing instruction is received, generating background audio according to current position information and current air temperature information, generating a background image according to current time information and current weather information, determining book sharing information according to the reading audio, the background audio and the background image, and sending the book sharing information to a receiving terminal; compared with the existing mode of sharing the electronic books to other users through image-text sharing; according to the method and the device for sharing the books, when the book sharing instruction is received, the background audio and the background image are generated, and the book sharing information is determined according to the reading audio, the background audio and the background image, so that the book sharing information of the pictures and texts can be replaced by the book sharing information with the reading audio, the background audio and the background image, the display content is enriched, and the user experience is improved.
Drawings
Fig. 1 is a schematic structural diagram of an electronic book sharing device in a hardware running environment according to an embodiment of the present invention;
FIG. 2 is a flowchart of a first embodiment of an electronic book sharing method according to the present invention;
FIG. 3 is a flowchart of a second embodiment of an electronic book sharing method according to the present invention;
FIG. 4 is a schematic diagram illustrating a mask position adjustment when the sun illumination angle is from 0 to 45 according to an embodiment of the electronic book sharing method of the present invention;
FIG. 5 is a schematic diagram illustrating a mask position adjustment when the sun illumination angle is from 45 ° to 90 ° according to an embodiment of the electronic book sharing method of the present invention;
FIG. 6 is a schematic diagram illustrating a mask position adjustment when the sun illumination angle is from 90 DEG to 135 DEG according to an embodiment of the electronic book sharing method of the present invention;
FIG. 7 is a schematic diagram illustrating a mask position adjustment when the sun illumination angle is from 135 ° to 180 ° according to an embodiment of the electronic book sharing method of the present invention;
FIG. 8 is a flowchart of a third embodiment of an electronic book sharing method according to the present invention;
fig. 9 is a block diagram of a first embodiment of an electronic book sharing device according to the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic diagram of an electronic book sharing device in a hardware running environment according to an embodiment of the present invention.
As shown in fig. 1, the electronic book sharing apparatus may include: a processor 1001, such as a central processing unit (Central Processing Unit, CPU), a communication bus 1002, a user interface 1003, a network interface 1004, a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display (Display), and the optional user interface 1003 may also include a standard wired interface, a wireless interface, and the wired interface for the user interface 1003 may be a USB interface in the present invention. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a high-speed random access Memory (Random Access Memory, RAM) or a stable Memory (NVM), such as a disk Memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 does not constitute a limitation of the electronic book sharing apparatus, and may include more or fewer components than shown, or may combine certain components, or may have a different arrangement of components.
As shown in fig. 1, the memory 1005, which is considered to be a computer storage medium, may include an operating system, a network communication module, a user interface module, and an electronic book sharing program.
In the electronic book sharing device shown in fig. 1, the network interface 1004 is mainly used for connecting to a background server, and performing data communication with the background server; the user interface 1003 is mainly used for connecting user equipment; the electronic book sharing device invokes the electronic book sharing program stored in the memory 1005 through the processor 1001, and executes the electronic book sharing method provided by the embodiment of the invention.
Based on the above hardware structure, an embodiment of the electronic book sharing method is provided.
Referring to fig. 2, fig. 2 is a flowchart illustrating a first embodiment of an electronic book sharing method according to the present invention.
Step S10: and when receiving a book sharing instruction, generating background audio according to the current position information and the current air temperature information.
It should be understood that the main body of the method of this embodiment may be an electronic book sharing device with functions of data processing, network communication and program running, such as a mobile phone and a computer, or other electronic devices capable of implementing the same or similar functions, which is not limited in this embodiment. In this embodiment and other embodiments, a mobile phone will be described as an example.
It should be noted that, the user may initiate the book sharing instruction through the user interaction interface of the electronic book sharing device. The book sharing instruction may include sharing text content, sharing chapter information, user information, and the like.
The current location information may be location information of a location where the electronic book sharing device is located at a current time. The current air temperature information may be air temperature information of a location where the electronic book sharing device is located at the current time.
The electronic book sharing device can collect current position information through a position sensor, and collect current air temperature information through a temperature sensor. Wherein, the position sensor and the temperature sensor may be pre-installed on the electronic book sharing device.
It may be understood that generating the background audio according to the current position information and the current air temperature information may be to use the current position information and the current air temperature information as audio reference information, and search a preset background audio table for the background audio corresponding to the audio reference information. The preset background audio table contains a corresponding relation between audio reference information and background audio, and the corresponding relation between the audio reference information and the background audio can be preset.
Further, in order to generate background audio more in line with the location of the user and the current temperature, the personalized sharing requirement of the user is satisfied, the step S10 includes:
When a book sharing instruction is received, determining a target tone frequency through a preset tone model according to current air temperature information, acquiring a tone frequency range of each instrument, determining an instrument set according to the target tone frequency and the tone frequency range, searching characteristic instrument information corresponding to current position information, selecting a target instrument from the instrument set according to the characteristic instrument information, and generating background audio according to the target instrument and the target tone frequency.
Step S20: and generating a background image according to the current time information and the current weather information.
It should be noted that, at the present moment, the electronic book shares weather information of the location where the device is located, for example, sunny days, rainy days, snowy days, foggy days, and the like. The current time information may be time information of a location where the electronic book sharing device is located at a current time.
It is understood that the electronic book sharing device may crawl weather information and time information of a location from the internet.
It should be understood that generating the background image according to the current time information and the current weather information may be to use the current time information and the current weather information as background reference information, and search a preset background image table for a background image corresponding to the background reference information. The preset background image table contains the corresponding relation between the background reference information and the background image, and the corresponding relation between the background reference information and the background image can be preset by a user.
Further, in order to enrich the content of the background image presentation, the step S20 includes:
acquiring a user shooting image, searching an initial background image corresponding to the user shooting image, generating a weather scene image according to current weather information, carrying out image aggregation on the weather scene image and the initial background image to obtain a candidate background image, acquiring sunrise time information corresponding to current position information, generating illumination shadow information according to the current time information and the sunrise time information, and carrying out shadow adjustment on the candidate background image according to the illumination shadow information to obtain the background image.
Step S30: and determining book sharing information according to the reading audio, the background audio and the background image.
It should be noted that, the speakable audio may be generated by the electronic book sharing device or may be manually input by the user, which is not limited in this example.
In a specific implementation, for example, the electronic book sharing device starts a ReplayKit recording function, the shared text content selected by the user is transmitted to the TTS reading engine, the ReplayKit module captures the reading audio stream data, and the reading audio stream data is converted into reading audio through AVFoundation.
It should be understood that determining the book sharing information according to the speakable audio, the background audio, and the background image may be performing audio synthesis on the speakable audio and the background audio to obtain a target audio, and determining the book sharing information according to the target audio and the background image.
It should be noted that, the book sharing information may be directly the target audio and the background image, or may be an information link. Wherein the information links are used to represent the storage locations of the target audio and background images in the server.
Step S40: and sending the book sharing information to a receiving terminal.
The receiving terminal may be a terminal that receives the book sharing information. For example, if the user a wants to share the electronic book to the user B through the electronic book sharing device, the terminal corresponding to the user B is a receiving terminal.
It should be understood that, the sending of the book sharing information to the receiving terminal may be generating a sharing link according to the book sharing information, and sending the sharing link to the receiving terminal.
It can be understood that after clicking the sharing link, the user of the receiving terminal will display the background image and play the target audio.
In the first embodiment, it is disclosed that when a book sharing instruction is received, background audio is generated according to current position information and current air temperature information, background images are generated according to current time information and current weather information, book sharing information is determined according to the reading audio, the background audio and the background images, and the book sharing information is sent to a receiving terminal; compared with the existing mode of sharing the electronic books to other users through image-text sharing; according to the method and the device for sharing the books, when the books share instruction is received, the background audio and the background image are generated, and the books share information is determined according to the reading audio, the background audio and the background image, so that the books share information of the pictures and texts can be replaced by the books share information with the reading audio, the background audio and the background image, the display content is enriched, and the user experience is improved.
Referring to fig. 3, fig. 3 is a flowchart illustrating a second embodiment of the electronic book sharing method according to the present invention, and the second embodiment of the electronic book sharing method according to the present invention is proposed based on the first embodiment shown in fig. 2.
In a second embodiment, the step S10 includes:
step S101: and when a book sharing instruction is received, determining a target tone frequency through a preset tone model according to the current air temperature information.
It should be noted that the current air temperature information may be a current air temperature of a location where the user is located.
The preset tone model may be preset on the electronic book sharing device. For example, the tone formula may be set in advance as a preset tone model. Wherein the tone formula is as follows:
wherein p is the target pitch frequency, N max The highest temperature or the lowest temperature, and C is the current air temperature.
The highest temperature may be a highest temperature value that a human body can receive, and the lowest temperature may be a lowest temperature value that a human body can receive. Both the maximum temperature and the minimum temperature may be preset.
It should be understood that determining the target pitch frequency by the preset pitch model based on the current air temperature information may be to extract the current air temperature from the current air temperature information and calculate the target pitch frequency by a pitch formula based on the current air temperature.
In a specific implementation, for example, in N max 40 DEG or-40 DEG, and the target pitch frequency is when the current air temperature is 20 DEGP is calculated to be 2575Hz.
Step S102: a pitch frequency range of each instrument is acquired, and a set of instruments is determined based on the target pitch frequency and the pitch frequency range.
It will be appreciated that the pitch frequency ranges of the different instruments are different, and that in order to determine which instrument the target pitch frequency falls within the corresponding pitch frequency range, it is necessary to obtain the pitch frequency range of each instrument.
It will be appreciated that the pitch frequency ranges of the instruments may be obtained by searching a preset instrument pitch table for the corresponding pitch frequency ranges of the instruments. The preset musical instrument tone table includes a correspondence between musical instruments and a tone frequency range, and the correspondence between musical instruments and the tone frequency range may be preset, or may be automatically crawled from the internet by the electronic book sharing device.
In a specific implementation, for example, the range of tone frequency corresponding to Erhu is 293-1318 Hz, the range of tone frequency corresponding to lute is 110-1.2 KHz, the range of tone frequency corresponding to piano is 80-8 KHz, the range of tone frequency corresponding to violin is 173-3.1 KHz, the range of tone frequency corresponding to violin is 123-2.6 KHz, the range of tone frequency corresponding to cello is 61-2.6 KHz, the range of tone frequency corresponding to clarinet is 146-2.6 KHz, and the range of tone frequency corresponding to clarinet is 220-2.6 KHz.
It should be noted that the instrument set may be a set of instrument components whose pitch frequency range includes the target pitch frequency.
In a specific implementation, for example, the target tone frequency is 2575Hz, and falls into the musical instrument range of a piano (80-8 KHz), a violin (173-3.1 KHz), a violin (123-2.6 KHz), a violin (61-2.6 KHz), a clarinet (146-2.6 KHz), a double reed pipe (220-2.6 KHz) and the like, and at this time, a musical instrument set can be generated as follows: { Piano, violin, cello, clarinet, reed-pipe }.
Step S103: and searching the characteristic musical instrument information corresponding to the current position information, and selecting a target musical instrument from the musical instrument set according to the characteristic musical instrument information.
It should be noted that the current location information may be a location where the user is currently located. The distinctive musical instrument information may be a musical instrument representing the location of the user. For example, the chinese corresponding characteristic musical instrument is lute and the european corresponding characteristic musical instrument is violin.
It should be understood that the searching for the characteristic instrument information corresponding to the current position information may be searching for the characteristic instrument information corresponding to the current position information in a preset instrument table. The preset musical instrument table includes a correspondence between position information and characteristic musical instrument information, and the correspondence between position information and characteristic musical instrument information may be preset, or may be automatically crawled from the internet by the electronic book sharing device.
It will be appreciated that selecting a target instrument from the instrument set based on the characteristic instrument information may be to match the characteristic instrument information with instrument information in the instrument set, and take the successfully matched instrument as the target instrument.
In a specific implementation, for example, when the current location information is italy, the distinctive musical instrument information corresponding to the current location information is violin, and the musical instrument set is: { Piano, violin, cello, reed-pipe, double reed pipe }, violin is selected from the instrument set as the target instrument.
Step S104: background audio is generated from the target musical instrument and the target pitch frequency.
It should be appreciated that generating the background audio from the target instrument and the target tone frequency may be adjusting the instrument tone of the target instrument to obtain the background audio based on the target tone frequency.
According to the second embodiment, the target tone frequency is determined through the preset tone model according to the current air temperature information, the tone frequency range of each musical instrument is obtained, the musical instrument set is determined according to the target tone frequency and the tone frequency range, the characteristic musical instrument information corresponding to the current position information is searched, the target musical instrument is selected from the musical instrument set according to the characteristic musical instrument information, and the background audio is generated according to the target musical instrument and the target tone frequency, so that the background audio which better accords with the position of the user and the current temperature can be generated, and the personalized sharing requirement of the user is met.
In a second embodiment, the step S20 includes:
step S201: and acquiring a user shooting image, and searching an initial background image corresponding to the user shooting image.
The user shot image may be an image shot by the user in real time, or may be an image of the user in the image library, which is not limited in this embodiment.
It should be understood that searching for the initial background image corresponding to the user captured image may be performed by matching the user captured image with the virtual scene image, and selecting the initial background image from the virtual scene image according to the matching result. The virtual scene image may be preset, or may be crawled from the internet by the electronic book sharing device, which is not limited in this embodiment.
It is understood that selecting the initial background image from the virtual scene images according to the matching result may be to use the successfully matched virtual scene image as the initial background image. The virtual scene image can be images of scenic spots, parks, libraries and the like.
Step S202: and generating the weather scene image according to the current weather information, and carrying out image aggregation on the weather scene image and the initial background image to obtain candidate background images.
It should be noted that, at the present moment, the electronic book shares weather information of the location where the device is located, for example, sunny days, rainy days, snowy days, foggy days, and the like.
It should be appreciated that generating the weather scene image according to the current weather information may be searching for a weather scene image corresponding to the current weather information in a preset weather scene image table. The weather scene image table includes a corresponding relationship between weather information and a weather scene image, and the corresponding relationship between weather information and the weather scene image may be preset, or may be crawled from the internet by the electronic book sharing device. For example, the weather scene image corresponding to a sunny day is a sunny weather scene image, and the weather scene image corresponding to a rainy day is a hazy scene image.
Step S203: and acquiring sunrise time information corresponding to the current position information, and generating illumination shadow information according to the current time information and the sunrise time information.
The illumination shade information may be a relationship between illumination information of each object and shielding between each object, starting from the position of the light source.
It should be understood that the acquiring sunrise time information corresponding to the current location information may be searching for sunrise time information corresponding to the current location information. In this embodiment and other embodiments, the sunrise time is denoted by N, and the sunset time is denoted by n+12.
It may be appreciated that generating the illumination shadow information according to the current time information and the sunrise time information may be determining the illumination shadow information according to the current time information and the sunrise time information through a preset illumination shadow model. The preset illumination shadow model is used for generating illumination shadow information.
Further, in order to ensure the reliability of the illumination shade information, the step S203 includes:
acquiring sunrise time information corresponding to a current position, and determining an illumination score through a preset illumination model according to the current time information and the sunrise time information;
and determining a shadow score according to the illumination score through a preset shadow model, and taking the shadow score as illumination shadow information.
It should be noted that the preset illumination model may be preset on the electronic book sharing device. For example, the illumination formula may be set in advance as a preset illumination model. Wherein, the illumination formula is as follows:
wherein l is illumination score, T is current time, and N is sunrise time.
It should be understood that determining the illumination score through the preset illumination model according to the current time information and the sunrise time information may be extracting the current time from the current time information, extracting the sunrise time from the sunrise time information, and calculating the illumination score through the illumination formula according to the current time and the sunrise time.
It should be noted that the preset shadow model may be preset on the electronic book sharing device. For example, the shading formula may be set in advance as a preset illumination model. Wherein the shading formula is as follows:
shadow=l-ambientLight
wherein shadow is a shadow score, l is an illumination score, ambientLight is an ambient light score, and the shadow is a constant value, and the shadow is a value of [0,0.5].
It is understood that determining the shadow score from the illumination score by the preset shadow model may be calculating the shadow score from the illumination score by a shadow formula.
Step S204: and performing shadow adjustment on the candidate background images according to the illumination shadow information to obtain background images.
It should be understood that, the candidate background image is subjected to shading adjustment according to the illumination shading information, and the background image may be obtained by setting the transparency of the candidate background image according to the illumination shading information.
In a specific implementation, for example, the alpha transparency of the candidate background image may be set to a shadow value to obtain the background image.
Further, in order to be able to obtain a better lighting shadow effect. The step S204 includes:
generating a first background image according to the candidate background image and a first image frame, and setting a shadow mask on the first background image;
Generating a second background image according to the candidate background image, a second image frame and the illumination shadow information, and setting the first background image on an upper image layer of the second background image;
and determining a target shade angle according to the current time information and the sunrise time information, and adjusting the position of the shadow shade according to the target shade angle to obtain a background image.
It will be appreciated that generating the first background image from the candidate background image and the first image frame may be to put a background picture into the first image frame to generate the first background image.
It should be noted that the shadow Mask may be a Mask. The MASK is attached to the layer to cover the image of the layer and display the image of the layer below the layer.
It should be appreciated that generating the second background image from the candidate background image, the second image frame, and the illumination shading information may be to put the background picture into the second image frame and set the transparency of the processed background picture to obtain the second background image.
It may be appreciated that determining the target mask angle according to the current time information and the sunrise time information, and adjusting the position of the shadow mask according to the target mask angle may be determining the sun illumination angle according to the current time information and the sunrise time information, and taking the sun illumination angle as the target mask angle. Wherein, the sun illumination angle is changed from 0 DEG to 180 DEG from N to N+12, N represents sunrise time, and N+12 represents sunset time.
For ease of understanding, the description is provided with reference to fig. 4-7, but the present solution is not limited thereto. Fig. 4 is a schematic view of Mask position adjustment when the sun illumination angle is from 0 ° to 45 °, and when the Mask angle is 0 °, the first background image is completely covered by the Mask, the first background image is not displayed, and only the display content (shown in the left image) of the second background image is displayed. When the Mask angle gradually becomes larger, the first background image and the second background image are displayed together, the display content of the first background image gradually increases, and the display content of the second background image gradually decreases (shown in the right image).
Fig. 5 is a schematic view of the mask position adjustment when the sun illumination angle is from 45 ° to 90 °. When the Mask angle gradually becomes larger, the first background image and the second background image are displayed together, the display content of the first background image gradually increases, and the display content of the second background image gradually decreases (shown in the left image); when the Mask angle becomes 90 °, the Mask disappears and the first background image is fully shown without any shadow effect (shown on the right).
Fig. 6 is a schematic view of the mask position adjustment when the sun illumination angle is from 90 ° to 135 °. When the Mask angle becomes 90 degrees, the Mask disappears, and the first background image is completely displayed, and no shadow effect exists (shown in the left graph); when the Mask angle is gradually reduced, the display content of the first background image is gradually reduced, and the display content of the second background image is gradually increased (shown in the right image).
Fig. 7 is a schematic view of the mask position adjustment when the sun illumination angle is from 135 ° to 180 °. When the Mask angle is gradually reduced, the display content of the first background image is gradually reduced, and the display content of the second background image is gradually increased (shown in the left graph); when the angle of the Mask is changed to 180 degrees, the first background image is completely covered by the Mask, and only the display content (shown in the right image) of the second background image is displayed.
According to the second embodiment, the user shooting image is obtained, the initial background image corresponding to the user shooting image is searched, the weather scene image is generated according to the current weather information, the weather scene image and the initial background image are subjected to image aggregation, the candidate background image is obtained, sunrise time information corresponding to the current position information is obtained, illumination shadow information is generated according to the current time information and the sunrise time information, shadow adjustment is carried out on the candidate background image according to the illumination shadow information, and the background image is obtained, so that the background image with the weather scene image and the illumination shadow information can be generated, the display content of the background image is richer, and the user experience is improved.
Referring to fig. 8, fig. 8 is a flowchart illustrating a third embodiment of the electronic book sharing method according to the present invention, and the third embodiment of the electronic book sharing method according to the present invention is proposed based on the first embodiment shown in fig. 2.
In a third embodiment, before the step S30, the method further includes:
step S210: and acquiring the inclination angle of the current equipment, and determining the book placement angle according to the inclination angle of the current equipment.
It should be noted that, the current device inclination angle may be an inclination angle of the electronic book sharing device at the current time.
It should be understood that the current device inclination angle may be obtained by a preset sensor. Wherein the preset sensor may be a gyro sensor.
It can be appreciated that the current device tilt angle cannot directly represent the book placement angle due to the shock conditions. Therefore, parameter correction is required.
It should be understood that determining the book placement angle according to the current device inclination angle may be calculating the book placement angle according to the current device inclination angle through a preset angle formula. The preset angle formula is as follows:
in the formula, a is a book placement angle, S is a current equipment inclination angle, and N is a constant value.
Step S220: and determining books to be shared according to the book sharing instruction, and acquiring book pictures of the books to be shared.
It should be noted that, the book picture may be a book cover picture, or may be a picture most representative in a book, which is not limited in this embodiment.
It can be understood that the obtaining of the book pictures of the books to be shared may be by crawling the book pictures of the books to be shared from the internet through a preset script, or may be preset by the user. The preset script may be a crawler script.
Step S230: and rotating the book pictures according to the book placing angle to obtain the book pictures to be displayed.
It should be understood that rotating the book pictures according to the book placement angle may be rotating the book pictures so that the book pictures reach the book placement angle.
The step S30 includes:
step S30': and determining book sharing information according to the reading audio, the background image and the book pictures to be displayed.
It should be understood that determining the book sharing information according to the speakable audio, the background image, and the book picture to be displayed may be performing audio synthesis on the speakable audio and the background audio to obtain the target audio, and determining the book sharing information according to the target audio, the background image, and the book picture to be displayed.
It should be noted that the book sharing information may be directly target audio, background image, and book picture to be displayed, or may be information link. The information link is used for representing storage positions of target audio, background images and to-be-displayed book pictures in the server.
According to the third embodiment, the book to be shared is determined according to the book sharing instruction by acquiring the inclination angle of the current equipment, determining the book placing angle according to the inclination angle of the current equipment, acquiring the book picture of the book to be shared, rotating the book picture according to the book placing angle, acquiring the book picture to be displayed, and determining the book sharing information according to the reading audio, the background image and the book picture to be displayed, so that the book picture to be displayed can be generated based on the inclination angle of the current equipment, and the book picture to be displayed is additionally added to the book sharing information, so that the book sharing information is richer, and the book sharing information is more suitable for actual scenes.
In a third embodiment, before the step S30, the method further includes:
acquiring user information and generating virtual character animation according to the user information;
the determining book sharing information according to the speakable audio, the background audio and the background image includes:
and determining book sharing information according to the reading audio, the background image and the virtual character animation.
The user information may be a user ID, user basic information, or the like. The virtual character animation may be a virtual character gif animation.
It should be appreciated that generating the virtual character animation from the user information may be searching a preset virtual character animation table for a virtual character animation corresponding to the user information. The preset virtual character animation table comprises the corresponding relation between the user information and the virtual character animation, and the corresponding relation between the user information and the virtual character animation can be preset.
In a specific implementation, for example, a gif animation of a virtual character is configured for each user in advance, and the gif animation of the corresponding virtual character is cyclically played when audio is played according to the user ID.
In the third embodiment, the book sharing information is determined according to the read-aloud audio, the background image and the virtual character animation by acquiring the user information and generating the virtual character animation according to the user information, so that the virtual character animation can be additionally added into the book sharing information, and the book sharing information is richer.
In addition, an embodiment of the present invention further provides a storage medium, where an electronic book sharing program is stored, where the electronic book sharing program implements the electronic book sharing method described above when executed by a processor.
In addition, referring to fig. 9, an embodiment of the present invention further provides an electronic book sharing device, where the electronic book sharing device includes: the system comprises a background audio generation module 10, a background image generation module 20, a sharing information generation module 30 and a sharing information sending module 40;
The background audio generation module 10 is configured to generate background audio according to the current position information and the current air temperature information when receiving a book sharing instruction.
It should be noted that, the user may initiate the book sharing instruction through the user interaction interface of the electronic book sharing device. The book sharing instruction may include sharing text content, sharing chapter information, user information, and the like.
The current location information may be location information of a location where the electronic book sharing device is located at a current time. The current air temperature information may be air temperature information of a location where the electronic book sharing device is located at the current time.
The electronic book sharing device can collect current position information through a position sensor, and collect current air temperature information through a temperature sensor. Wherein, the position sensor and the temperature sensor may be pre-installed on the electronic book sharing device.
It may be understood that generating the background audio according to the current position information and the current air temperature information may be to use the current position information and the current air temperature information as audio reference information, and search a preset background audio table for the background audio corresponding to the audio reference information. The preset background audio table contains a corresponding relation between audio reference information and background audio, and the corresponding relation between the audio reference information and the background audio can be preset.
Further, in order to generate background audio more in line with the location of the user and the current temperature, and meet the personalized sharing requirement of the user, the background audio generating module 10 is further configured to determine a target tone frequency according to the current air temperature information through a preset tone model when receiving a book sharing instruction, obtain a tone frequency range of each instrument, determine an instrument set according to the target tone frequency and the tone frequency range, search for characteristic instrument information corresponding to the current location information, select a target instrument from the instrument set according to the characteristic instrument information, and generate the background audio according to the target instrument and the target tone frequency.
The background image generating module 20 is configured to generate a background image according to the current time information and the current weather information.
It should be noted that, at the present moment, the electronic book shares weather information of the location where the device is located, for example, sunny days, rainy days, snowy days, foggy days, and the like. The current time information may be time information of a location where the electronic book sharing device is located at a current time.
It is understood that the electronic book sharing device may crawl weather information and time information of a location from the internet.
It should be understood that generating the background image according to the current time information and the current weather information may be to use the current time information and the current weather information as background reference information, and search a preset background image table for a background image corresponding to the background reference information. The preset background image table contains the corresponding relation between the background reference information and the background image, and the corresponding relation between the background reference information and the background image can be preset by a user.
Further, in order to enrich the content of the background image and improve the user experience, the background image generating module 20 is further configured to obtain a user shot image, search an initial background image corresponding to the user shot image, generate a weather scene image according to current weather information, perform image aggregation on the weather scene image and the initial background image, obtain a candidate background image, obtain sunrise time information corresponding to current position information, generate illumination shadow information according to the current time information and the sunrise time information, and perform shadow adjustment on the candidate background image according to the illumination shadow information to obtain the background image.
The sharing information generating module 30 is configured to determine book sharing information according to the speakable audio, the background audio and the background image.
It should be noted that, the speakable audio may be generated by the electronic book sharing device or may be manually input by the user, which is not limited in this example.
In a specific implementation, for example, the electronic book sharing device starts a ReplayKit recording function, the shared text content selected by the user is transmitted to the TTS reading engine, the ReplayKit module captures the reading audio stream data, and the reading audio stream data is converted into reading audio through AVFoundation.
It should be understood that determining the book sharing information according to the speakable audio, the background audio, and the background image may be performing audio synthesis on the speakable audio and the background audio to obtain a target audio, and determining the book sharing information according to the target audio and the background image.
It should be noted that, the book sharing information may be directly the target audio and the background image, or may be an information link. Wherein the information links are used to represent the storage locations of the target audio and background images in the server.
The share information generating module 40 is configured to send the book share information to a receiving terminal.
The receiving terminal may be a terminal that receives the book sharing information. For example, if the user a wants to share the electronic book to the user B through the electronic book sharing device, the terminal corresponding to the user B is a receiving terminal.
It should be understood that, the sending of the book sharing information to the receiving terminal may be generating a sharing link according to the book sharing information, and sending the sharing link to the receiving terminal.
It can be understood that after clicking the sharing link, the user of the receiving terminal will display the background image and play the target audio.
In the embodiment, when a book sharing instruction is received, generating background audio according to current position information and current air temperature information, generating a background image according to current time information and current weather information, determining book sharing information according to the reading audio, the background audio and the background image, and sending the book sharing information to a receiving terminal; compared with the existing mode of sharing the electronic books to other users through image-text sharing; according to the method and the device for sharing the books, when the books share instruction is received, the background audio and the background image are generated, and the books share information is determined according to the reading audio, the background audio and the background image, so that the books share information of the pictures and texts can be replaced by the books share information with the reading audio, the background audio and the background image, the display content is enriched, and the user experience is improved.
Other embodiments or specific implementation manners of the electronic book sharing device according to the present invention may refer to the above method embodiments, and are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the terms first, second, third, etc. do not denote any order, but rather the terms first, second, third, etc. are used to interpret the terms as names.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. read only memory mirror (Read Only Memory image, ROM)/random access memory (Random Access Memory, RAM), magnetic disk, optical disk), comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (9)

1. The electronic book sharing method is characterized by comprising the following steps of:
when a book sharing instruction is received, generating background audio according to the current position information and the current air temperature information;
generating a background image according to the current time information and the current weather information;
determining book sharing information according to the reading audio, the background audio and the background image;
the book sharing information is sent to a receiving terminal;
when receiving a book sharing instruction, the step of generating background audio according to the current position information and the current air temperature information comprises the following steps:
when a book sharing instruction is received, determining a target tone frequency through a preset tone model according to current air temperature information;
acquiring a pitch frequency range of each musical instrument, and determining a set of musical instruments according to the target pitch frequency and the pitch frequency range;
Searching characteristic instrument information corresponding to the current position information, and selecting a target instrument from the instrument set according to the characteristic instrument information;
background audio is generated from the target musical instrument and the target pitch frequency.
2. The electronic book sharing method of claim 1, wherein the step of generating the background image according to the current time information and the current weather information comprises:
acquiring a user shooting image, and searching an initial background image corresponding to the user shooting image;
generating a weather scene image according to current weather information, and carrying out image aggregation on the weather scene image and the initial background image to obtain candidate background images;
acquiring sunrise time information corresponding to the current position information, and generating illumination shadow information according to the current time information and the sunrise time information;
and performing shadow adjustment on the candidate background images according to the illumination shadow information to obtain background images.
3. The electronic book sharing method of claim 2, wherein the step of obtaining sunrise time information corresponding to the current location information and generating illumination shadow information according to the current time information and the sunrise time information comprises the steps of:
Acquiring sunrise time information corresponding to a current position, and determining an illumination score through a preset illumination model according to the current time information and the sunrise time information;
and determining a shadow score according to the illumination score through a preset shadow model, and taking the shadow score as illumination shadow information.
4. The electronic book sharing method of claim 2, wherein the step of shading the candidate background image according to the illumination shading information to obtain a background image comprises:
generating a first background image according to the candidate background image and a first image frame, and setting a shadow mask on the first background image;
generating a second background image according to the candidate background image, a second image frame and the illumination shadow information, and setting the first background image on an upper image layer of the second background image;
and determining a target shade angle according to the current time information and the sunrise time information, and adjusting the position of the shadow shade according to the target shade angle to obtain a background image.
5. The electronic book sharing method of any one of claims 1 to 4, wherein before the step of determining book sharing information according to the speakable audio, the background audio, and the background image, further comprising:
Acquiring the inclination angle of current equipment, and determining the book placement angle according to the inclination angle of the current equipment;
determining books to be shared according to the book sharing instruction, and acquiring book pictures of the books to be shared;
rotating the book pictures according to the book placing angle to obtain book pictures to be displayed;
the determining book sharing information according to the speakable audio, the background audio and the background image includes:
and determining book sharing information according to the reading audio, the background image and the book pictures to be displayed.
6. The electronic book sharing method of any one of claims 1 to 4, wherein before the step of determining book sharing information according to the speakable audio, the background audio, and the background image, further comprising:
acquiring user information and generating virtual character animation according to the user information;
the determining book sharing information according to the speakable audio, the background audio and the background image includes:
and determining book sharing information according to the reading audio, the background image and the virtual character animation.
7. An electronic book sharing apparatus, characterized in that the electronic book sharing apparatus includes: the electronic book sharing method according to any one of claims 1 to 6, when the electronic book sharing program is executed by the processor.
8. A storage medium, wherein an electronic book sharing program is stored on the storage medium, and the electronic book sharing program when executed by a processor implements the electronic book sharing method according to any one of claims 1 to 6.
9. An electronic book sharing apparatus, comprising: the system comprises a background audio generation module, a background image generation module, a sharing information generation module and a sharing information sending module;
the background audio generation module is used for generating background audio according to the current position information and the current air temperature information when receiving a book sharing instruction;
the background image generation module is used for generating a background image according to the current time information and the current weather information;
the sharing information generation module is used for determining book sharing information according to the reading audio, the background audio and the background image;
The sharing information sending module is used for sending the book sharing information to a receiving terminal;
the background audio generation module is further used for determining target tone frequency through a preset tone model according to current air temperature information when a book sharing instruction is received; acquiring a pitch frequency range of each musical instrument, and determining a set of musical instruments according to the target pitch frequency and the pitch frequency range; searching characteristic instrument information corresponding to the current position information, and selecting a target instrument from the instrument set according to the characteristic instrument information; background audio is generated from the target musical instrument and the target pitch frequency.
CN202110815925.1A 2021-07-19 2021-07-19 Electronic book sharing method, device, storage medium and apparatus Active CN113485609B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110815925.1A CN113485609B (en) 2021-07-19 2021-07-19 Electronic book sharing method, device, storage medium and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110815925.1A CN113485609B (en) 2021-07-19 2021-07-19 Electronic book sharing method, device, storage medium and apparatus

Publications (2)

Publication Number Publication Date
CN113485609A CN113485609A (en) 2021-10-08
CN113485609B true CN113485609B (en) 2024-04-09

Family

ID=77941417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110815925.1A Active CN113485609B (en) 2021-07-19 2021-07-19 Electronic book sharing method, device, storage medium and apparatus

Country Status (1)

Country Link
CN (1) CN113485609B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114626468B (en) * 2022-03-17 2024-02-09 小米汽车科技有限公司 Method, device, electronic equipment and storage medium for generating shadow in image

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013143028A (en) * 2012-01-11 2013-07-22 Sharp Corp Electronic book terminal, web system, information sharing method, and program, realizing information sharing
CN103679204A (en) * 2013-12-23 2014-03-26 上海安琪艾可网络科技有限公司 Image identification and creation application system and method based on intelligent mobile device platform
CN103957240A (en) * 2014-04-09 2014-07-30 广州市久邦数码科技有限公司 Weather system capable of shooting pictures in real time and conducting sharing and implementation method of weather system capable of shooting pictures in real time and conducting sharing
CN105474157A (en) * 2013-05-09 2016-04-06 亚马逊技术股份有限公司 Mobile device interfaces
CN105975581A (en) * 2016-05-05 2016-09-28 腾讯科技(北京)有限公司 Media information display method, client and server
US9569549B1 (en) * 2010-05-25 2017-02-14 Amazon Technologies, Inc. Location based recommendation and tagging of media content items
CN106708894A (en) * 2015-11-17 2017-05-24 腾讯科技(深圳)有限公司 Method and device of configuring background music for electronic book
CN106844677A (en) * 2017-01-24 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of method and device of Information Sharing
CN109145142A (en) * 2018-07-24 2019-01-04 维沃移动通信有限公司 A kind of management method and terminal of the sharing information of picture
CN110572431A (en) * 2019-07-31 2019-12-13 华为技术有限公司 Card sharing method, device and system
CN111796754A (en) * 2020-06-30 2020-10-20 上海连尚网络科技有限公司 Method and device for providing electronic books
CN112130788A (en) * 2020-08-05 2020-12-25 华为技术有限公司 Content sharing method and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8045761B2 (en) * 2006-05-30 2011-10-25 Intelliview Technologies Inc. Detection of environmental conditions in a sequence of images
US20170010773A1 (en) * 2012-07-27 2017-01-12 Felice Curcelli A collaborative system and method for publishing multi-media interactive content
US20140282205A1 (en) * 2013-03-14 2014-09-18 Eugene Teplitsky Apparatus, system and method for electronic book reading
US20170091831A1 (en) * 2015-09-25 2017-03-30 Bookgrabbr, Inc. Automated generation of content-limited previews for electronic media in a sharing platform

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9569549B1 (en) * 2010-05-25 2017-02-14 Amazon Technologies, Inc. Location based recommendation and tagging of media content items
JP2013143028A (en) * 2012-01-11 2013-07-22 Sharp Corp Electronic book terminal, web system, information sharing method, and program, realizing information sharing
CN105474157A (en) * 2013-05-09 2016-04-06 亚马逊技术股份有限公司 Mobile device interfaces
CN103679204A (en) * 2013-12-23 2014-03-26 上海安琪艾可网络科技有限公司 Image identification and creation application system and method based on intelligent mobile device platform
CN103957240A (en) * 2014-04-09 2014-07-30 广州市久邦数码科技有限公司 Weather system capable of shooting pictures in real time and conducting sharing and implementation method of weather system capable of shooting pictures in real time and conducting sharing
CN106708894A (en) * 2015-11-17 2017-05-24 腾讯科技(深圳)有限公司 Method and device of configuring background music for electronic book
CN105975581A (en) * 2016-05-05 2016-09-28 腾讯科技(北京)有限公司 Media information display method, client and server
CN106844677A (en) * 2017-01-24 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of method and device of Information Sharing
CN109145142A (en) * 2018-07-24 2019-01-04 维沃移动通信有限公司 A kind of management method and terminal of the sharing information of picture
CN110572431A (en) * 2019-07-31 2019-12-13 华为技术有限公司 Card sharing method, device and system
CN111796754A (en) * 2020-06-30 2020-10-20 上海连尚网络科技有限公司 Method and device for providing electronic books
CN112130788A (en) * 2020-08-05 2020-12-25 华为技术有限公司 Content sharing method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
微信平台上微分享图文消息的视觉整合研究;余露露;;包装工程(第22期);第123-127页 *

Also Published As

Publication number Publication date
CN113485609A (en) 2021-10-08

Similar Documents

Publication Publication Date Title
CN109599079B (en) Music generation method and device
CN109618222B (en) A kind of splicing video generation method, device, terminal device and storage medium
RU2408067C2 (en) Metadata identification
CN111415399B (en) Image processing method, device, electronic equipment and computer readable storage medium
TWI393860B (en) Navigation method and system of geo-locations by identifying web pages
CN107170432B (en) Music generation method and device
US20050052558A1 (en) Information processing apparatus, information processing method and software product
US20130182012A1 (en) Method of providing augmented reality and terminal supporting the same
JP2017532582A (en) Audio cover display method and apparatus
US20200020310A1 (en) Method and system for musical synthesis using hand-drawn patterns/text on digital and non-digital surfaces
US20090300500A1 (en) Methods, apparatuses, and computer program products for determining icons for audio/visual media content
US11470240B2 (en) Method and terminal device for matching photgraphed objects and preset text imformation
CN113485609B (en) Electronic book sharing method, device, storage medium and apparatus
US9646585B2 (en) Information processing apparatus, information processing method, and program
CN110019919B (en) Method and device for generating rhyme-rhyme lyrics
JP2012155219A (en) Musical performance data providing system, method, server, portable terminal, and computer program
JP2005065048A (en) Picture processing method, picture processor and picture processing program
CN116755590A (en) Virtual image processing method, device, enhancement realization equipment and storage medium
CN111078982A (en) Electronic page retrieval method, electronic device and storage medium
JP2008282316A (en) Dynamic image comparator, dynamic image comparison method, and dynamic image comparison program
JP5550593B2 (en) Karaoke equipment
CN111582281A (en) Picture display optimization method and device, electronic equipment and storage medium
JP5446683B2 (en) Image composition apparatus and program
KR20190005144A (en) Character service system, method and apparatus for providing character service in the system
CN111652986A (en) Stage effect presentation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant