CN108876873B - Image generation method, device, equipment and storage medium - Google Patents

Image generation method, device, equipment and storage medium Download PDF

Info

Publication number
CN108876873B
CN108876873B CN201810648905.8A CN201810648905A CN108876873B CN 108876873 B CN108876873 B CN 108876873B CN 201810648905 A CN201810648905 A CN 201810648905A CN 108876873 B CN108876873 B CN 108876873B
Authority
CN
China
Prior art keywords
target object
texture
image
texture data
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810648905.8A
Other languages
Chinese (zh)
Other versions
CN108876873A (en
Inventor
冯伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Wingtech Electronic Technology Co Ltd
Original Assignee
Shanghai Wingtech Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Wingtech Electronic Technology Co Ltd filed Critical Shanghai Wingtech Electronic Technology Co Ltd
Priority to CN201810648905.8A priority Critical patent/CN108876873B/en
Publication of CN108876873A publication Critical patent/CN108876873A/en
Application granted granted Critical
Publication of CN108876873B publication Critical patent/CN108876873B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Abstract

The embodiment of the invention discloses an image generation method, an image generation device, image generation equipment and a storage medium, wherein the method comprises the following steps: determining the texture characteristics of the target object; processing the texture features to obtain texture data of the target object; and adding the texture data of the target object or index information corresponding to the texture data of the target object into the image of the target object to obtain a composite image of the target object. According to the technical scheme provided by the embodiment of the invention, the composite image capable of showing the texture effect of the object to the user can be obtained, and the experience effect and the interactivity of the user on the image are improved.

Description

Image generation method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to an image generation method, an image generation device, image generation equipment and a storage medium.
Background
An image is a flat medium made up of graphics and the like, and is a representation of an objective object that contains information about the object being described. With the development of science and technology, images become an impossible or lacking medium for acquiring information in people's life.
Currently, there are many Image formats, and the Image formats can be divided into bitmap and Vector Graphics in terms of processing modes, for example, Tag Image File Format (TIFF) is bitmap Image Format, and Scalable Vector Graphics (SVG) is Vector Graphics Image Format. However, most image formats are focused on the display effect, and the display form is single, the information amount is small, and the user experience effect is poor.
Disclosure of Invention
The embodiment of the invention provides an image generation method, an image generation device, image generation equipment and a storage medium, which can show the texture effect of an object to a user and improve the experience effect and interactivity of the user on an image.
In a first aspect, an embodiment of the present invention provides an image generation method, including:
determining the texture characteristics of the target object;
processing the texture features to obtain texture data of the target object;
adding the texture data of the target object or index information corresponding to the texture data of the target object into the image of the target object to obtain a composite image of the target object
In a second aspect, an embodiment of the present invention further provides an image generating apparatus, including:
The texture feature module is used for determining the texture feature of the target object;
the texture data module is used for processing the texture features to obtain texture data of the target object;
and the synthetic image module is used for adding the texture data of the target object or index information corresponding to the texture data of the target object in the image of the target object to obtain a synthetic image of the target object.
Further, the texture data module comprises a processing unit configured to:
determining the intensity information of the ultrasonic signal corresponding to the target object according to the concave-convex information in the texture features;
and/or determining the frequency information of the ultrasonic signal corresponding to the target object according to the flatness or softness in the texture features.
Further, the apparatus further includes an audio module, where the audio module is specifically configured to:
and acquiring preset audio data of the target object, and adding the preset audio data or index information corresponding to the preset audio data in the image of the target object to obtain a synthetic image of the target object.
In a third aspect, an embodiment of the present invention further provides an apparatus, where the apparatus includes:
One or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the image generation method as described above.
In a fourth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the image generation method as described above.
According to the method and the device, the texture data of the target object is obtained by processing the texture features of the target object, and the texture data of the target object or the index information corresponding to the texture data of the target object is added into the image of the target object to obtain the composite image of the target object. The texture data is added into the image of the target object to generate the synthetic image, so that the synthetic image can contain texture effects, a user can feel the texture information of the image when using the synthetic image, the experience effect and interactivity of the user on the image are improved, and the sensory experience of the user on the image is improved.
Drawings
FIG. 1 is a flowchart illustrating an image generating method according to a first embodiment of the present invention;
FIG. 2 is a flowchart of an image generation method according to a second embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an image generating apparatus according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an apparatus in a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of an image generation method in a first embodiment of the present invention, where the present embodiment is applicable to image generation, the method may be executed by an image generation apparatus, and the apparatus may be implemented in software and/or hardware, for example, the apparatus may be configured in a device. The method specifically comprises the following steps:
and step 110, determining the texture characteristics of the target object.
The texture is a visual feature which shows the tissue arrangement attribute of a surface structure with slow change or periodic change on the surface of an object and can reflect the homogeneous phenomenon in an image.
Optionally, the texture features include surface roughness information, flatness, softness, and the like of the target object. For example, if the target object is a coin, the obtained texture features are the shapes of the images and characters on the coin and the corresponding concave-convex information.
In this embodiment, an image processing algorithm, such as a Local Binary Pattern (LBP) method and a gray level co-occurrence matrix, may be adopted to analyze an image of a target object to obtain the texture feature; or special equipment such as a sensor measuring probe or a micro-distance measuring device can be adopted to scan the surface of the target object to obtain the texture characteristics corresponding to the target object, and the texture characteristics are applied to the image corresponding to the target object. The image of the target object may specifically be a picture or a video in a conventional format, which only includes image information (such as color values, gray values, and image resolutions).
And 120, processing the texture features to obtain texture data of the target object.
The texture data may include parameters of the ultrasonic signal corresponding to the target object, such as power information, intensity information, and/or frequency information. The texture data may enable a user to feel a texture effect of the target object, and when the texture data is power of the ultrasonic signal, a specific texture effect display process may be as follows: the ultrasonic generators output ultrasonic signals with different powers according to the texture data, and the ultrasonic signals with different powers are induced by fingers of a user, so that the effect of bringing reality of touching real objects to the user is achieved. The power of the ultrasonic signal may correspond to the clarity of the texture effect, e.g., the sharper the texture effect, the greater the power of the output ultrasonic signal.
Optionally, the processing the texture feature to obtain texture data of the target object includes: and obtaining texture data of the target object according to the texture features and the position information of the texture features in the image of the target object.
And determining texture data of each pixel point in the image of the target object according to the texture feature and the position information of the texture feature in the image of the target object, namely according to the coordinate of each pixel point in the image corresponding to the target object and the texture feature corresponding to each pixel point, thereby determining the texture data of the image of the whole target object. For example, the format of the texture data may be typedef Wenli { int x; int y; int vol; int freq; and f, wherein x represents an abscissa of a pixel point, y represents an ordinate of the pixel point, vol represents an ultrasonic intensity value corresponding to the pixel point, and freq represents an ultrasonic frequency corresponding to the pixel point. And if a certain pixel point in the image of the target object does not have texture features, setting the ultrasonic strength value and the ultrasonic frequency to be 0.
Step 130, adding the texture data of the target object or the index information corresponding to the texture data of the target object to the image of the target object to obtain a composite image of the target object.
Specifically, the texture data of the target object or the index information corresponding to the texture data of the target object may be converted into binary data, and the binary data is added to the binary data of the image of the target object to obtain a composite image of the target object in the form of binary data storage. Further, the index information corresponding to the texture data refers to a file path for storing the specific texture data. When the data amount of the texture data to be added is large, the texture data may be stored in a separate folder, and index information may be generated and saved to the composite image. When the texture effect corresponding to the texture data of the synthetic image needs to be displayed, the corresponding texture data can be obtained according to the index information, so that the file size of the synthetic image is reduced, and the display sensitivity of the synthetic image is improved.
Optionally, adding the texture data of the target object to the image of the target object to obtain a composite image of the target object, including: taking the position information associated with the texture data as the adding position of the texture data in the image of the target object; and adding the texture data to the adding position to obtain a composite image of the target object. The index information corresponding to the texture data of the target object may also be added to the addition position to obtain a composite image of the target object. As the corresponding texture data is added to the position information associated with the texture data to obtain the synthetic image, the corresponding relation is formed between the texture effect and the position of the synthetic image, the corresponding texture effect can be displayed at the corresponding position when the synthetic image is used, and the accuracy of the texture effect displayed by the synthetic image is improved.
Optionally, the composite image is displayed through a terminal supporting screen touch, and when a user touches the image in the screen, the display of the corresponding texture effect may be triggered. And, the composite image may be analyzed into image data and texture data when being displayed, the texture data may display a texture effect to a user through a special texture generating device, and the texture generating device may be a plurality of ultrasonic generators configured on a terminal.
According to the technical scheme of the embodiment, the texture data of the target object is obtained by processing the texture features of the target object, and the texture data of the target object or the index information corresponding to the texture data of the target object is added to the position, corresponding to the texture features, in the image of the target object, so that the synthetic image of the target object is obtained. The texture data is added into the image of the target object to generate the synthetic image, so that the synthetic image can contain texture effects, a user can feel the texture information of the image when using the synthetic image, the experience effect and interactivity of the user on the image are improved, and the sensory experience of the user on the image is improved.
Example two
Fig. 2 is a flowchart of an image generation method according to a second embodiment of the present invention. The present embodiment further optimizes the image generation method on the basis of the above-described embodiments. Correspondingly, the method of this embodiment may specifically include:
and step 210, determining the texture characteristics of the target object.
Optionally, the texture features include surface roughness information, flatness, softness and the like of the target object. The corresponding texture features can be obtained by analyzing the image of the target object, and the corresponding texture features can also be obtained by scanning the real target object.
Step 220, processing the texture features to obtain texture data of the target object.
The texture data may include ultrasonic signal strength information and/or frequency information corresponding to the target object.
Specifically, the processing the texture features to obtain the texture data of the target object may include: determining the intensity information of the ultrasonic signal corresponding to the target object according to the concave-convex information in the texture features; and/or determining the frequency information of the ultrasonic signal corresponding to the target object according to the flatness or softness in the texture features. That is, the information on the irregularities in the texture feature is expressed by the intensity of the ultrasonic signal, and the intensity is increased as the position of the target object is more convex, and the intensity is decreased as the position is more concave. The flatness or softness in the texture features is represented by the frequency of the ultrasonic signal, and the higher the flatness or softness is, the higher the frequency is, and the lower the frequency is, and vice versa.
Optionally, the processing the texture feature to obtain texture data of the target object includes: and obtaining texture data of the target object according to the texture features and the position information of the texture features in the image of the target object.
Step 230, adding the texture data of the target object or the index information corresponding to the texture data of the target object to the image of the target object to obtain a composite image of the target object.
Taking the position information associated with the texture data as an adding position of the texture data in the image of the target object; and adding the texture data or index information corresponding to the texture data of the target object to the adding position to obtain a composite image of the target object.
The image of the target object may specifically be a picture or a video in a conventional format that only contains image information (such as color values, gray values, image resolution, etc.).
Step 240, acquiring preset audio data of the target object, and adding the preset audio data or index information corresponding to the preset audio data to the image of the target object to obtain a synthetic image of the target object.
The preset audio data may be a sound characteristic related to the target object, may be a sound emitted when the target object is knocked, and may also be a real natural sound in a scene where the target object is located. For example, if the target object is a stream or a lake, the audio data may be its own stream sound or a stream sound emitted by touching; if the target object is a bird, the audio data may be a bird song; if the target object is a glass bottle, the audio data may be a crisp sound emitted by the glass bottle when the glass bottle is knocked.
Optionally, after the preset audio data of the target object is obtained, the method further includes: taking the position information associated with the preset audio data as an adding position of the preset audio data in the image of the target object; and adding the preset audio data or the index information corresponding to the preset audio data to the adding position to obtain a synthetic image of the target object. For example, if the corresponding composite image is obtained by adding the index information corresponding to the preset audio data, the format of the texture data may be typedef Wenli { int x; int y; int vol; char [ path _ len ] audio _ path; and h, wherein x represents an abscissa of a pixel point, y represents an ordinate of the pixel point, vol represents an ultrasonic intensity value corresponding to the pixel point, and audio _ path represents index information corresponding to preset audio data corresponding to the pixel point, and the index information is a file storage path including the preset audio data.
In addition, the texture data and the preset audio data of the target object can be simultaneously added to the image of the target object to obtain a composite image of the target object. Illustratively, the data format may be typedef Wenli { int x; int y; int vol; int freq; char [ path _ len ] audio _ path; and f, wherein x represents an abscissa of a pixel point, y represents an ordinate of the pixel point, vol represents an ultrasonic intensity value corresponding to the pixel point, freq represents an ultrasonic frequency corresponding to the pixel point, and audio _ path represents a storage path of a preset audio data file. In this case, the format may include texture data and preset audio data at the same time.
Optionally, the composite image is displayed through a terminal supporting screen touch, and when a user touches the image in the screen, display of a corresponding texture effect and playing of audio data may be triggered. When the composite image is displayed, the composite image can be analyzed into image data, texture data and audio data, the texture data can display texture effects to a user through special texture generation equipment, and the texture generation equipment can be a plurality of ultrasonic generators configured on a terminal; the audio data can be played through an audio device in the terminal, such as a speaker or an earphone.
According to the technical scheme, the texture data of the target object is obtained by processing the texture features of the target object, the preset audio data of the target object is obtained, and the texture data and/or the preset audio data of the target object are added to the image of the target object to obtain the synthetic image of the target object. According to the technical scheme, the composite image capable of showing the texture effect and the audio effect of the object to the user can be obtained, the experience effect and the interactivity of the user on the image are improved, and the sensory experience of the user on the image is improved.
EXAMPLE III
Fig. 3 is a schematic structural diagram of an image generating apparatus in a third embodiment of the present invention, where the apparatus may include:
a texture feature module 310, configured to determine a texture feature of the target object;
a texture data module 320, configured to process the texture features to obtain texture data of the target object;
a composite image module 330, configured to add texture data of the target object or index information corresponding to the texture data of the target object to the image of the target object to obtain a composite image of the target object.
Optionally, the texture data module 320 includes a processing unit, and the processing unit is configured to:
Determining the intensity information of the ultrasonic signal corresponding to the target object according to the concave-convex information in the texture features;
and/or determining the frequency information of the ultrasonic signal corresponding to the target object according to the flatness or softness in the texture features.
Optionally, the texture data module 320 further comprises a location unit, the location unit is configured to:
and obtaining texture data of the target object according to the texture features and the position information of the texture features in the image of the target object, wherein the texture data comprises the intensity information and/or the frequency information of the ultrasonic signals.
Optionally, the synthetic image module 330 is specifically configured to:
taking the position information associated with the texture data as an adding position of the texture data in the image of the target object; and adding the texture data to the adding position to obtain a composite image of the target object.
Optionally, the apparatus further includes an audio module, where the audio module is specifically configured to:
and acquiring preset audio data of the target object, and adding the preset audio data or index information corresponding to the preset audio data in the image of the target object to obtain a synthetic image of the target object.
The image generation device provided by the embodiment of the invention can execute the image generation method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
Example four
Fig. 4 is a schematic structural diagram of an apparatus according to a fourth embodiment of the present invention. FIG. 4 illustrates a block diagram of an exemplary device 412 suitable for use in implementing embodiments of the present invention. The device 412 shown in fig. 4 is only an example and should not impose any limitation on the functionality or scope of use of embodiments of the present invention.
As shown in FIG. 4, device 412 is in the form of a general purpose computing device. The components of device 412 may include, but are not limited to: one or more processors 416, a system memory 428, and a bus 418 that couples the various system components including the system memory 428 and the processors 416.
Bus 418 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and processor 416, or a local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 412 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by device 412 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 428 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)430 and/or cache memory 432. The device 412 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 434 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 418 by one or more data media interfaces. Memory 428 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 440 having a set (at least one) of program modules 442 may be stored, for instance, in memory 428, such program modules 442 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which or some combination of which may comprise an implementation of a network environment. Program modules 442 generally perform the functions and/or methodologies of embodiments of the present invention as described herein.
The device 412 may also communicate with one or more external devices 414 (e.g., keyboard, pointing device, display 424, etc.), with one or more devices that enable a user to interact with the device 412, and/or with any devices (e.g., network card, modem, etc.) that enable the device 412 to communicate with one or more other computing devices. Such communication may occur through input/output (I/O) interfaces 422. Also, the device 412 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) through the network adapter 420. As shown, network adapter 420 communicates with the other modules of device 412 over bus 418. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the device 412, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, to name a few.
The processor 416 executes various functional applications and data processing by executing programs stored in the system memory 428.
The programs stored in the system memory 428 may be further specifically configured to cause the processor 416 to:
determining the texture characteristics of the target object;
processing the texture features to obtain texture data of the target object;
and adding the texture data of the target object or index information corresponding to the texture data of the target object into the image of the target object to obtain a composite image of the target object.
The programs stored in the system memory 428 may be further specifically configured to cause the processor 416 to:
determining the intensity information of the ultrasonic signal corresponding to the target object according to the concave-convex information in the texture features;
and/or determining the frequency information of the ultrasonic signal corresponding to the target object according to the flatness or softness in the texture features.
In an alternative approach, the programs stored in the system memory 428 may also be used to cause the processor 416 to:
and obtaining texture data of the target object according to the texture features and the position information of the texture features in the image of the target object, wherein the texture data comprises intensity information and/or frequency information of the ultrasonic signals.
In an alternative approach, the programs stored in the system memory 428 may also be used to cause the processor 416 to:
taking the position information associated with the texture data as an adding position of the texture data in the image of the target object;
and adding the texture data to the adding position to obtain a composite image of the target object.
In an alternative approach, the programs stored in the system memory 428 may also be used to cause the processor 416 to:
and acquiring preset audio data of the target object, and adding the preset audio data or index information corresponding to the preset audio data in the image of the target object to obtain a synthetic image of the target object.
EXAMPLE five
Fifth, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the image generation method provided in the embodiment of the present invention.
The program may specifically be adapted to cause a processor to perform the following operations:
determining the texture characteristics of the target object;
processing the texture features to obtain texture data of the target object;
And adding the texture data of the target object or index information corresponding to the texture data of the target object into the image of the target object to obtain a composite image of the target object.
In an alternative, the program further causes the processor to:
determining the intensity information of the ultrasonic signal corresponding to the target object according to the concave-convex information in the texture features;
and/or determining the frequency information of the ultrasonic signal corresponding to the target object according to the flatness or softness in the texture features.
In an alternative, the program further causes the processor to:
and obtaining texture data of the target object according to the texture features and the position information of the texture features in the image of the target object, wherein the texture data comprises intensity information and/or frequency information of the ultrasonic signals.
In an alternative, the program further causes the processor to:
taking the position information associated with the texture data as the adding position of the texture data in the image of the target object;
and adding the texture data to the adding position to obtain a composite image of the target object.
In an alternative form, the program further causes the processor to:
and acquiring preset audio data of the target object, and adding the preset audio data or index information corresponding to the preset audio data into the image of the target object to obtain a composite image of the target object.
Computer storage media for embodiments of the present invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (9)

1. An image generation method, comprising:
determining the texture characteristics of the target object;
processing the texture features to obtain texture data of the target object;
adding the texture data of the target object or index information corresponding to the texture data of the target object into the image of the target object to obtain a composite image of the target object;
the processing the texture features to obtain the texture data of the target object includes:
determining the intensity information of the ultrasonic signal corresponding to the target object according to the concave-convex information in the texture features;
And/or determining the frequency information of the ultrasonic signal corresponding to the target object according to the flatness or softness in the texture features.
2. The method of claim 1, wherein processing the texture features to obtain texture data of the target object comprises:
and obtaining texture data of the target object according to the texture features and the position information of the texture features in the image of the target object, wherein the texture data comprises the intensity information and/or the frequency information of the ultrasonic signals.
3. The method of claim 2, wherein adding texture data of the target object to the image of the target object to obtain a composite image of the target object comprises:
taking the position information associated with the texture data as the adding position of the texture data in the image of the target object;
and adding the texture data to the adding position to obtain a composite image of the target object.
4. The method of claim 1, further comprising:
and acquiring preset audio data of the target object, and adding the preset audio data or index information corresponding to the preset audio data in the image of the target object to obtain a synthetic image of the target object.
5. An image generation apparatus, comprising:
the texture feature module is used for determining the texture feature of the target object;
the texture data module is used for processing the texture features to obtain texture data of the target object;
a composite image module, configured to add texture data of the target object or index information corresponding to the texture data of the target object to the image of the target object to obtain a composite image of the target object;
the texture data module is specifically configured to determine intensity information of the ultrasonic signal corresponding to the target object according to concave-convex information in the texture feature;
and/or determining the frequency information of the ultrasonic signal corresponding to the target object according to the flatness or softness in the texture features.
6. The apparatus of claim 5, wherein the texture data module comprises a location unit to:
and obtaining texture data of the target object according to the texture features and the position information of the texture features in the image of the target object, wherein the texture data comprises the intensity information and/or the frequency information of the ultrasonic signals.
7. The apparatus of claim 6, wherein the composite image module is specifically configured to:
taking the position information associated with the texture data as the adding position of the texture data in the image of the target object;
and adding the texture data to the adding position to obtain a composite image of the target object.
8. An apparatus, characterized in that the apparatus comprises:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the image generation method of any one of claims 1-4.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the image generation method of any one of claims 1 to 4.
CN201810648905.8A 2018-06-22 2018-06-22 Image generation method, device, equipment and storage medium Active CN108876873B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810648905.8A CN108876873B (en) 2018-06-22 2018-06-22 Image generation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810648905.8A CN108876873B (en) 2018-06-22 2018-06-22 Image generation method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN108876873A CN108876873A (en) 2018-11-23
CN108876873B true CN108876873B (en) 2022-07-19

Family

ID=64340778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810648905.8A Active CN108876873B (en) 2018-06-22 2018-06-22 Image generation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN108876873B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10283487A (en) * 1997-04-04 1998-10-23 Fujitsu F I P Kk Multiple texture mapping device and method therefor and storage medium storing program for multiple texture mapping
US6483521B1 (en) * 1998-02-02 2002-11-19 Matsushita Electric Industrial Co., Ltd. Image composition method, image composition apparatus, and data recording media
JP2012058773A (en) * 2010-09-03 2012-03-22 Toshiba Corp Image processing apparatus
CN104991950A (en) * 2015-07-16 2015-10-21 百度在线网络技术(北京)有限公司 Picture generating method, display method and corresponding devices
CN106919257A (en) * 2017-02-28 2017-07-04 南京信息工程大学 Based on image luminance information power haptic interaction texture power reproducting method
CN107369188A (en) * 2017-07-12 2017-11-21 北京奇虎科技有限公司 The synthetic method and device of image

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4130114B2 (en) * 2002-10-09 2008-08-06 株式会社日立メディコ Ultrasonic imaging apparatus and ultrasonic signal processing method
US20140184596A1 (en) * 2012-12-28 2014-07-03 Microsoft Corporation Image based rendering
US9536288B2 (en) * 2013-03-15 2017-01-03 Samsung Electronics Co., Ltd. Creating details in an image with adaptive frequency lifting
EP2863363A1 (en) * 2013-09-30 2015-04-22 Samsung Medison Co., Ltd. Method and apparatus for generating three-dimensional image of target object
CN106780701B (en) * 2016-11-23 2020-03-13 深圳大学 Non-uniform texture image synthesis control method, device, storage medium and equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10283487A (en) * 1997-04-04 1998-10-23 Fujitsu F I P Kk Multiple texture mapping device and method therefor and storage medium storing program for multiple texture mapping
US6483521B1 (en) * 1998-02-02 2002-11-19 Matsushita Electric Industrial Co., Ltd. Image composition method, image composition apparatus, and data recording media
JP2012058773A (en) * 2010-09-03 2012-03-22 Toshiba Corp Image processing apparatus
CN104991950A (en) * 2015-07-16 2015-10-21 百度在线网络技术(北京)有限公司 Picture generating method, display method and corresponding devices
CN106919257A (en) * 2017-02-28 2017-07-04 南京信息工程大学 Based on image luminance information power haptic interaction texture power reproducting method
CN107369188A (en) * 2017-07-12 2017-11-21 北京奇虎科技有限公司 The synthetic method and device of image

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Texture Synthesis Approach Using Cooperative Features;Chin-chen Chang et al.;《IEEE》;20130808;第1-3页 *
Texture synthesis by non-parametric samping;Efors A et al.;《International Conference on Computer Vision》;19990215;第5卷(第43期);第1033-1038页 *
一种新的纹理描述方法及其应用—多尺度斑块特征法;徐琪;《中国博士学位论文全文数据库信息科技辑》;20111215;第1-135页 *
基于纹理合成的图像修复与基于分形的图像分割方法的研究与应用;刘洋;《中国博士学位论文全文数据库信息科技辑》;20100915;第1-114页 *

Also Published As

Publication number Publication date
CN108876873A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
KR101458939B1 (en) Augmented reality system
CN109300179B (en) Animation production method, device, terminal and medium
CN111325699B (en) Image restoration method and training method of image restoration model
CN110069191B (en) Terminal-based image dragging deformation implementation method and device
CN110162604B (en) Statement generation method, device, equipment and storage medium
WO2022007565A1 (en) Image processing method and apparatus for augmented reality, electronic device and storage medium
CN111882634A (en) Image rendering method, device and equipment and storage medium
CN108763350B (en) Text data processing method and device, storage medium and terminal
CN111209377A (en) Text processing method, device, equipment and medium based on deep learning
CN111325220B (en) Image generation method, device, equipment and storage medium
US11475549B1 (en) High dynamic range image generation from tone mapped standard dynamic range images
CN113934297A (en) Interaction method and device based on augmented reality, electronic equipment and medium
CN108876873B (en) Image generation method, device, equipment and storage medium
CN111815748A (en) Animation processing method and device, storage medium and electronic equipment
CN109857244B (en) Gesture recognition method and device, terminal equipment, storage medium and VR glasses
CN112132859A (en) Sticker generation method, apparatus, medium, and electronic device
CN107452046B (en) Texture processing method, device and equipment of three-dimensional city model and readable medium
CN112528707A (en) Image processing method, device, equipment and storage medium
CN113361490B (en) Image generation method, network training method, image generation device, network training device, computer equipment and storage medium
CN111008934B (en) Scene construction method, device, equipment and storage medium
CN114154520A (en) Training method of machine translation model, machine translation method, device and equipment
EP3141291A1 (en) Methods and apparatus of composing an image of a textured material distorted when rubbing a touch surface
CN111127620A (en) Method and device for generating hemispherical domain sampling mode and computer storage medium
CN111026342A (en) Print preview picture generation method, device, equipment and storage medium
CN111354070A (en) Three-dimensional graph generation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant