CN116612261A - Information processing method, device, terminal and storage medium - Google Patents

Information processing method, device, terminal and storage medium Download PDF

Info

Publication number
CN116612261A
CN116612261A CN202310643905.XA CN202310643905A CN116612261A CN 116612261 A CN116612261 A CN 116612261A CN 202310643905 A CN202310643905 A CN 202310643905A CN 116612261 A CN116612261 A CN 116612261A
Authority
CN
China
Prior art keywords
image
information processing
label
processing method
dimensional model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310643905.XA
Other languages
Chinese (zh)
Inventor
李闪磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202310643905.XA priority Critical patent/CN116612261A/en
Publication of CN116612261A publication Critical patent/CN116612261A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides an information processing method and device, a terminal and a storage medium. The information processing method comprises the following steps: displaying a drawn first image in a first area of the image drawing page in response to a first preset operation in the image drawing page; displaying a second image in a second area of the image drawing page based on the first image, wherein the first image is a planar image, and the second image is a three-dimensional model corresponding to the first image; in response to the generation or movement of the first label on the first image, a location of a second label on the second image corresponding to the first label is determined and the second label is displayed on the second image. The information processing method enables a user to observe the positions of the labels on the first image and the second image in real time, avoids the condition that the second label on the second image is blocked, and improves the quality of the second image.

Description

Information processing method, device, terminal and storage medium
Technical Field
The present disclosure relates to the field of information technologies, and in particular, to an information processing method and apparatus, a terminal, and a storage medium.
Background
In three-dimensional models, such as in Virtual Reality (VR), it is often necessary to identify names between functions in the form of labels, and such semantically information is generated by plotting house type graphs. Fig. 1 shows an exemplary house pattern diagram, and fig. 2 shows a three-dimensional model corresponding to the house pattern diagram of fig. 1. Thus, the location of the names between each function in the house type graph ultimately determines the location in the three-dimensional model. However, when the house type graph is drawn, the position of the name label between functions is dragged by the mouse, only two-dimensional coordinates in a plane can be obtained, the two-dimensional coordinates correspond to plane coordinates in a top view of the three-dimensional model, and the three-dimensional coordinates are needed for displaying the label in the three-dimensional model, so that one dimension is absent. If the third-dimensional height coordinate value of the label is set through the house height or other empirical values, in the case of poor model quality, the label coordinate generated through the house type graph is very easy to be generated in a triangular network of the three-dimensional model, because the three-dimensional model has more noise models, no matter the three-dimensional model is observed from any view angle, the label in the three-dimensional model is lost in the aspect of effect presentation.
Disclosure of Invention
In order to solve the existing problems, the present disclosure provides an information processing method and apparatus, a terminal, and a storage medium.
The present disclosure adopts the following technical solutions.
An embodiment of the present disclosure provides an information processing method, including: displaying a drawn first image in a first area of an image drawing page in response to a first preset operation in the image drawing page; displaying a second image in a second area of the image drawing page based on the first image, wherein the first image is a planar image, and the second image is a three-dimensional model corresponding to the first image; in response to the generation or movement of a first label on the first image, a location of a second label on the second image corresponding to the first label is determined and the second label is displayed on the second image.
Another embodiment of the present disclosure provides an information processing apparatus including: a first display module configured to display a drawn first image in a first area of an image drawing page in response to a first preset operation in the image drawing page; a second display module configured to display a second image in a second area of the image drawing page based on the first image, wherein the first image is a planar image, and the second image is a three-dimensional model corresponding to the first image; a tag presentation module configured to determine a location of a second tag on the second image corresponding to the first tag in response to generation or movement of the first tag on the first image, and present the second tag on the second image.
In some embodiments, the present disclosure provides a terminal comprising: at least one memory and at least one processor; the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the information processing method.
In some embodiments, the present disclosure provides a storage medium for storing program code for performing the above-described information processing method.
According to the method and the device, the second image is displayed in the second area of the image drawing page based on the first image, and the position of the second label corresponding to the first label on the second image is determined in response to the generation or movement of the first label on the first image, so that the user can observe the label position conditions on the first image and the second image in real time, the condition that the second label on the second image is blocked is avoided, and the quality of the second image is improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Fig. 1 shows an exemplary floor plan.
Fig. 2 shows a three-dimensional model corresponding to the floor plan of fig. 1.
Fig. 3 is a flowchart of an information processing method of an embodiment of the present disclosure.
Fig. 4 is a schematic diagram of an image rendering page of an embodiment of the present disclosure.
Fig. 5 is a partial block diagram for an information processing apparatus according to an embodiment of the present disclosure.
Fig. 6 is a schematic structural diagram of an electronic device of an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in and/or in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "a" and "an" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be construed as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
In three-dimensional models, such as in Virtual Reality (VR), it is often necessary to identify names between functions in the form of labels, and such semantically information is generated by plotting house type graphs. Fig. 1 shows an exemplary house pattern diagram, and fig. 2 shows a three-dimensional model corresponding to the house pattern diagram of fig. 1. Thus, the location of the names between each function in the house type graph ultimately determines the location in the three-dimensional model. However, when the house type graph is drawn, the position of the name label between functions is dragged by the mouse, only two-dimensional coordinates in a plane can be obtained, the two-dimensional coordinates correspond to plane coordinates in a top view of the three-dimensional model, and the three-dimensional coordinates are needed for displaying the label in the three-dimensional model, so that one dimension is absent. If the third-dimensional height coordinate value of the label is set through the house height or other empirical values, in the case of poor model quality, the label coordinate generated through the house type graph is very easy to be generated in a triangular network of the three-dimensional model, because the three-dimensional model has more noise models, no matter the three-dimensional model is observed from any view angle, the label in the three-dimensional model is lost in the aspect of effect presentation.
For example, for a house type graph and a corresponding three-dimensional model, the effect display of the three-dimensional model on a client side is greatly influenced, labels of the three-dimensional model are mistakenly lost, the specific functions among the functions are unclear, and meanwhile, for an operator who draws the house type graph, the workload of the operator is increased, because when the house type graph is drawn, the corresponding positions of the labels in the three-dimensional model cannot be seen in real time, after a project is submitted, the operator needs to check the integrity of the house type graph and the labels of the three-dimensional model, at the moment, the operator finds that the labels of the three-dimensional model are not actually lost (the labels are only blocked by the model, and the labels cannot be seen under any view angle), and the operator can recall the project, so that the operator can revise the auditing passing rate of the operator is reduced, and the workload of the operator is increased.
Fig. 3 provides a flowchart of an information processing method of an embodiment of the present disclosure. The information processing method of the present disclosure may include step S101 of displaying a drawn first image in a first area of an image drawing page in response to a first preset operation in the image drawing page. Fig. 4 shows a schematic diagram of an image rendering page of an embodiment of the present disclosure, wherein the image displayed in the first area is a first image. In some embodiments, the image rendering page comprises an image rendering page of a user profile rendering application. In some embodiments, the first preset operation may include an operation of drawing a house type graph, for example, an operation of drawing a wall, a door, or the like. In some embodiments, the drawn first image may be displayed in the first region of the image drawing page by a drawing operation of a drawing person.
In some embodiments, the method of the present disclosure may further include step S102 of displaying a second image in a second area of the image drawing page based on the first image. In some embodiments, by drawing and displaying a first image in a first region, a second image may be automatically generated and displayed in a second region of the image drawing page. In some embodiments, the first image is a planar image, such as a planar floor plan. In some embodiments, the second image is a three-dimensional model corresponding to the first image, e.g., a three-dimensional model corresponding to a planar floor plan. In some embodiments, the corresponding three-dimensional model may be generated from the planar floor plan by an automated modeling method, although the disclosure is not limited thereto, and other suitable methods may be employed, which are not described in detail herein for the sake of simplicity.
In some embodiments, the method of the present disclosure may further include step S103, in response to the generation or movement of the first label on the first image, determining a position of a second label corresponding to the first label on the second image, and displaying the second label on the second image. In some embodiments, as shown in fig. 4, by drawing or moving a label on a first image (e.g., "bedroom"), the location of a second label on a second image corresponding to the first label (e.g., label "bedroom" on the second image) may be automatically determined and presented on the corresponding location of the second image.
Taking the first image as the house type drawing as an example, when drawing the house type drawing, the three-dimensional model displayed on the second area can be observed while drawing the house type drawing. Meanwhile, in order to solve the problem that the label is blocked, when a drawing person moves the first label on the first image, the position of the second label corresponding to the first label in the second image can be calculated in real time and displayed in the second image, whether the second label is blocked or not is observed, and therefore verification return caused by submitting and verifying the work result of blocked second label can be avoided. In some embodiments, the height value of the second tag in the second image may be given by house height default, for example. In the present disclosure, the linkage of a first label on a first image and a second label on a second image is achieved.
In some embodiments, the first preset operation includes an operation of rendering or modifying the first image. For example, as described above, the first preset operation may include an operation of drawing a first image (e.g., a house type drawing), for example, an operation of drawing a wall, a door, or the like.
In some embodiments, the first region and the second region do not overlap. As shown in fig. 4, the first region and the second region do not overlap each other. The dotted line in fig. 4 indicates that the first area and the second area may not have an explicit dividing line, and the dotted line in fig. 4 may be moved, for example, left and right, depending on the sizes of the first image and the second image for the convenience of viewing. For example, the broken line in fig. 4 may move leftward when the user enlarges the second image, whereas the broken line in fig. 4 may move rightward when the user enlarges the first image.
In some embodiments, the first image comprises a house pattern graph and the second image comprises a three-dimensional model graph corresponding to the house pattern graph. Therefore, when a user draws the inter-function label in the movable house type graph, the client can calculate the position of the current inter-function label in the three-dimensional model in real time and display the position in the model, and observe whether the current inter-function label is blocked by the three-dimensional model or not.
In some embodiments, displaying the second image in the second region of the image rendering page includes: the second image is displayed in the second region in the 3D mode or the virtual reality mode. As shown in fig. 4, the user may select which mode to display the second image, thereby enabling a more comprehensive viewing angle for the user to view the second image.
In some embodiments, the information processing method of the present disclosure further includes: in response to movement of the second tag on the second image, a location of a first tag on the first image corresponding to the second tag is determined and the location of the first tag on the first image is updated. Therefore, the present disclosure achieves the linkage of the first label on the first image and the second label on the second image, and when the first label on the first image is moved, the second label on the second image is correspondingly moved; when the second label on the second image is moved, the first label on the first image is also moved accordingly. Therefore, in order to avoid the second label on the second image from being blocked, the operation of a user can be greatly facilitated by directly moving the second label or by moving the first label.
For example, in the three-dimensional model, a label between certain functions may be dragged to a position not blocked by the three-dimensional model, and at this time, the position of the label of the three-dimensional model in the house type diagram may be calculated in real time, and at the same time, the position of the label between functions in the house type diagram may be updated. Therefore, a drawing person can observe the specific situation of the self-drawn inter-function label position in the house type graph and the three-dimensional model in real time, so that the auditing person can be prevented from modifying the label repeatedly after the label is blocked by the three-dimensional model.
In some embodiments, the information processing method of the present disclosure further includes: in response to a change in the display orientation of the first image, a corresponding change in the display orientation of the second image is made. In some embodiments, the information processing method of the present disclosure further includes: in response to a change in the display orientation of the second image, the display orientation of the first image is changed accordingly. For example, by rotating the first image, the second image is also rotated accordingly; by rotating the second image, the first image is rotated accordingly.
Through the linkage of the first label on the first image and the second label on the second image, whether the second label is blocked or not can be conveniently observed when the first image is drawn, so that a user can conveniently adjust the position of the second label in time, and the auditing and returning condition caused by blocking of the second label is avoided.
Embodiments of the present disclosure also provide an information processing apparatus 400, and fig. 5 is a partial module for the information processing apparatus 400 of the embodiments of the present disclosure. The information processing apparatus 400 includes a first display module 401, a second display module 402, and a tag presentation module 403. In some embodiments, the first display module 401 is configured to display the drawn first image in a first region of the three-dimensional model image drawing page in response to a first preset operation in the image drawing page. In some embodiments, the second display module 402 is configured to display a second image in a second region of the three-dimensional model image rendering page based on the three-dimensional model first image, wherein the three-dimensional model first image is a planar image and the three-dimensional model second image is a three-dimensional model corresponding to the three-dimensional model first image. In some embodiments, the tag presentation module 403 is configured to determine a location of a second tag corresponding to the three-dimensional model first tag on the three-dimensional model second image in response to the generation or movement of the first tag on the three-dimensional model first image, and present the three-dimensional model second tag on the three-dimensional model second image.
It should be understood that the descriptions regarding the information processing method also apply to the information processing apparatus 400 herein, and for the sake of simplicity, detailed descriptions thereof will not be provided herein.
In some embodiments, the three-dimensional model first preset operation includes an operation of rendering or modifying the three-dimensional model first image. In some embodiments, the three-dimensional model first region and the three-dimensional model second region do not overlap. In some embodiments, the three-dimensional model first image comprises a house pattern map and the three-dimensional model second image comprises a three-dimensional model map corresponding to the three-dimensional model house pattern map. In some embodiments, displaying the three-dimensional model second image in the three-dimensional model second region of the three-dimensional model image rendering page includes: and displaying the three-dimensional model second image in the three-dimensional model second region in the 3D mode or the virtual reality mode. In some embodiments, the label presentation module is further configured to: in response to movement of the three-dimensional model second tag on the three-dimensional model second image, determining a position of a three-dimensional model first tag on the three-dimensional model first image corresponding to the three-dimensional model second tag, and updating the position of the three-dimensional model first tag on the three-dimensional model first image. In some embodiments, the second display module is further configured to: in response to a change in the display orientation of the first image of the three-dimensional model, a corresponding change in the display orientation of the second image of the three-dimensional model is made. In some embodiments, the first display module is further configured to: in response to a change in the display orientation of the second image of the three-dimensional model, the display orientation of the first image of the three-dimensional model is correspondingly changed.
In addition, the present disclosure also provides a terminal, including: at least one memory and at least one processor; the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the information processing method.
Further, the present disclosure also provides a computer storage medium storing a program code for executing the above-described information processing method.
The information processing method and apparatus of the present disclosure are described above based on the embodiments and application. In addition, the present disclosure also provides a terminal and a storage medium, which are described below.
Referring now to fig. 6, a schematic diagram of an electronic device (e.g., a terminal device or server) 500 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 6, the electronic device 500 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 501, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 shows an electronic device 500 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or from the storage means 508, or from the ROM 502. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods of the present disclosure described above.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary states of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided an information processing method including: displaying a drawn first image in a first area of an image drawing page in response to a first preset operation in the image drawing page; displaying a second image in a second area of the image drawing page based on the first image, wherein the first image is a planar image, and the second image is a three-dimensional model corresponding to the first image; in response to the generation or movement of a first label on the first image, a location of a second label on the second image corresponding to the first label is determined and the second label is displayed on the second image.
According to one or more embodiments of the present disclosure, the first preset operation includes an operation of drawing or modifying the first image.
According to one or more embodiments of the present disclosure, the first region and the second region do not overlap.
According to one or more embodiments of the present disclosure, the first image includes a house type graph, and the second image includes a three-dimensional model graph corresponding to the house type graph.
According to one or more embodiments of the present disclosure, displaying the second image in the second region of the image drawing page includes: the second image is displayed in the second region in a 3D mode or a virtual reality mode.
According to one or more embodiments of the present disclosure, the information processing method further includes: in response to movement of the second tag on the second image, a location of the first tag on the first image corresponding to the second tag is determined and the location of the first tag on the first image is updated.
According to one or more embodiments of the present disclosure, the information processing method further includes: responding to the change of the display orientation of the first image, and correspondingly changing the display orientation of the second image; or in response to a change in the display orientation of the second image, the display orientation of the first image is changed accordingly.
According to one or more embodiments of the present disclosure, there is provided an information processing apparatus including: a first display module configured to display a drawn first image in a first area of an image drawing page in response to a first preset operation in the image drawing page; a second display module configured to display a second image in a second area of the image drawing page based on the first image, wherein the first image is a planar image, and the second image is a three-dimensional model corresponding to the first image; a tag presentation module configured to determine a location of a second tag on the second image corresponding to the first tag in response to generation or movement of the first tag on the first image, and present the second tag on the second image.
According to one or more embodiments of the present disclosure, there is provided a terminal including: at least one memory and at least one processor; wherein the at least one memory is configured to store program code, and the at least one processor is configured to invoke the program code stored by the at least one memory to perform any of the methods described above.
According to one or more embodiments of the present disclosure, there is provided a storage medium for storing program code for performing the above-described method.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (10)

1. An information processing method, characterized in that the information processing method comprises:
displaying a drawn first image in a first area of an image drawing page in response to a first preset operation in the image drawing page;
displaying a second image in a second area of the image drawing page based on the first image, wherein the first image is a planar image, and the second image is a three-dimensional model corresponding to the first image;
in response to the generation or movement of a first label on the first image, a location of a second label on the second image corresponding to the first label is determined and the second label is displayed on the second image.
2. The information processing method according to claim 1, wherein the first preset operation includes an operation of drawing or modifying the first image.
3. The information processing method according to claim 1, wherein the first area and the second area do not overlap.
4. The information processing method according to claim 1, wherein the first image includes a house type drawing, and the second image includes a three-dimensional model drawing corresponding to the house type drawing.
5. The information processing method according to claim 4, wherein displaying the second image in the second area of the image drawing page includes: the second image is displayed in the second region in a 3D mode or a virtual reality mode.
6. The information processing method according to claim 1, characterized by further comprising:
in response to movement of the second tag on the second image, a location of the first tag on the first image corresponding to the second tag is determined and the location of the first tag on the first image is updated.
7. The information processing method according to claim 1, characterized by further comprising:
responding to the change of the display orientation of the first image, and correspondingly changing the display orientation of the second image; or alternatively
In response to a change in the display orientation of the second image, the display orientation of the first image is correspondingly changed.
8. An information processing apparatus, characterized in that the information processing apparatus comprises:
a first display module configured to display a drawn first image in a first area of an image drawing page in response to a first preset operation in the image drawing page;
a second display module configured to display a second image in a second area of the image drawing page based on the first image, wherein the first image is a planar image, and the second image is a three-dimensional model corresponding to the first image;
a tag presentation module configured to determine a location of a second tag on the second image corresponding to the first tag in response to generation or movement of the first tag on the first image, and present the second tag on the second image.
9. A terminal, comprising:
at least one memory and at least one processor;
wherein the at least one memory is configured to store program code, and the at least one processor is configured to invoke the program code stored in the at least one memory to perform the information processing method of any of claims 1 to 7.
10. A storage medium storing program code for executing the information processing method according to any one of claims 1 to 7.
CN202310643905.XA 2023-06-01 2023-06-01 Information processing method, device, terminal and storage medium Pending CN116612261A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310643905.XA CN116612261A (en) 2023-06-01 2023-06-01 Information processing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310643905.XA CN116612261A (en) 2023-06-01 2023-06-01 Information processing method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN116612261A true CN116612261A (en) 2023-08-18

Family

ID=87679843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310643905.XA Pending CN116612261A (en) 2023-06-01 2023-06-01 Information processing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN116612261A (en)

Similar Documents

Publication Publication Date Title
CN111291244B (en) House source information display method, device, terminal and storage medium
CN110795196A (en) Window display method, device, terminal and storage medium
CN113589926B (en) Virtual interface operation method, head-mounted display device and computer readable medium
CN113129366B (en) Monocular SLAM initialization method and device and electronic equipment
CN113506356B (en) Method and device for drawing area map, readable medium and electronic equipment
US20230199262A1 (en) Information display method and device, and terminal and storage medium
CN113253874B (en) Control method and device of display device, terminal and storage medium
CN110619615A (en) Method and apparatus for processing image
CN116612261A (en) Information processing method, device, terminal and storage medium
CN111290812B (en) Display method, device, terminal and storage medium of application control
CN113703704A (en) Interface display method, head-mounted display device and computer readable medium
CN110807164A (en) Automatic image area adjusting method and device, electronic equipment and computer readable storage medium
CN113835791B (en) Method and apparatus for presenting hierarchical relationships of view components
CN116880726B (en) Icon interaction method and device for 3D space, electronic equipment and medium
CN111563214B (en) Reference line processing method and device
CN114911564B (en) Page movement processing method, device, equipment and storage medium
CN115033324B (en) Method and device for displaying diagrams in three-dimensional space page, electronic equipment and medium
CN112395826B (en) Text special effect processing method and device
CN114327188B (en) Form layout method, form layout device, electronic equipment and computer readable medium
CN114357348B (en) Display method and device and electronic equipment
CN111489286B (en) Picture processing method, device, equipment and medium
CN111311665B (en) Video processing method and device and electronic equipment
CN118069487A (en) Page display method, device, equipment and storage medium
CN118034551A (en) Display method and device and electronic equipment
CN116808589A (en) Motion control method and device, readable medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination