CN116594609A - Visual programming method, visual programming device, electronic equipment and computer readable storage medium - Google Patents

Visual programming method, visual programming device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN116594609A
CN116594609A CN202310521623.2A CN202310521623A CN116594609A CN 116594609 A CN116594609 A CN 116594609A CN 202310521623 A CN202310521623 A CN 202310521623A CN 116594609 A CN116594609 A CN 116594609A
Authority
CN
China
Prior art keywords
programming
code block
programming object
user
scene area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310521623.2A
Other languages
Chinese (zh)
Inventor
王宇航
陈向东
曾顺超
贾强强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Siming Qichuang Technology Co ltd
Original Assignee
Beijing Siming Qichuang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Siming Qichuang Technology Co ltd filed Critical Beijing Siming Qichuang Technology Co ltd
Priority to CN202310521623.2A priority Critical patent/CN116594609A/en
Publication of CN116594609A publication Critical patent/CN116594609A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a visual programming method, a visual programming device, electronic equipment and a computer readable storage medium, and relates to the technical field of computers. The visual programming method comprises the following steps: responding to the operation of loading a programming interface by a user, and displaying the programming interface, wherein the programming interface comprises a scene area, and 3D programming objects are displayed on the scene area; and responding to a modification operation of modifying the attribute of the 3D programming object by a user, and modifying the display state of the 3D programming object in the scene area to be consistent with the attribute corresponding to the modification operation. Due to the fact that the 3D programming object is arranged, a user can perform different actions through programming control of the 3D programming object, and therefore the spatial thinking capability of the user can be effectively trained. In addition, compared with the existing 2D programming object, the 3D programming object can execute more actions which cannot be executed by the 2D programming object, and the diversity and applicability of programming products can be further improved.

Description

Visual programming method, visual programming device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of computer technology, and in particular, to a visual programming method, device, electronic apparatus, and computer readable storage medium.
Background
There are many juvenile programming products in the market, and these juvenile programming products usually set a 2D (two-dimensional) scene, and after the user finishes programming, the user can control the objects in the 2D scene to execute corresponding actions by running the edited program. However, the existing juvenile programming products have poor effects on training the spatial thinking ability of children, and fewer actions can be executed by the objects in the 2D scene, so that the form and content of the programming products are single, and the ever-increasing use demands of users are difficult to meet.
Disclosure of Invention
The application provides a visual programming method, a visual programming device, electronic equipment and a computer readable storage medium, which are used for solving the problems that the prior art has poor effect on training the three-dimensional space thinking ability of children, a 2D scene cannot present the stereoscopic impression of a three-dimensional model, and cannot generate perspective effect, so that the 2D programming product is difficult to meet the increasing use demands of users.
In a first aspect, the present application provides a visual programming method, comprising: responding to the operation of loading a programming interface by a user, and displaying the programming interface, wherein the programming interface comprises a scene area, and 3D (three-dimensional) programming objects and a scene area are displayed on the scene area; and responding to a modification operation of modifying the attribute of the 3D programming object by a user, and modifying the display state of the 3D programming object in the scene area to be consistent with the attribute corresponding to the modification operation.
In the embodiment of the application, the 3D programming object is arranged, and the user can control the 3D programming object to execute different actions through programming, so that the space thinking capability of the user can be effectively trained. In addition, compared with the existing 2D programming object, the 3D programming object can execute more actions which cannot be executed by the 2D programming object, the form and the content of the programming product are increased, and the actual use requirements of users are further met.
With reference to the foregoing technical solution of the first aspect, in some possible implementation manners, the attribute of the 3D programming object includes three-dimensional coordinates, a size ratio, and a three-dimensional rotation degree.
In the embodiment of the application, the three-dimensional coordinates, the size proportion and the three-dimensional rotation degree of the 3D programming object can be modified, so that the display state of the 3D programming object in the scene area can be more flexibly adjusted, the diversity and the flexibility of the 3D programming object are improved, and the use experience of a user is further improved.
With reference to the foregoing technical solution of the first aspect, in some possible implementation manners, the programming interface includes a modification control, where the modification control is used to modify an attribute of the 3D programming object.
In the embodiment of the application, the modification control is arranged on the programming interface, so that a user can modify the attribute of the 3D programming object through the modification control, the operation of modifying the attribute of the 3D programming object is simplified, and the user can intuitively feel the change of the attribute such as the position of the 3D programming object in the space in the three-dimensional scene, and the use experience of the user can be improved.
With reference to the foregoing technical solution of the first aspect, in some possible implementation manners, before the modifying operation that responds to a user to modify an attribute of the 3D programming object, modifies a display state of the 3D programming object in the scene area to be consistent with the attribute corresponding to the modifying operation, the method further includes: and responding to a selection operation of selecting the 3D programming object by a user, determining the 3D programming object from a preset 3D programming object library, and displaying the 3D programming object in the scene area based on the preset initial attribute of the 3D programming object.
In the embodiment of the application, the 3D programming object library is arranged, so that a user can select different 3D programming objects according to actual use requirements, the diversity of programming products is enriched, and the use experience of the user is further improved.
With reference to the foregoing technical solution of the first aspect, in some possible implementation manners, after the modifying operation that responds to a user to modify an attribute of the 3D programming object, modifies a display state of the 3D programming object in the 3D scene area to be consistent with the attribute corresponding to the modifying operation, the method further includes: responding to configuration operation of a user configuration code block group, and adding the code block group corresponding to the configuration operation in a code block configuration area, wherein the code block group comprises at least one code block; and responding to the operation of the user for operating the code block group, and controlling the 3D programming object to execute the action corresponding to the code block group in the scene area.
With reference to the foregoing technical solution provided in the first aspect, in some possible implementation manners, the action includes at least one of bouncing, physical collision, moving, running.
In the embodiment of the application, the 3D programming object can execute actions such as bouncing, physical collision, moving, running and the like, and compared with the prior art that the programming method based on the 2D programming object can only support straight-line actions such as forward, backward, leftward and rightward, the programming method based on the 3D programming object provided by the application has more programming content and can more meet the use requirement of users.
With reference to the foregoing technical solution of the first aspect, in some possible implementation manners, when the code block group includes making the 3D programming object perform a sounding operation, the controlling the 3D programming object to perform an action corresponding to the code block group in the scene area includes: and controlling the 3D programming object to execute an action corresponding to the sounding operation, and playing the voice corresponding to the sounding operation, wherein the playing volume and the sound channel of the voice are determined through the coordinates of the 3D programming object in the scene area.
In the embodiment of the application, the playing volume and the channel of the voice are determined by the coordinates of the 3D programming object in the scene area, so that the played voice is more attached to the 3D scene, and the use experience of a user is improved.
With reference to the foregoing technical solution provided in the first aspect, in some possible implementation manners, the programming interface includes a plurality of preset code blocks, and the adding, in a code block configuration area, a code block group corresponding to a configuration operation in response to a configuration operation of a user configuration code block group includes: responding to the configuration operation that a user drags each code block required by the code block group to the code block configuration area and splices with other code blocks in the code block group, controlling each code block required by the code block group to move according to a track dragged by the user and splice with other code blocks in the code block group, wherein each code block in the code block group is one of a plurality of preset code blocks.
In the embodiment of the application, the user only needs to drag each code block required by the code block group to the code block configuration area to splice with other code blocks in the code block group, thus completing the configurable code block group and simplifying the operation of configuring the code block group by the user.
With reference to the foregoing technical solution provided in the first aspect, in some possible implementation manners, the code block configuration area and the scene area overlap, where the scene area includes a position layer, where the position layer is used to display a position of a code block group in the code block configuration area on the scene area, and after the code block configuration area adds a code block corresponding to the configuration operation, the method further includes: and hiding the code block groups in the code block configuration area.
In the embodiment of the application, the code blocks in the code block configuration area are hidden, so that the image of the scene area can be completely displayed to the user, and the viewing experience of the user is improved. Meanwhile, the position of each code block in the code block configuration area on the scene area is displayed by using the position layer, so that a user can accurately find the position of the hidden code block group, and the code block group can be conveniently adjusted subsequently.
In a second aspect, the present application provides a visual programming apparatus comprising: the display module is used for responding to the operation of loading a programming interface by a user, and displaying the programming interface, wherein the programming interface comprises a scene area; and the attribute modification module is used for responding to the modification operation of modifying the attribute of the 3D programming object by a user under the condition that the 3D programming object is displayed on the scene area, and modifying the display state of the 3D programming object in the scene area to be consistent with the attribute corresponding to the modification operation.
In a third aspect, an embodiment of the present application further provides a programming system, including: a server and the visual programming device of the second aspect, wherein the visual programming device is in communication connection with the server.
In a fourth aspect, an embodiment of the present application further provides an electronic device, including: the device comprises a memory and a processor, wherein the memory is connected with the processor; the memory is used for storing programs; the processor is configured to invoke the program stored in the memory to perform the method as provided by the embodiments of the first aspect and/or any of the possible implementation manners in combination with the embodiments of the first aspect.
In a fifth aspect, embodiments of the present application further provide a computer readable storage medium having stored thereon a computer program which, when executed by a computer, performs a method as provided by the embodiments of the first aspect and/or any of the possible implementations in combination with the embodiments of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a visual programming method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a programming interface according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a 3D programming object library according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a modification control, shown in accordance with an embodiment of the present application;
FIG. 5 is a schematic diagram of a code block set according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a scene area including a position layer according to an embodiment of the application;
FIG. 7 is a block diagram of a visual programming device according to an embodiment of the present application;
fig. 8 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, relational terms such as "first," "second," and the like may be used solely to distinguish one entity or action from another entity or action in the description of the application without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The technical scheme of the present application will be described in detail with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a flow chart of a visual programming method according to an embodiment of the application, and the steps included in the visual programming method will be described with reference to fig. 1.
S100: and responding to the operation of loading the programming interface by the user, and displaying the programming interface.
The programming interface includes a scene area as shown in fig. 2. Wherein the scene area is a 3D scene.
The specific way to display the scene area in the programming interface may be: the scene and the user-selected resources (e.g., 3D programming objects) are loaded into memory, and after loading is completed, rendering instructions are sent to the GPU (Graphics Processing Unit, graphics processor). And the GPU calculates the corresponding relation between the graphics vertexes and material materials by using a preset shader, and projects 3D on a 2D screen in an orthogonal projection mode, so that the rendering and the display of the 3D graphics are realized.
The 3D programming object may be any type of 3D model, such as a character model, an animal model, a plant model, a vehicle model, etc., and the specific type of 3D programming object is not limited herein.
The 3D programming object may be preset, that is, when the scene area is loaded, the 3D programming object and the scene area are loaded together, and at this time, the preset 3D programming object is displayed in the loaded scene area. Or, the 3D programming object may be added by the user according to the actual use requirement, and at this time, the scene area may be loaded first, and after the scene area is loaded, the 3D programming object selected by the user is added in the scene area in response to the operation of adding the 3D programming object by the user.
Because the required 3D programming objects may be different in different application scenarios, in order to improve the application scope of the present solution, after responding to the operation of loading the programming interface by the user, the visual programming method further includes: in response to a selection operation of a user selecting a 3D programming object, the 3D programming object is determined from a preset 3D programming object library, and the 3D programming object is displayed in a scene area based on a preset initial attribute of the 3D programming object. The 3D programming object library comprises a plurality of 3D programming objects.
When the number of 3D programming objects included in the 3D programming object library is large, the 3D programming object library may optionally sort the 3D programming objects in order to facilitate the user to quickly find the 3D programming objects to be used. For ease of understanding, please refer to fig. 3.
The left portion of the 3D programming object library shown in fig. 3 illustrates the types of 3D programming objects (e.g., animals, people, sports, food, music, weapons, FOR TEST, etc. shown in fig. 3), and the right portion of the 3D programming object library shown in fig. 3 is used to display all 3D programming objects (e.g., dirt Cube, lore, tree 1, bus 3, ball 3, etc. shown in fig. 3) included in the selected types of 3D programming objects. The 3D programming object library shown in fig. 3 is merely one embodiment provided by the present application and should not be taken as limiting the present application.
Accordingly, in response to a selection operation of selecting a 3D programming object by a user, a specific process of determining the 3D programming object from a preset 3D programming object library may be: and in response to the operation of selecting the target 3D programming object type by the user, displaying all 3D programming objects included in the target 3D programming object type. In response to a selection operation of a user selecting a 3D programming object, the 3D programming object is determined from the target 3D programming object type, and the 3D programming object is displayed in the scene area based on a preset initial attribute of the 3D programming object.
The classification of the 3D programming objects may be set according to actual requirements, which is not limited herein.
Optionally, when the 3D programming object required by the user does not exist in the 3D programming object library, the 3D programming object uploaded by the user may be added to the 3D programming object library in response to the operation of uploading the 3D programming object by the user. Alternatively, the 3D programming object downloaded by the user may be added to the 3D programming object library in response to an operation of downloading the 3D programming object by the user. Wherein the 3D programming object downloaded by the user may be uploaded by other users (e.g., uploaded by a teacher, developer, etc.).
S200: in the case where the 3D programming object is displayed on the scene area, in response to a modification operation by which the user modifies the attribute of the 3D programming object, the display state of the 3D programming object in the scene area is modified to be consistent with the attribute corresponding to the modification operation.
The attributes of the 3D programming object may include physical attributes of the 3D programming object itself and location attributes characterizing the location of itself.
In one embodiment, the attributes of the 3D programming object may include three-dimensional coordinates, size scale, three-dimensional rotation.
Optionally, the attributes of the 3D programming object may further include at least one of physical collision attributes, physical gravity attributes, whether to have a lighting characteristic, a lighting color, and model skin.
The 3D object is composed of a mesh of 3D vertices and a 2D UV map. UV mapping is a planar representation of the 3D model surface, whereby the 3D object has a skinning effect by projecting the UV mapping on the vertex coordinates under the grid 3D coordinate system.
The process of creating a UV map is called UV unfolding. U and V refer to the horizontal and vertical axes of the 2D space because X, Y and Z are already used in the 3D space. Once the polygonal mesh (mesh of 3D vertices) has been created, the next step is to "unwrap it" into the UV map. Giving life to the polygonal mesh and making it look more realistic requires adding texture to the polygonal mesh. However, there is currently no 3D texture, and the texture of the polygonal mesh is to convert the 3D mesh into 2D information through UV mapping, and then wrap the 2D texture (UV map) around the 3D mesh to generate a 3D model with a specific effect.
Alternatively, a plurality of UV maps and 3D grids may be provided, and one 3D grid may correspond to a plurality of UV maps. The user may select different 3D grids (i.e., models) and may select a desired UV map on the property panel of the corresponding model. The target 3D model can then be derived based on the selected UV map and 3D mesh.
It will be appreciated that the three-dimensional coordinates characterize a specific location of the 3D programming object in the scene area, and that the specific location of the 3D programming object in the scene area may be adjusted by adjusting the three-dimensional coordinates.
The size scale characterizes the size of the 3D programming object relative to the scene area, and the 3D programming object can be enlarged or reduced by adjusting the size scale.
The three-dimensional rotation degree characterizes the rotation angle of each coordinate axis of the 3D programming object compared with a preset reference coordinate system, and the position and the posture of the 3D programming object, such as the orientation of the 3D programming object, can be adjusted by adjusting the three-dimensional rotation degree.
The physical collision properties include whether the 3D programming object has a collision volume and a corresponding collision model (the collision model of the 3D programming object may be calculated by a bounding box, which may be an AABB bounding box, an OBB bounding box, etc.). Whether the 3D programming object collides with other 3D programming objects or objects in the scene area can be controlled by adjusting the physical collision attribute.
When the 3D programming object collides with other 3D programming objects or objects in the scene area, the collision mode and the collision point are calculated through corresponding bounding boxes (collision models) of the 3D programming object and the other 3D programming object. Wherein, the collision between different objects is detected by using the bounding box as a conventional technical means in the art, and for brevity, will not be described herein.
The physical gravity attribute characterizes the falling speed of the 3D programming object, and the higher the physical gravity attribute is, the faster the falling speed of the 3D programming object is.
By adjusting whether the 3D programming object has a light emitting characteristic, the 3D programming object may be controlled to emit light or not. Wherein, when the 3D programming object emits light, a light emitting color of the 3D programming object may be set. The specific type of the light emission color may be set according to actual demands, and is not limited here.
To facilitate modifying properties of the 3D programming object, in one embodiment, the programming interface further includes a modification control, wherein the modification control is configured to modify properties of the 3D programming object.
Alternatively, only one modification control may be included in the programming interface, at which point all properties of the 3D programming object are modified by the one modification control. For example, the modification control may be an attribute configuration table including all attributes of the 3D programming object, and the user implements modification of the display state of the 3D programming object in the scene area by modifying different attributes in the attribute configuration table.
Optionally, a plurality of modification controls may be included in the programming interface, and accordingly, each modification control corresponds to an attribute of the 3D programming object.
For example, the plurality of modification controls may include a modification control for modifying three-dimensional coordinates, a modification control for modifying a size scale, a modification control for modifying a three-dimensional rotation, a modification control for modifying a physical impact attribute, a modification control for modifying a physical gravity attribute, a modification control for modifying whether a lighting characteristic is present, a modification control for modifying a lighting color.
Alternatively, the modification control for modifying the three-dimensional coordinates may be a form including three coordinate values of "x", "y" and "z", and the user modifies the three-dimensional coordinates by filling in the numerical values. Alternatively, the user may drag the 3D programming object directly to modify the three-dimensional coordinates of the 3D programming object. The examples herein are for ease of understanding only and should not be construed as limiting the application.
The modification control for modifying the size ratio may be a scroll bar (comprising a sliding region and a slider), the position of the slider in the sliding region characterizing the value of the size ratio, wherein the two end points of the sliding region respectively characterize the maximum and minimum value of the size ratio. Alternatively, the modification control for modifying the size ratio may be a form, with the user modifying the size ratio by filling in a numerical value. The examples herein are for ease of understanding only and should not be construed as limiting the application.
The modification control for modifying the three-dimensional rotation degree may be a form including three rotation degrees of "x-axis rotation degree", "y-axis rotation degree", "z-axis rotation degree", and the user modifies the three-dimensional rotation degree by filling in a numerical value. Alternatively, the modification control for modifying the three-dimensional rotation degree may be a sphere, and when the sphere rotates, the 3D programming object correspondingly rotates with the same rotation angle. At this time, the user adjusts the three-dimensional rotation degree of the 3D programming object by rotating the sphere (for example, dragging the sphere). The examples herein are for ease of understanding only and should not be construed as limiting the application.
The modification control for modifying the physical collision property may be a checkbox, and the user determines whether the 3D programming object has a collision property by checking or de-checking. When a 3D programming object has collision properties, the 3D programming object may collide with other 3D objects having collision properties. The examples herein are for ease of understanding only and should not be construed as limiting the application.
The modification control for modifying the physical gravity property may be a form, the user modifying the physical gravity property by filling in a number. The examples herein are for ease of understanding only and should not be construed as limiting the application.
The modification control for modifying whether the 3D programming object has a lighting characteristic may be a checkbox, and the user determines whether the 3D programming object lights by checkpointing or de-checkpointing. The examples herein are for ease of understanding only and should not be construed as limiting the application.
The modification control for modifying the luminescent color may be a pigment disk comprising a plurality of colors, and the user may implement the modification of the luminescent color of the 3D programming object by selecting different colors in the pigment disk. It is understood that the light emitting color of the 3D programming object is only validated when the 3D programming object has the light emitting characteristic of emitting light. The examples herein are for ease of understanding only and should not be construed as limiting the application.
One particular implementation of a modification control is shown in FIG. 4 for ease of understanding. The modification control 1 is a modification control for modifying three-dimensional coordinates; modification control 2 is a modification control for modifying the size scale; the modification control 3 is a modification control for modifying the three-dimensional rotation degree; the modification control 4 is a modification control for modifying the physical gravity attribute; the modification controls 5 include a modification control for modifying whether or not there is a lighting characteristic and a modification control for modifying a lighting color.
Wherein, the modification control can be displayed at the bottom of the programming interface or at the bottom of the scene area; or may be displayed to the left and/or right of the programming interface, without limitation herein as to the particular display location of the modification control.
In one embodiment, when a plurality of 3D programming objects exist in a scene area, a user can select a 3D programming object to be modified by clicking or the like, and at this time, a modification control is used for modifying the attribute of the selected 3D programming object. The manner of modifying the attribute of the selected 3D programming object by the modification control is consistent with the manner of modifying the attribute of the 3D programming object by the modification control described above, and is not described herein for brevity.
After S200, the visual programming method further includes S300, S400.
S300: and responding to the configuration operation of the user configuration code block group, and adding the code block group corresponding to the configuration operation in a code block configuration area. Wherein the code block group comprises at least one code block.
In one embodiment, the programming interface includes a plurality of preset code blocks, and the specific implementation manner of S300 may be: responding to the configuration operation that each code block required by the code block group is dragged to a code block configuration area by a user and spliced with other code blocks in the code block group, controlling each code block required by the code block group to move according to the track dragged by the user and spliced with other code blocks in the code block group, wherein each code block in the code block group is one of a plurality of preset code blocks.
For easy understanding, referring to fig. 5, taking an example of 3 code blocks of the code block group including the first code block, the second code block, and the third code block as an example, a specific implementation manner of S300 may be: first, the first code block is moved to the code block configuration area in response to a user operation to move the first code block to the code block configuration area. And then, in response to the operation that the user moves the second code block to the code block configuration area and performs splicing with the first code block, moving the second code block to the code block configuration area to splice with the first code block. And finally, responding to the operation that a user moves the third code block to the code block configuration area and performs splicing with the first code block and the second code block, moving the third code block to the code block configuration area, and performing splicing of the third code block with the first code block and the second code block, wherein the specific splicing sequence of the first code block, the second code block and the third code block can be set according to actual conditions, and the sequence of the user moving the code blocks does not represent the connection sequence among the code blocks. The examples herein are for ease of understanding only and should not be construed as limiting the application.
The programming interface may further include a preset code block area, where the plurality of preset code blocks are displayed. The user can move the code blocks in the preset code block area into the code block configuration area in a dragging manner.
S400: in response to a user running operation of the code block set, the 3D programming object is controlled to perform an action corresponding to the code block set in the scene area.
Wherein the action performed by the 3D programming object includes at least one of bouncing, physical collision, moving, running.
In response to a user running operation of the code block set, a specific manner of controlling the 3D programming object to execute an action corresponding to the code block set in the scene area may be to compile the code block set to obtain an executable code. The CPU sequentially runs the executable codes corresponding to each code block according to the sequence of the code blocks in the code block group to generate control instructions, and controls the GPU to execute corresponding rendering drawing, so that the visual effect in the scene area is changed, and the 3D programming object is controlled to execute actions corresponding to the code block group in the scene area.
In one embodiment, when the code block set includes a sound producing operation performed by the 3D programming object, a specific manner of controlling the 3D programming object to perform an action corresponding to the code block set in the scene area may be: and controlling the 3D programming object to execute the action corresponding to the sounding operation and playing the voice corresponding to the sounding operation. Wherein the play volume and channel of the speech are determined by coordinates of the 3D programming object in the scene area.
A reference coordinate point can be set in the scene area, and the further the 3D programming object is from the reference coordinate point, the smaller the playing volume of the voice. The channel is determined by the relative position of the 3D programming object and the reference coordinate point.
Specifically, the screen camera is used as a reference, and the camera position is used as a user-accepted sound position. The 3D programming object location is the sound source location. The sound size is controlled by the distance size from the sound source position to the sound receiving position. The farther the distance, the less sound; whereas the closer the distance, the louder the sound. The sound channel is determined by the left and right directions from the sound source position to the sound reference point by taking the sound source receiving position as the sound reference point. The closer the sound source location is to the left of the sound reference point, the more volume is allocated to the left channel and the less volume is allocated to the right channel. Conversely, the closer the sound source position is to the right of the sound reference point, the less the left channel is allocated volume and the more the right channel is allocated volume. Thereby realizing the playing effect of the 3D sound effect.
In order to improve the viewing experience of the user when the code block configuration area and the scene area overlap, in one embodiment, after S300, the visual programming method may further include: the code blocks in the code block configuration area are hidden. By hiding the code blocks in the code block configuration area, the image of the scene area can be completely displayed to the user, and the viewing experience of the user is improved.
Optionally, after the code block is hidden, the hidden code block may be displayed in response to the user clicking on the code block hiding area. So that the code blocks can be adjusted again.
Alternatively, a hidden code block may be used as the first code block of the code block group, where all code blocks below the code block are not displayed at other clients. Whether the code block group is hidden or not can be controlled according to actual requirements.
For example, the teacher end may use a hidden code block as the first code block of the code block set, and the student end may not display the code block set after downloading the code block set. The examples herein are for ease of understanding only and should not be construed as limiting the application.
In order to facilitate subsequent adjustment of the code block groups in the code block configuration area by the user, in one embodiment, a location layer is also included in the scene area of the programming interface. As shown in fig. 6:
the location layer is used to display the location of each code block in the code block configuration area on the scene area. Because the position layer displays the position of each code block in the code block configuration area on the scene area, the user can accurately find the position of each code block.
The code blocks shown in fig. 6 with broken lines are hidden code blocks, and are invisible in practical applications.
Based on the same technical concept, the application also provides a visual programming device, as shown in fig. 7, which comprises a display module and an attribute modification module.
And the display module is used for responding to the operation of loading the programming interface by a user, and displaying the programming interface, wherein the programming interface comprises a scene area, and 3D programming objects are displayed on the scene area.
And the attribute modification module is used for responding to a modification operation of modifying the attribute of the 3D programming object by a user and modifying the display state of the 3D programming object in the scene area to be consistent with the attribute corresponding to the modification operation.
In one embodiment, the attributes of the 3D programming object include three-dimensional coordinates, size scale, three-dimensional rotation.
In one embodiment, the programming interface includes a modification control, wherein the modification control is configured to modify an attribute of the 3D programming object.
The visual programming device further comprises a selection module, wherein the selection module is used for responding to the selection operation of selecting the 3D programming object by a user before the display state of the 3D programming object in the scene area is modified to be consistent with the attribute corresponding to the modification operation in response to the modification operation of the attribute of the 3D programming object by the user, determining the 3D programming object from a preset 3D programming object library, and displaying the 3D programming object in the scene area based on the preset initial attribute of the 3D programming object.
The visual programming device further comprises a processing module, wherein the processing module is used for responding to the configuration operation of a user configuration code block group after the display state of the 3D programming object in the scene area is modified to be consistent with the attribute corresponding to the modification operation in response to the modification operation of the attribute of the 3D programming object, and adding the code block group corresponding to the configuration operation in a code block configuration area, wherein the code block group comprises at least one code block.
And the processing module is also used for responding to the operation of the user for operating the code block group and controlling the 3D programming object to execute the action corresponding to the code block group in the scene area.
In one embodiment, the action includes at least one of bouncing, physical collision, moving, running.
And the processing module is specifically used for controlling the 3D programming object to execute actions corresponding to the sounding operation and playing the voice corresponding to the sounding operation when the code block group comprises the sounding operation, wherein the playing volume and the sound channel of the voice are determined through the coordinates of the 3D programming object in the scene area.
And the processing module is specifically used for responding to the configuration operation that a user drags each code block required by the code block group to the code block configuration area and splices with other code blocks in the code block group when the programming interface comprises a plurality of preset code blocks, controlling each code block required by the code block group to move according to the track dragged by the user and splice with other code blocks in the code block group, wherein each code block in the code block group is one of the plurality of preset code blocks.
The processing module is further configured to conceal the code blocks in the code block configuration area when the code block configuration area coincides with the scene area, where the scene area includes a position layer for displaying a position of each code block in the code block configuration area on the scene area.
The implementation principle and the technical effects of the visual programming device 100 provided in the embodiment of the present application are the same as those of the embodiment of the visual programming method, and for the sake of brevity, reference may be made to the corresponding contents of the embodiment of the visual programming method.
Based on the same inventive concept, the application also provides a programming system, which comprises a server and a visual programming device.
Wherein the visual programming device is in communication connection with the server. The specific implementation and principles of the visual programming device have been described above and are not repeated here for brevity.
In one embodiment, the visual programming means may comprise a plurality of visual programming means, each visual programming means being communicatively coupled to a server.
Alternatively, the authority may be set for each visual programming device, and the authority of different visual programming devices may be different.
For example, the visual programming means may be provided as student side or teacher side. The visual programming device used as the teacher end can modify the scene area or establish the scene area to obtain a new scene area. After that, the teacher end can upload the scene area to the server. The student side can access the server download scene area.
When uploading the scene area, the teacher end can associate teaching contents (such as code blocks required to be used for teaching, post-class problems and the like) with the scene area, and synchronously upload the associated scene area and the teaching contents to the server. At this time, when the student side downloads the scene area, the teaching contents associated with the scene area are downloaded synchronously.
Optionally, the teacher end may also establish a code block group, and use a hidden code block as the first code block of the code block group. In this case, when the student side downloads the teaching content including the code block group, the code block group is not displayed at the student side.
Fig. 8 is a schematic diagram of an electronic device 200 according to an embodiment of the application. The electronic device 200 includes: transceiver 210, memory 220, communication bus 230, processor 240.
The transceiver 210, the memory 220, and the processor 240 are electrically connected directly or indirectly to each other to realize data transmission or interaction. For example, the components may be electrically coupled to each other via one or more communication buses 230 or signal lines. Wherein the transceiver 210 is configured to transmit and receive data. The memory 220 is used for storing a computer program, such as the software functional modules shown in fig. 7, i.e. the visual programming means 100. Wherein the visualization programming means 100 comprise at least one software functional module which may be stored in the memory 220 in the form of software or firmware (firmware) or cured in an Operating System (OS) of the electronic device 200. The processor 240 is configured to execute executable modules stored in the memory 220, such as software functional modules or computer programs included in the visual programming device 100. At this time, the processor 240 is configured to respond to an operation of loading a programming interface by a user, and display the programming interface, where the programming interface includes a scene area, and a 3D programming object is displayed on the scene area; and responding to a modification operation of modifying the attribute of the 3D programming object by a user, and modifying the display state of the 3D programming object in the scene area to be consistent with the attribute corresponding to the modification operation.
The Memory 220 may be, but is not limited to, a random access Memory (Random Access Memory, RAM), a Read Only Memory (ROM), a programmable Read Only Memory (Programmable Read-Only Memory, PROM), an erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), an electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc.
The processor 240 may be an integrated circuit chip with signal processing capabilities. The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor 240 may be any conventional processor or the like.
The electronic device 200 includes, but is not limited to, a personal computer, a server, and the like.
The embodiment of the present application further provides a computer readable storage medium (hereinafter referred to as a storage medium) having a computer program stored thereon, where the computer program, when executed by a computer such as the electronic device 200 described above, performs the visual programming method described above. The computer-readable storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (13)

1. A method of visual programming, comprising:
responding to the operation of loading a programming interface by a user, and displaying the programming interface, wherein the programming interface comprises a scene area;
In the case that the 3D programming object is displayed on the scene area, responding to a modification operation of modifying the attribute of the 3D programming object by a user, and modifying the display state of the 3D programming object in the scene area to be consistent with the attribute corresponding to the modification operation.
2. The method of claim 1, wherein the attributes of the 3D programming object include three-dimensional coordinates, size scale, three-dimensional rotation.
3. The method of claim 1, wherein the programming interface includes a modification control, wherein the modification control is used to modify a property of the 3D programming object.
4. The method of claim 1, wherein prior to the modifying operation responsive to a user modifying the property of the 3D programming object, modifying the display state of the 3D programming object in the scene area to be consistent with the property corresponding to the modifying operation, the method further comprises:
and responding to a selection operation of selecting the 3D programming object by a user, determining the 3D programming object from a preset 3D programming object library, and displaying the 3D programming object in the scene area based on the preset initial attribute of the 3D programming object.
5. The method of claim 1, wherein after the modifying operation responsive to the user modifying the property of the 3D programming object modifies the display state of the 3D programming object in the scene area to be consistent with the property corresponding to the modifying operation, the method further comprises:
responding to configuration operation of a user configuration code block group, and adding the code block group corresponding to the configuration operation in a code block configuration area, wherein the code block group comprises at least one code block;
and responding to the operation of the user for operating the code block group, and controlling the 3D programming object to execute the action corresponding to the code block group in the scene area.
6. The method of claim 5, wherein the action comprises at least one of bouncing, physical collision, moving, running.
7. The method of claim 6, wherein when the set of code blocks includes causing the 3D programming object to perform a sound producing operation, the controlling the 3D programming object to perform an action corresponding to the set of code blocks in the scene area includes:
and controlling the 3D programming object to execute an action corresponding to the sounding operation, and playing the voice corresponding to the sounding operation, wherein the playing volume and the sound channel of the voice are determined through the coordinates of the 3D programming object in the scene area.
8. The method of claim 5, wherein the programming interface comprises a plurality of preset code blocks, wherein the adding code block groups corresponding to the configuration operations in a code block configuration area in response to a configuration operation for configuring the code block groups by a user comprises:
and responding to the configuration operation that a user drags each code block required by the code block group to the code block configuration area and splices with other code blocks in the code block group, and controlling each code block required by the code block group to splice with other code blocks in the code block group, wherein each code block in the code block group is one of a plurality of preset code blocks.
9. The method of claim 5, wherein the code block configuration area and the scene area coincide, the scene area including a location layer for displaying a location of a code block group in the code block configuration area on the scene area, the method further comprising, after the code block configuration area adds a code block corresponding to the configuration operation:
and hiding the code block groups in the code block configuration area.
10. A visual programming device, comprising:
the display module is used for responding to the operation of loading the programming interface by a user, and displaying the programming interface, wherein the programming interface comprises a scene area, and 3D programming objects are displayed on the scene area;
and the attribute modification module is used for responding to the modification operation of modifying the attribute of the 3D programming object by a user under the condition that the 3D programming object is displayed on the scene area, and modifying the display state of the 3D programming object in the scene area to be consistent with the attribute corresponding to the modification operation.
11. A programming system, comprising:
a server and the visual programming apparatus of claim 10, wherein the visual programming apparatus is communicatively coupled to the server.
12. An electronic device, comprising: the device comprises a memory and a processor, wherein the memory is connected with the processor;
the memory is used for storing programs;
the processor being adapted to invoke a program stored in the memory for performing the method according to any of claims 1-9.
13. A computer-readable storage medium, on which a computer program is stored, which, when being run by a computer, performs the method according to any one of claims 1-9.
CN202310521623.2A 2023-05-10 2023-05-10 Visual programming method, visual programming device, electronic equipment and computer readable storage medium Pending CN116594609A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310521623.2A CN116594609A (en) 2023-05-10 2023-05-10 Visual programming method, visual programming device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310521623.2A CN116594609A (en) 2023-05-10 2023-05-10 Visual programming method, visual programming device, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN116594609A true CN116594609A (en) 2023-08-15

Family

ID=87593062

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310521623.2A Pending CN116594609A (en) 2023-05-10 2023-05-10 Visual programming method, visual programming device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN116594609A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111240673A (en) * 2020-01-08 2020-06-05 腾讯科技(深圳)有限公司 Interactive graphic work generation method, device, terminal and storage medium
CN112506502A (en) * 2020-12-16 2021-03-16 深圳市优必选科技股份有限公司 Visual programming method, device, equipment and storage medium based on human-computer interaction
CN112612463A (en) * 2020-12-30 2021-04-06 深圳市大富网络技术有限公司 Graphical programming control method, system and device
US10997217B1 (en) * 2019-11-10 2021-05-04 Tableau Software, Inc. Systems and methods for visualizing object models of database tables

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10997217B1 (en) * 2019-11-10 2021-05-04 Tableau Software, Inc. Systems and methods for visualizing object models of database tables
CN111240673A (en) * 2020-01-08 2020-06-05 腾讯科技(深圳)有限公司 Interactive graphic work generation method, device, terminal and storage medium
CN112506502A (en) * 2020-12-16 2021-03-16 深圳市优必选科技股份有限公司 Visual programming method, device, equipment and storage medium based on human-computer interaction
CN112612463A (en) * 2020-12-30 2021-04-06 深圳市大富网络技术有限公司 Graphical programming control method, system and device

Similar Documents

Publication Publication Date Title
Linowes et al. Augmented reality for developers: Build practical augmented reality applications with unity, ARCore, ARKit, and Vuforia
Linowes Unity virtual reality projects
Goldstone Unity game development essentials
Linowes Unity virtual reality projects: Learn virtual reality by developing more than 10 engaging projects with unity 2018
McCaffrey Unreal Engine VR cookbook: Developing virtual reality with UE4
US7536655B2 (en) Three-dimensional-model processing apparatus, three-dimensional-model processing method, and computer program
Linowes Unity 2020 virtual reality projects: Learn VR development by building immersive applications and games with Unity 2019.4 and later versions
Montero et al. Designing and implementing interactive and realistic augmented reality experiences
Glover et al. Complete Virtual Reality and Augmented Reality Development with Unity: Leverage the power of Unity and become a pro at creating mixed reality applications
Lee Learning unreal engine game development
US20140100839A1 (en) Method for controlling properties of simulated environments
Singh et al. Game Development using Unity Game Engine
Geig Unity Game Development in 24 Hours, Sams Teach Yourself
Cookson et al. Unreal Engine 4 Game Development in 24 Hours, Sams Teach Yourself
Thorn Learn unity for 2d game development
Miller et al. XNA game studio 4.0 programming: developing for windows phone 7 and xbox 360
CN116243831B (en) Virtual cloud exhibition hall interaction method and system
CN116594609A (en) Visual programming method, visual programming device, electronic equipment and computer readable storage medium
Thorn Moving from Unity to Godot
JPH06236432A (en) Virtual-reality system and generation method of virtual-reality world of virtual-reality image
Geig Unity 2018 Game Development in 24 Hours, Sams Teach Yourself
Bateman et al. The Essential Guide to 3D in Flash
KR102618644B1 (en) Method and apparatus for generating composite image using 3d model
Stabbert et al. Extending Augmented Sandboxes with Virtual Reality Interaction
Shekar et al. Mastering Android Game Development with Unity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination