WO2020063132A1 - Système et procédé de programmation interactive à base d'ar et support et dispositif intelligent - Google Patents

Système et procédé de programmation interactive à base d'ar et support et dispositif intelligent Download PDF

Info

Publication number
WO2020063132A1
WO2020063132A1 PCT/CN2019/099902 CN2019099902W WO2020063132A1 WO 2020063132 A1 WO2020063132 A1 WO 2020063132A1 CN 2019099902 W CN2019099902 W CN 2019099902W WO 2020063132 A1 WO2020063132 A1 WO 2020063132A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
real object
image
real
program instruction
Prior art date
Application number
PCT/CN2019/099902
Other languages
English (en)
Chinese (zh)
Inventor
王乐添
李斌
陈焕
Original Assignee
上海葡萄纬度科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海葡萄纬度科技有限公司 filed Critical 上海葡萄纬度科技有限公司
Publication of WO2020063132A1 publication Critical patent/WO2020063132A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming

Definitions

  • the present invention relates to the field of toys, and in particular, to an AR-based interactive programming system, method, medium, and smart device.
  • Patent documents CN1267228A and CN1223158A respectively disclose programmable toys, which belong to earlier programming toys. Such a programming toy only stays offline and offline, and cannot be augmented by virtual means.
  • patent document CN1223158A is an important research and development foundation for subsequent AR programming toys.
  • Patent document CN1223158A provides users with a series of action buttons, which can be used to program the movements of toys by pressing different buttons successively. The priority date is 1997. At that time, home PCs were not yet popular, and domestic programming skills were mainly concentrated. In colleges and universities for adults; the patent document CN1223158A was able to provide programming toys for children in the 1990s. At the time, it was really a very fun toy that was worth showing off. The focus on children's toys has a strong basis for further improvement.
  • patent document CN101789194A discloses a building block type learning and programming device, which includes a plurality of different types of real volume blocks, each type of block contains corresponding instruction information, and after connecting the block blocks to a single-chip microcomputer, the single-chip microcomputer makes The instruction information in the block outputs the control signal, presenting sound, light, electricity and other effects.
  • the interactivity of the programming effect provided by the patent document CN101789194A is still insufficient.
  • patent document CN105396295A discloses a children-oriented space-programmed robot toy, which uses a plurality of instruction labels arranged sequentially, each instruction label has several control instructions, and the robot body sequentially reads the control instructions on each instruction label. Then, the corresponding command actions indicated by the control command are executed in turn.
  • anti-addiction software the use of acceleration sensors to identify and assist in reminding the sitting posture and other technologies have been valued and developed. Therefore, a programming process such as the patent document CN105396295A departs from the technical solution of the smart device and is far from the technical field of AR programming toys.
  • AR Augmented Reality
  • Patent document CN106730899A discloses a toy control method and system, which adopts AR technology to realize synchronous movement between physical toys and virtual toys.
  • the interaction process between the physical toy and the virtual toy in the patent document CN106730899A belongs to synchronous control, only the virtualized control of the physical toy and the physicalized control of the virtual toy are implemented, and the physical toy and the virtual toy are not in the same space. To interact.
  • Patent document CN105931289A is a more typical technical solution for interacting physical toys with virtual toys in the same space. It discloses a system and method for real models to cover virtual objects, such as realizing virtual dinosaurs to be displayed on display parts , And the effect of being blocked by the real jungle landscape patch, visually the virtual dinosaur can shuttle back and forth between the real jungle landscape patch. Conversely, the effect of another virtual object blocking the solid object can also be achieved accordingly.
  • AR programming toys are mainly related to "programming” and "AR”.
  • non-AR programming toys appeared earlier, and then combined with the development of AR.
  • R & D process spanning more than 20 years, it is necessary to consider how to combine AR and programming to improve the interaction effect of programming, so that this interaction provides more companionship in children's learning and growth.
  • the object of the present invention is to provide an AR-based interactive programming system, method, medium, and smart device.
  • An AR-based interactive programming system includes:
  • AR module Overlay and present images of real objects and virtual objects
  • the AR-based interactive programming system further includes:
  • a programming module obtaining a program instruction set, wherein the program instruction set includes one or more program instruction units;
  • Execution module instructs a real object to perform an action according to the program instruction set.
  • it further comprises:
  • Matching module presents the relative relationship between the image of the real object and the virtual object.
  • the programming module includes:
  • a programming unit acquisition module acquiring a plurality of program instruction units according to the first operation input information
  • Timing relationship acquisition module acquiring timing relationships executed between the plurality of program instruction units according to the second operation input information
  • Instruction set generation module Generates the program instruction set according to the multiple program instruction units and timing relationships.
  • the matching module includes any one or more of the following modules:
  • Matching size module presents the size relative relationship between the image of the real object and the virtual object, and makes the size match between the image of the real object and the virtual object through size interaction; wherein the size interaction refers to: reminding the real object The size difference between the image and the virtual object, or instructing the real object to move to change the size of the image;
  • Matching orientation module presents the relative relationship between the orientation of the image of the real object and the virtual object, and the orientation interaction makes the orientation of the image of the real object and the virtual object match; wherein the orientation interaction means: prompting the real object The orientation difference between the image and the virtual object, or instructs the real object to rotate to change the orientation of the image.
  • the matching module further includes:
  • Obstacle elimination module determine whether there is a conflict between the virtual object and the actual obstacle; if there is no conflict, confirm that there is no conflict; if there is a conflict, then:
  • the virtual object includes a virtual venue
  • the virtual venue is presented in any of the following ways:
  • it further comprises:
  • Virtual interaction response module presents a virtual interaction response of the virtual object to the image of the real object according to the action performed by the real object.
  • it further comprises:
  • Real interaction response module instructs the real object to perform a real interaction response according to the virtual interaction response.
  • it further comprises:
  • Programming interface module provides a visual programming interface; wherein, the programming interface module includes any one or more of the following modules:
  • -Overlay presentation module Make the images of the real object and the virtual object overlay and present in the visual programming interface
  • -Operation presentation module a graphical program instruction unit, in the visual programming interface, presents a corresponding visual effect with the user's operation;
  • the program instruction set is executed step by step to execute the program instruction unit, and the actual object is executed step by step in the visual programming interface;
  • -Omit presentation module According to the user's designation of the program instruction unit, instruct the real object to directly respond to the corresponding action after the designated program instruction unit is executed, and present it in the visual programming interface.
  • the programming interface module further includes any one or more of the following modules:
  • a first interface switching module trigger switching from the operation presentation module to the execution presentation module according to the third operation input information
  • the second interface switching module triggers switching from the execution presentation module to the operation presentation module according to a virtual interaction response of the virtual object to the image of the real object.
  • An AR-based interactive programming method provided according to the present invention includes:
  • AR step superimposing and presenting images of real objects and virtual objects
  • the AR-based interactive programming method further includes:
  • Programming step obtaining a program instruction set, wherein the program instruction set includes one or more program instruction units;
  • Execution step According to the program instruction set, instruct a real object to perform an action.
  • it further comprises:
  • Matching step present the relative relationship between the image of the real object and the virtual object.
  • the programming step includes:
  • Step of acquiring a programming unit acquiring a plurality of program instruction units according to the first operation input information
  • Step of acquiring timing relationships acquiring timing relationships executed between the plurality of program instruction units according to the second operation input information
  • Instruction set generation step generating the program instruction set according to the multiple program instruction units and a timing relationship.
  • the matching step includes any one or more of the following steps:
  • Step of matching size presenting the relative size relationship between the image of the real object and the virtual object, and matching the size between the image of the real object and the virtual object through size interaction; wherein the size interaction refers to: prompting the real object The size difference between the image and the virtual object, or instructing the real object to move to change the size of the image;
  • Step of matching orientation presenting the relative relationship between the orientation of the image of the real object and the virtual object, and the orientation interaction makes the orientation of the image of the real object and the virtual object match; wherein the orientation interaction refers to: prompting the real object The orientation difference between the image and the virtual object, or instructs the real object to rotate to change the orientation of the image.
  • the matching step further includes:
  • Obstacle removal steps determine whether there is a conflict between the virtual object and the actual obstacle; if there is no conflict, confirm that there is no conflict; if there is a conflict, then:
  • the virtual object includes a virtual venue
  • the virtual venue is presented in any of the following ways:
  • it further comprises:
  • Virtual interaction response step presenting a virtual interaction response of the virtual object to the image of the real object according to the action performed by the real object.
  • it further comprises:
  • Real interaction response step instructing the real object to perform a real interaction response according to the virtual interaction response.
  • it further comprises:
  • Programming interface steps Provide a visual programming interface; wherein the programming interface steps include any one or more of the following steps:
  • -Overlay presentation step making the image of the real object and the virtual object overlay and present in the visual programming interface
  • -Operation presentation step making a graphical program instruction unit, in the visual programming interface, present a corresponding visual effect with the user's operation;
  • the programming interface step further includes any one or more of the following steps:
  • First interface switching step trigger switching from the operation presentation step to the execution presentation step according to the third operation input information
  • the second interface switching step trigger switching from the execution presentation step to the operation presentation step according to a virtual interaction response of the virtual object to the image of the real object.
  • a computer-readable storage medium storing a computer program, and the computer program implements the steps of the foregoing method when executed by a processor.
  • An intelligent device provided according to the present invention includes the above-mentioned AR-based interactive programming system, or the above-mentioned computer-readable storage medium storing a computer program.
  • the present invention has the following beneficial effects:
  • the invention improves the programming toy in the prior art, and involves the interaction between the image of the real object and the virtual object, the interaction between the real object and the smart device, and the interaction between the virtual object and the user during the programming process. Fully combined with AR technology, making programming toys suitable for children's intelligent companionship.
  • FIG. 1 is a schematic diagram of an adjustment interface of a relative relationship between a size of an image of a real object and a size of a virtual object.
  • FIG. 2 is a schematic diagram of an adjustment interface for the relative relationship between the orientation of an image of a real object and a virtual object.
  • FIG. 3 is a schematic diagram of an interface in which an image of a real object and a grid road in a virtual object are superimposed.
  • Figure 4 is a schematic diagram of a visual programming interface.
  • FIG. 5 is a schematic diagram of an interface for executing a first program instruction in a visual programming interface.
  • FIG. 6 is an interface schematic diagram of executing the second step program instruction in the visual programming interface.
  • FIG. 7 is a flowchart of method steps in a specific scenario embodiment.
  • FIG. 8 is a schematic diagram of a structural framework in a specific scenario embodiment.
  • the figure shows:
  • Companionship is an indispensable emotional link in the growth process of children, including parent-to-child companionship, and toy-to-child companionship. More importantly, parents and children can have the opportunity to play games and learn together through toys to achieve More advanced smart companions, and programming toys are one of the best carriers. To this end, the inventors carried out technical improvements under the vision and recognition of intelligent companionship, and carried out graphic expressions with the assistance of patent engineers, patent agents, and patent attorneys, in order to obtain patent rights and enable subsequent development of more Many products accompany children at home. The technical solutions and specific application scenarios of the present application are described below.
  • An AR-based interactive programming system includes:
  • AR module Overlay and present the image of the real object and the virtual object; the image of the real object is obtained by capturing the real picture from the camera device, such as the camera of a smart phone.
  • the real picture includes the entire picture collected by the camera device. Contains real objects such as stairs, sofas, building blocks, robots, carpets, murals, food, pets, and people.
  • the graphics corresponding to the recognition target in the entire screen are selected as the real object images.
  • a robot that can perform actions is used as the recognition.
  • the target, the image of the robot constitutes the image of the real object; in other preferred examples, people or animals can be used as recognition targets. As long as the person or animal can understand the instructions, the corresponding actions can also be performed after receiving the instructions.
  • the instruction is preferably a voice instruction, and a sub-optimal instruction may also be a graphic instruction, an odor instruction, or the like, which is suitable for a situation in which a person with a sensory impairment or an animal needs perceptual assistance.
  • Programming interface module Provides a visual programming interface; the visual programming interface is mainly composed of one or more sub-interfaces, and multiple sub-interfaces can be switched, displayed, or displayed side-by-side; when displayed side by side, visually the sub-interfaces Corresponds to a display area in the visual programming interface; at least one sub-interface is used as a presentation space, and at the same time, the image of the real object and the virtual object are presented, and the image of the real object and the virtual object are superimposed and presented. Only the image and virtual object of the real object are presented.
  • the sub-interface can also present the real picture of the real object at the same time. Carpets and other things around real objects.
  • the virtual object can cover a part of the real picture in which the real object is located, and the image of the real object can cover the virtual object.
  • Those skilled in the art can refer to the blocking technology of the virtual dinosaur and the real jungle landscape plug-in board in patent document CN105931289A
  • the overlay is implemented in an overlay manner, which will not be repeated here.
  • Programming module obtaining a program instruction set, wherein the program instruction set includes one or more program instruction units; wherein the type of the program instruction unit may be a logic type instruction or an execution type instruction; the execution type The instruction instructs the real object to perform an action.
  • Execution module instructs a real object to perform an action according to the program instruction set; wherein the action may be various motions, such as translation, rotation, beating, and deformation; the action may also be sound, light, and electrical effects, For example, vocalization, light emission, color change, temperature change, phase change.
  • the action may be an action that can be achieved by this type of real object, such as laughing, bending , Picking up items, turning on household appliances, etc.
  • Matching module presents the relative relationship between the image of the real object and the virtual object. Since the image of the real object and the virtual object need to be superimposed in the same space, and the image of the real object and the virtual object are required to interact, it is necessary to match the two with parameters such as size.
  • the real object is a robot and the virtual object is a grid. The influence of the robot needs to move between different grids. If the robot is too far away from the camera at this time, the robot needs to move a long distance to move from a grid. The next grid is not suitable for game effects, so it is necessary to present the relative relationship between the image of the real object and the virtual object before the virtual interactive response, the real interactive response, and the execution action, and match the relative relationship. Adjustment.
  • Virtual interaction response module presents a virtual interaction response of the virtual object to the image of the real object according to the action performed by the real object; for example, the virtual interaction response may be eating a virtual cake, brightening the grid, and saving a virtual animal. Way, you can remind the similarities and differences between the programming results and the expected structure.
  • Real interaction response module According to the virtual interaction response, instruct the real object to perform a real interaction response; the real interaction response may be that the instructed robot makes a circle in place, shakes, changes expression, and emits different sounds.
  • the invention also provides a smart device, which includes the AR-based interactive programming system.
  • the smart device may be a smart phone, a tablet computer, a smart watch, smart glasses, a projection device, a VR helmet, and other devices.
  • a smart phone as an example, the image of a real object is captured by a camera of the smart phone.
  • the screen of the smart phone overlays and displays the image of the real object and the virtual object, and presents the programming interface module, and receives the user's visual programming interface through the touch screen.
  • the short-range wireless communication module of the smart phone or the WIFI network is used to send instructions to the real object, the voice control instruction is played through the smart phone's sound module, and the light control instruction is issued through the smart phone's lighting.
  • the programming module includes:
  • Programming unit acquisition module Acquire multiple program instruction units according to the first operation input information; display graphical program instruction units in a visual programming interface, such as graphically shaped as a jigsaw puzzle; users use the program instruction units Performing the first operation input information to obtain a plurality of program instruction units, wherein the first operation input information includes input information generated by clicking, voice control selection, gesture control, eye selection, and peripheral selection.
  • Peripherals can be mouse, electronic pen and other external devices.
  • Timing relationship acquisition module Acquire a timing relationship between the multiple program instruction units according to the second operation input information.
  • the second operation input information is used to set the execution timing between the obtained multiple program instruction units. It is a single execution or a cyclic execution, and the timing relationship is mainly determined by the second operation input information and / or the logic of the program instruction unit itself.
  • the second operation input information may be a user dragging a program instruction unit after selecting it on the touch screen, or setting an execution sequence number of each program instruction unit.
  • the first operation input information and the second operation input information may be generated by the same operation of the user, for example, the user edits a graphical program instruction unit from visual editing Drag one area of the interface to another area.
  • Instruction set generation module Generates the program instruction set according to the multiple program instruction units and timing relationships.
  • the visual effect of the program instruction set on the visual editing interface may be the orderly arrangement of multiple graphical program instruction units at different positions, and the code in the program instruction set is preferably not presented visually.
  • the matching module includes any one or more of the following modules:
  • Matching size module presents the size relative relationship between the image of the real object and the virtual object, and makes the size match between the image of the real object and the virtual object through size interaction; wherein the size interaction refers to: reminding the real object The size difference between the image and the virtual object, or instructing a real object to move to change the size of the image; the virtual object includes a size comparison object between the shape of the size comparison object and the image of the real object Match, for example, both are circular, as shown in Figure 1, or both are circular and square, respectively. If the size contrast object is exactly enveloped, basically includes or contains images of real objects, then both are considered The sizes are matched, otherwise, the sizes are considered not to match, so that the relative relationship between the sizes is realized, reflecting the size difference.
  • the real object when the size difference between the image of the real object and the virtual object is large and does not match, the real object may be instructed to move to match the size relationship; if the image of the real object is too small, it indicates that the real object is close to the camera ; If the image of the real object is too large, it indicates that the real object is far from the camera; or, if it is recognized that the real object does not all enter the screen, it can instruct the real object to all enter the camera's frame acquisition angle.
  • Matching orientation module presents the relative relationship between the orientation of the image of the real object and the virtual object, and the orientation interaction makes the orientation of the image of the real object and the virtual object match; wherein the orientation interaction means: prompting the real object The orientation difference between the image and the virtual object, or instructs the real object to rotate to change the orientation of the image; the virtual object includes a contrast-oriented object.
  • the orientation of the real object can be identified, and Icons such as arrows indicate that, as shown in FIG. 2, for example, the orientation of a toy car can be set to the front direction of the toy car; the visually programming interface presents both the orientation-oriented object and the orientation icon of the real object at the same time, so that The user may automatically recognize the difference in orientation between the two.
  • the user can operate to change the orientation of the virtual object or instruct the real object to rotate to reduce or eliminate the difference in orientation.
  • Obstacle elimination module determine whether the virtual object conflicts with the real obstacle.
  • virtual objects are functional for programming, conflicts between real obstacles and virtual objects may occur. For example, if an image of a virtual object robot that is a real object needs to move along a virtual road, the virtual road cannot overlap with a real obstacle. As another example, the extension of a virtual road is overlapped by a wall or a sofa. If there is no conflict, confirm that there is no conflict; if there is a conflict, then: prompt a conflict; instruct the real object to move so that the virtual object does not conflict with the real obstacle; or, update the virtual object so that the virtual object does not conflict with the real obstacle .
  • the virtual object includes a virtual venue.
  • virtual sites are grids, roads, bridges, watercourses, stadiums, cities, grasslands, mountains and rivers, tracks, and so on.
  • the virtual venue is presented in any of the following ways: based on preset parameters; based on the relative relationship between the image of the real object and the virtual object; and after the image of the real object matches the virtual object, it is based on the real object.
  • the presenting according to the preset parameters may specifically generate and present the virtual venue according to the preset parameters without considering the relative relationship between the image of the real object and the virtual object.
  • the presentation is based on the relative relationship between the image of the real object and the virtual object.
  • the virtual object is correspondingly presented according to the image of the real object; for example, if the image of the real object is large, the grid in the virtual venue is also The corresponding area is larger; if the image of the real object is smaller, the grid in the virtual site is also smaller.
  • the motion parameters of the real object are protected.
  • the step distance of the real object of the robot is constant.
  • the real object of the robot moves one step, which is almost the same. Or just move from one grid to another.
  • the real object is presented; specifically, the position of the real object is recognized in real time, indicating that the action of the real object is adapted to the virtual venue. For example, instruct the real object to move so that the real object moves exactly from one grid to another.
  • the programming interface module includes any one or more of the following modules:
  • Overlay presentation module Make the image of the real object and the virtual object overlay and present in the visual programming interface; for example, as shown in FIG. 1.
  • Operation presentation module Make a graphical program instruction unit, in the visual programming interface, present a corresponding visual effect with the user's operation; for example, if the user drags the program instruction unit, the program instruction unit is being dragged It is highlighted during the movement and moves to the end position of the drag track.
  • the execution presentation module the program instruction set is executed step by step to execute the program instruction unit and the real object is executed step by step in the visual programming interface, and displayed synchronously; as shown in FIG. 5 and FIG. 6.
  • Omit presentation module According to the user's designation of the program instruction unit, instruct the real object to directly respond to the corresponding action after the designated program instruction unit is executed, and present it in the visual programming interface. For example, if there are 5 consecutive program instruction units that turn 90 degrees, if the user clicks on the debug and executes the fifth program instruction unit that turns 90 degrees, the real object will only turn 90 degrees, instead of 360 degrees + 90 degrees, Save time in programming proofreading.
  • the programming interface module further includes any one or more of the following modules:
  • First interface switching module trigger switching from the operation presentation module to the execution presentation module according to the third operation input information; for example, switching from FIG. 4 to the interface shown in FIG. 5.
  • the second interface switching module trigger switching from the execution presentation module to the operation presentation module according to a virtual interaction response of the virtual object to the image of the real object; for example, switching from FIG. 5 to the interface shown in FIG. 4.
  • the real object includes a robot
  • FIG. 1 shows an image 100 of the real object.
  • the robot has a spherical structure, and wheels are installed at the bottom, which can move on its own, including translation and rotation.
  • the user turns on the mobile phone, and continuously shoots the robot with the mobile phone.
  • FIG. 1 is a schematic diagram showing a realistic picture on the screen of the mobile phone.
  • the relative distance relationship between the robot and the mobile phone needs to be adjusted, that is, the virtual object, especially the distance relationship between the virtual field and the robot.
  • the virtual field is not shown in FIG. 1, but is shown in FIG. 3. This is because the virtual field needs to be generated according to the orientation of the robot.
  • the size comparison object 201 in FIG. 1 is similar to the size and size of the robot image, and basically forms an envelope. It is considered that the distance between the robot and the mobile phone is appropriate at this time.
  • an arrow icon 202 is displayed on the mobile phone interface, and the user rotates the arrow icon so that the arrow icon is aligned with the front direction of the robot.
  • FIG. 3 a virtual venue 203 is generated.
  • the user needs to look at the grid roads in the virtual field and move forward along the grid road one by one.
  • this requires the user to obtain the instruction program instruction set programmatically, and use the program instruction set to instruct the robot to move along the grid road.
  • Figure 4 shows the visual programming interface.
  • the left area of the visual programming interface is a graphical program instruction unit.
  • the right area of the visual programming interface is the program instruction units that the user has selected and sorted, that is, the six program instruction units in the right area mainly constitute the program instruction set. According to the program instruction set movement, the robot will follow the road from one end of the grid road to the other.
  • the program instruction set can execute the program instruction unit step by step, and the real object step by step actions.
  • the visual programming interface synchronous display.
  • a solid dot is marked on the graphic of the program instruction unit in the first step, which indicates that the program instruction unit is being executed. Accordingly, the robot starts from the first grid on the network road to the second. The Internet advances one space.
  • the solid dots are marked on the graphic of the program instruction unit in the second step, which indicates that the program instruction unit is being executed. Accordingly, the robot starts from the second grid of the network road to the The three networks advance one grid.
  • AR-based interactive programming method provided by the present invention.
  • Those skilled in the art can implement the AR-based interactive programming system by referring to the steps and procedures in the AR-based interactive programming. That is, the AR-based interactive programming method can be understood as a preferred example of the AR-based interactive programming system.
  • An AR-based interactive programming method provided according to the present invention includes:
  • AR step superimposing and presenting images of real objects and virtual objects
  • the AR-based interactive programming method further includes:
  • Programming step obtaining a program instruction set, wherein the program instruction set includes one or more program instruction units;
  • Execution step According to the program instruction set, instruct a real object to perform an action.
  • it further comprises:
  • Matching step present the relative relationship between the image of the real object and the virtual object.
  • the programming step includes:
  • Step of acquiring a programming unit acquiring a plurality of program instruction units according to the first operation input information
  • Step of acquiring timing relationships acquiring timing relationships executed between the plurality of program instruction units according to the second operation input information
  • Instruction set generation step generating the program instruction set according to the multiple program instruction units and a timing relationship.
  • the matching step includes any one or more of the following steps:
  • Step of matching size presenting the relative size relationship between the image of the real object and the virtual object, and matching the size between the image of the real object and the virtual object through size interaction; wherein the size interaction refers to: prompting the real object The size difference between the image and the virtual object, or instructing the real object to move to change the size of the image;
  • Step of matching orientation presenting the relative relationship between the orientation of the image of the real object and the virtual object, and the orientation interaction makes the orientation of the image of the real object and the virtual object match; wherein the orientation interaction refers to: prompting the real object The orientation difference between the image and the virtual object, or instructs the real object to rotate to change the orientation of the image.
  • the matching step further includes:
  • Obstacle removal steps determine whether there is a conflict between the virtual object and the actual obstacle; if there is no conflict, confirm that there is no conflict; if there is a conflict, then:
  • the virtual object includes a virtual venue
  • the virtual venue is presented in any of the following ways:
  • it further comprises:
  • Virtual interaction response step presenting a virtual interaction response of the virtual object to the image of the real object according to the action performed by the real object.
  • it further comprises:
  • Real interaction response step instructing the real object to perform a real interaction response according to the virtual interaction response.
  • it further comprises:
  • Programming interface steps Provide a visual programming interface; wherein the programming interface steps include any one or more of the following steps:
  • -Overlay presentation step making the image of the real object and the virtual object overlay and present in the visual programming interface
  • -Operation presentation step making a graphical program instruction unit, in the visual programming interface, present a corresponding visual effect with the user's operation;
  • the programming interface step further includes any one or more of the following steps:
  • First interface switching step trigger switching from the operation presentation step to the execution presentation step according to the third operation input information
  • the second interface switching step trigger switching from the execution presentation step to the operation presentation step according to a virtual interaction response of the virtual object to the image of the real object.
  • a computer-readable storage medium storing a computer program, characterized in that, when the computer program is executed by a processor, the steps of the AR-based interactive programming method are implemented. Examples are chips, memories, and optical discs. Especially servers that support APP stores.
  • a smart device provided according to the present invention includes the computer-readable storage medium storing the computer program.
  • system, device and each module provided by the present invention can be made by logically programming the system module.
  • the same programs are implemented in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, and embedded microcontrollers. Therefore, the system, device, and its various modules provided by the present invention can be considered as a hardware component, and the modules included in it for implementing various programs can also be considered as the structure within the hardware component; Modules for implementing various functions are considered to be both software programs that implement the system and structures within hardware components.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un système et un procédé de programmation interactive à base d'AR, ainsi qu'un support et un dispositif intelligent. Le procédé consiste à : présenter une image d'un objet réel et d'un objet virtuel de manière superposée ; acquérir un ensemble d'instructions de programme, l'ensemble d'instructions de programme comprenant une ou plusieurs unités d'instruction de programme ; et ordonner à l'objet réel de réaliser une action selon l'ensemble d'instructions de programme. La présente invention améliore les jouets de programmation en l'état de la technique, implique une interaction entre une image d'un objet réel et un objet virtuel, une interaction entre l'objet réel et un dispositif intelligent et une interaction entre l'objet virtuel et un utilisateur dans un processus de programmation et est pleinement combinée à la technologie AR, de sorte que les jouets de programmation soient appropriés pour servir de compagnons intelligents pour des enfants.
PCT/CN2019/099902 2018-09-30 2019-08-09 Système et procédé de programmation interactive à base d'ar et support et dispositif intelligent WO2020063132A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811160524.1A CN109240682B (zh) 2018-09-30 2018-09-30 基于ar的交互编程系统、方法、介质及智能设备
CN201811160524.1 2018-09-30

Publications (1)

Publication Number Publication Date
WO2020063132A1 true WO2020063132A1 (fr) 2020-04-02

Family

ID=65054336

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/099902 WO2020063132A1 (fr) 2018-09-30 2019-08-09 Système et procédé de programmation interactive à base d'ar et support et dispositif intelligent

Country Status (2)

Country Link
CN (1) CN109240682B (fr)
WO (1) WO2020063132A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111610997A (zh) * 2020-05-26 2020-09-01 北京市商汤科技开发有限公司 Ar场景内容的生成方法、展示方法、展示系统及装置
CN111610998A (zh) * 2020-05-26 2020-09-01 北京市商汤科技开发有限公司 Ar场景内容的生成方法、展示方法、装置及存储介质
CN112732247A (zh) * 2021-01-13 2021-04-30 王亚刚 基于虚拟现实技术的事件发生方法以及事件发生系统
CN113849166A (zh) * 2021-11-29 2021-12-28 广东青藤环境科技有限公司 智慧水环境积木式零代码开发平台

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109240682B (zh) * 2018-09-30 2021-11-30 上海葡萄纬度科技有限公司 基于ar的交互编程系统、方法、介质及智能设备
CN110533780B (zh) 2019-08-28 2023-02-24 深圳市商汤科技有限公司 一种图像处理方法及其装置、设备和存储介质
CN111552238A (zh) * 2020-04-17 2020-08-18 达闼科技(北京)有限公司 机器人控制方法、装置、计算设备及计算机存储介质
CN112882570A (zh) * 2021-01-28 2021-06-01 深圳点猫科技有限公司 一种基于vr技术的少儿编程实现方法、装置及设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120239785A1 (en) * 2011-03-14 2012-09-20 Pazos Carlos M D System and apparatus for using multichannel file delivery over unidirectional transport ("flute") protocol for delivering different classes of files in a broadcast network
CN103998107A (zh) * 2011-05-23 2014-08-20 乐高公司 用于增强现实的玩具搭建系统
CN105396295A (zh) * 2015-11-17 2016-03-16 卢军 一种面向儿童的空间编程机器人玩具
CN108182062A (zh) * 2017-12-12 2018-06-19 上海葡萄纬度科技有限公司 一种反向编程的方法及系统
CN108230201A (zh) * 2017-12-12 2018-06-29 清华大学 一种组合式的交互系统
CN108292816A (zh) * 2016-03-31 2018-07-17 深圳贝尔创意科教有限公司 模块化组件系统
CN109240682A (zh) * 2018-09-30 2019-01-18 上海葡萄纬度科技有限公司 基于ar的交互编程系统、方法、介质及智能设备

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
HK1216278A (zh) * 2016-04-27 2016-10-28 Kam Ming Lau 種使用虛擬機器人的教育系統

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120239785A1 (en) * 2011-03-14 2012-09-20 Pazos Carlos M D System and apparatus for using multichannel file delivery over unidirectional transport ("flute") protocol for delivering different classes of files in a broadcast network
CN103998107A (zh) * 2011-05-23 2014-08-20 乐高公司 用于增强现实的玩具搭建系统
CN105396295A (zh) * 2015-11-17 2016-03-16 卢军 一种面向儿童的空间编程机器人玩具
CN108292816A (zh) * 2016-03-31 2018-07-17 深圳贝尔创意科教有限公司 模块化组件系统
CN108182062A (zh) * 2017-12-12 2018-06-19 上海葡萄纬度科技有限公司 一种反向编程的方法及系统
CN108230201A (zh) * 2017-12-12 2018-06-29 清华大学 一种组合式的交互系统
CN109240682A (zh) * 2018-09-30 2019-01-18 上海葡萄纬度科技有限公司 基于ar的交互编程系统、方法、介质及智能设备

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111610997A (zh) * 2020-05-26 2020-09-01 北京市商汤科技开发有限公司 Ar场景内容的生成方法、展示方法、展示系统及装置
CN111610998A (zh) * 2020-05-26 2020-09-01 北京市商汤科技开发有限公司 Ar场景内容的生成方法、展示方法、装置及存储介质
CN112732247A (zh) * 2021-01-13 2021-04-30 王亚刚 基于虚拟现实技术的事件发生方法以及事件发生系统
CN112732247B (zh) * 2021-01-13 2024-05-24 王亚刚 基于虚拟现实技术的事件发生方法以及事件发生系统
CN113849166A (zh) * 2021-11-29 2021-12-28 广东青藤环境科技有限公司 智慧水环境积木式零代码开发平台

Also Published As

Publication number Publication date
CN109240682A (zh) 2019-01-18
CN109240682B (zh) 2021-11-30

Similar Documents

Publication Publication Date Title
WO2020063132A1 (fr) Système et procédé de programmation interactive à base d'ar et support et dispositif intelligent
JP7403452B2 (ja) 双方向ビデオゲームシステム
JP6431233B1 (ja) 視聴ユーザからのメッセージを含む動画を配信する動画配信システム
US20180373413A1 (en) Information processing method and apparatus, and program for executing the information processing method on computer
CN105027190B (zh) 用于虚拟或增强介导视觉的射出空间成像数字眼镜
CN108027653A (zh) 虚拟环境中的触觉交互
CN104731343A (zh) 一种基于移动终端的虚拟现实人机交互儿童教育体验系统
JP2017536715A (ja) 立体空間の物理的な対話の発現
CN102414641A (zh) 改变显示环境内的视图视角
KR20020089460A (ko) 애니메이션 생성 프로그램
CN106873767A (zh) 一种虚拟现实应用的运行控制方法和装置
TWI831074B (zh) 虛擬場景中的信息處理方法、裝置、設備、媒體及程式產品
CN115337634A (zh) 一种应用于餐食游戏类的vr系统及方法
KR20190059068A (ko) 증강현실을 이용한 격자 맵의 미션 기반 퍼즐 조립 시스템 및 방법
WO2020114395A1 (fr) Procédé de commande d'image virtuelle, dispositif terminal et support d'informations
JP6554139B2 (ja) 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム
CN109344504A (zh) 一种基于vr的计算机组装方法及其系统
CN113546420B (zh) 虚拟对象的控制方法、装置、存储介质及电子设备
JP7098575B2 (ja) アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム
JP6498832B1 (ja) 視聴ユーザからのメッセージを含む動画を配信する動画配信システム
US20240112424A1 (en) Authoring systems and methods for enabling bidirectional binding of augmented reality with toys in real-time
JP2018190397A (ja) 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム
Sun et al. The implementation of a live interactive augmented reality game creative system
Wan et al. Interactive shadow play animation system
JP6431242B1 (ja) 視聴ユーザからのメッセージを含む動画を配信する動画配信システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19865893

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19865893

Country of ref document: EP

Kind code of ref document: A1