WO2020029556A1 - Procédé et dispositif d'adaptation de plan, et support de stockage lisible par ordinateur - Google Patents

Procédé et dispositif d'adaptation de plan, et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2020029556A1
WO2020029556A1 PCT/CN2019/073080 CN2019073080W WO2020029556A1 WO 2020029556 A1 WO2020029556 A1 WO 2020029556A1 CN 2019073080 W CN2019073080 W CN 2019073080W WO 2020029556 A1 WO2020029556 A1 WO 2020029556A1
Authority
WO
WIPO (PCT)
Prior art keywords
plane
virtual object
target
terminal screen
adaptive
Prior art date
Application number
PCT/CN2019/073080
Other languages
English (en)
Chinese (zh)
Inventor
刘昂
陈怡�
Original Assignee
北京微播视界科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京微播视界科技有限公司 filed Critical 北京微播视界科技有限公司
Publication of WO2020029556A1 publication Critical patent/WO2020029556A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present disclosure relates to the field of information technology, and in particular, to a method, an apparatus, and a computer-readable storage medium for an adaptive plane.
  • Augmented Reality is a technology that calculates the position and angle of the camera image in real time and adds the corresponding images, videos, and virtual objects.
  • the goal of this technology is to put the virtual world on the screen. Real world and interact.
  • the implementation method of augmented reality technology is to put a virtual object in a real scene, that is, a real environment and a virtual object are superimposed on the same screen or space in real time. After superimposing, the virtual object will move according to a predetermined motion trajectory, or control the virtual object to perform a predetermined action through controls.
  • virtual objects are usually placed on a plane in a real scene, such as on a desktop or a wall.
  • the placed virtual objects can be controlled to move between multiple planes.
  • the plane angle is different due to different planes, such as moving from the desktop to the wall.
  • the virtual object is either suspended or kept on the desktop (vertical to the desktop).
  • the virtual cylindrical object was originally placed on a horizontal plane, and then moved from the horizontal plane to another vertical plane, but after moving out of the horizontal plane, the virtual cylindrical object is still Keeping the previous posture and the size has not changed, it can be seen from the figure that the virtual cylindrical object is not correctly placed on the vertical plane, which affects the display effect.
  • the technical problem solved by the present disclosure is to provide an adaptive plane method to at least partially solve the technical problem of how to improve the display effect of a virtual object on a terminal.
  • an adaptive plane device an adaptive plane hardware device, a computer-readable storage medium, and an adaptive plane terminal are also provided.
  • An adaptive plane method includes:
  • the virtual object is displayed on a terminal screen, and the displayed virtual object is adapted to the target plane.
  • the step of adjusting the placement posture of the virtual object according to the direction includes:
  • the method further includes:
  • the method further includes:
  • a plane is selected from the identified planes as the target plane.
  • the step of selecting a plane from the identified planes as the target plane includes:
  • the selected plane is used as the target plane.
  • the method further includes:
  • the step of adjusting the placement posture of the virtual object according to the direction further includes:
  • the step of determining a target position according to the target display position and the target plane includes:
  • An intersection of the line and the target plane is used as the target position.
  • the line is perpendicular to a plane on which the terminal screen is located.
  • the step of determining a target display position of the virtual object on a terminal screen includes:
  • the target display position to receive the input.
  • the step of determining a target plane of the virtual object in the real scene includes:
  • a plane is selected from the identified planes as the target plane.
  • the step of selecting a plane from the identified planes as the target plane includes:
  • the selected plane is used as the target plane.
  • the step of determining a target position of a virtual object in a real scene includes:
  • the step of adjusting the placement posture of the virtual object according to the direction further includes:
  • the step of determining the target position according to the target display position and the target plane includes:
  • An intersection of the line and the target plane is used as the target position.
  • the line is perpendicular to a plane on which the terminal screen is located.
  • the step of determining a target display position of the virtual object on a terminal screen includes:
  • the target display position to receive the input.
  • An adaptive plane device includes:
  • a plane direction determining module configured to determine a direction of a target plane of a virtual object in a real scene
  • An attitude adjustment module is configured to adjust the placement posture of the virtual object according to the direction, and display the virtual object on a terminal screen, and the displayed virtual object is adapted to the target plane.
  • attitude adjustment module is specifically configured to:
  • the device further includes:
  • a control movement module configured to control the virtual object to move on an initial plane
  • a position determination module is configured to, if it is determined that the position of the virtual object exceeds the initial plane, trigger an operation of determining a direction of a target plane of the virtual object in a real scene.
  • the device further includes:
  • a plane identification module is configured to identify a plurality of planes included in the real scene; and select one plane from the identified planes as the target plane.
  • plane identification module is specifically configured to:
  • the device further includes:
  • a target position determining module configured to determine a target display position of the virtual object on a terminal screen; determine a target position according to the target display position and the target plane;
  • the attitude adjustment module is specifically configured to:
  • target position determination module is specifically configured to:
  • the line is perpendicular to a plane on which the terminal screen is located.
  • target position determination module is specifically configured to:
  • the target display position to receive the input.
  • An adaptive plane hardware device includes:
  • Memory for storing non-transitory computer-readable instructions
  • a processor configured to run the computer-readable instructions, so that the processor, when executed, implements the steps described in any one of the foregoing technical solutions of the adaptive plane.
  • a computer-readable storage medium is configured to store non-transitory computer-readable instructions, and when the non-transitory computer-readable instructions are executed by a computer, cause the computer to execute any one of the foregoing adaptive plane method technical solutions Described steps.
  • An adaptive plane terminal includes any of the foregoing adaptive plane devices.
  • Embodiments of the present disclosure provide an adaptive plane method, an adaptive plane device, an adaptive plane hardware device, a computer-readable storage medium, and an adaptive plane terminal.
  • the adaptive plane method includes determining a direction of a target plane of a virtual object in a real scene; adjusting a placement posture of the virtual object according to the direction, and displaying the virtual object on a terminal screen, and displaying the virtual object.
  • the virtual object is adapted to the target plane.
  • the embodiment of the present disclosure first determines the direction of a target plane of a virtual object in a real scene, and then adjusts the posture of the virtual object according to the direction, and displays the virtual object on a terminal screen, and the displayed all
  • the virtual object is adapted to the target plane, which can avoid the situation where the virtual object is suspended in the plane or the posture of the plane is incorrect when moving, and improve the display effect of the terminal.
  • FIG. 1 is a schematic diagram of a posture for controlling a virtual object to reach a target plane according to the prior art
  • FIG. 2a is a schematic flowchart of a method for adaptive plane according to an embodiment of the present disclosure
  • 2b is a schematic diagram of controlling a placement posture of a virtual object to a target plane in a method of adaptive plane according to an embodiment of the present disclosure
  • 2c is a schematic flowchart of a method for adaptive plane according to another embodiment of the present disclosure.
  • 2d is a schematic flowchart of a method for adaptive plane according to another embodiment of the present disclosure.
  • FIG. 2e is a schematic diagram of a plane selection state in the method for adaptive plane according to the embodiment shown in FIG. 2a;
  • FIG. 2f is a schematic diagram of a selected plane state in the method for adaptive plane according to the embodiment shown in FIG. 2a;
  • FIG. 3a is a schematic structural diagram of an adaptive plane device according to an embodiment of the present disclosure.
  • 3b is a schematic structural diagram of an adaptive plane device according to another embodiment of the present disclosure.
  • FIG. 4 is a schematic structural diagram of a hardware device of an adaptive plane according to an embodiment of the present disclosure
  • FIG. 5 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic structural diagram of an adaptive plane terminal according to an embodiment of the present disclosure.
  • the adaptive plane method mainly includes the following steps S1 to S2. among them:
  • Step S1 Determine the direction of the target plane of the virtual object in the real scene.
  • the execution subject of this embodiment may be an adaptive plane device provided by an embodiment of the present disclosure, or an adaptive plane hardware device provided by an embodiment of the present disclosure, or an adaptive plane terminal provided by an embodiment of the present disclosure.
  • the virtual object may be a three-dimensional model of the real object in the scene.
  • the target plane is a plane to which a virtual object is to be moved in a real scene
  • the plane is a surface of an entity located in the real scene, and may be, for example, but not limited to, a desktop or a wall.
  • the direction of the target plane can be determined according to the coordinate axis of any direction in the three-dimensional space where the virtual object is located. For example, if the x-axis of the three-dimensional coordinates is used as a reference, the orientation of the target plane can be determined by calculating the angle between the target plane and the x-axis. For example, if the angle is 0, it is determined that the target plane is parallel to the x-axis. If the included angle is an angle greater than 0 and less than 90 degrees, it is determined that the target plane is inclined by a preset angle with respect to the x axis, and if the included angle is 90 degrees, it is determined that the target plane is perpendicular to the x axis.
  • Step S2 Adjust the posture of the virtual object according to the direction, display the virtual object on the terminal screen, and the displayed virtual object is adapted to the target plane.
  • the terminal may be, but is not limited to, a mobile terminal (for example, an iPhone, a smart phone, a tablet computer, etc.), or a fixed terminal (for example, a desktop computer).
  • the virtual object can be controlled to rotate and / or zoom, and the virtual object can be displayed on the terminal screen, and the displayed virtual object can be adapted to the target plane.
  • Figure 1 Take Figure 1 as an example. According to Figure 1, it can be seen that the virtual cylindrical object was originally placed vertically on a horizontal plane, and then moved to the target plane. As shown in Figure 1, the target plane is placed vertically, so the The virtual cylindrical object can be placed vertically on the vertical plane after being rotated by 90 degrees. The placement position of the virtual cylindrical object finally obtained is shown in FIG. 2b.
  • the displayed virtual object adapts
  • the target plane can avoid the situation where the virtual object is suspended in the plane or the posture is incorrect when the virtual object is moved, and improve the display effect of the terminal.
  • step S2 may include:
  • the initial position of the virtual cylindrical object is that the corresponding z-axis is perpendicular to the horizontal plane, and then it is moved to the target plane, as shown in FIG. 1.
  • the target plane is placed vertically, the direction of the corresponding plane has changed, and its corresponding z-axis is no longer perpendicular to the target plane. Therefore, the virtual cylindrical object's z-axis needs to be perpendicular to the target plane. It is placed vertically on the target plane, and the placement position of the virtual cylindrical object finally obtained is shown in FIG. 2b.
  • the method in the embodiment of the present disclosure further includes:
  • the initial plane can be selected by the user and is the plane on which the virtual object is initially placed.
  • the virtual screen can be controlled to move on the initial plane by controlling the terminal screen or the terminal.
  • the virtual screen can be controlled to move on the initial plane by controlling the terminal screen or the terminal.
  • the virtual object is always located in the center of the terminal screen, and the mobile terminal is equivalent to a mobile virtual model.
  • the edge contour position of the initial plane is recorded in advance, and then it is determined whether the position of the virtual object completely exceeds the initial plane. Specifically, it can be implemented by using the recognition method of the extended plane edge contour in the prior art, for example, using feature points or using texture to identify. Once it is determined that the position of the virtual object exceeds the initial plane, the direction of the target plane is determined.
  • This embodiment adopts the above technical solution, and determines whether the position of the virtual object exceeds the initial plane. If the position of the virtual object exceeds the initial plane, the operation of determining the direction of the target plane of the virtual object in the real scene is triggered, and then the virtual object is adjusted according to the direction. Placing a posture and displaying the virtual object on the terminal screen, and the displayed virtual object adapts to the target plane, can avoid the virtual object floating on the plane or the posture of the plane is incorrect when moving, and improve the display effect of the terminal.
  • the method in this embodiment may further include a step of determining a target plane:
  • a real scene may include one or more planes.
  • Corresponding algorithms are needed to identify the planes contained in the real scene. This step can be implemented using existing technologies, for example, real-time positioning and map construction (simultaneous localization and mapping) (SLAM) algorithms, which are not repeated here.
  • SLAM real-time positioning and map construction
  • this step can be implemented in the following two ways:
  • a plane is automatically selected as the target plane, that is, a plane is automatically selected as the target plane from the identified planes.
  • the user selects the target plane, that is, the identified plane is displayed on the terminal screen, and the identified plane is selected; the selected plane is used as the target plane. That is, the user can select the plane by clicking or double-clicking or other preset actions, and use the plane selected by the user as the target plane.
  • planes 1 to 3 in the recognized real scene are sequentially displayed on the terminal screen. If the user wants to display the virtual object on the plane 1, they only need to order on the terminal screen. Click or double-click on plane 1 to complete the selection operation.
  • plane 1 is displayed on the terminal screen according to its placement position in the display scene, as shown in FIG. 2f. The process of determining the initial plane is similar to that of the target plane, and is not repeated here.
  • the method in the embodiment of the present disclosure further includes:
  • S17 determine the target display position of the virtual object on the terminal screen
  • the terminal may be, but is not limited to, a mobile terminal (for example, an iPhone, a smart phone, a tablet computer, etc.), or a fixed terminal (for example, a desktop computer).
  • a mobile terminal for example, an iPhone, a smart phone, a tablet computer, etc.
  • a fixed terminal for example, a desktop computer
  • the target display position is the display position of the virtual object on the terminal screen.
  • step S2 specifically includes:
  • Control the virtual object to move to the target position adjust the placement posture of the virtual object according to the direction, and display the virtual object on the terminal screen, and the displayed virtual object adapts to the target plane.
  • the target display position can be obtained in the following two ways: the first method: the trigger response generated on the screen of the terminal is detected, and the generation position of the trigger response is used as the target display position.
  • the trigger response is a response generated by a trigger operation acting on the terminal screen, and may be, but is not limited to, a click response, a double-click response, or a detected preset gesture action generated for the terminal screen.
  • the location where the trigger response is generated is a point on the corresponding plane of the terminal screen, which can be specifically determined by a sensor arranged on the terminal screen.
  • the user wants to change the display position of the virtual object on the terminal screen, the user needs to perform operations on the terminal screen, for example, by clicking, or double-clicking, or making a preset gesture on the terminal screen to determine the virtual object A display position.
  • a trigger response is generated.
  • the generation position of the trigger response is the display position where the user wants to move the virtual object, and the display position is not the target position of the virtual object in the real scene.
  • the trigger response determines the target position of the virtual object in the real scene, so that the display position of the virtual object on the terminal screen can be accurately located.
  • the second way is to receive the input target display position.
  • the user can input the target display position through the terminal. Since the user's trigger operation on the terminal screen is often a trigger area, it is difficult to locate to a point, and the target position can be accurately located to the point. Compared with the trigger operation of the user on the terminal screen, the embodiment can more accurately locate the position of the virtual object and further improve the display effect of the terminal.
  • step S18 may include:
  • the line may be a straight line, a ray, or a line segment.
  • the line is perpendicular to the plane where the terminal screen is located.
  • the following is a device embodiment of the present disclosure.
  • the device embodiment of the present disclosure can be used to perform the steps implemented by the method embodiments of the present disclosure.
  • Only parts related to the embodiments of the present disclosure are shown. Specific technical details are not disclosed. Reference is made to the method embodiments of the present disclosure.
  • an embodiment of the present disclosure provides an adaptive plane device.
  • the apparatus may perform the steps in the foregoing embodiment of the adaptive plane method.
  • the device mainly includes: a plane direction determination module 21 and an attitude adjustment module 22; wherein the plane direction determination module 21 is used to determine the direction of a target plane of a virtual object in a real scene; The posture of the virtual object is adjusted according to the direction, and the virtual object is displayed on the terminal screen, and the displayed virtual object is adapted to the target plane.
  • the virtual object may be a three-dimensional model of the real object in the scene.
  • the target plane is a plane to which a virtual object is to be moved in a real scene
  • the plane is a surface of an entity located in the real scene, and may be, for example, but not limited to, a desktop or a wall.
  • the direction of the target plane can be determined according to the coordinate axis of any direction in the three-dimensional space where the virtual object is located. For example, if the x-axis of the three-dimensional coordinates is used as a reference, the orientation of the target plane can be determined by calculating the angle between the target plane and the x-axis. For example, if the angle is 0, it is determined that the target plane is parallel to the x-axis. If the included angle is an angle greater than 0 and less than 90 degrees, it is determined that the target plane is inclined by a preset angle with respect to the x axis, and if the included angle is 90 degrees, it is determined that the target plane is perpendicular to the x axis.
  • the terminal may be, but is not limited to, a mobile terminal (for example, an iPhone, a smart phone, a tablet computer, etc.), or a fixed terminal (for example, a desktop computer).
  • a mobile terminal for example, an iPhone, a smart phone, a tablet computer, etc.
  • a fixed terminal for example, a desktop computer
  • the attitude adjustment module 22 may control the rotation and / or scaling of the virtual object, and display the virtual object on the terminal screen, and the displayed virtual object adapts to the target plane.
  • Figure 1 it can be seen that the virtual cylindrical object was originally placed vertically on a horizontal plane, and then moved to the target plane. As shown in Figure 1, the target plane is placed vertically, so the The virtual cylindrical object can be placed vertically on the vertical plane after being rotated by 90 degrees. The placement position of the virtual cylindrical object finally obtained is shown in FIG. 2b.
  • the plane direction determination module 21 determines the direction of the target plane of the virtual object in the real scene, and then uses the attitude adjustment module 22 to adjust the posture of the virtual object according to the direction, and displays the virtual object on the terminal screen.
  • the virtual object is displayed on the screen, and the displayed virtual object is adapted to the target plane, which can prevent the virtual object from floating on the plane or the posture of the plane is incorrect when moving, and improve the display effect of the terminal.
  • the attitude adjustment module 22 is specifically configured to:
  • the initial position of the virtual cylindrical object is that the corresponding z-axis is perpendicular to the horizontal plane, and then it is moved to the target plane, as shown in FIG. 1.
  • the target plane is placed vertically, the direction of the corresponding plane has changed, and its corresponding z-axis is no longer perpendicular to the target plane. Therefore, the virtual cylindrical object's z-axis needs to be perpendicular to the target plane. It is placed vertically on the target plane, and the placement position of the virtual cylindrical object finally obtained is shown in FIG. 2b.
  • the device further includes: a control movement module 23 and a position determination module 24; wherein the control movement module 23 is used to control the movement of the virtual object on the initial plane; the position determination module 24 is used If it is determined that the position of the virtual object exceeds the initial plane, an operation for determining the direction of the target plane of the virtual object in the real scene is triggered.
  • the initial plane can be selected by the user and is the plane on which the virtual object is initially placed.
  • control movement module 23 can control the movement of the virtual object on the initial plane by controlling the terminal screen or the terminal.
  • the control movement module 23 can control the movement of the virtual object on the initial plane by controlling the terminal screen or the terminal.
  • the virtual object is always located in the center of the terminal screen, and the mobile terminal is equivalent to a mobile virtual model.
  • the position determination module 24 may record the edge contour position of the initial plane in advance, and then determine whether the position of the virtual object completely exceeds the initial plane. Specifically, it can be implemented by using the recognition method of the extended plane edge contour in the prior art, for example, using feature points or using texture to identify. Once it is determined that the position of the virtual object exceeds the initial plane, the direction of the target plane is determined.
  • the device further includes: a plane identification module; the plane identification module is used to identify a plane included in the real scene; and a plane is selected from the identified planes as a target plane.
  • a real scene may include one or more planes.
  • Corresponding algorithms are required to identify the planes contained in the real scene. This step can be implemented using existing technologies, for example, real-time positioning and map construction (simultaneous localization and mapping) (SLAM) algorithms, which are not repeated here.
  • SLAM real-time positioning and map construction
  • the plane identification module can be implemented in the following two ways:
  • a plane is automatically selected as the target plane, that is, a plane is automatically selected as the target plane from the identified planes.
  • the user selects the target plane, that is, the identified plane is displayed on the terminal screen, and the identified plane is selected; the selected plane is used as the target plane. That is, the user can select the plane by clicking or double-clicking or other preset actions, and use the plane selected by the user as the target plane.
  • the plane recognition module sequentially displays the plane 1-plane 3 in the recognized real scene on the terminal screen. If the user wants to display the virtual object on the plane 1, it only needs to be displayed on the terminal. Click or double-click plane 1 on the screen to complete the selected operation. When plane 1 is selected, it is displayed on the terminal screen according to its placement position in the display scene, as shown in FIG. 2f.
  • the process of determining the initial plane is similar to that of the target plane, and is not repeated here.
  • the device further includes: a target position determination module for determining a target display position of the virtual object on the terminal screen; determining the target position according to the target display position and the target plane; correspondingly
  • the posture adjustment module 22 is specifically configured to control the virtual object to move to the target position, adjust the placement posture of the virtual object according to the direction, and display the virtual object on the terminal screen, and the displayed virtual object adapts Target plane.
  • the terminal may be, but is not limited to, a mobile terminal (for example, an iPhone, a smart phone, a tablet computer, etc.), or a fixed terminal (for example, a desktop computer).
  • a mobile terminal for example, an iPhone, a smart phone, a tablet computer, etc.
  • a fixed terminal for example, a desktop computer
  • the target display position is the display position of the virtual object on the terminal screen.
  • the attitude adjustment module 22 is specifically configured to control the virtual object to move to the target position, adjust the placement posture of the virtual object according to the direction, display the virtual object on the terminal screen, and the displayed virtual object adapts to the target plane.
  • the target position determination module is specifically configured to detect a trigger response generated on the screen of the terminal, and use the generation position of the trigger response as a target display position.
  • the trigger response is a response generated by a trigger operation acting on the terminal screen, and may be, but is not limited to, a click response, a double-click response, or a preset gesture gesture detected for the terminal screen.
  • the location where the trigger response is generated is a point on the corresponding plane of the terminal screen, which can be specifically determined by a sensor arranged on the terminal screen.
  • the user wants to change the display position of the virtual object on the terminal screen, the user needs to perform operations on the terminal screen, for example, by clicking, or double-clicking, or making a preset gesture on the terminal screen to determine the virtual object A display position.
  • a trigger response is generated.
  • the generation position of the trigger response is the display position where the user wants to move the virtual object, and the display position is not the target position of the virtual object in the real scene.
  • the trigger response determines the target position of the virtual object in the real scene, so that the display position of the virtual object on the terminal screen can be accurately located.
  • the target position determination module is specifically configured to receive an input target display position.
  • the user can input the target display position through the terminal. Since the user's trigger operation on the terminal screen is often a trigger area, it is difficult to locate to a point, and the target position can be accurately located to the point. Compared with the trigger operation of the user on the terminal screen, the embodiment can more accurately locate the position of the virtual object and further improve the display effect of the terminal.
  • the target position determination module is specifically configured to: obtain a line passing through a point at which the target display position is located; and use the intersection of the line and the target plane as the target position.
  • the line is perpendicular to the plane where the terminal screen is located.
  • FIG. 4 is a hardware block diagram illustrating a hardware device of an adaptive plane according to an embodiment of the present disclosure.
  • an adaptive plane hardware device 30 according to an embodiment of the present disclosure includes a memory 31 and a processor 32.
  • the memory 31 is configured to store non-transitory computer-readable instructions.
  • the memory 31 may include one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and / or non-volatile memory.
  • the volatile memory may include, for example, a random access memory (RAM) and / or a cache memory.
  • the non-volatile memory may include, for example, a read-only memory (ROM), a hard disk, a flash memory, and the like.
  • the processor 32 may be a central processing unit (CPU) or other form of processing unit having data processing capabilities and / or instruction execution capabilities, and may control other components in the hardware device 30 of the adaptive plane to perform desired functions.
  • the processor 32 is configured to run the computer-readable instructions stored in the memory 31, so that the hardware device 30 of the adaptive plane executes the aforementioned adaptive plane of the embodiments of the present disclosure. All or part of the steps of a method.
  • this embodiment may also include well-known structures such as a communication bus and an interface. These well-known structures should also be included in the protection scope of the present disclosure. within.
  • FIG. 5 is a schematic diagram illustrating a computer-readable storage medium according to an embodiment of the present disclosure.
  • a computer-readable storage medium 40 according to an embodiment of the present disclosure stores non-transitory computer-readable instructions 41 thereon.
  • the non-transitory computer-readable instruction 41 is executed by a processor, all or part of the steps of the method for adaptive plane of the foregoing embodiments of the present disclosure are performed.
  • the computer-readable storage medium 40 includes, but is not limited to, optical storage media (for example, CD-ROM and DVD), magneto-optical storage media (for example, MO), magnetic storage media (for example, magnetic tape or mobile hard disk), Non-volatile memory rewritable media (for example: memory card) and media with built-in ROM (for example: ROM box).
  • optical storage media for example, CD-ROM and DVD
  • magneto-optical storage media for example, MO
  • magnetic storage media for example, magnetic tape or mobile hard disk
  • Non-volatile memory rewritable media for example: memory card
  • media with built-in ROM for example: ROM box
  • FIG. 6 is a schematic diagram illustrating a hardware structure of a terminal according to an embodiment of the present disclosure. As shown in FIG. 5, the adaptive plane terminal 50 includes the foregoing embodiment of the adaptive plane device.
  • the terminal may be implemented in various forms, and the terminal in the present disclosure may include, but is not limited to, such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP ( Portable multimedia players), navigation devices, on-board terminals, on-board display terminals, on-board electronic rear-view mirrors, and other mobile terminals, and fixed terminals such as digital TVs, desktop computers, and the like.
  • a mobile phone such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP ( Portable multimedia players), navigation devices, on-board terminals, on-board display terminals, on-board electronic rear-view mirrors, and other mobile terminals, and fixed terminals such as digital TVs, desktop computers, and the like.
  • PDA personal digital assistant
  • PAD tablet computer
  • PMP Portable multimedia players
  • navigation devices
  • the terminal may further include other components.
  • the adaptive plane terminal 50 may include a power supply unit 51, a wireless communication unit 52, an A / V (audio / video) input unit 53, a user input unit 54, a sensing unit 55, an interface unit 56, The controller 57, the output unit 58 and the memory 59 and so on.
  • FIG. 5 illustrates a terminal having various components, but it should be understood that it is not required to implement all the illustrated components, and more or fewer components may be implemented instead.
  • the wireless communication unit 52 allows radio communication between the terminal 50 and a wireless communication system or network.
  • the A / V input unit 53 is used to receive audio or video signals.
  • the user input unit 54 may generate key input data according to a command input by the user to control various operations of the terminal.
  • the sensing unit 55 detects the current state of the terminal 50, the position of the terminal 50, the presence or absence of a user's touch input to the terminal 50, the orientation of the terminal 50, the acceleration or deceleration movement and direction of the terminal 50, and the like, and generates a signal for controlling the terminal. 50 commands or signals for operation.
  • the interface unit 56 functions as an interface through which at least one external device can be connected to the terminal 50.
  • the output unit 58 is configured to provide an output signal in a visual, audio, and / or tactile manner.
  • the memory 59 may store software programs and the like for processing and control operations performed by the controller 55, or may temporarily store data that has been output or is to be output.
  • the memory 59 may include at least one type of storage medium.
  • the terminal 50 may cooperate with a network storage device that performs a storage function of the memory 59 through a network connection.
  • the controller 57 generally controls the overall operation of the terminal.
  • the controller 57 may include a multimedia module for reproducing or playing back multimedia data.
  • the controller 57 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as characters or images.
  • the power supply unit 51 receives external power or internal power under the control of the controller 57 and provides appropriate power required to operate each element and component.
  • Various embodiments of the adaptive plane approach proposed by the present disclosure may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof.
  • various embodiments of the adaptive plane method proposed by the present disclosure can be implemented by using application-specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), (PLD), field programmable gate array (FPGA), processor, controller, microcontroller, microprocessor, electronic unit designed to perform the functions described herein, and in some cases,
  • Various embodiments of the adaptive plane method proposed by the present disclosure may be implemented in the controller 57.
  • various embodiments of the adaptive plane method proposed by the present disclosure may be implemented with separate software modules that allow performing at least one function or operation.
  • the software codes may be implemented by a software application (or program) written in any suitable programming language, and the software codes may be stored in the memory 59 and executed by the controller 57.
  • an "or” used in an enumeration of items beginning with “at least one” indicates a separate enumeration such that, for example, an "at least one of A, B, or C” enumeration means A or B or C, or AB or AC or BC, or ABC (ie A and B and C).
  • the word "exemplary” does not mean that the described example is preferred or better than other examples.
  • each component or each step can be disassembled and / or recombined.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé d'adaptation de plan, un dispositif d'adaptation de plan, un dispositif matériel d'adaptation de plan et un support de stockage lisible par ordinateur. Le procédé d'adaptation de plan comprend les étapes consistant à : déterminer la direction d'un objet virtuel dans un plan cible dans une scène réelle (S1) ; et ajuster, selon la direction, une posture de placement de l'objet virtuel, et afficher l'objet virtuel sur un écran de terminal, l'objet virtuel affiché s'adaptant au plan cible (S2). Au moyen dudit procédé, tout d'abord, une direction d'un objet virtuel dans un plan cible dans une scène réelle est déterminée, puis un geste de placement de l'objet virtuel est ajusté selon la direction, et l'objet virtuel est affiché sur un écran de terminal, et l'objet virtuel affiché s'adapte au plan cible ; il est possible d'éviter la situation dans laquelle un objet virtuel est suspendu dans un plan ou est dans une posture incorrecte dans un plan pendant un mouvement, ce qui améliore l'effet d'affichage d'un terminal.
PCT/CN2019/073080 2018-08-09 2019-01-25 Procédé et dispositif d'adaptation de plan, et support de stockage lisible par ordinateur WO2020029556A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810900637.4 2018-08-09
CN201810900637.4A CN110827412A (zh) 2018-08-09 2018-08-09 自适应平面的方法、装置和计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2020029556A1 true WO2020029556A1 (fr) 2020-02-13

Family

ID=69413913

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/073080 WO2020029556A1 (fr) 2018-08-09 2019-01-25 Procédé et dispositif d'adaptation de plan, et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN110827412A (fr)
WO (1) WO2020029556A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023065949A1 (fr) * 2021-10-20 2023-04-27 腾讯科技(深圳)有限公司 Procédé et appareil de commande d'objet dans une scène virtuelle, dispositif terminal, support de stockage lisible par ordinateur et produit programme informatique

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118142157A (zh) * 2022-12-07 2024-06-07 腾讯科技(深圳)有限公司 虚拟物体的放置方法、装置、设备、介质及产品

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11272892A (ja) * 1998-03-24 1999-10-08 Rekusaa Research:Kk 三次元空間表示システムにおける物体移動配置装置及び方法
CN102722908A (zh) * 2012-05-25 2012-10-10 任伟峰 一种在三维虚拟现实场景中的物体空间摆位方法及装置
CN105825499A (zh) * 2016-03-09 2016-08-03 京东方科技集团股份有限公司 基准平面的确定方法和确定系统
JP2017084323A (ja) * 2015-10-22 2017-05-18 キヤノン株式会社 情報処理装置、方法及びプログラム
CN108052253A (zh) * 2017-12-28 2018-05-18 灵图互动(武汉)科技有限公司 一种虚拟现实展示内容制作方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9020825B1 (en) * 2012-09-25 2015-04-28 Rawles Llc Voice gestures
JP2014191718A (ja) * 2013-03-28 2014-10-06 Sony Corp 表示制御装置、表示制御方法および記録媒体
CN107665505B (zh) * 2016-07-29 2021-04-06 成都理想境界科技有限公司 基于平面检测实现增强现实的方法及装置
CN107665506B (zh) * 2016-07-29 2021-06-01 成都理想境界科技有限公司 实现增强现实的方法及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11272892A (ja) * 1998-03-24 1999-10-08 Rekusaa Research:Kk 三次元空間表示システムにおける物体移動配置装置及び方法
CN102722908A (zh) * 2012-05-25 2012-10-10 任伟峰 一种在三维虚拟现实场景中的物体空间摆位方法及装置
JP2017084323A (ja) * 2015-10-22 2017-05-18 キヤノン株式会社 情報処理装置、方法及びプログラム
CN105825499A (zh) * 2016-03-09 2016-08-03 京东方科技集团股份有限公司 基准平面的确定方法和确定系统
CN108052253A (zh) * 2017-12-28 2018-05-18 灵图互动(武汉)科技有限公司 一种虚拟现实展示内容制作方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023065949A1 (fr) * 2021-10-20 2023-04-27 腾讯科技(深圳)有限公司 Procédé et appareil de commande d'objet dans une scène virtuelle, dispositif terminal, support de stockage lisible par ordinateur et produit programme informatique

Also Published As

Publication number Publication date
CN110827412A (zh) 2020-02-21

Similar Documents

Publication Publication Date Title
US11017580B2 (en) Face image processing based on key point detection
US11443453B2 (en) Method and device for detecting planes and/or quadtrees for use as a virtual substrate
WO2020019663A1 (fr) Procédé et appareil de génération d'effet spécial basé sur le visage, et dispositif électronique associé
WO2019141100A1 (fr) Procédé et dispositif pour afficher un objet supplémentaire, dispositif informatique et support de stockage
US9304591B2 (en) Gesture control
WO2020019666A1 (fr) Procédé de suivi de visages multiples pour effet spécial facial, appareil et dispositif électronique
US8769409B2 (en) Systems and methods for improving object detection
US20120036485A1 (en) Motion Driven User Interface
WO2020029554A1 (fr) Procédé et dispositif d'interaction d'animation de modèle multiplan de réalité augmentée, appareil et support d'informations
US10191612B2 (en) Three-dimensional virtualization
US10649616B2 (en) Volumetric multi-selection interface for selecting multiple objects in 3D space
US20110261048A1 (en) Electronic device and method for displaying three dimensional image
WO2020019664A1 (fr) Procédé et appareil de génération d'image déformée sur la base du visage humain
US10649615B2 (en) Control interface for a three-dimensional graphical object
US10606360B2 (en) Three-dimensional tilt and pan navigation using a single gesture
WO2016179912A1 (fr) Procédé et appareil de commande de programme d'application, et terminal mobile
WO2020029556A1 (fr) Procédé et dispositif d'adaptation de plan, et support de stockage lisible par ordinateur
KR101949493B1 (ko) 멀티미디어 콘텐츠의 재생을 제어하기 위한 방법 및 시스템
US11755119B2 (en) Scene controlling method, device and electronic equipment
WO2020029555A1 (fr) Procédé et dispositif de commutation en continu parmi des plans et support d'informations lisible par ordinateur
US20160224134A1 (en) Display apparatus and control method thereof
CN110825280A (zh) 控制虚拟物体位置移动的方法、装置和计算机可读存储介质
US9898183B1 (en) Motions for object rendering and selection
CN110827413A (zh) 控制虚拟物体形态改变的方法、装置和计算机可读存储介质
US20190156792A1 (en) Method and system for adjusting display content and head-mounted display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19847553

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.05.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19847553

Country of ref document: EP

Kind code of ref document: A1