CN111798573A - Electronic fence boundary position determination method and device and VR equipment - Google Patents

Electronic fence boundary position determination method and device and VR equipment Download PDF

Info

Publication number
CN111798573A
CN111798573A CN202010931912.6A CN202010931912A CN111798573A CN 111798573 A CN111798573 A CN 111798573A CN 202010931912 A CN202010931912 A CN 202010931912A CN 111798573 A CN111798573 A CN 111798573A
Authority
CN
China
Prior art keywords
electronic fence
determining
boundary
boundary position
fence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010931912.6A
Other languages
Chinese (zh)
Other versions
CN111798573B (en
Inventor
周延献
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Qiyuan Technology Co.,Ltd.
Original Assignee
Nanjing Iqiyi Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Iqiyi Intelligent Technology Co Ltd filed Critical Nanjing Iqiyi Intelligent Technology Co Ltd
Priority to CN202010931912.6A priority Critical patent/CN111798573B/en
Publication of CN111798573A publication Critical patent/CN111798573A/en
Application granted granted Critical
Publication of CN111798573B publication Critical patent/CN111798573B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method and a device for determining the boundary position of an electronic fence and VR equipment, which are applied to VR equipment, wherein the VR equipment comprises a camera device and a controller, the camera device is used for collecting scene image information, and the controller is used for interacting with a display interface under the condition that the scene image information is displayed; the method comprises the following steps: determining a reference plane according to the scene image information, wherein the reference plane is a corresponding plane when the goodness of fit between the display interface and the ground plane reaches a preset value; under the condition that the controller points to the boundary of each object in the display interface, taking the intersection point of the ray where the controller is located and the reference plane as a boundary reference coordinate; determining the boundary position of the electronic fence according to the reference coordinates of each boundary and the height of the fence grids; according to the technical scheme, the boundary position of the electronic fence of the VR equipment can be accurately determined according to a real scene, and the safety of wearing the VR equipment by a user is improved.

Description

Electronic fence boundary position determination method and device and VR equipment
Technical Field
The invention relates to the field of virtual reality, in particular to a method and a device for determining the boundary position of an electronic fence and VR equipment.
Background
With the rapid development of scientific technology, VR equipment is applied more and more widely, interpersonal interaction is performed through the VR equipment, and the reality of a scene and the immersion of a user are greatly improved.
The existing electronic fence is usually fixed in shape, for example, the shape (circle, rectangle, square, etc.) is defined in advance, but not drawn according to the real scene where the VR device is located, the fixed electronic fence cannot form diversified virtual fences according to actual demands, and in a VR device use scene, if an object is in the electronic fence, potential safety hazards exist in the activities of users in the electronic fence.
Disclosure of Invention
In view of the foregoing problems, an object of the embodiments of the present invention is to provide a method and an apparatus for determining a boundary position of an electronic fence, and a VR device, so as to solve the deficiencies of the prior art.
According to an embodiment of the invention, a method for determining the boundary position of an electronic fence is provided, which is applied to VR equipment, where the VR equipment includes a camera device and a controller, the camera device is used to collect scene image information, and the controller is used to interact with a display interface when the scene image information is displayed; the electronic fence boundary position determining method comprises the following steps:
determining a reference plane according to the scene image information, wherein the reference plane is a corresponding plane when the goodness of fit between the display interface and the ground plane reaches a preset value;
under the condition that the controller points to the boundary of each object in the display interface, taking the intersection point of the ray where the controller is located and the reference plane as a boundary reference coordinate;
and determining the boundary position of the electronic fence according to the reference coordinates of each boundary and the height of the fence grid.
In the method for determining the boundary position of the electronic fence, the method further includes:
and drawing the electronic fence based on the electronic fence boundary position.
In the method for determining the boundary position of the electronic fence, the electronic fence is operated in the VR device in a service mode.
In the method for determining the boundary position of the electronic fence, the method further includes:
calculating a first distance according to the current position information of the user and the vertical direction information of the reference plane;
determining the boundary position of the electronic fence to be calculated pointed by the user according to the current orientation information of the user;
calculating a second distance between the user and the boundary position of the electronic fence according to the current position information of the user and the boundary position of the electronic fence to be calculated;
and sending out first prompt information of reaching the boundary position of the electronic fence under the condition that the first distance exceeds the second distance.
In the method for determining the boundary position of the electronic fence, the method further includes:
determining a transparency based on a difference of the first distance and the second distance;
displaying the electronic fence based on the transparency.
In the method for determining the boundary position of the electronic fence, the acquired subsequent scene image information is compared with the current scene image information corresponding to the electronic fence, all steps from determining the reference plane according to the scene image information to determining the boundary position of the electronic fence according to the boundary reference coordinates and the height of the fence are executed again under the condition that the subsequent scene image information is changed, and second prompt information of the change of the real environment is sent.
In the method for determining the boundary position of the electronic fence, the condition that the image information of the subsequent scene changes includes:
identifying the object shape of the ground plane according to the difference result of the subsequent scene image information and the current scene image information corresponding to the electronic fence;
determining whether the object exists in the electronic fence according to the shape and the position of the object;
and if so, determining that the subsequent scene image information is changed.
According to another embodiment of the invention, an electronic fence boundary position determining apparatus is provided, which is applied to a VR device, where the VR device includes a camera and a controller, the camera is configured to acquire scene image information, and the controller is configured to interact with a display interface when the scene image information is displayed; the electronic fence boundary position determining device comprises:
the first determining module is used for determining a reference plane according to the scene image information, wherein the reference plane is a corresponding plane when the coincidence degree of the display interface and the ground plane reaches a preset value;
the second determining module is used for taking the intersection point of the ray where the controller is located and the reference plane as a boundary reference coordinate under the condition that the controller points to the boundary of each object in the display interface;
the third determining module is used for determining the boundary position of the electronic fence according to the reference coordinates of each boundary and the height of the fence grid;
the first distance is calculated according to the current position information of the user and the vertical direction information of the reference plane;
determining the boundary position of the electronic fence to be calculated pointed by the user according to the current orientation information of the user;
calculating a second distance between the user and the boundary position of the electronic fence according to the current position information of the user and the boundary position of the electronic fence to be calculated;
and sending out first prompt information of reaching the boundary position of the electronic fence under the condition that the first distance exceeds the second distance.
According to yet another embodiment of the present invention, there is provided a VR device including a memory for storing a computer program and a processor for executing the computer program to cause the VR device to perform the method for determining fence boundary positions described above.
According to yet another embodiment of the invention, a computer readable storage medium is provided, which stores the computer program for use in the VR device.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
the invention relates to a method and a device for determining the boundary position of an electronic fence and VR equipment.A camera device is used for collecting scene image information, a display interface for displaying the scene image information by the VR equipment is used for determining a reference plane, and the reference plane is arranged according to a ground plane and is used as a datum plane of the electronic fence; and then the controller points to the boundary of each object in the display interface, the intersection point of the boundary of each object and the reference plane is determined as a boundary reference coordinate, and the boundary position of the electronic fence can be determined according to the boundary reference coordinate subsequently, so that the boundary position of the electronic fence is determined according to each object in the scene image information acquired by the camera device in real time, the accuracy of the boundary position of the electronic fence is improved, and the safety of activities in the electronic fence is also improved under the condition that a user wears VR equipment.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention, and therefore should not be considered as limiting the scope of the present invention, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart illustrating a method for determining a boundary position of an electronic fence according to a first embodiment of the present invention;
fig. 2 is a schematic diagram of a framework of an electronic fence and VR application coexistence scheme according to a first embodiment of the present invention;
FIG. 3 is a schematic diagram showing a space of boundary reference coordinates provided by a first embodiment of the present invention;
fig. 4 is a schematic spatial diagram illustrating a boundary position of an electronic fence according to a first embodiment of the present invention;
fig. 5 is a flowchart illustrating an electronic fence boundary position determining method according to a second embodiment of the present invention;
fig. 6 is a flowchart illustrating a method for determining a boundary position of an electronic fence according to a third embodiment of the present invention;
fig. 7 is a flowchart illustrating a method for determining a boundary position of an electronic fence according to a fourth embodiment of the present invention;
fig. 8 is a flowchart illustrating an electronic fence boundary position determining method according to a fifth embodiment of the present invention;
fig. 9 is a schematic structural diagram illustrating an electronic fence boundary position determining apparatus according to a sixth embodiment of the present invention.
Description of the main element symbols:
600-electronic fence boundary position determination means; 610-a first determination module; 620-a second determination module; 630-third determination module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
Fig. 1 is a flowchart illustrating a method for determining a boundary position of an electronic fence according to a first embodiment of the present invention.
The method for determining the boundary position of the electronic fence is applied to VR equipment, the VR equipment comprises a camera device and a controller, the camera device is used for collecting scene image information, and the controller is used for interacting with a display interface under the condition that the scene image information is displayed.
Specifically, include camera device, controller and display screen in the VR equipment at least, camera device specifically can be the camera, the camera is used for gathering the scene image information when the user wears VR equipment. The display screen is used for displaying the scene image information acquired by the camera to a display interface; the controller may be a handle, where in the VR device, the handle is configured to interact with the display interface, where the interaction may specifically include adjusting the display interface, such as enlarging the display interface, reducing the display interface, and moving the display interface, where the moving the display interface may further include pulling the display interface upward, pressing the display interface downward, moving the display interface leftward, moving the display interface rightward, moving the display interface upward, and moving the display interface downward. Of course, the interaction between the handle and the display interface includes not only the mobile display interface, but also interaction with the VR application displayed by the display interface, such as sending an instruction, receiving an instruction, touching a tag or an option in the display interface, and the like.
The electronic fence boundary position determining method comprises the following steps:
in step S110, a reference plane is determined from the scene image information.
Specifically, when a user wears the VR device, the display screen displays image information on a display interface, the display interface is used as a rendered layer, the user can see an external real scene through scene image information collected by the camera device, at this time, the VR device is provided with a relatively large plane at a chest position of the user, the plane is the display interface, the display interface is drawn by using a 3D image drawing tool, the color of the display interface can be a semitransparent color, the display interface is perpendicular to a Y axis, and the Y axis is a Y axis in a world coordinate system.
And a user presses downwards by using the controller to enable the coincidence degree of the display interface and the ground plane to reach a preset value, and the higher the coincidence degree is, the more accurate the determined boundary position of the electronic fence is.
Specifically, the goodness of fit refers to the degree of coincidence between the display interface and the ground plane when the controller presses down to adjust the spatial position of the display interface. The matching degree is determined according to the pressing degree of the controller.
And setting the plane of the display interface as a reference plane when the goodness of fit reaches a preset value, and taking the reference plane as a reference plane when the boundary position of the electronic fence is determined subsequently.
The reference plane is a plane where the display interface is located under the condition that the goodness of fit between the display interface and the ground plane reaches a preset value.
As shown in fig. 2, the gray plane is a reference plane, and the plane center coordinates of the reference plane are (0, Y, 0), where Y is the value of the Y axis in the world coordinate system, i.e., the Y value of the reference plane from the center point of the world coordinate system. The plane normal coordinates of the reference plane are (0, 1, 0).
In step S120, when the controller points to the boundary of each object in the display interface, an intersection point of the ray where the controller is located and the reference plane is used as a boundary reference coordinate.
Specifically, when determining the boundary reference coordinate of the electronic fence, a user controls the direction of the controller to enable the controller to point to the boundary of each object in the display interface, when each point points to a boundary point, an intersection point exists between a ray where the controller is located and the reference plane, and the intersection point coordinate is used as the boundary reference coordinate.
When determining the boundary reference coordinates of the electronic fence, the plurality of boundary reference coordinates on the reference plane are obtained by adjusting the direction of the controller.
In particular, the amount of the solvent to be used,
Figure 201493DEST_PATH_IMAGE001
represents a normal vector of the reference plane and,
Figure 929147DEST_PATH_IMAGE002
representing the ray vector in which the handle (controller) is located,
Figure 493989DEST_PATH_IMAGE003
representing the vector formed by the ray origin and the coordinates of the center of the reference plane. Then, the movement amount ratio can be expressed by the following equation:
Figure 16106DEST_PATH_IMAGE004
the coordinates of the intersection point are the coordinates of the starting point of the ray where the handle is located according to
Figure 931979DEST_PATH_IMAGE005
And moving to obtain the end point coordinate, namely the intersection point coordinate.
In step S130, the electronic fence boundary position is determined according to the boundary reference coordinates and the fence height.
Specifically, the fence height can be predetermined, and the fence height can determine the density of lines in the electronic fence, which can be determined according to user requirements or scene requirements.
For example, if the fence height is set to Num and the boundary reference coordinates are (x 1, Y, z 1) (x2, Y, z2), (x3, Y, z3), (x4, Y, z 4) … …, then the fence boundary coordinates are also a plurality of: (x 1, Y + Num, z 1), (x2, Y + Num, z2), (x3, Y + Num, z3), (x4, Y + Num, z 4) … ….
Further, a hierarchy of the fence grids may be provided, which may represent the height of the fence relative to a reference plane. The method can be specifically determined according to the requirements of users or scenes.
For example, if the fence height is set to Num, the boundary reference coordinates are (x 1, Y, z 1) (x2, Y, z2), (x3, Y, z3), (x4, Y, z 4) … …, and the level of the fence is 3, the first level fence boundary coordinates are: (x 1, Y + Num, z 1), (x2, Y + Num, z2), (x3, Y + Num, z3), (x4, Y + Num, z 4) … …. The second-level fence boundary coordinates are: (x 1, Y + Num, z 1), (x2, Y + Num, z2), (x3, Y + Num, z3), (x4, Y + Num, z 4) … …. The third pole fence boundary coordinates are (x 1, Y + Num, z 1), (x2, Y + Num, z2), (x3, Y + Num, z3), (x4, Y + Num, z 4) … ….
The middle polygon shown in fig. 3 is the boundary position of the electronic fence determined according to the current real scene.
Further, the electronic fence operates in the VR device in a service mode.
Specifically, there are some operating systems, such as the Android system, in which VR applications are displayed in a full screen mode, only a single application foreground mode is supported, that is, only one VR application is allowed to be displayed in the foreground of the VR device. When a plurality of VR applications are switched, life cycle management of the applications and processes can exist, the former application releases related resources, the newly opened application loads related resources, especially, resource load of the VR applications is generally large, and switching time is relatively long.
In order to reduce the resource loading time and the switching time, the electronic fence is operated in the VR equipment in a service mode instead of a foreground full-screen application display mode, so that the switching between the electronic fence process and other processes does not exist, the warning function of the electronic fence is ensured to work all the time, the problem of long VR application switching time is solved, and potential safety hazards are avoided.
Specifically, the design of the electronic fence can be realized in an electronic fence service, and the electronic fence service can be started when being started, and can acquire posture information (position and orientation) of a relevant camera and VR equipment, so that the VR equipment can draw and display the electronic fence.
Further, the electronic fence may be run in a service mode resident.
As shown in fig. 4, the electronic fence service process can be continuously running, independent of the VR application at the front end, for example, during the running of the electronic fence service process, the user can switch the VR application at the front end at will among the application a process, the application B process, and so on. The method has better portability, is irrelevant to the development of specific VR application, and has more efficient and decoupled code structure.
Example 2
Fig. 5 is a flowchart illustrating a method for determining a boundary position of an electronic fence according to a second embodiment of the present invention.
The method for determining the boundary position of the electronic fence is applied to VR equipment, the VR equipment comprises a camera device and a controller, the camera device is used for collecting scene image information, and the controller is used for interacting with a display interface under the condition that the scene image information is displayed.
The electronic fence boundary position determining method comprises the following steps:
in step S210, a reference plane is determined from the scene image information.
This step is the same as step S110, and is not described herein again.
In step S220, when the controller points to the boundary of each object in the display interface, an intersection point of the ray where the controller is located and the reference plane is used as a boundary reference coordinate.
This step is the same as step S120, and is not described herein again.
In step S230, the electronic fence boundary position is determined according to the boundary reference coordinates and the fence height.
This step is the same as step S130, and is not described herein again.
In step S240, the electronic fence is drawn based on the electronic fence boundary position.
Specifically, the electronic fence can be drawn according to a defined fence by using a three-party stereo graphic drawing tool (opengl/canvas, and the like), a drawing result is rendered on a surface (similar to an image data buffer area), the surface is bound to a window, and the window is added to the uppermost layer of the content of the current VR display window by using an android WMS window mechanism in a window adding mode, the content of the VR display window is drawn in a layered mode, the rendering of the application occupies one layer, the drawing of the electronic fence occupies the other layer, and the electronic fence is overlapped when being displayed. The running and the application process of the electronic fence process can be run in parallel, and the part is an android mechanism, so that the electronic fence rendering is realized by using a service mode in an android system, the global control (running independent of a certain VR application) can be achieved, and the electronic fence process and the foreground application can run simultaneously.
Example 3
Fig. 6 is a flowchart illustrating a method for determining a boundary position of an electronic fence according to a third embodiment of the present invention.
The method for determining the boundary position of the electronic fence is applied to VR equipment, the VR equipment comprises a camera device and a controller, the camera device is used for collecting scene image information, and the controller is used for interacting with a display interface under the condition that the scene image information is displayed.
The electronic fence boundary position determining method comprises the following steps:
in step S310, a reference plane is determined according to the scene image information.
This step is the same as step S110, and is not described herein again.
In step S320, when the controller points to the boundary of each object in the display interface, an intersection point of the ray where the controller is located and the reference plane is used as a boundary reference coordinate.
This step is the same as step S120, and is not described herein again.
In step S330, the electronic fence boundary position is determined according to the boundary reference coordinates and the fence height.
This step is the same as step S130, and is not described herein again.
In step S340, a first distance is calculated according to the current position information of the user and the vertical direction information of the reference plane.
Specifically, the first distance is a distance from a current position of the user to a center point of the reference plane.
The VR device comprises a gyroscope, an accelerator, an attitude sensor and other devices, and can acquire the current position information of the user. The position information may be user head position information and/or left hand position information and/or right hand position information.
In step S350, the boundary position of the electronic fence to be calculated, which is pointed by the user, is determined according to the current orientation information of the user.
Specifically, the VR device may further include a gyroscope or an accelerator, which may collect rotation angle information of the user, and may calculate the orientation of the user according to the rotation angle information, that is, a vector representing the orientation of the user. For example, the direction in which the face of the user is oriented may be taken as the user orientation.
The user's current position information and the user's current orientation information may also be acquired through a 6DOF algorithm of the controller.
After the orientation of the user is determined, all coordinates pointed by the user are calculated according to the vector of the orientation of the user, namely the current position information of the user, and in all the coordinates, a point intersected with the boundary coordinates of the electronic fence is the boundary position of the electronic fence to be calculated.
In step S360, a second distance between the user and the boundary position of the electronic fence is calculated according to the current position information of the user and the boundary position of the electronic fence to be calculated.
Specifically, the second distance is a distance between the current location of the user and the boundary of the electronic fence.
In step S370, when the first distance exceeds the second distance, a first prompt message of reaching the boundary position of the electronic fence is issued.
And judging whether the first distance exceeds the second distance, judging that the electronic fence reaches the boundary position of the electronic fence under the condition that the first distance exceeds the second distance, and moving the electronic fence to easily cause a safety problem to touch surrounding objects. Sending first prompt information containing the content of reaching the boundary position of the electronic fence; and under the condition that the first distance does not exceed the second distance, judging that the boundary position of the electronic fence is not reached, not sending any prompt signal, and continuously displaying the electronic fence.
Example 4
Fig. 7 is a flowchart illustrating a method for determining a boundary position of an electronic fence according to a fourth embodiment of the present invention.
The method for determining the boundary position of the electronic fence is applied to VR equipment, the VR equipment comprises a camera device and a controller, the camera device is used for collecting scene image information, and the controller is used for interacting with a display interface under the condition that the scene image information is displayed.
The electronic fence boundary position determining method comprises the following steps:
in step S410, a reference plane is determined from the scene image information.
This step is the same as step S110, and is not described herein again.
In step S420, when the controller points to the boundary of each object in the display interface, an intersection point of the ray where the controller is located and the reference plane is used as a boundary reference coordinate.
This step is the same as step S120, and is not described herein again.
In step S430, the electronic fence boundary position is determined according to the boundary reference coordinates and the fence height.
This step is the same as step S130, and is not described herein again.
In step S440, a first distance is calculated according to the current position information of the user and the vertical direction information of the reference plane.
This step is the same as step S340, and is not described herein again.
In step S450, the boundary position of the electronic fence to be calculated, which is pointed by the user, is determined according to the current orientation information of the user.
This step is the same as step S350, and is not described herein again.
In step S460, a second distance between the user and the boundary position of the electronic fence is calculated according to the current position information of the user and the boundary position of the electronic fence to be calculated.
This step is the same as step S360, and is not described herein again.
In step S470, when the first distance exceeds the second distance, a first prompt message for reaching the boundary position of the electronic fence is issued.
This step is the same as step S370, and is not described herein again.
In step S480, a transparency is determined based on a difference between the first distance and the second distance.
In step S490, the electronic fence is displayed based on the transparency.
Specifically, a transparency can be converted according to a difference between the first distance and the second distance, and the transparency is used for identifying the distance between the user and the electronic fence.
Further, the transparency is positively correlated with the difference.
For example, when the second distance from the user to the boundary of the electronic fence is greater than or equal to the safe distance (the first distance), and relatively safe, the electronic fence can not be rendered in the VR device application (in this case, in a hidden state), but when the second distance between the user and the fence boundary is less than the first distance, the fence will be rendered in the VR device application, and it can be detected that the real-time user gets closer to the fence boundary, which results in the fence becoming clearer, that is, when the second distance between the user and the boundaries of the electronic fence is less than the first distance, the electronic fence becomes clearer as the user gets closer to the boundaries of the electronic fence, when the user fully reaches the fence boundary, the fence is in a fully displayed state and a relevant warning (which may be a first prompt message) is given. Further, when the user completely reaches the fence boundary, the color of the fence may also change to prompt the user.
Further, when the user still continues to move after completely reaching the boundary of the electronic fence to cause the electronic fence to be broken through, the electronic fence can display a transparent hole at the broken position to prompt that the user has gone out and is quite unsafe, and the user immediately returns to the original position.
Returning to the safe area logic within the electronic fence is contrary to the logic leaving the safe area within the electronic fence, as the user gets closer to the electronic fence, the electronic fence becomes less and less clear until the electronic fence can be rendered out of the VR device application (in this case hidden) when the second distance between the user and the boundary of the electronic fence is greater than or equal to the safe distance (first distance).
Example 5
Fig. 8 is a flowchart illustrating an electronic fence boundary position determining method according to a fifth embodiment of the present invention.
The method for determining the boundary position of the electronic fence is applied to VR equipment, the VR equipment comprises a camera device and a controller, the camera device is used for collecting scene image information, and the controller is used for interacting with a display interface under the condition that the scene image information is displayed.
The electronic fence boundary position determining method comprises the following steps:
in step S510, a reference plane is determined according to the scene image information.
This step is the same as step S110, and is not described herein again.
In step S520, when the controller points to the boundary of each object in the display interface, an intersection point of the ray where the controller is located and the reference plane is used as a boundary reference coordinate.
This step is the same as step S120, and is not described herein again.
In step S530, the electronic fence boundary position is determined according to the boundary reference coordinates and the fence height.
This step is the same as step S130, and is not described herein again.
In step S540, it is determined whether or not the subsequent scene image has changed.
For security reasons, the fence process can be set to a continuously running state, independent of the VR application that the user opens.
That is, during the process of using the VR device, the scene image information is collected in real time (for example, collected periodically or collected on demand), and the VR device can draw the electronic fence according to the scene image information collected in real time. Then, the subsequent scene image may be compared with the current scene image, if the subsequent scene image changes, the process returns to step S510 and all subsequent steps, all steps for determining the boundary position of the electronic fence are executed again, and the process proceeds to step S550, and a second prompt message indicating that the real environment changes is sent out; if the subsequent scene image is not changed, the process proceeds to step S560 to continue to collect the scene image information of the current scene.
It is worth noting that since the image contrast involves two images, the scene image with the most advanced acquisition time is uniformly used as the scene image of the current scene; and unifying the images with later acquisition time as subsequent scene images.
In step S550, a second prompt message indicating that the real environment has changed is issued.
In step S560, the acquisition of scene image information is continued.
Further, the case that the subsequent scene image information changes includes:
identifying the object shape of the ground plane according to the difference result of the subsequent scene image information and the current scene image information corresponding to the electronic fence; determining whether the object exists in the electronic fence according to the shape and the position of the object; and if so, determining that the subsequent scene image information is changed.
Specifically, in a real-time scene, a fixed object does not change, and only a moving object (for example, a puppy moving to a safe area in an electronic fence suddenly) changes, so that the subsequent scene image information and the current scene image information can be subjected to score checking operation, the fixed object is filtered, the moving object is retained, and the shape of the ground-level object is identified through an image identification technology, wherein the shape of the object includes a convex object or a non-convex object, and the convex object is a type which easily causes a potential safety hazard. And determining the shape and position of the object to determine whether the object exists in the electronic fence, and if the object exists in the electronic fence, determining that the subsequent scene image information is changed.
For example, if the object is a convex object and the position of the convex object is within the coordinate range in the electronic fence, it is determined that the subsequent scene image information changes, and it is necessary to return to step S510 and all subsequent steps again, and determine a reference plane according to the acquired scene image information and the scene image information, where the reference plane is a corresponding plane when the matching degree between the display interface and the ground plane reaches a predetermined value; under the condition that the controller points to the boundary of each object in the display interface, taking the intersection point of the ray where the controller is located and the reference plane as a boundary reference coordinate; and determining the boundary position of the electronic fence according to the reference coordinates of each boundary and the height of the fence grid.
Example 6
Fig. 9 is a schematic structural diagram illustrating an electronic fence boundary position determining apparatus according to a sixth embodiment of the present invention. The device 600 for determining the boundary position of an electronic fence corresponds to the method for determining the boundary position of an electronic fence in embodiment 1, and the method for determining the boundary position of an electronic fence in embodiment 1 is also applicable to the device 600 for determining the boundary position of an electronic fence, and is not described herein again.
The device 600 for determining the boundary position of the electronic fence is applied to VR equipment, the VR equipment comprises a camera and a controller, the camera is used for collecting scene image information, and the controller is used for interacting with a display interface under the condition that the scene image information is displayed.
The electronic fence boundary position determining device 600 includes a first determining module 610, a second determining module 620, and a third determining module 630.
The first determining module 610 is configured to determine a reference plane according to the scene image information, where the reference plane is a corresponding plane when an agreement degree between the display interface and a ground plane reaches a predetermined value.
And a second determining module 620, configured to, when the controller points to the boundary of each object in the display interface, use an intersection point of the ray where the controller is located and the reference plane as a boundary reference coordinate.
And a third determining module 630, configured to determine the boundary position of the electronic fence according to the boundary reference coordinates and the height of the fence.
The third determining module 630 is further configured to calculate a first distance according to the current position information of the user and the vertical direction information of the reference plane;
determining the boundary position of the electronic fence to be calculated pointed by the user according to the current orientation information of the user;
calculating a second distance between the user and the boundary position of the electronic fence according to the current position information of the user and the boundary position of the electronic fence to be calculated;
and sending out first prompt information of reaching the boundary position of the electronic fence under the condition that the first distance exceeds the second distance.
Another embodiment of the present invention further provides a VR device, including a memory for storing a computer program and a processor for executing the computer program to make the VR device execute the function of each module in the aforementioned electronic fence boundary position determining method or the aforementioned electronic fence boundary position determining apparatus.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to use of the computer device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The embodiment also provides a computer storage medium for storing the electronic fence boundary position determination method used in the VR device.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative and, for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, each functional module or unit in each embodiment of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention or a part of the technical solution that contributes to the prior art in essence can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a smart phone, a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.

Claims (9)

1. The method for determining the boundary position of the electronic fence is applied to VR equipment, the VR equipment comprises a camera device and a controller, the camera device is used for collecting scene image information, and the controller is used for interacting with a display interface under the condition that the scene image information is displayed; the electronic fence boundary position determining method comprises the following steps:
determining a reference plane according to the scene image information, wherein the reference plane is a corresponding plane when the goodness of fit between the display interface and the ground plane reaches a preset value;
under the condition that the controller points to the boundary of each object in the display interface, taking the intersection point of the ray where the controller is located and the reference plane as a boundary reference coordinate;
determining the boundary position of the electronic fence according to the reference coordinates of each boundary and the height of the fence grids;
further comprising:
calculating a first distance according to the current position information of the user and the vertical direction information of the reference plane;
determining the boundary position of the electronic fence to be calculated pointed by the user according to the current orientation information of the user;
calculating a second distance between the user and the boundary position of the electronic fence according to the current position information of the user and the boundary position of the electronic fence to be calculated;
and sending out first prompt information of reaching the boundary position of the electronic fence under the condition that the first distance exceeds the second distance.
2. The method of determining fence boundary positions as claimed in claim 1, further comprising:
and drawing the electronic fence based on the electronic fence boundary position.
3. The method of claim 2, wherein the electronic fence is run in service in the VR device.
4. The fence boundary position determination method of claim 1, further comprising:
determining a transparency based on a difference of the first distance and the second distance;
displaying the electronic fence based on the transparency.
5. The fence boundary position determination method of claim 1, further comprising:
comparing the acquired subsequent scene image information with the current scene image information corresponding to the electronic fence, and under the condition that the subsequent scene image information is changed, re-executing all the steps from determining the reference plane according to the scene image information to determining the boundary position of the electronic fence according to the boundary reference coordinates and the fence height, and sending out second prompt information of the change of the real environment.
6. The fence boundary position determination method of claim 5, wherein the condition that the subsequent scene image information changes comprises:
identifying the object shape of the ground plane according to the difference result of the subsequent scene image information and the current scene image information corresponding to the electronic fence;
determining whether the object exists in the electronic fence according to the shape and the position of the object;
and if so, determining that the subsequent scene image information is changed.
7. The device for determining the boundary position of the electronic fence is applied to VR equipment, the VR equipment comprises a camera device and a controller, the camera device is used for collecting scene image information, and the controller is used for interacting with a display interface under the condition that the scene image information is displayed; the electronic fence boundary position determining device comprises:
the first determining module is used for determining a reference plane according to the scene image information, wherein the reference plane is a corresponding plane when the coincidence degree of the display interface and the ground plane reaches a preset value;
the second determining module is used for taking the intersection point of the ray where the controller is located and the reference plane as a boundary reference coordinate under the condition that the controller points to the boundary of each object in the display interface;
the third determining module is used for determining the boundary position of the electronic fence according to the reference coordinates of each boundary and the height of the fence grid;
the first distance is calculated according to the current position information of the user and the vertical direction information of the reference plane;
determining the boundary position of the electronic fence to be calculated pointed by the user according to the current orientation information of the user;
calculating a second distance between the user and the boundary position of the electronic fence according to the current position information of the user and the boundary position of the electronic fence to be calculated;
and sending out first prompt information of reaching the boundary position of the electronic fence under the condition that the first distance exceeds the second distance.
8. A VR device comprising a memory for storing a computer program and a processor that executes the computer program to cause the VR device to perform the fence boundary position determination method of any of claims 1 to 6.
9. A computer-readable storage medium storing the computer program for use in the VR device of claim 8.
CN202010931912.6A 2020-09-08 2020-09-08 Electronic fence boundary position determination method and device and VR equipment Active CN111798573B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010931912.6A CN111798573B (en) 2020-09-08 2020-09-08 Electronic fence boundary position determination method and device and VR equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010931912.6A CN111798573B (en) 2020-09-08 2020-09-08 Electronic fence boundary position determination method and device and VR equipment

Publications (2)

Publication Number Publication Date
CN111798573A true CN111798573A (en) 2020-10-20
CN111798573B CN111798573B (en) 2020-12-08

Family

ID=72834152

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010931912.6A Active CN111798573B (en) 2020-09-08 2020-09-08 Electronic fence boundary position determination method and device and VR equipment

Country Status (1)

Country Link
CN (1) CN111798573B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112604306A (en) * 2020-12-29 2021-04-06 严瑞华 Game platform based on VR equipment and use method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017143960A1 (en) * 2016-02-23 2017-08-31 福建蓝帽子互动娱乐科技股份有限公司 Method for realizing electronic fence, and toy and game system
CN110503001A (en) * 2019-07-25 2019-11-26 青岛小鸟看看科技有限公司 A kind of Virtual Reality equipment and its barrier-avoiding method, device
CN111243103A (en) * 2020-01-07 2020-06-05 青岛小鸟看看科技有限公司 Method and device for setting safety area, VR equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017143960A1 (en) * 2016-02-23 2017-08-31 福建蓝帽子互动娱乐科技股份有限公司 Method for realizing electronic fence, and toy and game system
CN110503001A (en) * 2019-07-25 2019-11-26 青岛小鸟看看科技有限公司 A kind of Virtual Reality equipment and its barrier-avoiding method, device
CN111243103A (en) * 2020-01-07 2020-06-05 青岛小鸟看看科技有限公司 Method and device for setting safety area, VR equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112604306A (en) * 2020-12-29 2021-04-06 严瑞华 Game platform based on VR equipment and use method thereof

Also Published As

Publication number Publication date
CN111798573B (en) 2020-12-08

Similar Documents

Publication Publication Date Title
US8811667B2 (en) Terminal device, object control method, and program
CN108876934B (en) Key point marking method, device and system and storage medium
US10532271B2 (en) Data processing method for reactive augmented reality card game and reactive augmented reality card game play device, by checking collision between virtual objects
CN111937046B (en) Mixed reality system, storage medium, method and portable terminal device
EP3629133B1 (en) Interface interaction apparatus and method
CN107204044B (en) Picture display method based on virtual reality and related equipment
CN104360816A (en) Screen capture method and system
EP3036719A1 (en) Simulating three-dimensional views using planes of content
CN110199319B (en) Overlay emphasis modification in augmented reality displays
US20220237818A1 (en) Image Processing Method and Apparatus for Electronic Dvice, and Electronic Device
US20210287350A1 (en) Map building method, apparatus, and system, and storage medium
CN103914876A (en) Method and apparatus for displaying video on 3D map
CN112988927B (en) Map data processing method and device, computer equipment and storage medium
CN115578433B (en) Image processing method, device, electronic equipment and storage medium
CN112714266B (en) Method and device for displaying labeling information, electronic equipment and storage medium
CN112634414A (en) Map display method and device
CN111798573B (en) Electronic fence boundary position determination method and device and VR equipment
CN112181141A (en) AR positioning method, AR positioning device, electronic equipment and storage medium
CN107491289B (en) Window rendering method and device
CN116363082A (en) Collision detection method, device, equipment and program product for map elements
CN106657976B (en) A kind of visual range extension method, device and virtual reality glasses
CN111930240B (en) Motion video acquisition method and device based on AR interaction, electronic equipment and medium
CN109814703B (en) Display method, device, equipment and medium
CN112446823B (en) Monitoring image display method and device
CN115761123B (en) Three-dimensional model processing method, three-dimensional model processing device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: Room 1103, building C, Xingzhi science and Technology Park, Nanjing Economic and Technological Development Zone, Nanjing, Jiangsu Province 210038

Patentee after: Nanjing Qiyuan Technology Co.,Ltd.

Address before: Room 1103, building C, Xingzhi science and Technology Park, Nanjing Economic and Technological Development Zone, Nanjing, Jiangsu Province 210038

Patentee before: Nanjing iqiyi Intelligent Technology Co.,Ltd.