WO2018196184A1 - Procédé et système de surveillance d'usine - Google Patents

Procédé et système de surveillance d'usine Download PDF

Info

Publication number
WO2018196184A1
WO2018196184A1 PCT/CN2017/094087 CN2017094087W WO2018196184A1 WO 2018196184 A1 WO2018196184 A1 WO 2018196184A1 CN 2017094087 W CN2017094087 W CN 2017094087W WO 2018196184 A1 WO2018196184 A1 WO 2018196184A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
camera
plant
image information
user
Prior art date
Application number
PCT/CN2017/094087
Other languages
English (en)
Chinese (zh)
Inventor
贾二东
Original Assignee
深圳前海弘稼科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳前海弘稼科技有限公司 filed Critical 深圳前海弘稼科技有限公司
Publication of WO2018196184A1 publication Critical patent/WO2018196184A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Definitions

  • the invention relates to the field of information processing, in particular to a monitoring method and a monitoring system for a plant.
  • the existing plant monitoring method is to obtain the image of the plant at a fixed angle through a fixed camera, and display it to the user through a mobile phone or a computer screen. Since the shooting angle of the camera is fixed, the viewing field of the user is limited.
  • the prior art provides an input device to a user, such as a touch screen of a mobile phone or a mouse or a keyboard connected to a computer. The user can input a control command to the camera through the input device, so that the remote control camera changes its shooting angle, thereby widening The user's field of view.
  • the existing plant monitoring method cannot provide the user with an immersive observation experience, which is inconvenient for the user to understand the plant situation and the user experience is poor.
  • the invention provides a monitoring method and a monitoring system for a plant, which are used for solving the problem that the existing plant monitoring method cannot provide the user with an immersive observation experience.
  • An aspect of an embodiment of the invention provides a method for monitoring a plant, comprising:
  • the monitoring system establishes a first communication connection between the virtual reality VR head display and the first camera device;
  • the control information input by the user is detected by the VR head;
  • the first imaging device is controlled according to the control instruction for the first imaging device.
  • control information input by the VR header to detect user input includes:
  • the user's head rotation information, and/or eye information is detected by the VR head.
  • control instruction for the first camera device includes:
  • the method further includes:
  • control information corresponds to a switching instruction of the imaging device
  • the method further includes:
  • Displaying the first image information by using the VR header includes:
  • the first image information with the pest identifier is displayed by the VR head, and the pest identifier points to the target area.
  • a first camera device configured to acquire first image information of the plant
  • a first communication connection module configured to establish a first communication connection between the VR head and the first camera
  • the VR header is used to display the first image information, and is also used to detect control information input by a user;
  • a first control module configured to: when determining that the control information corresponds to a control instruction to the first camera device, control the first camera device according to the control instruction for the first camera device.
  • the VR head display is further configured to detect a user's head rotation information, and/or eye information.
  • control instruction for the first camera device includes:
  • the monitoring system further includes a second control module and a second imaging device, where the second imaging device is configured to acquire second image information of the plant;
  • a second communication connection module configured to establish a second communication connection between the VR head display and the second camera device
  • the second control module is configured to switch the first communication connection to a second communication connection when determining that the control information corresponds to a switching instruction of the camera device.
  • the monitoring system further includes:
  • An identification module configured to identify a target area where the pest is located in the first image information
  • the VR head display is also used to:
  • the first image information with a pest identifier is displayed, the pest identifier pointing to the target area.
  • the monitoring system of the present invention can establish a first communication connection between the virtual reality VR head display and the first camera device, obtain the first image information of the plant through the first camera device, and display the
  • the first image information may further detect, by using the VR head, control information input by the user, and if it is determined that the control information corresponds to a control instruction to the first camera device, according to the pair of the first camera device
  • the control command controls the first camera device
  • the present invention provides an immersive observation experience for the user by using the VR head as an input terminal of the remote control camera device provided to the user and a display device for observing the plant image. It is convenient for users to understand the situation of plants and improve user experience.
  • FIG. 1 is a schematic view showing an embodiment of a monitoring method of a plant of the present invention
  • Figure 2 is a schematic view showing another embodiment of the monitoring method of the plant of the present invention.
  • Figure 3 is a schematic view showing another embodiment of the monitoring method of the plant of the present invention.
  • Figure 4 is a schematic view showing another embodiment of the monitoring method of the plant of the present invention.
  • Figure 5 is a schematic view showing an embodiment of a monitoring system of the plant of the present invention.
  • Figure 6 is a schematic illustration of another embodiment of a monitoring system for a plant of the present invention.
  • Embodiments of the present invention provide a monitoring method and monitoring system for a plant for providing an immersive observation experience for a plant.
  • an embodiment of the method for monitoring a plant in the embodiment of the present invention includes:
  • the monitoring system may establish a first communication connection between the virtual reality VR head display and the first camera device.
  • the first communication connection may be a wired communication mode or a wireless communication mode, and may also include wired communication and wireless communication. After the first communication connection is established, communication can be performed between the VR head display and the first camera.
  • VR head display that is, a virtual reality head-mounted display device, uses a head-mounted display device to close a person's visual and auditory sense to the outside world, and guides the user to create a feeling in a virtual environment.
  • the first image information of the plant can be acquired by the first camera.
  • the first image information may be connected through the first communication connection. Send to the VR head display to display the first image information through the VR head.
  • the user can input control information through the VR head to the monitoring system, and the monitoring system can detect the control information input by the user through the VR head display.
  • the first imaging device is controlled according to a control instruction to the first imaging device.
  • the monitoring system can associate the storage control information with a control command corresponding thereto, and the pre-stored control command includes a control command for the camera device communicatively connected to the VR head. After the monitoring device detects the control information input by the user through the VR head, the control command corresponding to the control information may be determined. If the control information is determined to correspond to the control command to the first camera device, the first camera device is determined according to the determined The control command controls the first camera.
  • the monitoring method provided by the embodiment of the present invention can be used as an input end of a remote control camera device provided to the user and a display device for observing the image of the plant through the VR head display, and can provide the user with an immersive observation experience on the plant, and facilitate the user to the plant situation. Learn to improve the user experience.
  • control information input by the user to the VR head may be head rotation information, such as left and right rotation of the head or back and forth rotation
  • control information input by the user to the VR head may also be eye information, such as line of sight focus position information. , blinking information, etc.
  • the control command of the monitoring system to the first camera device may be rotating the first camera device, or translating the first camera device, or adjusting the image distance of the first camera device.
  • the first camera device can be remotely controlled to perform the corresponding rotation, panning, zooming, etc. by rotating the head or blinking, etc., so that the user can display the VR head display.
  • the adjustment of the plant image enhances the user's immersive viewing experience on the plant.
  • the control information input by the user is the head rotation information and the eye information.
  • the user's head rotation information is detected by the VR head display:
  • another embodiment of a method for monitoring a plant in an embodiment of the present invention includes:
  • the monitoring system may establish a first communication connection between the virtual reality VR head display and the first camera device.
  • the first communication connection may be a wired communication mode or a wireless communication mode, and may also include wired communication and wireless communication.
  • the VR head display and the first camera device can be For communication.
  • VR head display that is, a virtual reality head-mounted display device, uses a head-mounted display device to close a person's visual and auditory sense to the outside world, and guides the user to create a feeling in a virtual environment.
  • the monitoring system establishes the first communication connection, it is assumed that the camera of the first camera device faces the first shooting angle at this time, and the first camera device can acquire the first image information of the plant at the first shooting angle.
  • the target area where the pest is located in the first image information may be identified.
  • the image recognition technology has been relatively mature. By pre-establishing the pest model, it is possible to determine whether the pest is included in the image by image recognition, and to determine the region where the pest is located.
  • the first image information may be sent to the VR head display through the first communication connection to display the first image information of the plant at the first shooting angle through the VR head display.
  • the monitoring system may add a pest identifier to the first image information, and the pest identifier may point to the target area, for example, circle the target area, and then the monitoring system may display the target through the VR Displays the first image information with the pest identification.
  • the pest identification is performed, and the first image with the pest identifier is displayed by the VR head, which can help the user to observe the pest condition of the plant. It should be noted that the step 203 may not be performed, and the first image information displayed in step 204 does not carry the pest identifier.
  • the user can rotate the head, for example, turn the head to the right.
  • the VR head can detect the user's head rotation information, and determine that the control information input by the user is rightward. Turn the head.
  • the VR head display method for detecting the head rotation information is relatively mature.
  • the VR head display can detect the user's head rotation information through the acceleration sensor. In actual use, the VR head display can also detect the user by other technical means.
  • the head rotation information is not specifically limited herein.
  • the monitoring system may pre-associate the storage control information with a control command corresponding thereto, and the pre-stored control command includes a control command for the camera device communicatively connected to the VR head.
  • the control information of "turning the head to the right” can be stored in association with the control command of "turning the camera to the right", and the control information of "turning the head to the left” and “turning to the left”
  • the control unit of the moving camera device is associated and stored.
  • the control command corresponding to the head rotation information can be searched, thereby determining that the control command corresponding to the head rotation information is rotating the first camera.
  • the determined control command is to rotate the first camera device to the right.
  • the first camera device can be controlled to perform corresponding rotation.
  • the first device since the determined control command is to rotate the first camera device to the right, the first device can be controlled. The camera rotates to the right.
  • the imaging angle of the camera orientation of the first imaging device changes, and assuming that the camera of the first imaging device faces the second imaging angle, the first image acquired by the first imaging device
  • the information is the first image information of the plant at the second shooting angle.
  • the first image information of the acquired plant at the second shooting angle may be sent to the VR head display through the first communication connection, so as to display the first image through the VR head.
  • the first image information of the plant under the shooting angle can still identify the target area where the pest is located in the first image information, and display the first image information with the pest identifier under the second shooting angle through the VR head.
  • the first image information includes image information of the plants acquired by the first imaging device at different shooting angles.
  • another embodiment of a method for monitoring a plant in an embodiment of the present invention includes:
  • the monitoring system can establish a first communication connection between the virtual reality VR head display and the first camera device,
  • a communication connection may be a wired communication method or a wireless communication method, and may also include wired communication and wireless communication.
  • communication can be performed between the VR head display and the first camera.
  • VR head display that is, a virtual reality head-mounted display device, uses a head-mounted display device to close a person's visual and auditory sense to the outside world, and guides the user to create a feeling in a virtual environment.
  • the image distance of the first imaging device is the first image distance, and the first imaging device can acquire the first image information of the plant under the first image distance.
  • the target region where the pest is located in the first image information may be identified.
  • the image recognition technology has been relatively mature. By pre-establishing the pest model, it is possible to determine whether the pest is included in the image by image recognition, and to determine the region where the pest is located.
  • the first image information may be sent to the VR head display through the first communication connection to display the first image information of the plant under the first image distance through the VR head display.
  • the monitoring system may add a pest identifier to the first image information, and the pest identifier may point to the target area, for example, circle the target area, and then the monitoring system may display the target through the VR The first image information with the pest identifier under the first image distance is displayed.
  • the pest identification is performed, and the first image with the pest identifier is displayed by the VR head, which can help the user to observe the pest condition of the plant. It should be noted that the step 303 may not be performed, and the first image information displayed in step 304 does not carry the pest identifier.
  • the VR head display detects that the user's line of sight focus continuously aligns with the target position in the first image for more than a preset duration
  • the user can look at a certain position in the first image.
  • the corresponding position of the user's line of sight focus in the first image is referred to as the target position.
  • the VR head display can detect that the user's line of sight focus continuously aligns with the target position in the first image for more than the preset time length, and uses the detection result as the control information input by the user.
  • the monitoring system may pre-associate the storage control information with a control command corresponding thereto, and the pre-stored control command includes a control command for the camera device communicatively connected to the VR head. It is assumed that the control information that "the line of sight is in focus in a certain position in the image exceeds the preset duration" can be stored in association with the control command "by adjusting the focus corresponding position".
  • the control command corresponding to the control information is determined to be zoomed to enlarge the scene of the target position. For example, optical zoom can be achieved by moving the lens to change the image distance.
  • the first camera device may be controlled to perform corresponding focusing according to the control command, and the target position is enlarged to enable the user to see the scene of the target position, for example, the image distance may be increased.
  • the image distance of the first imaging device is the second image distance
  • the first image information acquired by the first imaging device is the first image information of the plant under the second image distance
  • the first image information of the acquired plant under the second image distance can be sent to the VR head display through the first communication connection, so as to display the first image through the VR head.
  • the first image information of the plant under the second image distance can be sent to the VR head display through the first communication connection, so as to display the first image through the VR head.
  • the first image information includes image information of the plants acquired by the first imaging device under different image distances.
  • the corresponding embodiment of FIG. 2 can cause the monitoring system to simulate the situation of the plant that the user sees by rotating the body at the position of the camera.
  • the corresponding embodiment of FIG. 3 can cause the monitoring system to simulate the plant seen when the user goes to the plant corresponding to the target position.
  • the first camera device can select a camera device that can perform translational motion, such as panning up and down, or panning in a horizontal direction.
  • this method has higher requirements on the camera device and a corresponding increase in cost.
  • a plurality of imaging devices can be provided in the planting space of the plant, and a plurality of imaging devices can be used in combination to expand the range of plants that the user can observe through the VR head.
  • FIG. 4 the monitoring side of the plant in the embodiment of the present invention.
  • Another embodiment of the method includes:
  • the monitoring system may establish a first communication connection between the virtual reality VR head display and the first camera device.
  • the first communication connection may be a wired communication mode or a wireless communication mode, and may also include wired communication and wireless communication. After the first communication connection is established, communication can be performed between the VR head display and the first camera.
  • VR head display that is, a virtual reality head-mounted display device, uses a head-mounted display device to close a person's visual and auditory sense to the outside world, and guides the user to create a feeling in a virtual environment.
  • the first camera device can acquire the first image information of the plant.
  • the first image information may be sent to the VR head display through the first communication connection to display the first image information of the plant through the VR head.
  • the user can blink for a plurality of times within a preset time period, for example, the user can blink twice in 1 s.
  • the VR head display can detect the blinking information of the user, and the detection result is used as the control information input by the user.
  • control instruction 405. Determine, according to a correspondence between the pre-stored control information and the control instruction, that the control instruction is a switching camera device.
  • the monitoring system may pre-associate the storage control information and the control command corresponding thereto, and the pre-stored control command includes a control command of the camera device communicatively connected to the VR head and a switching command to the camera device. It is assumed that the control information of "twice eyes in 1 s" can be stored in association with the control command "switching camera". After the monitoring device detects that the user blinks twice in 1 s, the control device can determine that the control command corresponding to the control information is: switching the first communication connection to the second communication connection between the VR head display and the second camera.
  • a second imaging device may be provided in the planting space of the plant. After the monitoring device determines that the control command is to switch the camera device, the first communication may be disconnected according to the control command. Connect and establish a second communication connection between the VR headset and the second camera.
  • the second camera device can acquire the second image information of the plant.
  • the second image information of the acquired plant may be sent to the VR head display through the second communication connection to display the second image information of the plant through the VR head.
  • the first image information refers to image information of the plant acquired by the first imaging device
  • the second image information refers to image information of the plant acquired by the second imaging device. If the planting area of the plant is large, including multiple planting areas, an imaging device can be installed in each planting area, and by switching the camera device connected to the VR head, it is possible to observe the plants in different planting areas.
  • an embodiment of a monitoring system for a plant in an embodiment of the present invention includes:
  • a first camera device 501 configured to acquire first image information of the plant
  • a first communication connection module 502 configured to establish a first communication connection between the VR head display and the first camera device
  • the VR head display 503 is configured to display first image information, and is also used to detect control information input by the user;
  • the first control module 504 is configured to control the first imaging device according to a control instruction to the first imaging device when determining that the control information corresponds to a control instruction to the first imaging device.
  • the first camera device 501 is disposed in a plant growing area, such as in a planting box or in a greenhouse.
  • the first communication connection module 502 can include Bluetooth modules respectively disposed on the first camera 501 and the VR head display 503 to implement a wireless connection between the first camera 501 and the VR headset 503.
  • the first control module may be disposed on the VR head display 503 or may be disposed on the first camera device 501.
  • the VR head display is also used to detect the user's head rotation information, and/or eye information.
  • the monitoring system further comprises an identification module for identifying the pest in the first image information
  • the identification module may be disposed on the VR head display 503, or may be disposed on the first camera unit 501.
  • VR Header 501 is also used to:
  • the first image information with the pest identifier is displayed, and the pest identifier points to the target area.
  • FIG. 6 another embodiment of a monitoring system for a plant in an embodiment of the present invention includes:
  • a first camera device 601 configured to acquire first image information of the plant
  • a first communication connection module 602 configured to establish a first communication connection between the VR head and the first camera
  • the VR head display 603 is configured to display first image information, and is also used to detect control information input by the user;
  • the first control module 604 is configured to: when determining that the control information corresponds to a control instruction to the first camera, control the first camera according to a control instruction to the first camera;
  • a second camera 605, configured to acquire second image information of the plant
  • a second communication connection module 606, configured to establish a second communication connection between the VR head display and the second camera device
  • the second control module 607 is configured to switch the first communication connection to the second communication connection when determining that the control information corresponds to the switching instruction of the camera device;
  • the first camera unit 601 and the second camera unit 605 are both disposed in a plant growing area, such as in a planting box or in a greenhouse.
  • the first control module 604 and the second control module 607 may be disposed on the VR head display 603, or may be disposed on the first imaging device 601 and the second imaging device 605.
  • the disclosed systems, devices, and methods may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of cells is only a logical function division.
  • multiple units or components may be combined or integrated. Go to another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be electrical, Mechanical or other form.
  • the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • An integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, can be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Testing And Monitoring For Control Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephonic Communication Services (AREA)

Abstract

La présente invention concerne un mode de réalisation d'un procédé et d'un système de surveillance d'une usine, pour résoudre le problème lié à l'impossibilité de fournir une expérience d'observation immersive à l'aide des procédés de surveillance d'usine existants. Le procédé de surveillance dans le mode de réalisation de la présente invention consiste : à établir une première connexion de communication entre un visiocasque VR et un premier dispositif de caméra ; à acquérir des premières informations d'images d'usines au moyen du premier dispositif de caméra ; à afficher les premières informations d'images au moyen du visiocasque VR ; à détecter, au moyen du visiocasque VR, des informations de commande entrées par un utilisateur ; s'il est déterminé que les informations de commande correspondent à une instruction de commande pour le premier dispositif de caméra, à commander le premier dispositif de caméra en fonction de l'instruction de commande concernant le premier dispositif de caméra.
PCT/CN2017/094087 2017-04-28 2017-07-24 Procédé et système de surveillance d'usine WO2018196184A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710293358.1 2017-04-28
CN201710293358.1A CN106993167A (zh) 2017-04-28 2017-04-28 一种植株的监控方法及监控系统

Publications (1)

Publication Number Publication Date
WO2018196184A1 true WO2018196184A1 (fr) 2018-11-01

Family

ID=59418032

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/094087 WO2018196184A1 (fr) 2017-04-28 2017-07-24 Procédé et système de surveillance d'usine

Country Status (2)

Country Link
CN (1) CN106993167A (fr)
WO (1) WO2018196184A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111031292A (zh) * 2019-12-26 2020-04-17 北京中煤矿山工程有限公司 一种基于vr技术的煤矿安全生产实时监测系统

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107295310B (zh) * 2017-07-31 2019-01-08 深圳春沐源控股有限公司 种植监控方法和种植监控装置
CN110309933A (zh) * 2018-03-23 2019-10-08 广州极飞科技有限公司 植株种植数据测量方法、作业路线规划方法及装置、系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016145443A1 (fr) * 2015-03-12 2016-09-15 Daniel Kerzner Amélioration virtuelle de surveillance de sécurité
CN106131483A (zh) * 2016-06-24 2016-11-16 宇龙计算机通信科技(深圳)有限公司 一种基于虚拟现实的巡检方法及相关设备、系统
CN205726125U (zh) * 2016-03-30 2016-11-23 重庆邮电大学 一种新型机器人远程监视系统
CN106383522A (zh) * 2016-09-22 2017-02-08 华南农业大学 一种基于虚拟现实的田间农情信息实时监测系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160133328A (ko) * 2015-05-12 2016-11-22 삼성전자주식회사 웨어러블 디바이스를 이용한 원격 제어 방법 및 장치
CN105828062A (zh) * 2016-03-23 2016-08-03 常州视线电子科技有限公司 无人机3d虚拟现实拍摄系统
CN106598244A (zh) * 2016-12-12 2017-04-26 大连文森特软件科技有限公司 一种基于ar虚拟现实技术的植物生长监测系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016145443A1 (fr) * 2015-03-12 2016-09-15 Daniel Kerzner Amélioration virtuelle de surveillance de sécurité
CN205726125U (zh) * 2016-03-30 2016-11-23 重庆邮电大学 一种新型机器人远程监视系统
CN106131483A (zh) * 2016-06-24 2016-11-16 宇龙计算机通信科技(深圳)有限公司 一种基于虚拟现实的巡检方法及相关设备、系统
CN106383522A (zh) * 2016-09-22 2017-02-08 华南农业大学 一种基于虚拟现实的田间农情信息实时监测系统

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111031292A (zh) * 2019-12-26 2020-04-17 北京中煤矿山工程有限公司 一种基于vr技术的煤矿安全生产实时监测系统

Also Published As

Publication number Publication date
CN106993167A (zh) 2017-07-28

Similar Documents

Publication Publication Date Title
US9842433B2 (en) Method, apparatus, and smart wearable device for fusing augmented reality and virtual reality
CN104486543B (zh) 智能终端触控方式控制云台摄像头的系统
EP3029552B1 (fr) Système de réalité virtuelle et procédé de commande de modes de fonctionnement d'un tel système
US10349031B2 (en) Augmented reality based user interfacing
JP6610546B2 (ja) 情報処理装置、情報処理方法、及びプログラム
US10171792B2 (en) Device and method for three-dimensional video communication
WO2019067899A1 (fr) Commande de dispositifs externes à l'aide d'interfaces de réalité
EP2720464B1 (fr) Génération des informations d'image
WO2018104869A1 (fr) Système de téléprésence
JP6822410B2 (ja) 情報処理システム及び情報処理方法
JP7081052B2 (ja) 模擬現実(sr)におけるデバイス共有及び対話性の表示
WO2018196184A1 (fr) Procédé et système de surveillance d'usine
KR20160091316A (ko) 물리적 위치들 간 비디오 대화
US11151804B2 (en) Information processing device, information processing method, and program
AU2017370476A1 (en) Virtual reality-based viewing method, device, and system
JP7239916B2 (ja) 遠隔操作システム、情報処理方法、及びプログラム
CN106327583A (zh) 一种实现全景摄像的虚拟现实设备及其实现方法
US20200035035A1 (en) Imaging system, display apparatus and method of producing mixed-reality images
WO2019028855A1 (fr) Dispositif d'affichage virtuel, procédé d'interaction intelligent et serveur en nuage
EP4222943A1 (fr) Systèmes de caméras à capteurs multiples, dispositifs et procédés pour fournir une fonctionnalité de panoramique, d'inclinaison et de zoom d'image
US20230215079A1 (en) Method and Device for Tailoring a Synthesized Reality Experience to a Physical Setting
CN114371779B (zh) 一种视线深度引导的视觉增强方法
WO2019102680A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
CN112558761A (zh) 一种面向移动端的远程虚拟现实交互系统及交互方法
JP7242448B2 (ja) 仮想現実制御装置、仮想現実制御方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17907818

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26.02.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17907818

Country of ref document: EP

Kind code of ref document: A1