CN113075936A - Underwater robot display method, device and system - Google Patents

Underwater robot display method, device and system Download PDF

Info

Publication number
CN113075936A
CN113075936A CN202110628965.5A CN202110628965A CN113075936A CN 113075936 A CN113075936 A CN 113075936A CN 202110628965 A CN202110628965 A CN 202110628965A CN 113075936 A CN113075936 A CN 113075936A
Authority
CN
China
Prior art keywords
underwater robot
display
preset
target object
detection result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110628965.5A
Other languages
Chinese (zh)
Inventor
魏建仓
张增虎
胡蓉贵
侯明波
赵国腾
郭轶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deepinfar Ocean Technology Inc
Original Assignee
Deepinfar Ocean Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deepinfar Ocean Technology Inc filed Critical Deepinfar Ocean Technology Inc
Priority to CN202110628965.5A priority Critical patent/CN113075936A/en
Publication of CN113075936A publication Critical patent/CN113075936A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/0206Control of position or course in two dimensions specially adapted to water vehicles

Abstract

The disclosure provides a display method, a display device and a display system of an underwater robot. The display method comprises the following steps: receiving a display instruction; determining preset mode information according to the display instruction, and sinking according to the preset mode information; completing sinking, and detecting a target area to obtain a detection result; and dynamically displaying according to the detection result. Therefore, the underwater robot can be subjected to omnibearing dynamic performance display, a more visual display effect is achieved, and the interest of a viewer on the underwater robot is effectively improved.

Description

Underwater robot display method, device and system
Technical Field
The disclosure relates to the technical field of data processing, in particular to a display method, a display device and a display system of an underwater robot.
Background
With the development of the technology, the application of the robot is more and more extensive, more and more underwater robots are in use, and the underwater observation, photographing and video recording through the underwater robots are more and more convenient. However, at present, the underwater robot is generally displayed in a display cabinet or a sales counter in a manner of placing the underwater robot in the display cabinet or playing a pre-recorded promo, so that the underwater robot is seriously lack of dynamics, the performance of a product cannot be comprehensively displayed, consumers are difficult to be interested in the underwater robot, and the functions of the underwater robot cannot be more understood.
Disclosure of Invention
In view of this, an object of the present disclosure is to provide a method, an apparatus and a system for displaying an underwater robot.
Based on the above purpose, the present disclosure provides a method for displaying an underwater robot, which is applied to an underwater robot, and includes:
receiving a display instruction;
determining preset mode information according to the display instruction, and sinking according to the preset mode information;
detecting the target area to obtain a detection result;
and dynamically displaying according to the detection result.
Optionally, the preset mode information includes: presetting the depth of water entry;
the sinking according to the preset mode information specifically includes:
and sinking according to the preset water penetration depth.
Optionally, the detecting the target area to obtain a detection result specifically includes:
and responding to the fact that the target area sinks to reach the preset underwater penetration depth, and detecting the target area to obtain a detection result.
Optionally, the preset mode information includes: presetting an action sequence;
the dynamically displaying according to the detection result specifically includes:
in response to the fact that the detection result is that the target object exists in the target area, the target area rotates along with the target object;
and in response to the detection result that the target object does not exist in the target area, dynamically displaying the underwater robot according to a preset action sequence.
Optionally, if it is determined that the target object exists in the target area in response to the detection result, the target object is rotated along with the target object, specifically including:
acquiring the three-dimensional coordinates of the target object according to a predefined coordinate system;
and rotating according to the three-dimensional coordinates to enable the target part to face the three-dimensional coordinate direction.
Optionally, rotating the underwater robot according to the moving direction of the three-dimensional coordinate further includes:
and acquiring preset distance information, and moving relative to the target object according to the distance information to adjust the distance between the target object and the target object.
Optionally, if it is determined that the target object does not exist in the target area in response to the determination result, the underwater robot performs an action display according to a preset action sequence, which specifically includes:
the sequence of actions includes: rotating, rolling and pitching, wherein the motion sequence is performed in sequence according to a preset motion index;
in response to determining that the current point is the last action in the sequence of actions, then the index value of the action index is incremented and points to the first action.
Optionally, the preset mode information includes: presetting display duration;
the dynamic display according to the detection result further comprises:
and in response to determining that the power is too low or the duration of the dynamic display exceeds a preset duration, closing the impeller and floating out of the water.
Based on the same inventive concept, the present disclosure also provides an underwater robot, comprising a memory, a processor and a computer program stored on the memory and operable on the processor, wherein the processor implements the method as described above when executing the program.
Based on the same inventive concept, the present disclosure also provides an underwater robot display system, comprising: the underwater robot comprises a display groove, a setting end and the underwater robot;
the display groove is used for containing water and providing a display space for the underwater robot;
the setting end is used for sending the display instruction to the underwater robot.
As can be seen from the above, according to the display method, device and system for the underwater robot provided by the disclosure, firstly, the underwater depth, the action sequence and the display duration of the underwater robot are preset at the setting end, the underwater robot which completes the setting is placed on the water surface of the display groove, the underwater robot reaches the preset depth range according to the preset underwater depth, the underwater robot performs dynamic display according to the actual detection result within the depth range, and the underwater robot closes the pusher to float out of the water surface until the dynamic display duration exceeds the preset display duration or the electric quantity is too low, so as to wait for the collection of workers.
Drawings
In order to more clearly illustrate the technical solutions in the present disclosure or related technologies, the drawings needed to be used in the description of the embodiments or related technologies are briefly introduced below, and it is obvious that the drawings in the following description are only embodiments of the present disclosure, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a method for displaying an underwater robot according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram showing an action state of the underwater robot according to the embodiment of the present disclosure;
FIG. 3 is a schematic three-dimensional coordinate system of an embodiment of the present disclosure;
fig. 4 is a schematic illustration of an exhibition flow of an underwater robot of an embodiment of the present disclosure;
fig. 5 is a schematic diagram of electronic components of an underwater robot according to an embodiment of the present disclosure.
Detailed Description
For the purpose of promoting a better understanding of the objects, aspects and advantages of the present disclosure, reference is made to the following detailed description taken in conjunction with the accompanying drawings.
It is to be noted that technical terms or scientific terms used in the embodiments of the present disclosure should have a general meaning as understood by those having ordinary skill in the art to which the present disclosure belongs, unless otherwise defined. The use of "first," "second," and similar terms in the embodiments of the disclosure is not intended to indicate any order, quantity, or importance, but rather to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect.
As described in the background art, the prior art shows the underwater robot generally by placing the underwater robot in a display cabinet or by recording a promo sheet in advance, which seriously lacks dynamic property and results in failure to show the performance of the underwater robot comprehensively.
In order to solve the problems in the prior art, the present disclosure provides a method for displaying an underwater robot, which sinks to a designated depth through preset mode information, performs target detection on a preset target area, and determines a specific dynamic display mode according to a target detection result, that is, whether a target object exists in the target area.
The display method of the underwater robot performs target detection through preset mode information, and determines a specific display mode according to a target detection result; the display dynamics, switching logic between various actions during the display, by rotating, rolling, pitching, and tracking the person within a particular container.
The embodiments of the present disclosure are further illustrated by the following specific examples.
First, the embodiment of the present disclosure provides a method for displaying an underwater robot. Referring to fig. 1, the demonstration method of the underwater robot includes the following steps:
and step S101, receiving a display instruction.
In some embodiments, the presentation instruction includes preset mode information and a start instruction,
before the underwater robot is displayed, the underwater robot is connected with a setting end through a wireless network to complete connection, the underwater robot is subjected to mode setting through the setting end, the underwater robot enters an initial state, after a starting instruction is received, the underwater robot is switched to a preparation state from the initial state, and the reference figure 2 is shown, so that a worker waits for the underwater robot to be placed on the water surface. The preset mode information comprises a preset water entry depth, a preset action sequence and a preset display duration; the set-up side may be any of a variety of handheld devices having wireless communication capabilities, in-vehicle devices, wearable devices, computing devices or other processing devices connected to a wireless modem, as well as various forms of user equipment, mobile stations, terminal devices, which are collectively referred to as the set-up side for ease of description.
And S102, determining preset mode information according to the display instruction, and sinking according to the preset mode information.
In some embodiments, the preset mode information includes: presetting the depth of water entry; the sinking according to the preset mode information specifically includes: and sinking according to the preset water penetration depth. The underwater robot always keeps a balance posture in the sinking process, the underwater robot can detect the underwater depth again after reaching the specified depth, if the underwater robot reaches the specified depth, the underwater robot can automatically switch to a waiting state, and after the underwater robot is switched to the waiting state, the underwater robot can automatically detect the target object in the target area. And after receiving the starting instruction, the underwater robot sinks under the action of the built-in propeller.
In some embodiments, the target area is an area inside the display slot, through which the underwater robot is displayed underwater, and in general, the display slot is square or rectangular, and may also be a cylinder, and when the display slot is square, rectangular or a cylinder, the target area corresponding to the underwater robot changes accordingly and corresponds to the target area.
And step S103, detecting the target area to obtain a detection result.
In some embodiments, the detecting the target region to obtain a detection result specifically includes: and responding to the fact that the target area sinks to reach the preset underwater penetration depth, and detecting the target area to obtain a detection result.
In some embodiments, the underwater robot will detect the depth according to the built-in depth sensor, and the detection result obtained after multiple detections still shows that the underwater robot does not reach the specified depth and switches to the termination state, wherein the termination state is represented by turning off the impeller and floating out of the water surface, which is shown in fig. 2. Here, the "screen" and the "target area" in fig. 2 have the same meaning.
And step S104, performing dynamic display according to the detection result.
In some embodiments, the preset mode information includes: presetting an action sequence; the dynamically displaying according to the detection result specifically includes: in response to the fact that the detection result is that the target object exists in the target area, the target area rotates along with the target object; and in response to the detection result that the target object does not exist in the target area, dynamically displaying the underwater robot according to a preset action sequence. Wherein the preset action sequence comprises: rotation, roll, and pitch.
In some embodiments, the preset action sequence may also be long jump and high jump, and the long jump may be divided into: skip left, skip right, skip forward and skip backward.
In some embodiments, when it is determined that the target object exists in the target area, that is, a person exists in the image, the detecting step follows the target object to rotate, specifically including: acquiring the three-dimensional coordinates of the target object according to a predefined coordinate system; and rotating according to the three-dimensional coordinates to enable the target part to face the three-dimensional coordinate direction. Wherein, refer to fig. 3, three-dimensional coordinate system is including X axle, Y axle and Z axle, and when the show groove was square or cuboid, the original point was in the lower left corner, and the mark is O, is X axle positive direction right, upwards is Y axle positive direction, and Z axle positive direction is the direction of screen towards people's eye, obtains three-dimensional coordinate system.
In some embodiments, rotating the underwater robot according to the moving direction of the three-dimensional coordinates further includes: and acquiring preset distance information, and moving relative to the target object according to the distance information to adjust the distance between the target object and the target object.
In some embodiments, the underwater robot is rotated according to a moving direction of the three-dimensional coordinates of the target object when the underwater robot is in the tracking state. The target object is not fixed but movable, and when the position of the target object moves, the three-dimensional coordinate of the target object changes accordingly, for example, when the three-dimensional coordinate of the target object moves 20cm in the positive direction of the X axis, the underwater robot calculates a corresponding angle value to be rotated according to the moving distance, and rotates the underwater robot according to the obtained angle value, so that the image acquisition module of the underwater robot can acquire the target object all the time.
In some embodiments, in response to determining that the detection result is that the target area does not have the target object, the underwater robot enters a performance state, and in the performance state, the underwater robot performs an action display according to a preset action sequence, specifically including: the sequence of actions includes: rotating, rolling and pitching, wherein the motion sequence is performed in sequence according to a preset motion index; in response to determining that the current point is the last action in the sequence of actions, then the index value of the action index is incremented and points to the first action. For example, in the performance state, the motion of the underwater robot proceeds to the pitch motion, and the motion of the next performance is the rotation. And switching to the performance state when the detection times or the detection time length of the target object exceed the preset time length.
In some embodiments, the preset action sequence may also be long jump and high jump, and the long jump may be divided into: skip left, skip right, skip forward and skip backward. If the motion sequence is roll, rotate, jump, pitch, and jump, when the performance motion proceeds to jump, the next motion to be performed is roll.
In some embodiments, the target object detection is performed on the target area again after the execution of the action sequence is completed, and if the target object cannot be detected in the target area and the preset display duration or the electric quantity is low, the terminal state is entered.
When the underwater robot reaches the preset underwater depth and no target object is detected in the picture, the underwater robot is switched to a waiting state, a target area is detected in the waiting state, and when no target object exists in the picture, namely no person exists in the picture, the underwater robot performs action performance according to preset mode information.
In some embodiments, referring to fig. 4, when performing the dynamic display, the underwater robot may have a situation that the power is too low due to some uncertain factors, and in response to determining that the power is too low or the duration of the dynamic display exceeds a preset duration, the underwater robot enters a preset termination state, and turns off the pusher and floats out of the water.
According to the underwater robot display method, the underwater robot can display a series of dynamic performances of rotation, rolling, pitching and tracking of the position of the underwater robot in the display groove according to the preset logic instruction, the full-attitude motion capability of the underwater robot is effectively displayed, the underwater robot can be displayed more vividly and vividly, the display effect is enhanced, and the interest of a viewer in the underwater robot is improved.
It should be noted that the method of the embodiments of the present disclosure may be executed by a single device, such as a computer or a server. The method of the embodiment can also be applied to a distributed scene and completed by the mutual cooperation of a plurality of devices. In such a distributed scenario, one of the devices may only perform one or more steps of the method of the embodiments of the present disclosure, and the devices may interact with each other to complete the method.
It should be noted that the above describes some embodiments of the disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Based on the same inventive concept, corresponding to the method of any embodiment, the present disclosure further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the program, the underwater robot display method according to any embodiment is implemented.
Fig. 5 is a schematic diagram illustrating a more specific hardware structure of an electronic device according to this embodiment, where the electronic device may include: a processor 1010, a memory 1020, an input/output interface 1030, a communication interface 1040, and a bus 1050. Wherein the processor 1010, memory 1020, input/output interface 1030, and communication interface 1040 are communicatively coupled to each other within the device via bus 1050.
The processor 1010 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 1020 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static storage device, a dynamic storage device, or the like. The memory 1020 may store an operating system and other application programs, and when the technical solution provided by the embodiments of the present specification is implemented by software or firmware, the relevant program codes are stored in the memory 1020 and called to be executed by the processor 1010.
The input/output interface 1030 is used for connecting an input/output module to input and output information. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The communication interface 1040 is used for connecting a communication module (not shown in the drawings) to implement communication interaction between the present apparatus and other apparatuses. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, Bluetooth and the like).
Bus 1050 includes a path that transfers information between various components of the device, such as processor 1010, memory 1020, input/output interface 1030, and communication interface 1040.
It should be noted that although the above-mentioned device only shows the processor 1010, the memory 1020, the input/output interface 1030, the communication interface 1040 and the bus 1050, in a specific implementation, the device may also include other components necessary for normal operation. In addition, those skilled in the art will appreciate that the above-described apparatus may also include only those components necessary to implement the embodiments of the present description, and not necessarily all of the components shown in the figures.
The electronic device of the above embodiment is used for implementing the corresponding display method of the underwater robot in any one of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Based on the same inventive concept, corresponding to the method of any of the above embodiments, the present disclosure further provides an underwater robot display system, including: show groove, setting end and as above-mentioned underwater robot. The display groove is used for containing water and providing a display space for the underwater robot; the setting end is used for sending the display instruction to the underwater robot.
The display groove in this embodiment is a container with a transparent side wall, and contains a proper amount of clear water for displaying the underwater robot of the present disclosure, and certainly is not limited to displaying the underwater robot.
In some embodiments, the display tray may also be a container that is completely transparent or in which two or three of the side walls are transparent; wherein, the transparent container can be a glass container or a plastic container.
The display system of the underwater robot of the above embodiment is used for displaying the actions of the underwater robot according to any of the above embodiments to visitors.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the idea of the present disclosure, also technical features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present disclosure as described above, which are not provided in detail for the sake of brevity.
In addition, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown in the provided figures for simplicity of illustration and discussion, and so as not to obscure the embodiments of the disclosure. Furthermore, devices may be shown in block diagram form in order to avoid obscuring embodiments of the present disclosure, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform within which the embodiments of the present disclosure are to be implemented (i.e., specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the disclosure, it should be apparent to one skilled in the art that the embodiments of the disclosure can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative instead of restrictive.
While the present disclosure has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of these embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic ram (dram)) may use the discussed embodiments.
The disclosed embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Therefore, any omissions, modifications, equivalents, improvements, and the like that may be made within the spirit and principles of the embodiments of the disclosure are intended to be included within the scope of the disclosure.

Claims (10)

1. A display method of an underwater robot is applied to the underwater robot and comprises the following steps:
receiving a display instruction;
determining preset mode information according to the display instruction, and sinking according to the preset mode information;
detecting the target area to obtain a detection result;
and dynamically displaying according to the detection result.
2. The underwater robot exhibiting method according to claim 1, wherein the preset mode information includes: presetting the depth of water entry;
the sinking according to the preset mode information specifically includes:
and sinking according to the preset water penetration depth.
3. The method for displaying the underwater robot as claimed in claim 1, wherein the step of detecting the target area to obtain a detection result specifically comprises:
and responding to the fact that the target area sinks to reach the preset underwater penetration depth, and detecting the target area to obtain a detection result.
4. The underwater robot exhibiting method according to claim 1, wherein the preset mode information includes: presetting an action sequence;
the dynamically displaying according to the detection result specifically includes:
in response to the fact that the detection result is that the target object exists in the target area, the target area rotates along with the target object;
and in response to the detection result that the target object does not exist in the target area, dynamically displaying the underwater robot according to a preset action sequence.
5. The method for displaying the underwater robot as claimed in claim 4, wherein the following the target object when the target object is present in the target area in response to the detection result is determined, specifically comprises:
acquiring the three-dimensional coordinates of the target object according to a predefined coordinate system;
and rotating according to the three-dimensional coordinates to enable the target part to face the three-dimensional coordinate direction.
6. The underwater robot exhibiting method of claim 5, wherein rotating the underwater robot according to the moving direction of the three-dimensional coordinates, further comprises:
and acquiring preset distance information, and moving relative to the target object according to the preset distance information so as to adjust the distance between the target object and the target object.
7. The method for displaying the underwater robot as claimed in claim 4, wherein the performing, in response to determining that the target object does not exist in the target area as a result of the detection, an action display by the underwater robot according to a preset action sequence specifically includes:
the sequence of actions includes: rotating, rolling and pitching, wherein the motion sequence is performed in sequence according to a preset motion index;
in response to determining that the current point is the last action in the sequence of actions, then the index value of the action index is incremented and points to the first action.
8. The underwater robot exhibiting method according to claim 1, wherein the preset mode information includes: presetting display duration;
the dynamic display according to the detection result further comprises:
and in response to determining that the power is too low or the duration of the dynamic display exceeds a preset duration, closing the impeller and floating out of the water.
9. An underwater robot comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor, when executing the program, implements the method of any one of claims 1 to 8.
10. A display system for an underwater robot comprising: a display slot, a setting end and an underwater robot as claimed in claim 9;
the display groove is used for containing water and providing a display space for the underwater robot;
the setting end is used for sending the display instruction to the underwater robot.
CN202110628965.5A 2021-06-07 2021-06-07 Underwater robot display method, device and system Pending CN113075936A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110628965.5A CN113075936A (en) 2021-06-07 2021-06-07 Underwater robot display method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110628965.5A CN113075936A (en) 2021-06-07 2021-06-07 Underwater robot display method, device and system

Publications (1)

Publication Number Publication Date
CN113075936A true CN113075936A (en) 2021-07-06

Family

ID=76617113

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110628965.5A Pending CN113075936A (en) 2021-06-07 2021-06-07 Underwater robot display method, device and system

Country Status (1)

Country Link
CN (1) CN113075936A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115599126A (en) * 2022-12-15 2023-01-13 深之蓝海洋科技股份有限公司(Cn) Automatic collision-prevention wireless remote control unmanned submersible and automatic collision-prevention method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050235899A1 (en) * 2002-04-30 2005-10-27 Ikuo Yamamoto Fish-shaped underwater navigating body, control system thereof, and aquarium
CN104881045A (en) * 2015-06-17 2015-09-02 中国科学院自动化研究所 Bionic robot fish three-dimensional tracking method based on embedded visual guidance
CN105383654A (en) * 2015-10-30 2016-03-09 哈尔滨工程大学 Depth control device of autonomous underwater vehicle
CN106530660A (en) * 2016-12-06 2017-03-22 北京臻迪机器人有限公司 Underwater unmanned ship control system
CN107009371A (en) * 2017-06-14 2017-08-04 上海思依暄机器人科技股份有限公司 A kind of method and device for automatically adjusting machine people's dance movement
KR20180076754A (en) * 2016-12-28 2018-07-06 (주)아이로 Underwater performance system of robot fish
CN111098307A (en) * 2019-12-31 2020-05-05 航天信息股份有限公司 Intelligent patrol robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050235899A1 (en) * 2002-04-30 2005-10-27 Ikuo Yamamoto Fish-shaped underwater navigating body, control system thereof, and aquarium
CN104881045A (en) * 2015-06-17 2015-09-02 中国科学院自动化研究所 Bionic robot fish three-dimensional tracking method based on embedded visual guidance
CN105383654A (en) * 2015-10-30 2016-03-09 哈尔滨工程大学 Depth control device of autonomous underwater vehicle
CN106530660A (en) * 2016-12-06 2017-03-22 北京臻迪机器人有限公司 Underwater unmanned ship control system
KR20180076754A (en) * 2016-12-28 2018-07-06 (주)아이로 Underwater performance system of robot fish
CN107009371A (en) * 2017-06-14 2017-08-04 上海思依暄机器人科技股份有限公司 A kind of method and device for automatically adjusting machine people's dance movement
CN111098307A (en) * 2019-12-31 2020-05-05 航天信息股份有限公司 Intelligent patrol robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴清潇 等: "基于模型的水下机器人视觉悬停定位技术", 《高技术通讯》 *
夏文卿: "仿鲹科机器鱼胸鳍尾鳍协同推进运动控制方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115599126A (en) * 2022-12-15 2023-01-13 深之蓝海洋科技股份有限公司(Cn) Automatic collision-prevention wireless remote control unmanned submersible and automatic collision-prevention method

Similar Documents

Publication Publication Date Title
EP4198694A1 (en) Positioning and tracking method and platform, head-mounted display system, and computer-readable storage medium
US11276183B2 (en) Relocalization method and apparatus in camera pose tracking process, device, and storage medium
US20220148279A1 (en) Virtual object processing method and apparatus, and storage medium and electronic device
US9912853B2 (en) Switching between cameras of an electronic device
US20210191684A1 (en) Control method, control device, control system, electronic whiteboard, and mobile terminal
CN107204044B (en) Picture display method based on virtual reality and related equipment
CN111062981A (en) Image processing method, device and storage medium
EP1811431A1 (en) Information processing system, and information processing method
US20160148343A1 (en) Information Processing Method and Electronic Device
CN103460705A (en) Real-time depth extraction using stereo correspondence
CN108553895B (en) Method and device for associating user interface element with three-dimensional space model
CN111311756A (en) Augmented reality AR display method and related device
US9886101B2 (en) Information processing method and electronic device
CN113075936A (en) Underwater robot display method, device and system
CN113778252B (en) Anti-false touch method and device applied to flexible display screen, terminal and storage medium
CN104952058B (en) A kind of method and electronic equipment of information processing
CN114116081B (en) Interactive dynamic fluid effect processing method and device and electronic equipment
CN108399638B (en) Augmented reality interaction method and device based on mark and electronic equipment
CN107038746B (en) Information processing method and electronic equipment
CN113721818B (en) Image processing method, device, equipment and computer readable storage medium
CN116721237B (en) House type wall editing method, device, equipment and storage medium
CN115937299B (en) Method for placing virtual object in video and related equipment
CN113014806B (en) Blurred image shooting method and device
US20230326147A1 (en) Helper data for anchors in augmented reality
CN107742275B (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210706

RJ01 Rejection of invention patent application after publication