WO2020151468A1 - Système de commande à distance de véhicule établi par des dispositifs sans fil primaires et secondaires au moyen d'une connexion internet des objets - Google Patents

Système de commande à distance de véhicule établi par des dispositifs sans fil primaires et secondaires au moyen d'une connexion internet des objets Download PDF

Info

Publication number
WO2020151468A1
WO2020151468A1 PCT/CN2020/000015 CN2020000015W WO2020151468A1 WO 2020151468 A1 WO2020151468 A1 WO 2020151468A1 CN 2020000015 W CN2020000015 W CN 2020000015W WO 2020151468 A1 WO2020151468 A1 WO 2020151468A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
processor
arm
fault
manipulator
Prior art date
Application number
PCT/CN2020/000015
Other languages
English (en)
Chinese (zh)
Inventor
韩磊
韩宛蕙
Original Assignee
岳秀兰
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 岳秀兰 filed Critical 岳秀兰
Publication of WO2020151468A1 publication Critical patent/WO2020151468A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0022Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement characterised by the communication link
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network

Definitions

  • the invention relates to a primary wireless device and a secondary wireless device connected through the Internet of Things to establish a remote driving system for a manned remotely driven vehicle, in particular to the technical field in which a remote operator uses a computer terminal to remotely drive a networked vehicle through the Internet of Things.
  • GPS is used to provide accurate map information, which is currently obtained through manual driving; cameras and radars are used to perceive the environment around the car, identify lane lines, signal lights, and surrounding obstacles; high-performance processors are used to process GPS, cameras, and radars Wait for the collected information and issue instructions to executive components such as the accelerator, brake, and steering wheel.
  • This kind of automatic driving system is costly and requires the establishment of powerful and accurate map data.
  • the map cannot record buildings and road renovations in time, cannot make judgments when encountering construction and accidents, and cannot recognize traffic police gestures and language.
  • Machines are good at simple, definite and repetitive actions, but are not good at handling complex, uncertain, and changeable actions. In the short term, machines cannot reach the perception and judgment capabilities of humans, and cannot independently respond to special situations encountered during driving.
  • the present invention provides a remote operator using a computer terminal to conduct Internet-connected vehicles through the Internet of Things.
  • the solution of the remote driving system is: the remote operator 171 controls the robot 170 to drive the vehicle 260 through the following connections, the remote console 169 is connected to the remote control center 298, and the remote control center 298 is connected to the wired and wireless LAN 295, wired and wireless
  • the wireless local area network 295 is connected to the switch 291, the switch 291 is connected to the ground network 264, the ground network 264 is connected to the wireless carrier system 262, the wireless carrier system 262 is connected to the communication network access system 278, and the communication network access system 278 is connected to the telematics unit 269 connection, telematics unit 269 is connected to vehicle bus 276, vehicle bus 276 is connected to robot 170, robot 170 is connected to first manipulator 182, robot 170 is connected to second manipulator 183,
  • remote control console 169 is connected to remote control center 298, remote control center 298 is connected to switch 291, switch 291 is connected to ground network 264, ground network 264 is connected to uplink transmission station 290, and uplink transmission station 290 It is connected to the communication satellite 289, the communication satellite 289 is connected to the communication network access system 278, the communication network access system 278 is connected to the telematics unit 269, the telematics unit 269 is connected to the robot 170, and the robot 170 is connected to the steering wheel 235. 170 is connected to the gear 400, the robot 170 is connected to the brake pedal 401, and the robot 170 is connected to the accelerator pedal 402,
  • Main radar video image transmission line the vehicle vision system 502 is connected to the processor 280, the processor 280 is connected to the communication network access system 278, the communication network access system 278 is connected to the wireless carrier system 262, and the wireless carrier system 262 is connected to the ground network 264
  • the ground network 264 is connected to the switch 291, the switch 291 is connected to the remote control center 298, the remote control center 298 is connected to the second processor 215, and the second processor 215 is connected to the visual display 255,
  • the vehicle vision system 502 is connected to the processor 280, the processor 280 is connected to the communication network access system 278, the communication network access system 278 is connected to the communication satellite 289, and the communication satellite 289 is connected to the uplink transmitting station 290 is connected, the uplink transmitting station 290 is connected to the ground network 264, the ground network 264 is connected to the switch 291, the switch 291 is connected to the remote control center 298, the remote control center 298 is connected to the second processor 215, and the second processor 215 is connected to The visual display 255 is connected.
  • the radar 110 and the video acquisition device 120 of the vehicle vision system 502 are fused by the radar video information fusion system 130, and the 443 scanning radar video information fusion system 130 transmits the image to the compression storage unit 444, and the compression storage unit 444 transmits the image to the first judgment unit 445 ,
  • the first judging unit 445 transmits the image to the compressed data generating unit 446, and the compressed data generating unit 446 transmits the compressed image to the sending module 447.
  • the sending module 447 sends the compressed image to the communication network access system 278.
  • the communication network access system 278 is transmitted to the switch switch 291 through the wireless carrier system 262 and the ground communication network 264, the switch 291 is transmitted to the remote control center 298, the remote control center 298 is transmitted to the second processor 215, and the second processor 215 is connected to the receiving module 263.
  • the processor 215 transmits the received image to the receiving module 263, the receiving module 263 transmits the received image to the compressed data scanning unit 449, the compressed data scanning unit 449 transmits to the compression logic acquisition unit 450, and the compression logic acquisition unit 450 transmits the
  • the decompression reading unit 451, the decompression reading unit 451 transmits to the second judgment unit 452, the second judgment unit 452 transmits to the original byte data recovery unit 453, and the original byte data recovery unit 453 transmits the image to the visual display 255 .
  • the invention has the beneficial effects of establishing a remote driving system for a vehicle remotely driven by a remote operator, adapting to the requirements of all road conditions, without a driver in the vehicle, realizing unmanned driving in a true sense, and providing technical support for the development of the sharing economy.
  • Figure 1 is a schematic diagram of a primary wireless device and a secondary wireless device connected through the Internet of Things;
  • FIG. 2 is a circuit diagram of the CAN bus module of the multi-protocol communication network access system of the present invention
  • FIG 3 is the third part of the circuit diagram of the processor of the multi-protocol communication network access system of the present invention.
  • FIG. 4 is a circuit diagram of the communication interface of the multi-protocol communication network access system of the present invention.
  • Figure 5 is a control principle diagram of the multi-protocol communication network access system of the present invention.
  • FIG. 6 is the first part of the circuit diagram of the processor of the multi-protocol communication network access system of the present invention.
  • Figure 8 is a circuit diagram of the RS232 signal communication chip of the multi-protocol communication network access system of the present invention.
  • Figure 11 is the third part of the RS485 signal communication circuit diagram of the multi-protocol communication network access system of the present invention.
  • FIG. 12 is a circuit diagram of the Ethernet module of the multi-protocol communication network access system of the present invention.
  • FIG. 13 is a block diagram of the radar video composite data detection and processing system of the present invention.
  • Embodiment 14 is a system structure diagram of data compression provided by Embodiment 5 of the present invention.
  • Figure 16 is a diagram of the coordinate relationship between the environment coordinate system and the pixel coordinate system
  • Figure 17 is a data compression and decompression interface information structure diagram provided by the second embodiment of the present invention.
  • Embodiment 18 is a scene diagram of data compression for wireless communication network transmission provided by Embodiment 1 of the present invention.
  • FIG. 19 is a flowchart of a data compression method provided by Embodiment 1 of the present invention.
  • Embodiment 20 is a flowchart of a method for data compression and storage provided by Embodiment 2 of the present invention.
  • FIG. 21 is a flowchart of a method for data compression with added interface information provided in the second embodiment of the present invention.
  • FIG. 22 is a flowchart of a data decompression method provided by Embodiment 3 of the present invention.
  • FIG. 23 is a flowchart of a method for decompressing and reading data according to Embodiment 4 of the present invention.
  • Fig. 24 is a system connection diagram of the dual-mode driving mode of the second and third embodiments of the present invention.
  • 25 is a logic diagram of dual-mode driving work switching logic diagram of the second embodiment of the present invention.
  • 26 is a logic diagram of dual-mode driving work switching logic diagram of the third embodiment of the present invention.
  • Figure 27 is a structural diagram of the remote control center of the present invention.
  • Figure 28 is a top view of the vehicle
  • Figure 29 is a schematic diagram of a remote driving system and a robot control system
  • Figure 30 is a schematic diagram of a remote console and an operator
  • Figure 31 is a perspective view of the robot system
  • Figure 32 is a perspective view of a remotely controlled robot in the cab of a car
  • Figure 33 is a perspective view of a robot manipulator arm controlling a car steering wheel
  • Figure 34 and Figure 35 are the robot manipulating arm switch and the robot manipulating arm;
  • Figure 36 is the use of the distal part of the cannula and instrument and the switch
  • Figure 37 is a simplified block diagram of a fully constrained inverse Jacobian master/slave speed controller
  • Figure 38 is a refinement of the simplified master/slave control in Figure 38;
  • Figure 39 is a simplified diagram of a modified master/slave controller
  • Figure 40 is a schematic diagram of a modified part of the controller
  • Figure 41 is an exemplary inverse Jacobian controller of a fully constrained master/slave robot control system
  • Figure 42, Figure 43 and Figure 44 are schematic block diagrams of a reference frame for motion control
  • Figures 45 and 46 are block diagrams of two systems of the end effector reference system and the remote center reference system;
  • Figure 47 is a block diagram of fault response, fault isolation and fault weakening in the robot system
  • Figures 48-52 are flowcharts that provide fault response, fault isolation, and fault weakening methods.
  • the remote operator 171 drives the vehicle 260 through the remote control robot 170: the remote console 169 is connected to the remote control center 298, the remote control center 298 is connected to the wired and wireless LAN 295, the wired and wireless LAN 295 is connected to the switch 291, and the switch 291 is connected to The ground network 264 is connected, the ground network 264 is connected to the wireless carrier system 262, the wireless carrier system 262 is connected to the communication network access system 278, the communication network access system 278 is connected to the telematics unit 269, and the telematics unit 269 is connected to the vehicle bus 276 connections,
  • FIG 26 is a schematic diagram of the remote control center 298.
  • Each seat in the end group formulates a plan to issue control instructions according to the corresponding authority and work requirements of the seat.
  • the control instruction data is transmitted to the network switch, and the network switch transmits the control instruction data to the main server
  • the processed data is transmitted to the graphics splicing controller.
  • the graphics splicing controller intelligently realizes the splicing and combination of various data, and finally It is displayed on the large LCD screen.
  • the graphics splicing controller intelligently realizes the splicing and combination of various data, and finally displays it on the large LCD screen.
  • the PDA controller issues control instructions, which are transmitted to the network switch, and the network switch transmits the control instruction data to the main server and the secondary server, after the logic processing of the primary server and the secondary server, and then transmits the processed data to the graphics
  • the graphic splicing controller intelligently realizes the splicing and combination of various data, and finally displays it on the large LCD screen.
  • the large-screen LCD screen can promptly display the data information of various terminals and cameras and other components, which is convenient for the personnel in each seat of the terminal to view, so that people can obtain the information and data of the vehicle 260, and then conduct appropriate coordinated operations.
  • the remote control center 298 has the number of data terminals, the number of voice terminals, graphics workstations, and PDA controllers.
  • the display control module of the large-screen display control host has 16 display control modes. Selection and switching of 16 display modes of the LCD screen; the large LCD screen is a large screen display.
  • the remote control center 298 includes: large-screen LCD display, large-screen display control host, network switch, graphics splicing controller, graphics workstation, graphics workstation group control host, server group and terminal group; network switches are respectively connected with graphics workstations and graphics splicing Controller, graphic workstation group control host, server, terminal one-to-one corresponding electrical communication connection; large-screen LCD screen is used to display the graphics, video, and audio data after the graphic splicing controller is spliced; the graphic splicing controller is used from the graphic workstation Retrieve graphics or video or audio and complete the combination and splicing work; graphics workstation group control is used to control the storage, movement, display, and deletion of graphics or video or audio in the graphics workstation; the network switch realizes the graphics workstation and graphics splicing controller ,
  • the graphics workstation group controls the corresponding data communication between the host, server, and terminal; the server is composed of a main server and a secondary server, and the terminal is composed of a data terminal and a voice terminal.
  • the main server is used to receive and control the data information of the data terminal.
  • the level server is used to receive and control the voice information of the voice terminal;
  • the large-screen display control host is also connected to a wireless receiver for electronic communication, and the wireless receiver is connected to the PDA controller through wireless communication, and the data command information sent by the data terminal passes through the network
  • the switch is transmitted to the main server, through the main server for logical operation processing, the data information and processing results are displayed on the large-screen LCD display and the LCD display of the data terminal;
  • the voice command information issued by the voice terminal is transmitted to the secondary level through the network switch Server, through the secondary server for logical operation processing, the voice information and processing results are displayed on the large-screen LCD display and the LCD display of the voice terminal;
  • the data and voice command information sent by the PDA controller are transmitted to the wireless through wireless communication Receiver, wireless receiver transmits data and voice information to the graphics splicing controller through the large-screen display control host, and processes the data, voice information and processing results through the large-screen LCD
  • the second processor 215 of the remote console 169 is composed of hardware, software, and firmware. It is executed by one unit or divided into several sub-units. Each sub-unit can further use hardware, software, and Realized by any combination of firmware, the second processor 215 can cross-connect the control logic and the controller, the second processor 215 can also be distributed as a subunit in the entire vehicle remote driving system 258, and the second processor 215 can execute A machine-readable instruction of a transitory machine-readable medium, which activates the second processor 215 to perform actions corresponding to the instruction, the second processor 215 executes various instructions input by the remote operator 171, and the second processor 215 executes The remote operator 171 uses the instructions input by the left input device 177 and the right input device 178 to activate the respective joints of the first manipulator 182 and the second manipulator 183, the second processor 215 of the remote console 169 and the visual display 255, left input The device 177 is connected to the right input device 178, the first foot pedal 214 and the
  • the robot 170 directly controls the steering wheel 235, the accelerator pedal 402, the brake pedal 401, the gear position 400, the door 403, the air conditioner 404 and the audio 405 of the car in the cab of the car.
  • the car seat 173 includes a seat back 216, an anti-submarine beam 217, an anti-submersible link mechanism 219, a fifth link 218, a sixth link 230, a seventh link 231, and a column 256.
  • the robot 170 is fixed to the car seat On the chair 173, a first manipulator 182, a second manipulator 183, a third manipulator 184, and a fourth manipulator 185 are installed on the column 256.
  • the three manipulators 184 and the fourth manipulator 185 can move up and down, and can manipulate the attached instruments 300, 301, 302, and 303.
  • the remote operator 171 grasps the left input device 177 with his left hand.
  • the left input device 177 can cause the movement of the first manipulator 182 and is connected to the steering wheel 235; grasps the right input device 178 with the right hand, and the right input device 178 can cause the second manipulator 183
  • the right input device 178 can also cause the second manipulator 183 to move and connect with the gear 400.
  • the remote operator 171 connects with the second foot base 233 with his left foot, and the remote operator 171 connects with his right foot
  • the first foot pedal 214, the first foot pedal 214 can cause the robot's third manipulator 184 to move, the third manipulator 184 can cause the accelerator pedal 402 to move, and the second foot base 233 can cause the fourth manipulator 185 to move.
  • the movement of the manipulator 185 can be connected to the brake pedal 401.
  • Figure 1 shows that the remote driving system 258 includes a vehicle 260, a wireless carrier system 262, a ground communication network 264, a computer 266, and a call center 265.
  • vehicle 260 is a motorcycle, truck, bus, sports utility vehicle (SUV), and recreational vehicle ( RV), ships, aircraft and ultra-high-speed pipeline trains.
  • Vehicle electronic equipment 263 includes telematics unit 269, microphone 270, buttons and control input 271, audio system 272, visual display 273, GPS and BDS satellite navigation module 274, multiple vehicle system modules (VSM virtual switching matrix) 275,
  • VSM virtual switching matrix multiple vehicle system modules
  • CAN Controller Area Network
  • MOST Media Oriented System Transmission
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • other connections Ethernet or conform to ISO, SAE and IEEE standards and specifications
  • telematics unit 269 Installed on the vehicle 260, the wireless carrier system 262 enables the vehicle 260 to perform wireless voice and data communication with the call center 265 through wireless networking.
  • the telematics unit 269 uses radio transmission to establish a communication channel with the wireless carrier system 262.
  • the communication channel includes a voice channel and a data channel, and the communication channel can send and receive voice and data transmissions.
  • the communication network access system 278 includes: a processor, a microwave communication unit, a satellite communication unit and a mobile communication unit; wherein the processor module is adapted to receive data information of the microwave communication unit, the satellite communication unit and the mobile communication unit.
  • the telematics unit 269 uses cellular communication according to the GSM or CDMA standard, and includes a standard cellular chipset 279 for voice communication, a wireless modem for data transmission, an electronic processing device 280, multiple digital storage devices 281, and a communication network interface.
  • the access system 278 includes: a processor, a microwave communication unit, a satellite communication unit, and a mobile communication unit; wherein the processor module is adapted to receive data information of the microwave communication unit, the satellite communication unit, and the mobile communication unit.
  • the modem can be realized by software stored in the telematics unit 269 and executed by the processor 280, or the modem can be a discrete hardware component located inside or outside the telematics unit 269.
  • the modem can use any number of different standards or protocols such as EVDO, CDMA, GPRS and EDGE to operate.
  • the telematics unit 269 can be used to implement wireless networking between the vehicle and other networked devices.
  • the telematics unit 269 can be configured to perform wireless communication according to any one of one or more wireless protocols such as IEEE 422.11 protocol, WiMAX or Bluetooth.
  • the telematics unit can be configured with a static IP address or can be set to automatically receive an assigned IP address from another device on the network such as a router or from a network address server.
  • the smart phone 284 is a wireless device among networked devices that communicate with the telematics unit 269.
  • the smart phone 284 includes computer processing capabilities, a transceiver capable of communicating using a short-range wireless protocol, and a visual smart phone display 286 that includes a touch screen graphical user interface.
  • the smart phone 284 is configured to communicate using the traditional Wi-Fi protocol.
  • the smart phone 284 includes the ability to communicate via cellular communication using the wireless carrier system 262, including processing capabilities, a display 286, and the ability to communicate over short-range wireless communication links.
  • the smart phone 282 may use the WiFi direct protocol to establish a short-range wireless link.
  • the smart phone 282 includes devices that do not have cellular communication capabilities.
  • the processor 280 is a device capable of processing electronic instructions, including a microprocessor, a microcontroller, a main processor, a controller, a vehicle communication processor, and an application specific integrated circuit (ASIC). It is a dedicated processor for the telematics unit 269 , Can be shared with other vehicle systems.
  • the processor 280 executes various types of digital storage instructions, such as software or firmware programs stored in the memory 281, so that the telematics unit can provide various types of services, and the processor 280 can execute programs or process data.
  • the telematics unit 269 provides wireless communication services from the vehicle. These services include: the service provided by the satellite navigation module 274; the airbag deployment notification service provided by the combination of the collision sensor interface module and the body control module; the diagnosis report using the diagnosis module; and Information, music, webpages, movies, TV shows, video games, when the modules are implemented as VSM 275 located outside the telematics unit 269, they can use the vehicle bus 276 to exchange data and commands with the telematics unit 269 .
  • the satellite navigation module 274 can determine the location of the vehicle based on the radio signals received from the satellite 285, and provide navigation and other location-related services to the vehicle driver. Navigation information can be presented and voiced on the display 273. The location information can be provided to a call center 265 or other remote computer system, such as computer 266. The new or updated map data is downloaded from the call center 265 to the satellite navigation module 274 through the telematics unit 269. Satellite navigation includes GPS satellite navigation system and Beidou (BDS) satellite navigation system. A communication satellite 289 and an uplink transmitter station 290 are used to implement two-way communication. The program content is received by the transmitter station 290, packaged for upload, and then sent to the satellite 289. The satellite 289 broadcasts programs to users.
  • BDS Beidou
  • the two-way communication uses the satellite of the satellite 289 Telephone service to relay telephone communication between the vehicle 260 and the station 290.
  • the vehicle 260 includes a vehicle system module (VSM) 275 that is located in the vehicle and receives input from sensors and uses the sensed input to perform diagnosis, monitoring, control, and reporting.
  • VSM 275 is connected to other VSMs and to the telematics unit 269 through the vehicle bus 276, and can be programmed to run vehicle system and subsystem diagnostic tests.
  • One VSM 275 can be an engine control module (ECM), which controls all aspects of engine operation, including fuel ignition and ignition timing, and the other VSM 275 can be a power system control module, which regulates the operation of components of the vehicle’s powertrain.
  • ECM engine control module
  • a VSM 275 can be a body control module that manages the electronic components in the vehicle including the electric door locks and headlights of the vehicle.
  • the engine control module is equipped with on-board diagnostics (OBD) features, which provide information such as various sensors including vehicle emission sensors. Receive various real-time data, and provide a standardized series of diagnostic trouble codes (DTC).
  • OBD on-board diagnostics
  • the vehicle electronic equipment 263 includes multiple vehicle user interfaces, devices for providing and receiving information, including a microphone 270, buttons 271, an audio system 272, and a visual display 273.
  • the microphone 270 provides audio input to the telematics unit 269 to enable the vehicle occupant It can provide voice commands and implement hands-free calls through the wireless carrier system 262, and can connect to the vehicle-mounted automatic voice processing unit using human-machine interface (HMI) technology.
  • Button 271 allows manual user input of telematics unit 269 to initiate wireless telephone calls and provide other data, response, or control input. Separate buttons can be used to initiate emergency calls and regular service assistance calls to the call center 265.
  • the audio system 272 provides audio output to vehicle occupants.
  • the audio system 272 is connected to the vehicle bus 276 and the entertainment bus 277 and can provide AM, FM, satellite radio, CD, DVD, and other multimedia functions.
  • the visual display 273 is a graphic display.
  • the wireless carrier system 262 is a cellular telephone system that includes a cellular tower 287, a mobile switching center (MSC) 288, and any other networking components required to connect the wireless carrier system 262 with the terrestrial network 264.
  • the cell tower 287 includes transmitting and receiving antennas and base stations. Base stations from different cell towers are directly connected to the MSC 288 or connected to the MSC 288 through an intermediate device of the base station controller.
  • the cellular system 262 can implement communication technologies, including AMPS analog technologies and CDMA and GSM/GPRS digital technologies.
  • the terrestrial network 264 is a terrestrial telecommunications network that connects wired telephones and connects the wireless carrier system 262 to the call center 265.
  • the terrestrial network 264 includes the Public Switched Telephone Network (PSTN), which is used to provide hard-line telephones, packet-switched data communications, and Internet infrastructure.
  • PSTN Public Switched Telephone Network
  • the call center 265 is directly connected to the wireless carrier system 262.
  • the computer 266 is a web server accessed by the vehicle through the telematics unit 269 and the wireless carrier 262.
  • the computer 266 uploads diagnostic information from the vehicle 20 through the telematics unit 269; the computer 266 provides Internet connection, provides DNS services and acts as a network address server, which uses DHCP or other appropriate protocols to assign an IP address to the vehicle 260.
  • the call center 265 provides system back-end functions to the vehicle electronic equipment 263. These back-end functions include a switch 291, a server 283, a database 292, a remote control center 298, and an automatic voice response system (VRS) 294 that are connected together through a wired and wireless LAN 295 .
  • VRS automatic voice response system
  • the switch 291 is a private exchange (PBX) switch that routes incoming signals so that the voice transmission is usually sent to the remote control center 298 via a regular telephone and to the automatic voice response system 294 using VoIP.
  • the telephone of the remote control center 298 can also use VoIP, and the VoIP and other data communication through the switch 291 are implemented through a modem connected between the switch 291 and the network 295.
  • the data transmission is transmitted to the server 283 and the database 292 via the modem.
  • the database 292 can store account information, user authentication information, and vehicle identification. It can also transmit data via wireless system 422.11x and GPRS. It is used to connect to the manual call center 265 through the remote control center 298, the call center 265 uses the VRS 294 as an automatic instructor, and the call center 265 uses the VRS 294 to connect to the remote control center 298.
  • the vehicle bus 276 is connected to the robot 170, the robot 170 is connected to the first robot 182, the robot 170 is connected to the second robot 183, the robot 170 is connected to the third robot 184, the robot 170 is connected to the fourth robot 185, and the first robot 182 is connected to the steering wheel.
  • 235 is connected, the second manipulator 183 is connected to the steering wheel 235, the second manipulator 183 can also be connected to the gear 400, 184 is connected to the accelerator pedal 402, the fourth manipulator 185 is connected to the brake pedal 401, and the power cord of the robot 170 is connected to the vehicle 260 Connect the power cords together.
  • Backup driving system remote control station 169 and remote control center 298, remote control center 298 and switch 291, switch 291 and ground network 264, ground network 264 and uplink transmission station 290 connected, uplink transmission station 290 and communication satellite 289 connection, communication satellite 289 and communication network access system 278, communication network access system 278 and telematics unit 269, telematics unit 269 and robot 170, robot 170 and steering wheel 235, robot 170 and gear 400
  • the robot 170 is connected to the brake pedal 401 and the robot 170 is connected to the accelerator pedal 402, the remote operator 171 drives the vehicle 260;
  • the main radar video image transmission line the vehicle vision system 502 is connected to the processor 280,
  • the vehicle 260 includes a vehicle vision system 502, which is configured to capture an image in a 360° area around the vehicle.
  • the first imaging device 500 of the vehicle vision system 502 is installed behind the front windshield, the vehicle grille, the front instrument panel, and the position closer to the front edge of the vehicle, and is used to capture the image of the forward field of view (FOV) 506 of the vehicle 260 A front-view camera
  • the second imaging device 508 of the vehicle vision system 502 is installed at the rear of the vehicle and a rear-view camera used to capture the vehicle's backward field of view (FOV) 510
  • the third imaging device 512 of the vehicle vision system 502 is installed at The left side of the vehicle is used to capture the side view of the vehicle's lateral field of view (FOV) 514
  • the fourth imaging device 504 of the vehicle vision system 502 is installed on the right side of the vehicle to capture the lateral view of the vehicle (FOV) 519 Side-view camera.
  • the fifth imaging device 406 is installed on the first manipulator 182, the sixth imaging device 407 is installed on the second manipulator 183, the seventh imaging device 408 is installed on the third manipulator 184, and the eighth imaging device 409 is installed on the fourth manipulator 185.
  • a ninth imaging device 410 is installed on 256, and the imaging systems of the first imaging device to the ninth imaging device are all composed of a video acquisition device 120 and a radar 110, and the radar 110 is composed of a lidar or a millimeter wave radar.
  • the radar 110 is used to detect the target and collect the target data and environmental coordinates of the target.
  • the radar 110 adopts the FMCW system with one transmitter and two receivers and 2D-FFT data processing technology.
  • the detected target data includes the target's radial distance, radial velocity and angle information .
  • the radial distance and angle information are converted into the target's horizontal distance and vertical distance information according to the geometric relationship.
  • the horizontal distance and the vertical distance information constitute the target's environmental coordinates relative to the video capture device.
  • the target data detected by the radar will be different each time. In order to obtain more accurate target information and eliminate false targets as much as possible, it is necessary to adopt data association and target tracking technology to integrate the target information detected by the radar multiple times.
  • the radar Perform data association and adaptive filtering prediction.
  • the video trigger signal is output, which triggers the camera to perform image acquisition and target extraction, and converts the target detected by the radar into environmental coordinate data relative to the camera and transmits it to the radar video
  • the information fusion system 130 performs information fusion.
  • the video acquisition device 120 is used to collect the image information and pixel coordinates of the target after the radar achieves tracking of the target.
  • the video acquisition device 120 is composed of a camera, which acquires target characteristic data by processing the image after collecting graphic information, and transmits the pixel coordinate data of the target to the radar video information fusion system 130.
  • the radar and video acquisition equipment connected to the input end of the radar video information fusion system 130 are used to perform information fusion on the target data and image information of the target, which specifically includes the coordinate conversion of the target data collected by the radar 110 from the environment coordinates. Converted to pixel coordinates corresponding to the image, the radar 110 detects the target position and the image information or video data collected by the video acquisition device 120 for time registration, first data association and decision making, and the target fusion result is displayed on the display screen.
  • the detection and processing method of the radar video composite data detection and processing system includes the following steps:
  • the radar detects the target and collects the target data and environmental coordinates of the target.
  • the radar detects the target and processes the echo data to obtain target data.
  • the target data includes the radial distance, radial velocity and angle information of the target.
  • the radar converts the radial distance and angle information into the horizontal distance and the vertical distance of the target according to the geometric relationship through the data feature transformation, and the horizontal distance and the vertical distance of the target constitute the environmental coordinates of the target relative to the video acquisition device.
  • the radar After the radar obtains the target data of the target, it performs the second data association on the radar information.
  • the methods for the radar to perform the second data association on the target data obtained at the current moment include: track bifurcation method, nearest neighbor method, and joint probability Data Association Algorithm (JPDA).
  • JPDA joint probability Data Association Algorithm
  • the radar judges the number of targets detected by the radar. If the number of targets detected by the radar is less than the preset number threshold, and the number of targets is small or sparse, the track bifurcation method or the nearest neighbor method is used for data association, and the calculation is simple and real-time.
  • JPDA joint probabilistic data association algorithm
  • the radar performs adaptive filtering prediction on the target data acquired at the current moment, and the adaptive filtering prediction can use Kalman filter tracking to perform target tracking and prediction, and then target the target.
  • the video acquisition device collects the image information and pixel coordinates of the target.
  • the video capture device captures the image information of the target.
  • the video acquisition device performs image processing on the image information to obtain target feature data, and transmits the target feature number and pixel coordinate data to the radar video information fusion system.
  • the S3 radar video information fusion system integrates the target data and image information of the target; information fusion includes: coordinate transformation, time registration, data decision-making and first data association.
  • the radar video information fusion system converts the target data obtained by the radar from the environment coordinates to the pixel coordinates corresponding to the video information, which specifically includes; the environment coordinate system Ow-XwYwZw, whose origin is the intersection point of the video acquisition device perpendicular to the ground It is the origin Ow (can also be set at any position, generally set according to the actual situation), the Yw axis points to the horizontal front of the video captured by the video capture device, the Zw axis points upwards perpendicular to the horizontal plane, and the Xw axis lies on the horizontal plane and perpendicular to the Yw axis .
  • the environment coordinate system Ow-XwYwZw whose origin is the intersection point of the video acquisition device perpendicular to the ground It is the origin Ow (can also be set at any position, generally set according to the actual situation)
  • the Yw axis points to the horizontal front of the video captured by the video capture device
  • the Zw axis points upwards per
  • the pixel coordinate system Oo-UV, the U axis and the Y axis form an imaging plane, the imaging plane is perpendicular to the Yw axis of the environment coordinate system, the upper left corner of the imaging plane is the coordinate origin Oo, and the unit of the pixel coordinate system is a pixel.
  • the height of the video capture device is H meters above the ground, the relationship between the environmental coordinates and the pixel coordinates is as shown in formula (1):
  • u is the U-axis coordinate of the target in the pixel coordinate system
  • v is the V-axis coordinate of the target in the pixel coordinate system
  • ax and az are the equivalent focal lengths of the Xw axis and Zw axis of the video capture device
  • u0, v0 is the coordinate of the pixel center of the image information
  • xw, yw, and zw are respectively the environmental coordinate values of points within the physical range of the camera's irradiation.
  • the radar video information fusion system performs time registration on the radar target data and the image information of the video acquisition device.
  • the data refresh frequency of radar and video cameras are different. It is necessary to integrate the information of the radar detection target and the extracted information of the video target in time to ensure the synchronization of the paired data, and to play the role of complementing the advantages of radar and video.
  • the data refresh frequency of radar is faster than that of cameras, and the time registration algorithm based on least squares criterion can be used, which specifically includes: different types of sensors C and R, the sampling period of sensor C is ⁇ , and the sampling period of sensor R is T , The proportional coefficient of the sampling period is an integer n.
  • the latest target state estimation time from sensor C is recorded as (k-1) ⁇
  • the number of times R estimates the target state is n.
  • the idea of least squares time registration is to fuse the n measurements collected by the sensor R into a virtual measurement, and use it as the measurement value of the sensor R at the current moment.
  • the measurement value is fused with the measurement value of sensor C to eliminate the purpose of unsynchronized target state measurement value caused by time deviation, and eliminate the influence of time mismatch on the accuracy of multi-sensor information fusion.
  • the acquisition period of the video acquisition device is ⁇
  • the acquisition period of the radar is T
  • the scale factor of the acquisition period is an integer n
  • the latest target state estimation time of the video acquisition device is recorded as (k-1) ⁇
  • n is the number of times that the radar detects the target in one cycle of the video acquisition device
  • the n measured values collected by the radar are merged into a virtual measurement and used as the current measurement value of the radar.
  • S n [S1, S2,..., Sn] T is the collection of data of a certain target position detected by radar from (k-1) ⁇ to k ⁇ , sn corresponds to the video collection data at k ⁇ , if expressed by S1, S2,..., Sn is a column vector composed of the measured values and their derivatives after fusion, and the virtual measured value si of the radar detection data is expressed as:
  • V n [v1, v2,..., vn] T has a mean value of zero
  • the covariance matrix is:
  • the measured value of the radar at the current moment and the measured value of the video acquisition device are fused using the nearest neighbor data association method.
  • the radar video information fusion system makes data decisions on the target data of the radar and the image information of the video acquisition device, which specifically includes: the radar video information fusion system determines whether the image quality of the image information collected by the video acquisition device at the current moment is greater than the preset If yes, use the target number information extracted from the image information, if not, use the target number information extracted from the target data collected by radar.
  • the radar video information fusion system performs the first data association between the radar target data and the image information of the video acquisition device.
  • the first data association uses the nearest neighbor data association method, which specifically includes: First, set up tracking gates to limit potential
  • the tracking gate is a subspace in the tracking space.
  • the tracking gate is set with video processing or radar detection target position as the center. Its size should ensure a certain probability of correct matching. Therefore, the larger residual will be eliminated first. If the number of targets detected by the radar in the tracking gate is greater than 1, the one with the smallest residual is regarded as the target.
  • the radar video information fusion system displays the target fusion result information through the display screen; the processor 280 is connected to the communication network access system 278,
  • the communication network access system 278 in FIG. 5 includes a processor, a microwave communication unit, a satellite communication unit, and a mobile communication unit; wherein the processor module is adapted to receive data information of the microwave communication unit, the satellite communication unit, and the mobile communication unit.
  • the microwave communication unit includes: a directional antenna and a radio frequency unit; wherein the directional antenna is adapted to send the received radio frequency signal to the radio frequency unit, and the radio frequency unit is adapted to modulate the radio frequency signal and send it to the processor module for demodulation into data information; or After the data information is modulated by the processor module, it is sent through the radio frequency unit and the directional antenna.
  • the satellite communication unit includes: a transceiver and a Ka-band modem; the transceiver is connected to a UHF UHF antenna for UHF-band signals, the processor is used to convert the received UHF-band signals into Ka-band signals, and the Ka-band modem is connected to the Ka antenna Used to send converted Ka-band signals to satellites; or Ka-band modems receive Ka-band signals sent by satellites through connected Ka antennas.
  • the processor is used to convert received Ka-band signals into UHF-band signals, and the transceiver is suitable for passing The UHF UHF antenna transmits the converted UHF band signal.
  • the mobile communication unit is a 4G and 5G communication module; the processor is suitable for receiving or sending 4G and 5G signals.
  • the wired communication unit includes: a serial communication circuit, a CAN bus module, and an Ethernet module; the processor is adapted to receive the data information sent by the serial communication circuit, the CAN bus module and the Ethernet module, and then convert the above data information into a Ka-band signal and UHF frequency band signal; or extract data information from Ka frequency band signal and UHF frequency band signal, and send it out through serial communication circuit, CAN bus module, and Ethernet module.
  • the multi-protocol communication network access system combines wireless communication and wired communication, that is, the communication signal of the corresponding protocol is sent to the processor. After the processor performs the corresponding protocol conversion, it is sent through the corresponding communication method to realize the multi-protocol Conversion between.
  • the processor adopts STM32 series single-chip microcomputer.
  • the 21, 22, 25, 26, 27 and 28 pins of the STM32F10XC processor are respectively connected to the 36 of the Ethernet module.
  • 37, 32, 33, 34, 35 pins are connected for communication.
  • the CAN bus module uses the SN65HVD230 chip, the CAN bus module 1 and 4 pins are electrically connected to the processor 46, 45 pins, and the CAN bus module realizes the cascading of multiple processors to realize the The processor is expanded to meet the needs of controlling communication between multiple processors.
  • the serial communication circuit includes: a communication interface, an RS485 signal communication circuit electrically connected to the processor, and an RS232 signal communication chip; the communication interface is provided with RS485 signal communication The input terminal of the circuit and the input terminal of the RS232 signal communication chip, the input terminal of the RS485 signal communication circuit and the input terminal of the RS232 signal communication chip send data information to the processor; the processor is suitable for converting the RS232 signal into the RS485 signal.
  • the pins 9 and 10 of the communication interface are electrically connected to pins 30 and 31 of the processor.
  • the 3 and 4 pins of the communication interface are connected to the 6 and 7 pins of the RS485 signal communication circuit, and the 5 and 6 pins of the communication interface are connected to the 7 and 8 pins of the RS232 signal communication chip.
  • the pins 1, 2, and 4 of the RS485 signal communication circuit are respectively connected to the pins 14, 15, and 16 of the processor, and the pins 10 and 9 of the RS232 signal communication chip are respectively connected to the pins 12 and 13 of the processor.
  • the processor module is equipped with an information classification database; the processor module is suitable for extracting the key content of the data information, and comparing it in the information classification database, and after classifying according to the comparison result, transmitting according to the corresponding transmission mode of the classification , During classified transmission, the corresponding communication protocol to be transmitted will be loaded into the data information to meet the corresponding communication requirements, thereby realizing automatic configuration between multiple protocols.
  • the serial communication circuit also includes: a communication indication circuit electrically connected to the communication interface; the communication indication circuit is provided with a first indicator light and a second indicator light. When the RS485 signal communication circuit connected to the communication interface works normally, the first indicator light The indicator is green, and when the RS232 signal communication chip connected to the communication interface works normally, the second indicator is green.
  • the multi-protocol communication network access system also includes: a DC-DC step-down circuit; the DC-DC step-down circuit is suitable for powering and stabilizing equipment.
  • the communication network access system 278 is connected to the wireless carrier system 262, the wireless carrier system 262 is connected to the ground network 264, the ground network 264 is connected to the switch 291, the switch 291 is connected to the remote control center 298, and the remote control center 298 is connected to the second processor 215 Connected, the second processor 215 is connected to the visual display 255.
  • the vehicle vision system 502 is connected to the processor 280, the processor 280 is connected to the communication network access system 278, the communication network access system 278 is connected to the communication satellite 289, and the 289 is connected to the uplink transmitting station 290 ,
  • the uplink transmitting station 290 is connected to the ground network 264, the ground network 264 is connected to the switch 291, the switch 291 is connected to the remote control center 298, the remote control center 298 is connected to the second processor 215, and the second processor 215 is connected to the visual display 255 connections.
  • the radar 110 of the vehicle vision system 502 and the video acquisition device 120 are fused by the radar video information fusion system 130, and the image of the 443 scanning radar video information fusion system 130 is transmitted to the compression storage unit 444 and the compression storage unit 444.
  • the image is transmitted to the first judging unit 445, the first judging unit 445 transmits the image to the compressed data generating unit 446, and the compressed data generating unit 446 transmits the compressed image to the sending module 447, which sends the compressed image to the communication network interface.
  • the communication network access system 278 transmits to the switch switch 291 through the wireless carrier system 262 and the ground communication network 264, the switch 291 transmits to the remote control center 298, and the remote control center 298 transmits to the second processor 215, the second processing
  • the receiver 215 is connected to the receiving module 263, the second processor 215 transmits the received image to the receiving module 263, the receiving module 263 transmits the received image to the compressed data scanning unit 449, and the compressed data scanning unit 449 transmits to the compression logic acquiring unit 450, the compression logic acquisition unit 450 transmits to the decompression reading unit 451, the decompression reading unit 451 transmits to the second judgment unit 452, and the second judgment unit 452 transmits to the original byte data recovery unit 453, and the original byte data is restored
  • the unit 453 transmits the image to the visual display 255.
  • Figures 17-23 are data compression and decompression methods and systems.
  • Figure 17 is a data compression and decompression interface information structure diagram.
  • the interface information can include the compression type and the size of the original byte data.
  • the definition of the compression type can be represented by 0 No compression, use 1 to indicate compressed.
  • Further data compression is performed on the continuously increasing byte data in the original byte data and the discontinuously identical and discontinuously increasing byte data in the original byte data, which further reduces the redundancy of the data and improves the data transmission Efficiency: By adding interface information to the compressed data, it can ensure that the receiving module correctly completes the decompression process.
  • the transmitted data is transmitted between the wireless node 411 and the wireless node 414 through the wireless channel after data compression.
  • the wireless node 411 includes a sending module 412; the wireless node 414 includes a receiving module 413; the data compression function is deployed in On the sending module 413, the data decompression function is deployed on the receiving module 413.
  • the data compression method includes the following steps: 415. Scan the original byte data. The scanning starts from the first byte of the original byte data, and scans sequentially. According to the scanning results, the redundant components of the original byte data are determined, and then the next step of compression processing is carried out according to the characteristics of the redundant data. 416 compresses and stores the original byte data.
  • any one byte data of consecutive identical byte data is stored as another byte data. For example, when there are 3 consecutive identical byte data in the original byte data, and they are 0x05, 0x05, and 0x05, the first logical operation is to use 0x80 and the number of consecutive identical byte data 0x03.
  • the logical OR operation obtains the first logical operation value 0x83, and then the obtained first logical operation value 0x83 is stored as one byte of data, and the continuous identical byte data 0x05 is stored as another byte of data.
  • 417 Determine whether the original byte data has been scanned. If the scan is completed, go to step 418; if the scan is not completed, go back to step 415 to continue scanning. 418 generates compressed data based on the stored byte data.
  • step 416 compression processing is performed for the consecutive identical byte data in the original byte data, so that the byte data that originally occupied 3 bytes of data only occupies 2 bytes of data after data compression processing.
  • 0x83 and 0x05 are compressed data.
  • step a scan the original byte data b: if there are consecutive identical byte data in the original byte data, perform the first logical operation with the number of consecutive identical byte data to obtain the first A logical operation value, the first logical operation value is stored as one byte of data, and any byte data of the same continuous byte data is stored as another byte data c: Determine whether the original byte data has been scanned, if so, Go to step d, if not, return to step a; d: generate compressed data according to the stored byte data.
  • step 419 first determine the data characteristics in the original byte data according to step 419; if step 420, that is, when it is determined that there is continuously increasing byte data in the original byte data, step 421 can be performed, which is different from the continuous increase
  • the second logical operation is performed on the number of byte data to obtain the second logical operation value; then step 422 is executed, that is, the second logical operation value is stored as one byte of data, the first byte of continuously increasing byte data
  • the data is stored as another byte of data.
  • the second logic operation is to use 0xC0 and the number of continuously increasing byte data to perform an "OR" operation.
  • step 424 may be performed, that is, the number of discontinuously identical and discontinuously increasing byte data is carried out.
  • Three logic operations are performed to obtain the third logic operation value; then step 425 is performed, that is, the third logic operation value is stored as one byte of data, and each byte of data that is discontinuously identical and discontinuously increasing is sequentially stored as Another byte of data.
  • the third logical operation is to perform an "OR" operation using 0x00 and the number of discontinuously identical and discontinuously increasing byte data.
  • the stored byte data is obtained according to step 426, and compressed data is generated according to the stored byte data.
  • step 419 add interface information to the compressed data to generate a compressed data packet.
  • Figure 23 is a method of data decompression.
  • the method of data decompression may include the following steps: 427 scanning compressed data, the compressed data is the data obtained after the original byte data is compressed by the data compression method in the first embodiment. 428 performs a compression logic judgment operation on the n-th byte data of the compressed data to obtain a compression logic judgment value and a compression logic, where n is a natural number greater than or equal to 1. 429. Decompress and read the compressed data according to the compression logic judgment value and the compression logic.
  • the compression logic judgment value is equal to the first preset value, it is judged that there are consecutive identical byte data in the original byte data corresponding to the compression logic; the first logical number operation is performed on the nth byte data to obtain the number of data i , Where i is a natural number greater than or equal to 2, and the n+1th byte data of i compressed data is repeatedly read. 430 It is judged whether the scanning of the compressed data is completed. If the scan is completed, go to step 431; if the scan is not completed, go back to step 427 to continue scanning. 431 restores the original byte data according to the read byte data. In step 429, the original byte data is the read byte data 0x05, 0x05, 0x05.
  • step 427 on the basis of step 427, that is, on the basis of decompressing and reading the compressed data according to the compression logic judgment value and the compression logic, it is further provided that the compression logic judgment value is equal to the second preset value, That is, there is continuously increasing byte data in the original byte data corresponding to the compression logic; and the judgment value of the compression logic is not equal to the first preset value and not equal to the second preset value, that is, there is discontinuity in the original byte data corresponding to the compression logic In the case of the same and discontinuously increasing byte data, the compressed data is decompressed and read.
  • step 434 determines that the compression logic corresponds to the original byte data.
  • step 435 is further performed to perform the second logical number operation on the n-th byte data of the compressed data to obtain the data number j, where j is a natural number greater than or equal to 2; finally, according to step 436, Starting from the n+1th byte of compressed data, read j bytes of data in sequence.
  • the second preset value is 0xC0; the second logical number operation is to use 0x38 and the nth byte of data to perform an "OR" operation.
  • step 438 it can be determined that there are discontinuously identical and discontinuously increasing original byte data corresponding to the compression logic
  • step 439 is further performed to perform a third logical number operation on the n-th byte data of the compressed data to obtain the number of data k, where k is a natural number greater than or equal to 2; finally, according to step 440, Starting from the n+1th byte of compressed data, read k bytes of data in sequence. Among them, the third logical number operation is an OR operation using 0x00 and the nth byte data. Finally, according to step 441, the read byte data is obtained, and the original byte data is restored.
  • Figure 15 is a data compression system 442 including: radar 110, video acquisition device 120, radar video information fusion system 130, radar 110 is composed of lidar and millimeter wave radar, raw byte data scanning unit 443, compression storage unit 444, A judging unit 445 and compressed data generating unit 446, sending module 447, original byte data scanning unit 443, used to scan the original byte data; compression storage unit 444, used to compress and store the original byte data; The judging unit 445 is used to judge whether the original byte data has been scanned; the compressed data generating unit 446 is used to generate compressed data according to the stored byte data, and the compressed data is transmitted to the sending module 447.
  • 16 is a data decompression system 448 including a receiving module 263, a compressed data scanning unit 449, a compression logic judgment value and compression logic acquisition unit 450, a decompression reading unit 451, a second judgment unit 452, and an original byte data recovery unit 453.
  • the compressed data scanning unit 449 is used to scan compressed data; the compression logic judgment value and compression logic acquisition unit 450 is used to perform compression logic judgment operations on the nth byte of compressed data to obtain the compression logic judgment value and compression logic , Where n is a natural number greater than or equal to 1; the decompression reading unit 451 is used to decompress and read the compressed data according to the compression logic judgment value and the compression logic; the second judgment unit 452 is used to judge whether the compressed data is Scanning is complete; the original byte data recovery unit 453 is used to recover the original byte data according to the read byte data.
  • the display 255 displays the original byte data recovered by the original byte data recovery unit 453
  • the remote operator 171 uses the following system to remotely drive the vehicle 260 to walk:
  • Figures 33 to 46 are remote driving systems.
  • the distal end of the first link 139 is connected to the proximal end of the second link 137 at a joint that provides a horizontal pivot axis 138.
  • the proximal end of the third link 124 is connected to the distal end of the second link 137 at the rolling joint, so that the third link generally lies around an axis extending along the axes of both the second link and the third link.
  • the rotation or rolling at the joint 123 is performed distally after the pivot joint 125.
  • the distal end of the fourth link 136 is connected to the instrument holder 136 through a pair of pivot joints 135, 134, and the pivot joints 135, 134 together define
  • the instrument holder 121, the translation of the robot 170 manipulator arm assembly 133 or the prismatic joint 132 facilitates the axial movement of the instrument 126, enabling the instrument holder 131 to be attached to the cannula through which the instrument 126 is slidably inserted ,
  • the second instrument 126 includes additional degrees of freedom, the actuation of the second instrument 126 degrees of freedom is driven by the motor of the robot manipulator arm assembly 133, the second instrument 126 and the robot manipulator arm
  • the interface between the components 133 can be arranged closer or farther along the kinematic chain of the manipulator arm assembly 133.
  • the second instrument 126 includes a rotary joint 130 on the proximal side of the pivot point PP, which is arranged at a desired location. At this point, the distal side of the second instrument 126 allows the end effector 128 to pivot about the instrument wrist axis 129, 127.
  • the angle ⁇ between the end effector jaws 231 can be controlled independently of the position and orientation of the end effector 128.
  • the left hand-held input device 177 and the right main input device 178 are connected to and separated from the console 169 through wireless communication, the left hand-held input device 177 is connected to the second processor 215, and the right hand-held input device 178 is connected to the second processor.
  • the remote operator 171 starts to perform remote driving work after the remote console 169 activates the second processor 215, the left hand of the remote operator 171 controls the left hand input device 177, and the left hand input device 177 is controlled by the second processor 215
  • the movement of the arm end 197, the right hand of the remote operator 171 controls the right hand input device 178, the right hand input device 178 controls the movement of the arm end 197 through the second processor 215, and the arm end 197 uses the first contact in the end effector 193
  • the end 194 and the second contact end 196 are in contact with the steering wheel 235 and held tightly.
  • the steering wheel 235 can be rotated by moving the left hand input device 177 and the right hand input device 178 in opposite directions.
  • the remote operator 171 uses the The second processor 215 software controls the first manipulator 182 and the second manipulator 183 of the robot 170.
  • the remote operator 171 determines the application on the first manipulator 182 and the second manipulator 183 of the robot 170 through measurement, model estimation, measurement and modeling.
  • the first manipulator 182 and the second manipulator 183 provide tactile feedback to the remote operator 171 through the remote console 169.
  • This tactile feedback can simulate the manual manipulation of the arm end 197 for the remote operator 171 to control the steering wheel 235.
  • the reaction force corresponding to the steering wheel 235 experienced by the first manipulator 182 of the robot 170 can be simulated for the remote operator 171.
  • the first contact end 194 and the second contact end 196 of the end effector 193 pivot relative to each other so as to define a pair of end effector jaws 231.
  • the jaws 231 are actuated by squeezing the grip members of the input devices 177 and 178, and the robot 170 manipulates the first manipulator 182 and the second manipulator 183 to move the transmission assembly 195 on the upper part of the steering wheel 235, so that the shaft 187 is extended and retracted , Provide the desired movement of the end effector 193.
  • the robot 170 manipulates the first manipulator 182, the second manipulator 183, the third manipulator 184, and the fourth manipulator 185 to move at the steering wheel 235, the gear position 400, the brake 401 pedal, and the accelerator pedal 402 during remote driving.
  • the first manipulator 182 and the second manipulator 183 grip the steering wheel 235 to change the direction of the car.
  • the first manipulator 182 is connected with an instrument holder 180, the instrument holder 180 and the instrument 186 and the arm end 197
  • the instrument holder 180 is connected to the first manipulator 182 by a motorized joint.
  • the instrument holder 180 includes an instrument holder frame 188, a clamp 189 and an instrument holder bracket 190.
  • the clamp 189 is fixed to the instrument holder frame 188 At the distal end, the clamp 189 can be connected to and separated from the arm end 197, the instrument holder bracket 190 is connected with the instrument holder frame 188, and the linear translation of the instrument holder bracket 190 along the instrument holder frame 188 is determined by the second processor 215 Controlled motorized translational movement.
  • the instrument 186 includes a transmission assembly 195, an elongated shaft 187 and an end effector 193, and the transmission assembly 195 is connected to the instrument holder bracket 190.
  • the shaft 187 extends distally from the transmission assembly 195.
  • the end effector 193 is provided at the distal end of the shaft 187.
  • the shaft 187 defines a longitudinal axis 192 that coincides with the longitudinal axis of the arm end 197 and coincides with the longitudinal axis defined by the arm end 197.
  • the end effector 193 can be extended and retracted from the working space.
  • the remote operator 171 sends instructions through the second processor 215, the three-state switch 202 receives the activation signal, and the remote operator 171 uses the second processor 215 to drive the vehicle remotely.
  • the system 258 connects the robot 170 to manipulate the arm end 197 of the first manipulator 182 to grip and move away from the steering wheel 235.
  • the remote operator 171 uses the second processor 215 and the vehicle remote driving system 258 to connect the robot 170 to manipulate the arm end 197 of the second manipulator 183 Make a grip and move away from the steering wheel 235.
  • the first contact end 194 and the second contact end 196 in the end effector 193 apply force to the steering wheel 235 to rotate the steering wheel 235, release the three-state switch 202 to stop the movement of the arm end 197, and when the arm end 197 is connected to the steering wheel 235 ,
  • the remote operator 171 sends an activation second direction signal, the first direction is opposite to the second direction, the tri-state switch 202 receives the activation second direction signal, and the arm end 197 moves to the steering wheel 235.
  • 151 is a simplified controller schematic diagram of the master/slave controller 153 connecting the master input device 152 to the slave manipulator 154.
  • subscripts can be appended to these vectors to identify specific structures so that Is the position of the main input device in the associated main working space or coordinate system, and x s represents the position of the follower in the working space.
  • the velocity vector associated with the bearing vector is represented by the dot above the vector or the word "dot" between the vector and the subscript, such as xdot m of the main velocity vector, where the velocity vector is mathematically defined as the change of the bearing vector over time .
  • the controller 153 contains the inverse Jacobian speed controller, in In the case of the orientation of the master input device and the speed of the master input device, the controller 153 calculates the power command for transmission to the slave manipulator 154 to realize the movement of the slave end effector corresponding to the input device from the master speed. The controller 153 can calculate the force reflection signal applied to the main input device and from there to the hand of the remote operator 171 from the slave position x s and the slave speed.
  • the first module 159 contains an inverse Jacobian speed controller, which has an output from a calculation performed using an inverse Jacobian matrix modified according to the virtual slave path 163.
  • the vector associated with the virtual follower is usually represented by the v subscript, so that the virtual follower velocity in the joint space qdot v is integrated to provide q v , and the inverse motion module 162 is used to process q v to generate the virtual From the joint position signal x v .
  • the virtual slave position and the master input command x m are combined and processed using the forward motion 161.
  • the use of the virtual follower is helpful for smooth control and force reflection when approaching the hard limit of the system, when surpassing the soft limit of the system, etc., indicated by the first control module 159 and the second control module 160 and the control diagram
  • the structure of other components of 165 and other controllers includes data processing hardware, software, and firmware. Such a structure includes reprogrammable software and data, which is embodied in machine-readable code and stored in a tangible medium to
  • the second processor 215 for the remote console 169 uses machine-readable codes to store in a variety of different configurations, including random access memory, non-volatile memory, write-once memory, magnetic recording media, and optical recording media.
  • the second processor 215 includes one or more data processors of the remote console 169, including one or more local data processing circuits of manipulators, instruments, separate and remote processing structures and locations, and the module includes a single common processor Board, multiple individual boards, one or more of the modules are scattered on multiple boards, some of which also run some and all calculations of another module.
  • the software code of the module is written as a single integrated software code, and each module is divided into separate subroutines, or part of the code of one module is combined with some or all of the code of another module.
  • the data and processing structure includes any of a variety of centralized or distributed data processing and programming architectures.
  • the output of the controller which will often try to solve for a specific manipulator joint configuration vector q, is used to generate commands for these highly configurable slave manipulator mechanisms.
  • Manipulator linkages usually have enough degrees of freedom to occupy a series of joint states for a given end effector state.
  • These structures are sometimes referred to as having excess, extra or redundant degrees of freedom, and these terms usually cover kinematic chains in which the middle link can move without changing the orientation of the end effector.
  • the main joint controller of the first module often tries to determine or solve the virtual joint speed vector qdot v , which can be used to make the end effector accurate
  • the joint of the slave manipulator 164 is driven in a manner of following the master command x m .
  • the inverse Jacobian matrix usually does not completely define the joint vector solution.
  • the mapping from the Cartesian command xdot to the joint movement qdot is a one-to-many mapping. Because the mechanism is redundant, there are countless solutions in mathematics. , Which is represented by the subspace of inverse survival.
  • the controller uses Jacobian matrices with more columns than rows to reflect this relationship, and maps multiple joint velocities to relatively few Cartesian velocities.
  • the concept of the remote motion center 298 constrained by software is determined.
  • different modes characterized by system compliance or stiffness can be selectively implemented.
  • the estimated pivot point After calculating the estimated pivot point, different system modes on a certain range of pivot points/centers are realized.
  • the estimated pivot point can be compared with the desired pivot point to generate an error output that can be used to drive the pivot of the instrument to a desired position.
  • the estimated pivot point can be used for error detection and therefore for safety, because the estimated pivot point position The change indicates that it is separated from the steering wheel or the sensor is malfunctioning, giving the system an opportunity to take corrective actions.
  • the processor 157 includes a first controller module 157 and a second controller module 160.
  • the first module 157 can include a main joint controller and an inverse Jacobian master-slave controller.
  • the main joint controller of the first module 157 can be configured to generate the desired manipulator assembly motion in response to input from the main input device 156.
  • the manipulator linkage has a series of alternative configurations for a given end effector orientation in space. Commands for the end effector to assume a given orientation can cause a variety of different joint motions and configurations.
  • the second module 160 can be configured to help drive the manipulator assembly to the desired configuration, and manipulate the manipulator during the master-slave movement. The filter is driven toward the preferred configuration, and the second module 160 will contain configuration-related filters.
  • Both the main joint controller of the first module 157 and the configuration-related filters of the second module 160 can include filters used by the processor 157 to transfer the control authority of the linear combination of joints to the realization of one or more goals or tasks.
  • F(X) can be a filter that controls the joint to i) provide the desired end effector movement, and ii) provide the pivoting movement of the instrument shaft at the hole site.
  • the main joint controller of the first module 157 may include the filter F(X).
  • (1-F-1F)(X) can describe configuration-related subspace filters that give control actuation authority for linear combinations of joint speeds orthogonal to the goal of achieving the main joint controller.
  • This configuration-related filter can be used by the second module 160 of the controller 157 to achieve the second goal.
  • the two filters can be further subdivided into more filters corresponding to more specific tasks.
  • the filter F(X) can be divided into F1(X) and F2(X), which are used to control the end effector and control the movement of the pivot axis respectively, any of which can be selected as the highest priority task of the processor.
  • the robot processor and control technology will often utilize the primary joint controller configured for the first controller task, as well as configuration related filters that utilize the underconstrained solution generated by the primary joint controller for the first controller task.
  • the main joint controller will be described with reference to the first module, and the configuration related filters will be described with reference to the second module, which can also include additional functions and additional modules of various priorities.
  • the hardware and programming codes of the first module and the second module are fully integrated, and partly integrated and can be completely separated.
  • the controller 157 can use the functions of two modules at the same time, and can have a variety of different modes, in which one or two modules are used separately or in different ways.
  • the first module 157 can be used with little or no influence from the second module 160, and when the end effector is not driven by the robot, the second module 160 has more features during system assembly.
  • the big effect is that both modules can be active most or all of the time when the robot movement is enabled, by setting the gain of the first module to zero, by setting X s to x s.actual and by reducing the inverse Jacobian
  • the matrix rank in the controller makes it impossible to control too much and allows the configuration-related filters to have more control authority, which can reduce or eliminate the influence of the first module on the state of the manipulator component, thereby changing the mode of the processor 157 to capture Support mode.
  • the first module 157 can contain some form of Jacobian controller with a Jacobian correlation matrix.
  • the second module 160 can receive a signal from the slave manipulator 158 that indicates the orientation or speed of the follower at least in part resulting from the manual articulation of the slave manipulator linkage.
  • the second module 160 can generate a power command suitable for driving the joint of the follower in order to allow manual articulation of the slave linkage while configuring the follower in a desired joint configuration.
  • the controller can use the second module 160 to help derive power commands based on different signals bqdot o .
  • This alternative input signal to the second module 160 of the controller 157 can be used to drive the manipulator linkage to maintain or move the minimally invasive hole pivot position along the manipulator structure, thereby avoiding collisions between multiple manipulators, Thereby increasing the range of motion of the manipulator structure and avoiding singularities in order to produce the desired posture of the manipulator.
  • Fig. 41 the input from the MTM controller is used to actively control a block diagram 231 of the remote motion center (RC), arm end 197 (C), and instrument end effector (E) reference frame.
  • RC remote motion center
  • C arm end 197
  • E instrument end effector
  • the instrument end effector (E) system is actively controlled using input from the primary manipulator controller, while the secondary input device is used to control the block diagram 232 of the remote center (RC) and arm end 197 (C) system.
  • the secondary input device uses arbitrary references, not necessarily the target system (EYE system).
  • the reference frame transformation EYETREF can be directly measured or calculated from indirect measurements.
  • the signal conditioning unit combines these inputs in an appropriate common system for use by the slave manipulator controller.
  • the posture specifications of the remote center frame of reference and the arm end 197 frame of reference come from one or a combination of the following sources: (i) The MTM controller designates these frames/frames in the EYE system, namely EYE T RC and EYE T C , (ii ) The secondary device commands these postures in a convenient reference frame, namely REF T RC and REF T C (where EYE T REF can be determined), and (iii) the slave side controller assigns these postures to the base system of the slave arm ( base frame), namely W T RC and W T C.
  • 45 and 46 are schematic block diagrams of the systems 212 and 213.
  • the systems 212 and 213 are used to use the second processor 215 of the computer-aided vehicle remote driving system 258 to control the instrument end effector 193 reference frame and the remote control center 298
  • the relationship between frames of reference It is assumed that the reference frame of the arm end 197 and the reference frame of the remote control center 298 coincide.
  • the arm end 197 frame of reference and the remote control center 298 are physically constrained to move relative to the instrument end effector 193 only along the arm end 197 and the longitudinal axis of the instrument.
  • Two different strategies are adopted to control the relationship between the frame of reference (E system) of the instrument end effector 193 and the frame of reference (RC system) of the remote control center 298.
  • One strategy for actively controlling the relative distance (d) between the two frames of reference whether the E frame is fixed or moving uses input from a force/torque sensor or a three-state switch.
  • the block diagram is used to implement the control subsystem for this mode, which can be described as a'relative posture controller'.
  • slv_cart_delta S*slv_cart_vel*Ts (where Ts is the sampling time of the controller).
  • S takes the value [1, -1, 0], which depends on its commands to control the movement of the steering wheel 235, the gear 400, the brake pedal 401, and the accelerator pedal 402.
  • Figure 46 is a general block diagram of the distance between the force/torque or pressure sensor and the tip of the control arm 197.
  • the calculated estimation value can be used as input F to command incremental movement, and the signal F can be any other measured or calculated amount of force based on the user's interaction with the manipulator. It can independently control the trajectories of RC system and E system, where the control input to manage these trajectories may all come from the main manipulator.
  • the block diagram of the control subsystem of this additional strategy can be called an'independent posture controller', which can be summarized as inserted (I /O) Movement to allow lateral movement of the remote control center 298 or arm end 197 relative to the instrument tip E.
  • the remote control center 298 or arm end 197 will need to pivot about the tip while driving the instrument to compensate for the movement of E. This will allow movement of the RC and arm end 197 in the cab.
  • Figures 47-52 are methods for providing fault response, fault isolation, and fault weakening of the remote system.
  • the components of the robot 170 cooperate and interact to perform various aspects of fault reaction, fault isolation and fault weakening in the robot 170.
  • the first robot 182, the second robot 183, the third robot 184, and the fourth robot 185 each include a plurality of nodes. Each node controls multiple motors, which drive the joint and linkage mechanism in the robotic arm to affect the freedom of movement of the robotic arm, and each node also controls multiple brakes for stopping the rotation of the motor.
  • the first manipulator 182 has motors 307, 309, 311, and 313; multiple brakes 308, 310, 312, and 314, and multiple nodes 315, 316, and 317, each node 315, 316 controls a single motor/brake pair; node 317 controls Two motor/brake pairs, the sensor processing unit 318 is included to provide motor displacement sensor information to the node 317 for control purposes.
  • the second manipulator 183, the third manipulator 184, and the fourth manipulator 185 are similarly configured to the first manipulator 182. With motors, brakes and nodes.
  • Each robotic arm is operatively coupled to the arm processor.
  • the arm processor 328 is operatively coupled to the node of the first manipulator 182
  • the arm processor 325 is operatively coupled to the node of the second manipulator 183
  • the arm processor 323 is operatively coupled to the node of the third manipulator 184
  • the arm The processor 321 is operatively coupled to the node of the fourth manipulator 185.
  • Each arm processor also includes a joint position controller for converting the desired joint position of the operatively coupled robot arm into a current command for driving the motor in the operatively coupled robot arm to adjust it accordingly The joint is driven to the desired joint position.
  • the system management processor 320 is operatively coupled to the arm processors 328, 325, 323, 321; the system management processor 320 also translates the user manipulation input device associated with the robotic arm to the desired joint position although shown as a separate unit,
  • the arm processors 328, 325, 323, and 321 are also implemented as part of the system management processor 320 through program codes.
  • the arm management processor 319 is operatively coupled to the system management processor 320 and the arm processors 328, 325, 323, 321.
  • the arm management processor 319 initiates, controls, and monitors certain coordinated activities of the arm in order to save the system management processor 320 from having to do so.
  • the arm manager 319 is also implemented as a part of the system management processor 320 through program code.
  • Each of the processor and the node is configured to perform various tasks herein through any combination of hardware, firmware, and software programming. Their functions are executed by one unit or distributed among many subunits, and each subunit is implemented by any combination of hardware, firmware, and software programming.
  • the system management processor 320 is allocated as sub-units throughout the robot 170, such as the remote console 169 and the base 173 of the robot 170.
  • the system management processor 320, the arm management processor 319, and each arm processor 328, 325, 323, 321 include multiple processors to perform various processor and controller tasks and functions.
  • Each node and sensor processing unit includes a transmitter/receiver (TX/RX) pair to facilitate communication with other nodes of its robotic arm and an arm processor operatively coupled to its robotic arm.
  • TX/RX are connected to the network in a daisy chain.
  • the RX of each node receives an information packet from the TX of a neighboring node, it verifies the destination field in the packet to determine whether the packet is for its node. If the packet is used for its node, the node processes the packet. If the packet is used for another node, the node's TX transfers the received packet to the neighboring node's RX in the opposite direction to where it came from.
  • the fault response logic (FRL) circuit is provided in each robot arm, and the fault notification is quickly transmitted by hand.
  • the first manipulator 182 includes an FRL circuit coupled to each of the nodes 315, 316, and 317 of the arm processor 328 and the manipulator 315.
  • the arm processor 328 and one of the nodes 315, 316, and 317 detects a fault affecting it, the arm processor or node pulls up the FRL line 329 to quickly transmit a fault notification to other components coupled to the line 329.
  • the arm processor 328 when the arm processor 328 is about to transmit the recovery notification to the node of the first manipulator 182, it pulls down the FRL line 329 to quickly transmit the recovery notification to other components coupled to the line 329.
  • the virtual FRL line 329 is used instead by specifying that one or more fields in the packet include such failure notification and recovery notification.
  • the method detects a failure in a failed arm of a plurality of robot arms, where the robot arm becomes a "failed arm” due to the detected failure.
  • the method then puts the failed arm into a safe state, where the "safe state” refers to the state of isolating the detected malfunctioning arm by preventing further movement of the arm.
  • the method determines whether the failure should be regarded as a system failure or a partial failure, where "system failure” refers to a failure that affects the performance of at least one other of the multiple robot arms, and "partial failure” Refers to a malfunction that affects the performance of only the failed arm.
  • a local fault causes only the failed arm to be kept in a safe state until the fault is cleared, it is not a type of fault that causes unsafe operation of a non-failed mechanical arm.
  • the fault is the type that causes the unsafe operation of the non-failure arm, then the method should produce a determination that the detected fault is a system fault, in which all the robot arms in the system will be placed in a safe state.
  • the method puts the non-failed arm of the multiple arms into a safe state only when the fault will be regarded as a system failure, where the "non-failed arm" refers to the one in which no failure has been detected The robotic arm among multiple robotic arms.
  • the method determines whether the detected fault is classified as a recoverable system fault or an unrecoverable system fault.
  • the fault is classified as a recoverable system fault, and the method provides the system user with recovery options.
  • the fault is classified as an unrecoverable system fault, and the method waits for the system to shut down.
  • the determination in 329 is that the fault will be treated as a partial fault, and then in 334, the method determines whether the fault is classified as a recoverable partial fault or an unrecoverable partial fault.
  • the fault is classified as a recoverable partial fault, and the method provides the system user with recovery options and weakened operation options.
  • the fault is classified as an unrecoverable partial fault, and the method only provides a weakened operation option.
  • FIG. 47 is a flowchart of aspects of a method of performing fault reaction, fault isolation, and fault weakening, which are performed by each node 315, 316, and 317 of the multiple robot arms of the robot 170.
  • each node continuously monitors the signals and information in the node to use conventional fault detection methods to detect faults affecting the node.
  • This type of detected fault is referred to herein as a "local fault” because it is limited to nodes.
  • the node also monitors the FRL circuit for fault notifications sent by its arm processor or another node in its robotic arm.
  • This type of detected fault is referred to herein as a "remote fault” because it is not limited to the node.
  • the detected fault is hardware, firmware, software, environment, or related communications.
  • failed node The node where the fault has been detected is called “failed node” in this article, and its robotic arm is called “failed arm” in this article.
  • a node in which no fault has been detected is referred to herein as a “non-failure node”, and a mechanical arm in which no fault has been detected is referred to herein as a “non-failure arm”.
  • a failure is detected in 337, and the node puts itself into a safe state. This is done by deactivating one or more controlled motors of the node, which is done by engaging one or more controlled brakes of the node.
  • the node determines whether the detected fault is a local fault or a remote fault. As previously referred to 337, the source of the fault determines whether it will be regarded as a local fault or a remote fault. If the failure is determined to be a partial failure, the node is a failed node. In the first case, the failed node continues by executing the following 343-346 and 341-342. If the failure is determined to be a remote failure, the node is a non-failed node.
  • the non-failed node continues by performing the following 340-342.
  • the failed node transmits a failure notification to neighboring nodes in the failed robot arm in the upstream and downstream directions.
  • the "downstream” direction refers to the direction in which the packet travels away from the arm processor of the node and the "upstream” direction refers to the direction the packet travels toward the arm processor of the node.
  • One way for a node to complete the process is through Pull the FRL line to the high state.
  • the failed node then diagnoses the fault and sends an error message to the system management processor 320.
  • the error message preferably includes failure information, its error code, error type, and origin. Each type of error that may occur that affects the node is assigned an error code.
  • the error code is classified as an error class. There are at least four types of errors: recoverable arm failure, unrecoverable arm failure, recoverable system failure, and unrecoverable system failure.
  • recoverable means that the user is provided with the option to try to recover from the failure.
  • Using "unrecoverable” means that the user is not provided with the option to try to recover from the failure.
  • the origin of the fault includes information about the identity of the node and optional additional information about the source of the fault in the node.
  • the failed node determines whether the detected fault is a recoverable partial fault. The determination in 345 is no, then in 346, the failed node remains in its safe state and ignores any recovery notifications it may subsequently receive on the FRL line. If the determination in 345 is yes, the failed node proceeds to 341.
  • the determination in 339 is that the detected fault will be regarded as a remote fault, then in 340, the virtual FRL line is used, and the non-failed node transmits the received fault notification in the opposite direction from where the fault notification came from In the case of a real FRL line, the non-failure node does not need to take any action for this transmission of the failure notification.
  • both the failed node and the non-failed node wait for the recovery notification to be received.
  • the node once the recovery notification is received, the node returns itself from the safe state to its normal operating state. This is done by reversing the actions taken in 338 while avoiding sudden changes. The node returns to perform the fault detection task referred to in 337.
  • each arm processor 321, 323, 325 and 328 The each arm processor is operatively coupled to the robotic arm of the robot 170.
  • each arm processor continuously monitors its own operation while performing its normal operation tasks and pays attention to the failure notifications transmitted by the failed node in the robot arm to which it is operatively coupled.
  • the fault is referred to herein as a "local fault”.
  • a fault is detected by receiving a fault notification from a failed node in the robotic arm to which it is operatively coupled, and the fault is called a "remote fault.”
  • the remote fault is a fault notification sent along the FRL line by the failed node in the robotic arm operatively coupled to the arm processor.
  • the fault has been detected.
  • the arm processor puts its joint position controller into a safe state by locking the joint position controller's output motor current command to zero. This is used to strengthen the security status of their corresponding nodes.
  • the arm processor determines whether the detected fault is a local fault or a remote fault.
  • the source of the fault determines whether the fault will be considered a local fault or a remote fault.
  • the fault is determined to be a partial fault, and the arm processor is regarded as a failed node.
  • the arm processor continues by executing the following 353-356 and 351-352.
  • the fault is determined as a remote fault, the arm processor is regarded as a non-failure node, and the arm processor continues by executing 350-352.
  • the arm processor transmits the failure notification downstream to all nodes of the robotic arm to which it is operatively connected. One way for the arm processor to complete this process is by pulling the FRL line to a high state.
  • the arm processor diagnoses the fault and sends an error message to the system management processor 320.
  • the error message includes fault information, error code, error type, and origin.
  • Each type of error that occurs that affects the arm processor is assigned an error code.
  • the error code is classified as an error type, and there are at least four error types: recoverable processor failure, unrecoverable processor failure, recoverable system failure, and unrecoverable system failure.
  • the origin of the fault includes information on the identity of the arm processor and optional additional information on the source of the fault in the arm processor.
  • the arm processor determines whether the detected fault is a recoverable partial fault, and the determination is completed by the error category of the fault. If the determination in 355 is no, then in 356, the joint position of the failed arm processor is controlled The arm processor remains in its safe state and the arm processor ignores any recovery notifications it may subsequently receive on the FRL line. The determination in 355 is yes, then the arm processor proceeds to 350.
  • the determination in 349 is that the detected fault will be regarded as a remote fault, then in 350, the arm processor waits for the recovery notification received from the system management processor 320. In 351, once the recovery notification is received, the arm processor transmits the recovery notification to all nodes in the robotic arm to which it is operatively coupled by, for example, pulling its FRL line low. In 352, the arm processor then returns its joint position controller from the safe state to its normal operating state. This process is accomplished by releasing the output motor current command of the joint position controller so that they can once again reflect the desired joint position of the robotic arm to which they are operatively coupled while avoiding sudden changes. The arm processor then returns to perform its fault detection task with reference to 347.
  • 47 and 51 are flowcharts of various aspects of methods for performing fault reaction, fault isolation, and fault weakening, which are executed by the system management processor 320 of the robot 170.
  • the system management processor also waits to receive an error message transmitted from another component of the robot 170 while performing its normal operation tasks.
  • the error message is received in 357, then in 358, the system management processor for safety purposes, for example, by instructing the joint position controllers of all arm processors 328, 325, 323, and 321 in the robot system to lock their respective current values To stop the system. No new current command input is provided to the robotic arm until the output of the joint position controller is unlocked. This locking of the output of the joint position controller is referred to herein as a "soft lock" joint position controller.
  • the system management processor determines whether the detected fault should be regarded as a system fault or an arm fault.
  • the system management processor completes this step by checking the error type information provided in the error message.
  • System faults include all faults classified as either recoverable system faults or unrecoverable system faults, because these faults can be applied to more than just failed robots.
  • arm faults include all faults classified as either recoverable partial faults or unrecoverable partial faults, because these faults can only be applied to a failed robot arm.
  • the system management processor provides the remote operator 171 of the robot 170 with the option of accepting the weakened operation of the robot 170.
  • the system management processor also provides the user with an option to recover from the failure.
  • information about the detected failure is also provided by the system management processor to assist the remote operator 171 in determining whether to accept the option.
  • the option and fault information are provided on the visual display 255 of the remote console 169.
  • the system management processor waits for the remote operator 171 to select the option provided in 360. Once the option is selected by the remote operator 171, in 362, the system management processor determines whether the selected option is a weakened operation option or a recovery option. The recovery option is provided and the remote operator 171 selects the recovery option, then in 381, the system management processor sends a recovery notification to the arm processor of the failed robotic arm. The arm processor of the failed robotic arm will process the recovery notification, including sending the recovery notification to all nodes of the failed arm, and the node of the failed arm will then process the recovery notification.
  • system management processor then releases the soft locks of the joint controllers by unlocking the outputs of the joint controllers of all arm processors, so that the joint controllers again issue the desired joint position reflecting their operationally coupled robotic arms Then, the system management processor returns to perform its task with reference to 357.
  • the remote operator 171 selects the weakened operation option, then in 363, the system management processor provides the remote operator 171 with an option to recover from the failure. Recovery from the failure in this case is different from the recovery with reference 381-382 because no attempt is made to recover the failed arm. Recovery should only be used to restore normal operation of the non-failed arm.
  • the system management processor waits for the user to select the option provided in 363. Once the option is selected by the remote operator 171, in 365, the system management processor sends a message to the arm processor of the disabled arm to reinforce the failure.
  • the enhancement of the failure in this case means that additional steps are taken to completely shut down the operation of the failed robot arm.
  • An example of such an enhanced measure is to operatively disconnect the joint position controller of the arm processor from other components of the master/slave control system that generates the desired joint position of the operatively coupled robot arm.
  • One strengthening measure is to turn off the power to the failed robotic arm.
  • the system management processor then releases the soft lock of the joint controller by unlocking the output of the joint controller of the arm processor of all non-failed arms, so that the joint controller again sends out the mechanical coupling reflecting its operation.
  • the motor current at the desired joint position of the arm commands the system management processor and then returns to perform the task with reference to 357.
  • the system management processor validates the system FRL situation for all nodes in the robot 170. This is done by causing the FRL lines 329, 327, 384, and 385 to be pulled high so that the fault notification is provided to the arm processor and the nodes of the first robot 182, second robot 183, third robot 184, and fourth robot 185 at the same time. step.
  • the system management processor determines whether the system failure is a recoverable system failure. Complete this step by checking the error class in the received error message. The determination in 369 is no, then in 363, the system management processor takes no further action and waits for the system to be shut down.
  • the system management processor determines whether the system management processor provides the user with an option to recover from the failure.
  • the management processor waits for the remote operator 171 to select a recovery option. If this option is selected, in 372, the system management processor sends a recovery notification to the arm processors of all the robot arms of the robot 170.
  • the system management processor upon receiving a request or action from 171, releases the soft lock of each joint controller to operate the joint controller arm in its normal operating state, so that the released joint controller is once again Sending a motor current reflecting the desired joint position of the robotic arm to which it is operatively coupled instructs the system management processor to then return to perform the task of reference 357.
  • FIGS. 47 and 52 are flowcharts of various aspects of methods for performing fault reaction, fault isolation, and fault weakening, which are operatively coupled to the system management processor and arm processing of the robot 170
  • the arm management processors of the devices 328, 325, 323, and 321 are executed.
  • the arm management processor 319 starts, controls, and monitors certain coordinated activities of the first robot 170, the first robot 182, the second robot 183, the third robot 184, and the fourth robot 185.
  • the arm manager 319 initiates and monitors the start of the brake test, where the arm manager 319 communicates with each of the arm processors 328, 325, and 323 so that specific braking sequences with different torque values are applied to their respective machines The brake of the arm.
  • the coordination of this activity is performed by the arm manager 319 in this case, because the overhead of coding it to each arm processor is redundant.
  • the maximum torque value calculated by each arm processor is transmitted back to the arm manager 319.
  • the arm manager 319 will notify the failed arm of the transmission failure.
  • the arm manager instructs the arm processor to perform arm activity, monitors the result, and determines whether the result of the activity indicates arm failure.
  • the arm manager monitors the coordinated activities of the robotic arms based on the reports of the respective arm processors of the robotic arms to detect faults in one arm. When the reported measurement exceeds the expected value by a threshold amount, the arm manager determines that a fault has occurred.
  • the detected fault is a fault that is generally not detected by one of the nodes of the robot arm or the arm processor.
  • the arm manager suppresses any further commands of the disabled arm. No further commands will be transmitted from the arm manager to the arm processor of the failed arm until either a recovery notification is received from the system manager or the system is restarted.
  • the arm manager sends a fault notification to the failed arm by pulling the FRL line of the failed arm to a high state. In the case of a virtual FRL line, the arm manager transmits a failure notification in a packet field that is the same as or different from the packet field designated for transmission of the failure notification through one of the nodes of the failed arm or the arm processor.
  • the arm manager sends an error message to the system manager, the error message has the available details of the failure, each type of failure detected by the arm manager is assigned an error code, and the error code is classified as an error class,
  • the origin of the fault includes the identification information of the failed arm and optional additional information on the source of the fault.
  • the system manager in 357 the system manager then starts processing error messages.
  • the arm manager determines whether the detected fault is a recoverable fault. This determination is made according to the error class of the fault. The determination in 378 is no, then in 381, the arm manager continues to suppress any further commands to the disabled arm and ignores any recovery notifications subsequently received from the system manager. The determination in 378 is yes, then in 379, the arm manager waits for the recovery notification received from the management processor. In 380, the recovery notification is received, the arm manager stops suppressing the command to the failed arm and returns to its normal operation mode and performs the fault detection task with reference 374.
  • Fig. 24 is a system connection diagram of the dual-mode driving mode of the second and third embodiments of the present invention.
  • the modular design concept is adopted for the unmanned driving controller 525.
  • the unmanned driving controller 525 is connected to the vehicle vision system 502 for unmanned driving control
  • the device 525 is connected with the positioning and navigation module 274, the planning system module 528, and the unmanned driving controller 525 with the vehicle bus 276, and the control system module 529 is connected with the robot 170. Ethernet and CAN bus communication are adopted between the modules.
  • the unmanned driving controller 525 adopts a reconfigurable computing AI chip and is programmed based on the Linux system platform.
  • the control strategy is customized by integrating vehicle parameters and operating characteristics.
  • the specific control strategies include: 1.
  • the unmanned driving controller 525 is used as control Mode arbitration controller determines whether it is currently in unmanned driving mode and whether it meets the conditions of unmanned driving mode; 2. When in remote driving mode, the control system module shields the control commands issued by the unmanned driving controller and responds to remote operations Command; 3. When in the unmanned driving control mode, according to the information of lidar, millimeter-wave radar and camera received by the environment sensing module received by the unmanned controller 525, the optimized information fused by each sensor, and The received body attitude information sensed by the gyroscope and the latitude and longitude information provided by the DGPS positioning and navigation module are combined with the tracking route learned through deep learning, and adaptive route planning through parallel mapping and positioning technology (SLAM).
  • SLAM parallel mapping and positioning technology
  • Fig. 25 is a logic diagram of dual-mode driving switching in the second embodiment of the present invention.
  • the remote control driving mode can be switched to the intelligent unmanned driving mode.
  • Emergency switching mode the "remote control driving mode" signal interrupts the automatic sending request 517.
  • the unmanned driving controller judges to enter the unmanned driving mode and automatically takes over the remote driving mode, and executes 518. It is not allowed to enter the unmanned driving mode, and the feedback signal 519
  • the external unmanned driving control switch is reset or the pre-planned exit unmanned driving logic is switched to the normal driving mode, and the control instruction 520 is executed.
  • Figure 26 is a logic diagram of dual-mode driving switching logic diagram of the third embodiment of the present invention.
  • the remote control driving mode can be used to switch to the intelligent unmanned driving mode. Normal switching mode: when it is necessary to enter the "intelligent unmanned driving mode" from the "remote control driving mode” ”, press the external driverless control switch to send a request 521 to the driverless controller.
  • the driverless controller judges whether to enter the driverless mode, and allows the driver to enter the driverless mode, execute 522, and not allow Unmanned driving mode, feedback signal 523; when it is necessary to enter the “remote control driving mode” from “unmanned driving mode”, switch to normal driving by resetting the external unmanned driving control switch or pre-planned exit unmanned driving logic Mode, execute the control instruction 524.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un système de commande à distance de véhicule établi par des dispositifs sans fil primaires et secondaires au moyen d'une connexion Internet des objets, d'un système de conduite primaire, d'un système de conduite en veille, d'une ligne de transmission d'image vidéo radar primaire et d'une ligne de transmission d'image vidéo radar en veille. Un radar (110) d'un système de vision de véhicule (502) et un dispositif d'acquisition vidéo (120) sont fusionnés par un système de fusion d'informations vidéo radar (130) ; le système de conduite à distance de véhicule pour un opérateur de conduite à distance est établi au moyen des systèmes ci-dessus, et en particulier, l'opérateur de conduite à distance commande à distance des véhicules en réseau au moyen de l'Internet des objets à l'aide d'un terminal informatique, les exigences de l'ensemble des conditions de route sont adaptées, aucun conducteur n'est attribué dans le véhicule, une sensation réelle de conduite sans pilote est obtenue, et un support technique pour le développement d'économie de partage est proposé.
PCT/CN2020/000015 2019-01-22 2020-01-07 Système de commande à distance de véhicule établi par des dispositifs sans fil primaires et secondaires au moyen d'une connexion internet des objets WO2020151468A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910096218.4A CN111464978A (zh) 2019-01-22 2019-01-22 主次无线设备通过物联网连接建立的车辆远程驾驶体系
CN201910096218.4 2019-01-22

Publications (1)

Publication Number Publication Date
WO2020151468A1 true WO2020151468A1 (fr) 2020-07-30

Family

ID=71679897

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/000015 WO2020151468A1 (fr) 2019-01-22 2020-01-07 Système de commande à distance de véhicule établi par des dispositifs sans fil primaires et secondaires au moyen d'une connexion internet des objets

Country Status (2)

Country Link
CN (1) CN111464978A (fr)
WO (1) WO2020151468A1 (fr)

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112134942A (zh) * 2020-09-18 2020-12-25 北京市生态环境监测中心 一种重型车远程监控平台数据质量检测装置及方法
CN112130136A (zh) * 2020-09-11 2020-12-25 中国重汽集团济南动力有限公司 一种交通目标综合感知系统及方法
CN112269392A (zh) * 2020-09-16 2021-01-26 广西电网有限责任公司电力科学研究院 一种无人机集群控制的地面工作站系统及其控制方法
CN112345531A (zh) * 2020-10-19 2021-02-09 国网安徽省电力有限公司电力科学研究院 一种基于仿生机器鱼的变压器故障检测方法
CN112437414A (zh) * 2020-09-30 2021-03-02 北方工业大学 一种远程驾驶车辆控制信息传输与处理方法
CN112492029A (zh) * 2020-11-27 2021-03-12 河南汇祥通信设备有限公司 一种综合管廊现场多源综合现场定位系统及方法
CN112714282A (zh) * 2020-12-22 2021-04-27 北京百度网讯科技有限公司 远程控制中的图像处理方法、装置、设备和程序产品
CN112822734A (zh) * 2020-12-31 2021-05-18 上海擎昆信息科技有限公司 一种高铁沿线网络接入方法和系统
CN112918517A (zh) * 2021-02-01 2021-06-08 中国神华能源股份有限公司神朔铁路分公司 铁路机车驾驶参数设置方法、装置、计算机设备和存储介质
CN113032003A (zh) * 2021-04-08 2021-06-25 平安国际智慧城市科技股份有限公司 开发文件导出方法、装置、电子设备及计算机存储介质
CN113036915A (zh) * 2021-03-04 2021-06-25 国网福建省电力有限公司厦门供电公司 一种基于智能网关的园区供用电设备远程监测及控制方法
CN113112041A (zh) * 2021-04-22 2021-07-13 深圳市阳谷医疗系统有限公司 一种基于空气净化器的云计算医疗系统
CN113254283A (zh) * 2021-05-20 2021-08-13 中国兵器装备集团自动化研究所有限公司 多can口测试系统优化方法、装置、设备和存储介质
CN113253274A (zh) * 2021-04-30 2021-08-13 西南电子技术研究所(中国电子科技集团公司第十研究所) 直升机防撞地表电力线的融合处理方法
CN113311835A (zh) * 2021-05-21 2021-08-27 深圳裹动智驾科技有限公司 降低自动驾驶车辆行驶风险的方法
CN113342169A (zh) * 2021-06-10 2021-09-03 中国水利水电第七工程局有限公司 一种基于力反馈的塔式起重机操作虚拟培训系统
CN113359124A (zh) * 2021-05-20 2021-09-07 陕西长岭电子科技有限责任公司 机载悬停指示器
CN113433548A (zh) * 2021-06-24 2021-09-24 中国第一汽车股份有限公司 一种数据监控方法、装置、设备及存储介质
CN113538759A (zh) * 2021-07-08 2021-10-22 深圳创维-Rgb电子有限公司 基于显示设备的门禁管理方法、装置、设备及存储介质
CN113715868A (zh) * 2021-06-17 2021-11-30 上海应用技术大学 基于时间空间耦合的远程轨道检测系统
CN113741401A (zh) * 2021-09-18 2021-12-03 南昌济铃新能源科技有限责任公司 一种房车总线系统
CN113865596A (zh) * 2021-08-31 2021-12-31 广东省威汇智能科技有限公司 一种基于汽车导航领域的贴框设备
CN113965619A (zh) * 2021-10-21 2022-01-21 宜信普惠信息咨询(北京)有限公司 一种gps设备上线判定方法及装置
CN114143086A (zh) * 2021-11-30 2022-03-04 北京天融信网络安全技术有限公司 一种Web应用识别方法、装置、电子设备及存储介质
CN114185348A (zh) * 2021-12-03 2022-03-15 东风悦享科技有限公司 一种履带式巡逻车远程驾驶控制系统及方法
CN114475734A (zh) * 2022-02-28 2022-05-13 中南大学 无线远程机车控制方法及系统
CN114585054A (zh) * 2022-02-23 2022-06-03 北京小米移动软件有限公司 Wifi连接控制方法、装置及存储介质
CN114582154A (zh) * 2020-11-30 2022-06-03 丰田自动车株式会社 图像显示装置及方法、非临时性的计算机可读取的介质
CN114726672A (zh) * 2022-03-08 2022-07-08 广东空天科技研究院 高速飞行器地面站多余度人机交互系统和方法
CN114756007A (zh) * 2022-04-20 2022-07-15 中国第一汽车股份有限公司 一种测评方法、装置、设备以及存储介质
CN114863711A (zh) * 2021-12-13 2022-08-05 广东电网有限责任公司 一种车联网车辆定位方法及系统
CN114970608A (zh) * 2022-05-06 2022-08-30 中国科学院自动化研究所 基于眼电信号的人机交互方法及系统
CN115032969A (zh) * 2022-06-27 2022-09-09 中国第一汽车股份有限公司 一种车载控制器的以太网测试系统
CN115331190A (zh) * 2022-09-30 2022-11-11 北京闪马智建科技有限公司 一种基于雷视融合的道路隐患识别方法及装置
CN115324443A (zh) * 2022-07-08 2022-11-11 卡斯柯信号有限公司 一种基于在线检测的车门自动对位隔离系统及方法
CN115396755A (zh) * 2022-10-28 2022-11-25 高勘(广州)技术有限公司 电力管廊运维方法、装置、设备及存储介质
CN115410233A (zh) * 2022-11-01 2022-11-29 齐鲁工业大学 一种基于卡尔曼滤波和深度学习的手势姿态估计方法
CN115599737A (zh) * 2022-12-13 2023-01-13 南京芯驰半导体科技有限公司(Cn) 一种异构多核系统、通信方法、芯片、设备及存储介质
CN116112529A (zh) * 2023-04-03 2023-05-12 三一汽车起重机械有限公司 多模式信号传输方法及装置、工程机械
WO2023220854A1 (fr) * 2022-05-16 2023-11-23 Oppo广东移动通信有限公司 Procédé de commande d'accès, terminal, puce, support de stockage lisible et produit programme d'ordinateur
US11835008B1 (en) 2023-01-12 2023-12-05 Ford Global Technologies, Llc Engine and engine exhaust control system
CN117473456A (zh) * 2023-12-28 2024-01-30 北京卓视智通科技有限责任公司 一种雷视智能融合的方法及系统
CN117681852A (zh) * 2024-02-02 2024-03-12 万向钱潮股份公司 一种基于转角传感器故障的线控制动控制方法
CN113741401B (zh) * 2021-09-18 2024-05-14 南昌智能新能源汽车研究院 一种房车总线系统

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112364701B (zh) * 2020-10-15 2022-11-04 东风汽车集团有限公司 一种应用于平行驾驶系统的视频图像处理方法和装置
CN112288906B (zh) * 2020-10-27 2022-08-02 北京五一视界数字孪生科技股份有限公司 仿真数据集的获取方法、装置、存储介质和电子设备
CA3197021A1 (fr) * 2020-10-30 2022-05-05 Hongchao RUAN Procede de transmission d'informations, appareil de commande, appareil emetteur-recepteur de signal electromagnetique et dispositif de traitement de signal
CN113040757B (zh) * 2021-03-02 2022-12-20 江西台德智慧科技有限公司 头部姿态监测方法、装置、头部智能穿戴设备及存储介质
CN113112844A (zh) * 2021-03-18 2021-07-13 浙江金乙昌科技股份有限公司 基于5g通信和高精度定位车辆远程控制系统及其控制装置
CN112734927B (zh) * 2021-03-31 2021-06-25 湖北亿咖通科技有限公司 高精地图车道线的简化方法、简化装置及计算机存储介质
CN115958996A (zh) * 2021-04-14 2023-04-14 岳秀兰 由远程驾驶、能源补给和地面航母组成的飞行器运保体系
CN113434764B (zh) * 2021-06-29 2023-10-24 青岛海尔科技有限公司 内容推送方法和装置、存储介质及电子装置
CN113920762A (zh) * 2021-10-08 2022-01-11 湖南湘江智能科技创新中心有限公司 一种基于智能网联环境下应急车辆优先通行的控制方法
CN114578188B (zh) * 2022-05-09 2022-07-08 环球数科集团有限公司 一种基于北斗卫星的电网故障定位方法
CN115240292A (zh) * 2022-06-24 2022-10-25 柳州铁道职业技术学院 一种用于长途客车的运营监控系统、方法、设备及介质
CN115297539B (zh) * 2022-10-10 2023-02-03 联友智连科技有限公司 一种车联网终端时间同步方法、系统及电子设备
CN115701894B (zh) * 2022-12-27 2023-10-24 蔚来汽车科技(安徽)有限公司 通话方法、通话系统及计算机存储介质
CN116052121B (zh) * 2023-01-28 2023-06-27 上海芯算极科技有限公司 一种基于距离预估的多传感目标检测融合方法及装置
CN116170779B (zh) * 2023-04-18 2023-07-25 西安深信科创信息技术有限公司 一种协同感知数据传输方法、装置及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105869419A (zh) * 2016-04-25 2016-08-17 张开冰 一种智能行车系统
CN107010152A (zh) * 2016-11-25 2017-08-04 罗轶 公共自行包
CN108037760A (zh) * 2017-12-12 2018-05-15 成都育芽科技有限公司 一种无人自动驾驶汽车自主运行控制方法
CN108063797A (zh) * 2017-11-17 2018-05-22 南京视莱尔汽车电子有限公司 一种自动驾驶汽车的运行姿态全程监控方法
WO2018169590A1 (fr) * 2017-03-17 2018-09-20 Autoliv Asp, Inc. Classification asil améliorée par positionnement coopératif

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112006003044T5 (de) * 2005-10-21 2008-10-23 Deere & Company, Moline Vielseitiges Robotersteuermodul
US9446517B2 (en) * 2013-10-17 2016-09-20 Intuitive Surgical Operations, Inc. Fault reaction, fault isolation, and graceful degradation in a robotic system
US9439232B2 (en) * 2014-01-17 2016-09-06 GM Global Technology Operations LLC Managing traditional Wi-Fi and Wi-Fi direct connections using a wireless device
CN106656195A (zh) * 2015-11-04 2017-05-10 北京信威通信技术股份有限公司 数据压缩、解压缩的方法及系统
CN107305372B (zh) * 2016-04-25 2020-06-19 岳秀兰 云计算网络架构的远程监控的电动汽车能源监控和补给网
KR102584758B1 (ko) * 2016-06-09 2023-10-06 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 컴퓨터 보조 원격 조작 수술 시스템 및 방법
CN106101590B (zh) * 2016-06-23 2019-07-19 上海无线电设备研究所 雷达视频复合数据探测与处理系统及探测与处理方法
US10162360B2 (en) * 2016-12-01 2018-12-25 GM Global Technology Operations LLC Vehicle environment imaging systems and methods
CN108995538A (zh) * 2018-08-09 2018-12-14 金龙联合汽车工业(苏州)有限公司 一种电动汽车的无人驾驶系统
CN108811191A (zh) * 2018-09-18 2018-11-13 南昌工程学院 微波信道、移动通信、卫星通信网络接入系统及工作方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105869419A (zh) * 2016-04-25 2016-08-17 张开冰 一种智能行车系统
CN107010152A (zh) * 2016-11-25 2017-08-04 罗轶 公共自行包
WO2018169590A1 (fr) * 2017-03-17 2018-09-20 Autoliv Asp, Inc. Classification asil améliorée par positionnement coopératif
CN108063797A (zh) * 2017-11-17 2018-05-22 南京视莱尔汽车电子有限公司 一种自动驾驶汽车的运行姿态全程监控方法
CN108037760A (zh) * 2017-12-12 2018-05-15 成都育芽科技有限公司 一种无人自动驾驶汽车自主运行控制方法

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112130136A (zh) * 2020-09-11 2020-12-25 中国重汽集团济南动力有限公司 一种交通目标综合感知系统及方法
CN112130136B (zh) * 2020-09-11 2024-04-12 中国重汽集团济南动力有限公司 一种交通目标综合感知系统及方法
CN112269392A (zh) * 2020-09-16 2021-01-26 广西电网有限责任公司电力科学研究院 一种无人机集群控制的地面工作站系统及其控制方法
CN112269392B (zh) * 2020-09-16 2023-08-22 广西电网有限责任公司电力科学研究院 一种无人机集群控制的地面工作站系统及其控制方法
CN112134942B (zh) * 2020-09-18 2023-12-19 北京市生态环境监测中心 一种重型车远程监控平台数据质量检测装置及方法
CN112134942A (zh) * 2020-09-18 2020-12-25 北京市生态环境监测中心 一种重型车远程监控平台数据质量检测装置及方法
CN112437414B (zh) * 2020-09-30 2023-06-30 北方工业大学 一种远程驾驶车辆控制信息传输与处理方法
CN112437414A (zh) * 2020-09-30 2021-03-02 北方工业大学 一种远程驾驶车辆控制信息传输与处理方法
CN112345531A (zh) * 2020-10-19 2021-02-09 国网安徽省电力有限公司电力科学研究院 一种基于仿生机器鱼的变压器故障检测方法
CN112345531B (zh) * 2020-10-19 2024-04-09 国网安徽省电力有限公司电力科学研究院 一种基于仿生机器鱼的变压器故障检测方法
CN112492029B (zh) * 2020-11-27 2022-12-23 河南汇祥通信设备有限公司 一种综合管廊现场多源综合现场定位系统及方法
CN112492029A (zh) * 2020-11-27 2021-03-12 河南汇祥通信设备有限公司 一种综合管廊现场多源综合现场定位系统及方法
CN114582154B (zh) * 2020-11-30 2023-12-29 丰田自动车株式会社 图像显示装置及方法、非临时性的计算机可读取的介质
CN114582154A (zh) * 2020-11-30 2022-06-03 丰田自动车株式会社 图像显示装置及方法、非临时性的计算机可读取的介质
CN112714282A (zh) * 2020-12-22 2021-04-27 北京百度网讯科技有限公司 远程控制中的图像处理方法、装置、设备和程序产品
CN112822734B (zh) * 2020-12-31 2023-01-31 上海擎昆信息科技有限公司 一种高铁沿线网络接入方法和系统
CN112822734A (zh) * 2020-12-31 2021-05-18 上海擎昆信息科技有限公司 一种高铁沿线网络接入方法和系统
CN112918517B (zh) * 2021-02-01 2023-02-17 中国神华能源股份有限公司神朔铁路分公司 铁路机车驾驶参数设置方法、装置、计算机设备和存储介质
CN112918517A (zh) * 2021-02-01 2021-06-08 中国神华能源股份有限公司神朔铁路分公司 铁路机车驾驶参数设置方法、装置、计算机设备和存储介质
CN113036915A (zh) * 2021-03-04 2021-06-25 国网福建省电力有限公司厦门供电公司 一种基于智能网关的园区供用电设备远程监测及控制方法
CN113036915B (zh) * 2021-03-04 2024-02-27 国网福建省电力有限公司厦门供电公司 一种基于智能网关的园区供用电设备远程监测及控制方法
CN113032003A (zh) * 2021-04-08 2021-06-25 平安国际智慧城市科技股份有限公司 开发文件导出方法、装置、电子设备及计算机存储介质
CN113032003B (zh) * 2021-04-08 2024-04-02 深圳赛安特技术服务有限公司 开发文件导出方法、装置、电子设备及计算机存储介质
CN113112041A (zh) * 2021-04-22 2021-07-13 深圳市阳谷医疗系统有限公司 一种基于空气净化器的云计算医疗系统
CN113253274A (zh) * 2021-04-30 2021-08-13 西南电子技术研究所(中国电子科技集团公司第十研究所) 直升机防撞地表电力线的融合处理方法
CN113253274B (zh) * 2021-04-30 2024-02-06 西南电子技术研究所(中国电子科技集团公司第十研究所) 直升机防撞地表电力线的融合处理方法
CN113254283A (zh) * 2021-05-20 2021-08-13 中国兵器装备集团自动化研究所有限公司 多can口测试系统优化方法、装置、设备和存储介质
CN113359124B (zh) * 2021-05-20 2024-02-23 陕西长岭电子科技有限责任公司 机载悬停指示器
CN113359124A (zh) * 2021-05-20 2021-09-07 陕西长岭电子科技有限责任公司 机载悬停指示器
CN113311835A (zh) * 2021-05-21 2021-08-27 深圳裹动智驾科技有限公司 降低自动驾驶车辆行驶风险的方法
CN113311835B (zh) * 2021-05-21 2023-12-29 深圳安途智行科技有限公司 降低自动驾驶车辆行驶风险的方法
CN113342169A (zh) * 2021-06-10 2021-09-03 中国水利水电第七工程局有限公司 一种基于力反馈的塔式起重机操作虚拟培训系统
CN113715868A (zh) * 2021-06-17 2021-11-30 上海应用技术大学 基于时间空间耦合的远程轨道检测系统
CN113433548A (zh) * 2021-06-24 2021-09-24 中国第一汽车股份有限公司 一种数据监控方法、装置、设备及存储介质
CN113538759B (zh) * 2021-07-08 2023-08-04 深圳创维-Rgb电子有限公司 基于显示设备的门禁管理方法、装置、设备及存储介质
CN113538759A (zh) * 2021-07-08 2021-10-22 深圳创维-Rgb电子有限公司 基于显示设备的门禁管理方法、装置、设备及存储介质
CN113865596A (zh) * 2021-08-31 2021-12-31 广东省威汇智能科技有限公司 一种基于汽车导航领域的贴框设备
CN113741401B (zh) * 2021-09-18 2024-05-14 南昌智能新能源汽车研究院 一种房车总线系统
CN113741401A (zh) * 2021-09-18 2021-12-03 南昌济铃新能源科技有限责任公司 一种房车总线系统
CN113965619B (zh) * 2021-10-21 2024-03-01 宜信普惠信息咨询(北京)有限公司 一种gps设备上线判定方法及装置
CN113965619A (zh) * 2021-10-21 2022-01-21 宜信普惠信息咨询(北京)有限公司 一种gps设备上线判定方法及装置
CN114143086B (zh) * 2021-11-30 2023-09-26 北京天融信网络安全技术有限公司 一种Web应用识别方法、装置、电子设备及存储介质
CN114143086A (zh) * 2021-11-30 2022-03-04 北京天融信网络安全技术有限公司 一种Web应用识别方法、装置、电子设备及存储介质
CN114185348A (zh) * 2021-12-03 2022-03-15 东风悦享科技有限公司 一种履带式巡逻车远程驾驶控制系统及方法
CN114863711A (zh) * 2021-12-13 2022-08-05 广东电网有限责任公司 一种车联网车辆定位方法及系统
CN114863711B (zh) * 2021-12-13 2023-09-29 广东电网有限责任公司 一种车联网车辆定位方法及系统
CN114585054A (zh) * 2022-02-23 2022-06-03 北京小米移动软件有限公司 Wifi连接控制方法、装置及存储介质
CN114585054B (zh) * 2022-02-23 2023-11-14 北京小米移动软件有限公司 Wifi连接控制方法、装置及存储介质
CN114475734A (zh) * 2022-02-28 2022-05-13 中南大学 无线远程机车控制方法及系统
CN114475734B (zh) * 2022-02-28 2023-11-28 中南大学 无线远程机车控制方法及系统
CN114726672B (zh) * 2022-03-08 2023-08-04 广东空天科技研究院 高速飞行器地面站多余度人机交互系统和方法
CN114726672A (zh) * 2022-03-08 2022-07-08 广东空天科技研究院 高速飞行器地面站多余度人机交互系统和方法
CN114756007A (zh) * 2022-04-20 2022-07-15 中国第一汽车股份有限公司 一种测评方法、装置、设备以及存储介质
CN114970608A (zh) * 2022-05-06 2022-08-30 中国科学院自动化研究所 基于眼电信号的人机交互方法及系统
CN114970608B (zh) * 2022-05-06 2023-06-02 中国科学院自动化研究所 基于眼电信号的人机交互方法及系统
WO2023220854A1 (fr) * 2022-05-16 2023-11-23 Oppo广东移动通信有限公司 Procédé de commande d'accès, terminal, puce, support de stockage lisible et produit programme d'ordinateur
CN115032969A (zh) * 2022-06-27 2022-09-09 中国第一汽车股份有限公司 一种车载控制器的以太网测试系统
CN115324443A (zh) * 2022-07-08 2022-11-11 卡斯柯信号有限公司 一种基于在线检测的车门自动对位隔离系统及方法
CN115324443B (zh) * 2022-07-08 2024-04-09 卡斯柯信号有限公司 一种基于在线检测的车门自动对位隔离系统及方法
CN115331190B (zh) * 2022-09-30 2022-12-09 北京闪马智建科技有限公司 一种基于雷视融合的道路隐患识别方法及装置
CN115331190A (zh) * 2022-09-30 2022-11-11 北京闪马智建科技有限公司 一种基于雷视融合的道路隐患识别方法及装置
CN115396755A (zh) * 2022-10-28 2022-11-25 高勘(广州)技术有限公司 电力管廊运维方法、装置、设备及存储介质
CN115410233A (zh) * 2022-11-01 2022-11-29 齐鲁工业大学 一种基于卡尔曼滤波和深度学习的手势姿态估计方法
CN115599737A (zh) * 2022-12-13 2023-01-13 南京芯驰半导体科技有限公司(Cn) 一种异构多核系统、通信方法、芯片、设备及存储介质
CN115599737B (zh) * 2022-12-13 2023-02-28 南京芯驰半导体科技有限公司 一种异构多核系统、通信方法、芯片、设备及存储介质
US11835008B1 (en) 2023-01-12 2023-12-05 Ford Global Technologies, Llc Engine and engine exhaust control system
CN116112529A (zh) * 2023-04-03 2023-05-12 三一汽车起重机械有限公司 多模式信号传输方法及装置、工程机械
CN116112529B (zh) * 2023-04-03 2023-06-09 三一汽车起重机械有限公司 多模式信号传输方法及装置、工程机械
CN117473456B (zh) * 2023-12-28 2024-02-27 北京卓视智通科技有限责任公司 一种雷视智能融合的方法及系统
CN117473456A (zh) * 2023-12-28 2024-01-30 北京卓视智通科技有限责任公司 一种雷视智能融合的方法及系统
CN117681852A (zh) * 2024-02-02 2024-03-12 万向钱潮股份公司 一种基于转角传感器故障的线控制动控制方法
CN117681852B (zh) * 2024-02-02 2024-04-30 万向钱潮股份公司 一种基于转角传感器故障的线控制动控制方法

Also Published As

Publication number Publication date
CN111464978A (zh) 2020-07-28

Similar Documents

Publication Publication Date Title
WO2020151468A1 (fr) Système de commande à distance de véhicule établi par des dispositifs sans fil primaires et secondaires au moyen d'une connexion internet des objets
US11242066B2 (en) Vehicle control apparatus and vehicle control system and vehicle control method thereof
US20190064813A1 (en) Systems and Methods of Controlling an Autonomous Vehicle Using an Enhanced Trajectory Following Configuration
CN110568852A (zh) 一种自动驾驶系统及其控制方法
JP7200946B2 (ja) 情報処理装置、制御システム、情報処理方法及びプログラム
US10929995B2 (en) Method and apparatus for predicting depth completion error-map for high-confidence dense point-cloud
KR20160113039A (ko) 차량 결합 및 차량 결합을 형성하기 위한 그리고 실시하기 위한 방법
US20220182498A1 (en) System making decision based on data communication
WO2020116195A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme, dispositif de commande de corps mobile et corps mobile
CN110716549A (zh) 用于无地图区域巡逻的自主导航机器人系统及其导航方法
WO2022218219A1 (fr) Système de garantie de fonctionnement d'aéronef consistant en une conduite à distance, en une alimentation en énergie et en un support au sol
CN112544061B (zh) 一种数据传输方法以及装置
CN104122891A (zh) 一种城市地下轨道检测的智能机器人巡检系统
CN114348020A (zh) 一种5g远程与自动驾驶安全冗余系统及控制方法
CA3136909A1 (fr) Systemes et methodes de localisation et de cartographie en simultane au moyen de cameras a angles multiples asynchrones
WO2020116194A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme, dispositif de commande de corps mobile et corps mobile
CN109177753A (zh) 车用飞行器以及搭载飞行器的汽车
CN108958248A (zh) 备份系统
CN114240769A (zh) 一种图像处理方法以及装置
US20220222936A1 (en) Outside environment recognition device
US11110970B2 (en) Removable interior for reconfigurable vehicles
Lu et al. Teleoperation technologies for enhancing connected and autonomous vehicles
Bećirbašić et al. Video-processing platform for semi-autonomous driving over 5G networks
US20220094435A1 (en) Visible light communication apparatus, visible light communication method, and visible light communication program
CN112550277B (zh) 车辆和自动泊车系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20744228

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20744228

Country of ref document: EP

Kind code of ref document: A1