CN111542295A - Automatic driving method and system for intelligent wheelchair and computer readable medium - Google Patents

Automatic driving method and system for intelligent wheelchair and computer readable medium Download PDF

Info

Publication number
CN111542295A
CN111542295A CN201780098026.6A CN201780098026A CN111542295A CN 111542295 A CN111542295 A CN 111542295A CN 201780098026 A CN201780098026 A CN 201780098026A CN 111542295 A CN111542295 A CN 111542295A
Authority
CN
China
Prior art keywords
wheelchair
user
information
path
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201780098026.6A
Other languages
Chinese (zh)
Inventor
刘伟荣
李家鑫
焦寅
闫励
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Golden Ridge Intelligence Science and Technology Co Ltd
Original Assignee
Sichuan Golden Ridge Intelligence Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Golden Ridge Intelligence Science and Technology Co Ltd filed Critical Sichuan Golden Ridge Intelligence Science and Technology Co Ltd
Publication of CN111542295A publication Critical patent/CN111542295A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/04Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/10Parts, details or accessories

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method (600), system (100), and computer readable medium for intelligent wheelchair autopilot, the method (600) comprising obtaining destination information (602) of a user and determining current location information (604) of a wheelchair; determining a driving path (606) of the intelligent wheelchair according to the destination information of the user and the current position information of the wheelchair; obstacle information in a travel path of the wheelchair may be detected (608), and movement of the wheelchair may be adjusted based on the detected obstacle information (610).

Description

Automatic driving method and system for intelligent wheelchair and computer readable medium Technical Field
The application relates to the field of automatic driving, in particular to an automatic driving method and system of an intelligent wheelchair.
Background
The wheelchair is used as a travel tool, and provides great convenience for the travel of the disabled and the mobility-handicapped. The existing wheelchairs are usually driven manually, so that the user needs to spend a lot of force and time when traveling for a long distance. When meeting obstacles in the traveling process, the device is not convenient to avoid.
Disclosure of Invention
One aspect of the present application relates to a method of automatic driving of an intelligent wheelchair. The method includes obtaining destination information; determining a current location of the wheelchair; determining a driving path of the wheelchair according to the current position and the destination information; detecting obstacle information in a travel path of a wheelchair; and adjusting movement of the wheelchair in accordance with the detected obstacle information.
In some embodiments, the obtaining of the destination information includes any one of a user manual input, a voice input, and a system recommendation input.
In some embodiments, the determining a travel path of a wheelchair from the current location and destination information comprises: calculating a plurality of candidate paths according to the current position and the destination information; scoring the plurality of candidate paths; sequencing the multiple candidate paths according to the grade, and presenting the sequenced multiple candidate paths to a user for selection; and determining the driving path of the wheelchair according to the selection result of the user.
In some embodiments, after determining the travel path of the wheelchair, the user is allowed to set the travel speed of the wheelchair.
In some embodiments, the obstacle information includes a type, location, size, and motion state of the obstacle.
In some embodiments, the adjusting movement of the wheelchair includes adjusting the speed of the wheelchair, controlling wheelchair steering, and controlling wheelchair braking.
In some embodiments, the method for automatic driving of a smart wheelchair further comprises providing the user with an interest, and automatically adjusting the travel route when the user confirms that the user is interested in the interest.
Yet another aspect of the present application relates to a system for controlling the autopilot of a wheelchair. The system comprises: obtain module, processing module, detection module and control module, wherein: the acquisition module is used for acquiring destination information; the processing module is used for determining the current position of the wheelchair and determining the driving path of the wheelchair according to the current position and the destination information; the detection module is used for detecting obstacle information in a driving path of the wheelchair; and the control module is used for adjusting the movement of the wheelchair according to the detected obstacle information.
Another aspect of the application relates to a non-transitory computer-readable medium having a computer program product. The computer program product may include instructions configured to cause a computing device to perform the method.
The invention has the following technical effects:
the method comprises the steps of planning a reasonable route for the intelligent wheelchair based on destination information input by a user and combining an automatic positioning technology, and improving the traveling experience of the user.
Secondly, obstacle information is accurately identified in real time through various obstacle detection means, a reasonable avoidance scheme is planned based on the obstacle information, and travel safety is improved.
And thirdly, the efficiency and the accuracy of information input are improved by combining multiple input modes, and convenience is provided for users.
And fourthly, scoring the candidate paths to provide a more reasonable travel route for the customer.
And fifthly, providing an interesting object prompt function for the user, and adjusting the path according to the selection of the user so that the user does not miss the interest points on the driving path, thereby improving the user experience.
Additional features of the present application will be set forth in part in the description which follows. Additional features of some aspects of the present application will be apparent to those of ordinary skill in the art in view of the following description and accompanying drawings, or in view of the production or operation of the embodiments. The features of the present disclosure may be realized and attained by practice or use of the methods, instrumentalities and combinations of the various aspects of the particular embodiments described below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only some embodiments of the application, and that it is also possible for a person skilled in the art to apply the application to other similar scenarios without inventive effort on the basis of these drawings. Unless otherwise apparent from the context of language or otherwise indicated, like reference numerals in the figures refer to like structures and operations.
FIG. 1 is a schematic view of an exemplary intelligent wheelchair autopilot system according to some embodiments of the present application;
FIG. 2 is a diagram illustrating exemplary hardware and/or software components of an exemplary computing device that may implement a processing engine according to some embodiments of the present application;
FIG. 3 is a schematic diagram of the hardware and/or software components of an exemplary mobile device according to some embodiments of the present application;
FIG. 4 is a block diagram of an exemplary processing engine shown in accordance with some embodiments of the present application;
FIG. 5 is a block diagram illustrating an exemplary processing module shown in accordance with some embodiments of the present application;
FIG. 6 is an exemplary flow chart illustrating automated driving of a smart wheelchair according to some embodiments of the present application;
FIG. 7 is an exemplary flow chart illustrating the determination of a travel path of a wheelchair according to some embodiments of the present application; and
figure 8 is a schematic view of intelligent wheelchair autopilot according to some embodiments of the present application.
Detailed Description
In the following detailed description, specific details of embodiments are set forth by way of example in order to provide a thorough understanding of the related applications. It will be apparent, however, to one skilled in the art that the present application may be practiced without these specific details. In other instances, well-known methods, procedures, systems, components, and/or circuits have been described at a high-level (without detail) in order to avoid unnecessarily obscuring aspects of the present application. Various modifications to the embodiments of the present application will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present application. Thus, the present application is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should be understood that the terms "system," "engine," "unit," "module," and/or "block" as used herein are a hierarchical approach to distinguish different components, elements, components, parts, or assemblies in an ascending order. However, these terms may be replaced by other expressions if the other expressions achieve the same purpose.
Generally, "module," "unit," or "block" as used herein refers to logic embodied in hardware or a set of firmware or software instructions. The modules, units, or blocks described herein may be executed on software and/or hardware and may be stored in any type of non-transitory computer-readable medium or other storage device. In some embodiments, software modules, units, blocks may be compiled and linked into an executable program. It should be appreciated that software modules may be called from other modules, units, blocks, or themselves and/or may be called in response to detected events or interrupts. Software modules/units/blocks configured for execution on a computing device (e.g., processing engine 120 as shown in fig. 1) may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disk, or any other tangible medium or as a digital download (and may be originally stored in a compressed or installable format requiring installation, decompression, or decryption prior to execution). The software code may be stored in part or in whole on a storage device executing the computing device for execution by the computing device. The software instructions may be embedded in firmware, such as an EPROM. It should be understood that hardware modules, units or blocks may be included in connected logic components, such as gates and flip-flops, and/or may be included in programmable units such as programmable gate arrays or processors. The modules, units, blocks or computing device functions described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware. Generally, the modules, units, blocks described herein refer to logical modules, units, blocks that may be combined with other modules, units, blocks or divided into sub-modules, sub-units, sub-blocks, despite their physical organization or storage. The description may apply to the system, the engine, or a portion thereof.
It will be understood that when an element, engine, module or block is referred to as being "on" … "," connected to "or" coupled to "another element, engine, module or block, it can communicate directly with the other element, engine, module or block or there may be elements, engines, modules or blocks unless the context clearly dictates otherwise. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
These and other features of the present application, as well as related structural elements and components of manufacture and methods of operation and function that are economically incorporated, may become more apparent and form a part of the present application upon consideration of the following description with reference to the accompanying drawings. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the application. It should be understood that the drawings are not to scale.
FIG. 1 is a schematic view of an exemplary intelligent wheelchair autopilot system 100 according to some embodiments of the present application. As shown, the smart wheelchair autopilot system 100 can include a smart wheelchair 110, a processing engine 120, a memory 130, one or more terminals 140, and a network 150. In some embodiments, the smart wheelchair 110, the processing engine 120, the memory 130, and/or the terminal 140 may be connected and/or communicate with each other via a wireless connection (e.g., the network 150), a wired connection, or a combination thereof. The connections between the components in the intelligent wheelchair autopilot system 100 can vary. For example only, the smart wheelchair 110 may be connected to the processing engine 120 via a network 150 as shown in fig. 1. As another example, the smart wheelchair 110 may be directly connected to the processing engine 120. As another example, memory 130 may be coupled to processing engine 120 through network 150 as shown in FIG. 1, or directly coupled to processing engine 120.
In some embodiments, the smart wheelchair 110 may transmit the travel data to the processing engine 120, the memory 130, and/or the terminal 140 via the network 150. For example, the travel data may be sent to the processing engine 120 for further processing, or may be stored in the memory 130.
The processing engine 120 may process data and/or information obtained from the smart wheelchair 110, the memory 130, and/or the terminal 140. For example, the processing engine 120 may generate control instructions based on travel data collected by the smart wheelchair 110. In some embodiments, the processing engine 120 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, the processing engine 120 may be local or remote. For example, the processing engine 120 may access information and/or data from the intelligent wheelchair 110, the memory 130, and/or the terminal 140 via the network 150. As another example, the processing engine 120 may be directly connected to the smart wheelchair 110, the terminal 140, and/or the memory 130 to access information and/or data. In some embodiments, the processing engine 120 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, inter-cloud, multi-cloud, and the like, or a combination thereof. In some embodiments, the processing engine 120 may be implemented by a computing device 200 including one or more components as described in fig. 2.
Memory 130 may store data, instructions, and/or any other information. In some embodiments, the memory 130 may store data obtained from the processing engine 120, the terminal 140. In some embodiments, memory 130 may store data and/or instructions that processing engine 120 may execute or perform the exemplary methods described herein. In some embodiments, memory 130 may include mass storage, removable storage, volatile read/write memory, Read Only Memory (ROM), and the like, or any combination thereof. Exemplary mass storage devices may include magnetic disks, optical disks, solid state drives, and the like. Exemplary removable storage may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Exemplary volatile read/write memory can include Random Access Memory (RAM). Exemplary RAM may include Dynamic RAM (DRAM), double data rate synchronous dynamic RAM (DDR SDRAM), Static RAM (SRAM), thyristor RAM (T-RAM), and zero-capacitor RAM (Z-RAM), among others. Exemplary ROMs may include Mask ROM (MROM), Programmable ROM (PROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital versatile disk ROM, among others. In some embodiments, memory 130 may be implemented on a cloud platform as described elsewhere herein.
In some embodiments, the memory 130 may be connected to the network 150 to communicate with one or more other components (e.g., the processing engine 120, the terminal 140, etc.) in the intelligent wheelchair autopilot system 100. One or more components of the intelligent wheelchair autopilot system 100 may access data or instructions stored in the memory 130 via the network 150. In some embodiments, memory 130 may be part of processing engine 120.
The terminal 140 may be connected to and/or in communication with the smart wheelchair 110, the processing engine 120, and/or the memory 130. For example, the terminal 140 may obtain a processed travel data (e.g., travel speed) from the processing engine 120. As another example, the terminal 140 may obtain image data (e.g., street scene images captured by a camera) captured by the smart wheelchair 110 and transmit the image data to the processing engine 120 for processing. In some embodiments, the terminal 140 may include a mobile device 140-1, a tablet 140-2, a laptop 140-3, and the like, or any combination thereof. For example, the mobile device 140-1 may include a mobile phone, a Personal Digital Assistant (PDA), a gaming device, a navigation device, a point of sale (POS) device, a laptop, a tablet, a desktop, and the like, or any combination thereof. In some embodiments, the terminal 140 may include an input device, an output device, and the like. The input devices may include alphanumeric and other keys that may be input through a keypad, touch screen (e.g., with tactile or haptic feedback), voice input, gaze tracking input, brain monitoring system, or any other similar input mechanism. Input information received by the input device may be transmitted over, for example, a bus to the processing engine 120 for further processing. Other types of input devices may include cursor control devices such as a mouse, a trackball, or cursor direction keys. The output devices may include a display, speakers, printer, etc., or a combination thereof. In some embodiments, the endpoint 140 may be part of the processing engine 120.
The network 150 may include any suitable network that may facilitate the exchange of information and/or data for the intelligent wheelchair autopilot system 100. In some embodiments, one or more components of the intelligent wheelchair autopilot system 100 (e.g., the intelligent wheelchair 110, the processing engine 120, the memory 130, the terminal 140, etc.) may be in information and/or data communication with one or more other components of the imaging system 100 via the network 150. For example, the processing engine 120 may be over a networkThe network 150 obtains image data from the scanner 110. As another example, processing engine 120 may obtain user instructions from terminal 140 via network 150. The network 150 may be and/or include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), etc.), a wired network (e.g., ethernet), a wireless network (e.g., an 802.11 network, a Wi-Fi network, etc.), a cellular network (e.g., a Long Term Evolution (LTE) network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. For example, network 150 may include a cable network, a wireline network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), BluetoothTMNetwork and ZigBeeTMA network, a Near Field Communication (NFC) network, and the like, or any combination thereof. In some embodiments, network 150 may include one or more network access points. For example, the network 150 may include wired and/or wireless network access points, such as base stations and/or internet exchange points, through which one or more components of the intelligent wheelchair autopilot system 100 may connect to the network 150 to exchange data and/or information.
The above description is for illustrative purposes only and does not limit the scope of the present application. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the memory 130 may be a data storage comprising a cloud computing platform, which may be a public cloud, a private cloud, a community cloud, a hybrid cloud, and the like. However, variations and modifications may not depart from the scope of the present application.
Fig. 2 is a schematic diagram of exemplary hardware and/or software components of an exemplary computing device 200 that may implement the processing engine 120, according to some embodiments of the present application. As shown in fig. 2, computing device 200 may include a processor 210, a memory 220, input/output (I/O) ports 230, and communication ports 240.
Processor 210 may execute computer instructions (e.g., program code) and perform the functions of processing engine 120 in accordance with the techniques described herein. The computer instructions may include, for example, routines, programs, objects, components, data structures, procedures, modules, and functions that perform particular functions described herein. For example, the processor 210 may process data acquired from the smart wheelchair 110, the terminal 140, the memory 130, and/or any other component of the smart wheelchair autopilot system 100. In some embodiments, processor 210 may include one or more hardware processors, such as microcontrollers, microprocessors, reduced instruction computers (RISC), Application Specific Integrated Circuits (ASIC), application specific instruction set processors (ASIP), Central Processing Units (CPU), Graphics Processors (GPU), Physical Processors (PPU), microcontroller units, Digital Signal Processors (DSP), Field Programmable Gate Arrays (FPGA), advanced reduced instruction system computers (ARM), Programmable Logic Devices (PLD), any circuit or processor capable of executing one or more functions, or the like, or any combination thereof.
For illustration only, only one processor in computing device 200 is described. It should be noted, however, that the computing device 200 may include multiple processors, and thus operations and/or method steps described herein as being performed by one processor may also be performed by multiple processors, collectively or independently. For example, if in the present application, the processors of computing device 200 perform process A and process B, it should be understood that process A and process B may also be performed jointly or independently by two or more different processors in computing device 200 (e.g., a first processor performs process A and a second processor performs process B; or a first processor and a second processor perform processes A and B together).
The memory 220 may store data/information obtained from the smart wheelchair 110, the terminal 140, the memory 130, and/or any other component of the smart wheelchair autopilot system 100. In some embodiments, memory 220 may include mass storage, removable storage, volatile read/write memory, Read Only Memory (ROM), or the like, or any combination thereof. For example, mass storage may include magnetic disks, optical disks, solid state drives, and so forth. Removable storage may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Volatile read/write memory can include Random Access Memory (RAM). RAM may include Dynamic RAM (DRAM), double data rate synchronous dynamic RAM (DDR SDRAM), Static RAM (SRAM), thyristor RAM (T-RAM), zero capacitor RAM (Z-RAM), and the like. The ROM may include Mask ROM (MROM), Programmable ROM (PROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital versatile disk ROM, among others. In some embodiments, memory 220 may store one or more programs and/or instructions to perform the example methods described herein. For example, memory 220 may store a program that may cause processing engine 120 to determine a regularization term.
I/O230 may input and/or output signals, data, information, and the like. In some embodiments, I/O230 may enable user interaction with processing engine 120. In some embodiments, I/O230 may include one input device and one output device. Examples of input devices may include a keyboard, mouse, touch screen, microphone, etc., or a combination thereof. Examples of output devices may include a display device, speakers, a printer, a projector, and so forth, or a combination thereof. Examples of display devices may include Liquid Crystal Displays (LCDs), Light Emitting Diode (LED) based displays, flat panel displays, curved screens, television devices, Cathode Ray Tubes (CRTs), touch screens, and the like, or combinations thereof.
The communication port 240 may be connected to a network (e.g., network 150) to facilitate data communication. The communication port 240 may establish a connection between the processing engine 120 and the smart wheelchair 110, the terminal 140, and/or the memory 130. The connection may be a wired connection, a wireless connection, any other communication connection that may enable data transmission and/or reception, and/or any combination of such connections. The wired connection may include, for example, an electrical cable, an optical cable, a telephone line, etc., or any combination thereof. The wireless connection may comprise, for example, BluetoothTMConnection, Wi-FiTMConnection, WiMaxTMConnection, WLAN connection, ZigBee connection, mobileNetwork connections (e.g., 3G, 4G, 5G, etc.), and the like, or combinations thereof. In some embodiments, the communication port 240 may be (or include) a standardized communication port, such as RS232, RS485, and the like. In some embodiments, the communication port 240 may be a specially designed communication port. For example, the communication port 240 may be designed according to an autonomous communication protocol.
Fig. 3 is a diagram illustrating exemplary hardware and/or software components of an exemplary mobile device 300, on which the terminal 140 may be implemented, according to some embodiments of the present application. As shown in FIG. 3, mobile device 300 may include a communication platform 310, a display 320, a Graphics Processing Unit (GPU)330, a Central Processing Unit (CPU)340, I/O350, memory 360, and memory 390. In some embodiments, the mobile device 300 may also include any other suitable components, including but not limited to a system bus or a controller (not shown). In some embodiments, an operating system 370 (e.g., iOS)TM、AndroidTM、Windows PhoneTMEtc.) and one or more application programs 380 may be loaded from memory 390 into memory 360 for execution by CPU 340. The application 380 may include a browser or any other suitable mobile application for receiving and presenting information regarding image processing or other information from the processing engine 120. The information flow may be enabled for interaction with a user via the I/O350 and provided to the processing engine 120 and/or other components of the intelligent wheelchair autopilot system 100 via the network 150.
To implement the various modules, units and their functions described in this application, a computer hardware platform may be used as the hardware platform for one or more of the elements described in this application. A computer with user interface elements may be used to implement a Personal Computer (PC) or any other type of workstation or external device. The computer may also function as a server if suitably programmed.
Fig. 4 is a schematic diagram of an exemplary processing engine 120 shown in accordance with some embodiments of the present application. The processing engine 120 may include an acquisition module 402, a processing module 404, a detection module 406, a control module 408, and a storage module 410. At least a portion of the processing engine 120 may be implemented on a computing device as shown in fig. 2 or a mobile device as shown in fig. 3.
The acquisition module 402 may acquire data. In some embodiments, the acquisition module 402 may acquire data related to the smart wheelchair 110 from the smart wheelchair 110, the memory 130, the terminal 140, and/or an external data source (not shown). In some embodiments, the data may include raw data (e.g., travel data), instructions, and the like, or a combination thereof. For example, the travel speed of the smart wheelchair may be generated based on the rolling speed of the wheels of the smart wheelchair. The instructions may be executable by a processor of processing engine 120 to implement the example methods described herein. In some embodiments, the acquired data may be transmitted to the storage module 410 for storage.
The processing module 404 may process the data. The processing module 404 may process information provided by multiple modules of the processing engine 120. The processing module 404 may process data that has been acquired by the acquisition module 402, data detected by the detection module 406, data obtained from the storage module 410 and/or the memory 130, and so forth. For example, the processing module 404 may process the GPS positioning data acquired by the acquisition module 402 to determine the real-time location of the smart wheelchair 110. For another example, the processing module 404 may process the destination information obtained by the obtaining module 402 and determine the driving path of the smart wheelchair 110 according to the real-time location of the smart wheelchair 110. In some embodiments, the user may change the destination in real time, and the processing module 404 may adjust the travel path of the smart wheelchair 110 in real time according to the user's requirement. For example, in the automatic driving process, when the user expresses that something is interested (e.g., supermarket promotion, street show, etc.), the original destination information may be modified in real time, and the processing module 404 may adjust the driving route according to the destination information modified in real time. In some embodiments, the detection module 406 may detect data related to an obstacle in the travel path via a detection device (e.g., a camera, a radar, a distance sensor, etc.) installed on the smart wheelchair 110, and the processing module 404 may process the data related to the obstacle. For example: the intelligent wheelchair 110 may be installed with a camera, the camera may acquire image data of the obstacle, and the processing module 404 may determine the type of the obstacle according to a preset prediction model.
The detection module 406 may detect data of an obstacle. The related data of the obstacle comprises the type, the position and the size information of the obstacle. In some embodiments, the obstacle may be stationary. For example, as a vehicle parked at the roadside. The detection module 406 may obtain the picture data of the vehicle through a camera, and may detect the movement information of the vehicle through a radar. Further, the information related to the vehicle may be sent to the storage module 410 for subsequent operations (e.g., as sample information of a training model), and may also be sent to the processing module 404 for processing to generate related operation instructions. For example, the processing module 404 may generate corresponding obstacle avoidance commands to control the movement (e.g., deceleration, steering, etc.) of the smart wheelchair 110 according to the position and size information of the vehicle. In some embodiments, the obstacle may be a real-time moving object (e.g., a real-time moving vehicle or pedestrian). At this time, the detection module 404 may detect the moving speed information of the object, and the processing module 406 may predict the moving track of the object based on the speed information of the object and determine whether the object will collide with the smart wheelchair 110. If the determination result is yes, the intelligent wheelchair 110 can automatically adjust the moving state (such as braking, steering, etc.) to effectively and reasonably avoid the situation. Further, if the detected moving object is a pedestrian, the smart wheelchair 110 may identify the pedestrian, determine whether the pedestrian is a person recognized by the user, and send a prompt if the determination result is yes. The identification of the identity of the pedestrian can be realized by the image information acquired by the detection device, and the walking posture of the pedestrian is identified. The reminding mode can comprise display screen prompt, voice prompt, vibration prompt and the like.
The control module 408 can receive control command information sent by the processing module 406, stored in the storage module 410, or manually input by a user to control the wheelchair. The control instructions may include movement control instructions, function control instructions, and the like. The movement control commands include deceleration, braking, steering, etc. of the wheelchair. The function control instruction comprises a temperature adjusting control instruction, a voice/video starting instruction and a seat adjusting instruction. In some embodiments, the control module 408 may receive real-time instructions from a user or predetermined instructions provided by a user to control one or more operations of the smart wheelchair 110, the acquisition module 402, and/or the processing module 406. For example: when an obstacle is detected, the processing module 404 sends an obstacle avoiding instruction, and the control module may perform braking and steering control on the smart wheelchair 110 according to the instruction. Another example is: when the received seat adjusting instruction of the user is received, the control module can automatically adjust the seat inclination angle of the intelligent wheelchair. Another example is: when receiving a voice/video starting instruction of the user, the intelligent wheelchair 110 can automatically start the voice/video call function. In some embodiments, control module 408 may communicate with one or more other modules of processing engine 120 for information and/or data exchange.
The storage module 410 may store image data, sensed data, control parameters, processed image data, and the like, or combinations thereof. In some embodiments, the memory module 410 may store one or more programs and/or instructions that may be executed by a processor of the processing engine 120 to implement the example methods described herein. For example, the storage module 410 may store programs and/or instructions that may be executed by a processor of the processing engine 120 to obtain obstacle information, generate, and/or display any intermediate results based on the obstacle data.
In some embodiments, one or more of the modules shown in fig. 4 may be implemented in at least a portion of the imaging system shown in fig. 1. For example, the acquisition module 402, the control module 408, the storage module 410, and/or the processing module 404 may be integrated into a console (not shown). Through the console, a user can set a destination for travel, select a route for travel, set a speed for travel, change a travel route, and the like. In some embodiments, the console may be implemented by the processing engine 120 and/or the terminal 140.
Fig. 5 is a block diagram illustrating an exemplary processing module 406 according to some embodiments of the present application. The processing module 404 may include a candidate route determination unit 502, a route scoring unit 504, a ranking unit 506, and a travel route determination unit 508. The processing module 406 may be implemented on a plurality of components (e.g., the processor 210 of the computing device 200 shown in fig. 2). For example, at least a portion of the processing module 406 may be implemented on a computing device as shown in fig. 2 or a mobile device as shown in fig. 3.
The candidate path determination unit 502 may determine a plurality of candidate paths. In some embodiments, the plurality of paths have the same origin and the same destination. The starting point is the current position of the intelligent wheelchair. The current position of the intelligent wheelchair can be obtained by the existing positioning technology. The positioning technology used in the present application may include Global Positioning System (GPS), global satellite navigation system (GLONASS), COMPASS navigation system (COMPASS), galileo positioning system, quasi-zenith satellite system (QZSS), wireless fidelity (WiFi) positioning technology, picture recognition positioning, etc. or any combination of the above examples. One or more of the above positioning techniques may be used interchangeably in this application. The end point may be selected or entered by a user. For example, the system may provide the user with a plurality of recent destination information for selection based on historical travel records. When the destination provided by the system does not contain the user requirement, the user can input a new destination. The input mode includes manual input, voice input and the like.
After determining the starting point of travel of the intelligent wheelchair 110 and the destination of the user, the candidate route determination unit 502 may determine a plurality of candidate routes accordingly. In some embodiments, the plurality of candidate paths may have different path characteristics. For example, the lengths of the routes are different, the ratio of the lengths of the trunk routes in the routes is different, whether a specific area is passed, etc. In some embodiments, the different path characteristics and the corresponding candidate paths may be provided to the user together, so as to provide a reference for the selection of the user. As shown in fig. 8 for 3 candidate routes, route 1 has the shortest route length; the length of the trunk route is highest in route 2 and route 3 can pass through the user interest area C.
The path scoring unit 504 may perform composite scoring on the plurality of candidate paths, respectively. In some embodiments, the score is determined from the path characteristics. Each path feature corresponds to a weight, and the score of each candidate path takes the path feature into consideration. For example: the path length, the number of traffic lights, the number of turns required, the road congestion degree and the like are considered. Each path characteristic can be scored, and the composite score of the path can be obtained by multiplying the weighted value of each path characteristic. The scores of the path characteristics corresponding to a candidate route may be 10 points, 8 points and 6 points, respectively, and the corresponding weight values may be 0.3, 0.1, 0.2 and 0.4, respectively. Thus, a composite score of 8 for the last-changed candidate route can be obtained. It should be noted that the scoring may be presented as a specific score, or may be performed in a hierarchical, star-level, etc. manner.
The ranking unit 506 may rank the plurality of candidate routes according to the composite score of each candidate path and present the ranking results to the customer for selection. The travel path determination unit 508 may determine a final travel path according to the selection result of the customer. In some embodiments, the final driving path of the wheelchair may be any one of the candidate paths, and may also be a driving path customized by the user.
It should be noted that the above description of processing module 406 is for illustrative purposes only and is not intended to limit the scope of the present application. Many variations or modifications may be made as indicated by the teaching of the present application to those skilled in the art. However, such variations and modifications do not depart from the scope of the present application. For example, the ranking unit 506 may be integrated into the path scoring 504 and/or the travel path determination unit 508. As another example, the candidate route unit 502 may omit or be integrated into the route scoring unit 504.
Fig. 6 is an exemplary 600 flow diagram of intelligent wheelchair autopilot according to some embodiments of the present application. In some embodiments, flow 600 may include: obtaining destination information 602, determining a current location 604 of the wheelchair, determining a travel path 606 of the wheelchair based on the destination information and the current location, detecting obstacle information 608 in the travel path of the wheelchair, and adjusting 610 movement of the wheelchair based on the detected obstacle information. In some embodiments, one or more operations of the process 600 shown in fig. 6 for wheelchair autopilot may be implemented in the imaging system 100 shown in fig. 1. For example, the flow 600 illustrated in fig. 6 may be stored in the memory 130 in the form of instructions and invoked and/or executed by the processing engine 120 (e.g., the processor 210 of the computing apparatus 200 illustrated in fig. 2, the CPU 340 of the mobile device 300 illustrated in fig. 3).
In operation 602, destination information of a user may be obtained. Operation 602 is performed by acquisition module 402. In some embodiments, the destination information may be obtained from the smart wheelchair 110, the memory 130, the terminal 140, or an external data source. As used herein, destination information may include the name of the destination, the latitude and longitude coordinates of the destination, and the like. The user can directly input the name of the destination, also input the longitude and latitude coordinates of the destination, and also can specify the destination through an electronic map. In some embodiments, the user may be provided with a selection by selecting a destination from the historical travel record for the most recent trips. In some embodiments, when providing the destination to the client, the probability that the user selects the corresponding destination may be predicted by a recommendation model, machine learning, or an artificial neural network, and the destination with a high probability is presented to the user preferentially or in a conspicuous manner.
In operation 604, a current location of the wheelchair may be determined. Operation 604 is performed by processing module 404. In some embodiments, the current location of the wheelchair may be determined based on one or more positioning techniques mentioned in the present application. Operation 604 may be implemented based on GPS location technology, for example. In some embodiments, the positioning technology may also include global navigation satellite system (GLONASS), COMPASS navigation system (COMPASS), galileo positioning system, quasi-zenith satellite system (QZSS), wireless fidelity (WiFi) positioning technology, the like, or any combination of the above. One or more of the above positioning techniques may be used interchangeably in this application.
In operation 606, a travel path of the wheelchair may be determined based on the destination information and the current location. Operation 606 is performed by processing module 404. In some embodiments, the travel path of the wheelchair may be system recommended or user input. For example, after obtaining the destination information and the current location information of the wheelchair, the system may determine a plurality of candidate paths for selection by the user. The user may select any one of these candidate routes as the travel path of the wheelchair. In some embodiments, a user may customize a route as a travel route for a wheelchair. For example, when the system determines that the current wheelchair location is "home" and the destination for the user input is "supermarket" after the system has obtained user input. The system may recommend three candidate routes for the user. The three candidate routes respectively have the characteristics of shortest route, least traffic lights and least turning. If the user feels that he needs to go to another place (such as a breakfast shop) at this time, the three recommended routes may not meet the user's requirement. At the moment, the user can customize the driving path of the wheelchair to meet the actual requirements of the user. In some embodiments, the user may change the travel path of the wheelchair in real time. For example, when the user receives a reminder of an interesting transaction and confirms that the user is interested in the interesting transaction, the system can modify the driving path of the wheelchair in real time to meet the real-time requirements of the user.
In operation 608, obstacle information in a travel path of the wheelchair may be detected. Operation 608 is performed by detection module 406. In some embodiments, the smart wheelchair 110 may have multiple detection devices (e.g., cameras, radar, distance sensors, etc.) mounted thereon. By these detection means, the system can detect the situation around the smart wheelchair 110. For example, obstacle information on the travel route. The obstacle may be stationary, such as a vehicle parked at the roadside. Or may be mobile, such as a pedestrian walking on a road. In some embodiments, a moving object may not be on the driving route at the beginning, i.e., does not interfere with the driving of the wheelchair, but during its subsequent movement, may move to the driving path of the wheelchair and interfere with the normal driving of the wheelchair, such moving object is also considered an obstacle.
In some embodiments, operation 608 may obtain information of the obstacle. The obstacle information may include the type, position, size, and motion state of the obstacle, etc. For example: the method comprises the steps of acquiring a picture of an obstacle by using a camera, and determining the type of the obstacle according to the picture of the obstacle by using a picture identification method. The picture identification method comprises the following steps: machine learning, deep learning, artificial neural networks, and the like. Another example is: by using radar and/or a distance sensor, the relative position information of the obstacle and the wheelchair, the actual size and speed information of the obstacle, and the like can be acquired. The acquired obstacle information can provide reference for subsequent planning of obstacle avoidance schemes of the system.
In operation 610, movement of the wheelchair may be adjusted based on the detected obstacle information. Operation 610 is performed by control module 408. In some embodiments, adjusting the movement of the wheelchair includes adjusting the direction of movement of the wheelchair (left or right) and adjusting the speed of the wheelchair (acceleration or deceleration). For example, when a vehicle is detected to be parked at the roadside on the front right during the travel of the smart wheelchair 110, the system will adjust the smart wheelchair 110 to move forward left to avoid the vehicle. For example, when a pedestrian is detected directly in front of the smart wheelchair during the travel of the smart wheelchair 110 and the user confirms that the pedestrian is recognized, the system may adjust the moving speed (acceleration) of the smart wheelchair to catch up with the pedestrian directly in front of the smart wheelchair. For example, when it is detected that there is an intersection ahead and there is a car traveling in another direction while the smart wheelchair 110 is traveling, the system will adjust the moving speed (deceleration) of the smart wheelchair 110 to prevent a collision with the car traveling in the other direction.
It should be noted that the above description is for illustrative purposes only and is not intended to limit the scope of the present application. Many variations or modifications may be made as indicated by the teaching of the present application to those skilled in the art. However, such variations and modifications do not depart from the scope of the present application. For example, process 600 may include an operation for preprocessing obstacle related information prior to 610.
Fig. 7 is an exemplary flow chart 700 for determining a travel path for a wheelchair according to some embodiments of the present application. Flow 700 may be performed by processing module 406. In some embodiments, operation 606 shown in FIG. 6 may be performed in accordance with flow 700. In some embodiments, one or more operations of the process 700 shown in fig. 7 to determine a travel path of a wheelchair may be implemented in the imaging system 100 shown in fig. 1. For example, the flow 700 shown in fig. 7 may be stored in the memory 130 in the form of instructions and invoked and/or executed by the processing engine 120 (e.g., the processor 210 of the computing apparatus 200 shown in fig. 2, the CPU 340 of the mobile device 300 shown in fig. 3).
In operation 702, a plurality of candidate paths may be calculated based on the current location and the destination information. Operation 702 may be performed by the candidate path determination unit 502. In some embodiments, after obtaining destination information for the user and determining the current location of the user, the system may determine a plurality of candidate travel paths for selection by the customer based on the obtained destination and current location. In some embodiments, the plurality of candidate paths may have different path characteristics. For example, the lengths of the routes are different, the ratio of the lengths of the trunk routes in the routes is different, whether a specific area is passed, etc. In some embodiments, the different path characteristics and the corresponding multiple candidate paths may be provided to the user together, so as to provide a reference for the selection of the user.
In operation 704, the plurality of candidate paths may be scored. Operation 704 may be performed by path scoring unit 504. In some embodiments, the score is related to a path characteristic of the candidate path. Each path feature corresponds to a weight, and the weights of different path features may be the same. Scoring each candidate path may take into account a plurality of said path characteristics, such as: the path length, the number of traffic lights, the number of turns required, the road congestion degree and the like are considered. In some embodiments, the system may score each path feature and multiply the weight value of each path feature to obtain a composite score for the path. The score may be a numerical score (e.g., 8.0, 9.5, etc.) or a rating score (e.g., 3 stars, 5 stars, etc.).
In operation 706, the plurality of candidate paths may be ranked according to the scores, and the ranked plurality of candidate paths may be presented to the user for selection by the user. Operation 706 may be performed by ordering unit 506. In some embodiments, the system may rank the plurality of candidate routes by score from high to low and present the highest scoring priority to the user. For example, the highest scoring candidate path may be labeled as a recommended route, with the other candidate paths labeled as alternative routes. As another example, when presenting the current candidate path to the user, the highest scoring candidate path is placed at the top or most prominent location. For another example, when the candidate route is presented in the electronic map, the candidate route with the highest score may be highlighted (e.g., a color different from other candidate routes is set, or a highest transparency is set, etc.).
In operation 708, a travel path of the wheelchair may be determined according to the selection result of the user. Operation 708 may be performed by the travel path determination unit 508. In some embodiments, the user may select any one of the plurality of candidate routes as a travel route for a wheelchair, e.g., the user may directly select the highest scoring candidate route as the travel route for the wheelchair. In some embodiments, the user can customize the driving route of the wheelchair according to actual conditions. For example, in practical situations, a user may need to take a priority to another location different from the destination, and the driving path of the wheelchair may be set by an input device (a manual input device, a voice input device, or another input device with similar functions) to meet the user's demand.
It should be noted that the above description is for illustrative purposes only and is not intended to limit the scope of the present application. Many variations or modifications may be made as indicated by the teaching of the present application to those skilled in the art. However, such variations and modifications do not depart from the scope of the present application. For example, process 700 may include an operation for filtering candidate routes to remove partially unsuitable candidate routes before 704.
Figure 8 is a schematic view of intelligent wheelchair autopilot according to some embodiments of the present application. As shown in FIG. 8, A is the current location of the wheelchair, B is the user's destination, and C is something the user is interested in. After determining the user's destination B and the current wheelchair location A, the system plans 3 candidate routes (denoted 1, 2, 3, respectively) for the user to select. Route 1 has the shortest distance, route 2 has the highest proportion of trunk sections, and route 3 passes through point of interest C.
Various aspects of the methods outlined above and/or methods in which other steps are implemented by the program. Program portions of the technology may be thought of as "products" or "articles of manufacture" in the form of executable code and/or associated data embodied in or carried out by a computer readable medium. Tangible, non-transitory storage media include memory or storage for use by any computer, processor, or similar device or associated module. Such as various semiconductor memories, tape drives, disk drives, or similar devices capable of providing storage functions for software at any one time.
All or a portion of the software may sometimes communicate over a network, such as the internet or other communication network. Such communication enables loading of software from one computer device or processor to another. For example: from a management server or host computer of the on-demand service system to a hardware platform of a computing environment or other computing environment implementing the system or similar functionality related to the information needed to provide the on-demand service. Thus, another medium capable of transferring software elements may also be used as a physical connection between local devices, such as optical, electrical, electromagnetic waves, etc., propagating through cables, optical cables, or the air. The physical medium used for the carrier wave, such as an electric, wireless or optical cable or the like, may also be considered as the medium carrying the software. As used herein, unless limited to a tangible "storage" medium, other terms referring to a computer or machine "readable medium" refer to media that participate in the execution of any instructions by a processor.
Thus, a computer-readable medium may take many forms, including but not limited to, a tangible storage medium, a carrier wave medium, or a physical transmission medium. The stable storage medium comprises: optical or magnetic disks, and other computer or similar devices, capable of implementing the system components described in the figures. Volatile storage media include dynamic memory, such as the main memory of a computer platform. Tangible transmission media include coaxial cables, copper cables, and fiber optics, including the wires that form a bus within a computer system. Carrier wave transmission media may convey electrical, electromagnetic, acoustic, or light wave signals, which may be generated by radio frequency or infrared data communication methods. Common computer-readable media include hard disks, floppy disks, magnetic tape, any other magnetic medium; CD-ROM, DVD-ROM, any other optical medium; punch cards, any other physical storage medium containing a pattern of holes; RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge; a carrier wave transmitting data or instructions, a cable or connection transmitting a carrier wave, any other program code and/or data which can be read by a computer. These computer-readable media may take many forms, and include any type of program code for causing a processor to perform instructions, communicate one or more results, and/or the like.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Those skilled in the art will appreciate that various modifications and improvements may be made to the disclosure herein. For example, the different system components described above are implemented by hardware devices, but may also be implemented by software solutions only. For example: the system is installed on an existing server. Further, the location information disclosed herein may be provided via a firmware, firmware/software combination, firmware/hardware combination, or hardware/firmware/software combination.
The foregoing describes the present application and/or some other examples. The present application can be modified in various ways in light of the above. The subject matter disclosed herein can be implemented in various forms and examples, and the present application can be applied to a wide variety of applications. All applications, modifications and variations that are claimed in the following claims are within the scope of this application.

Claims (10)

  1. A method for controlling the autopilot of a wheelchair, comprising:
    acquiring destination information;
    determining a current location of the wheelchair;
    determining a driving path of the wheelchair according to the current position and the destination information;
    detecting obstacle information in a travel path of a wheelchair; and
    adjusting movement of the wheelchair based on the detected obstacle information.
  2. The method of claim 1, wherein the obtaining of the destination information comprises any one of a user manual input, a voice input and a system recommendation input.
  3. The method of claim 1, wherein determining a travel path for a wheelchair based on the current location and destination information comprises:
    calculating a plurality of candidate paths according to the current position and the destination information;
    scoring the plurality of candidate paths;
    sequencing the multiple candidate paths according to the grade, and presenting the sequenced multiple candidate paths to a user for selection;
    and determining the driving path of the wheelchair according to the selection result of the user.
  4. The method of claim 1, wherein after determining the travel path of the wheelchair, allowing the user to set the travel speed of the wheelchair.
  5. The method of claim 1, wherein the obstacle information includes a type, location, size, and motion status of an obstacle.
  6. The method of claim 1, wherein the adjusting movement of the wheelchair comprises adjusting speed of the wheelchair, controlling wheelchair steering, and controlling wheelchair braking.
  7. The method of claim 1, further comprising providing the user with an interest and automatically adjusting the travel route when the user confirms that the user is interested in the interest.
  8. A system for controlling automatic driving of a wheelchair is characterized by comprising an acquisition module, a processing module, a detection module and a control module;
    the acquisition module is used for acquiring destination information;
    the processing module is used for determining the current position of the wheelchair and determining the driving path of the wheelchair according to the current position and the destination information;
    the detection module is used for detecting obstacle information in a driving path of the wheelchair; and
    the control module is used for adjusting the movement of the wheelchair according to the detected obstacle information.
  9. The system of claim 8, wherein the processing module comprises a candidate route determination unit, a route scoring unit, a ranking unit, and a travel route determination unit,
    the candidate route determining unit calculates a plurality of candidate routes according to the current position and the destination information;
    the path scoring unit scores the candidate paths;
    the sorting unit sorts the multiple candidate paths according to the grade and presents the sorted multiple candidate paths to a user for the user to select; and
    the travel path determination unit determines a travel path of the wheelchair according to a selection result of the user.
  10. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed, performs the method of:
    acquiring destination information of the wheelchair and position information of the current wheelchair, and determining a driving path of the wheelchair according to the destination information and the current position information;
    obstacle information in a travel path of a wheelchair is detected, and movement of the wheelchair is adjusted according to the detected obstacle information.
CN201780098026.6A 2017-12-28 2017-12-28 Automatic driving method and system for intelligent wheelchair and computer readable medium Pending CN111542295A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/119542 WO2019127261A1 (en) 2017-12-28 2017-12-28 Method for automatic driving of smart wheelchair, system and computer readable medium

Publications (1)

Publication Number Publication Date
CN111542295A true CN111542295A (en) 2020-08-14

Family

ID=67064360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780098026.6A Pending CN111542295A (en) 2017-12-28 2017-12-28 Automatic driving method and system for intelligent wheelchair and computer readable medium

Country Status (2)

Country Link
CN (1) CN111542295A (en)
WO (1) WO2019127261A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706790A (en) * 2021-09-28 2021-11-26 平安国际智慧城市科技股份有限公司 Method, system, device, equipment and medium for driving assistance
CN115154080A (en) * 2022-07-07 2022-10-11 广东职业技术学院 Anti-collision system and method for electric wheelchair
CN117277513A (en) * 2023-11-15 2023-12-22 深圳安培时代数字能源科技有限公司 Energy storage device control method, controller and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080300777A1 (en) * 2002-07-02 2008-12-04 Linda Fehr Computer-controlled power wheelchair navigation system
DE202012005212U1 (en) * 2012-05-25 2012-07-04 Mitac Europe Ltd. Personal navigation device for generating exercise routes for a user
CN102631265A (en) * 2012-05-11 2012-08-15 重庆大学 Embedded control system of intelligent wheelchair
CN103699124A (en) * 2013-12-04 2014-04-02 北京工业大学 Fuzzy neural network control method for omni-directional intelligent wheelchair to avoid obstacle
CN104083258A (en) * 2014-06-17 2014-10-08 华南理工大学 Intelligent wheel chair control method based on brain-computer interface and automatic driving technology
CN104161629A (en) * 2014-06-27 2014-11-26 西安交通大学苏州研究院 Intelligent wheelchair
KR20150070764A (en) * 2013-12-17 2015-06-25 인제대학교 산학협력단 Navigation apparatus for powered wheelchair and method thereof
CN105094131A (en) * 2015-08-06 2015-11-25 东圳医疗器械(上海)有限公司 Automatic path searching method suitable for electric wheelchair
CN105398389A (en) * 2015-12-23 2016-03-16 安徽安凯汽车股份有限公司 Automobile safe driving auxiliary detection system and method
CN106428000A (en) * 2016-09-07 2017-02-22 清华大学 Vehicle speed control device and method
CN106557474A (en) * 2015-09-24 2017-04-05 北京四维图新科技股份有限公司 Obtain the method and device of POI, database, navigation terminal and automobile on the way
CN106983613A (en) * 2017-04-12 2017-07-28 深圳市元征科技股份有限公司 The control method of intelligent wheel chair and intelligent wheel chair
CN107036610A (en) * 2016-11-18 2017-08-11 四川研宝科技有限公司 It is a kind of based on method from predetermined paths to user's propelling data, terminal and server
CN206493897U (en) * 2017-02-06 2017-09-15 宿迁学院 Vehicle environmental sensory perceptual system and autonomous driving vehicle

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103110485B (en) * 2013-02-25 2015-07-08 武汉理工大学 Multifunctional intelligent medical guardian wheelchair and wheelchair control method

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080300777A1 (en) * 2002-07-02 2008-12-04 Linda Fehr Computer-controlled power wheelchair navigation system
CN102631265A (en) * 2012-05-11 2012-08-15 重庆大学 Embedded control system of intelligent wheelchair
DE202012005212U1 (en) * 2012-05-25 2012-07-04 Mitac Europe Ltd. Personal navigation device for generating exercise routes for a user
CN103699124A (en) * 2013-12-04 2014-04-02 北京工业大学 Fuzzy neural network control method for omni-directional intelligent wheelchair to avoid obstacle
KR20150070764A (en) * 2013-12-17 2015-06-25 인제대학교 산학협력단 Navigation apparatus for powered wheelchair and method thereof
CN104083258A (en) * 2014-06-17 2014-10-08 华南理工大学 Intelligent wheel chair control method based on brain-computer interface and automatic driving technology
CN104161629A (en) * 2014-06-27 2014-11-26 西安交通大学苏州研究院 Intelligent wheelchair
CN105094131A (en) * 2015-08-06 2015-11-25 东圳医疗器械(上海)有限公司 Automatic path searching method suitable for electric wheelchair
CN106557474A (en) * 2015-09-24 2017-04-05 北京四维图新科技股份有限公司 Obtain the method and device of POI, database, navigation terminal and automobile on the way
CN105398389A (en) * 2015-12-23 2016-03-16 安徽安凯汽车股份有限公司 Automobile safe driving auxiliary detection system and method
CN106428000A (en) * 2016-09-07 2017-02-22 清华大学 Vehicle speed control device and method
CN107036610A (en) * 2016-11-18 2017-08-11 四川研宝科技有限公司 It is a kind of based on method from predetermined paths to user's propelling data, terminal and server
CN206493897U (en) * 2017-02-06 2017-09-15 宿迁学院 Vehicle environmental sensory perceptual system and autonomous driving vehicle
CN106983613A (en) * 2017-04-12 2017-07-28 深圳市元征科技股份有限公司 The control method of intelligent wheel chair and intelligent wheel chair

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706790A (en) * 2021-09-28 2021-11-26 平安国际智慧城市科技股份有限公司 Method, system, device, equipment and medium for driving assistance
CN115154080A (en) * 2022-07-07 2022-10-11 广东职业技术学院 Anti-collision system and method for electric wheelchair
CN117277513A (en) * 2023-11-15 2023-12-22 深圳安培时代数字能源科技有限公司 Energy storage device control method, controller and storage medium
CN117277513B (en) * 2023-11-15 2024-02-27 深圳安培时代数字能源科技有限公司 Energy storage device control method, controller and storage medium

Also Published As

Publication number Publication date
WO2019127261A1 (en) 2019-07-04

Similar Documents

Publication Publication Date Title
CN109429518B (en) Map image based autonomous traffic prediction
CN110641472B (en) Safety monitoring system for autonomous vehicle based on neural network
US11328219B2 (en) System and method for training a machine learning model deployed on a simulation platform
US20220245950A1 (en) Association and Tracking for Autonomous Devices
US20210276587A1 (en) Systems and Methods for Autonomous Vehicle Systems Simulation
CN110691957B (en) Path planning system and method based on deep convolutional neural network
KR20180074676A (en) Dynamic adjustment of steering ratio of autonomous vehicle
US11762094B2 (en) Systems and methods for object detection and motion prediction by fusing multiple sensor sweeps into a range view representation
WO2022109000A1 (en) Systems and methods for video object segmentation
WO2020142548A1 (en) Autonomous routing system based on object ai and machine learning models
US11507090B2 (en) Systems and methods for vehicle motion control with interactive object annotation
US11670286B2 (en) Training mechanism of verbal harassment detection systems
US20220058314A1 (en) Hardware In Loop Testing and Generation of Latency Profiles for Use in Simulation
CN111542295A (en) Automatic driving method and system for intelligent wheelchair and computer readable medium
CN113424209B (en) Trajectory prediction using deep learning multi-predictor fusion and Bayesian optimization
CN117416344A (en) State estimation of school bus in autonomous driving system
CN116776151A (en) Automatic driving model capable of performing autonomous interaction with outside personnel and training method
US20230059370A1 (en) Gaze and awareness prediction using a neural network model
CN115675528A (en) Automatic driving method and vehicle based on similar scene mining
CN115366920A (en) Decision method and apparatus, device and medium for autonomous driving of a vehicle
CN115115084A (en) Predicting future movement of an agent in an environment using occupancy flow fields
CN114394111A (en) Lane changing method for autonomous vehicle
WO2021056327A1 (en) Systems and methods for analyzing human driving behavior
CN116859724B (en) Automatic driving model for simultaneous decision and prediction of time sequence autoregressive and training method thereof
US20230234617A1 (en) Determining perceptual spatial relevancy of objects and road actors for automated driving

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200814

RJ01 Rejection of invention patent application after publication