AU2020103161A4 - An novel method for a smart car - artificial intelligence based autonomous steering control system with voice assistance - Google Patents

An novel method for a smart car - artificial intelligence based autonomous steering control system with voice assistance Download PDF

Info

Publication number
AU2020103161A4
AU2020103161A4 AU2020103161A AU2020103161A AU2020103161A4 AU 2020103161 A4 AU2020103161 A4 AU 2020103161A4 AU 2020103161 A AU2020103161 A AU 2020103161A AU 2020103161 A AU2020103161 A AU 2020103161A AU 2020103161 A4 AU2020103161 A4 AU 2020103161A4
Authority
AU
Australia
Prior art keywords
vehicle
steering
systems
intelligent
smart
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2020103161A
Inventor
Nalini A
Sheeba Percis E
Jayarajan J
Haribabu K
Arumugam K.
.Kaliappan S
Bhuvaneswari S
Kaliappan S
Jenish T
Balaji V
Uvaraja V.C.
Ganesan V.G
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
A Nalini Dr
E Sheeba Percis Dr
K Arumugam Dr
K Haribabu Dr
S Bhuvaneswari Mrs
VC Uvaraja Dr
Original Assignee
A Nalini Dr
E Sheeba Percis Dr
K Arumugam Dr
S Bhuvaneswari Mrs
V C Uvaraja Dr
V G Ganesan Mr
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by A Nalini Dr, E Sheeba Percis Dr, K Arumugam Dr, S Bhuvaneswari Mrs, V C Uvaraja Dr, V G Ganesan Mr filed Critical A Nalini Dr
Priority to AU2020103161A priority Critical patent/AU2020103161A4/en
Assigned to A, NALINI, J, JAYARAJAN, K, HARIBABU, K., ARUMUGAM, S, BHUVANESWARI, S, KALIAPPAN, T, JENISH, V, BALAJI, V.C., UVARAJA, V.G, GANESAN, E, SHEEBA, RAJKAMAL, M.D. reassignment A, NALINI Amend patent request/document other than specification (104) Assignors: A, NALINI, E, SHEEBA, J, JAYARAJAN, K, HARIBABU, K., ARUMUGAM, S, .KALIAPPAN, S, BHUVANESWARI, S, KALIAPPAN, T, JENISH, V, BALAJI, V.C., UVARAJA, V.G, GANESAN
Application granted granted Critical
Publication of AU2020103161A4 publication Critical patent/AU2020103161A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • B62D6/002Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits computing target steering angles for front or rear wheels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096833Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route
    • G08G1/096844Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route where the complete route is dynamically recomputed based on new data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/175Autonomous driving
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/176Camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/08Predicting or avoiding probable or impending collision
    • B60Y2300/09Taking automatic action to avoid collision, e.g. braking or steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/08Predicting or avoiding probable or impending collision
    • B60Y2300/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2400/00Special features of vehicle units
    • B60Y2400/30Sensors
    • B60Y2400/301Sensors for position or displacement
    • B60Y2400/3015Optical cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2400/00Special features of vehicle units
    • B60Y2400/30Sensors
    • B60Y2400/301Sensors for position or displacement
    • B60Y2400/3017Radars
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

To minimize road danger caused by minor driver failures, the idea of an intelligent steering system was developed. In an emergency, the smart steering system manoeuvres the steering wheel, which otherwise may lead to a vehicle accident or obstacle on the lane. Intelligent steering systems are an integral part of smart parking, lane management and other collision prevention systems. Depending on the rising demand of automatic systems for vehicles, order for the intelligent steering systems is expected to increase in the projected period. Increased social purchasing power parity and a young generation's preference for automated vehicle systems are expected to be the main drivers of the intelligent market for control systems. Features such as voice inputs and hand movements from sophisticated, intelligent steering systems, make it extremely appealing to customers. The production of self-employed cars is anticipated to hamper the demand for the smart steering system by eliminating the steering wheel from the driver's cab. A technology-based, steering wheel type, input technology, vehicle, and region are the basis of the intelligent steering system market. The smart steering system market can be segmented into RADAR and LiDAR-based on technology. A large proportion of the smart steering system market is possibly in the RADAR market. Due to its economic quality, it is preferable in most automobiles. These devices are used for the identification by the steering actuator of obstacles along the road of the vehicle that controls the steering wheel of the vehicle to prevent a collision. x~fobt Maeag w"SWw rednavwkn tAMa Fig: Autonomous Vehicle by using multi sensor .... M .......... - ..w Fig: Al model for autonomous vehicle including data collection, planning and act

Description

x~fobt Maeag
w"SWw
rednavwkn
tAMa
Fig: Autonomous Vehicle by using multi sensor
.......... .... - wM ..
Fig: Al model for autonomous vehicle including data collection, planning and act
Editorial Note 2020103161 There is 11 pages of Description only.
TITLE OF THE INVENTION ARTIFICIAL INTELLIGENCE-BASED AUTONOMOUS STEERING CONTROL SYSTEM WITH VOICE ASSIST APPLICANT
The following specification particularly describes the invention and the manner in which it is to be performed
ARTIFICIAL INTELLIGENCE-BASED AUTONOMOUS STEERING CONTROL SYSTEM WITH VOICE ASSIST FIELD OF THE INVENTION
Artificial Intelligence ( Al) is a machine intelligence tool which gives massive potential to
intelligent revolution in the industry. It enables data/information collection, the identification
of alternatives, selection of other options, specific activities, decision-making, decision
making and intelligent prediction.
On the other hand, Internet of Things ( IoT) is the core element of the industry 4.0 revolution
and involves a worldwide infrastructure for storage, application, sensing, advanced services
and networking technology to capture and process data/information. With the combination of
high-speed, resilient, low latency networking, the Al and IoT technologies, the complementary real-world and digital know-how of industry 4.0 is transformable to a truly smart AV. It was shown that 90 percent of auto accidents are caused by human error, and ten times the average driver is the safest. Automated vehicle safety is critical, and users need an appropriate level of risk 1000 times smaller. One of the unbelievable advantages of AVs is:
(1) an improvement in the safety of the vehicle, (2) a reduction in injuries, and (3) a reduction
in fuel usage and, (4) a lack of driver time and business opportunities. AVs can, however, use
large-scale sensors and system data/information.
The difficulty of AV data/information (1 GB per second of processing) used during
Advanced Driver Assistance Systems ( ADAS), and entertainment is growing. Hardware and
software specifications using sensors, actuators, and software must also be designed to cope
with functions identical to the superhuman brain as aimed at by Al. AV sensors and
equipment produces information such as time, data, motion detection, navigation, fuel
consumption, voice recognition, acceleration vehicle speed, deceleration, cumulative miles;
voice search; eye and conductor monitoring engines; image recognition, feel analytics, voice
recognition, gesture and virtual assistance. Similarly, data are produced by AV sensors and
devices. The overall estimates for 100,000 vehicles are also 100 terabytes per year.
BACKGROUND OF THE INVENTION
These data will further increase as connected vehicles (CVs) are becoming increasingly
available. As the AV goes forward, industrial producers and distributors have new
opportunities so that businesses can use Al to boost their client's value. In processing this
data/information through Al, Machine Learning ( ML) algorithms are the most robust
approach. The ML algorithms help to establish behaviour patterns for specific drivers profiles
and give vehicle owners with their corresponding application precisely what they need both
in the vehicle and through mobile phones. You do so by recalling your actions and evaluating
your driving background and the situation on the lane. Via various IoT networks including
Local Area Network ( LAN), Wide Area Network ( WAN), Wireless Sensor Network
(WSN), and Personal Areas Network (PAN), Al can manage AV broad information, but
some additional conditions such as traffic, feelings talk and experience will have to be
collected via various IoT networks. This considerable data/information needs to have certain
substances that allow them to collect data and exchange it, such as embedded electronic
devices, sensors, automobiles, constructions, software and network connectivity. These IoT
enabled AVs use a range of integrated technologies, such as enhanced protection, fuel
consumption reductions and vehicle safety, to provide many successful assistance in real
time. The 4.0 industry and IoT can be turned into a significant stimulus by reducing system
failure, enhancing quality control, increasing efficiency, and simultaneously decreasing costs.
IoT technology's potential and projections are incredible.
Morgan Stanley Research's study was claiming that, with many superior technologies, main
features and services, at least nine industrial companies would benefit from AVs.
(1) Original Equipment Manufacturers (OEM), (2) Auto dealers, (3) Autonomous trucks, (4)
Chemical engineering, (5) Electric utilities, (6) Semiconductor, (7) IT hardware-software, (8)
Telecom and communications, and (9) Beverage and restaurant sectors.
Artificial Intelligence: Benefits and Challenges
Industrial revolution trend, including technology, automation and data exchange. Present
businesses have new competitive and demand problems and have to undergo a drastic
transition to the evolution of Industry 4.0. Artificial Intelligence ( Al) offers better decision
making and decision-making dynamics for industry 4.0 and results in increased efficiency of
the enterprise, decreases the fault of the machine, improves quality management, increases
the productivity and lowers costs.
While Al is likely to modify the world today, it does have its limits. All's key difficulties are
learning from experience, and expertise can not be translated into learning. Furthermore, the
results indicate any inaccuracies in the data and are very difficult.
Artificial Intelligence: History and Approaches
Al is focused on broad data combinations. With iterative processing, it processes data very
rapidly through intelligent algorithms that allow the programme to learn from data
functionality or patterns. Since the vision error recently declined significantly (< 5 percent)
relative to human vision error, Al has become more prevalent.
Autonomous Vehicle
An independent vehicle (AV), instead of being driven by humans, is a vehicle that can guide
itself. The AV has grown into a form of driverless vehicle that is the art of driving with
computers for the future. AVs is targeted due to: (1) improved safety of cars, (2) reduced
injuries, (3) fuel consumption, (4) release of driver time and business opportunities, (5) new
market opportunity, and (6) decreased emissions and dust particulates. Around 10 million
AVs are expected to be on the road by the year 2020, and AVs are expected to generate a
cumulative revenue of 7 trillion USD by 2050.
Automated Vehicle: Levels and History
Advanced Driver Assistance Systems (ADASs) vehicles have six tiers for automated
vehicles. The travel stage of automation to fully self-sufficient automobiles is: Null stage
no automation and the people carry out all the complex driving tasks such as speeding or
slowing down, driving, braking etc. Level 1: driver help via an acceleration/deceleration
system, or steering with driver knowledge.
Stage two - partial automation of cars, where both acceleration/determination and steering
functions are mixed. Stage 3: Automation of driving mode by the condition that the
automated driving device executes with precise precision when the driver answers requests.
Level 4-high automation enables all driving tasks to be carried out under certain
circumstances, even if a human driver does not respond to a request. The vehicle performs all
the driving jobs/functions under all situations, level five - full automation.
Level 4 and 5 allow AVs to perform all driving capabilities together with several systems and
sensors to control a driverless car. The six automated vehicle levels (detailed in the section
"AV Objective Sensors," ADAS Technologies, Sensors and Controls) were developed.
Artificial Intelligence in Autonomous Vehicle
Step 1: Data Collection
AVs are designed to generate a large amount of data from their vehicles and their
surroundings using multi-sensors and equipment, including radars, cameras and
communication. These AVs include lane, road facilities, all other vehicles, parking, traffic
information, and environmental information similar to that of the human driver, and any other
item on or close to the road. These data are then forwarded as modified information to be
processed by AV. This is the first AV contact with particular vehicle circumstances and
conditions.
Step 2: Path Planning
AV device massive data will be saved and added in each route into a database called Big
Data from past driving experiences. An Al agent also uses a big data for making meaningful
strategic decisions. The track planning control strategy of the AVs helps self-driving vehicles
to determine the shortest, most comfortable and most economically profitable routes between
points A and B, using previous driving experience that enables the Al agent to make far more
precise future decisions. All the static and manoeuvrable obstacles that must be found and
bypassed by a vehicle make seeking routes difficult. Path planning strategy means finding a geometric path from an initial setup to an initial configuration, so that and set up and state on the track are feasible (if time is taken into account). Road planning management strategy involves Maneuvre Planning which is designed to decide on a vehicle at the best high level when contemplating how a vehicle is to go from one viable state to the next in real-time and follow the film limits of a vehicle based on its vehicle dynamism and as limited. The route planning strategy includes manoeuvre planning. In the driving and driving situation, the AV knows precisely what to do.
Step 3: Act
The AV can detect road objects, navigate through the highways, parking lots, obstacles,
entertainment, traffic lights, bikes, pedestrians, work areas, weather conditions and other
vehicles without the interposition of the human driver and goes to the destination safely based
on the decisions taken by the Al agent.
The AVs also come with Al-based control systems and functional systems such as steering
control, pedal speed, voice and speech recognition, brake pedal monitoring, visual
monitoring, protection system, gesture controls, fuel consumption and other driver
support/monitoring systems. This AV process loop will take place repetitively, including data
collection, route planning and execution. The more data loops are generated, the smarter the
Al agent is, and the more accurate the decisions are made, especially in complex driving
situations.
Autonomous Vehicle Challenges
Road Conditions: Road conditions are highly evolving and vary from point to point. Smooth
and marked large roads are available in some places. However, several other places have
significantly degraded road conditions - no lane marking. Lanes are not defined; potholes,
mountain and tunnel roads do not have very simple and identical external directions.
Weather Conditions: Another spoilsport is the weather. Sunny, clear weather or precipitous
and precipitous weather can occur. In all kinds of weather conditions, AVs should operate.
No space for malfunction or downtime is available at all.
Traffic Conditions: AVs should be driven on the road under all kinds of conditions of
transport. They would have to drive on the road with other AVs, and there would be several
people at the same time.
There are a lot of feelings wherever people are involved. Traffic can be extraordinarily
moderate and autonomous. However, it is also the case that people violate traffic laws.
Unforeseen circumstances can lead to an object. Even the motion of few centimetres per
minute is necessary for dense traffic. One can not wait indefinitely for traffic to clear and
provide preconditions for traffic to start moving. If more of these cars wait on the road to
clean up the traffic, this may eventually lead to a traffic impasse.
Accident Liability: Liability for injuries is the most critical part of AVs. Who is responsible
for AV accidents? In the case of AVs, the programme will be the main driving force of the
car, and all the crucial decisions will be made. Although the initial designs have an individual
positioned physically behind the steering wheel, Google's latest techniques do not include a
dashboard and a wheel. How is the person in the vehicle expected to inspect the vehicle for
an unsightly incident in these designs, where the car has no power, such as a steering wheel, a
brake pedal, or an accelerator pedal? Also, the creation of AVs primarily means that the
occupants are in a comfortable role and can not pay close attention to traffic conditions.
When their attention is needed, it might be too late to avoid the situation when they need to
act.
Radar Interference: For navigation, AVs use lasers and radar. The lasers are attached to the
rooftop while the sensors are connected to the vehicle's body.
Integrating Artificial Intelligence with Edge Computingfor Autonomous Vehicles
To have an Al model for AVs that use edge computing, we have to change the conventional
cloud model where all data storage and analytics take place in the cloud. This traditional
model is presented in the cloud to incorporate the control mechanisms and database system.
To develop this model, the method and preparation needs to be split into two modules that are
done in combination with the edge and the cloud. Al-based AV using edge computing, where
data from the AV are moved for pre-processing and decision-making to the edge node. The
data from the IoT sensors are processed locally on the edge while data from the edge nodes
are collected and transmitted to the cloud for global offline and time-conscious decisions.
Thus, time-sensitive decisions like identification of obstacles or avoidance of crashes are
made at the edge node in a much shorter time. The data on route, traffic and driving trends in
the cloud was analysed to enhance road safety and driving experience. The Al models in the
edge node can be dynamic and modified in compliance with the policies, applicable
regulations and customer requirements. The amount of data transferred in the cloud is lower
than the amount produced by IoT sensors in AVs as data is pre-processed, filtered and
cleaned in the edge node before cloud download. The bandwidth and costs can be
considerably decreased.
Energy and Emissions Implications of Autonomous Vehicles
In the United States, the use of light-duty passenger cars leads to almost 20% of GHG
national emissions (EPA, 2013a). It is also a large contributor to standard air pollution, such
as smog and ground-level ozone, and accounts for around 60% of petroleum consumption
(Davis, Diegel and Boundy, 2013). As with protection, congestion and land use, the transition
to AVs could, at least on a long-term basis, have a significant effect on energy usage, GHG
emissions and traditional air pollution impacts in the transport sectors. It will be focused on
three factors to assess whether AVs boost or deteriorate energy usage and environmental outcomes: the fuel efficiency in AVs and the carbon intensity and life cycle emissions profile in AV fuel will result in the overall shift in VMT resulting from the use of AVs. About these three variables, we address the possible scope and direction of change AVs might have.
Politicians and other actors will use this knowledge to understand how short-term policies
can influence the potential energy and environmental outcomes for AVs.
Parking is the source for many cities of significant steady urban profits. AVs will kill this
source of income by making nearby parking unnecessary. Although tax-revenue-generating
uses can substitute parking, a shift to AVs can disrupt municipal finances substantially.
Others noted possible problems with social equity posed by AV innovations, arguing that an
emphasis on AVs would distract us from the transit system (Arieff, 2013). The AVs could
only reinforce our individualist auto-centred society through the hungry public transport of
drivers, rather than improving the transport that can benefit all people. One of the major
public transportation attractions is that a smartphone can be read or used. In private vehicles,
less people can use public transit while these practices carried out. In turn, this could lead to
lower fare income and public transit to decrease or raise rates, which could build a vicious
spiral in reducing motorcycling. AVs are also expected to at least initially be considerably
more costly than traditional cars - exacerbating the risk of crash between the rich and the
poor. However, these findings are not predetermined, and several policy instruments are
available to address. Jobs' going to be lost, too.
WORKING PRINCIPLE
The software will recognise the signs and store them with Google Maps and GPS in a
common database. The GPS software part registers travel signs and directions. Any vehicle
moving in traffic with the software records the new signs that have been identified and updates the trust and recognition level for other drivers. A different software variable can recognise the dividing lines between the paths. It uses three cameras to precisely measure the location of the car on the lane, where the lane sides are, or to give a new route for the next few seconds, even without traffic signs. Another part has been used to identify other vehicle fingerprints from webcam images with Artificial Intelligence. Parallel to and propagation of the algorithms is implemented. To ensure data processes and control on three separate computers with multiple core systems, I built a semaphore management software framework.
Besides, this innovation includes a homemade LIDAR - 3D radar and OpenGL software to
create a 3D world for the car to navigate. The car would choose to prevent hurdles when
using it. The 3D radar helps to improve the trust of decisions in the software framework. I
also used another training technique in the network, drive car with EEG sensor about 100 km,
and took over 1 000 human brain samples when he took some action such as Acceleration,
Break, left or right turn. All the situation was mapped with brain samples.
The control is obtained by tapping into the Controller Area Network (CAN) bus of the
vehicle. We reflect the steering command 1/ r, where r is the turning radius in metres, to
ensure that our device is separate from car geometry. We use 1 / r instead of r to avoid the
singularity of straight driving (the turning radius is endless). 1 / r smooth transitions from left
to right by zero turns (negative values). The training data includes individual video clips,
coupled with the respective steering control (1 / r). Training with human driver data alone is
not enough; the network needs to learn how to recover from errors and how to get the car out
of the traffic slowly. This improves training data with other pictures showing the vehicle in
various changes from the middle of the track and turns from the direction of the lane. Photos
can be taken from left and right cameras for two particular off-centre shifts.
The viewpoint simulates further changes between the camera utilising the image
transformation from a nearest camera. Precise shift of the perspective involves awareness of
the 3D scene that we have no, so we approach the change by assuming every point below the
horizon is flat and every spot above the horizon is endlessly removed. It works well for flat
land, but it provides distortions for objects which cling over the ground like vehicles, poles,
trees and buildings for a complete rendering. These distortions luckily do not pose a big
problem for network preparation. In two seconds, the steering label for the images is adapted
quickly to one which correctly guides the car back to the desired position and orientation.
Editorial Note 2020103161 There is 1 page of Claims only.
We can claim:
(1) The use of data to automate learning,
(2) Developing knowledge skills for current goods,
(3) The adaptation to programmable by data of smart learning algorithms,
(4) A logical data analysis,
(5) The improvement of data accuracy.
Editorial Note 2020103161 There is 1 page of Drawings only.
Fig: Autonomous Vehicle by using multi sensor
Fig: AI model for autonomous vehicle including data collection, planning and act
AU2020103161A 2020-11-01 2020-11-01 An novel method for a smart car - artificial intelligence based autonomous steering control system with voice assistance Ceased AU2020103161A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2020103161A AU2020103161A4 (en) 2020-11-01 2020-11-01 An novel method for a smart car - artificial intelligence based autonomous steering control system with voice assistance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2020103161A AU2020103161A4 (en) 2020-11-01 2020-11-01 An novel method for a smart car - artificial intelligence based autonomous steering control system with voice assistance

Publications (1)

Publication Number Publication Date
AU2020103161A4 true AU2020103161A4 (en) 2021-01-07

Family

ID=74041773

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2020103161A Ceased AU2020103161A4 (en) 2020-11-01 2020-11-01 An novel method for a smart car - artificial intelligence based autonomous steering control system with voice assistance

Country Status (1)

Country Link
AU (1) AU2020103161A4 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113805171A (en) * 2021-10-14 2021-12-17 中国第一汽车股份有限公司 Dangerous target judgment method, device, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113805171A (en) * 2021-10-14 2021-12-17 中国第一汽车股份有限公司 Dangerous target judgment method, device, equipment and storage medium
CN113805171B (en) * 2021-10-14 2023-09-26 中国第一汽车股份有限公司 Dangerous target judging method, dangerous target judging device, dangerous target judging equipment and storage medium

Similar Documents

Publication Publication Date Title
Li et al. Survey on artificial intelligence for vehicles
CN113165652B (en) Verifying predicted trajectories using a mesh-based approach
Van Brummelen et al. Autonomous vehicle perception: The technology of today and tomorrow
Kuru et al. A framework for the synergistic integration of fully autonomous ground vehicles with smart city
CN111252061B (en) Real-time decision-making for autonomous vehicles
Chen et al. Milestones in autonomous driving and intelligent vehicles—Part I: Control, computing system design, communication, HD map, testing, and human behaviors
US20230124864A1 (en) Graph Representation Querying of Machine Learning Models for Traffic or Safety Rules
CN110239562A (en) The real-time perception adjustment based on surrounding vehicles behavior of automatic driving vehicle is adjusted with driving
CN111380534A (en) ST-map-learning-based decision making for autonomous vehicles
Hind Digital navigation and the driving-machine: supervision, calculation, optimization, and recognition
JP2022041923A (en) Vehicle path designation using connected data analysis platform
CN111259712B (en) Representation of compression environment characteristics for vehicle behavior prediction
Chai et al. Autonomous driving changes the future
Cai et al. DiGNet: Learning scalable self-driving policies for generic traffic scenarios with graph neural networks
Hacohen et al. Autonomous driving: A survey of technological gaps using google scholar and web of science trend analysis
Siboo et al. An empirical study of ddpg and ppo-based reinforcement learning algorithms for autonomous driving
Baliyan et al. Role of AI and IoT techniques in autonomous transport vehicles
AU2020103161A4 (en) An novel method for a smart car - artificial intelligence based autonomous steering control system with voice assistance
Ilievski Wisebench: A motion planning benchmarking framework for autonomous vehicles
US20240075950A1 (en) Alternative Driving Models for Autonomous Vehicles
Broggi et al. Moving from analog to digital driving
Ganesan et al. A Comprehensive Review on Deep Learning-Based Motion Planning and End-To-End Learning for Self-Driving Vehicle
Tyagi et al. Autonomous vehicles and intelligent transportation systems—a framework of intelligent vehicles
Sun Cooperative adaptive cruise control performance analysis
Chamola et al. Overtaking mechanisms based on augmented intelligence for autonomous driving: Datasets, methods, and challenges

Legal Events

Date Code Title Description
HB Alteration of name in register

Owner name: V.C., U.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: K, H.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: S, K.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: J, J.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: RAJKAMAL, M.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: E, S.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: K., A.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: S, B.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: V.G, G.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: T, J.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: A, N.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

Owner name: V, B.

Free format text: FORMER NAME(S): K, HARIBABU; E, SHEEBA; A, NALINI; T, JENISH; J, JAYARAJAN; S, BHUVANESWARI; V.G, GANESAN; V.C., UVARAJA; K., ARUMUGAM; V, BALAJI; S, .KALIAPPAN; S, KALIAPPAN

FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry