GB2569774A - Method for virtual testing of real environments with pedestrian interaction and drones - Google Patents

Method for virtual testing of real environments with pedestrian interaction and drones Download PDF

Info

Publication number
GB2569774A
GB2569774A GB1717221.4A GB201717221A GB2569774A GB 2569774 A GB2569774 A GB 2569774A GB 201717221 A GB201717221 A GB 201717221A GB 2569774 A GB2569774 A GB 2569774A
Authority
GB
United Kingdom
Prior art keywords
drone
test person
compromising
search
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1717221.4A
Other versions
GB201717221D0 (en
Inventor
Hartmann Michael
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kompetenzzentrum das Virtuelle Fahrzeug Forchungs GmbH
Original Assignee
Kompetenzzentrum das Virtuelle Fahrzeug Forchungs GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kompetenzzentrum das Virtuelle Fahrzeug Forchungs GmbH filed Critical Kompetenzzentrum das Virtuelle Fahrzeug Forchungs GmbH
Priority to GB1717221.4A priority Critical patent/GB2569774A/en
Publication of GB201717221D0 publication Critical patent/GB201717221D0/en
Publication of GB2569774A publication Critical patent/GB2569774A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0016Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement characterised by the operator's input device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/048Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles a model being viewed and manoeuvred from a remote point
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • H04W24/06Testing, supervising or monitoring using simulated traffic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Abstract

A method for testing of self-driving or autonomous vehicles A and their algorithms in situations with pedestrian interactions comprises providing a drone B controlled by a test person C. The control of the drone B comprises applying motion capture techniques and providing a walking platform to the test person C. The drone B comprises a camera, the images being wirelessly transmitted to the test person C, and the movement of the test person in response to the images tracked and used to control the drone B. There may be traffic between the drone and the vehicle. The method may be carried out in real time.

Description

(57) A method for testing of self-driving or autonomous vehicles A and their algorithms in situations with pedestrian interactions comprises providing a drone B controlled by a test person C. The control of the drone B comprises applying motion capture techniques and providing a walking platform to the test person C. The drone B comprises a camera, the images being wirelessly transmitted to the test person C, and the movement of the test person in response to the images tracked and used to control the drone B. There may be traffic between the drone and the vehicle. The method may be carried out in real time.
Fig. 3
At least one drawing originally filed was informal and the print reproduced here is taken from a later filed formal copy.
1/3
Figures
05 19
Fig. 1 /--------------------------------------------------------------------------------------------------\
1. Wireless data Transfer stimuli unit of the test person and drone camera \>
--
2. Wireless data transfer between motion capture system and walking platform k_________________________________________________________________________________________________________________________________________________________________________________>
Z--------------------------------------------------------------------------------------------\
3. Sending process of the information about the camera perspective of the drone to the stimuli unit
X_________________________________________________________________
Z...................................................................................................................................................
4. Measurement of the reaction of the test person
X______________________________________________________________________________
Z----------------------------------------------------------
5. Measurement of the joint states of the person ’X —
6. Sending process of the joint states to the drone
X__________________________________________________________________________ z--------------------------------------------------------------
7. Conversion of the Gesture in to flying control
I.__________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
-Z
2/3
05 19
Fig. 2.
3/3
05 19
Fig. 3
Method for virtual testing of real environments with pedestrian interaction and drones
Background of the invention
Testing of autonomous vehicles for complex and uncertain environments has become one of the biggest challenges in the automotive industry. Automation and computational intelligence will increase abilities of the vehicle Okumura et al. (2016). The environment perception and situation understanding will be covered, by computer algorithms.
In addition to vehicle dynamics, the environmental states have to be incorporated into the test Eskandarian (2012). In order to ensure safety, it is required to test the intelligent vehicle in a reasonable way. It is also necessary to have prediction mechanisms to infer the consequences of decisions correctly. Conventional testing procedures are insufficient to ensure safety of increasingly complex future assistance functions involving machine perception and cognition Bengler et al. (2014). The paper is structured as follows:
The complexity of tests for autonomous vehicles is much higher, compared with conventional test procedures. Additional to vehicle states, information of the environment is incorporated in the decision making process of an autonomous vehicle. This leads to an increase of complexity, also because of predictions. General requirements of test procedures for autonomous vehicles include:
• Clear and reproducible statements.
• As easy as possible, as complex as necessary.
• Possible and adequate for all environments and situations Okumura et al. (2016) • Meaningful metrics (e.g. measures for the safety-risk-ratio) and suitable description forms • Measures for robustness and redundancy for safety reasons • Adequate for testing realistic driving scenarios Ziegler et al. (2014) • Comparison to human performance Aeberhard et al. (2015)
State of the Art
Many different aspects must be taken into account when a person moves. A human is an open, complex, biological system that interacts with its environment. Environmental influences are perceived by organs of perception and the movement is constrained by biomechanical prerequisites of the human body. In Chatziastros (2003) the visual perception of drivers is examined for dynamic environments, especially concerning psychological aspects. The Max-Planck institute for biological cybernetics. (2017) conducts basic research dealing with signal- and information processing of the human brain specifically in the perception of the environment and the resulting actions. Many research projects in the field of cybernetics use virtual reality technologies Chatziastros (2003).
Another example is Jovancevic et al. (2006) focused on attention and gaze analysis. Mathematical models for a theory of cognitive communication are described in Rupprecht (2014). In Sprague et al. (2007) models for vision and scene understanding are analysed with virtual reality environments.
Testing safety of dynamic systems can be divided in different strategies Althoff (2010). To classify a system as a safe system, it is necessary to make sure that trajectories never reach unsafe states.
The validation of technical systems is often done by simulation and experiments. If the trajectory hits the unsafe state during a simulation, the system can be declared as an unsafe dynamical system (falsification). As long as a counterexample has not been found, there is no direct way to declare the system safe. There are some exploring techniques for the state space to find the counterexample systematically Althoff (2010).
In conventional driving tests (e.g. testing vehicle dynamics), internal vehicle states have to be examined at specified manoeuvres. For autonomous driving functions, there are no standardized tests, because states of the environment are essential. It is not trivial to determine the external states and conditions that have to be used for tests in order to ensure a clear statement for the safety of the vehicle. Also, due to the diversity of situations, the number of tests for demonstrating safety is tremendous. For the reproducibility of real-world tests, some strategies are known. Steering robots are already used in experimental settings. Another strategy is to collect a large amount of data during long-term studies to ensure that the system is tested for all possible situations Winner et al. (2015). Hereby the problem of missing trajectories plays an essential role.
Soft-crash-targets and passable target robots can be used to model accident scenarios. These crash target robots are already used because they can be precisely coordinated Winner et al. (2015).
The decision making process is influenced by the interaction with other road users. The intention estimation and the prediction for the future movement of road users is vital for the motion planning of the ego-vehicle Eskandarian (2012).
Description of the Invention
The invention presents a method to virtually test traffic situations with real pedestrian interaction in real environments and dangerous driving maneuvers. The problem of testing safety critical driving scenarios with scenarios is that real persons cannot be incorporated. The idea of this invention is to use drones as pedestrians. The drones can be controlled by a test person. The test person see's what the drone sees. The drone has a camera which shows the oncoming vehicle. The flying route can be controlled by a test person, which is walking on a non-movable walking platform
In a first step, there is a wireless data transfer between the stimuli of the test person and the drone. For example a simple Skype connection with good internet conditions is adequate for this purpose. In a second step there is also a wireless data transfer between the flying drone, the motion capture system and the walking platform. The camera perspective of the drone is sent to the stimuli unit of the test person in real-time. The reaction of the test person is measured in real time in a fourth step. In a fifth step the joint states of the test person are measured. The joint states are sent to the drone. In experiments it is shown that it is enough also to have a cap with visual markers on the test person, measuring the head position, instead of measuring the whole posture. In the seventh step the gesture of the test person is converted into flying control actions.
The experiment offers a lot of advantages. Real persons are incorporated in safety critical situations with cars to increase the safety of autonomous vehicles. The interaction, real environments and the personality of the test person can be integrated in a safe experiment for testing real-life dangerous situations.
In conventional state of the art testing procedures crash target robots are used, which follow predefined trajectories and are not suited to react as real persons.
Figure Description
Fig. 1 shows an example flow-chart for a sequential procedure
Fig. 2 shows an example of the testing environment.
Fig. 3 shows the concept of the invention (see following legend)
Legend of Figure 3 • Block A: Vehicle driving in the test area with collision avoidance maneuver • Block B: Flying Drone as a pedestrian follows the movements of the test person (Block C) • Block C: Test person controls the drone and sees what the camera of the drone shows • Arrow A: Block B and Block C are connected by visual informations and information provided by physical displacements.
• Arrow B: Block A and Block B are visually connected.
• Test area: Area where vehicle and drone make safety critical driving maneuvers.
• Control area: Test person walks on a non-movable walking platform.
References • Bengler, K., Dietmayer, K., Farber, B., Maurer, M., Stiller, C., and Winner, H. (2014). Three decades of driver assistance systems: Review and future perspectives. IEEE Intelligent Transportation Systems Magazine, 6(4), 6-22.
• Chatziastros, A. (2003). Visuelle Kontrolle der Lokomotion. Ph.D. thesis, Universitatsbibliothek Giessen.
• Eskandarian, A. (2012). Handbook of intelligent vehicles. Springer London.
• Hartmann, M., Viehweger, M., Stolz, M., and Watzenig, D. (2017). Pedestrian in the Loop: An approach using virtual reality. ICAT 2017 -XXVI International Conference on Information, Communication and Automation Technologies.
• Jovancevic, J., Sullivan, B., and Hayhoe, M. (2006). Control of attention and gaze in complex environments. Journal of Vision, 6(12), 9-9.
• Okumura, B., James, M.R., Kanzawa, Y., Derry, M., Sakai, K., Nishi, T., and Prokhorov, D. (2016). Challenges in perception and decision making for intelligent automotive vehicles: A case study. IEEE Transactions on Intelligent Vehicles, 1(1), 20-32.
• Pearl, J., Glymour, M., and Jewell, N.P. (2016). Causal inference in statistics: a primer. John Wiley & Sons.
• Rupprecht, W. (2014). Einfiihrung in die Theorie der kognitiven Kommunikation: wie Sprache, Information, Energie, Internet, Gehirn und Geist zusammenhangen. Springer-Verlag.
• Sprague, N., Ballard, D., and Robinson, A. (2007). Modeling embodied visual behaviors. ACM Transactions on Applied Perception (TAP), 4(2), 11.
• Winner, H., Hakuli, S., Lotz, F., and Singer, C. (eds.) (2015). Handbuch Fahrerassistenzsysteme. ATZ/MTZ Fachbuch. Springer Vieweg, Wiesbaden, 3 edition, doi: 10.1007/978-3-658-05734-3.
• Ziegler, J., Bender, P., Schreiber, M., Lategahn, H., Strauss, T., Stiller, C., Dang, T., Franke, U., Appenrodt, N., Keller, C.G., et al. (2014). Making bertha drive - An autonomous journey on a historic route. IEEE Intelligent Transportation Systems Magazine, 6(2), 8-20.
Claims

Claims (6)

1. Method for virtual testing of vehicles and their algorithms in situations with pedestrian interaction compromising, in a first step the wireless data transfer between the drone camera and the stimuli unit of the test person in a second step the wireless data transfer between the motion capture system and walking platform of the test person and the fly control unit of the drone, in a third step the sending process of information about the camera perspective of the drone to the stimuli unit to the test person, in a fourth step the measurement of the actions of the test person as a reaction to the real perspective of the drone, in a fifth step the measurement of the joint states of the person via a motion capture system and walking platform, in a sixth step the sending process of the information of gesture and actions to the drone, in a seventh the conversion of the gesture of the person in a flying control.
2. Method according to claim 1 compromising a traffic situation between the drone and the vehicle.
3. Method according to claim 1 and 2 compromising measuring the gesture of the test person, and incorporation of the behavior for real-life dangerous situations.
4. Method according to claim 1 to 3 compromising real-time perception of the perspective of the flying drone and stimulating the perception of the test person.
5. Method according to claim 1 to 4 compromising for controlling the flying drone by a test person and control of the drone to fly as the pedestrian.
6. Method according to claim 1 to 5 compromising for increasing the safety of collision avoidance algorithms in incorporation of engineers from technical aspects and psychologists for human related aspects.
Intellectual Property Office
Application No: GB1717221.4 Examiner: Dr Michael Collett
Claims searched: 1-6 Date of search: 5 February 2018
Patents Act 1977: Search Report under Section 17
Documents considered to be relevant:
Category Relevant to claims Identity of document and passage or figure of particular relevance X 1-6 US2012/0283896 Al (PERSAUD) X 1-6 US2013/0253733 Al (LEE) X 1-6 WO2016/168117 A2 (DANIELS) X 1-6 CN106094861 Al (ZERO UAV) & - US2017/0351253 Al (YANG) X 1-6 US2013/0107027 Al (MUELLHAEUSER)
Categories:
X Document indicating lack of novelty or inventive step A Document indicating technological background and/or state of the art. Y Document indicating lack of inventive step if P Document published on or after the declared priority date but combined with one or more other documents of before the filing date of this invention. same category. & Member of the same patent family E Patent document published on or after, but with priority date earlier than, the filing date of this application.
Field of Search:
Search of GB, EP, WO & US patent documents classified in the following areas of the UKCX :
Worldwide search of patent documents classified in the following areas of the IPC__________________
B60W; G05D; G08G_____________________________________________
The following online and other databases have been used in the preparation of this search report______
WPI, Epodoc, Patent Fulltext, INSPEC, NPL, TDB, XPAIP, XPI3E, XPIEE, XPIOP, XPIPCOM, XPMISC
GB1717221.4A 2017-10-20 2017-10-20 Method for virtual testing of real environments with pedestrian interaction and drones Withdrawn GB2569774A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1717221.4A GB2569774A (en) 2017-10-20 2017-10-20 Method for virtual testing of real environments with pedestrian interaction and drones

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1717221.4A GB2569774A (en) 2017-10-20 2017-10-20 Method for virtual testing of real environments with pedestrian interaction and drones

Publications (2)

Publication Number Publication Date
GB201717221D0 GB201717221D0 (en) 2017-12-06
GB2569774A true GB2569774A (en) 2019-07-03

Family

ID=60481719

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1717221.4A Withdrawn GB2569774A (en) 2017-10-20 2017-10-20 Method for virtual testing of real environments with pedestrian interaction and drones

Country Status (1)

Country Link
GB (1) GB2569774A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114296472A (en) * 2021-11-19 2022-04-08 浙江中控技术股份有限公司 Weighing process testing system and method based on unmanned aerial vehicle
US11505134B1 (en) 2021-05-25 2022-11-22 Motional Ad Llc Automated moving platform

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120283896A1 (en) * 2011-05-04 2012-11-08 Qualcomm Incorporated Gesture recognition via an ad-hoc proximity sensor mesh for remotely controlling objects
US20130107027A1 (en) * 2011-10-27 2013-05-02 Deutsches Zentrum Fuer Luft- Und Raumfahrt E.V. Control and monitoring device for vehicle
US20130253733A1 (en) * 2012-03-26 2013-09-26 Hon Hai Precision Industry Co., Ltd. Computing device and method for controlling unmanned aerial vehicle in flight space
WO2016168117A2 (en) * 2015-04-14 2016-10-20 John James Daniels Wearable electric, multi-sensory, human/machine, human/human interfaces
CN106094861A (en) * 2016-06-02 2016-11-09 零度智控(北京)智能科技有限公司 Unmanned plane, unmanned aerial vehicle (UAV) control method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120283896A1 (en) * 2011-05-04 2012-11-08 Qualcomm Incorporated Gesture recognition via an ad-hoc proximity sensor mesh for remotely controlling objects
US20130107027A1 (en) * 2011-10-27 2013-05-02 Deutsches Zentrum Fuer Luft- Und Raumfahrt E.V. Control and monitoring device for vehicle
US20130253733A1 (en) * 2012-03-26 2013-09-26 Hon Hai Precision Industry Co., Ltd. Computing device and method for controlling unmanned aerial vehicle in flight space
WO2016168117A2 (en) * 2015-04-14 2016-10-20 John James Daniels Wearable electric, multi-sensory, human/machine, human/human interfaces
CN106094861A (en) * 2016-06-02 2016-11-09 零度智控(北京)智能科技有限公司 Unmanned plane, unmanned aerial vehicle (UAV) control method and device
US20170351253A1 (en) * 2016-06-02 2017-12-07 Zerotech (Shenzhen) Intelligence Robot Co., Ltd Method for controlling an unmanned aerial vehicle

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11505134B1 (en) 2021-05-25 2022-11-22 Motional Ad Llc Automated moving platform
GB2607130A (en) * 2021-05-25 2022-11-30 Motional Ad Llc Automated moving platform
CN114296472A (en) * 2021-11-19 2022-04-08 浙江中控技术股份有限公司 Weighing process testing system and method based on unmanned aerial vehicle
CN114296472B (en) * 2021-11-19 2024-02-27 中控技术股份有限公司 Weighing flow test system and method based on unmanned aerial vehicle

Also Published As

Publication number Publication date
GB201717221D0 (en) 2017-12-06

Similar Documents

Publication Publication Date Title
Shia et al. Semiautonomous vehicular control using driver modeling
Unhelkar et al. Human-robot co-navigation using anticipatory indicators of human walking motion
Lawitzky et al. Interactive scene prediction for automotive applications
Basu et al. Trust dynamics in human autonomous vehicle interaction: a review of trust models
Sun et al. An integrated solution for lane level irregular driving detection on highways
Li et al. Multi-sensor soft-computing system for driver drowsiness detection
Lee et al. Continuous car driving intent detection using structural pattern recognition
Lemonnier et al. Discriminating cognitive processes with eye movements in a decision-making driving task.
Tran et al. Vision for driver assistance: Looking at people in a vehicle
GB2569774A (en) Method for virtual testing of real environments with pedestrian interaction and drones
Morignot et al. Arbitration for balancing control between the driver and ADAS systems in an automated vehicle: Survey and approach
Jha et al. Probabilistic estimation of the driver's gaze from head orientation and position
KR20230069940A (en) Methods and systems for testing driver assistance systems
US11580871B2 (en) Assessment system and assessment method
Pacaux-Lemoine et al. Human-Machine Cooperation principles to support driving automation systems design
Amadori et al. HammerDrive: A task-aware driving visual attention model
CN113557524A (en) Method for representing a mobile platform environment
Hartmann et al. “Pedestrian in the Loop”: An approach using augmented reality
Amadori et al. Decision anticipation for driving assistance systems
Graf et al. The Predictive Corridor: A Virtual Augmented Driving Assistance System for Teleoperated Autonomous Vehicles.
Kaur et al. Scenario-based simulation of intelligent driving functions using neural networks
Rock et al. Quantifying realistic behaviour of traffic agents in urban driving simulation based on questionnaires
Pulgarin et al. Drivers' Manoeuvre Prediction for Safe HRI
Gillmeier et al. Combined driver distraction and intention algorithm for maneuver prediction and collision avoidance
Hartmann et al. Pedestrian in the loop: An approach using flying drones

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)