CN111756990A - Image sensor control method, device and system - Google Patents

Image sensor control method, device and system Download PDF

Info

Publication number
CN111756990A
CN111756990A CN201910253101.2A CN201910253101A CN111756990A CN 111756990 A CN111756990 A CN 111756990A CN 201910253101 A CN201910253101 A CN 201910253101A CN 111756990 A CN111756990 A CN 111756990A
Authority
CN
China
Prior art keywords
real
time
image
environment
image sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910253101.2A
Other languages
Chinese (zh)
Other versions
CN111756990B (en
Inventor
刘挺
王兵
刘洪鑫
黄宇
谷明琴
薛晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuzhou Online E Commerce Beijing Co ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201910253101.2A priority Critical patent/CN111756990B/en
Publication of CN111756990A publication Critical patent/CN111756990A/en
Application granted granted Critical
Publication of CN111756990B publication Critical patent/CN111756990B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/663Remote control of cameras or camera parts, e.g. by remote control devices for controlling interchangeable camera parts based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection

Abstract

The application discloses a method, a device and a system for controlling an image sensor, wherein in the system, the image sensor comprises a camera component and a direction adjusting component, and the system comprises: a calculation unit and a control unit; the computing unit is used for detecting and determining an environmental object in the real-time environmental image according to the real-time environmental image acquired by the image sensor, and determining real-time state information of the environmental object in the real-time environmental image; and the control unit is used for controlling the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the environment object in the real-time environment image. By the system, the equipment cost and the time cost for system operation can be reduced, and the effectiveness and the efficiency for sensing and identifying the environmental object are improved.

Description

Image sensor control method, device and system
Technical Field
The invention relates to the technical field of environmental object perception, in particular to a method, a device and a system for controlling an image sensor.
Background
In the automatic driving technology of vehicles, environmental perception during automatic driving is one of the key fields of research on automatic driving technology. The quality of perception of environmental objects in vehicle autopilot systems determines many aspects of the automated driving technique implementation, including the usability of autopilot systems, and the driving safety generally considered to be of great importance in implementing autopilot systems, among other things. In urban road and other traffic environments, the system needs to process more and more complex road condition perceptions in the driving process, the complex road conditions are embodied in the environment including more traffic control signs and control signals, such as traffic sign lines, traffic lights and the like, and the automatic driving system effectively identifies the traffic control signs and the signals, which is the premise of ensuring the traffic safety of drivers and passengers of vehicles and other traffic participants.
Since the autonomous vehicle is in motion most of the time during the driving process, the relative position between the vehicle and the target environmental object is constantly changing, which requires the environmental object sensing system to have the sensing capability of the target object at the constantly changing position. The prior art implements sensing an environmental object whose relative position changes continuously, including a system implemented by using an image recognition technology and a related method, and in particular, an environmental image may be captured by an image capturing device, and then the environmental object is recognized in the captured image.
However, limited by conditions such as hardware equipment design and identification methods, the existing environment object perception technology has the disadvantages of complex hardware implementation, complex corresponding control and calculation processes, and low efficiency of the implementation process of environment object perception due to the fact that more software and hardware resources are occupied in the operation process; and the perception range is limited, and the effectiveness of perceiving the moving environmental objects also needs to be improved. Therefore, a technical problem to be solved by those skilled in the art is to provide a sensor control system, which can implement more efficient sensing of moving environmental objects with simpler equipment, reduce equipment cost and time cost of system operation, and improve effectiveness of sensing and identifying environmental objects.
Disclosure of Invention
The embodiment of the invention provides an image sensor control method, device and system, which can reduce equipment cost and time cost of system operation and improve effectiveness and efficiency of sensing and identifying an environmental object.
The invention provides the following scheme:
an image sensor control system is provided for controlling an image sensor,
the image sensor comprises a camera component and a direction adjusting component;
the control system includes:
a calculation unit and a control unit;
the computing unit is used for detecting and determining an environmental object in the real-time environmental image according to the real-time environmental image acquired by the image sensor, and determining real-time state information of the environmental object in the real-time environmental image;
and the control unit is used for controlling the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the environment object in the real-time environment image.
An image sensor control system in an autonomous vehicle,
the image sensor comprises a camera component and a direction adjusting component;
the control system includes:
the information processing unit is used for detecting and determining a traffic environment object in the real-time environment image according to the real-time environment image acquired by the image sensor and determining real-time state information of the traffic environment object in the real-time environment image;
and the sensor control unit is used for controlling the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the environment object in the real-time environment image.
A traffic environment object sensing system for autonomous driving, comprising:
a communication unit, a calculation unit, a control unit, an image sensor; wherein the image sensor comprises a camera component and a direction adjusting component;
the communication unit is used for receiving induction task information of the road side equipment;
the computing unit is used for detecting and determining a traffic environment object in the real-time environment image according to the real-time environment image acquired by the image sensor after the induction task is started, and determining real-time state information of the traffic environment object in the real-time environment image;
and the control unit is used for controlling the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the traffic environment object in the real-time environment image.
An image sensor control method for obtaining real-time environmental images by an orientation adjustable image sensor, the method comprising:
detecting and determining environmental object information in the real-time environment image according to the real-time environment image acquired by the image sensor;
tracking a target environment object in the real-time environment image, and determining real-time state information of the target environment object in the real-time environment image;
determining whether to perform a directional adjustment on the image sensor based on the real-time status information of the target environmental object.
An image sensor control apparatus, the image sensor including a camera section, and a direction adjustment section, the apparatus comprising:
an environmental object recognition unit for detecting and determining environmental object information in the real-time environmental image from the real-time environmental image acquired by the image sensor;
the target object tracking unit is used for tracking a target environment object in the real-time environment image and determining real-time state information of the target environment object in the real-time environment image;
and the adjusting control unit is used for determining whether to adjust the direction of the image sensor according to the real-time state information of the target environment object.
A computer system, comprising:
one or more processors; an image sensor;
the image sensor comprises a camera component and a direction adjusting component; and the number of the first and second groups,
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
detecting and determining environmental object information in the real-time environment image according to the real-time environment image acquired by the image sensor;
tracking a target environment object in the real-time environment image, and determining real-time state information of the target environment object in the real-time environment image;
determining whether to perform a directional adjustment on the image sensor based on the real-time status information of the target environmental object.
According to the specific embodiments provided herein, the present application discloses the following technical effects:
the application provides an image sensor control system, in the system, an image sensor comprises a camera component and a direction adjusting component to replace the existing image sensor which is composed of a plurality of cameras with different FOVs, the image sensor control system comprises a calculating unit and a control unit; the computing unit is used for detecting and determining an environmental object in the real-time environmental image according to the real-time environmental image acquired by the image sensor, and determining real-time state information of the environmental object in the real-time environmental image; and the control unit can control the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the environment object. The image sensor with the single camera and the adjustable direction is adopted to track the environmental object in real time, the defect that blind areas are easily generated in the prior art can be overcome while the visual range of the sensor is considered, the adaptability of the system to the environment object under the condition of complex distribution is improved, and the reliability of tracking and identifying the environmental object is improved. Can cover the environmental object of different distances based on single sensor, for the sensor machine control scheme of many cameras, the effectual calculated amount that reduces the response data is favorable to simplifying the system in order to improve the operating efficiency. Because the camera is adjusted in a dynamic tracking mode to cover the target object, when the method is applied to automatic driving, the occurrence of blind areas can be effectively sensed in the driving process, and the image sensor can also effectively cover traffic environment objects such as traffic lights and the like when the vehicle changes lanes or turns or meets the situations such as intersections with special traffic lights; in addition, the control system can track and identify the environmental object in real time, so that the map navigation information is not needed to assist, and the control system can also effectively track and identify temporary traffic control changes, such as temporary traffic lights, and provide an effective decision basis for a driving system. The efficiency and the reliability of the system for identifying and tracking the environmental object are improved, and the safety of the automatic driving system is improved. The control system may be applied in the field of autonomous driving, and the environmental objects may include objects in other autonomous driving environments, such as traffic lights. The method can also be applied to other fields needing environment object identification, such as the technical field of robots and the like, so that the identification efficiency and reliability are improved when the environment object related to the related field is tracked and identified.
Furthermore, in the system, the environment object in the real-time environment image is detected and determined, the real-time state information of the environment object in the real-time environment image is determined, and the two tasks are respectively realized by different units, wherein the real-time state information of the environment object in the real-time environment image is determined, so that the task of providing the adjusting basis for tracking the environment object is realized by a lightweight search tracking method, the calling of relevant units for identification and detection is reduced, the multi-task decoupling is realized, the calculation amount of the system is reduced on the whole, the complexity of system design is reduced, and the operating efficiency of the system is improved.
Of course, it is not necessary for any product to achieve all of the above-described advantages at the same time for the practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic application diagram of an existing identification device provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of an application of a sensor control system provided in an embodiment of the present application;
FIG. 3-1 is a schematic diagram of a first system provided by an embodiment of the present application;
FIG. 3-2 is a schematic diagram of a second system provided by an embodiment of the present application;
3-3 are schematic diagrams of a third system provided by embodiments of the present application;
FIG. 4-1 is a schematic diagram illustrating a location of a target environment object in an environment image according to an embodiment of the present application;
4-2 is another schematic position diagram of a target environment object in an environment image provided by the embodiment of the application;
FIG. 5 is a flow chart of a method provided by an embodiment of the present application;
FIG. 6 is a schematic view of an apparatus provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of an image sensor control system in an autonomous vehicle provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a traffic environment object sensing system for autonomous driving provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a computer system provided by an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present invention.
In an intelligent automatic control system such as an autonomous vehicle, a robot, and the like, a very important task is sensing and recognition of an environmental object. In the automatic driving vehicle, effective sensing and recognition of environmental objects in the traffic environment, such as roads, traffic signs, people participating in traffic, other vehicles and the like, are required to ensure the basic realizability of automatic driving and the safety of the automatic driving vehicle and other people and equipment. In the prior art, one of the main means for sensing an environmental object by an automatic driving vehicle includes an image recognition technology, which analyzes and recognizes the characteristics of a traffic environmental object in an environmental image based on an environmental image captured by an image capturing device in real time by using the image recognition technology, so as to determine the related data of the traffic environmental object in the image, perform automatic driving based on the data, and implement action decision in the automatic driving process. However, due to the limitations of hardware device and recognition method design, the effectiveness and efficiency of environmental object recognition in the prior art still need to be improved.
Taking an identification device of a traffic environment object as an example, as shown in fig. 1, it is an application schematic diagram of the device. The device is equipped with a camera group 110, which includes three cameras with different FOV (Field of View), and in practical applications, two cameras with different FOV can be equipped, and the implementation principle is similar. In an optical system, the field angle parameter is an angle parameter, and can represent the field range of the optical instrument. The angle of view of the camera is positively correlated with the angle of the visible range and negatively correlated with the focal length. As the angle of view increases, the range of viewing angles of the camera increases, while the focal length and viewing distance decrease. One example of distinguishing different types of cameras according to the size of the field angle may be a camera with FOV below 40 degrees, called a telephoto lens, which is characterized by small FOV but long viewing distance; cameras with FOV at 40-50 degrees, including cameras with what is considered a standard field of view angle; and a camera with FOV above 50 degrees. Of course, in practical applications, the above division is not always the subject.
The camera group 110 in the identification apparatus of this example includes three cameras with different angles of view, and the visible areas respectively cover three areas in front of the road, and for convenience of description, in fig. 1, the visible areas covered by the cameras are respectively represented by different triangular areas, and the visible areas of the different cameras have overlapping portions, but the visible areas covered by the cameras in practical application are not necessarily triangular areas. In the diagram 140, an environmental object in the road, which may be a traffic light, for example, is shown. The camera group 110 includes a far camera with a smaller FOV to capture a far target image, and the coverage area of the far target image is a triangular area 1201 shown in fig. 1; a standard FOV camera to capture an image of a mid-range object covering the triangular area 1202 shown in fig. 1; and a camera with a larger FOV to capture an image of a target at a close distance, with the coverage area being the triangular area 1203 shown in fig. 1. The camera group fixed in the vehicle can shoot environment images at different distances through cameras with different FOVs, so that environment targets at different distances can be identified based on the shot environment images. However, there are several areas of this implementation that are worth improvement.
Firstly, the camera group fixed in the vehicle is provided with a plurality of cameras with different FOVs, and relatively, the composition and the control mode of the multi-sensor have higher complexity, more energy is consumed during use, and the system control and calculation pressure is higher, so that the system is not beneficial to simplifying the system to improve the operation efficiency, and the energy is not beneficial to saving energy and improving the energy utilization rate. Secondly, the camera group in this mode is fixed in the vehicle to shoot forward, and a blind area, for example, an area 130 in fig. 1, is easily formed outside the shooting boundary of each camera, and when the target object is switched within the coverage range of different cameras, the target object easily falls into the blind area range to cause temporary target loss, so that the target tracking process is easily unstable, and in the case of lane change or steering of the vehicle, the change situation of the blind area is more complicated, and the instability of the tracking process is further increased. Similarly, for intersections with special traffic environment object distribution, target objects are also very easy to fall into blind areas, for example, when the positions of traffic lights are deviated, the practical application condition may be that only cameras with middle or short distances are shot, and the shooting range covers the traffic lights, which reflects that the adaptability of the implementation mode of the device to target perception and recognition is insufficient, the time left for system decision is reduced, and the decision of a driving system is possibly adversely affected. In addition, when the device identifies and senses an environmental object, high-precision map information is often required to be combined to provide the current position of a vehicle and the position of a front fixed traffic signal lamp in advance so as to start target detection in a range close to the target signal lamp, and the implementation mode cannot adapt to the changed traffic environment, such as the situation that a map is not updated timely due to traffic change, and the situation that temporary traffic control change and the like cannot be applied, so that the application in the traffic environment without collecting road information is also limited.
In view of the above-described aspects of improvements required by the prior art for sensing an environmental object in a traffic environment and a control method, an embodiment of the present application provides an image sensor control system applicable to an autonomous driving traffic environment, which can at least partially overcome the above-described drawbacks of the prior art implementation.
The image sensor control system provided by the embodiment of the present application may include an image sensor with adjustable direction, for example, when a camera component is used as the image sensor, the system can sense and identify environmental objects in the traffic environment based on the shot images, can replace a camera group consisting of a plurality of cameras in the existing implementation mode by a limited number of (such as one) cameras with adjustable directions, and can adopt a single small-field-angle camera with adjustable directions in practical application, the requirements of acquisition distance and acquisition range can be satisfied simultaneously, thus simplifying the sensor composition and the control mode, meanwhile, the direction-adjustable camera can be adjusted according to the target position, so that blind areas generated by the camera group are reduced or avoided, and the problem of unstable target identification process caused by the blind areas is solved. The image sensor based on the direction-adjustable camera component can realize environmental object induction in a tracking mode, can identify and track a target object in real time based on an environmental image in practical application without depending on environmental object position information provided by map application, and can be suitable for more traffic environments, such as temporary traffic control change (for example, temporary traffic lights are added), and can also effectively identify and track objects such as traffic lights and the like in traffic environments without collecting road information.
Example one
Please refer to fig. 2, which is a schematic application diagram of an image sensor system according to an embodiment of the present disclosure, wherein the image sensor 210 may include a camera component and a direction adjustment component, and a direction of the camera component may be changed by the direction adjustment component, for example, as shown in the figure, the camera component may be rotated clockwise or counterclockwise to track an environmental object during a movement process, such as an environmental object 240 shown in fig. 2. Of course, in practical applications, different direction control members may be used, for example, control members that can be adjusted in both horizontal and vertical directions, such as gimbal suspensions, gimbal mounts, etc. In the moving process, as the direction of the image sensor changes, the visible range covered within the angle of view thereof also changes, for example, in fig. 2, when the image sensor 210 moves from the position L0 in fig. 2(a) to the position L1 in fig. 2(b), the visible range thereof also changes from the range 2201 in fig. 2(a) to the range 2202 in fig. 2(b) to keep the environment object 240 within the visible range thereof. Because the direction of the camera component is controlled by the component capable of adjusting the direction, the camera component can adopt a far camera with a small FOV, and the sensing distance is considered while the blind zone is eliminated. For example, in practical application, a camera with an FOV of 18 degrees and a lens with a proper focal length can be adopted, and both the adaptable distance range and tracking and recognizing of the environmental object achieve ideal effects.
Referring to fig. 3-1, which is a schematic diagram of an image sensor control system according to an embodiment of the present invention, as shown in fig. 3-1, a controlled image sensor 330 of the control system, the image sensor 330 may include a camera component 3301 and a direction adjustment component 3302, the image sensor control system may include a computing unit 310 and a control unit 320, where the computing unit may be configured to detect and determine an environmental object in a real-time environmental image according to the real-time environmental image acquired by the image sensor, and determine real-time status information of the environmental object in the real-time environmental image; the control unit 320 may be configured to control generation and transmission of an adjustment instruction for performing a direction adjustment on the image sensor according to real-time status information of the environmental object in the real-time environmental image. When it is determined that the image sensor needs to be directionally adjusted, the control unit 320 determines an adjustment instruction for directionally adjusting the image sensor and transmits it so that the direction adjustment unit 3302 adjusts the direction of the camera unit 3301 according to the adjustment instruction, which enables the image sensor to track the target environment object when the main body to which the image sensor system is mounted moves, for example, when the autonomous vehicle is running, as well as when the robot moves.
The computing unit 310 may have different implementations when it is implemented. As shown in fig. 3-2, the computing unit 310 may include a detection computing subunit 3101, where the detection computing subunit 3101 may be configured to detect and determine an environmental object in the real-time environmental image according to the real-time environmental image acquired by the image sensor 330, and determine real-time status information of the environmental object in the real-time environmental image; the detection calculation subunit 3101 may be obtained by performing machine learning training based on the sample data, and may be implemented by a convolutional neural network, for example, based on the sample data. The detection calculation subunit 3101, which is implemented by machine learning training, may recognize a target environmental object therein from an input real-time environmental image, for example, a traffic sign such as a traffic light, etc., from an environmental image obtained by an image sensor of an autonomous vehicle. The detection calculation subunit can be used as a main core unit for sensing and identifying a target object based on an image, and can be designed to identify classification information of an environmental object according to practical application, wherein the classification information can be information about the state/attribute and the like of the environmental object. That is, the detection calculation subunit 3101 may be configured to detect and determine information such as an environmental object and its classification information in the real-time environment image according to the real-time environment image acquired by the image sensor. For example, when the detection calculation unit 3101 is applied to an automatic driving vehicle system, in addition to recognizing a traffic sign in an environment image obtained by an image sensor of a vehicle, the detection calculation unit may also recognize state information such as a position, a shape, and a color of a traffic signal lamp, so as to realize operations such as control of a vehicle traveling route according to the state information of the traffic signal obtained in real time.
The detection and calculation subunit 3101 implemented in the above manner may be used as a main core unit for sensing and identifying a target object based on an image, and generally provides data support for a plurality of tasks in a system, such as a task of route decision, etc., whereas for a task of adjusting and controlling an image sensor to track a target environmental object, detection and calculation need to be frequently invoked to adjust the image sensor in real time in practical applications, which is not favorable for reducing the operation pressure of the system. On the other hand, the information generated by the detection and calculation subunit is generally the information required by the decision system, while for the adjustment task of the image sensor, the information generated by the detection unit generally includes much redundant information. Therefore, in the system provided by the embodiment of the present application, another implementation manner of the computing unit 310 may be as shown in fig. 3-3, where the computing unit 310 includes a detection computing subunit 3101 and a tracking computing subunit 3102, and the task of the computing unit 310 may be divided into two parts, which are respectively performed by the detection computing subunit 3101 and the tracking computing subunit 3102, that is, the detection computing subunit 3101 is configured to detect and determine an environmental object in a real-time environmental image according to the real-time environmental image acquired by the image sensor; and the tracking calculation subunit 3102 is configured to determine real-time status information of the environmental object in the real-time environmental image.
In such an implementation as shown in fig. 3-3, the frequency of calling the detection computing subunit 3101 due to adjusting the image sensor direction can be reduced, and the detection computing subunit 3101 only needs to provide auxiliary functions for the tracking computing subunit 3102 in a few requirements, for example, providing a tracking target when a tracking task is initialized, assisting the tracking computing subunit 3102 to reposition a target object and the like in the case of a tracking target being lost, and in most cases, only the tracking computing subunit 3102 needs to complete the determination of the real-time status information of the environment object in the real-time environment image. In a specific implementation, a tracking target may be determined by the detection and calculation subunit 3101 in a frame of the real-time environment image at a certain time, for example, at the time of task initialization or when a target object is repositioned, for the tracking calculation subunit 3102. That is, the detection calculating subunit 3101 determines the environment object in the real-time environment image and the tracking target data corresponding to the environment object in a target frame of the real-time environment image; the tracking calculation subunit 3102 searches the subsequent frames of the real-time environment image according to the tracking target data to determine the real-time status information of the environment object in the real-time environment image. When the tracking calculation subunit 3102 loses the tracked environmental object, the detection calculation subunit 3101 may also re-detect and determine the environmental object in the image frame of the real-time environmental image, and the tracking target data of the re-determined environmental object; the tracking calculation subunit 3102 may be configured to search in subsequent frames of the real-time environment image based on the re-determined tracking target data.
The tracking calculation subunit 3102 may be implemented based on a lightweight search tracking method, and is different from the detection calculation subunit 3101 implemented based on sample data to perform machine learning training, and the tracking calculation subunit 3102 implemented based on the lightweight search tracking method may implement tracking of a specific target in an image with less calculation amount, specifically, tracking target data corresponding to an environmental object determined by the detection calculation subunit 3101 in a target frame of a real-time environmental image, for example, the detection calculation subunit 3101 may use a screenshot of the environmental object in the target frame as tracking target data, and the tracking calculation subunit 3102 searches for an image similar to the screenshot of the environmental object in a subsequent frame to determine position information and the like of the image as real-time state information of the environmental object in the real-time environmental image. In practical applications, there are other implementations of searching for a specific target in an image, and the implementation may also be applied to the tracking calculation subunit 3102, which is not necessarily taken as an example here.
In implementations of the above computing unit 310, the determined real-time status information may be one or a combination of information about the tracked target environmental object. For example, in one implementation, the real-time status information may be position information of the environment object in the real-time environment image, the control unit 320 may determine whether to adjust the image sensor 330 according to a change in the position information of the environment object in the real-time environment image, and may determine an adjustment parameter according to the position information of the environment object when determining to adjust the image sensor 330, so as to perform adjustment control on the orientation of the image sensor according to the adjustment parameter. In practical applications, particularly in applications of automatic driving of vehicles, the direction in which a target environmental object such as a traffic light moves in an environmental image is generally fixed, and for example, a displacement component is always generated in a horizontal rightward direction. In addition, in one tracking detection process, only the change of the position of the target environmental object in the current frame relative to the reference position (which may be the position of the environmental object in the initial frame of the detection, and is generally located on the vertical centerline of the environmental image) may be considered.
As shown in fig. 4-1, which is a schematic diagram of the position of the target environmental object in the environmental image, in fig. 4-1, in the real-time environmental image 400 acquired by the image sensor, the tracking/detection environmental object 440 in the current frame is located at L1, and the reference position L0 is located on the vertical centerline line _ m of the image, and the position information of the environmental object 440 at L1 can be determined as the real-time status information of the environmental object. And the control unit 320 may determine whether to adjust the orientation of the image sensor according to the position information of the environmental object. In another implementation, the adjustment parameter for adjusting the direction of the image sensor may also be determined in combination with the moving speed information of the environmental object in the specific spatial dimension of the real-time environmental image, for example, when the moving speed is fast, the adjustment may be started in advance, or the adjustment may be performed with a relatively large amplitude and a fast rotation speed, that is, the real-time status information may also include the moving speed information of the environmental object in the specific spatial dimension of the real-time environmental image. The moving speed information can be obtained by calculating the displacement and the tracking detection frequency.
In determining whether to adjust the direction of the image sensor, the control unit 320 may determine a moving direction of the environmental object with respect to the image sensor or with respect to a subject (e.g., an autonomous vehicle) according to the real-time status information of the environmental object, and further adjust the direction of the image sensor by a certain amount according to the relative moving direction of the environmental object when the tracked environmental object can move out of the coverage of the image sensor. However, this implementation when the environmental object moves out of the coverage area is prone to cause the loss of the tracked target environmental object when the environmental object moves at an excessive speed relative to the sensor, so another more stable tracking manner is provided herein. In another implementation, the control unit 320 may determine generation and transmission of an adjustment instruction for performing a direction adjustment on the image sensor when the environmental object approaches the edge of the environmental object to a certain extent. In a specific implementation, as shown in fig. 4-1, a boundary line _ m may be preset in the real-time environment image, and when the tracked environment object moves to a position L2, that is, moves out of the boundary line _ m, the direction adjustment of the image sensor is started. In this way, the area outside the boundary line _ m becomes a "buffer area", and when the environment object moves into the area, the adjustment of the image sensor is started, so that the target environment object can be always in the picture in most cases, the target is not easily lost in the tracking process, the effectiveness of the sensor in tracking the environment object is increased, and the effectiveness of the image sensor is increased.
Another implementation may be as shown in fig. 4-2, a stable detection region 450 may be determined in the real-time environment image, the relative position of the environment object 440 and the stable detection region 450 may be determined, and the determined relative position may be used as the real-time status information of the environment object. The stable detection region may be determined by detecting/tracking the subject of the environmental object, and may be, for example, the detection calculation subunit 3101, or the tracking calculation subunit 3102, or the like. The center of the detection area may be located at the center of the real-time environment image. In a specific implementation, when it is determined that the environmental object moves out of the stable detection region, for example, moves to the position L2 shown in fig. 4-2, a relative distance between the environmental object and the stable detection region, for example, a distance from the center of the stable detection region or a distance from a closer edge of the stable detection region in the real-time environmental image of the environmental object, may be determined by an executor detecting/tracking the environmental object, for example, the tracking calculation subunit 3102, and the relative distance is used as the real-time status information. Thus, the degree of deviation of the target environmental object from the image sensor at the moment can be determined according to the relative distance, and according to the determined relative distance, the control unit 320 can determine adjustment parameters of the adjustment instruction, such as an adjustment direction, an adjustment angle, a rotation speed, and the like, wherein the adjustment direction can be a movement direction of the target environmental object or a component of the movement direction in a certain dimension, and is determined according to different implementations, for example, when the image sensor supports horizontal left-right rotation, the adjustment direction can be a component of the movement direction of the target environmental object in a horizontal direction, and when the image sensor supports adjustment in both horizontal and vertical directions, the adjustment direction of the image sensor in both horizontal and vertical dimensions can be determined according to the movement direction of the target environmental object in the real-time environmental image. In addition, the farther the relative distance, the larger the adjustment angle, the faster the grasping speed required for adjustment to quickly adjust the image sensor to the appropriate orientation.
The control unit may target the target environment object to be adjusted to a centerline or a midpoint of the environment image when determining an adjustment parameter of the adjustment instruction to control the image sensor to perform the direction adjustment according to the adjustment parameter at 320. In another implementation, a certain distance may be adjusted through a middle line or a middle point in the opposite direction of the moving direction of the environmental object to reduce the adjustment frequency and improve the tracking stability, in a specific implementation, the adjustment amount may include a rotation margin, the rotation margin may be a preset amount, such as a preset "overshoot" angle, or may be obtained by calculation according to an actual deviation rate of the environmental object, and the direction adjustment component may adjust the direction of the image sensor according to the adjustment amount carrying the rotation margin. For example, as shown in fig. 4-2, after adjusting the orientation of the image sensor according to the adjustment amount carrying the turning margin, the target environmental object may be located at the position of L3 in the real-time environmental image.
In a practical application, the target environmental object may be an object group comprising a plurality of synchronously moving sub-environmental objects, for example, for an image sensor control system applied in an autonomous vehicle, the object group may be a lamp group comprising a plurality of traffic lights, each of which is synchronized with respect to the motion of the vehicle within the lamp group. In order to simplify the calculation and reduce the tracking pressure, for an object group composed of such multiple sub-objects, one of the sub-objects may be tracked, i.e. the relevant parameters for adjusting the image sensor may be determined, and in particular, the moving direction of the environmental object may be determined by the detection and calculation subunit 3101 according to the position information of the environmental object in different frames of the real-time environmental image. When all the sub-objects in the object group are positioned in the stable detection area, determining the sub-object which is closest to the edge of the stable detection area in the moving direction as a target sub-environment object; determining tracking target data of a target sub-environment object; furthermore, the tracking calculation subunit 3102 may search in the subsequent frame of the real-time environment image according to the tracking target data, determine the real-time status information of the environment object in the real-time environment image, and generate an adjustment instruction and an adjustment parameter for adjusting the direction of the image sensor based on the obtained real-time status information.
With the relative position of the environmental object and the image sensor changing continuously, the tracked environmental object, such as a traffic light, may eventually exceed the effective visible range of the image sensor, and the control unit 320 may send a reset instruction to reset the image sensor when the image sensor rotates to the maximum allowable angle and the environmental object is still located outside the stable detection area. The reset instruction can enable the image sensor to return to the angle at which the environmental object is found last time, namely, to the angle at which the next environmental object is more likely to appear, so that certain time cost is saved when the real-time environmental image is detected and searched again, and the identification efficiency is improved.
The image sensor control system provided in the first embodiment of the present application is described in detail above, in which the image sensor includes a camera component and a direction adjustment component, so as to replace the existing image sensor composed of a plurality of camera groups with different FOVs, and the image sensor control system includes a calculation unit and a control unit; the computing unit can be used for detecting and determining an environmental object in the real-time environment image according to the real-time environment image acquired by the image sensor, and determining real-time state information of the environmental object in the real-time environment image; and the control unit can control the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the environment object. The image sensor with the single camera and the adjustable direction is adopted to track the environmental object in real time, the defect that blind areas are easily generated in the prior art can be overcome while the visual range of the sensor is considered, the adaptability of the system to the environment object under the condition of complex distribution is improved, and the reliability of tracking and identifying the environmental object is improved. Can cover the environmental object of different distances based on single sensor, for the sensor machine control scheme of many cameras, the effectual calculated amount that reduces the response data is favorable to simplifying the system in order to improve the operating efficiency. Because the camera is adjusted in a dynamic tracking mode to cover the target object, when the method is applied to automatic driving, the occurrence of blind areas can be effectively sensed in the driving process, and the image sensor can also effectively cover traffic environment objects such as traffic lights and the like when the vehicle changes lanes or turns or meets the situations such as intersections with special traffic lights; in addition, the control system can track and identify the environmental object in real time, so that the map navigation information is not needed to assist, and the control system can also effectively track and identify temporary traffic control changes, such as temporary traffic lights, and provide an effective decision basis for a driving system. The efficiency and the reliability of the system for identifying and tracking the environmental object are improved, and the safety of the automatic driving system is improved. The control system may be applied in the field of autonomous driving, and the environmental objects may include objects in other autonomous driving environments, such as traffic lights. The method can also be applied to other fields needing environment object identification, such as the technical field of robots and the like, so that the identification efficiency and reliability are improved when the environment object related to the related field is tracked and identified. Furthermore, in the system, the environment object in the real-time environment image is detected and determined, and the real-time state information of the environment object in the real-time environment image is determined, wherein the two tasks are respectively realized by different units, the task of determining the real-time state information of the environment object in the real-time environment image to provide the adjusting basis for tracking the environment object is realized by a lightweight search tracking method, the calling of relevant units for identification and detection is reduced, the decoupling of different tasks is realized, the calculation amount of the system is reduced on the whole, the complexity of system design is reduced, and the operating efficiency of the system is improved.
Example two
The second embodiment of the present application provides an image sensor control method, in which a real-time environment image can be obtained by an image sensor with an adjustable direction. Referring to fig. 5, which is a flowchart of an image sensor control method provided in the second embodiment of the present application, as shown in fig. 5, the method may include the following steps:
s510: detecting and determining environmental object information in the real-time environment image according to the real-time environment image acquired by the image sensor;
the real-time environment information acquired by the image sensor may include a series of frame pictures of environment images acquired in real time, and when the method is applied to an autonomous vehicle, the environment images may be traffic environment images, and for the autonomous vehicle, the environment objects may be traffic environment objects in a traffic environment, such as traffic lights. Firstly, the environment object information in the real-time environment image can be detected and determined according to the real-time environment image acquired by the image sensor, and in the concrete implementation, the real-time environment image acquired by the image sensor can be identified through a pre-trained detection calculation module, and the environment object information in the real-time environment image can be detected and determined. And the detection calculation module performs machine learning training based on the sample data to obtain the target. The detection calculation module can be used as a main core unit for sensing and identifying a target object based on an image, and can be designed to identify classification information of an environmental object according to practical application, wherein the classification information can be information about the state/attribute and the like of the environmental object. Namely, the method can be used for detecting and determining the environmental object and the classification information thereof in the real-time environment image according to the real-time environment image acquired by the image sensor. For example, when the system is applied to an automatic driving vehicle system, in addition to identifying a traffic sign in an environment image obtained by an image sensor of a vehicle, the system can also identify state information such as the position, shape and color of a traffic signal lamp, so as to realize operations such as control of a vehicle traveling route according to the state information of the traffic signal and the like obtained in real time.
S520: tracking a target environment object in the real-time environment image, and determining real-time state information of the target environment object in the real-time environment image;
however, the detection calculation is a main core task based on image sensing and target object identification, and usually provides data support for a plurality of tasks in the system, such as route decision and the like, whereas for the task of adjusting and controlling the image sensor to track the target environment object, the detection calculation needs to be frequently called in practical application to adjust the image sensor in real time, which is not beneficial to reducing the operation pressure of the system. On the other hand, the information generated by the detection calculation is usually the information required by the decision system, while for the adjustment task of the image sensor, the information generated by the detection unit usually comprises much redundant information. Therefore, in the method provided by the embodiment of the application, the target environment object in the real-time environment image can be tracked through the tracking calculation module, and the real-time state information of the target environment object in the real-time environment image is determined. The tracking calculation module is realized based on a lightweight search tracking method. In the specific implementation, an environment object in the real-time environment image and tracking target data corresponding to the environment object can be determined in a target frame of the real-time environment image through detection calculation; and then, searching in subsequent frames of the real-time environment image by the tracking calculation module according to the tracking target data to determine the real-time state information of the environment object in the real-time environment image.
The real-time status information of the target environment object in the real-time environment image may include position information of the target environment object in the real-time environment image. When determining the real-time status information of the target environmental object in the real-time environmental image, a stable detection area may be determined in the real-time environmental image, and the relative position of the environmental object and the stable detection area may be determined, with the relative position as the real-time status information of the environmental object.
In the above implementations, the determined real-time status information may be one or a combination of information about the tracked target environmental object. For example, in one implementation, the real-time status information may be position information of the environmental object in the real-time environmental image, whether to adjust the image sensor may be determined according to a change of the position information of the environmental object in the real-time environmental image, and an adjustment parameter may be determined according to the position information of the environmental object when determining to adjust the image sensor, so as to perform adjustment control on the orientation of the image sensor according to the adjustment parameter. In practical applications, particularly in applications of automatic driving of vehicles, the direction in which a target environmental object such as a traffic light moves in an environmental image is generally fixed, and for example, a displacement component is always generated in a horizontal right direction. In addition, in one tracking detection process, only the change of the position of the target environment object in the current frame relative to the reference position may be considered. In another implementation, the adjustment parameter for adjusting the direction of the image sensor may also be determined in combination with the moving speed information of the environmental object in the specific spatial dimension of the real-time environmental image, for example, when the moving speed is fast, the adjustment may be started in advance, or the adjustment may be performed with a relatively large amplitude and a fast rotation speed, that is, the real-time status information may also include the moving speed information of the environmental object in the specific spatial dimension of the real-time environmental image. The moving speed information can be obtained by calculating the displacement and the tracking detection frequency.
S530: determining whether to perform a directional adjustment on the image sensor based on the real-time status information of the target environmental object.
When the image sensor is determined to need to be subjected to direction adjustment, adjusting parameters for carrying out direction adjustment on the image sensor are determined, and the direction of the image sensor is adjusted according to the adjusting parameters. When determining whether to adjust the direction of the image sensor, the moving direction of the environmental object relative to the image sensor or relative to the subject (such as an autonomous vehicle) can be determined according to the real-time status information of the environmental object, and then when the tracked environmental object can move out of the coverage area of the image sensor, the direction of the image sensor is adjusted by a certain amplitude according to the relative moving direction of the environmental object. In another implementation, a stable detection area may be determined in the real-time environment image, the relative position between the environment object and the stable detection area may be determined, and the determined relative position may be used as the real-time status information of the environment object. For example, when it is determined that the environmental object moves out of the stable detection area, the relative distance of the environmental object with respect to the stable detection area may be determined, and may be a distance from the center of the stable detection area or a distance from a near side of the stable detection area in the real-time environmental image of the environmental object, and the relative distance is used as the real-time status information. The degree of deviation of the target environmental object from the image sensor at the moment can be determined according to the relative distance, and the adjustment parameters of the adjustment instruction, such as the adjustment direction, the adjustment angle, the rotation speed, and the like, can be determined according to the determined relative distance, and the adjustment direction can be the movement direction of the target environmental object or the component of the movement direction in a certain dimension, and can be determined according to different implementations, for example, when the image sensor supports horizontal left-right rotation, the adjustment direction can be the component of the movement direction of the target environmental object in the horizontal direction, and when the image sensor supports adjustment in both the horizontal direction and the vertical direction, the adjustment direction of the image sensor in both the horizontal and the vertical dimensions can be determined according to the movement direction of the target environmental object in the real-time environmental image. In addition, the farther the relative distance, the larger the adjustment angle, the faster the grasping speed required for adjustment to quickly adjust the image sensor to the appropriate orientation.
In a practical application, the target environmental object may be an object group comprising a plurality of synchronously moving sub-environmental objects, for example, for an image sensor control system applied in an autonomous vehicle, the object group may be a lamp group comprising a plurality of traffic lights, each of which is synchronized with respect to the motion of the vehicle within the lamp group. In order to simplify the calculation and reduce the tracking pressure, for an object group consisting of such multiple sub-objects, one of the sub-objects may be tracked, that is, relevant parameters of the image sensor may be determined and adjusted. When all the sub-objects in the object group are positioned in the stable detection area, determining the sub-object which is closest to the edge of the stable detection area in the moving direction as a target sub-environment object; determining tracking target data of a target sub-environment object; and then searching in the subsequent frame of the real-time environment image according to the tracking target data, determining the real-time state information of the environment object in the real-time environment image, and generating an adjusting instruction and an adjusting parameter for adjusting the direction of the image sensor according to the obtained real-time state information.
With the continuous change of the relative positions of the environment object and the image sensor, the tracked environment object, such as a traffic light, finally exceeds the effective visual range of the image sensor, and a reset instruction is sent to reset the image sensor when the image sensor rotates to the maximum allowable angle and the environment object is still outside the stable detection area. The reset instruction can enable the image sensor to return to the angle at which the environmental object is found last time, namely, to the angle at which the next environmental object is more likely to appear, so that certain time cost is saved when the real-time environmental image is detected and searched again, and the identification efficiency is improved.
In the method, the real-time environment image is obtained through the image sensor with the adjustable direction, the image sensor comprises a camera, and the method can detect and determine the environment object information in the real-time environment image according to the real-time environment image obtained by the image sensor; tracking a target environment object in the real-time environment image, and determining real-time state information of the target environment object in the real-time environment image; and determining whether to adjust the direction of the image sensor according to the real-time state information of the target environment object. The image sensor with the single camera and the adjustable direction is adopted to track the environmental object in real time, the defect that blind areas are easily generated in the prior art can be overcome while the visual range of the sensor is considered, the adaptability of the system to the environment object under the condition of complex distribution is improved, and the reliability of tracking and identifying the environmental object is improved. Can cover the environmental object of different distances based on single sensor, for the sensor machine control scheme of many cameras, the effectual calculated amount that reduces the response data is favorable to simplifying the system in order to improve the operating efficiency. Because the camera is adjusted in a dynamic tracking mode to cover the target object, when the method is applied to automatic driving, the occurrence of blind areas can be effectively sensed in the driving process, and the image sensor can also effectively cover traffic environment objects such as traffic lights and the like when the vehicle changes lanes or turns or meets the situations such as intersections with special traffic lights; in addition, the control system can track and identify the environmental object in real time, so that the map navigation information is not needed to assist, and the control system can also effectively track and identify temporary traffic control changes, such as temporary traffic lights, and provide an effective decision basis for a driving system. The efficiency and the reliability of the system for identifying and tracking the environmental object are improved, and the safety of the automatic driving system is improved. The control method can be applied to the field of automatic driving, and the environmental objects can comprise objects in other automatic driving environments, such as traffic lights. The method can also be applied to other fields needing environment object identification, such as the technical field of robots and the like, so that the identification efficiency and reliability are improved when the environment object related to the related field is tracked and identified. Furthermore, in the method, the environment object in the real-time environment image is detected and determined, the real-time state information of the environment object in the real-time environment image is determined, and the two tasks are respectively realized by different algorithm modules, wherein the real-time state information of the environment object in the real-time environment image is determined, so that the task of providing the adjusting basis for tracking the environment object is realized by a lightweight search tracking method, the calling of the related tasks of identification and detection is reduced, the multi-task decoupling is realized, the calculation amount of the system is reduced on the whole, the complexity of the system design is reduced, and the operation efficiency of the system is improved.
Corresponding to the image sensor control method provided in the second embodiment of the present application, there is also provided an image sensor control apparatus, wherein the image sensor includes a camera component and a direction adjustment component, as shown in fig. 6, the apparatus includes:
an environment object recognition unit 610 for detecting and determining environment object information in the real-time environment image from the real-time environment image acquired by the image sensor;
a target object tracking unit 620, configured to track a target environment object in the real-time environment image, and determine real-time status information of the target environment object in the real-time environment image; and
an adjustment control unit 630, configured to determine whether to perform a direction adjustment on the image sensor according to the real-time status information of the target environmental object.
EXAMPLE III
The image sensor control system, method and device in the above embodiments can be applied to different application fields, for example, part of the first embodiment is described with reference to the application in the automatic driving system as an example. A third embodiment of the present application provides an image sensor control system in an autonomous vehicle, please refer to fig. 7, which is a schematic diagram of the image sensor control system in the autonomous vehicle, wherein the image sensor 730 may include a camera component 7301, and a direction adjustment component 7302, and as shown in fig. 7, the image sensor control system may include:
an information processing unit 710 for detecting and determining a traffic environment object in the real-time environment image according to the real-time environment image acquired by the image sensor, and determining real-time status information of the traffic environment object in the real-time environment image;
and a sensor control unit 720, configured to control generation and transmission of an adjustment instruction for performing direction adjustment on the image sensor according to real-time status information of the environment object in the real-time environment image.
The traffic environment object may include various kinds, such as a person, a vehicle, a traffic signal light, and the like, and when the traffic environment object for processing is the traffic signal light, the information processing unit may control generation and transmission of an adjustment instruction for performing direction adjustment on the image sensor according to real-time status information of the traffic signal light in a real-time environment image; and when the image sensor is determined to be required to be subjected to direction adjustment, determining an adjustment parameter for performing direction adjustment on the image sensor, and performing direction adjustment on the image sensor according to the adjustment parameter so as to track the traffic signal lamp by adjusting the image sensor in the driving process of the vehicle. In the implementation process, reference may be made to implementation of the image sensor control system in the first embodiment of the present application for target tracking and image sensor control, so that the image sensor control system in the autonomous vehicle has corresponding or similar advantages.
In another embodiment, the system can be further used for tracking and sensing control of an oncoming vehicle, such as a traffic environment object, in this embodiment, the environment object processed by the system can include an oncoming vehicle, and the sensor control unit can control generation and sending of an adjustment instruction for adjusting the direction of the image sensor according to real-time state information of the oncoming vehicle in a real-time environment image; and when the image sensor is determined to need to be subjected to direction adjustment, determining an adjustment parameter for performing direction adjustment on the image sensor, and performing direction adjustment on the image sensor according to the adjustment parameter so as to continuously track and identify the opposite vehicle by adjusting the image sensor in the driving process of the vehicle.
In the process of tracking the opposite vehicle, the distance information between the automatic driving vehicle and the opposite vehicle can be determined according to the angle of the image sensor, the position information of the image of the opposite vehicle in the real-time state image and the geometric relationship between the vehicle and the image sensor, and the distance information between the automatic driving vehicle and the opposite vehicle has certain precision. In such an embodiment, the sensor control unit may be configured to determine a current angle of the image sensor; the real-time status information may include position information of the image of the oncoming vehicle in the real-time status image. The information processing unit may determine distance information of the autonomous vehicle and the oncoming vehicle based on the current angle and position information of the image of the oncoming vehicle in the real-time status image.
The determined distance information can be used as a basis for adjusting lighting of the vehicle, for example, the use of far/near light of the vehicle can be controlled, the light is switched to be near light when the opposite vehicle drives to a short distance, the influence of the far light on the opposite vehicle is avoided, and the safety of the automatic driving vehicle participating in traffic is improved. That is, the information processing unit may provide vehicle distance information, for example, to vehicle light adjustment related components, so that the vehicle light automatic adjustment component adjusts the light illumination of the autonomous vehicle according to the distance information.
Example four
In an intelligent traffic system, a roadside device is usually equipped, and the roadside device can provide traffic data information for an automatic driving vehicle, such as road conditions, environmental object data related to traffic and the like, and commands or helps the vehicle to run by providing information for the vehicle or interacting with the vehicle, so that the traffic smoothness and the traffic safety are ensured. An embodiment of the present application provides a traffic environment object sensing system for automatic driving, please refer to fig. 8, which is a schematic diagram of the traffic environment object sensing system for automatic driving, as shown in fig. 8, the system may include: a communication unit 840, a calculation unit 810, a control unit 820, an image sensor 830; the image sensor 830 includes a camera component 8301, and a direction adjustment component 8302;
the communication unit 840 may be configured to receive sensing task information of the roadside device;
the calculating unit 810 may be configured to detect and determine a traffic environment object in the real-time environment image according to the real-time environment image acquired by the image sensor after the sensing task is started, and determine real-time status information of the traffic environment object in the real-time environment image; the control unit 820 may be configured to control generation and transmission of an adjustment instruction for performing direction adjustment on the image sensor according to real-time status information of the environmental object in the real-time environmental image. The camera component and the control system thereof in the autonomous vehicle may refer to the descriptions in the first embodiment or the third embodiment of the present application.
In one embodiment, whether the traffic environment object is started or not can be determined by sensing according to information provided by the road side equipment, for example, when the traffic signal is used as the target traffic environment object, the road is segmented by the traffic signal, and the distribution of the traffic signal is not uniform in general. At this time, when the autonomous vehicle recognizes and tracks the traffic signal as the target, the roadside device may provide the position information of the traffic signal, or the roadside device may provide the road area where the target needs to be tracked, or the roadside device may also provide an instruction for turning on the recognition/sensing target object, and the like, so as to start the sensing recognition of the traffic environment object according to the information or the instruction provided by the roadside device, where the sensing task information may include information for starting a sensing task when the autonomous vehicle reaches a preset sensing area.
In traffic, there are multiple types of traffic environment objects, and in different scenes, different types of environment objects may have different tracking priorities, for example, in traffic where people participate, people are often target objects for which security assurance needs to be prioritized; when approaching an intersection, generally, traffic signs, traffic lights, and the like represent traffic regulations, and are traffic objects and the like that need to be handled by the system. In the intelligent traffic system, level information of different types of traffic environment objects in traffic can be provided by the road side equipment, so that the automatic driving vehicle determines the traffic environment object (type) needing to be paid attention, identified and tracked preferentially according to the level information of the different types of traffic environment objects. A plurality of the above-described camera assemblies may be mounted in an autonomous vehicle to track different targets, or different types of targets, respectively.
An embodiment of the present application further provides a computer system, including:
one or more processors; an image sensor;
the image sensor comprises a camera component and a direction adjusting component; and the number of the first and second groups,
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
detecting and determining environmental object information in the real-time environment image according to the real-time environment image acquired by the image sensor;
tracking a target environment object in the real-time environment image, and determining real-time state information of the target environment object in the real-time environment image;
determining whether to perform a directional adjustment on the image sensor based on the real-time status information of the target environmental object.
Fig. 9 illustrates an architecture of a computer system, which may include, in particular, a processor 910, a video display adapter 911, a disk drive 912, an input/output interface 913, a network interface 914, and a memory 920. The processor 910, the video display adapter 911, the disk drive 912, the input/output interface 913, and the network interface 914 may be communicatively connected to the memory 920 via a communication bus 930.
The processor 910 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solution provided in the present Application.
The Memory 920 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random access Memory), a static storage device, a dynamic storage device, or the like. Memory 920 may store an operating system 921 for controlling the operation of computer system 900, a Basic Input Output System (BIOS) for controlling low-level operations of computer system 900. In addition, a web browser 923, a data storage management system 924, an icon font processing system 925, and the like may also be stored. The icon font processing system 925 can be an application program that implements the operations of the foregoing steps in this embodiment of the application. In summary, when the technical solution provided in the present application is implemented by software or firmware, the relevant program code is stored in the memory 920 and invoked by the processor 910 for execution.
The input/output interface 913 is used to connect the input/output module to realize information input and output. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The network interface 914 is used for connecting a communication module (not shown in the figure) to implement communication interaction between the present device and other devices. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, Bluetooth and the like).
The bus 930 includes a path to transfer information between the various components of the device, such as the processor 910, the video display adapter 911, the disk drive 912, the input/output interface 913, the network interface 914, and the memory 920.
In addition, the computer system 900 may also obtain information of specific retrieval conditions from the virtual resource object retrieval condition information database 941 for performing condition judgment, and the like.
It should be noted that although the above-mentioned devices only show the processor 910, the video display adapter 911, the disk drive 912, the input/output interface 913, the network interface 914, the memory 920, the bus 930 and so on, in a specific implementation, the device may also include other components necessary for normal operation. Furthermore, it will be understood by those skilled in the art that the apparatus described above may also include only the components necessary to implement the solution of the present application, and not necessarily all of the components shown in the figures.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments of the present application.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, the system or system embodiments are substantially similar to the method embodiments and therefore are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described system and system embodiments are only illustrative, wherein the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The image sensor control method, device and system provided by the present application are introduced in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the above description of the embodiment is only used to help understand the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, the specific embodiments and the application range may be changed. In view of the above, the description should not be taken as limiting the application.

Claims (36)

1. An image sensor control system characterized in that,
the image sensor comprises a camera component and a direction adjusting component;
the control system includes:
a calculation unit and a control unit;
the computing unit is used for detecting and determining an environmental object in the real-time environmental image according to the real-time environmental image acquired by the image sensor, and determining real-time state information of the environmental object in the real-time environmental image;
and the control unit is used for controlling the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the environment object in the real-time environment image.
2. The system of claim 1, wherein the control unit is further configured to:
when the image sensor is determined to need to be directionally adjusted, an adjusting instruction for directionally adjusting the image sensor is determined and sent, so that the direction adjusting component can adjust the direction of the camera component according to the adjusting instruction.
3. The system of claim 1, wherein the computing unit comprises:
and the detection calculation subunit is used for detecting and determining the environmental object in the real-time environment image according to the real-time environment image acquired by the image sensor.
4. The system of claim 3, wherein the detection computation subunit is obtained by performing machine learning training based on sample data.
5. The system of claim 3, wherein the detection calculation subunit is further configured to:
and detecting and determining the environmental object and the classification information thereof in the real-time environment image according to the real-time environment image acquired by the image sensor.
6. The system of claim 3, wherein the computing unit comprises:
and the tracking calculation subunit is used for determining the real-time state information of the environment object in the real-time environment image.
7. The system of claim 6, wherein the detection computation subunit is configured to:
determining an environmental object in the real-time environmental image and tracking target data corresponding to the environmental object in a target frame of the real-time environmental image;
and the tracking calculation subunit is used for searching in subsequent frames of the real-time environment image according to the tracking target data and determining the real-time state information of the environment object in the real-time environment image.
8. The system of claim 7, wherein the detection computation subunit is configured to:
re-detecting and determining an environmental object in an image frame of a real-time environmental image and tracking target data of the re-determined environmental object after the tracking calculation subunit loses the tracked environmental object;
and the tracking calculation subunit is used for searching in the subsequent frames of the real-time environment image according to the redetermined tracking target data.
9. The system of claim 7, wherein the tracking computation subunit is implemented based on a lightweight search tracking method.
10. The system of any of claims 1-9, wherein the real-time status information includes location information of the environmental object in the real-time environmental image.
11. The system of claim 10, wherein the real-time status information further comprises speed of movement information of the environmental object in a particular spatial dimension of the real-time environmental image.
12. The system according to any of claims 6-9, wherein the tracking calculation subunit is configured to:
and determining a stable detection area in the real-time environment image, determining the relative position of the environment object and the stable detection area, and taking the relative position as the real-time state information of the environment object.
13. The system of claim 12, wherein a center of the detection region is located at a center of the real-time environment image.
14. The system of claim 12, wherein the tracking computation subunit is configured to:
and when the environment object is determined to move out of the stable detection area, determining the relative distance of the environment object relative to the stable detection area, and taking the relative distance as the real-time state information.
15. The system of claim 14, wherein the control unit is configured to:
and determining the adjusting parameters of the adjusting instructions according to the relative distance.
16. The system of claim 15, wherein the adjustment parameters comprise: the amount of adjustment, and the speed of rotation.
17. The system of claim 16, wherein the adjustment amount comprises a turning margin; the direction adjustment component is used for:
and adjusting the direction of the image sensor according to the adjustment amount carrying the rotation allowance.
18. The system of claim 12, wherein the detection computation subunit is configured to:
and determining the moving direction of the environmental object according to the position information of the environmental object in different frames of the real-time environmental image.
19. The system of claim 18, wherein the environment object comprises an object group of a plurality of synchronized sub-environment objects; the detection calculation subunit is configured to:
when all the sub-objects in the object group are located in the stable detection area, determining the sub-object which is closest to the edge of the stable detection area in the moving direction as a target sub-environment object; determining tracking target data of the target sub-environment object;
and the tracking calculation subunit is used for searching in subsequent frames of the real-time environment image according to the tracking target data and determining the real-time state information of the environment object in the real-time environment image.
20. The system of claim 12, wherein the control unit is configured to:
and when the image sensor rotates to a maximum allowable angle and the environmental object is still positioned outside the stable detection area, sending a reset instruction to reset the image sensor.
21. An image sensor control system in an autonomous vehicle,
the image sensor comprises a camera component and a direction adjusting component;
the control system includes:
the information processing unit is used for detecting and determining a traffic environment object in the real-time environment image according to the real-time environment image acquired by the image sensor and determining real-time state information of the traffic environment object in the real-time environment image;
and the sensor control unit is used for controlling the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the environment object in the real-time environment image.
22. The system of claim 21, the traffic environment object comprising a traffic light, the information processing unit for controlling generation and transmission of an adjustment instruction for directional adjustment of the image sensor based on real-time status information of the traffic light in a real-time environment image; and the number of the first and second groups,
when the image sensor needs to be subjected to direction adjustment, the adjustment parameters for the direction adjustment of the image sensor are determined, and the direction of the image sensor is adjusted according to the adjustment parameters, so that the traffic signal lamp is tracked by adjusting the image sensor in the driving process of the vehicle.
23. The system of claim 21, the traffic environment object comprising an oncoming vehicle, the sensor control unit to:
controlling the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the opposite vehicle in the real-time environment image; and the number of the first and second groups,
when the image sensor is determined to need to be subjected to direction adjustment, adjusting parameters for performing direction adjustment on the image sensor are determined, and the image sensor is subjected to direction adjustment according to the adjusting parameters so as to track the opposite vehicle by adjusting the image sensor in the driving process of the vehicle.
24. The system of claim 23, wherein the sensor control unit is further configured to:
determining a current angle of the image sensor;
the real-time status information includes position information of an image of the oncoming vehicle in the real-time status image;
the information processing unit is configured to:
and determining the distance information between the automatic driving vehicle and the opposite vehicle according to the current angle and the position information of the image of the opposite vehicle in the real-time state image.
25. The system of claim 24, wherein the information processing unit is configured to:
and providing the vehicle distance information so that the vehicle light automatic adjusting component adjusts the light illumination of the automatic driving vehicle according to the distance information.
26. A traffic environment object sensing system for autonomous driving, comprising:
a communication unit, a calculation unit, a control unit, an image sensor; wherein the image sensor comprises a camera component and a direction adjusting component;
the communication unit is used for receiving induction task information of the road side equipment;
the computing unit is used for detecting and determining a traffic environment object in the real-time environment image according to the real-time environment image acquired by the image sensor after the induction task is started, and determining real-time state information of the traffic environment object in the real-time environment image;
and the control unit is used for controlling the generation and the transmission of an adjusting instruction for adjusting the direction of the image sensor according to the real-time state information of the traffic environment object in the real-time environment image.
27. The system of claim 26, wherein the sensory task information comprises information that initiates a sensory task when the autonomous vehicle reaches a preset sensory area.
28. The system of claim 26, wherein the sensing task information includes level information of different types of traffic environment objects;
the computing unit is used for determining a traffic environment object of a target type of the current sensing task according to the level information; and detecting and determining the traffic environment object of the target type in the real-time environment image according to the real-time environment image acquired by the image sensor, and determining the real-time state information of the traffic environment object of the target type in the real-time environment image.
29. An image sensor control method for obtaining real-time environment images by an orientation adjustable image sensor, the method comprising:
detecting and determining environmental object information in the real-time environment image according to the real-time environment image acquired by the image sensor;
tracking a target environment object in the real-time environment image, and determining real-time state information of the target environment object in the real-time environment image;
determining whether to perform a directional adjustment on the image sensor based on the real-time status information of the target environmental object.
30. The method of claim 29, wherein detecting and determining environmental object information in the real-time environmental image from the real-time environmental image acquired by the image sensor comprises:
and identifying the real-time environment image acquired by the image sensor through a pre-trained detection calculation module, and detecting and determining the environment object information in the real-time environment image.
31. The method according to claim 29 or 30, wherein the tracking the target environment object in the real-time environment image, and determining the real-time status information of the target environment object in the real-time environment image comprises:
and tracking the target environment object in the real-time environment image through a tracking calculation module, and determining the real-time state information of the target environment object in the real-time environment image.
32. The method of claim 31, wherein detecting and determining environmental object information in the real-time environmental image from the real-time environmental image acquired by the image sensor comprises:
determining an environmental object in the real-time environmental image and tracking target data corresponding to the environmental object in a target frame of the real-time environmental image;
the tracking the target environment object in the real-time environment image and determining the real-time state information of the target environment object in the real-time environment image includes:
and searching in the subsequent frames of the real-time environment image according to the tracking target data, and determining the real-time state information of the environment object in the real-time environment image.
33. The method of claim 31, wherein the real-time status information of the target environmental object in the real-time environmental image comprises position information of the target environmental object in the real-time environmental image;
the determining real-time status information of the target environmental object in the real-time environmental image includes:
and determining a stable detection area in the real-time environment image, determining the relative position of the environment object and the stable detection area, and taking the relative position as the real-time state information of the environment object.
34. The method of claim 33, wherein determining whether to adjust the orientation of the image sensor based on the real-time status information of the target environmental object comprises:
when it is determined that the environmental object moves out of the stable detection area, determining a relative distance of the environmental object with respect to the stable detection area, taking the relative distance as the real-time status information, and determining whether direction adjustment of the image sensor is required.
35. An image sensor control apparatus, wherein the image sensor includes a camera element, and a direction adjustment element, the apparatus comprising:
an environmental object recognition unit for detecting and determining environmental object information in the real-time environmental image from the real-time environmental image acquired by the image sensor;
the target object tracking unit is used for tracking a target environment object in the real-time environment image and determining real-time state information of the target environment object in the real-time environment image;
and the adjusting control unit is used for determining whether to adjust the direction of the image sensor according to the real-time state information of the target environment object.
36. A computer system, comprising:
one or more processors; an image sensor;
the image sensor comprises a camera component and a direction adjusting component; and the number of the first and second groups,
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
detecting and determining environmental object information in the real-time environment image according to the real-time environment image acquired by the image sensor;
tracking a target environment object in the real-time environment image, and determining real-time state information of the target environment object in the real-time environment image;
determining whether to perform a directional adjustment on the image sensor based on the real-time status information of the target environmental object.
CN201910253101.2A 2019-03-29 2019-03-29 Image sensor control method, device and system Active CN111756990B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910253101.2A CN111756990B (en) 2019-03-29 2019-03-29 Image sensor control method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910253101.2A CN111756990B (en) 2019-03-29 2019-03-29 Image sensor control method, device and system

Publications (2)

Publication Number Publication Date
CN111756990A true CN111756990A (en) 2020-10-09
CN111756990B CN111756990B (en) 2022-03-01

Family

ID=72672715

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910253101.2A Active CN111756990B (en) 2019-03-29 2019-03-29 Image sensor control method, device and system

Country Status (1)

Country Link
CN (1) CN111756990B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112150832A (en) * 2020-10-23 2020-12-29 连云港杰瑞电子有限公司 Distributed traffic signal control system based on 5G
CN112839171A (en) * 2020-12-31 2021-05-25 上海米哈游天命科技有限公司 Picture shooting method and device, storage medium and electronic equipment
CN114827470A (en) * 2022-04-28 2022-07-29 新石器慧通(北京)科技有限公司 Unmanned vehicle control method and device based on holder angle adjustment
CN116991084A (en) * 2023-09-28 2023-11-03 北京斯年智驾科技有限公司 Unmanned simulation system, method and device and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915545A (en) * 2012-09-20 2013-02-06 华东师范大学 OpenCV(open source computer vision library)-based video target tracking algorithm
CN105847691A (en) * 2016-04-15 2016-08-10 乐视控股(北京)有限公司 Camera shooting device control method and device
US20170050563A1 (en) * 2015-08-19 2017-02-23 Harris Corporation Gimbaled camera object tracking system
CN106713773A (en) * 2017-03-31 2017-05-24 联想(北京)有限公司 Shooting control method and electronic device
US20180131856A1 (en) * 2016-11-07 2018-05-10 Olympus Corporation Mobile photographing apparatus, mobile photographing control apparatus, photographing device, photographing method, control method of mobile photographing apparatus, and recording medium for recording photographing program
CN108416321A (en) * 2018-03-23 2018-08-17 北京市商汤科技开发有限公司 For predicting that target object moves method, control method for vehicle and the device of direction
CN109074078A (en) * 2016-04-26 2018-12-21 高通股份有限公司 Method and apparatus for capturing the image of traffic sign
CN109391762A (en) * 2017-08-03 2019-02-26 杭州海康威视数字技术股份有限公司 A kind of method and apparatus of track up

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915545A (en) * 2012-09-20 2013-02-06 华东师范大学 OpenCV(open source computer vision library)-based video target tracking algorithm
US20170050563A1 (en) * 2015-08-19 2017-02-23 Harris Corporation Gimbaled camera object tracking system
CN105847691A (en) * 2016-04-15 2016-08-10 乐视控股(北京)有限公司 Camera shooting device control method and device
CN109074078A (en) * 2016-04-26 2018-12-21 高通股份有限公司 Method and apparatus for capturing the image of traffic sign
US20180131856A1 (en) * 2016-11-07 2018-05-10 Olympus Corporation Mobile photographing apparatus, mobile photographing control apparatus, photographing device, photographing method, control method of mobile photographing apparatus, and recording medium for recording photographing program
CN106713773A (en) * 2017-03-31 2017-05-24 联想(北京)有限公司 Shooting control method and electronic device
CN109391762A (en) * 2017-08-03 2019-02-26 杭州海康威视数字技术股份有限公司 A kind of method and apparatus of track up
CN108416321A (en) * 2018-03-23 2018-08-17 北京市商汤科技开发有限公司 For predicting that target object moves method, control method for vehicle and the device of direction

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112150832A (en) * 2020-10-23 2020-12-29 连云港杰瑞电子有限公司 Distributed traffic signal control system based on 5G
CN112839171A (en) * 2020-12-31 2021-05-25 上海米哈游天命科技有限公司 Picture shooting method and device, storage medium and electronic equipment
CN112839171B (en) * 2020-12-31 2023-02-10 上海米哈游天命科技有限公司 Picture shooting method and device, storage medium and electronic equipment
CN114827470A (en) * 2022-04-28 2022-07-29 新石器慧通(北京)科技有限公司 Unmanned vehicle control method and device based on holder angle adjustment
CN116991084A (en) * 2023-09-28 2023-11-03 北京斯年智驾科技有限公司 Unmanned simulation system, method and device and storage medium
CN116991084B (en) * 2023-09-28 2023-12-19 北京斯年智驾科技有限公司 Unmanned simulation system, method and device and storage medium

Also Published As

Publication number Publication date
CN111756990B (en) 2022-03-01

Similar Documents

Publication Publication Date Title
CN111756990B (en) Image sensor control method, device and system
JP7345504B2 (en) Association of LIDAR data and image data
US10437252B1 (en) High-precision multi-layer visual and semantic map for autonomous driving
US10794710B1 (en) High-precision multi-layer visual and semantic map by autonomous units
US10849543B2 (en) Focus-based tagging of sensor data
JP2021503414A (en) Intelligent driving control methods and devices, electronics, programs and media
CN111797187A (en) Map data updating method and device, electronic equipment and storage medium
US20200082182A1 (en) Training data generating method for image processing, image processing method, and devices thereof
KR20200096410A (en) Feature Extraction based on Deep Learning for LIDAR Position Estimation of Autonomous Vehicles
US11157753B2 (en) Road line detection device and road line detection method
CN112036210B (en) Method and device for detecting obstacle, storage medium and mobile robot
CN106104203A (en) The distance detection method of a kind of mobile object, device and aircraft
CN206623754U (en) Lane detection device
US20210174113A1 (en) Method for limiting object detection area in a mobile system equipped with a rotation sensor or a position sensor with an image sensor, and apparatus for performing the same
CN111582189A (en) Traffic signal lamp identification method and device, vehicle-mounted control terminal and motor vehicle
EP4068205A1 (en) Method for tracking object within video frame sequence, automatic parking method, and apparatus therefor
CN108107897A (en) Real time sensor control method and device
Capito et al. Optical flow based visual potential field for autonomous driving
CN114170826A (en) Automatic driving control method and device, electronic device and storage medium
JPH02114304A (en) Traveling control device for moving vehicle
CN111814657A (en) Unmanned vehicle parking method and system based on image recognition and storage medium
CN112735163B (en) Method for determining static state of target object, road side equipment and cloud control platform
KR20200083941A (en) Obstacle filtering method of non-avoidance planning system in autonomous vehicles
CN115123291B (en) Behavior prediction method and device based on obstacle recognition
US11741718B2 (en) Light interference detection during vehicle navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230718

Address after: Room 437, Floor 4, Building 3, No. 969, Wenyi West Road, Wuchang Subdistrict, Yuhang District, Hangzhou City, Zhejiang Province

Patentee after: Wuzhou Online E-Commerce (Beijing) Co.,Ltd.

Address before: Box 847, four, Grand Cayman capital, Cayman Islands, UK

Patentee before: ALIBABA GROUP HOLDING Ltd.