CN112425149A - Image information processing method and terminal equipment - Google Patents

Image information processing method and terminal equipment Download PDF

Info

Publication number
CN112425149A
CN112425149A CN202080002368.5A CN202080002368A CN112425149A CN 112425149 A CN112425149 A CN 112425149A CN 202080002368 A CN202080002368 A CN 202080002368A CN 112425149 A CN112425149 A CN 112425149A
Authority
CN
China
Prior art keywords
image
processing data
data
image information
transmission module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202080002368.5A
Other languages
Chinese (zh)
Other versions
CN112425149B (en
Inventor
邓宝华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Streamax Technology Co Ltd
Original Assignee
Streamax Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Streamax Technology Co Ltd filed Critical Streamax Technology Co Ltd
Publication of CN112425149A publication Critical patent/CN112425149A/en
Application granted granted Critical
Publication of CN112425149B publication Critical patent/CN112425149B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Signal Processing (AREA)
  • Ophthalmology & Optometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The application provides an image information processing method, which comprises the following steps: the first data transmission module respectively sends the image data to at least two image information identification devices each time the first data transmission module acquires the image data; for each image information identification device, storing the latest processing data returned by the image information identification device in a preset memory space through the first data transmission module; acquiring a processing data set from the preset memory space in a preset period through the second data transmission module; and the second data processing module sends the K processing data sets to the image signal processor when the K processing data sets are acquired accumulatively each time. By the method, the problem that after different image characteristics are respectively identified by each image identification algorithm, the transmission frequency of asynchronous transmission to the image signal processor is different, so that the image signal processor needs to carry out intensive information interaction can be solved.

Description

Image information processing method and terminal equipment
Technical Field
The present application belongs to the technical field of image information processing, and in particular, to an image information processing method, a terminal device, and a computer-readable storage medium.
Background
In practical applications, in order to better meet the requirements of different application scenarios, after image data is acquired, it is often necessary to identify an image, such as a face region, an eye region or other feature regions, by some image identification algorithm to determine a region of interest in the image, so as to determine image processing parameters such as exposure parameters of the image.
At present, in order to obtain more accurate image feature information, image features are often required to be identified by relying on an image identification algorithm, but introduction of the image identification algorithm often causes delay of data processing; moreover, if more than one image recognition algorithm is introduced, the processing periods of the data by different image recognition algorithms are often different, so that after each image recognition algorithm respectively recognizes different image features, the transmission frequencies of the data asynchronously transmitted to the image signal processor are different, and the image signal processor needs to perform intensive information interaction, thereby causing the consumption of corresponding transmission resources and storage resources to be large.
Disclosure of Invention
The embodiment of the application provides an image information processing method, a terminal device and a computer readable storage medium, which can solve the problem that after different image characteristics are respectively identified by each image identification algorithm, the transmission frequency of asynchronous transmission to an image signal processor is different, so that the image signal processor needs to carry out intensive information interaction, and the corresponding transmission resource and storage resource are consumed greatly.
In a first aspect, an embodiment of the present application provides an image information processing method, which is applied to a terminal device, where the terminal device includes a first data transmission module, a second data transmission module, and an image signal processor, and the image information processing method includes:
the first data transmission module sends the image data to at least two image information identification devices respectively when acquiring the image data each time so as to acquire processing data respectively returned by each image information identification device aiming at the image data;
for each image information identification device, storing the latest processing data returned by the image information identification device in a preset memory space through the first data transmission module, and taking the latest processing data as the current processing data corresponding to the image information identification device;
acquiring processing data sets from the preset memory space in preset periods through the second data transmission module, wherein each preset period corresponds to one processing data set, one processing data set is included in the corresponding preset period, and the second data transmission module acquires current processing data corresponding to each image information identification device from the preset memory space;
and when the second data processing module accumulatively acquires K processing data sets each time, the K processing data sets are sent to the image signal processor, wherein K is a positive integer.
In a second aspect, an embodiment of the present application provides a terminal device, including a first data transmission module, a second data transmission module, and an image signal processor, where:
the first data transmission module is configured to: the method comprises the steps that when image data are obtained each time, the image data are respectively sent to at least two image information identification devices, so that processing data which are respectively returned by the image information identification devices aiming at the image data are obtained;
the first data transmission module is further configured to: for each image information identification device, storing the latest processing data returned by the image information identification device in a preset memory space through the first data transmission module, and taking the latest processing data as the current processing data corresponding to the image information identification device;
the second data transmission module is configured to: acquiring processing data sets from the preset memory space in preset periods, wherein each preset period corresponds to one processing data set, one processing data set is included in the corresponding preset period, and the second data transmission module acquires current processing data corresponding to each image information identification device from the preset memory space;
the second data processing module is further configured to: and when K processing data sets are obtained accumulatively each time, sending the K processing data sets to an image signal processor, wherein K is a positive integer.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the image information processing method according to the first aspect.
In a fourth aspect, the present application provides a computer program product, which, when run on a terminal device, causes the terminal device to execute the image information processing method described above in the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: in the embodiment of the application, each time the first data transmission module acquires image data, the first data transmission module sends the image data to at least two image information identification devices respectively so as to acquire processing data returned by each image information identification device aiming at the image data; then, for each image information identification device, storing the latest processing data returned by the image information identification device in a preset memory space through the first data transmission module, and taking the latest processing data as the current processing data corresponding to the image information identification device; at this time, the first data transmission module can respectively perform information interaction with each image information identification device to obtain and respectively store the latest processing data returned by each image information identification device. In addition, in this embodiment of the application, processing data sets may be acquired from the preset memory space through the second data transmission module in preset cycles, where each preset cycle corresponds to one processing data set, and one processing data set is included in the corresponding preset cycle, and the second data transmission module acquires, from the preset memory space, current processing data corresponding to each image information identification device; and when the second data processing module accumulatively acquires the K processing data sets each time, the K processing data sets are sent to the image signal processor, so that the latest processing data respectively corresponding to each image information identification device is conveniently acquired at different time nodes through the second data transmission module to serve as the processing data sets corresponding to the corresponding time nodes, and then each processing data set is sent to the image signal processor at a relatively stable frequency, so that the image signal processor can acquire the required image characteristics at a relatively stable speed and perform image processing, dense information interaction is avoided, and transmission resources and storage resources can be saved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is an exemplary information interaction diagram of a first data transmission module, a second data transmission module, an image signal processor and an image information identification device according to an embodiment of the present application;
fig. 2 is a schematic flowchart of an image information processing method according to an embodiment of the present application;
fig. 3 is a schematic diagram of exemplary information interaction among the second data transmission module, the first data transmission module, and each image information identification device according to an embodiment of the present application.
FIG. 4 is a schematic flowchart of another image information processing method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution of the present application, the following description will be given by way of specific examples.
In practical applications, in order to better meet the requirements of different application scenarios, after image data is acquired, it is often necessary to identify an image, such as a face region, an eye region or other feature regions, by some image identification algorithm to determine a region of interest in the image, so as to determine image processing parameters such as exposure parameters of the image.
For example, in the field of vehicle security, the Driving behavior of a driver can be photographed by a Driving State Monitoring (DSM) camera. At this time, it is necessary to acquire image feature information (such as feature point coordinate information) of a face region and face details such as an eye region and a glasses region, so as to better expose an area of interest such as a face and acquire a captured image with good quality.
At present, in order to obtain more accurate image feature information, image features are often required to be identified by relying on an image identification algorithm, but introduction of the image identification algorithm often causes delay of data processing; moreover, if more than one image recognition algorithm is introduced, the processing periods of the data by different image recognition algorithms are often different, so that after each image recognition algorithm respectively recognizes different image features, the transmission frequencies of the asynchronous transmission to the image signal processor are different, and the corresponding transmission resources and storage resources are consumed more.
According to the embodiment of the application, the first data transmission module can be respectively used for information interaction with each image information identification device so as to obtain and respectively store the latest processing data returned by each image information identification device; and through the second data transmission module, the latest processing data corresponding to each image information identification device is conveniently acquired at different time nodes to serve as the processing data set corresponding to the corresponding time node, and each processing data set is sent to the image signal processor at a relatively stable frequency, so that the image signal processor can acquire the required image characteristics at a relatively stable speed and perform image processing, thereby avoiding intensive information interaction and saving transmission resources and storage resources.
The image information processing method in the embodiment of the present application may be applied to a terminal device, and the specific type of the terminal device is not limited at all.
Illustratively, the terminal device may be one of an in-vehicle device, a server, a desktop computer, a mobile phone, a tablet computer, a wearable device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like.
In this embodiment, the terminal device may include a first data transmission module, a second data transmission module, and an image signal processor.
The first data transmission module may be a certain hardware module in the terminal device, or may be a set of at least two hardware modules in the terminal device, or may be a certain part of a specified hardware module in the terminal device for implementing a specified function, and so on. The second data transmission module may be a certain hardware module in the terminal device, or may be a set of at least two hardware modules in the terminal device, or may also be a certain part of a specified hardware module in the terminal device for implementing a specified function, and so on. It should be noted that the first data transmission module and the second data transmission module may be different independent hardware structures, or may be located in the same hardware structure and correspond to different parts of the same hardware structure. For example, the first data transmission module and the second data transmission module may be different data processing units in the same hardware structure, to run different threads, respectively, and so on.
The Image Signal Processor (ISP) may be configured to process Image data output by the Image sensor to obtain an output Image for display. In this embodiment, the image signal processor may perform processing such as exposure on the image data in combination with processing data respectively returned by the image information recognition apparatuses for the image data.
In an embodiment of the application, for any of the image information recognition devices, the image information recognition device may be disposed in the terminal device, and may be an external device capable of performing information transmission with the terminal device, for example, an external device such as a cloud server. The image information recognition device may be disposed therein with an image recognition algorithm to perform image recognition on the input image data.
The number of the image information recognition devices may be set according to actual scene requirements, and is not limited herein.
Fig. 1 is a schematic diagram illustrating an exemplary information interaction among the first data transmission module, the second data transmission module, the image signal processor, and the image information recognition device.
As shown in fig. 1, the first data transmission module may perform information interaction with each image information recognition device, respectively, and the first data transmission module and the second data transmission module may perform information interaction, and the second data transmission module and the image signal processor may perform information interaction.
Fig. 2 shows a flowchart of an image information processing method provided in an embodiment of the present application.
Specifically, as shown in fig. 2, the image information processing method may include:
step S201, when the first data transmission module acquires image data each time, the first data transmission module sends the image data to at least two image information identification devices, so as to acquire processing data that each of the image information identification devices respectively returns to the image data.
In the embodiment of the present application, for example, the image data may be acquired by an image sensor in a camera of the terminal device, or may be transmitted to the terminal device through other devices in communication connection with the terminal device; alternatively, the image data may be local image data or the like stored in the terminal device in advance. The specific manner of acquiring the image data is not limited herein.
In an embodiment, the image data may be image data acquired in real time by the second data transmission module and sent to the first data transmission module at a preset period.
An image recognition algorithm may be deployed in any one of the image information recognition devices, and the image recognition algorithms in different image information recognition devices may be different. The type of image recognition algorithm may be determined according to actual scene requirements.
For example, if the image data is acquired from a vehicle-mounted security system, the image information required by the image signal processor may be one or more of the following information:
1. image illumination information. For example, the image illumination information may be used to indicate whether the image is in a backlight state, a dim light state, a daytime state, a night scene state, or the like.
2. Face region information. The face region information may include face edge coordinates, feature point coordinates such as facial feature point coordinates, mask information, and the like.
3. Eye information. The eye information may include, for example, eye center coordinates, eye width and height, eye contour coordinates, and the like.
4. The glasses information. The glasses information may include judgment information on whether glasses are worn, glasses type information, judgment information on whether light is reflected, and the like.
5. Face orientation information. The face orientation information may include determination information on a front face, a left face, or a right face.
Accordingly, in some embodiments, any one of the image information recognition devices is used to recognize at least one of image illumination information, face region information, eye information, glasses information, and face orientation information, and a different image information recognition device is used to recognize different information.
Of course, the image information recognition device can also be used for recognizing other image information according to scene requirements.
In the embodiment of the present application, the sending policy of the first data transmission module to send the image data to at least two image information identification devices may be determined according to the source of the image data, the data type, and preset information of a user, and the like. For example, if the image data may be from two cameras, then if the image data is from camera a, the image data may be transmitted to image information identification device a and image information identification device b. If the image data is from the camera B, the image data can be transmitted to the image information recognition device a and the image information recognition device c.
Step S202, for each image information recognition device, storing the latest processing data returned by the image information recognition device in a preset memory space through the first data transmission module, and using the latest processing data as the current processing data corresponding to the image information recognition device.
In this embodiment of the application, the first data transmission module may be respectively associated with each image information recognition device, and respectively acquire and store current processing data corresponding to each image information recognition device, so that the latest processing data corresponding to each image information recognition device may be stored in the preset memory space.
The specific storage structure in the preset memory space may be various. Illustratively, the preset memory space includes memory subspaces corresponding to the image information recognition devices one to one, so that currently processed data corresponding to the image information recognition devices may not interfere with each other when stored.
Step S203, acquiring, by the second data transmission module, processing data sets from the preset memory space in preset periods, where each preset period corresponds to one processing data set, and one processing data set is included in the corresponding preset period, and the second data transmission module acquires, from the preset memory space, current processing data corresponding to each image information identification device.
In this embodiment of the application, since the preset memory space may store the latest processing data corresponding to each image information recognition device, the second data transmission module may conveniently acquire the latest processing data corresponding to each image information recognition device at different time nodes, so as to serve as the processing data set corresponding to the corresponding time node.
Step S204, when the second data processing module accumulatively acquires K processing data sets each time, the K processing data sets are sent to an image signal processor, wherein K is a positive integer.
The K processing data sets may be set according to actual scene requirements.
In some embodiments, the value of K is determined according to the preset period and the operation speed of the image signal processor.
For example, if the preset period is 60ms and each data processing process of the image signal processor takes 300ms, the value of K may be 300/60-5, that is, the second data processing module sends 5 processing data sets to the image signal processor each time 5 processing data sets are obtained cumulatively.
In the embodiment of the application, after the second data processing module accumulatively acquires the K processing data sets each time, the K processing data sets are sent to the image signal processor, so that storage resources can be saved, the problem that hardware resources are greatly occupied due to intensive interaction of memory data is avoided, and transmission resources and storage resources are saved.
In addition, according to the embodiment of the application, the image signal processor can acquire the required image characteristics at a stable speed and perform image processing, so that the processed image is stably output, and the quality of the output image can be improved.
In the embodiment of the application, each time the first data transmission module acquires image data, the first data transmission module sends the image data to at least two image information identification devices respectively so as to acquire processing data returned by each image information identification device aiming at the image data; then, for each image information identification device, storing the latest processing data returned by the image information identification device in a preset memory space through the first data transmission module, and taking the latest processing data as the current processing data corresponding to the image information identification device; at this time, the first data transmission module can respectively perform information interaction with each image information identification device to obtain and respectively store the latest processing data returned by each image information identification device. In addition, in this embodiment of the application, processing data sets may be acquired from the preset memory space through the second data transmission module in preset cycles, where each preset cycle corresponds to one processing data set, and one processing data set is included in the corresponding preset cycle, and the second data transmission module acquires, from the preset memory space, current processing data corresponding to each image information identification device; and when the second data processing module accumulatively acquires the K processing data sets each time, the K processing data sets are sent to the image signal processor, so that the latest processing data respectively corresponding to each image information identification device is conveniently acquired at different time nodes through the second data transmission module to serve as the processing data sets corresponding to the corresponding time nodes, and then each processing data set is sent to the image signal processor at a relatively stable frequency, so that the image signal processor can acquire the required image characteristics at a relatively stable speed and perform image processing, dense information interaction is avoided, and transmission resources and storage resources can be saved.
In some embodiments, the preset memory space includes memory subspaces corresponding to the image information recognition devices one to one, respectively, where the memory subspaces are used to store the latest processing data returned by the corresponding image information recognition devices, each memory subspace includes two paired memory blocks, a corresponding identifier of one of the two paired memory blocks is a first identifier, and a corresponding identifier of the other of the two paired memory blocks is a second identifier;
the step of storing, by the first data transmission module, latest processing data returned by the image information recognition device in a preset memory space for each image information recognition device, and using the latest processing data as current processing data corresponding to the image information recognition device, includes:
for each image information recognition device, storing, by the first data transmission module, the latest processing data returned by the image information recognition device in a target memory block as current processing data of the image information recognition device, where the target memory block is a memory block corresponding to a first identifier in a memory subspace corresponding to the image information recognition device, and the target memory block after storage is not stored with processing data of other rounds except the latest round;
after the storage is finished, updating the identifier of the target memory block after the storage is finished as a second identifier, and updating the identifier of the memory block matched with the target memory block as a first identifier;
the acquiring, by the second data transmission module, a processing data set from the preset memory space in a preset period includes:
and acquiring each current image data from each memory block corresponding to the second identifier in the preset memory space in a preset period through the second data transmission module to obtain the processing data set.
Fig. 3 is a schematic diagram illustrating exemplary information interaction among the second data transmission module, the first data transmission module, and each image information recognition device.
For the image information recognition device 1, the first data transmission module includes a memory subspace corresponding to the image information recognition device 1. The memory subspace may include two paired memory chunks, where one of the two paired memory chunks corresponds to an identifier of a first identifier buf1, and another of the two paired memory chunks corresponds to an identifier of a second identifier buf 0.
It is understood that the memory block identified as the first identification buf1 is a memory block capable of writing the processing data transmitted by the image information recognition apparatus 1, and the memory block identified as the second identification buf0 is a memory block capable of being read by the second data transmission module. At this time, the memory space for writing data and the memory space for reading data are separated, so that operation conflicts caused by simultaneous writing of data and reading of data are avoided.
In addition, in this embodiment of the application, for each image information recognition device, the first data transmission module stores the latest processing data returned by the image information recognition device in the target memory block, after the storage is completed, the identifier of the target memory block after the storage is completed is updated to the second identifier, and the identifier of the memory block paired with the target memory block is updated to the first identifier.
In some embodiments, each time image data is acquired by the first data transmission module, sending the image data to at least two image information identification devices respectively to acquire processing data returned by each image information identification device for the image data respectively includes:
sending real-time image data to the first data transmission module through the second data transmission module in a preset period;
the first data transmission module sends the image data to at least two image information identification devices respectively when acquiring the image data sent by the second data transmission module each time so as to acquire processing data respectively returned by each image information identification device for the image data;
the acquiring, by the second data transmission module, a processing data set from the preset memory space in a preset period includes:
and the second data transmission module acquires a processing data set from the preset memory space while sending real-time image data to the first data transmission module each time.
In the embodiment of the application, the second data transmission module sends real-time image data to the first data transmission module in a preset period, and obtains the processing data set from the preset memory space while sending the real-time image data to the first data transmission module at each time, so that the time delay between the obtained processing data set and the sent real-time image data can be reduced, the information interaction efficiency is improved, and the occupation of information transmission resources is reduced.
In some embodiments, as shown in fig. 4, after sending the K processing data sets to the image signal processor, the method further includes:
step S401, after the image signal processor receives the K processing data sets each time, determining an interested area in the image data acquired in real time according to the K processing data sets;
and step S402, carrying out exposure processing on the image data acquired in real time according to the region of interest.
In the embodiment of the present application, the region of interest (ROI) may be a square frame, a circle, an ellipse, an irregular polygon, or the like to outline a region to be processed. The content of the region of interest may be determined from the actual scene and the corresponding image recognition algorithm. For example, in an application scenario of vehicle security, the region of interest may be a human face region.
The region of interest may be determined based on at least part of the K processed data sets. For example, the face contour feature point coordinates, the eye feature point coordinates, mask information, and the like in the processing data may be determined. The specific determination method can be various. For example, a minimum square region that can include the face contour feature point may be used as the region of interest; in addition, the distance between the edge of the region of interest and the specified feature point in the human face can also be determined according to the distance between the camera and the human face.
In some embodiments, the step S401 may include:
after the image signal processor receives the K processing data sets each time, determining an initial region of interest in the image data acquired in real time according to the K processing data sets;
determining estimated offset information of the initial region of interest according to the K processing data sets and N historical processing data sets, wherein N is a positive integer;
and determining the region of interest in the image data acquired in real time according to the estimated offset information and the initial region of interest.
And determining an interested region in the image data acquired in real time according to the K processing data sets aiming at the K processing data sets received by the image signal processor each time, wherein the K processing data sets have time delay relative to the image data acquired by the image signal processor in real time. For example, if the time taken for the second data processing module to acquire K sets of processing data cumulatively each time is 300ms, and one processing cycle of the image data by the image information recognition apparatus is 30ms, the delay time of the K sets of processing data with respect to the image data acquired by the image signal processor in real time is at least 330 ms.
It can be seen that the initial region of interest determined from the K processing data sets may have an offset with respect to the real region of interest of the image data acquired in real time.
In this embodiment of the application, the estimated offset information of the initial region of interest may be determined according to the K processed data sets and the N historical processed data sets. The number of the historical processing data sets can be determined according to factors such as precision requirements and the value of K.
In some embodiments, the predicted offset information comprises a compensation region;
determining the region of interest in the image data acquired in real time according to the estimated offset information and the initial region of interest, including:
combining the compensation region and the initial region of interest to obtain the region of interest;
the exposing processing of the image data acquired in real time according to the region of interest includes:
and carrying out exposure processing on the image data acquired in real time according to the first weight corresponding to the compensation region and the second weight corresponding to the initial region of interest.
In this embodiment of the application, the first weight may be used to indicate an exposure degree corresponding to the compensation region, and the second weight may be used to indicate an exposure degree corresponding to the initial region of interest. The values of the first weight and the second weight may be predetermined. Illustratively, the first weight may be smaller than the second weight, for example, the first weight may be 0.5 times the second weight, in which case the exposure degree of the initial region of interest is higher, and the exposure degree of the compensation region relative to the initial region of interest is weaker, so as to more highlight the initial region of interest, and the exposure is slightly increased in the compensation region relative to other regions of no interest, so as to make the exposure process more accurate.
In some embodiments, the determining the estimated shift information of the initial region of interest from the K processed data sets and N historical processed data sets includes:
calculating a first moving speed of a target object corresponding to the history period according to the history period corresponding to the N historical processing data sets and the N historical processing data sets, wherein the target object is associated with the initial region of interest;
calculating a second moving speed of the target object in the target period according to the target period corresponding to the K processing data sets and the K processing data sets;
determining a target acceleration of the target object according to the first moving speed and the second moving speed;
and determining the estimated offset information of the initial region of interest according to the target acceleration.
The first moving speed may include a first lateral speed and/or a first longitudinal speed, the second moving speed may include a second lateral speed and/or a second longitudinal speed, and accordingly, the target acceleration may also include a lateral target acceleration and/or a longitudinal target acceleration.
In the embodiment of the application, a first moving speed of a target object in the history period can be calculated, so that the moving condition of the target object in the history period is obtained; and calculating a second moving speed of the target object in the target period to obtain a moving condition of the target object in the target period, so that the moving condition of the shot target object between two different time periods can be compared to estimate a moving trend of the target object, thereby determining the estimated offset information.
For example, for the history period, the corresponding coordinates of a certain specified feature point in each processing data set corresponding to the history period may be obtained, so as to obtain the corresponding image position information of the specified feature point at different time points in the history period, so as to obtain a first average speed of the specified feature point in the history period, and then the first average speed may be used as the first moving speed. For the target period, similarly, the second moving speed is obtained by the above-described method of obtaining the first moving speed.
Then, according to the first moving speed and the second moving speed, the target acceleration of the target object can be determined, so that the moving trend of the target object can be estimated according to the change situation of the second moving speed relative to the first moving speed through the target acceleration determination, and the estimated offset information can be determined.
For example, if the first lateral speed is the same as the second lateral speed in the direction, and the first lateral speed is greater than the second lateral speed, it may be predicted that the target object decelerates in the direction of the second lateral speed. At this time, if the second transverse speed is greater than a preset speed threshold, expanding along the direction of the second transverse speed on the basis of the initial region of interest to obtain the compensation region; if the second transverse speed is smaller than the preset speed threshold, it can be estimated that the target object may move in the opposite direction, so that expansion can be performed simultaneously in the direction of the second transverse speed and the opposite direction of the second transverse speed on the basis of the initial region of interest to obtain the compensation region.
If the first lateral velocity is different from the second lateral velocity in direction and the first lateral velocity is greater than the second lateral velocity, the acceleration of the target object in the direction of the second lateral velocity can be estimated. In this case, expansion in the direction of the second lateral velocity may be performed on the basis of the initial region of interest to obtain the compensation region.
Illustratively, the magnitude of the second lateral velocity is multiplied by a preset coefficient to obtain a width of the compensation zone expanded along the specified lateral direction.
It will be appreciated that for the first longitudinal speed and the second longitudinal speed, the compensation zone in the longitudinal direction can also be obtained by the method described above.
In some embodiments, the image information processing method further includes:
determining a first distance between a camera acquiring the image data and the target object in the historical period, and determining a second distance between the camera and the target object in the target period;
the calculating a first moving speed of the target object corresponding to the history period according to the history period corresponding to the N history processing data sets and the N history processing data sets includes:
calculating a first initial moving speed of the target object corresponding to the history period according to the history period corresponding to the N history processing data sets and the N history processing data sets;
obtaining the first moving speed according to the first distance and a preset distance-pixel displacement mapping relation;
the calculating a second moving speed of the target object in the target period according to the target period corresponding to the K processing data sets and the K processing data sets includes:
calculating a second initial moving speed of the target object corresponding to the target period according to the target period corresponding to the K processing data sets and the K processing data sets;
and obtaining the second moving speed according to the second distance and a preset distance-pixel displacement mapping relation.
In the embodiment of the present application, when the distances between the camera and the target object are different, there is also a difference in the correspondence between the pixel displacement in the acquired image and the moving speed of the target object in the real scene.
Therefore, the distance-pixel displacement mapping relationship may be established in advance, so as to unify mapping relationships of pixel displacement mapping between different images to a three-dimensional space based on the distance between the camera and the target object, and then determine the first moving speed and the second moving speed according to the pixel displacement after the unified mapping relationship.
For example, for a certain specified length in the three-dimensional space, the mapping relationship between different distances and pixel displacements may be determined in advance to obtain the distance-pixel displacement mapping relationship. At this time, for a certain distance, by querying the distance-pixel displacement mapping relationship, it can be determined how many lengths of pixel displacement in an image captured by the camera at the distance can indicate the specified length in the three-dimensional space. It should be noted that the distance-pixel displacement mapping relationship may also be regarded as a distance-pixel moving speed mapping relationship.
Therefore, after the first initial moving speed is obtained by calculation, the first moving speed may be obtained according to the first distance and a preset distance-pixel displacement mapping relationship, and after the second initial moving speed is obtained by calculation, the second moving speed may be obtained according to the second distance and a preset distance-pixel displacement mapping relationship. At this time, the obtained first moving speed and the second moving speed corresponding to the first moving speed can be equivalently regarded as moving speeds in the same spatial coordinate system, so that the accuracy of the preset offset information is ensured.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 5 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
As shown in fig. 5, the terminal device 5 of this embodiment includes: a first data transmission module 501, a second data transmission module 502 and an image signal processor 503, wherein:
the first data transmission module 501 is configured to: the method comprises the steps that when image data are obtained each time, the image data are respectively sent to at least two image information identification devices, so that processing data which are respectively returned by the image information identification devices aiming at the image data are obtained;
the first data transmission module 501 is further configured to: for each image information recognition device, storing the latest processing data returned by the image information recognition device in a preset memory space through the first data transmission module 501, and using the latest processing data as the current processing data corresponding to the image information recognition device;
the second data transmission module 502 is configured to: acquiring processing data sets from the preset memory space in preset periods, wherein each preset period corresponds to one processing data set, and one processing data set is included in the corresponding preset period, and the second data transmission module 502 acquires current processing data corresponding to each image information identification device from the preset memory space;
the second data processing module is further configured to: each time K processing data sets are obtained cumulatively, the K processing data sets are sent to the image signal processor 503, where K is a positive integer.
Optionally, the preset memory space includes memory subspaces corresponding to the image information recognition apparatuses one to one, where the memory subspaces are used to store the latest processed data returned by the corresponding image information recognition apparatuses, each memory subspace includes two paired memory blocks, an identifier corresponding to one of the two paired memory blocks is a first identifier, and an identifier corresponding to the other of the two paired memory blocks is a second identifier;
the first data transmission module 501 is configured to:
for each image information recognition device, the first data transmission module 501 stores the latest processing data returned by the image information recognition device in a target memory block as the current processing data of the image information recognition device, where the target memory block is a memory block corresponding to a first identifier in a memory subspace corresponding to the image information recognition device, and the stored target memory block does not store processing data of other rounds except the latest round;
after the storage is finished, updating the identifier of the target memory block after the storage is finished as a second identifier, and updating the identifier of the memory block matched with the target memory block as a first identifier;
the second data transmission module 502 is configured to:
and acquiring each current image data from each memory block corresponding to the second identifier in the preset memory space in a preset period to obtain the processing data set.
Optionally, the second data transmission module 502 is configured to:
sending real-time image data to the first data transmission module 501 at a preset period;
the first data transmission module 501 is configured to:
when image data sent by the second data transmission module 502 is acquired, the image data is sent to at least two image information identification devices respectively, so as to acquire processing data respectively returned by each image information identification device for the image data;
the second data transmission module 502 is further configured to:
the second data transmission module 502 obtains a processing data set from the preset memory space while sending real-time image data to the first data transmission module 501 each time.
Alternatively, the value of K is determined according to the preset period and the operation speed of the image signal processor 503.
Optionally, the image signal processor 503 is configured to:
after the K processing data sets are received each time, determining an interested area in the image data acquired in real time according to the K processing data sets;
and according to the region of interest, carrying out exposure processing on the image data acquired in real time.
Optionally, the image signal processor 503 is configured to:
after the K processing data sets are received each time, determining an initial region of interest in the image data acquired in real time according to the K processing data sets;
determining estimated offset information of the initial region of interest according to the K processing data sets and N historical processing data sets, wherein N is a positive integer;
and determining the region of interest in the image data acquired in real time according to the estimated offset information and the initial region of interest.
Optionally, the estimated offset information includes a compensation area;
the image signal processor 503 is configured to:
combining the compensation region and the initial region of interest to obtain the region of interest;
and carrying out exposure processing on the image data acquired in real time according to the first weight corresponding to the compensation region and the second weight corresponding to the initial region of interest.
Optionally, the image signal processor 503 is configured to:
calculating a first moving speed of a target object corresponding to the history period according to the history period corresponding to the N historical processing data sets and the N historical processing data sets, wherein the target object is associated with the initial region of interest;
calculating a second moving speed of the target object in the target period according to the target period corresponding to the K processing data sets and the K processing data sets;
determining a target acceleration of the target object according to the first moving speed and the second moving speed;
and determining the estimated offset information of the initial region of interest according to the target acceleration.
Optionally, the image signal processor 503 is configured to:
determining a first distance between a camera acquiring the image data and the target object in the historical period, and determining a second distance between the camera and the target object in the target period;
calculating a first initial moving speed of the target object corresponding to the history period according to the history period corresponding to the N history processing data sets and the N history processing data sets;
obtaining the first moving speed according to the first distance and a preset distance-pixel displacement mapping relation;
the calculating a second moving speed of the target object in the target period according to the target period corresponding to the K processing data sets and the K processing data sets includes:
calculating a second initial moving speed of the target object corresponding to the target period according to the target period corresponding to the K processing data sets and the K processing data sets;
and obtaining the second moving speed according to the second distance and a preset distance-pixel displacement mapping relation.
The first data transmission module 501 may include at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, so that the first data transmission module 501 may implement the steps executed on the first data transmission module 501 in any of the various image information processing method embodiments described above.
The second data transmission module 502 may include at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, so that the second data transmission module 502 may implement the steps performed on the second data transmission module 502 in any of the various image information processing method embodiments described above.
The image signal processor 503 may also include at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, so that the image signal processor 503 may implement the steps executed on the image signal processor 503 in any of the various image information processing method embodiments described above.
It should be noted that the first data transmission module 501, the second data transmission module 502, and the image signal processor 503 may also include more or less components, respectively, as needed, and the components respectively included in the first data transmission module 501, the second data transmission module 502, and the image signal processor 503 are not limited herein.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above method embodiments.
The embodiments of the present application provide a computer program product, which when running on a terminal device, enables the terminal device to implement the steps in the above method embodiments when executed.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer-readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (19)

1. An image information processing method applied to a terminal device including a first data transmission module, a second data transmission module, and an image signal processor, the image information processing method comprising:
the first data transmission module sends the image data to at least two image information identification devices respectively when acquiring the image data each time so as to acquire processing data respectively returned by each image information identification device aiming at the image data;
for each image information identification device, storing the latest processing data returned by the image information identification device in a preset memory space through the first data transmission module, and taking the latest processing data as the current processing data corresponding to the image information identification device;
acquiring processing data sets from the preset memory space in preset periods through the second data transmission module, wherein each preset period corresponds to one processing data set, one processing data set is included in the corresponding preset period, and the second data transmission module acquires current processing data corresponding to each image information identification device from the preset memory space;
and when the second data processing module accumulatively acquires K processing data sets each time, the K processing data sets are sent to the image signal processor, wherein K is a positive integer.
2. The image information processing method according to claim 1, wherein the preset memory space includes memory subspaces corresponding to the respective image information recognition apparatuses one to one, the memory subspaces are used to store the latest processed data returned by the corresponding image information recognition apparatuses, each memory subspace includes two paired memory blocks, an identifier corresponding to one of the two paired memory blocks is a first identifier, and an identifier corresponding to the other of the two paired memory blocks is a second identifier;
the step of storing, by the first data transmission module, latest processing data returned by the image information recognition device in a preset memory space for each image information recognition device, and using the latest processing data as current processing data corresponding to the image information recognition device, includes:
for each image information recognition device, storing, by the first data transmission module, the latest processing data returned by the image information recognition device in a target memory block as current processing data of the image information recognition device, where the target memory block is a memory block corresponding to a first identifier in a memory subspace corresponding to the image information recognition device, and the target memory block after storage is not stored with processing data of other rounds except the latest round;
after the storage is finished, updating the identifier of the target memory block after the storage is finished as a second identifier, and updating the identifier of the memory block matched with the target memory block as a first identifier;
the acquiring, by the second data transmission module, a processing data set from the preset memory space in a preset period includes:
and acquiring each current image data from each memory block corresponding to the second identifier in the preset memory space in a preset period through the second data transmission module to obtain the processing data set.
3. The image information processing method according to claim 1, wherein the sending, by the first data transmission module, each time image data is acquired, the image data to at least two image information recognition devices, respectively, so as to acquire processing data that each of the image information recognition devices returns for the image data, includes:
sending real-time image data to the first data transmission module through the second data transmission module in a preset period;
the first data transmission module sends the image data to at least two image information identification devices respectively when acquiring the image data sent by the second data transmission module each time so as to acquire processing data respectively returned by each image information identification device for the image data;
the acquiring, by the second data transmission module, a processing data set from the preset memory space in a preset period includes:
and the second data transmission module acquires a processing data set from the preset memory space while sending real-time image data to the first data transmission module each time.
4. The image information processing method according to claim 1, wherein a value of K is determined in accordance with the preset period and an operation speed of the image signal processor.
5. The image information processing method according to any one of claims 1 to 4, further comprising, after sending the K sets of processed data to an image signal processor:
after the image signal processor receives the K processing data sets each time, determining an interested area in the image data acquired in real time according to the K processing data sets;
and according to the region of interest, carrying out exposure processing on the image data acquired in real time.
6. The image information processing method according to claim 5, wherein the determining, by the image signal processor, the region of interest in the image data acquired in real time from the K processing data sets each time after receiving the K processing data sets, comprises:
after the image signal processor receives the K processing data sets each time, determining an initial region of interest in the image data acquired in real time according to the K processing data sets;
determining estimated offset information of the initial region of interest according to the K processing data sets and N historical processing data sets, wherein N is a positive integer;
and determining the region of interest in the image data acquired in real time according to the estimated offset information and the initial region of interest.
7. The image information processing method according to claim 6, wherein the estimated offset information includes a compensation area;
determining the region of interest in the image data acquired in real time according to the estimated offset information and the initial region of interest, including:
combining the compensation region and the initial region of interest to obtain the region of interest;
the exposing processing of the image data acquired in real time according to the region of interest includes:
and carrying out exposure processing on the image data acquired in real time according to the first weight corresponding to the compensation region and the second weight corresponding to the initial region of interest.
8. The image information processing method of claim 6, wherein said determining estimated shift information for the initial region of interest from the K processed data sets and N historical processed data sets comprises:
calculating a first moving speed of a target object corresponding to the history period according to the history period corresponding to the N historical processing data sets and the N historical processing data sets, wherein the target object is associated with the initial region of interest;
calculating a second moving speed of the target object in the target period according to the target period corresponding to the K processing data sets and the K processing data sets;
determining a target acceleration of the target object according to the first moving speed and the second moving speed;
and determining the estimated offset information of the initial region of interest according to the target acceleration.
9. The image information processing method according to claim 8, further comprising:
determining a first distance between a camera acquiring the image data and the target object in the historical period, and determining a second distance between the camera and the target object in the target period;
the calculating a first moving speed of the target object corresponding to the history period according to the history period corresponding to the N history processing data sets and the N history processing data sets includes:
calculating a first initial moving speed of the target object corresponding to the history period according to the history period corresponding to the N history processing data sets and the N history processing data sets;
obtaining the first moving speed according to the first distance and a preset distance-pixel displacement mapping relation;
the calculating a second moving speed of the target object in the target period according to the target period corresponding to the K processing data sets and the K processing data sets includes:
calculating a second initial moving speed of the target object corresponding to the target period according to the target period corresponding to the K processing data sets and the K processing data sets;
and obtaining the second moving speed according to the second distance and a preset distance-pixel displacement mapping relation.
10. A terminal device, comprising a first data transmission module, a second data transmission module, and an image signal processor, wherein:
the first data transmission module is configured to: the method comprises the steps that when image data are obtained each time, the image data are respectively sent to at least two image information identification devices, so that processing data which are respectively returned by the image information identification devices aiming at the image data are obtained;
the first data transmission module is further configured to: for each image information identification device, storing the latest processing data returned by the image information identification device in a preset memory space through the first data transmission module, and taking the latest processing data as the current processing data corresponding to the image information identification device;
the second data transmission module is configured to: acquiring processing data sets from the preset memory space in preset periods, wherein each preset period corresponds to one processing data set, one processing data set is included in the corresponding preset period, and the second data transmission module acquires current processing data corresponding to each image information identification device from the preset memory space;
the second data processing module is further configured to: and when K processing data sets are obtained accumulatively each time, sending the K processing data sets to an image signal processor, wherein K is a positive integer.
11. The terminal device according to claim 10, wherein the preset memory space includes memory subspaces corresponding to the respective image information recognition apparatuses one to one, and the memory subspaces are used to store the latest processing data returned by the corresponding image information recognition apparatuses, each memory subspace includes two paired memory blocks, one of the two paired memory blocks corresponds to the first identifier, and the other of the two paired memory blocks corresponds to the second identifier;
the first data transmission module is configured to:
for each image information recognition device, storing, by the first data transmission module, the latest processing data returned by the image information recognition device in a target memory block as current processing data of the image information recognition device, where the target memory block is a memory block corresponding to a first identifier in a memory subspace corresponding to the image information recognition device, and the target memory block after storage is not stored with processing data of other rounds except the latest round;
after the storage is finished, updating the identifier of the target memory block after the storage is finished as a second identifier, and updating the identifier of the memory block matched with the target memory block as a first identifier;
the second data transmission module is configured to:
and acquiring each current image data from each memory block corresponding to the second identifier in the preset memory space in a preset period to obtain the processing data set.
12. The terminal device of claim 10, wherein the second data transmission module is configured to:
sending real-time image data to the first data transmission module in a preset period;
the first data transmission module is configured to:
when image data sent by the second data transmission module are acquired, the image data are respectively sent to at least two image information identification devices so as to acquire processing data respectively returned by each image information identification device aiming at the image data;
the second data transmission module is further configured to:
and the second data transmission module acquires a processing data set from the preset memory space while sending real-time image data to the first data transmission module each time.
13. The terminal device according to claim 10, wherein a value of K is determined based on the preset period and an operation speed of the image signal processor.
14. The terminal device of any of claims 10 to 13, wherein the image signal processor is configured to:
after the K processing data sets are received each time, determining an interested area in the image data acquired in real time according to the K processing data sets;
and according to the region of interest, carrying out exposure processing on the image data acquired in real time.
15. The terminal device of claim 14, wherein the image signal processor is to:
after the K processing data sets are received each time, determining an initial region of interest in the image data acquired in real time according to the K processing data sets;
determining estimated offset information of the initial region of interest according to the K processing data sets and N historical processing data sets, wherein N is a positive integer;
and determining the region of interest in the image data acquired in real time according to the estimated offset information and the initial region of interest.
16. The terminal device of claim 15, wherein the estimated offset information includes a compensation region;
the image signal processor is configured to:
combining the compensation region and the initial region of interest to obtain the region of interest;
and carrying out exposure processing on the image data acquired in real time according to the first weight corresponding to the compensation region and the second weight corresponding to the initial region of interest.
17. The terminal device of claim 15, wherein the image signal processor is to:
calculating a first moving speed of a target object corresponding to the history period according to the history period corresponding to the N historical processing data sets and the N historical processing data sets, wherein the target object is associated with the initial region of interest;
calculating a second moving speed of the target object in the target period according to the target period corresponding to the K processing data sets and the K processing data sets;
determining a target acceleration of the target object according to the first moving speed and the second moving speed;
and determining the estimated offset information of the initial region of interest according to the target acceleration.
18. The terminal device of claim 17, wherein the image signal processor is to:
determining a first distance between a camera acquiring the image data and the target object in the historical period, and determining a second distance between the camera and the target object in the target period;
calculating a first initial moving speed of the target object corresponding to the history period according to the history period corresponding to the N history processing data sets and the N history processing data sets;
obtaining the first moving speed according to the first distance and a preset distance-pixel displacement mapping relation;
the calculating a second moving speed of the target object in the target period according to the target period corresponding to the K processing data sets and the K processing data sets includes:
calculating a second initial moving speed of the target object corresponding to the target period according to the target period corresponding to the K processing data sets and the K processing data sets;
and obtaining the second moving speed according to the second distance and a preset distance-pixel displacement mapping relation.
19. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, implements the image information processing method according to any one of claims 1 to 9.
CN202080002368.5A 2020-10-19 2020-10-19 Image information processing method, terminal device, and computer-readable storage medium Active CN112425149B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/121894 WO2022082361A1 (en) 2020-10-19 2020-10-19 Image information processing method and terminal device

Publications (2)

Publication Number Publication Date
CN112425149A true CN112425149A (en) 2021-02-26
CN112425149B CN112425149B (en) 2022-01-28

Family

ID=74783094

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080002368.5A Active CN112425149B (en) 2020-10-19 2020-10-19 Image information processing method, terminal device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN112425149B (en)
WO (1) WO2022082361A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6307974B1 (en) * 1997-06-23 2001-10-23 Canon Kabushika Kaisha Image processing apparatus, system, and method with adaptive transfer
US20060018563A1 (en) * 2004-01-16 2006-01-26 Ruggiero Carl J Video image processing with processing time allocation
US20110211726A1 (en) * 2007-11-30 2011-09-01 Cognex Corporation System and method for processing image data relative to a focus of attention within the overall image
CN202815869U (en) * 2012-09-11 2013-03-20 北京铁道工程机电技术研究所有限公司 Vehicle microcomputer image and video data extraction apparatus
CN103098077A (en) * 2010-07-07 2013-05-08 数字光学欧洲有限公司 Real-time video frame pre-processing hardware
US20150023602A1 (en) * 2013-07-19 2015-01-22 Kamil Wnuk Fast recognition algorithm processing, systems and methods
CN107181724A (en) * 2016-03-11 2017-09-19 华为技术有限公司 A kind of recognition methods for cooperateing with stream, system and the server using this method
US20180284737A1 (en) * 2016-05-09 2018-10-04 StrongForce IoT Portfolio 2016, LLC Methods and systems for detection in an industrial internet of things data collection environment with large data sets
US20200068002A1 (en) * 2018-08-23 2020-02-27 Apple Inc. Synchronized wireless and video operations

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2260195C (en) * 1996-06-28 2003-09-23 T. Eric Hopkins Image acquisition system
KR20180052002A (en) * 2016-11-09 2018-05-17 삼성전자주식회사 Method for Processing Image and the Electronic Device supporting the same
CN111524159B (en) * 2019-02-01 2024-07-19 北京京东乾石科技有限公司 Image processing method and apparatus, storage medium, and processor
CN110070083A (en) * 2019-04-24 2019-07-30 深圳市微埃智能科技有限公司 Image processing method, device, electronic equipment and computer readable storage medium
CN110995994B (en) * 2019-12-09 2021-09-14 上海瑾盛通信科技有限公司 Image shooting method and related device
CN111597953A (en) * 2020-05-12 2020-08-28 杭州宇泛智能科技有限公司 Multi-path image processing method and device and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6307974B1 (en) * 1997-06-23 2001-10-23 Canon Kabushika Kaisha Image processing apparatus, system, and method with adaptive transfer
US20060018563A1 (en) * 2004-01-16 2006-01-26 Ruggiero Carl J Video image processing with processing time allocation
US20110211726A1 (en) * 2007-11-30 2011-09-01 Cognex Corporation System and method for processing image data relative to a focus of attention within the overall image
CN103098077A (en) * 2010-07-07 2013-05-08 数字光学欧洲有限公司 Real-time video frame pre-processing hardware
CN202815869U (en) * 2012-09-11 2013-03-20 北京铁道工程机电技术研究所有限公司 Vehicle microcomputer image and video data extraction apparatus
US20150023602A1 (en) * 2013-07-19 2015-01-22 Kamil Wnuk Fast recognition algorithm processing, systems and methods
CN107181724A (en) * 2016-03-11 2017-09-19 华为技术有限公司 A kind of recognition methods for cooperateing with stream, system and the server using this method
US20180284737A1 (en) * 2016-05-09 2018-10-04 StrongForce IoT Portfolio 2016, LLC Methods and systems for detection in an industrial internet of things data collection environment with large data sets
US20200068002A1 (en) * 2018-08-23 2020-02-27 Apple Inc. Synchronized wireless and video operations

Also Published As

Publication number Publication date
WO2022082361A1 (en) 2022-04-28
CN112425149B (en) 2022-01-28

Similar Documents

Publication Publication Date Title
CN111210429B (en) Point cloud data partitioning method and device and obstacle detection method and device
US9811732B2 (en) Systems and methods for object tracking
TWI770420B (en) Vehicle accident identification method and device, electronic equipment
CN111582054B (en) Point cloud data processing method and device and obstacle detection method and device
CN111553946B (en) Method and device for removing ground point cloud and method and device for detecting obstacle
CN111783905B (en) Target fusion method and device, storage medium and electronic equipment
TW201539378A (en) Object detection system
CN114862828A (en) Light spot searching method and device, computer readable medium and electronic equipment
CN107341460B (en) Face tracking method and device
US20200090309A1 (en) Method and device for denoising processing, storage medium, and terminal
CN112425149B (en) Image information processing method, terminal device, and computer-readable storage medium
CN117934888A (en) Data aggregation method, system, device and storage medium
CN116935640A (en) Road side sensing method, device, equipment and medium based on multiple sensors
CN114827464B (en) Target tracking method and system based on mobile camera
CN114387324A (en) Depth imaging method, depth imaging device, electronic equipment and computer readable storage medium
CN113658251A (en) Distance measuring method, device, electronic equipment, storage medium and system
CN113936316A (en) DOE (DOE-out-of-state) detection method, electronic device and computer-readable storage medium
CN115457282A (en) Point cloud data processing method and device
CN113167578A (en) Distance measuring method and device
CN114694375B (en) Traffic monitoring system, traffic monitoring method, and storage medium
CN112991210B (en) Image processing method and device, computer readable storage medium and electronic equipment
CN114783041B (en) Target object recognition method, electronic device, and computer-readable storage medium
US20180001821A1 (en) Environment perception using a surrounding monitoring system
CN112700657B (en) Method and device for generating detection information, road side equipment and cloud control platform
KR101807541B1 (en) Census pattern generating method for stereo matching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant