[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN115734088B - Image jelly effect eliminating method and related device - Google Patents

Image jelly effect eliminating method and related device Download PDF

Info

Publication number
CN115734088B
CN115734088B CN202211335568.XA CN202211335568A CN115734088B CN 115734088 B CN115734088 B CN 115734088B CN 202211335568 A CN202211335568 A CN 202211335568A CN 115734088 B CN115734088 B CN 115734088B
Authority
CN
China
Prior art keywords
aps
image
images
sub
exposure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211335568.XA
Other languages
Chinese (zh)
Other versions
CN115734088A (en
Inventor
尹程龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ruishi Zhixin Technology Co ltd
Original Assignee
Shenzhen Ruishi Zhixin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ruishi Zhixin Technology Co ltd filed Critical Shenzhen Ruishi Zhixin Technology Co ltd
Priority to CN202211335568.XA priority Critical patent/CN115734088B/en
Publication of CN115734088A publication Critical patent/CN115734088A/en
Application granted granted Critical
Publication of CN115734088B publication Critical patent/CN115734088B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The application provides an image jelly effect eliminating method and a related device, wherein the image jelly effect eliminating method comprises the following steps: acquiring an integral exposure time interval corresponding to an APS image to be processed, which is acquired by an APS image sensor; acquiring target event stream data synchronously acquired by an EVS image sensor based on the integral exposure time interval; respectively calculating instantaneous APS images of different rows of sub APS images in the APS images to be processed at the same world time according to the target event stream data; and synthesizing a plurality of instant APS images corresponding to the APS images of different rows to obtain the APS image with the jelly effect eliminated. By implementing the scheme of the application, the event stream data synchronously collected by the EVS image sensor in the exposure time of the APS image sensor is referred to perform the jelly removing effect processing on the APS image, and the self hardware configuration of the APS image sensor is not needed to be relied on, so that the method can be widely applied to the jelly removing effect processing of the APS image sensors with different configurations, and the feasibility of the jelly removing effect processing is improved.

Description

Image jelly effect eliminating method and related device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for eliminating an image jelly effect.
Background
The exposure mode of the image sensor is divided into two modes, global exposure (global shutter) and rolling shutter exposure (rolling shutter). The global exposure mode refers to full-width pixel simultaneous exposure, and the rolling shutter exposure mode refers to progressive scanning exposure. Compared with the global exposure mode, the exposure time of the roller shutter exposure mode can be longer, so that the performance and noise control of the roller shutter exposure mode are better.
Currently, most image sensors generally use a rolling shutter exposure mode, and the rolling shutter exposure mode is limited by a time difference between exposure start moments of each row, which causes problems such as tilting and twisting of a photographed image, namely a jelly effect. In the related art, the jelly effect can be relieved by improving the frame rate of the image sensor, but only part of the image sensor of the high-end camera or the professional camera generally has the function of 120 frames or 240 frames, on one hand, the frame rate is improved and the time difference between the line exposures cannot be absolutely eliminated, and on the other hand, the frame rate of most cameras can only support 60 frames, so that the jelly effect eliminating mode for improving the frame rate of the image sensor provided by the related art still has larger application limitation.
Disclosure of Invention
The embodiment of the application provides an image jelly effect eliminating method and a related device, which at least can solve the problem of larger application limitation of a jelly effect eliminating mode for improving the frame rate of an image sensor provided in the related technology.
An embodiment of the present application provides a method for eliminating an image jelly effect, including: acquiring an integral exposure time interval corresponding to an APS image to be processed, which is acquired by an APS image sensor; acquiring target event stream data synchronously acquired by an EVS image sensor based on the integral exposure time interval; respectively calculating instantaneous APS images of different rows of sub APS images in the APS image to be processed at the same world time according to the target event stream data; and synthesizing a plurality of instant APS images corresponding to the sub APS images of different rows to obtain APS images with jelly effect eliminated.
A second aspect of an embodiment of the present application provides an image jelly effect cancellation apparatus, including: the first acquisition module is used for acquiring an integral exposure time interval corresponding to the APS image to be processed, which is acquired by the APS image sensor; the second acquisition module is used for acquiring target event stream data synchronously acquired by the EVS image sensor based on the integral exposure time interval; the calculating module is used for respectively calculating the instantaneous APS images of different rows of sub APS images in the APS image to be processed at the same world time according to the target event stream data; and the synthesis module is used for synthesizing a plurality of instant APS images corresponding to the sub APS images of different rows to obtain APS images with jelly effect eliminated.
A third aspect of an embodiment of the present application provides an electronic device, including: the image jelly effect eliminating method comprises a memory and a processor, wherein the processor is used for executing a computer program stored on the memory, and when the processor executes the computer program, the steps in the image jelly effect eliminating method provided by the first aspect of the embodiment of the application are realized.
A fourth aspect of the embodiment of the present application provides a computer readable storage medium having a computer program stored thereon, where the computer program, when executed by a processor, implements the steps in the image jelly effect eliminating method provided in the first aspect of the embodiment of the present application.
From the above, according to the image jelly effect eliminating method and the related device provided by the scheme of the application, the integral exposure time interval corresponding to the APS image to be processed, which is acquired by the APS image sensor, is obtained; acquiring target event stream data synchronously acquired by an EVS image sensor based on the integral exposure time interval; respectively calculating instantaneous APS images of different rows of sub APS images in the APS images to be processed at the same world time according to the target event stream data; and synthesizing a plurality of instant APS images corresponding to the APS images of different rows to obtain the APS image with the jelly effect eliminated. By implementing the scheme of the application, the event stream data synchronously collected by the EVS image sensor in the exposure time of the APS image sensor is referred to perform the jelly removing effect processing on the APS image, and the self hardware configuration of the APS image sensor is not needed to be relied on, so that the method can be widely applied to the jelly removing effect processing of the APS image sensors with different configurations, and the feasibility of the jelly removing effect processing is improved.
Drawings
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 3 is a basic flow chart of an image jelly effect eliminating method according to an embodiment of the present application;
FIG. 4 is a schematic view of a roller shutter exposure according to an embodiment of the present application;
FIG. 5 is a schematic view of another roller shutter exposure according to an embodiment of the present application;
FIG. 6 is a detailed flowchart of an image jelly effect eliminating method according to an embodiment of the present application;
Fig. 7 is a schematic diagram of a program module of an image jelly effect eliminating device according to an embodiment of the application.
Detailed Description
In order to make the objects, features and advantages of the present application more comprehensible, the technical solutions in the embodiments of the present application will be clearly described in conjunction with the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the description of the embodiments of the present application, it should be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships based on the orientation or positional relationships shown in the drawings, merely to facilitate description of the embodiments of the present application and simplify description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present application.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the embodiments of the present application, the meaning of "plurality" is two or more, unless explicitly defined otherwise.
In the embodiments of the present application, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured" and the like are to be construed broadly and include, for example, either permanently connected, removably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the embodiments of the present application will be understood by those of ordinary skill in the art according to specific circumstances.
An image jelly effect eliminating method and related device according to an embodiment of the present application will be described in detail with reference to the accompanying drawings.
In order to solve the problem of the application limitation of the jelly effect elimination method for improving the frame rate of the image Sensor provided in the related art, an embodiment of the present application provides an image jelly effect elimination method, which is applied to a scene as shown in fig. 1, and in the application scene, an APS (Active-Pixel Sensor) camera 101, an EVS (Event-based Vision Sensor, event monitoring vision Sensor) camera 102 and an electronic device 103 may be included.
It should be noted that, the active pixel sensor is a commonly used image sensor, in which each pixel sensor cell has a photodetector and at least one active transistor, and in a Metal Oxide Semiconductor (MOS) active pixel sensor, a MOS field effect transistor (MOSFET) is used as an amplifier, and various types of APS include early NMOS type APS and more common Complementary MOS (CMOS) type APS; the event monitoring vision sensor is a novel sensor which simulates human retina, responds to pixel point pulse of brightness change generated by motion, so that the sensor can capture brightness change (namely light intensity change) of a scene at an extremely high frame rate, record events at specific time points and specific positions in an image, form event streams instead of frame streams, and solve the problems of redundancy of information, data storage capacity, large real-time processing capacity and the like of the traditional camera.
It should be noted that the APS image sensor and the EVS image sensor of the present embodiment may be discrete image sensors or integrated image sensors. For the integrated image sensor, namely, the whole photosensitive area of the integrated image sensor is divided into a plurality of sub-photosensitive areas, the pixel arrays of the sub-photosensitive areas respectively correspond to an APS data mode and an EVS data mode, and compared with a plurality of separately arranged sensor modules, the device volume of the sensor modules is effectively compressed, and the miniaturization of the whole hardware architecture is facilitated.
In addition, the electronic device 103 is a variety of terminal devices having data processing functions, including, but not limited to, a smart phone, a tablet computer, a laptop portable computer, a desktop computer, and the like.
In the application scenario shown in fig. 1, APS images may be acquired by the APS image sensor 101 and event stream data may be acquired by the EVS image sensor 102, respectively, and then the respective acquired data may be transmitted to the electronic device 103. The electronic device 103 executes the following flow of the image jelly effect elimination method for the received APS image to be processed and event stream data: firstly, acquiring an integral exposure time interval corresponding to an APS image to be processed; then, acquiring target event stream data corresponding to the whole exposure time interval; next, respectively calculating instantaneous APS images of different rows of sub APS images in the APS images to be processed at the same world time according to the target event stream data; and finally, synthesizing a plurality of instant APS images corresponding to the APS images of different rows to obtain the APS image with the jelly effect eliminated.
Fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device mainly comprises: the number of the processors 202 may be one or more, the memory 201 stores a computer program 203 that can run on the processor 202, the memory 201 is communicatively connected to the processor 202, and the processor 202 implements the flow of the image jelly effect eliminating method when executing the computer program 203.
It should be noted that the memory 201 may be a high-speed random access memory (RAM, random Access Memory) memory or a non-volatile memory (non-volatile memory), such as a disk memory. Memory 201 is used to store executable program code and processor 202 is coupled to memory 201.
An embodiment of the present application further provides a computer readable storage medium, which may be provided in the foregoing electronic device, and the computer readable storage medium may be a memory in the foregoing embodiment shown in fig. 2.
The computer readable storage medium stores a computer program which, when executed by a processor, implements the flow of the aforementioned image jelly effect elimination method. Further, the computer-readable medium may be any medium capable of storing a program code, such as a usb (universal serial bus), a removable hard disk, a Read-Only Memory (ROM), a RAM, a magnetic disk, or an optical disk.
Fig. 3 is a basic flowchart of an image jelly effect eliminating method according to an embodiment of the present application, where the image jelly effect eliminating method may be executed by the electronic device in fig. 1 or fig. 2, and specifically includes the following steps:
Step 301, acquiring an integral exposure time interval corresponding to an APS image to be processed acquired by an APS image sensor.
Fig. 4 is a schematic view of roller shutter exposure according to the present embodiment, and the image height in fig. 4 can be understood as the total number of exposure lines, for example, the total number of exposure lines shown in fig. 4 is 12 lines. In practical application, when the APS image sensor works in a roller shutter exposure mode, the sensor is exposed line by line in a preset integral exposure time interval, and a single frame APS image is acquired and used as an APS image to be processed. It should be appreciated that the overall exposure time interval of the present embodiment is defined by the exposure start time of the first row and the exposure end time of the last row.
In an optional implementation manner of this embodiment, the step of acquiring the integral exposure time interval corresponding to the APS image to be processed acquired by the APS image sensor includes: calculating the end line exposure termination time according to the single line exposure time length, the exposure time difference between adjacent lines, the initial line exposure time and the total exposure line number of the APS image to be processed acquired by the APS image sensor; and defining an integral exposure time interval corresponding to the APS image to be processed based on the initial line exposure starting time and the final line exposure ending time.
The calculation formula of the end line exposure termination time is expressed as follows:
Wherein, Indicating the end of the exposure time of the last row, t ex indicating the exposure time of a single row, t gap indicating the exposure time difference between adjacent rows,The first line exposure start time is indicated, and h represents the total number of exposure lines.
Referring to fig. 4 again, it should be understood that the exposure start time of different rows is different, and the exposure time difference between adjacent rows in this embodiment is the difference between the exposure start times of the previous row and the next row. In the present embodiment, since the first line exposure starts at the timeKnown, then at the calculated end of line exposure termination timeThen, the integral exposure time interval corresponding to the APS image to be processed can be obtained
Step 302, acquiring target event stream data synchronously acquired by the EVS image sensor based on the whole exposure time interval.
Specifically, in practical applications, event data exists independently at a very high frame rate, and even if the event data is within the exposure time of the APS image sensor or not, the EVS image sensor outputs event data whenever a light intensity (also understood as brightness) change occurs, so that the embodiment needs to align the APS image with the event data stream, that is, acquire specific event stream data corresponding to the entire exposure time interval of the APS image from all the event data streams.
In the present embodiment, event stream data of the EVS image sensor is realized based on the following formulated description:
It should be understood that one event data is constituted by (x, y, p, t), where x, y represents pixel coordinates, p represents event polarity, t represents a time stamp, L (t) represents light intensity at time t, and c represents a specific threshold value.
In practical applications, the EVS image sensor includes a pixel array composed of a plurality of pixels, each of which operates independently, and outputs an event when the pixels detect that a light intensity change reaches a specific threshold. In this embodiment, when log (L x,y(t1))-log(Lx,y(t0)) ∈c is greater than or equal to, the EVS image sensor generates a positive event (i.e., an UP event), p=1, indicating that the light intensity at the current time is stronger than that at the previous time; when log (L x,y(t1))-log(Lx,y(t0)) is less than or equal to-c, the EVS image sensor generates a negative event (namely DN event), and p= -1, which represents that the light intensity at the current moment is weakened relative to the light intensity at the previous moment; when-c is less than or equal to log (L x,y(t1))-log(Lx,y(t0)) is less than or equal to c, the EVS image sensor does not generate an event.
Step 303, respectively calculating instantaneous APS images of different rows of sub APS images in the APS images to be processed at the same world time according to the target event stream data.
Fig. 5 is a schematic view of another roller shutter exposure provided in this embodiment, the world time (see t world in fig. 5) is the absolute time, and in the same world time, since the exposure start time of the APS images of different rows is different, the corresponding relative time of the APS images of different rows is different in the same world time.
In this embodiment, the formulated description of a single row of sub APS images may be:
Wherein B n denotes an nth row sub APS image, And (3) withThe row exposure start time and the row exposure end time of the nth row sub APS image are respectively represented, t ex represents a single row exposure time, and L n (t) represents an instantaneous APS image of the nth row sub APS image at the time t.
It should be noted that, the exposure time of a single line is set to be t ex, the exposure time difference between adjacent lines is set to be t gap, and the exposure start time of the 0 th line is set to beThe exposure termination time isThen, the exposure start time of the nth row isThe exposure termination time is
In an optional implementation manner of this embodiment, for each line of sub-APS images of the APS image to be processed, event integration data corresponding to each line of sub-APS images is determined according to the line exposure time interval and the target event stream data; and respectively calculating the instantaneous APS images of the APS images of different lines at the same world time according to event integral data corresponding to the APS images of different lines.
Specifically, in this embodiment, according to the line exposure start time and the line exposure end time in the line exposure time interval, the integration operation is performed on the corresponding event polarity in the target event stream data, so as to obtain event integration data corresponding to each line of sub APS images; the integral operation formula of the present embodiment is expressed as:
wherein E n (t) represents event integration data, p represents event polarity, And (3) withThe line exposure start time and the line exposure end time of the n-th line sub APS image are respectively indicated.
It is noted that the formulated description of the event stream data described above according to the present embodiment is available: log (L x,y(t1)/Lx,y(t0)) ζ cp, then further can be obtained in combination with the above integral operation formula:
Next, a formulated description of the single row sub APS image described above in connection with this embodiment can be obtained:
Order the Then it is possible to obtain:
Next, according to the event integral data corresponding to the sub APS images of different rows, the instant APS images of the sub APS images of different rows at the same universal time are calculated by using a preset instant image calculation formula; the instantaneous image calculation formula is expressed as:
Wherein, Representing sub APS images at nth row at world timeIs used to determine the temporal APS image of (c),B n denotes an nth row sub APS image, E n (t) denotes event integration data,And (3) withThe row exposure start time and the row exposure end time of the nth row sub APS image are respectively represented, t ex represents the single row exposure time of the APS image to be processed, and c represents the comparison threshold corresponding to the light intensity variation of the EVS image sensor. It should be noted that the present embodimentAndThe intermediate amount is not practical.
And step 304, synthesizing a plurality of instant APS images corresponding to the APS images of different rows to obtain the APS image with the jelly effect eliminated.
Specifically, for all rows of the embodiment, the instantaneous images of the same world time are calculated according to the instantaneous image calculation formula, and as all rows are sampled at the same time, the obtained instantaneous images of different rows have no jelly effect, namely the jelly effect is eliminated.
Further, in the embodiment, a plurality of APS images with jelly effects eliminated corresponding to different world times in the whole exposure time interval can be obtained; and synthesizing a plurality of APS images after the jelly effect is eliminated, and obtaining the whole APS image after the jelly effect is eliminated.
The method in fig. 6 is a refined image jelly effect eliminating method according to an embodiment of the present application, where the image jelly effect eliminating method includes:
and step 601, calculating the final line exposure termination time according to the single line exposure time of the APS image to be processed, the exposure time difference between adjacent lines, the initial line exposure time and the total exposure line number, which are acquired by the APS image sensor.
Specifically, the calculation formula of the end line exposure termination time in this embodiment is expressed as:
Wherein, Indicating the end of the exposure time of the last row, t ex indicating the exposure time of a single row, t gap indicating the exposure time difference between adjacent rows,The first line exposure start time is indicated, and h represents the total number of exposure lines.
Step 602, defining an overall exposure time interval corresponding to the APS image to be processed based on the initial line exposure time and the final line exposure time.
And 603, acquiring target event stream data synchronously acquired by the EVS image sensor based on the whole exposure time interval.
Step 604, for each row of sub APS images of the APS image to be processed, performing integral operation on the corresponding event polarity in the target event stream data according to the row exposure start time and the row exposure end time in the row exposure time interval, to obtain event integral data corresponding to each row of sub APS images.
Specifically, the formulation of the single-line sub APS image of this embodiment is described as:
Wherein B n denotes an nth row sub APS image, And (3) withThe row exposure start time and the row exposure end time of the nth row sub APS image are respectively represented, t ex represents a single row exposure time, and L n (t) represents an instantaneous APS image of the nth row sub APS image at the time t.
In addition, the integral operation formula of the present embodiment is expressed as:
wherein E n (t) represents event integration data, p represents event polarity, And (3) withThe line exposure start time and the line exposure end time of the n-th line sub APS image are respectively indicated.
Step 605, calculating the instantaneous APS images of different lines of sub APS images at the same universal time by adopting a preset instantaneous image calculation formula according to the event integral data corresponding to the sub APS images of different lines.
Specifically, the instantaneous image calculation formula of the present embodiment is expressed as:
Wherein, Representing sub APS images at nth row at world timeIs used to determine the temporal APS image of (c),B n denotes an nth row sub APS image, E n (t) denotes event integration data,And (3) withThe row exposure start time and the row exposure end time of the nth row sub APS image are respectively represented, t ex represents the single row exposure time of the APS image to be processed, and c represents the comparison threshold corresponding to the light intensity variation of the EVS image sensor. It should be noted that the present embodimentAndThe intermediate amount is not practical.
And step 606, synthesizing a plurality of instant APS images corresponding to the APS images of different rows to obtain the APS image with the jelly effect eliminated.
Step 607, synthesizing a plurality of APS images with jelly effects removed corresponding to different world times in the whole exposure time interval, to obtain a whole APS image with jelly effects removed.
It should be understood that, the sequence number of each step in this embodiment does not mean the order of execution of the steps, and the execution order of each step should be determined by its functions and internal logic, and should not be construed as a unique limitation on the implementation process of the embodiment of the present application.
Fig. 7 is a schematic diagram of an apparatus for eliminating image jelly effect according to an embodiment of the present application. The image jelly effect eliminating apparatus can be used to realize the image jelly effect eliminating method in the foregoing embodiment, and mainly includes:
a first obtaining module 701, configured to obtain an overall exposure time interval corresponding to an APS image to be processed, which is collected by an APS image sensor;
A second acquiring module 702, configured to acquire target event stream data synchronously acquired by the EVS image sensor based on the overall exposure time interval;
A calculating module 703, configured to calculate instantaneous APS images of different rows of sub APS images in the APS image to be processed at the same universal time according to the target event stream data;
And the synthesizing module 704 is configured to synthesize a plurality of instantaneous APS images corresponding to the APS images of different rows, and obtain an APS image with the jelly effect eliminated.
In some implementations of this embodiment, the first obtaining module is specifically configured to: calculating the final line exposure termination time according to the single line exposure time length, the exposure time difference between adjacent lines, the initial line exposure time and the total exposure line number of the APS image to be processed acquired by the APS image sensor; defining an exposure time interval corresponding to the APS image to be processed based on the initial line exposure starting time and the final line exposure ending time; the calculation formula of the end line exposure termination time is expressed as follows:
Wherein, Indicating the end of the exposure time of the last row, t ex indicating the exposure time of a single row, t gap indicating the exposure time difference between adjacent rows,The first line exposure start time is indicated, and h represents the total number of exposure lines.
In some implementations of this embodiment, the formulation of a single row of sub APS images is described as:
Wherein B n denotes an nth row sub APS image, And (3) withThe row exposure start time and the row exposure end time of the nth row sub APS image are respectively represented, t ex represents a single row exposure time, and L n (t) represents an instantaneous APS image of the nth row sub APS image at the time t.
In some implementations of the present embodiment, the computing module is specifically configured to: determining event integration data corresponding to each row of sub-APS images according to the row exposure time interval and the target event stream data aiming at each row of sub-APS images of the APS images to be processed; and respectively calculating the instantaneous APS images of the APS images of different lines at the same world time according to event integral data corresponding to the APS images of different lines.
Further, in some implementations of the present embodiment, the computing module is specifically configured to: according to the line exposure starting time and the line exposure ending time of the line exposure time interval, carrying out integral operation on the corresponding event polarity in the target event stream data to obtain event integral data corresponding to each line of sub APS images; and respectively calculating the instantaneous APS images of the different rows of sub APS images at the same world time by adopting a preset instantaneous image calculation formula according to event integral data corresponding to the different rows of sub APS images.
Wherein, the integral operation formula is expressed as:
wherein E n (t) represents event integration data, p represents event polarity, And (3) withThe line exposure start time and the line exposure end time of the n-th line sub APS image are respectively indicated.
The instantaneous image calculation formula is expressed as:
Wherein, Representing sub APS images at nth row at world timeIs used to determine the temporal APS image of (c),B n denotes an nth row sub APS image, E n (t) denotes event integration data,And (3) withThe row exposure start time and the row exposure end time of the nth row sub APS image are respectively represented, t ex represents the single row exposure time of the APS image to be processed, and c represents the comparison threshold corresponding to the light intensity variation of the EVS image sensor.
In some implementations of this embodiment, the image jelly effect eliminating apparatus further includes a third obtaining module, configured to obtain a plurality of APS images with jelly effects eliminated corresponding to different universal times in an overall exposure time interval; correspondingly, the synthesis module is also used for: and synthesizing a plurality of APS images after the jelly effect is eliminated, and obtaining the whole APS image after the jelly effect is eliminated.
It should be noted that, the image jelly effect eliminating method in the foregoing embodiments may be implemented based on the image jelly effect eliminating device provided in the present embodiment, and those skilled in the art can clearly understand that, for convenience and brevity of description, the specific working process of the image jelly effect eliminating device described in the present embodiment may refer to the corresponding process in the foregoing method embodiment, and will not be described herein again.
Based on the technical scheme of the embodiment of the application, the event stream data synchronously collected by the EVS image sensor within the exposure time of the APS image sensor is used for carrying out the jelly removing effect processing on the APS image, and the self hardware configuration of the APS image sensor is not needed to be relied on, so that the method can be widely applied to the jelly removing effect processing of the APS image sensors with different configurations, and the feasibility of the jelly removing effect processing is improved.
It should be noted that the apparatus and method disclosed in several embodiments provided by the present application may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules is merely a logical function division, and there may be additional divisions of actual implementation, e.g., multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules illustrated as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
The integrated modules, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a readable storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned readable storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
It should be noted that, for the sake of simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all required for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The foregoing describes the method and apparatus for eliminating the image jelly effect provided by the present application, and those skilled in the art will recognize that there are variations in the specific embodiments and application scope of the present application according to the ideas of the embodiments of the present application.

Claims (8)

1. An image jelly effect eliminating method, characterized by comprising the following steps:
Acquiring an integral exposure time interval corresponding to an APS image to be processed, which is acquired by an APS image sensor;
acquiring target event stream data synchronously acquired by an EVS image sensor based on the integral exposure time interval;
Respectively calculating instantaneous APS images of different rows of sub APS images in the APS image to be processed at the same world time according to the target event stream data;
synthesizing a plurality of instant APS images corresponding to the sub APS images of different rows to obtain APS images with jelly effect eliminated;
The step of respectively calculating the instantaneous APS images of different lines of sub APS images in the APS image to be processed at the same universal time according to the target event stream data includes: determining event integration data corresponding to each row of sub-APS images of the APS image to be processed according to a row exposure time interval and the target event stream data; respectively calculating instantaneous APS images of the sub APS images of different rows at the same world time according to event integral data corresponding to the sub APS images of different rows;
The step of calculating the instantaneous APS images of the sub APS images of different rows at the same universal time according to the event integral data corresponding to the sub APS images of different rows, respectively, includes:
Respectively calculating the instantaneous APS images of the sub APS images of different rows at the same world time by adopting a preset instantaneous image calculation formula according to event integral data corresponding to the sub APS images of different rows;
The instantaneous image calculation formula is expressed as:
Wherein, Representing the sub APS image at world time in nth rowIs used to determine the temporal APS image of (c),B n denotes the sub APS image of the nth row, E n (t) denotes the event integration data,And (3) withAnd respectively representing the line exposure starting time and the line exposure ending time of the sub APS image of the nth line, t ex represents the single line exposure time of the APS image to be processed, and c represents the comparison threshold value corresponding to the light intensity variation of the EVS image sensor.
2. The image jelly effect elimination method according to claim 1, wherein the step of determining event integration data corresponding to each row of sub APS images according to a row exposure time interval and the target event stream data comprises:
According to the line exposure starting time and the line exposure ending time of the line exposure time interval, carrying out integral operation on the corresponding event polarity in the target event stream data to obtain event integral data corresponding to each line of sub APS images;
The integral operation formula is expressed as:
wherein E n (t) represents the event integration data, p represents the event polarity, And (3) withThe row exposure start time and the row exposure end time of the sub APS image of the nth row are respectively indicated.
3. The method for eliminating the image jelly effect according to claim 1, wherein the step of acquiring the integral exposure time interval corresponding to the APS image to be processed acquired by the APS image sensor comprises:
Calculating the end line exposure termination time according to the single line exposure time length, the exposure time difference between adjacent lines, the initial line exposure time and the total exposure line number of the APS image to be processed acquired by the APS image sensor;
defining an integral exposure time interval corresponding to the APS image to be processed based on the initial line exposure starting time and the final line exposure ending time;
The calculation formula of the final row exposure termination time is expressed as follows:
Wherein, Indicating the end of the exposure time, t ex indicating the single line exposure time, t gap indicating the exposure time difference between the adjacent lines,And indicating the initial line exposure starting time, and h indicating the total exposure line number.
4. The image jelly effect elimination method according to claim 1, wherein the formulation of the single line of the sub APS image is described as:
wherein B n denotes the sub APS image of the nth row, And (3) withRespectively representing the line exposure start time and the line exposure end time of the sub APS image in the nth line, t ex represents the single line exposure time, and L n (t) represents the instant APS image of the sub APS image in the nth line at the time t.
5. The image jelly effect elimination method according to any one of claims 1 to 4, wherein after the step of synthesizing a plurality of instantaneous APS images corresponding to the sub APS images of different rows to obtain an APS image after elimination of the jelly effect, further comprising:
acquiring a plurality of APS images corresponding to different world time within the integral exposure time interval after the jelly effect is eliminated;
And synthesizing a plurality of APS images after the jelly effect is eliminated, and obtaining an integral APS image after the jelly effect is eliminated.
6. An image jelly effect eliminating device, characterized by comprising:
The first acquisition module is used for acquiring an integral exposure time interval corresponding to the APS image to be processed, which is acquired by the APS image sensor;
the second acquisition module is used for acquiring target event stream data synchronously acquired by the EVS image sensor based on the integral exposure time interval;
the calculating module is used for respectively calculating the instantaneous APS images of different rows of sub APS images in the APS image to be processed at the same world time according to the target event stream data;
the synthesizing module is used for synthesizing a plurality of instant APS images corresponding to the sub APS images of different rows to obtain APS images with jelly effect eliminated;
The computing module is specifically configured to: determining event integration data corresponding to each row of sub-APS images of the APS image to be processed according to a row exposure time interval and the target event stream data; respectively calculating the instantaneous APS images of the sub APS images of different rows at the same world time by adopting a preset instantaneous image calculation formula according to event integral data corresponding to the sub APS images of different rows;
The instantaneous image calculation formula is expressed as:
Wherein, Representing the sub APS image at world time in nth rowIs used to determine the temporal APS image of (c),B n denotes the sub APS image of the nth row, E n (t) denotes the event integration data,And (3) withAnd respectively representing the line exposure starting time and the line exposure ending time of the sub APS image of the nth line, t ex represents the single line exposure time of the APS image to be processed, and c represents the comparison threshold value corresponding to the light intensity variation of the EVS image sensor.
7. An electronic device comprising a memory and a processor, wherein:
the processor is used for executing the computer program stored on the memory;
the processor, when executing the computer program, implements the steps of the image jelly effect elimination method according to any one of claims 1 to 5.
8. A computer-readable storage medium storing a computer program, characterized in that the steps in the image jelly effect elimination method according to any one of claims 1 to 5 are implemented when the computer program is executed by a processor.
CN202211335568.XA 2022-10-28 2022-10-28 Image jelly effect eliminating method and related device Active CN115734088B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211335568.XA CN115734088B (en) 2022-10-28 2022-10-28 Image jelly effect eliminating method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211335568.XA CN115734088B (en) 2022-10-28 2022-10-28 Image jelly effect eliminating method and related device

Publications (2)

Publication Number Publication Date
CN115734088A CN115734088A (en) 2023-03-03
CN115734088B true CN115734088B (en) 2024-10-11

Family

ID=85294078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211335568.XA Active CN115734088B (en) 2022-10-28 2022-10-28 Image jelly effect eliminating method and related device

Country Status (1)

Country Link
CN (1) CN115734088B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113592887A (en) * 2021-06-25 2021-11-02 荣耀终端有限公司 Video shooting method, electronic device and computer-readable storage medium
CN114723624A (en) * 2022-03-16 2022-07-08 深圳锐视智芯科技有限公司 Image processing method, system, equipment and computer readable storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190032818A (en) * 2017-09-20 2019-03-28 삼성전자주식회사 An electronic device including a plurality of camera using a rolling shutter system
WO2019067054A1 (en) * 2017-09-28 2019-04-04 Apple Inc. Generating static images with an event camera
EP3844945B1 (en) * 2018-10-25 2023-11-29 Samsung Electronics Co., Ltd. Method and apparatus for dynamic image capturing based on motion information in image
CN111445414B (en) * 2020-03-27 2023-04-14 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
CN113781470B (en) * 2021-09-24 2024-06-11 商汤集团有限公司 Parallax information acquisition method, device, equipment and binocular shooting system
CN217656677U (en) * 2021-12-10 2022-10-25 深圳锐视智芯科技有限公司 Mode-switchable image sensor
CN114612305B (en) * 2022-03-14 2024-04-02 中国科学技术大学 Event-driven video super-resolution method based on stereogram modeling
CN114881921B (en) * 2022-03-23 2024-08-16 清华大学 Anti-occlusion imaging method and device based on event and video fusion

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113592887A (en) * 2021-06-25 2021-11-02 荣耀终端有限公司 Video shooting method, electronic device and computer-readable storage medium
CN114723624A (en) * 2022-03-16 2022-07-08 深圳锐视智芯科技有限公司 Image processing method, system, equipment and computer readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《Bringing a Blurry Frame Alive at High Frame-Rate With an Event Camera》;Pan, C. Scheerlinck, X. Yu, R. Hartley, M. Liu and Y. Dai;《2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)》;20190620;第6814-6822页 *

Also Published As

Publication number Publication date
CN115734088A (en) 2023-03-03

Similar Documents

Publication Publication Date Title
EP3624439B1 (en) Imaging processing method for camera module in night scene, electronic device and storage medium
CN109068067B (en) Exposure control method and device and electronic equipment
CN109040609B (en) Exposure control method, exposure control device, electronic equipment and computer-readable storage medium
CN110290289B (en) Image noise reduction method and device, electronic equipment and storage medium
CN110225248B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
WO2020168967A1 (en) Image acquisition method, electronic device, and computer-readable storage medium
EP4013033A1 (en) Method and apparatus for focusing on subject, and electronic device, and storage medium
US11503223B2 (en) Method for image-processing and electronic device
CN110198418B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108093158B (en) Image blurring processing method and device, mobile device and computer readable medium
US8767096B2 (en) Image processing method and apparatus
CN110866486B (en) Subject detection method and apparatus, electronic device, and computer-readable storage medium
CN110430370B (en) Image processing method, image processing device, storage medium and electronic equipment
CN105120247A (en) White-balance adjusting method and electronic device
US20220301278A1 (en) Image processing method and apparatus, storage medium, and electronic device
CN108401110B (en) Image acquisition method and device, storage medium and electronic equipment
CN110290325B (en) Image processing method, image processing device, storage medium and electronic equipment
CN111246100A (en) Anti-shake parameter calibration method and device and electronic equipment
CN110399823B (en) Subject tracking method and apparatus, electronic device, and computer-readable storage medium
CN114257744B (en) Image processing method, device, equipment and readable storage medium
JP2021196643A (en) Inference device, imaging device, learning device, inference method, learning method and program
CN115734088B (en) Image jelly effect eliminating method and related device
CN113676635B (en) Method and device for generating high dynamic range image, electronic equipment and storage medium
CN110266965B (en) Image processing method, image processing device, storage medium and electronic equipment
CN115037871A (en) Method and device for controlling focusing, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant