CN112911165A - Endoscope exposure method, device and computer readable storage medium - Google Patents
Endoscope exposure method, device and computer readable storage medium Download PDFInfo
- Publication number
- CN112911165A CN112911165A CN202110231587.7A CN202110231587A CN112911165A CN 112911165 A CN112911165 A CN 112911165A CN 202110231587 A CN202110231587 A CN 202110231587A CN 112911165 A CN112911165 A CN 112911165A
- Authority
- CN
- China
- Prior art keywords
- image
- endoscope
- exposure
- target
- brightness
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 86
- 230000000007 visual effect Effects 0.000 claims abstract description 97
- 238000005286 illumination Methods 0.000 claims description 49
- 238000001514 detection method Methods 0.000 claims description 30
- 238000003709 image segmentation Methods 0.000 claims description 27
- 230000000875 corresponding effect Effects 0.000 claims description 26
- 238000004590 computer program Methods 0.000 claims description 11
- 230000002596 correlated effect Effects 0.000 claims description 5
- 230000000087 stabilizing effect Effects 0.000 claims description 3
- 238000003780 insertion Methods 0.000 claims description 2
- 230000037431 insertion Effects 0.000 claims description 2
- 238000012545 processing Methods 0.000 abstract description 44
- 238000004364 calculation method Methods 0.000 abstract description 20
- 230000008569 process Effects 0.000 abstract description 20
- 230000008859 change Effects 0.000 description 29
- 238000010586 diagram Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 15
- 230000003287 optical effect Effects 0.000 description 13
- 238000004891 communication Methods 0.000 description 12
- 238000003384 imaging method Methods 0.000 description 12
- 230000002093 peripheral effect Effects 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 9
- 230000035945 sensitivity Effects 0.000 description 8
- 230000003247 decreasing effect Effects 0.000 description 6
- 238000009499 grossing Methods 0.000 description 6
- 230000006978 adaptation Effects 0.000 description 5
- 230000001276 controlling effect Effects 0.000 description 5
- 230000006641 stabilisation Effects 0.000 description 5
- 238000011105 stabilization Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000001356 surgical procedure Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 1
- 210000000683 abdominal cavity Anatomy 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 210000000746 body region Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000002674 endoscopic surgery Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000000968 intestinal effect Effects 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000002980 postoperative effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000004148 unit process Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Endoscopes (AREA)
Abstract
The embodiment of the application discloses an endoscope exposure method, an endoscope exposure device and a computer readable storage medium, and belongs to the technical field of image processing. Based on the characteristic that the position and the size of a real visual field area are basically unchanged for a long time in the one-time use process of the endoscope, the one-time visual field area is obtained after a visual field area obtaining instruction is received, exposure is automatically adjusted based on the visual field area subsequently in the use process of the endoscope, the visual field area does not need to be obtained frequently, and therefore the calculation cost for obtaining the visual field area can be fully reduced. In addition, the scheme automatically adjusts exposure based on the brightness information of the field area in the image instead of based on the complete image, so that the exposure adjustment is more reasonable, overexposure or underexposure is avoided, and the image quality is higher.
Description
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to an endoscope exposure method, an endoscope exposure device and a computer readable storage medium.
Background
In the endoscope scene in the medical field, the development of the endoscope system enables doctors to obtain great efficiency improvement in links of disease diagnosis, operation treatment and the like. In the process of using the endoscope to obtain images to form operation videos, the images with proper brightness can be obtained through reasonable exposure, the image quality is improved, and comfortable, reasonable and more valuable images are provided for medical staff. In order to improve the working efficiency of a doctor, a more reasonable exposure method is innovated in the process of using the endoscope, overexposure or underexposure is avoided, more valuable images are provided for diagnosis of the doctor and the like, and the method becomes a hotspot of an endoscope system.
Disclosure of Invention
The embodiment of the application provides an endoscope exposure method, an endoscope exposure device and a computer readable storage medium, which can obtain high-quality images through reasonable exposure and provide more valuable information for diagnosis of doctors. The technical scheme is as follows:
in one aspect, an endoscopic exposure method is provided, the method comprising:
after receiving a visual field area acquisition instruction, determining a target visual field area of the endoscope according to a multi-frame image acquired by the endoscope;
and adjusting the exposure parameters of the endoscope according to the brightness information of the target visual field area in the first image acquired at the current time.
Optionally, the determining a target view area of the endoscope according to the multiple frames of images acquired by the endoscope includes:
inputting each frame of image in the multi-frame images into a first target detection model or a first image segmentation model respectively, outputting alternative view areas corresponding to the corresponding images respectively, and obtaining the target view areas based on a plurality of alternative view areas corresponding to the multi-frame images respectively; or,
and simultaneously inputting the multi-frame images into a second target detection model or a second image segmentation model, and outputting the target view field.
Optionally, in a manner of determining the target view area based on the first target detection model or the second target detection model, the obtained target view area is represented by a rectangular frame, and the target view area represents a circumscribed rectangular area or an inscribed rectangular area of the real view of the endoscope; or,
in a manner of determining the target field of view region based on the first image segmentation model or the second image segmentation model, the obtained target field of view region is represented by an image mask, and the target field of view region represents a real field of view of the endoscope.
Optionally, the adjusting the exposure parameter of the endoscope according to the brightness information of the target view field in the first image acquired at the current time includes:
counting the brightness information of the target view field in the first image to obtain the average brightness of the first image;
and if the exposure adjustment condition is determined to be met currently according to the average brightness of the first image, the reference brightness of the endoscope exposure and historical average brightness, adjusting the exposure parameter of the endoscope, wherein the historical average brightness represents the average brightness of at least one frame of image acquired before the first image.
Optionally, after the counting luminance information of the target view field in the first image to obtain an average luminance of the first image, the method further includes:
determining that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness exceeds a first range; or,
determining that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness does not exceed the first range and a difference between the average brightness of the first image and the historical average brightness exceeds a second range.
Optionally, the adjusting exposure parameters of the endoscope comprises:
determining scene illumination according to the average brightness of the first image and exposure time and gain included by the exposure parameters;
adjusting the reference brightness according to the scene illumination, wherein the adjusted reference brightness is positively correlated with the scene illumination;
determining an exposure adjustment direction according to the average brightness of the first image and the adjusted reference brightness;
and adjusting the exposure time and/or the gain according to the exposure adjustment direction.
Optionally, the endoscope comprises a bayonet and a front lens for inserting into a body, the bayonet and the front lens being connected;
the method further comprises the following steps:
before receiving the view field acquisition instruction, adjusting the object distance and/or the focal distance of the front end lens based on the bayonet of the endoscope so as to stabilize the view field acquired by the front end lens.
In another aspect, there is provided an endoscopic exposure apparatus, the apparatus including:
the determining module is used for determining a target visual field area of the endoscope according to the multi-frame images collected by the endoscope after receiving a visual field area acquisition instruction;
and the adjusting module is used for adjusting the exposure parameters of the endoscope according to the brightness information of the target visual field area in the first image acquired at the current time.
Optionally, the determining module includes:
the first determining submodule is used for respectively inputting each frame of image in the multi-frame images into a first target detection model or a first image segmentation model, respectively outputting alternative view areas corresponding to the corresponding images, and obtaining the target view areas based on a plurality of alternative view areas corresponding to the multi-frame images; or,
and the second determining submodule is used for simultaneously inputting the multi-frame images into a second target detection model or a second image segmentation model and outputting the target view field.
Optionally, in a manner of determining the target view area based on the first target detection model or the second target detection model, the obtained target view area is represented by a rectangular frame, and the target view area represents a circumscribed rectangular area or an inscribed rectangular area of the real view of the endoscope; or,
in a manner of determining the target field of view region based on the first image segmentation model or the second image segmentation model, the obtained target field of view region is represented by an image mask, and the target field of view region represents a real field of view of the endoscope.
Optionally, the adjusting module includes:
the statistic submodule is used for counting the brightness information of the target view field in the first image to obtain the average brightness of the first image;
a first adjusting sub-module, configured to adjust an exposure parameter of the endoscope if it is determined that an exposure adjustment condition is currently satisfied according to the average brightness of the first image, the reference brightness of the endoscope exposure, and a historical average brightness, where the historical average brightness represents an average brightness of at least one frame of image acquired before the first image.
Optionally, the adjusting module further comprises:
a third determining sub-module, configured to determine that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness exceeds a first range; or,
a fourth determination sub-module, configured to determine that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness does not exceed the first range and a difference between the average brightness of the first image and the historical average brightness exceeds a second range.
Optionally, the adjusting module includes:
a fifth determining submodule, configured to determine scene illuminance according to the average brightness of the first image and the exposure time and gain included in the exposure parameter;
the second adjusting submodule is used for adjusting the reference brightness according to the scene illumination, and the adjusted reference brightness is positively correlated with the scene illumination;
a sixth determining submodule, configured to determine an exposure adjustment direction according to the average brightness of the first image and the adjusted reference brightness;
and the third adjusting submodule is used for adjusting the exposure time and/or the gain according to the exposure adjusting direction.
Optionally, the endoscope comprises a bayonet and a front lens for inserting into a body, the bayonet and the front lens being connected;
the device further comprises:
and the visual field stabilizing module is used for adjusting the object distance and/or the focal distance of the front-end lens based on the bayonet of the endoscope before receiving the visual field area acquisition instruction so as to stabilize the visual field area obtained by the front-end lens.
In another aspect, an endoscope exposure apparatus is provided, which includes a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus, the memory is used for storing computer programs, and the processor is used for executing the programs stored in the memory to realize the steps of the endoscope exposure method. Optionally, the endoscope exposure apparatus is part or all of a computer device.
In another aspect, a computer-readable storage medium is provided, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the endoscopic exposure method described above.
In another aspect, a computer program product is provided comprising instructions which, when run on a computer, cause the computer to perform the steps of endoscopic exposure described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
based on the characteristic that the position and the size of a real visual field area are basically unchanged for a long time in the one-time use process of the endoscope, the one-time visual field area is obtained after a visual field area obtaining instruction is received, exposure is automatically adjusted based on the visual field area subsequently in the use process of the endoscope, the visual field area does not need to be obtained frequently, and therefore the calculation cost for obtaining the visual field area can be fully reduced. In addition, the scheme automatically adjusts exposure based on the brightness information of the field area in the image instead of based on the complete image, so that the exposure adjustment is more reasonable, overexposure or underexposure is avoided, and the image quality is higher.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic view of a field of view region of an endoscope according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of an endoscopic system provided in an embodiment of the present application;
FIG. 3 is a detailed schematic diagram of an endoscopic system provided in an embodiment of the present application;
FIG. 4 is a schematic structural diagram of another endoscope system provided by an embodiment of the present application;
FIG. 5 is a schematic view of a bayonet adjustment and imaging circle of an endoscope according to an embodiment of the present disclosure;
FIG. 6 is a flowchart of an endoscopic exposure method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a representation of a field of view provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of another endoscopic exposure method provided by an embodiment of the present application;
fig. 9 is a flowchart of a method for determining scene change according to an embodiment of the present application;
fig. 10 is a flowchart of a method for estimating scene illuminance according to an embodiment of the present application;
FIG. 11 is a flow chart of an automatic exposure adjustment provided by an embodiment of the present application;
FIG. 12 is a flowchart of an exposure parameter calculation provided by an embodiment of the present application;
FIG. 13 is a flow chart of another exposure parameter calculation provided by embodiments of the present application;
FIG. 14 is a flowchart of another exposure parameter calculation provided by an embodiment of the present application;
fig. 15 is a schematic structural diagram of an endoscopic exposure apparatus provided in an embodiment of the present application;
fig. 16 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 17 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application more clear, the embodiments of the present application will be further described in detail with reference to the accompanying drawings.
To facilitate understanding of the embodiments of the present application, some terms referred to in the embodiments of the present application will be described first.
Visual field area: the region that is imaged by endoscopic surgery is also referred to as the body region. The field of view of an endoscope as shown in fig. 1 is a circular area.
Illuminance: i.e., scene illumination, represents the energy of the light source from the scene being photographed, which illumination is a fixed value for a fixed scene.
Brightness: namely, image brightness, the brightness of the actual image, the image brightness is related to the scene illumination, and also related to the exposure parameter setting of the camera (such as an endoscope), and the image brightness is a variable.
AE: (Auto Exposure), which means that the camera automatically adjusts the Exposure parameters according to the current ambient illumination to obtain an image with proper brightness. Auto-exposure is primarily related to three exposure parameters: aperture, shutter, and gain. The input of the automatic exposure module is brightness statistical information, and the output is adjusted exposure parameters.
Aperture (f/stop): refers to some mechanism within the lens of the camera that can adjust the size of the aperture. When the opening is large, more light passes through the opening per unit time to reach the photosensitive wafer; conversely, when the opening is small, less light passes through the opening per unit time to reach the photosensitive wafer. Adjusting the aperture means adjusting the size of the aperture hole of the aperture so as to control the amount of light that can reach the photosensitive wafer through the optical lens per unit time.
A shutter: i.e., exposure time, is another important parameter in determining the brightness of an image. The exposure time increases and the image brightness increases; the exposure time is smaller and the image brightness is also reduced. The exposure time is represented by the shutter speed, and the current international standard designation of the shutter speed is 8 seconds, 4 seconds, 2 seconds, 1 second, 1/2 seconds, 1/4 seconds, 1/8 seconds, 1/15 seconds, 1/30 seconds, 1/60 seconds, 1/125 seconds, 1/250 seconds, 1/500 seconds, 1/1000 seconds, 1/2000 seconds, 1/4000 seconds, and so on, and the latter value of two adjacent values is approximately one-half of the former value.
Gain: the light-sensitive chip (sensor) used to record the image in the camera has a sensitivity value, commonly known as the ISO (International Standards Organization, a standard designation indicating the film industry), which represents the gain applied by the camera system to the electrical signal. The analog electrical signal converted by the image sensor passes through a module called AGC (Automatic Gain Control) before being converted into a digital signal. The AGC module amplifies the input analog electric signal, so that the intensity of the output signal meets the final requirement on the image brightness. Generally, when the incident light capability is small and the setting of the aperture and the exposure time cannot meet the exposure standard requirement, adjusting the gain of the signal is a very effective exposure adjustment means.
Next, an endoscope system according to the endoscope exposure method provided in the embodiment of the present application will be described. Fig. 2 is a schematic structural diagram of an endoscope system according to an embodiment of the present application, as shown in fig. 2. In fig. 2, the endoscope system includes an endoscope, a light source, an image pickup system host, a display device, and a storage device.
The endoscope is used for inserting the long tube into the body of a patient, shooting a part needing to be observed in the body of the patient, collecting an image of the part and sending the collected image to the camera system host. The light source device is used for illuminating light emitted from the front end of the long pipe of the endoscope so as to facilitate the endoscope to shoot a clear image. The camera system host is used for receiving the image transmitted by the endoscope, processing the image and then transmitting the processed image to the display device and the storage device. The camera system host is also used for uniformly controlling the whole endoscope system, for example, controlling the endoscope to send the acquired image to the camera system host. The display device is used for receiving the processed image sent by the camera system host and then displaying the processed image on the display device. The storage device is used for receiving the processed image sent by the camera system host and storing the processed image.
With the endoscope system shown in fig. 2, a doctor observes the presence or absence of a bleeding part, a tumor part, and an abnormal part of a detection target in an image by observing the processed image displayed on the display device. During surgery, real-time images of the surgical procedure may be provided by the endoscopic system shown in fig. 2. In addition, the doctor can also obtain the images in the storage device and perform postoperative review and operation training according to the video formed by the images.
For a clearer understanding of the principles of an endoscopic system, the components of the endoscopic system are explained herein. Fig. 3 is a detailed structural schematic diagram of an endoscope system according to an embodiment of the present disclosure, as shown in fig. 3.
In fig. 3, the image pickup system host in the endoscope system includes an image input unit, an image processing unit, an intelligent processing unit, a video encoding unit, a control unit, and an operation unit.
The image input unit receives images sent by the endoscope and transmits the received images to the image processing unit.
The Image processing unit receives the Image sent by the Image input unit, and processes the received Image, that is, performs an ISP (Image Signal Processor) operation on the Image, where the ISP operation includes operations such as luminance transformation, sharpening, moir e removal, and scaling on the Image. After the image processing unit processes the image, the processed image is sent to an intelligent processing unit, a video coding unit or a display device. In addition, the image processing unit is also used for receiving the image intelligently analyzed by the intelligent processing unit and performing ISP operation on the image after intelligent analysis again.
The intelligent processing unit receives the processed image sent by the image processing unit and performs intelligent analysis on the processed image, wherein the intelligent analysis comprises scene classification, instrument or instrument head detection, gauze detection, moire classification, dense fog classification and the like on the processed image based on deep learning. And after the intelligent processing unit intelligently analyzes the processed image, the intelligently analyzed image is sent to the image processing unit or the video coding unit.
The video coding unit is used for receiving the image processed by the image processing unit or the image intelligently analyzed by the intelligent processing unit. And encoding and compressing the processed image or the intelligently analyzed image, and sending the compressed image to a storage device.
The control unit is used for sending different function instructions to each unit of the endoscope system and controlling each module of the endoscope system to execute certain functions, such as controlling illumination of the light source, an image processing mode of the image processing unit, an intelligent analysis mode of the intelligent processing unit, a coding compression mode of the video coding unit and the like. In addition, the control unit is also used for receiving a trigger instruction sent by the operation unit and responding to the trigger instruction so as to start the camera system host. When a user triggers a switch, a button or a touch panel on the camera system host, the operation unit is used for receiving a trigger instruction of the user and sending the trigger instruction to the control unit.
In fig. 3, a light source in an endoscope system includes an illumination control unit and an illumination unit. The illumination control unit receives a function command sent by the control unit in the camera system host, and sends an illumination command to the illumination unit, and the illumination control unit is used for controlling the illumination unit to provide illumination light for the endoscope. The illumination unit receives the illumination instruction sent by the illumination control unit and provides illumination light to the endoscope.
In fig. 3, an endoscope in an endoscope system has an image pickup optical system, an imaging unit, a processing unit, and an operation unit. The imaging optical system is composed of one or more lenses, and focuses light from a region to be observed in a patient so that the region to be observed can be clearly imaged. The imaging unit is composed of an image sensor such as a CMOS (complementary metal oxide semiconductor) or a CCD (charge coupled device), and is configured to perform photoelectric conversion on light received by each pixel to generate an image. The imaging unit transmits the generated image to the processing unit. The processing unit receives the image sent by the imaging unit, converts the image into a digital signal image, and sends the converted image to an image input unit of the camera system host. When a user performs trigger operation on a switch, a button or a touch panel on the endoscope, the operation unit is used for receiving a trigger instruction of the user and sending the trigger instruction to the control unit of the camera system host.
The method provided by the embodiment of the present application is applied to a scene in which an endoscope system is used to process an image, and optionally, the endoscope exposure method provided by the embodiment of the present application may also be applied to other scenes in which an image is processed. And will not be illustrated herein.
Fig. 4 is a schematic structural diagram of another endoscope system provided in the embodiments of the present application. The endoscope system in fig. 4 is similar to the endoscope system in the above-described embodiment, and will not be described in detail here. In addition, in fig. 4, the endoscope is shown to include a front lens for insertion into the body, and a bayonet for focusing and zooming, i.e., a bayonet that integrates focusing and zooming optics, the bayonet being connected to the front lens. The focusing function of the bayonet is used for focusing to adapt to different object distances and make an image clear, namely the bayonet is used for adjusting the object distance of the front-end lens, the blurring degree of the image is changed, and the size of a view field ring (such as an imaging circle) cannot be changed. The zoom function of the bayonet is used for adjusting the focal length, and the size of a view field ring in an image can be changed. Focusing and zooming are carried out by adjusting the bayonet, so that a visual field area obtained through the lens is stable, and an image is clear.
In other embodiments, the endoscope comprises a bayonet for focusing, which has only a focusing function, i.e. only a fixed focus. Regardless of whether the mount of the endoscope supports zooming, the size and position of the field of view are fixed after the mount is adjusted to obtain a stable and clear field of view. Generally, in a surgical scene, after the bayonet is adjusted and fixed, the visual field area is not changed.
In one example, an optical zoom lens group is connected to the rear end of the bayonet, and after the bayonet is fixed, an endoscope is used for acquiring an image in a body, wherein light rays reach an image sensor through a front-end lens and the optical zoom lens group, and are finally imaged after a series of signal processing.
Fig. 5 is a schematic diagram illustrating an embodiment of the present application for adjusting a bayonet and an imaging circle of an endoscope. Under the condition of using the same lens (endoscope), the bayonet is adjusted to enable the focal length to be a large focal length to obtain a large imaging circle, and the bayonet is adjusted to enable the focal length to be a small focal length to obtain a small imaging circle. That is, the large focal length optical bayonet obtains a larger field of view circle, and the small focal length optical bayonet obtains a smaller field of view circle. It should be noted that, the focal length range of the zoom supported by one bayonet is fixed, and if the bayonet is adjusted, a proper view area still cannot be obtained, it is indicated that the focal length range that the bayonet can adjust is not proper, the bayonet needs to be replaced, and the replaced bayonet is used to adjust the focal length to obtain the proper view area.
When ISP operation is carried out on the image in the image processing, adjustment is carried out according to the brightness of the current image so as to provide comfort and reasonableness for the user. In fact, the imaging area (i.e. the field of view area) of the endoscope is not a complete image, but a local area, and if exposure processing is performed on the entire image, it is inevitable to introduce overexposure or underexposure. Therefore, the present solution performs automatic exposure for the visual field region, specifically, the range of the visual field region is automatically acquired first, and then the automatic exposure is performed for the visual field region, that is, the endoscope exposure method of the present solution includes two processes, which are a visual field region acquisition process and an automatic exposure process, and this is described next.
The main body of the endoscope exposure method provided in the embodiments of the present application is not limited, and the method may be executed by the imaging system host described above, or may be executed by an external device. For convenience of the following description, the following embodiments are described as examples in which the method is performed by a computer device.
Fig. 6 is a flowchart of an endoscope exposure method according to an embodiment of the present application, where the method includes the following steps:
step 601: and after receiving the view field acquisition instruction, determining a target view field of the endoscope according to the multi-frame images collected by the endoscope.
In the endoscope use scene, the size and the position of the visual field area of the endoscope are basically unchanged for a long time during one-time use of the endoscope, but when the scene is switched in the operation (for example, the intestinal cavity operation is switched to the abdominal cavity operation), the size and the position of the visual field area need to be acquired again, and the visual field area acquisition function needs to be restarted. In other words, although the endoscope scene does not change for a long time and the field of view corresponding to one operation is basically stable, the field of view changes after the operation is updated, and the field of view needs to be acquired again. In the embodiment of the application, whether the scene is switched is judged in a manual determining mode, if the scene is switched, a visual field area acquisition instruction is sent to the computer equipment through manual operation, and a visual field area acquisition module of the computer equipment is restarted to obtain an accurate visual field area adaptive to the scene. Of course, the computer may be triggered by instructions to reacquire the field of view at any time during the procedure.
Optionally, before receiving the view field acquisition instruction, the object distance and/or the focal distance of the front-end lens are adjusted based on the bayonet of the endoscope, so that the view field acquired by the front-end lens is stable. After the bayonet of the endoscope is fixed, a visual field region acquisition instruction is received. That is, when the endoscope is used, the bayonet of the endoscope needs to be fixed to fix the visual field area of the endoscope, that is, after the bayonet of the endoscope is fixed, the position and the size of the visual field area of the endoscope keep stable, and then the accuracy of acquiring the visual field area of the endoscope according to the scheme is high, and after the visual field area is acquired once, the visual field area does not need to be acquired repeatedly before a next visual field area acquisition instruction is received, so that the calculation overhead is reduced. Alternatively, the field of view of the endoscope may change after the bayonet of the endoscope is readjusted and secured, in which case the field of view needs to be reacquired.
Alternatively, the view area acquisition instruction may be by clicking or touching a start button on the display device, such as a start surgery button, a start image capture button, or the like. In addition, in addition to receiving the view area acquisition instruction after the bayonet of the endoscope is fixed at the beginning of the operation, the computer device can be triggered to receive the view area acquisition instruction at any time in the operation process according to the requirement.
Based on this, in the embodiment of the application, after receiving the view area acquisition instruction, the computer device determines the target view area of the endoscope according to the multi-frame images acquired by the endoscope. In one implementation, the computer device determines a target field of view of the endoscope, i.e., acquires a stable field of view, based on a plurality of frames of images most recently acquired by the endoscope. In the present embodiment, the stable visual field region finally determined is set as the target visual field region.
The method for acquiring the target visual field area adopts a deep learning method including but not limited to target detection, image segmentation and the like. From the view point of processing, the method for acquiring the target view field may be based on a single-frame acquisition method plus some time sequence processing, so that the acquired view field is more accurate, or may be directly based on multi-frame image processing in time sequence to determine the target view field. According to the scheme, the target view field is not acquired based on single-frame image information, but multi-frame information integration is carried out, and the target view field with higher robustness is acquired.
In the embodiment of the present application, an implementation manner of the single-frame-based acquisition method plus timing integration of the actual target view area is as follows: and the computer equipment respectively inputs each frame of image in the multi-frame images into a first target detection model or a first image segmentation model, respectively outputs the alternative view field areas corresponding to the corresponding images, and obtains the target view field area based on the multiple alternative view field areas corresponding to the multi-frame images.
Illustratively, more than the multi-frame images, the computer device inputs the single-frame images into the first target detection model respectively, outputs the alternative view areas corresponding to the single-frame images, and then performs time sequence integration on the multiple alternative view areas corresponding to the multi-frame images by using, for example, a sequence Non-maximum-suppression (SEQ-NMS) method, so as to obtain more robust target view areas. Or for the multi-frame image, inputting a single-frame image into the first image segmentation model, outputting the alternative view field corresponding to the single-frame image, and then obtaining a more robust target view field through a voting mechanism for the segmentation result (namely a plurality of alternative view field).
In the embodiment of the present application, one implementation manner of determining the target view area directly based on the multi-frame image processing in the time sequence is as follows: and the computer equipment simultaneously inputs the multi-frame images into a second target detection model or a second image segmentation model and outputs a target view field. That is, based on a video target detection method or a video image segmentation method in time series, the input of the model is a continuous multi-frame image, and the output is directly the target view field.
Optionally, in the above manner of determining the target view area based on the first target detection model or the second target detection model, the obtained target view area is represented by a rectangular frame, and the target view area represents a circumscribed rectangular area or an inscribed rectangular area of the real view of the endoscope. Illustratively, in one implementation, the alternative field of view region is represented by a rectangular box, for example, by diagonal vertex coordinates of the rectangular box (e.g., coordinates of the top left vertex and the bottom right vertex). The target visual field area represented by the circumscribed rectangle area can cover the whole real visual field of the endoscope, and the target visual field area represented by the inscribed rectangle area does not cover a background area (such as a black background area in fig. 1).
Optionally, in the above manner of determining the target field of view based on the first image segmentation model or the second image segmentation model, the obtained target field of view is represented by an image mask, and the target field of view represents a real field of view of the endoscope. Illustratively, a pixel point with a value of '1' in the image mask represents an effective region in the corresponding image, a pixel point with a value of '0' represents an ineffective region in the corresponding image, and the effective region constitutes a target view region in the image. As can be seen, the target field of view region represented by the image mask can truly reflect the true field of view of the endoscope.
As can be seen from the above, the target visual field region is represented in three different ways in this embodiment, such as way 1, way 2, and way 3 shown in fig. 7. The mode 1 is a circumscribed rectangular area, that is, a target visual field area is represented by a minimum circumscribed rectangular frame of the real visual field (such as a light gray filled area) of the endoscope. Mode 2 is an inscribed rectangular area, i.e. the target visual field area is represented by the largest inscribed rectangular frame of the endoscope's real visual field. Mode 3 is image mask (mask) representation, where the mask value of each pixel in the dark gray filled region is '1' and the mask value of each pixel in the white filled region is '0'.
Alternatively, if the obtained target field of view region represents a circumscribed rectangular region of the endoscope's real field of view, i.e., represented by way 1, the computer device may also obtain an inscribed rectangular region by calculation, i.e., in way 2. For example, the maximum inscribed circle (or ellipse) region of the circumscribed rectangle region is determined, and then the maximum inscribed rectangle region of the maximum inscribed circle (or ellipse) region is determined, so that the target view region represented by the inscribed rectangle region is obtained. Conversely, if the obtained target visual field area represents an inscribed rectangular area of the real visual field of the endoscope, the computer device can also obtain a corresponding circumscribed rectangular area through calculation. That is, the mode 1 and the mode 2 can be converted to each other. Alternatively, if the obtained target field of view region represents the real field of view of the endoscope, i.e., represented by way 3, the computer device may obtain the circumscribed rectangular region or the inscribed rectangular region by calculation, i.e., get way 1 or way 2. Alternatively, if the obtained target visual field region is represented by way 1, the computer device may represent the target visual field region by calculating an inscribed circle (or ellipse) region that is obtainable, and converting the inscribed circle (or ellipse) region into an image mask, that is, way 1 may result in way 3. Similarly, if the obtained target view field is represented by the mode 2, the computer device may obtain a circumscribed circular (or elliptical) field by calculation, and convert the circumscribed circular (or elliptical) field into an image mask to represent the target view field, that is, the mode 2 may obtain the mode 3. It can be seen that the above-described modes 1, 2 and 3 can be mutually converted to satisfy multiple representations of the field of view region, and different representations achieve different exposure effects.
Alternatively, in the above mode 1 and/or mode 2, in addition to the above-described rectangular representation of the field of view, the field of view may also be represented by any other polygonal or curved shape, such as an octagon, a diamond, an irregular shape, and the like, which is not limited in the embodiments of the present application.
Optionally, after the target view area is determined, the target view area in the image is displayed on the computer device, whether the target view area is accurate is determined manually, if the acquired target view area is accurate, the computer device locks the target view area according to the instruction, and waits for the next time of receiving a view area acquisition instruction. If the acquired target view field is not accurate, the computer device re-acquires the target view field according to the instruction, for example, re-acquires the multi-frame image, and re-determines the target view field based on the re-acquired multi-frame image.
According to the scheme, after the view area acquisition instruction is received, the target view area is acquired once, and in the process of using the endoscope at this time, the subsequent computer equipment automatically adjusts exposure based on the target view area without frequently acquiring the view area, so that the calculation overhead for acquiring the view area can be fully reduced.
In addition, referring to fig. 8, in the endoscope exposure method provided in the embodiment of the present application, the automatic exposure for the visual field mainly includes visual field acquisition, scene switching determination, and automatic exposure adjustment. In step 601, the method for obtaining the field of view (e.g., deep learning, time sequence integration, etc.) and the scene switching determination (e.g., manual determination) are introduced, and then the automatic exposure adjustment is introduced through step 602.
Step 602: and adjusting the exposure parameters of the endoscope according to the brightness information of the target visual field area in the first image acquired at the current time.
One type of common medical exposure area is a circular area (as shown in fig. 1), the circular area is an actual visible scene, the background area is a black non-light area, and in some endoscopes, there are poor mirror scenes, and the background area often has some light leakage phenomena. In order to better count the brightness of the image in the current scene, the main body area in the image is selected through the acquired information of the target view field area. For example, the image is divided into two regions as shown in fig. 1: and the main body area (target visual field area) and the background area are distinguished according to the acquired and determined information of the target visual field area, and the main body area is used as the range of brightness statistics in the scheme.
In the embodiment of the application, after the target visual field area is acquired, the computer device adjusts the exposure parameter of the endoscope according to the brightness information of the target visual field area in the first image acquired at the current time. That is, each frame of image acquired after the target field of view region is acquired realizes automatic exposure adjustment according to the scheme. Optionally, if the multiple frames of images used for acquiring the target view area are newly acquired images, the first image is one frame of image with the latest acquisition time in the multiple frames of images, and each frame of image acquired after the first image is an image acquired after the exposure is automatically adjusted according to the scheme.
In the embodiments of the present application, there are many implementations of adjusting the exposure parameters of the endoscope based on the brightness information of the first image in the target visual field area, and one implementation will be described as an example.
In the embodiment of the application, the computer device counts brightness information of a target visual field area in the first image to obtain average brightness of the first image, and if it is determined that the exposure adjustment condition is currently met according to the average brightness of the first image, reference brightness of endoscope exposure and historical average brightness, the computer device adjusts the exposure parameter of the endoscope. Wherein the historical average brightness characterizes an average brightness of at least one frame of image preceding the first image.
That is, subsequent exposure adjustment operations of the computer device are performed based on the target visual field area, the average brightness of the first image in the target visual field area is counted first, and due to the average brightness of the first image, the reference brightness of the endoscope exposure and the historical average brightness, the size relationship of the first image, the reference brightness of the endoscope exposure and the historical average brightness can reflect whether the current exposure is proper or not and whether the brightness change of the currently acquired image is smooth or not to a certain extent, so that whether the exposure adjustment condition is met currently is judged according to the three, and the exposure parameters are automatically adjusted when the exposure adjustment condition is met currently.
Illustratively, it is determined that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness exceeds a first range. That is, if the average zero degree of the first image acquired at the current time is not near the reference brightness, it indicates that the current exposure is not appropriate, and no matter whether the brightness change of the current acquired image is smooth or not, the exposure parameter needs to be adjusted, so that the brightness of the image acquired after the exposure adjustment is within the reference brightness. Alternatively, the reference brightness is a constant value or a range.
And if the difference value between the average brightness of the first image and the reference brightness does not exceed the first range and the difference value between the average brightness of the first image and the historical average brightness exceeds the second range, determining that the exposure adjustment condition is currently met. That is, if the average brightness of the first image acquired at the current time is near the reference brightness, but the brightness change of the first image compared with the historical image is not smooth, which indicates that although the current exposure is proper (i.e., the exposure of the first image is proper), the brightness change of the image is too large due to the scene change (e.g., illumination change) and the like, so the exposure parameter needs to be adjusted to adapt to the scene change and the like, so that the exposure of the subsequently acquired image is reasonable.
In other embodiments, as shown in FIG. 9, the exposure adjustment condition is satisfied as indicated by a scene change flag of 1 in FIG. 9, and the exposure adjustment condition is not satisfied as indicated by a scene change flag of 0, and the computer device updates the scene change flag. The computer device counts the times that the average brightness of the collected images continuously exceeds a reference brightness interval (within a first range around the reference brightness) for multiple times from the last time of updating the scene change mark, namely counts the times that the difference value between the brightness of continuous multi-frame images and the reference brightness continuously exceeds the first range, and determines that the exposure adjustment condition is met, namely determines that the scene changes and the exposure parameter needs to be adjusted if the counted times exceed a time threshold. If the current counted times do not exceed the time threshold and the scene change flag is not 1, it indicates that the scene has not changed significantly, or if the current counted times do not exceed the time threshold, the scene change flag is 1 and the historical frame has been updated, counting the number of blocks in a second range from the last time the scene change flag is updated to the time when the change of the average brightness of the currently acquired frame image compared with the average brightness of the historical frame is smaller than, that is, the proportion of the blocks with insignificant brightness change is counted, and if the proportion of the blocks with insignificant brightness change is smaller than the proportion threshold or the image brightness after the scene change has not entered the reference brightness interval, it is determined that the exposure adjustment condition is satisfied and the exposure parameters need to be adjusted.
As can be seen from the above, in an implementation manner of the present scheme, exposure adjustment is performed not on exposure of each frame of image but on the condition that it is determined according to the above method that the exposure adjustment condition is satisfied, so that it can be ensured that exposure parameters are not adjusted under the condition that scene illumination does not change significantly, and the computational overhead is further reduced.
Next, one implementation of adjusting the exposure parameters of the endoscope in the embodiment of the present application will be described. In an embodiment of the application, the computer device determines the scene illumination according to the average brightness of the first image and the exposure time and gain included in the exposure parameters. And then, the computer equipment adjusts the reference brightness according to the scene illumination, and the adjusted reference brightness is positively correlated with the scene illumination. Then, the computer device determines an exposure adjustment direction according to the average brightness of the first image and the adjusted reference brightness, and adjusts the exposure time and/or the gain according to the exposure adjustment direction.
Illustratively, the computer device estimates scene illumination according to information such as a currently acquired first image and a current exposure parameter, further adjusts reference brightness according to the scene illumination, reduces the reference brightness when the scene illumination is low, and increases the reference brightness when the scene illumination is high.
Fig. 10 is a flowchart of a method for estimating scene illuminance according to an embodiment of the present application. Referring to fig. 10, if the difference between the average luminance y _ avg _ cur of the first image and the reference luminance y _ avg _ pre does not exceed the first range (3 as shown in fig. 10), the counted stabilization count SC (stable _ count) is added with 1, and it is determined whether the stabilization count SC exceeds the count threshold th1, and if the SC exceeds the count threshold th1, the scene illumination cur _ lum _ level is calculated according to the formula cur _ lum _ level 1000 × y _ avg _ cur/(exp _ Time G), where exp _ Time and G are the current Exposure Time (exposuretime) and gain (gain), respectively. After that, the number of times of stabilization is set to SC zero and counted again. And if the difference value of the average brightness of the first image and the reference brightness exceeds a first range, setting the stabilization times to be zero, and counting again to estimate the scene illumination.
Alternatively, if the scene illumination is higher than the reference illumination, the reference brightness is decreased by the adjustment step size, or decreased by a linear or non-linear function. And if the scene illumination is lower than the reference illumination, increasing the reference brightness according to the adjustment step length, or increasing the reference brightness according to a linear or nonlinear function.
And then, the computer device determines an exposure adjustment direction according to the average brightness of the first image and the adjusted reference brightness, wherein the exposure adjustment direction indicates that the exposure needs to be reduced under the condition that the average brightness of the first image is greater than the adjusted reference brightness, and the exposure adjustment direction indicates that the exposure needs to be increased under the condition that the average brightness of the first image is less than the adjusted reference brightness. Thereafter, the computer device adjusts the exposure time and/or gain according to the exposure adjustment direction. For example, in the case where the exposure needs to be reduced, the exposure time and/or gain is increased, and in the case where the exposure needs to be increased, the exposure time and/or gain is decreased.
It should be noted that in the present embodiment, the exposure time and/or gain of the endoscope are mainly adjusted, and in some other embodiments, the aperture of the endoscope may also be adjusted, for example, the aperture is increased in the case of decreasing the exposure, and the aperture is decreased in the case of increasing the exposure.
The flow of the above-described automatic exposure is explained next with reference to fig. 11. Fig. 11 is a flowchart of an automatic exposure adjustment according to an embodiment of the present application, and referring to fig. 11, the modules of the computer device for performing automatic exposure include a brightness estimation module, a scene change smoothing module (also referred to as an exposure control sensitivity adaptation module), a reference brightness according to illumination adaptation module (also referred to as a reference brightness adjustment module), and an exposure parameter calculation module.
The brightness estimation module is used for calculating the average brightness of the image. The input of the luminance estimation module includes the position and size of the target view field region (e.g., the coordinates of a rectangular box or an image mask), and the luminance statistic (luminance information) of the current image (i.e., the first image), and the output of the luminance estimation module includes the average luminance y _ avg of the current image.
The scene change smoothing module is used for automatically adjusting the sensitivity of exposure, and the sensitivity is used for judging whether exposure adjustment conditions are met or not, namely whether exposure is needed or not. The inputs to the scene change smoothing module include the average brightness of the current image, the difference y _, between the average brightness y _ avg of the current image and the reference brightness of the endoscopic exposure (also referred to as the target brightness)△And a historical image luminance y _ pre (e.g., average luminance of previous frame image), the scene change smoothing module operates according to y _ \△And the difference between y _ avg and y _ pre, determines and outputs the sensitivity. Illustratively, the sensitivity of the output is 1 or 0.
If the sensitivity output by the scene change smoothing module is 1, the subsequent reference brightness adjustment and exposure parameter calculation (including setting the register value for storing the exposure parameter) is continued. If the sensitivity output by the scene change smoothing module is 0, subsequent reference brightness adjustment and exposure parameter calculation are not needed, namely exposure adjustment is not carried out under the condition that the scene change is not obvious.
The reference brightness is used for adaptively adjusting the reference brightness according to the illumination adaptive module. The reference luminance estimates the scene luminance from the input of the luminance adaptation module including the average luminance of the current image, the current exposure parameters (including exposure time and gain), etc. Wherein the gain of the endoscopic exposure comprises at least one of a video gain value and a digital gain value. Then, under the condition of high scene illumination, the reference brightness is properly improved, the image overexposure needs to be prevented on the premise of ensuring the visual effect, under the condition of low scene illumination, the reference brightness is properly reduced, the brightness of the image is considered on the premise of reducing digital noise, and the image is prevented from being too dark (the underexposure is prevented). After the reference brightness is adjusted, the reference brightness determines an exposure adjustment direction by comparing the average brightness of the current image with the adjusted reference brightness according to the illuminance adaptation module. The reference luminance outputs the adjusted reference luminance y _ ref and the exposure adjustment direction according to the illuminance adaptation module.
The exposure parameter calculation module is used for automatically calculating exposure parameters. The input of the exposure parameter calculation module comprises the average brightness of the first image, the exposure adjustment direction, the adjusted reference brightness, the current exposure parameter (comprising exposure time and gain), and the output comprises the adjusted exposure time and/or gain.
It should be noted that the bit shown in fig. 11 represents the number of storage bits occupied by the brightness, the exposure time, and the gain in the register of the computer device in one embodiment, such as the brightness represented by 8 bits of register data, and the exposure time and the gain represented by 16 bits of register data.
The detailed continuation of the exposure parameter calculation module will be described next by referring to fig. 12 to 14. Referring to fig. 12, each Time one frame of image is acquired and it is determined that the exposure parameter needs to be adjusted currently, the exposure parameter calculation module acquires an interrupt signal, acquires current ST (Shutter Time, also called exposure Time) and G (Gain), executes a process a shown in fig. 13 if the average luminance y _ avg of the current image is greater than the adjusted reference luminance y _ ref, and executes a process B shown in fig. 14 if the average luminance y _ avg of the current image is not greater than the adjusted reference luminance y _ ref. The exposure time and/or gain is adjusted by the trial method shown in flow a or flow B.
In the flow shown in fig. 13, y _ avg > y _ ref, exposure needs to be reduced. If the current exposure time ST is less than the light source period T (stroboscopic period of the endoscope light source), i.e., ST < T, the gain is set to the minimum value, i.e., the adjusted gain G _ new is G _ min (minimum value of gain), and a new exposure time, i.e., the new exposure time ST _ new is ST y _ ref/y _ avg, is calculated according to the trial method.
If ST ≧ T and G > G _ min, the gain is reduced, i.e., G _ new ≧ G-20 × lg (y _ ref/y _ avg), leaving the exposure time ST unchanged, i.e., ST _ new ≧ ST.
If ST ≧ T and G ≦ G _ min, the gain is kept to a minimum value, i.e., G _ new ≦ G _ min, and a new exposure time, i.e., ST _ new ≦ Y _ ref/y _ avg, is calculated according to the trial method. Since it is necessary to ensure that the endoscopic exposure is not affected by the stroboscopic light source, it is necessary to ensure that the exposure time after adjustment is an integral multiple of the stroboscopic period T, and after ST _ new is obtained, the multiple N of T, that is, N is calculated to be ST _ new/T, if N > 0, the gain G _ new is recalculated to be G-20 lg (y _ ref/y _ avg), and if N is less than or equal to 0, the adjustment is determined to be completed. Finally, the adjusted exposure time ST _ new and gain G _ new are obtained by the flow shown in fig. 13.
In the flow shown in FIG. 14, y _ avg ≦ y _ ref, an increased exposure is required. If the current exposure time ST is equal to the maximum value of the exposure time ST _ max, i.e., ST _ max, the exposure time is kept unchanged, i.e., ST _ new is ST _ max, and the gain is increased, i.e., G _ new is G +20 × lg (y _ ref/y _ avg).
If ST ≠ ST _ max, and ST ≧ N _ max T, the exposure time is increased, i.e., ST _ new ═ ST _ y _ ref/y _ avg, keeping the gain unchanged, i.e., G _ new ═ G.
If ST ≠ ST _ max, ST < N _ max × T, and ST < T, the gain is set to the minimum value, i.e., G _ new ═ G _ min, a new exposure time is calculated according to the trial method, i.e., ST _ new ═ ST _ y _ ref/y _ avg, ST _ new is calculated as a multiple N of T, i.e., N ≠ ST _ new/T is calculated, and if N > 0, the exposure time ST _ new ═ N is recalculated as N T. Where N _ max represents the number of stable frames corresponding to the frame rate.
If ST ≠ ST _ max, ST < N _ max T, and ST ≧ T, ST is set to be N _ r T, wherein N _ r is a random positive integer within a specified range, the gain is increased, namely G _ new ═ G-20 x lg (y _ ref/y _ avg), and the new exposure time ST _ new is adjusted to be ST. If G _ new is larger than or equal to G _ min (N _ r +1)/N, the gain is set to be the minimum value, namely G _ new is G _ min, and the exposure time is increased by one step, namely ST _ new is ST + T.
It should be noted that fig. 12 to 14 are only an exemplary illustration of the embodiment of the present application, and are not intended to limit the embodiment of the present application, and after the field of view is determined according to the present scheme, the exposure time and the gain may be adjusted by other methods to automatically and reasonably perform the exposure.
According to the scheme, the visual field area is automatically and accurately acquired, manual acquisition is not needed, the accuracy of brightness information statistics is improved, and subsequent automatic exposure adjustment is facilitated. And for the same scene, the view field is acquired with little expenditure, and the view field is not required to be acquired all the time, so that a large amount of resources are saved. According to the scheme, a more robust visual field area is obtained by combining a plurality of frames of images, and exposure adjustment is more reasonable. In addition, if the visual field area is automatically acquired based on the camera system host in the endoscope system, additional hardware equipment does not need to be introduced, and the cost is low.
In summary, in the embodiment of the present application, based on the characteristic that the position and size of the real view area are substantially unchanged for a long time in the one-time use process of the endoscope, the one-time view area is acquired after receiving the view area acquisition instruction, and the exposure is automatically adjusted based on the view area subsequently in the use process of the endoscope, so that the view area does not need to be acquired frequently, and thus the calculation overhead for acquiring the view area can be sufficiently reduced. In addition, the scheme automatically adjusts exposure based on the brightness information of the field area in the image instead of based on the complete image, so that the exposure adjustment is more reasonable, overexposure or underexposure is avoided, and the image quality is higher.
All the above optional technical solutions can be combined arbitrarily to form an optional embodiment of the present application, and the present application embodiment is not described in detail again.
Fig. 15 is a schematic structural diagram of an endoscopic exposure apparatus 1500 provided in an embodiment of the present application, where the endoscopic exposure apparatus 1500 may be implemented by software, hardware, or a combination of the two as part or all of a computer device, which may be the computer device in the above embodiments. Referring to fig. 15, the apparatus 1500 includes:
the determining module 1501 is configured to determine a target field of view region of the endoscope according to the multiple frames of images acquired by the endoscope after receiving the field of view region acquisition instruction;
the adjusting module 1502 is configured to adjust an exposure parameter of the endoscope according to brightness information of a target view field in the first image acquired at the current time.
Optionally, the determining module 1501 includes:
the first determining submodule is used for respectively inputting each frame of image in the multi-frame images into the first target detection model or the first image segmentation model, respectively outputting the alternative view areas corresponding to the corresponding images, and obtaining the target view areas based on the multiple alternative view areas corresponding to the multi-frame images; or,
and the second determining submodule is used for simultaneously inputting the multi-frame images into the second target detection model or the second image segmentation model and outputting the target view field.
Optionally, in a mode of determining a target visual field area based on the first target detection model or the second target detection model, the obtained target visual field area is represented by a rectangular frame, and the target visual field area represents a circumscribed rectangular area or an inscribed rectangular area of the real visual field of the endoscope; or,
in a manner of determining a target field of view region based on the first image segmentation model or the second image segmentation model, the obtained target field of view region is represented using an image mask, the target field of view region characterizing a real field of view of the endoscope.
Optionally, the adjusting module 1502 includes:
the statistic submodule is used for counting the brightness information of the target view field in the first image to obtain the average brightness of the first image;
and the first adjusting sub-module is used for adjusting the exposure parameter of the endoscope if the exposure adjusting condition is determined to be met currently according to the average brightness of the first image, the reference brightness of the endoscope exposure and the historical average brightness, and the historical average brightness represents the average brightness of at least one frame of image acquired before the first image.
Optionally, the adjusting module 1502 further comprises:
a third determining sub-module, configured to determine that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness exceeds a first range; or,
and the fourth determination submodule is used for determining that the exposure adjustment condition is currently met if the difference value between the average brightness of the first image and the reference brightness does not exceed the first range and the difference value between the average brightness of the first image and the historical average brightness exceeds the second range.
Optionally, the adjusting module 1502 includes:
the fifth determining submodule is used for determining the scene illumination according to the average brightness of the first image and the exposure time and the gain included by the exposure parameters;
the second adjusting submodule is used for adjusting the reference brightness according to the scene illumination, and the adjusted reference brightness is in positive correlation with the scene illumination;
the sixth determining submodule is used for determining the exposure adjusting direction according to the average brightness of the first image and the adjusted reference brightness;
and the third adjusting submodule is used for adjusting the exposure time and/or the gain according to the exposure adjusting direction.
Optionally, the endoscope comprises a bayonet and a front end lens for inserting into the body, the bayonet and the front end lens being connected;
the apparatus 1500 further comprises:
and the visual field stabilizing module is used for adjusting the object distance and/or the focal distance of the front-end lens based on the bayonet of the endoscope before receiving a visual field area acquisition instruction so as to stabilize the visual field area acquired by the front-end lens.
In the embodiment of the application, based on the characteristic that the position and the size of the real visual field area are basically unchanged for a long time in the one-time use process of the endoscope, the one-time visual field area is obtained after the visual field area obtaining instruction is received, exposure is automatically adjusted based on the visual field area subsequently in the use process of the endoscope, the visual field area does not need to be obtained frequently, and therefore the calculation cost for obtaining the visual field area can be fully reduced. In addition, the scheme automatically adjusts exposure based on the brightness information of the field area in the image instead of based on the complete image, so that the exposure adjustment is more reasonable, overexposure or underexposure is avoided, and the image quality is higher.
It should be noted that: in the endoscope exposure apparatus provided in the above embodiment, only the division of the above functional modules is exemplified when performing automatic exposure, and in practical applications, the above functions may be distributed by different functional modules as needed, that is, the internal structure of the apparatus may be divided into different functional modules to complete all or part of the above described functions. In addition, the endoscope exposure device provided by the above embodiment and the endoscope exposure method embodiment belong to the same concept, and the specific implementation process thereof is described in the method embodiment, which is not described herein again.
Fig. 16 is a schematic structural diagram of a terminal 1600 according to an embodiment of the present application. The terminal 1600 may be: a smartphone, a tablet, a laptop, or a desktop computer. Terminal 1600 may also be referred to by other names such as computer device, user equipment, portable terminal, laptop terminal, desktop terminal, and so forth.
Generally, terminal 1600 includes: a processor 1601, and a memory 1602.
In some embodiments, the terminal 1600 may also optionally include: peripheral interface 1603 and at least one peripheral. Processor 1601, memory 1602 and peripheral interface 1603 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1603 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1604, a display 1605, a camera assembly 1606, audio circuitry 1607, a positioning assembly 1608, and a power supply 1609.
The Radio Frequency circuit 1604 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1604 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1604 converts the electrical signal into an electromagnetic signal to be transmitted, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1604 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1604 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display 1605 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1605 is a touch display screen, the display screen 1605 also has the ability to capture touch signals on or over the surface of the display screen 1605. The touch signal may be input to the processor 1601 as a control signal for processing. At this point, the display 1605 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1605 can be one, disposed on the front panel of the terminal 1600; in other embodiments, the display screens 1605 can be at least two, respectively disposed on different surfaces of the terminal 1600 or in a folded design; in other embodiments, display 1605 can be a flexible display disposed on a curved surface or a folded surface of terminal 1600. Even further, the display 1605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1605 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
The camera assembly 1606 is used to capture images or video. Optionally, camera assembly 1606 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1606 can also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1601 for processing or inputting the electric signals to the radio frequency circuit 1604 to achieve voice communication. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of terminal 1600. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1601 or the radio frequency circuit 1604 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1607 may also include a headphone jack.
The positioning component 1608 is configured to locate a current geographic Location of the terminal 1600 for purposes of navigation or LBS (Location Based Service). The Positioning component 1608 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, the russian graves System, or the european union galileo System.
In some embodiments, terminal 1600 also includes one or more sensors 1610. The one or more sensors 1610 include, but are not limited to: acceleration sensor 1611, gyro sensor 1612, pressure sensor 1613, fingerprint sensor 1614, optical sensor 1615, and proximity sensor 1616.
Acceleration sensor 1611 may detect acceleration in three coordinate axes of a coordinate system established with terminal 1600. For example, the acceleration sensor 1611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1601 may control the display screen 1605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1611. The acceleration sensor 1611 may also be used for acquisition of motion data of a game or a user.
Gyroscope sensor 1612 can detect the organism direction and the turned angle of terminal 1600, and gyroscope sensor 1612 can gather the 3D action of user to terminal 1600 with acceleration sensor 1611 in coordination. From the data collected by the gyro sensor 1612, the processor 1601 may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1613 may be disposed on the side frames of terminal 1600 and/or underlying display 1605. When the pressure sensor 1613 is disposed on the side frame of the terminal 1600, a user's holding signal of the terminal 1600 can be detected, and the processor 1601 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1613. When the pressure sensor 1613 is disposed at the lower layer of the display 1605, the processor 1601 controls the operability control on the UI interface according to the pressure operation of the user on the display 1605. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1614 is configured to collect a fingerprint of the user, and the processor 1601 is configured to identify the user based on the fingerprint collected by the fingerprint sensor 1614, or the fingerprint sensor 1614 is configured to identify the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1601 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1614 may be disposed on the front, back, or side of the terminal 1600. When a physical key or vendor Logo is provided on the terminal 1600, the fingerprint sensor 1614 may be integrated with the physical key or vendor Logo.
The optical sensor 1615 is used to collect ambient light intensity. In one embodiment, the processor 1601 may control the display brightness of the display screen 1605 based on the ambient light intensity collected by the optical sensor 1615. Specifically, when the ambient light intensity is high, the display luminance of the display screen 1605 is increased; when the ambient light intensity is low, the display brightness of the display screen 1605 is adjusted down. In another embodiment, the processor 1601 may also dynamically adjust the shooting parameters of the camera assembly 1606 based on the ambient light intensity collected by the optical sensor 1615.
A proximity sensor 1616, also referred to as a distance sensor, is typically disposed on the front panel of terminal 1600. The proximity sensor 1616 is used to collect the distance between the user and the front surface of the terminal 1600. In one embodiment, the processor 1601 controls the display 1605 to switch from the light screen state to the clear screen state when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually decreased; when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually increased, the display 1605 is controlled by the processor 1601 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 16 is not intended to be limiting of terminal 1600, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
Embodiments of the present application also provide a non-transitory computer-readable storage medium, wherein when the instructions in the storage medium are executed by a processor of a terminal, the terminal is enabled to execute the endoscope exposure method provided in the above embodiments.
Embodiments of the present application also provide a computer program product containing instructions, which when run on a terminal, cause the terminal to execute the endoscope exposure method provided by the above embodiments.
Fig. 17 is a schematic diagram illustrating a server structure of an endoscopic exposure apparatus according to an exemplary embodiment. The server may be a server in a cluster of background servers. Specifically, the method comprises the following steps:
the server 1700 includes a Central Processing Unit (CPU)1701, a system memory 1704 including a Random Access Memory (RAM)1702 and a Read Only Memory (ROM)1703, and a system bus 1705 connecting the system memory 1704 and the central processing unit 1701. The server 1700 also includes a basic input/output system (I/O system) 1706 for facilitating the transfer of information between devices within the computer, and a mass storage device 1707 for storing an operating system 1713, application programs 1714, and other program modules 1715.
The basic input/output system 1706 includes a display 1708 for displaying information and an input device 1709 such as a mouse, keyboard, etc. for user input of information. Wherein a display 1708 and an input device 1709 are connected to the central processing unit 1701 via an input-output controller 1710 connected to the system bus 1705. The basic input/output system 1706 may also include an input/output controller 1710 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, the input-output controller 1710 may also provide output to a display screen, a printer, or other type of output device.
The mass storage device 1707 is connected to the central processing unit 1701 through a mass storage controller (not shown) connected to the system bus 1705. The mass storage device 1707 and its associated computer-readable media provide non-volatile storage for the server 1700. That is, the mass storage device 1707 may include a computer-readable medium (not shown), such as a hard disk or CD-ROM drive.
Without loss of generality, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that computer storage media is not limited to the foregoing. The system memory 1704 and mass storage device 1707 described above may be collectively referred to as memory.
According to various embodiments of the present application, the server 1700 may also operate with remote computers connected to a network through a network, such as the Internet. That is, the server 1700 may be connected to the network 1712 through the network interface unit 1711 connected to the system bus 1705, or may be connected to other types of networks or remote computer systems (not shown) using the network interface unit 1711.
The memory further includes one or more programs, and the one or more programs are stored in the memory and configured to be executed by the CPU. The one or more programs include instructions for performing the endoscopic exposure method provided by the embodiments of the present application.
Embodiments of the present application also provide a non-transitory computer-readable storage medium, wherein when the instructions in the storage medium are executed by a processor of a server, the server is enabled to execute the endoscope exposure method provided by the above embodiments.
Embodiments of the present application also provide a computer program product containing instructions, which when run on a server, cause the server to execute the endoscope exposure method provided by the above embodiments.
In some embodiments, a computer-readable storage medium is also provided, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the endoscopic exposure method in the above-mentioned embodiments. For example, the computer readable storage medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is noted that the computer-readable storage medium referred to in the embodiments of the present application may be a non-volatile storage medium, in other words, a non-transitory storage medium.
It should be understood that all or part of the steps for implementing the above embodiments may be implemented by software, hardware, firmware or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The computer instructions may be stored in the computer-readable storage medium described above.
That is, in some embodiments, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the steps of the endoscopic exposure method described above.
It is to be understood that reference herein to "at least one" means one or more and "a plurality" means two or more. In the description of the embodiments of the present application, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
The above-mentioned embodiments are provided not to limit the present application, and any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (10)
1. An endoscopic exposure method, characterized in that the method comprises:
after receiving a visual field area acquisition instruction, determining a target visual field area of the endoscope according to a multi-frame image acquired by the endoscope;
and adjusting the exposure parameters of the endoscope according to the brightness information of the target visual field area in the first image acquired at the current time.
2. The method of claim 1, wherein determining a target field of view region of the endoscope from a plurality of frames of images acquired by the endoscope comprises:
inputting each frame of image in the multi-frame images into a first target detection model or a first image segmentation model respectively, outputting alternative view areas corresponding to the corresponding images respectively, and obtaining the target view areas based on a plurality of alternative view areas corresponding to the multi-frame images respectively; or,
and simultaneously inputting the multi-frame images into a second target detection model or a second image segmentation model, and outputting the target view field.
3. The method according to claim 2, wherein in the determining the target field of view region based on the first target detection model or the second target detection model, the resulting target field of view region is represented by a rectangular box, the target field of view region representing a circumscribed rectangular region or an inscribed rectangular region of the endoscope's real field of view; or,
in a manner of determining the target field of view region based on the first image segmentation model or the second image segmentation model, the obtained target field of view region is represented by an image mask, and the target field of view region represents a real field of view of the endoscope.
4. The method according to any one of claims 1-3, wherein adjusting the exposure parameters of the endoscope based on brightness information of the target field of view in the first image acquired at the current time comprises:
counting the brightness information of the target view field in the first image to obtain the average brightness of the first image;
and if the exposure adjustment condition is determined to be met currently according to the average brightness of the first image, the reference brightness of the endoscope exposure and historical average brightness, adjusting the exposure parameter of the endoscope, wherein the historical average brightness represents the average brightness of at least one frame of image acquired before the first image.
5. The method according to claim 4, wherein the counting brightness information of the target view field region in the first image to obtain an average brightness of the first image further comprises:
determining that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness exceeds a first range; or,
determining that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness does not exceed the first range and a difference between the average brightness of the first image and the historical average brightness exceeds a second range.
6. The method of claim 4, wherein the adjusting the exposure parameters of the endoscope comprises:
determining scene illumination according to the average brightness of the first image and exposure time and gain included by the exposure parameters;
adjusting the reference brightness according to the scene illumination, wherein the adjusted reference brightness is positively correlated with the scene illumination;
determining an exposure adjustment direction according to the average brightness of the first image and the adjusted reference brightness;
and adjusting the exposure time and/or the gain according to the exposure adjustment direction.
7. A method according to any of claims 1-3, wherein the endoscope comprises a bayonet and a front lens for insertion into the body, the bayonet and the front lens being connected;
the method further comprises the following steps:
before receiving the view field acquisition instruction, adjusting the object distance and/or the focal distance of the front end lens based on the bayonet of the endoscope so as to stabilize the view field acquired by the front end lens.
8. An endoscopic exposure apparatus, characterized in that the apparatus comprises:
the determining module is used for determining a target visual field area of the endoscope according to the multi-frame images collected by the endoscope after receiving a visual field area acquisition instruction;
and the adjusting module is used for adjusting the exposure parameters of the endoscope according to the brightness information of the target visual field area in the first image acquired at the current time.
9. The apparatus of claim 8, wherein the determining module comprises:
the first determining submodule is used for respectively inputting each frame of image in the multi-frame images into a first target detection model or a first image segmentation model, respectively outputting alternative view areas corresponding to the corresponding images, and obtaining the target view areas based on a plurality of alternative view areas corresponding to the multi-frame images; or,
the second determining submodule is used for simultaneously inputting the multi-frame images into a second target detection model or a second image segmentation model and outputting the target view field;
in a mode of determining the target visual field area based on the first target detection model or the second target detection model, the obtained target visual field area is represented by a rectangular frame, and the target visual field area represents a circumscribed rectangular area or an inscribed rectangular area of the real visual field of the endoscope; or,
in a manner of determining the target field of view region based on the first image segmentation model or the second image segmentation model, the obtained target field of view region is represented by an image mask, and the target field of view region represents a real field of view of the endoscope;
wherein the adjustment module comprises:
the statistic submodule is used for counting the brightness information of the target view field in the first image to obtain the average brightness of the first image;
a first adjusting sub-module, configured to adjust an exposure parameter of the endoscope if it is determined that an exposure adjustment condition is currently satisfied according to an average brightness of the first image, a reference brightness of the endoscope exposure, and a historical average brightness, where the historical average brightness represents an average brightness of at least one frame of image acquired before the first image;
wherein the adjusting module further comprises:
a third determining sub-module, configured to determine that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness exceeds a first range; or,
a fourth determination sub-module configured to determine that the exposure adjustment condition is currently satisfied if a difference between the average brightness of the first image and the reference brightness does not exceed the first range and a difference between the average brightness of the first image and the historical average brightness exceeds a second range;
wherein the adjustment module comprises:
a fifth determining submodule, configured to determine scene illuminance according to the average brightness of the first image and the exposure time and gain included in the exposure parameter;
the second adjusting submodule is used for adjusting the reference brightness according to the scene illumination, and the adjusted reference brightness is positively correlated with the scene illumination;
a sixth determining submodule, configured to determine an exposure adjustment direction according to the average brightness of the first image and the adjusted reference brightness;
a third adjusting submodule, configured to adjust the exposure time and/or the gain according to the exposure adjusting direction;
the endoscope comprises a bayonet and a front-end lens which is inserted into a body, wherein the bayonet is connected with the front-end lens;
the device further comprises:
and the visual field stabilizing module is used for adjusting the object distance and/or the focal distance of the front-end lens based on the bayonet of the endoscope before receiving the visual field area acquisition instruction so as to stabilize the visual field area obtained by the front-end lens.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the method of any of the preceding claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110231587.7A CN112911165B (en) | 2021-03-02 | 2021-03-02 | Endoscope exposure method, device and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110231587.7A CN112911165B (en) | 2021-03-02 | 2021-03-02 | Endoscope exposure method, device and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112911165A true CN112911165A (en) | 2021-06-04 |
CN112911165B CN112911165B (en) | 2023-06-16 |
Family
ID=76108597
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110231587.7A Active CN112911165B (en) | 2021-03-02 | 2021-03-02 | Endoscope exposure method, device and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112911165B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113744266A (en) * | 2021-11-03 | 2021-12-03 | 武汉楚精灵医疗科技有限公司 | Method and device for displaying focus detection frame, electronic equipment and storage medium |
CN114052909A (en) * | 2021-12-01 | 2022-02-18 | 辽宁北镜医疗科技有限公司 | Multifunctional navigation system and navigation method in near-infrared fluoroscopy |
CN114449175A (en) * | 2022-01-13 | 2022-05-06 | 瑞芯微电子股份有限公司 | Automatic exposure adjusting method, automatic exposure adjusting device, image acquisition method, medium and equipment |
CN114727027A (en) * | 2022-03-09 | 2022-07-08 | 浙江华诺康科技有限公司 | Exposure parameter adjusting method and device, computer equipment and storage medium |
CN115835448A (en) * | 2022-12-28 | 2023-03-21 | 无锡车联天下信息技术有限公司 | Method and device for adjusting light, endoscope equipment and medium |
CN115984282A (en) * | 2023-03-21 | 2023-04-18 | 菲特(天津)检测技术有限公司 | Spandex product detection method, device, equipment and storage medium |
CN118695104A (en) * | 2024-08-26 | 2024-09-24 | 南京诺源医疗器械有限公司 | Dynamic exposure method and system for medical endoscope image |
Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6836288B1 (en) * | 1999-02-09 | 2004-12-28 | Linvatec Corporation | Automatic exposure control system and method |
JP2011183055A (en) * | 2010-03-10 | 2011-09-22 | Hoya Corp | Endoscope processor, and endoscope unit |
US20130208140A1 (en) * | 2012-02-15 | 2013-08-15 | Harman Becker Automotive Systems Gmbh | Brightness adjustment system |
US20140078277A1 (en) * | 2012-09-19 | 2014-03-20 | Omnivision Technologies, Inc. | Acquiring global shutter-type video images with cmos pixel array by strobing light during vertical blanking period in otherwise dark environment |
CN106456276A (en) * | 2014-03-17 | 2017-02-22 | 直观外科手术操作公司 | System and method for tissue contact detection and for auto-exposure and illumination control |
CN106791475A (en) * | 2017-01-23 | 2017-05-31 | 上海兴芯微电子科技有限公司 | Exposure adjustment method and the vehicle mounted imaging apparatus being applicable |
CN108292366A (en) * | 2015-09-10 | 2018-07-17 | 美基蒂克艾尔有限公司 | The system and method that suspect tissue region is detected in endoscopic surgery |
US20180243043A1 (en) * | 2017-02-24 | 2018-08-30 | Sony Olympus Medical Solutions Inc. | Endoscope device |
CN110458883A (en) * | 2019-03-07 | 2019-11-15 | 腾讯科技(深圳)有限公司 | A kind of processing system of medical imaging, method, apparatus and equipment |
CN110477844A (en) * | 2019-08-22 | 2019-11-22 | 重庆金山医疗技术研究院有限公司 | A kind of method, system and capsule endoscopic preventing acquisition picture overexposure |
WO2020029732A1 (en) * | 2018-08-06 | 2020-02-13 | Oppo广东移动通信有限公司 | Panoramic photographing method and apparatus, and imaging device |
WO2020038072A1 (en) * | 2018-08-22 | 2020-02-27 | Oppo广东移动通信有限公司 | Exposure control method and device, and electronic device |
WO2020038069A1 (en) * | 2018-08-22 | 2020-02-27 | Oppo广东移动通信有限公司 | Exposure control method and device, and electronic apparatus |
CN111144376A (en) * | 2019-12-31 | 2020-05-12 | 华南理工大学 | Video target detection feature extraction method |
CN111182232A (en) * | 2019-12-31 | 2020-05-19 | 浙江华诺康科技有限公司 | Exposure parameter adjusting method, device, equipment and computer readable storage medium |
CN111343388A (en) * | 2019-04-11 | 2020-06-26 | 杭州海康慧影科技有限公司 | Method and device for determining exposure time |
CN111343389A (en) * | 2019-05-16 | 2020-06-26 | 杭州海康慧影科技有限公司 | Automatic exposure control method and device |
CN111343387A (en) * | 2019-03-06 | 2020-06-26 | 杭州海康慧影科技有限公司 | Automatic exposure method and device for camera equipment |
US20210004959A1 (en) * | 2018-10-31 | 2021-01-07 | Tencent Technology (Shenzhen) Company Limited | Colon polyp image processing method and apparatus, and system |
CN112235512A (en) * | 2020-09-16 | 2021-01-15 | 浙江大华技术股份有限公司 | Image exposure parameter adjusting method, equipment and device |
CN112312033A (en) * | 2020-10-23 | 2021-02-02 | 浙江华诺康科技有限公司 | Exposure parameter determination method and device and electronic device |
US20210150251A1 (en) * | 2018-05-18 | 2021-05-20 | Hangzhou Hikvision Digital Technology Co., Ltd. | Method and apparatus for calculating a luminance value of a region of interest |
-
2021
- 2021-03-02 CN CN202110231587.7A patent/CN112911165B/en active Active
Patent Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6836288B1 (en) * | 1999-02-09 | 2004-12-28 | Linvatec Corporation | Automatic exposure control system and method |
JP2011183055A (en) * | 2010-03-10 | 2011-09-22 | Hoya Corp | Endoscope processor, and endoscope unit |
US20130208140A1 (en) * | 2012-02-15 | 2013-08-15 | Harman Becker Automotive Systems Gmbh | Brightness adjustment system |
US20140078277A1 (en) * | 2012-09-19 | 2014-03-20 | Omnivision Technologies, Inc. | Acquiring global shutter-type video images with cmos pixel array by strobing light during vertical blanking period in otherwise dark environment |
CN106456276A (en) * | 2014-03-17 | 2017-02-22 | 直观外科手术操作公司 | System and method for tissue contact detection and for auto-exposure and illumination control |
CN108292366A (en) * | 2015-09-10 | 2018-07-17 | 美基蒂克艾尔有限公司 | The system and method that suspect tissue region is detected in endoscopic surgery |
CN106791475A (en) * | 2017-01-23 | 2017-05-31 | 上海兴芯微电子科技有限公司 | Exposure adjustment method and the vehicle mounted imaging apparatus being applicable |
US20180243043A1 (en) * | 2017-02-24 | 2018-08-30 | Sony Olympus Medical Solutions Inc. | Endoscope device |
US20210150251A1 (en) * | 2018-05-18 | 2021-05-20 | Hangzhou Hikvision Digital Technology Co., Ltd. | Method and apparatus for calculating a luminance value of a region of interest |
WO2020029732A1 (en) * | 2018-08-06 | 2020-02-13 | Oppo广东移动通信有限公司 | Panoramic photographing method and apparatus, and imaging device |
WO2020038072A1 (en) * | 2018-08-22 | 2020-02-27 | Oppo广东移动通信有限公司 | Exposure control method and device, and electronic device |
WO2020038069A1 (en) * | 2018-08-22 | 2020-02-27 | Oppo广东移动通信有限公司 | Exposure control method and device, and electronic apparatus |
US20210004959A1 (en) * | 2018-10-31 | 2021-01-07 | Tencent Technology (Shenzhen) Company Limited | Colon polyp image processing method and apparatus, and system |
CN111343387A (en) * | 2019-03-06 | 2020-06-26 | 杭州海康慧影科技有限公司 | Automatic exposure method and device for camera equipment |
CN110458883A (en) * | 2019-03-07 | 2019-11-15 | 腾讯科技(深圳)有限公司 | A kind of processing system of medical imaging, method, apparatus and equipment |
CN111343388A (en) * | 2019-04-11 | 2020-06-26 | 杭州海康慧影科技有限公司 | Method and device for determining exposure time |
CN111343389A (en) * | 2019-05-16 | 2020-06-26 | 杭州海康慧影科技有限公司 | Automatic exposure control method and device |
CN110477844A (en) * | 2019-08-22 | 2019-11-22 | 重庆金山医疗技术研究院有限公司 | A kind of method, system and capsule endoscopic preventing acquisition picture overexposure |
CN111144376A (en) * | 2019-12-31 | 2020-05-12 | 华南理工大学 | Video target detection feature extraction method |
CN111182232A (en) * | 2019-12-31 | 2020-05-19 | 浙江华诺康科技有限公司 | Exposure parameter adjusting method, device, equipment and computer readable storage medium |
CN112235512A (en) * | 2020-09-16 | 2021-01-15 | 浙江大华技术股份有限公司 | Image exposure parameter adjusting method, equipment and device |
CN112312033A (en) * | 2020-10-23 | 2021-02-02 | 浙江华诺康科技有限公司 | Exposure parameter determination method and device and electronic device |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113744266A (en) * | 2021-11-03 | 2021-12-03 | 武汉楚精灵医疗科技有限公司 | Method and device for displaying focus detection frame, electronic equipment and storage medium |
CN113744266B (en) * | 2021-11-03 | 2022-02-08 | 武汉楚精灵医疗科技有限公司 | Method and device for displaying focus detection frame, electronic equipment and storage medium |
CN114052909A (en) * | 2021-12-01 | 2022-02-18 | 辽宁北镜医疗科技有限公司 | Multifunctional navigation system and navigation method in near-infrared fluoroscopy |
CN114449175A (en) * | 2022-01-13 | 2022-05-06 | 瑞芯微电子股份有限公司 | Automatic exposure adjusting method, automatic exposure adjusting device, image acquisition method, medium and equipment |
CN114727027A (en) * | 2022-03-09 | 2022-07-08 | 浙江华诺康科技有限公司 | Exposure parameter adjusting method and device, computer equipment and storage medium |
CN114727027B (en) * | 2022-03-09 | 2024-04-05 | 浙江华诺康科技有限公司 | Exposure parameter adjusting method, device, computer equipment and storage medium |
CN115835448A (en) * | 2022-12-28 | 2023-03-21 | 无锡车联天下信息技术有限公司 | Method and device for adjusting light, endoscope equipment and medium |
CN115984282A (en) * | 2023-03-21 | 2023-04-18 | 菲特(天津)检测技术有限公司 | Spandex product detection method, device, equipment and storage medium |
CN115984282B (en) * | 2023-03-21 | 2023-06-16 | 菲特(天津)检测技术有限公司 | Spandex product detection method, device, equipment and storage medium |
CN118695104A (en) * | 2024-08-26 | 2024-09-24 | 南京诺源医疗器械有限公司 | Dynamic exposure method and system for medical endoscope image |
Also Published As
Publication number | Publication date |
---|---|
CN112911165B (en) | 2023-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112911165B (en) | Endoscope exposure method, device and computer readable storage medium | |
CN111641778B (en) | Shooting method, device and equipment | |
CN109547701B (en) | Image shooting method and device, storage medium and electronic equipment | |
WO2021093793A1 (en) | Capturing method and electronic device | |
WO2019183819A1 (en) | Photographic method, photographic apparatus, and mobile terminal | |
WO2021129198A1 (en) | Method for photography in long-focal-length scenario, and terminal | |
KR20160118001A (en) | Photographing apparatus, method for controlling the same, and computer-readable recording medium | |
US20230247286A1 (en) | Photographing Method and Electronic Device | |
KR20190012465A (en) | Electronic device for acquiring image using plurality of cameras and method for processing image using the same | |
CN116896675A (en) | Electronic device for stabilizing image and method of operating the same | |
JP2012199675A (en) | Image processing apparatus, image processing method, and program | |
CN108616691B (en) | Photographing method and device based on automatic white balance, server and storage medium | |
CN114092364A (en) | Image processing method and related device | |
CN109302632B (en) | Method, device, terminal and storage medium for acquiring live video picture | |
CN111064895B (en) | Virtual shooting method and electronic equipment | |
WO2022206589A1 (en) | Image processing method and related device | |
KR20170009089A (en) | Method and photographing device for controlling a function based on a gesture of a user | |
CN114693593A (en) | Image processing method, device and computer device | |
US10009545B2 (en) | Image processing apparatus and method of operating the same | |
CN114757866A (en) | Definition detection method, device and computer storage medium | |
CN111050211B (en) | Video processing method, device and storage medium | |
CN108304841B (en) | Method, device and storage medium for nipple positioning | |
CN115150543B (en) | Shooting method, shooting device, electronic equipment and readable storage medium | |
CN115601316A (en) | Image processing method, image processing device, electronic equipment and computer readable storage medium | |
CN114913113A (en) | Method, device and equipment for processing image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |