US20220201987A1 - Lighting controller for sea lice detection - Google Patents
Lighting controller for sea lice detection Download PDFInfo
- Publication number
- US20220201987A1 US20220201987A1 US17/697,388 US202217697388A US2022201987A1 US 20220201987 A1 US20220201987 A1 US 20220201987A1 US 202217697388 A US202217697388 A US 202217697388A US 2022201987 A1 US2022201987 A1 US 2022201987A1
- Authority
- US
- United States
- Prior art keywords
- fish
- image
- light
- camera
- illuminating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 69
- 241001674048 Phthiraptera Species 0.000 title abstract description 80
- 241000251468 Actinopterygii Species 0.000 claims abstract description 138
- 238000000034 method Methods 0.000 claims abstract description 48
- 238000003860 storage Methods 0.000 claims abstract description 12
- 238000012545 processing Methods 0.000 claims description 8
- 238000004458 analytical method Methods 0.000 claims description 7
- 238000010801 machine learning Methods 0.000 claims description 6
- 230000003902 lesion Effects 0.000 claims description 5
- 208000030852 Parasitic disease Diseases 0.000 claims 2
- 244000045947 parasite Species 0.000 claims 2
- 230000036281 parasite infection Effects 0.000 claims 2
- 235000019688 fish Nutrition 0.000 abstract description 113
- 230000000116 mitigating effect Effects 0.000 abstract description 15
- 238000004590 computer program Methods 0.000 abstract description 11
- 231100000444 skin lesion Toxicity 0.000 abstract description 7
- 206010040882 skin lesion Diseases 0.000 abstract description 7
- 239000012530 fluid Substances 0.000 abstract description 2
- 238000011282 treatment Methods 0.000 abstract description 2
- 238000005286 illumination Methods 0.000 description 65
- 230000008569 process Effects 0.000 description 21
- 238000009987 spinning Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 17
- 238000010191 image analysis Methods 0.000 description 12
- 241000894007 species Species 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 10
- 239000003086 colorant Substances 0.000 description 8
- 241000972773 Aulopiformes Species 0.000 description 7
- 235000019515 salmon Nutrition 0.000 description 7
- 238000000926 separation method Methods 0.000 description 6
- 238000012546 transfer Methods 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 238000001429 visible spectrum Methods 0.000 description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 5
- 206010034960 Photophobia Diseases 0.000 description 4
- 208000015181 infectious disease Diseases 0.000 description 4
- 208000013469 light sensitivity Diseases 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000004075 alteration Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 230000009182 swimming Effects 0.000 description 3
- 241001611011 Caligus Species 0.000 description 2
- 241001247233 Lepeophtheirus Species 0.000 description 2
- 238000004026 adhesive bonding Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000003795 chemical substances by application Substances 0.000 description 2
- 238000000205 computational method Methods 0.000 description 2
- 238000007687 exposure technique Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 208000028454 lice infestation Diseases 0.000 description 2
- 238000012417 linear regression Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 241000239250 Copepoda Species 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 241001235687 Lepeophtheirus hippoglossi Species 0.000 description 1
- 241001247234 Lepeophtheirus salmonis Species 0.000 description 1
- 239000004677 Nylon Substances 0.000 description 1
- 238000009360 aquaculture Methods 0.000 description 1
- 244000144974 aquaculture Species 0.000 description 1
- 238000000149 argon plasma sintering Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000010304 firing Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 235000021384 green leafy vegetables Nutrition 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003097 mucus Anatomy 0.000 description 1
- 229920001778 nylon Polymers 0.000 description 1
- 238000009372 pisciculture Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000002211 ultraviolet spectrum Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K61/00—Culture of aquatic animals
- A01K61/10—Culture of aquatic animals of fish
- A01K61/13—Prevention or treatment of fish diseases
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K63/00—Receptacles for live fish, e.g. aquaria; Terraria
- A01K63/06—Arrangements for heating or lighting in, or attached to, receptacles for live fish
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H04N5/2256—
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/80—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
- Y02A40/81—Aquaculture, e.g. of fish
Definitions
- This specification generally describes lighting controllers, particularly those used for aquaculture.
- image analysis can be performed to detect sea lice or other skin features, including lesions, on the fish. Detection can be automatic and can inform various techniques of mitigation. For sea lice detection, mitigation can include methods of delousing.
- illuminator lights with specific frequencies are controlled by a lighting controller to coincide with camera exposures. The specific frequency of light is chosen for properties likely to aid in the detection of sea lice as well as skin lesions, shortened operculum or other physical deformities and skin features.
- Illuminator light controllers can use pulse patterns to illuminate a fish with specific frequency light.
- LEDs red and blue light-emitting diodes
- a camera can transfer images to a computer which performs visual analysis to detect attached sea lice.
- the different color light can highlight different features of interest along with improving clarity for sea lice detection.
- analysis can inform sea lice detection.
- the wavelength of a beam of light can change depending on the medium in which the beam propagates.
- the visible spectrum is continuous. Wavelength ranges for given colors within the continuous spectrum are approximate but wavelength or frequency can be used to clearly differentiate two or more colors.
- the detection information for specific fish can be stored.
- the stored data can be used for lice mitigation, other diagnoses, or in producing analytics.
- a fish can be detected by a system employing image analysis to have a certain quantity of sea lice attached to the right-side gill.
- This information can be passed to an automatic delouse, which can remove the sea lice.
- this information can be stored on a server to inform population analytics.
- the lighting controller can use pairs of light pulses.
- the lighting controller can use a red light and a blue light to illuminate a fish.
- the red light and the blue light can alternate illuminating the fish such that, at some point, the fish is illuminated by the red light and at another point the fish is illuminated by the blue light.
- Images can be captured of the fish while it is being illuminated by the red light.
- Images can also be captured of the fish while it is being illuminated by the blue light.
- Image processing can combine an image captured with red light illumination and an image captured with blue light illumination to determine if the fish has a certain condition. Conditions can include a sea lice infection, a lesion on the body of the fish, or a physical deformity such as a shortened operculum.
- the lighting controller can be used in any area with fish.
- the lighting controller can be used within a fish pen.
- the lighting controller can also be used within a fish run.
- the lighting controller can include a blue light with a specific frequency range.
- the lighting controller can include a blue light that can produce peak power within a wavelength range of 450 nanometers to 480 nanometers.
- the lighting controller can have a certain frequency at which illuminators alternate.
- the lighting controller can use pairs of light pulses which alternate on and off more than sixty times a second.
- the specific frequency can be chosen to ensure that a fish does not perceive the illuminators alternating.
- the specific frequency can be chosen to ensure that a fish perceives the illuminators as steady sources of light.
- camera exposures can be timed to coincide with periods of time in which a fish is illuminated.
- a camera can open for exposures for a portion of time between when an illuminator is on and illuminating a fish and when the illuminator is off and not illuminating the fish.
- a camera can open for exposures for a portion of time between when an illuminator is off and not illuminating a fish and when the illuminator is on and illuminating the fish.
- the lighting controller can activate illuminators without any overlap. For example, the lighting controller can illuminate a fish with a blue light for a period of time. The lighting controller can then stop illuminating the fish with the blue light. The lighting controller can then illuminate a fish with a red light for a period of time.
- machine learning can be used to inform elements of the detection process.
- the lighting controller can vary the time of camera exposure or illumination depending on current background lighting levels or the type of fish detected in the field of view.
- the lighting controller or image analysis process can use positive or negative detection results to inform machine learning.
- the lighting controller can use a learning data set of known sea lice infected fish and adjust illumination frequency, exposure lengths, or other parameter to produce a greater number of accurate detections or fewer inaccurate detections.
- an image buffer can be used to help aid in image capture.
- a camera can capture an exposure for an amount of time and save a resulting image to an image buffer. The camera can continue to save images to the image buffer until the image buffer is full. Images saved to the image buffer can be transferred to another device or computer.
- an image buffer can be used to reduce the amount of time in between consecutive image captures. Reducing the amount of time in between consecutive image captures can be advantageous when combining two or more images (e.g., an image captured of a fish illuminated with a red light and an image captured of the fish illuminated with a blue light).
- FIG. 1 is a diagram showing an example of a system for sea lice detection.
- FIG. 2 is a diagram of an exposure pattern.
- FIG. 3 is a diagram of an alternative exposure pattern.
- FIG. 4 is a diagram of another alternative exposure pattern.
- FIG. 5 is a flow diagram illustrating an example of a process for sea lice detection using a lighting controller.
- FIG. 6A, 6B, and 6C are diagrams of custom Bayer filters.
- FIG. 7 is a diagram of a method for image collection using a beam splitter.
- FIG. 8 is a diagram of a method for image collection using a rotating mirror.
- FIG. 9 is a diagram of a method for image collection using a pair of stereo cameras.
- FIG. 1 is a diagram showing an example of a system 100 for sea lice detection.
- the system 100 is comprised of a fish pen 101 , a control unit 120 , two primary illuminators 102 and 104 , a camera 105 , and the fish 109 .
- the fish pen 101 is formed with netting, e.g., rope, nylon, or silk.
- Illuminators can be controlled with signals from a lighting controller.
- the lighting controller can be connected to the control unit 120 .
- the fish 109 can be a member of a population of fish located with the fish pen 101 .
- the fish is a salmon with sea lice on its body.
- the detection of sea lice can include specific species of sea lice. For example, several species of ectoparasitic copepods of the genera Lepeophtheirus and Caligus.
- the type of fish being analyzed can affect the process of sea lice detection. For example, upon detection of a salmon, a system can adapt a system of detection for the detection of Lepeophtheirus salmonis —a species of sea lice which can be especially problematic for salmon.
- a detection of a specific species of sea lice can be separated from other sea lice detections. For example, a detection of Lepeophtheirus salmonis can be separated from sea lice detections of Caligus curtis and Lepeophtheirus hippoglossi.
- the fish pen 101 is shown at initial time ⁇ 1 and later time ⁇ 2 .
- the status (e.g., on/off) as well as the position of the contents of the fish pen 101 can change from time ⁇ 1 to ⁇ 2 .
- the times ⁇ 1 and ⁇ 2 correspond to the time at which a first image is captured ( ⁇ 1 ) and the time at which a second image is captured ( ⁇ 2 ).
- different exposure techniques can enable sea lice detection with only a single image capture. The various exposure techniques as well as exposure patterns are discussed below.
- the two primary illuminators 102 and 104 are LEDs transmitting light within specific frequency ranges. Illuminator 102 transmits light within the wavelength range of 440 nm to 485 nm and appears blue.
- the blue light region is distinct from the cyan light region in that the blue light region stretches from 450 nm wavelength up to 485 nm wavelength, while the wavelength of cyan light starts at 485 nm wavelength and increases to 500 nm.
- Blue light can have peak power between 450 and 485 nm wavelengths while cyan light can have peak power between 485 nm and 500 nm wavelengths.
- the light of a blue LED used in the lighting controller can be concentrated towards the lower wavelengths of the blue light region creating a separation of blue light to cyan light.
- the separation can be thousands of gigahertz or greater which equates to roughly ten percent of the entire visible spectrum.
- a greater separation between red light (e.g., 625 nm to 780 nm wavelength) and blue light (e.g., 450 nm to 485 nm wavelength) can result in greater accuracy in sea lice detection as well as detections of skin lesions, shortened operculum or other physical deformities and skin features.
- Illuminator 104 transmits light within the wavelength range of 620 nm to 750 nm and appears red. Frequency can be tuned to maximize frequency space separation while retaining visible light for camera image capture and minimizing environmental disruptions (e.g., light absorption, light scattering).
- the camera 105 captures visible light images.
- the exposures of camera 105 can be timed with illumination of any other illuminators in the fish pen 101 (e.g., illuminator 102 , illuminator 104 , additional illuminators).
- the exposure of camera 105 and illumination of any illuminator can be controlled by the control unit 120 .
- secondary illuminators can be used. Secondary illuminators can provide additional light for exposures of camera 105 . For example, secondary illuminators can be used to brighten the image of the fish. This can be useful in situations where surface light is minimal. Secondary illuminators can also enable the ability to control the ambient light of an image capture which can be useful in controlling for varying water conditions or location conditions.
- more or fewer illuminators can be used.
- secondary illuminators may not be required. These situations may include applications where background light is sufficient or does not pose challenges for sea lice detection.
- Less illuminators can also be used by installing custom image filters to capture an image or images.
- Stage A in FIG. 1 shows a particular wavelength being selected by the control unit 120 .
- This wavelength can be used by light emitting diodes (LEDs) within the fish pen shown in item 101 .
- the LEDs can be activated to illuminate the fish 109 .
- other forms of light can be used.
- LEDs instead of LEDs, incandescent light bulbs can be used.
- Other forms of light production may be used for either the primary set or a secondary set of illuminators.
- the wavelengths can be set before imaging events take place.
- an LED can be installed which emits light in the blue visible spectrum of light with wavelengths between 440 nm and 485 nm.
- Another LED can be installed which emits light in the red visible spectrum of light with wavelengths between 620 nm and 750 nm.
- it can be advantageous to use dissimilar frequencies, one with longer wavelength (towards infrared) and another with shorter wavelength (towards ultraviolet). Lights reaction in water should be considered and can prevent some frequencies of light from propagating effectively and therefore functioning properly as a primary illuminator.
- the frequency of the illumination LEDs can be tuned remotely.
- revolving LED wheels can be used to pick from a variety of LEDs. LEDs can be chosen based on effectiveness. Criteria can include an ability to produce images likely to result in true positive sea lice detection.
- Stage B in FIG. 1 shows the pen 101 at time ⁇ 1 .
- the fish pen 101 contains the fish 109 , the camera 105 , and the primary illuminators 102 and 103 .
- secondary illuminators are not used while the primary illuminators 102 and 103 are used and are set to the colors red and blue, respectively.
- the illuminators can be controlled by a lighting controller (e.g., control unit 120 ) connected to the camera 105 or by the camera 105 itself.
- the camera can time exposures as shown in FIG. 2 , FIG. 3 , and FIG. 4 . The specifics of the different exposure patterns will be discussed later in this application.
- the blue LED illuminator fires and bathes the fish 109 in blue light.
- the camera 105 opens exposures to coincide with the blue LED illuminator.
- the camera 105 can open exposures simultaneously with the flash of an illuminator or after the beginning of the flash.
- Stage C in FIG. 1 shows an image 110 created by an exposure of camera 105 and the illumination of the blue LED 102 .
- the dot pattern in the image 110 represents the color blue of the illuminator used to capture the image.
- the fish 109 is shown with sea lice 111 attached near the head.
- multiple fish can be detected within an image.
- the image taken by camera 105 can show multiple fish.
- the multiple fish can have individual sea lice detections.
- Stage D in FIG. 1 shows the pen 101 at time ⁇ 2 temporally separated from the fish pen 101 at time ⁇ 1 .
- the fish 109 has moved from left to right as the second illuminator, the red LED 104 , fires.
- the firing of the illuminator 104 coincides with the exposure of camera 105 .
- the red LED illuminator 104 fires and bathes the fish 109 in red light.
- the camera 105 opens exposures to coincide with the red LED illuminator.
- the camera 105 can open exposures simultaneously with the flash of an illuminator or after the beginning of the flash.
- Stage E in FIG. 1 shows an image 115 created by an exposure of camera 105 and the illumination of the red LED 104 .
- the image 115 is primarily red owing to the illumination of the red LED 104 . This is represented by the absence of the dots used to represent the blue illumination in the image 110 .
- the fish 109 is shown with sea lice 111 attached.
- the exposure of camera 105 need not be simultaneous with illuminators.
- the blue LED 102 can fire before the camera 105 begins capturing images or after. Images captured by the camera 105 can be selected based on illuminator status during image capture.
- Stage F in FIG. 1 involves feature selection.
- Feature selection can be a form of image analysis performed on images (e.g., image 110 , image 115 ).
- image 110 and image 115 can be combined.
- Image analysis can be performed to detect features on the body of the fish 109 .
- the image analysis can be performed by various computational methods including algorithms, neural networks, or linear regressions.
- the image analysis may be composed of multiple steps.
- a rough object identifier may be used to detect the fish 109 within the image 110 .
- a second object identifier may use the output of the first object identifier to locate objects on the fish 109 (e.g., the sea lice 111 ).
- the multiple steps can be performed by various computational methods including algorithms, neural networks, or linear regressions.
- Stage G in FIG. 1 involves detecting sea lice based on the image analysis performed.
- the image of the body of the fish can be separated from the background.
- Other pre-processing methods can prepare stages of image analysis.
- Sea lice surrounding and overlaying the image of the body can be detected and counted and attributed to a specific fish.
- Tallies of sea lice can be kept for individual fish, groups of fish, or whole populations.
- Detected sea lice data can be used by the system to inform further steps either for mitigation or analytics.
- Stage H in FIG. 1 shows a possible act related to the detection of sea lice.
- the act can be a form of sea lice mitigation.
- Techniques can include focused laser light where provided coordinates from the detected sea lice data can be used to target the lasers.
- Sea lice mitigation can take place in sync with detection or after detection.
- Detected sea lice data can be stored for future sea lice mitigation, or for analytics, by other devices within the system.
- the system 100 can store detected sea lice data and inform human workers to proceed with a sea lice mitigation technique. For example, infected fish can be tagged with a location which workers can use to catch and delouse the fish.
- the detection output 121 can include data related to the event of sea lice detection.
- the detection output 121 can include instructions for sea lice mitigation, data related to the fish 109 , or data related to the sea lice 111 .
- the detection output can specify that seven sea lice are on the fish 109 at specific coordinates or attached to specific features of the fish.
- the output can specify that sea lice mitigation for fish 109 should be conducted by hand. This data can be stored or used within other systems connected to or within system 100 .
- the system 100 can also be useful in detecting other conditions. For example, skin lesions on a fish can be detected using similar methods and processes.
- a system can perform other analysis. For example, a system can analyze images illuminated by different frequencies of light for elements denoting skin lesions or physical deformities such as shortened operculum.
- FIG. 2 is a diagram of an exposure pattern 200 which can be used by a lighting controller of system 100 .
- the exposure pattern 200 is comprised of a blue LED 201 , red LED 204 and a camera 206 .
- the boxes similar to item 202 represent time intervals in which the blue LED 201 is illuminating.
- the boxes similar to item 205 represent time intervals in which the red LED 204 is illuminating.
- the boxes similar to item 207 represent time intervals in which the camera 206 is open for exposures.
- the time interval of the blue LED illumination 202 and the time interval of the red LED illumination 205 can be considered an initial pair of light pulses 203 . Succeeding pairs of light pulses may follow the initial pair.
- the blue LED 201 can fire for a duration of 5 milliseconds.
- the camera 206 opens for exposure during this window.
- the exposure duration can vary but can overlap with the period of illumination from the blue LED 201 .
- An image captured during the illumination of the blue LED 201 can be stored using a computer device connected to the camera 206 or the camera 206 itself.
- the blue LED 201 stops illuminating. After the blue LED has stopped illuminating, the red LED 204 begins illuminating. In some implementations, an overlap between the two LEDs can be used. For example, if the blue LED 201 illuminates from time 0 to time 5 ms, the red LED 204 can fire from time 4 ms to 9 ms. Furthermore, the time intervals of the LEDs illumination need not be identical. For example, the blue LED 201 can illuminate for 5 ms while the red LED 204 illuminates for 10 ms.
- a gap between sequential illuminations can be inserted.
- the pattern 200 can contain a 1 ms period of non-illumination.
- periods of non-illumination can be inserted to prevent a subject being illuminated simultaneously by the blue LED 201 and the red LED 204 .
- the camera 206 can start exposures again.
- this delay can be inserted to transfer an image to a storage device or somewhere within memory.
- the delay can be 40 ms. Different implementations can use different delay lengths.
- the delay corresponds to the time from the beginning of one exposure to the beginning of the next exposure.
- the next exposure can be of an illumination that has not previously been captured. For example, if the illumination of the blue LED 201 was captured in exposure number one, the illumination of the red LED 204 can be captured in exposure number two. In this example, the time between exposure one and exposure two can be considered a delay.
- the LEDs 201 and 204 can alternate. This alternating can be advantageous as it can help maintain a more steady illumination level. At a rate of around 100 Hz, for example, alternating at a rate of up to 120 Hz, the alternating LEDs 201 and 204 may appear similar to steady non-flashing lights. Advantageous implementations may include maintaining a higher alternating rate for the light source as steady non-flashing lights are more attractive to some fish than flashing lights.
- the exposure pattern 200 can continue for as long as is required. In some implementations, the exposures will end after a subject has left the field of view of camera 206 . Multiple images can be combined or processed separately. Single images can also be processed.
- the red LED 204 can emit peak power at a specific wavelength.
- the red LED 204 can emit peak power at a wavelength between 625 nm and 780 nm.
- the blue LED 201 can emit peak power at a specific wavelength.
- the blue LED 201 can emit peak power at a wavelength between 450 nm and 485 nm.
- FIG. 3 is a diagram of an exposure pattern 300 which can be inserted by a lighting controller of system 100 .
- the exposure pattern 300 is comprised of a blue LED 301 , red LED 304 and a camera 306 .
- the boxes similar to item 302 represent time intervals in which the blue LED 301 is illuminating.
- the boxes similar to item 305 represent time intervals in which the red LED 304 is illuminating.
- the boxes similar to item 307 represent time intervals in which the camera 306 is open for exposures.
- the time interval of the blue LED illumination 302 and the time interval of the red LED illumination 305 can be considered an initial pair of light pulses 303 . Succeeding pairs of light pulses may follow the initial pair.
- FIG. 3 Proceeding within FIG. 3 from left to right shows the progression of exposure pattern from beginning to a later time.
- the blue LED 301 fires at time zero for a duration of 5 milliseconds.
- the camera 306 opens for exposure during this window for a duration of 4 milliseconds.
- An image captured during the illumination of the blue LED 301 can be stored using a computer device connected to the camera 306 or the camera 306 itself.
- the blue LED 301 stops illuminating. After the blue LED has stopped illuminating, the red LED 304 begins illuminating.
- an overlap between the two LEDs can be implemented. For example, if the blue LED 301 illuminates from time 0 to time 5 ms, the red LED 304 can fire from time 4 ms to 9 ms.
- the time intervals of the LEDs illumination need not be identical. For example, the blue LED 301 can illuminate for 5 ms while the red LED 304 illuminates for 10 ms.
- a gap between sequential illuminations can be inserted.
- the pattern 300 can contain a 1 ms period of non-illumination.
- periods of non-illumination can be inserted to prevent a subject being illuminated simultaneously by the blue LED 301 and the red LED 304 .
- the camera 306 can start exposures again.
- the delay between first and second exposures is shorter than exposure pattern 200 .
- a shorter delay can be accomplished by using a larger buffer to store multiple images captured within exposures.
- a graph of the buffer is shown in item 310 .
- the buffer graph 310 shows, relative to the horizontal axis of time, the amount of image data held in the image buffer.
- Item 311 shows the buffer storage increase as the image 307 is captured.
- Item 312 shows the buffer storage increase again as the image 308 is captured.
- An image from both exposure 307 and exposure 308 can be stored within the image buffer if the data stored is below a limit like the buffer limit line shown in item 314 .
- the exposure pattern can delay to give time for the images stored in the buffer to be transferred out of the buffer onto another storage device.
- Different implementations can use different delay lengths.
- the delay can be the time between two consecutive groups of exposures.
- the delay for pattern 300 can be 80 ms as measured from the beginning of exposure 307 to the beginning of exposure 309 . This delay may be calibrated to give enough time for the buffer to transfer data.
- the process of buffer transfer can be seen in graph 310 as a downward slanted line.
- different delay lengths as well as number of exposures captured within an exposure group can vary. For example, instead of two exposures within the first exposure group, four can be implemented. In general, the number of exposures per group before a period of non-exposure depends on the size of the image buffer used. During a period of non-exposure, data can be offloaded from the image buffer. With a large image buffer, more images can be captured with less delay in between consecutive shots.
- the camera 306 can resume exposures.
- the moment to resume exposures can coincide with buffer storage availability as well as illumination from illuminators (e.g., the blue LED 301 , the red LED 304 ).
- exposure 307 is timed with illumination from the blue LED 301 .
- Exposure 308 is timed with illumination from the red LED 304 .
- the camera 306 can resume exposures.
- the first exposure after a period of non-exposure can be timed with the blue LED 301 or with the red LED 304 . In this case, the exposure after a period of non-exposure is timed with the blue LED 301 .
- the exposure 309 after a period of non-exposure can also coincide with the buffer storage availability as shown in graph 310 .
- the LEDs 301 and 304 can alternate. This alternating can be advantageous as it can help maintain a more steady illumination level. At a rate of around 100 Hz or higher, the alternating LEDs 301 and 304 may appear similar to steady non-flashing lights which are more attractive to some fish than flashing lights.
- the exposure pattern 300 can continue for as long as is required. In some implementations, the exposures will end after a subject has left the field of view of camera 306 . Multiple images can be combined or processed separately. Single images can also be processed.
- the red LED 304 can emit peak power at a specific wavelength.
- the red LED 304 can emit peak power at a wavelength between 625 nm and 780 nm.
- the blue LED 301 can emit peak power at a specific wavelength.
- the blue LED 301 can emit peak power at a wavelength between 450 nm and 485 nm.
- FIG. 4 is a diagram of an exposure pattern 400 which can be implemented by a lighting controller of system 100 .
- the exposure pattern 400 is comprised of a blue LED 401 , red LED 404 and a camera 406 .
- the boxes similar to item 402 represent time intervals in which the blue LED 401 is illuminating.
- the boxes similar to item 405 represent time intervals in which the red LED 404 is illuminating.
- the boxes similar to item 407 represent time intervals in which the camera 406 is open for exposures.
- the time interval of the blue LED illumination 402 and the time interval of the red LED illumination 405 can be considered an initial pair of light pulses 403 . Succeeding pairs of light pulses may follow the initial pair. In some cases, intervals in which neither the blue LED 401 nor the red LED 404 is illuminated may be inserted between or within a pair or pairs of light pulses.
- the time of illumination can be set for each light source used in an exposure pattern.
- the blue LED 401 can fire at time zero for a duration of 5 milliseconds.
- the camera 406 can open for exposure. An image captured during the illumination of the blue LED 401 can be stored using a computer device connected to the camera 406 or the camera 406 itself.
- the blue LED 401 stops illuminating. After the blue LED has stopped illuminating, the red LED 404 begins illuminating.
- an overlap between the two LEDs can be used. For example, if the blue LED 401 illuminates from time 0 to time 5 ms, the red LED 204 can fire from time 4 ms to 9 ms.
- the time intervals of the LEDs illumination need not be identical. For example, the blue LED 401 can illuminate for 5 ms while the red LED 404 illuminates for 10 ms. Other durations can also be used.
- a gap between sequential illuminations can be inserted.
- the pattern 400 can contain a 1 ms period of non-illumination.
- periods of non-illumination can be inserted to prevent a subject being illuminated simultaneously by the blue LED 401 and the red LED 404 .
- the camera 406 After a delay, the camera 406 starts exposures again. In some implementations, this delay can be inserted to transfer the image to a storage device or somewhere within memory. For example, the delay can be 40 ms from exposure 407 to exposure 408 . Different implementations can use different delay lengths. The delay corresponds to the time difference between two sequential camera exposures. For example, if the blue LED 401 illumination was captured in exposure 407 , the red LED 404 illumination can be captured in exposure 408 after a given delay.
- the camera 406 exposes again shown in item 409 .
- the exposure 409 captures an image while no illuminators are illuminated. In the moments before, the blue LED 401 illuminates, followed by the red LED 404 but the exposure pattern 400 includes a period of non-illumination after the red LED 404 within the sequence.
- the exposure 409 can be used to get additional data. For example, the exposure 409 can be used to get data on background lighting. This can be useful in situations where other regions of light may be of interest.
- the images captured without illumination from the blue LED 401 or the red LED 404 can be used in other processes. For example, the exposure 409 can be used to get readings on water condition.
- the pattern of blue LED exposure 407 , red LED exposure 408 followed by non-LED exposure 409 can be used repeatedly in the exposure pattern 400 .
- the LEDs 401 and 404 can alternate. This alternating can be advantageous as it can help maintain a more steady illumination level. At a rate of around 80 to 120 Hz, the alternating LEDs 401 and 404 may appear similar to steady non-flashing lights when perceived by an eye of a human or a fish. Advantageous implementations may include maintaining a higher alternating rate for the light source as steady non-flashing lights are more attractive to some fish than flashing lights.
- the exposure pattern 400 can continue for as long as is required. In some implementations, the exposures will end after a subject has left the field of view of camera 406 . Multiple images can be combined or processed separately. Single images can also be processed.
- the LEDs used as illuminators in the example exposure patterns 200 , 300 , and 400 can be replaced by non LED light sources.
- the LEDs need not be red and blue wavelength but can be of any wavelength.
- Advantageous implementation can include using red and blue LEDs with wavelength ranges of between 440 nm and 485 nm for blue and 620 nm and 750 nm for red.
- the rate at which the LEDs alternate can be fast enough to make the alternating LEDs appear as steady, non-flashing lights, when perceived by an eye of a human or a fish.
- the LEDs can alternate at a frequency of 80 to 120 Hz.
- a high alternating rate is advantageous as it allows the flashing to be less noticeable by the fish being illuminated as well as reducing the time, and thus the visual differences, between consecutive snapshots of the fish when exposure patterns are used. Reducing visual differences can help reduce complexity and improve the resulting accuracy of any later image combination.
- the exposure patterns 200 , 300 , and 400 can be swapped without departing from the ideas therein.
- the first exposure can be with the red LED 404 illuminating while the second can be an exposure with no illumination.
- FIG. 5 shows a process 500 for sea lice detection using a lighting controller.
- the process 500 includes preparing an illumination system and a camera system ( 502 ).
- control unit 120 from FIG. 1 can select the wavelength used for illuminating the fish 109 .
- the process 500 includes detecting fish motion within the field of view of the camera system ( 504 ). For example, as the fish 109 swims within the field of view of the camera 105 , the illuminators 102 , 104 , 106 , or 107 and the camera 105 can coordinate through signals sent from control unit 120 in a manner similar to those discussed in FIG. 2 , FIG. 3 , or FIG. 4 .
- the process 500 includes using a lighting controller exposure pattern, involving the illumination system and the camera, to capture fish images ( 506 ).
- a lighting controller exposure pattern involving the illumination system and the camera
- a specific exposure pattern similar to pattern 200 of FIG. 2 can be used with the blue LED 201 and red LED 204 functioning as the illumination system and the camera 206 functioning as the camera.
- the process 500 includes analyzing captured fish images for sea lice ( 508 ).
- the control unit can gather image 110 and image 115 and perform image analysis to detect sea lice 111 .
- the process 500 includes storing results within a computer system ( 510 ).
- control unit 120 can store the results of the image analysis involving image 110 and image 115 .
- the process 500 includes employing mitigation techniques based on results ( 512 ).
- the mitigation techniques can include targeted treatments which can be comprised of lasers, fluids, or mechanical devices such as a brush or suction.
- the control unit 120 can activate lasers to focus intense light on a fish to remove sea lice from the fish.
- the lasers can use sea lice location data gleaned from the image analysis performed.
- the control unit 120 can also delegate the mitigation to other systems or devices (e.g., other computer systems, humans).
- more or less than two lights can be used for illuminating the subject.
- another LED of a different frequency or color can be added.
- the illumination of any additional LED can be captured by a camera as images like the images 110 and 115 .
- more than one camera can be used.
- an additional camera can be used to capture images.
- an additional camera can capture alternate angles of a subject.
- an additional camera within the fish pen 101 can capture one side of fish 109 while the camera 105 captures the other.
- the illumination from illuminators can be of any frequency.
- infrared and ultraviolet light can be used instead of the blue and red LED lights used by illuminator 102 and illuminator 104 respectively.
- the cameras used to capture images of scenes illuminated by illuminators can have the ability to capture the specific frequency of the illuminator.
- a camera can have the ability to sense and record the ultraviolet light within an image. Any frequency can be used within an exposure pattern like those in FIG. 2 , FIG. 3 , and FIG. 4 .
- more than one fish can be processed within a system like system 100 .
- the pen 101 in FIG. 1 can show not only the fish 109 but an additional fish.
- the additional fish can be captured by the camera 105 .
- Both the fish 109 and the additional fish can be processed by control unit 120 and be data representing their respective detection results can be contained within the resulting detection output 121 .
- Any number of fish can be processed in this way. Possible limitations to the number of fish processed can exist in hardware or software used.
- more than one exposure pattern can be used.
- both pattern 200 from FIG. 2 and pattern 300 from FIG. 3 can be used.
- Combinations of patterns can bring alterations to a given pattern and may result in a new pattern which can be used by a device.
- patterns can be used based on external or internal stimuli. In some situations, it may be beneficial or desirable to choose one exposure pattern over another or a specific combination of one or more exposure patterns.
- the exposure patterns may contain an additional light or additional lights.
- the exposure pattern 200 , 300 , and 400 can be modified with the addition of a light.
- more than one light can be added.
- an additional light can fire between the illumination 202 and the illumination 205 .
- the additional light can illuminate a given subject in a separate or similar frequency to the frequencies illuminated by illuminator 201 or illuminator 204 .
- the additional light can illuminate in ultraviolet.
- An exposure pattern can be altered.
- the illumination of the ultraviolet light source can be captured by an exposure after the exposure 207 .
- the exposure patterns may contain an additional camera or additional cameras.
- the exposure pattern 200 , 300 , and 400 can be modified with the addition of a camera.
- more than one camera can be added.
- an additional camera can be used to capture exposures after exposure 207 .
- the additional camera can capture an exposure of a given subject in a separate or similar frequency to the frequencies illuminated by illuminator 201 or illuminator 204 .
- the additional camera can capture exposures of light in the ultraviolet spectrum.
- An exposure pattern can be altered.
- an exposure capturing ultraviolet light can be added to the exposure pattern 200 after the exposure 207 .
- the sea lice on a fish can be detected anywhere within a field of view of a camera.
- the sea lice detected on a fish can be on any part of the body.
- the part of body, location, or number can be included within the detection output 121 .
- a system can alter detection techniques based on detection circumstances. For example, in the case of various fish species, the detection method can be altered to use algorithms associated with the species or other types of frequency of illuminator light.
- water quality can be a circumstance of detection that could be registered by the system and alter following sea lice detections. For example, if the water is murky, an increase in the brightness or quantity of lights used can be instigated and carried out by the system. Adjusting the lighting based on fish environment conditions can be a part of the illuminator controller or a separate subsystem depending on implementation.
- Detection techniques can also be altered by the detection of a species of fish. For example, different species could be considered a detection circumstance and registered by the system. The registering of different species could invoke different forms of detection methods.
- Any alteration in sea lice detection method can result in alterations of sea lice detection output and results. For example, if a sea lice detection method was altered based on the sighting of a particular species of salmon, the output can be altered to save the sea lice detection data with species-specific feature recognition. The output can also be altered to include mitigation techniques tailored to the particular species of salmon.
- more than two modes of light can be used in an exposure pattern.
- an exposure pattern can use a blue light, a red light, and a yellow light.
- other ranges of light can be used to illuminate the subject for image capture.
- a system can use ultraviolet light.
- the process 500 can also be useful in detecting other conditions.
- skin lesions on a fish can be detected using similar methods and processes.
- a system can perform other analysis. For example, a system can analyze images illuminated by different frequencies of light for elements denoting skin lesions or physical deformities such as shortened operculum.
- a lighting controller can use a blue illuminator composed of light with multiple wavelengths.
- a graph of output power versus wavelength for blue light can resemble a gaussian shape with peak power at 465 nm wavelength and 10% power at 450 nm and 495 nm wavelengths.
- Other implementations could have different proportions of wavelengths or different ranges of wavelengths.
- a graph of output power versus wavelength for blue light can resemble a gaussian shape with peak power at 460 nm and 0% power at 455 nm and 485 nm wavelengths.
- a lighting controller can use a red illuminator composed of light with multiple wavelengths.
- a graph of output power versus wavelength for red light can resemble a gaussian shape with peak power at 630 nm wavelength and 10% power at 605 nm and 645 nm.
- Other implementations could have different proportions of wavelengths or different ranges of wavelengths.
- a graph of output power versus wavelength for red light can resemble a gaussian shape with peak power at 635 nm and 0% power at 610 nm and 640 nm wavelengths.
- FIG. 6A, 6B, and 6C are diagrams of custom Bayer filters for use within sea lice detection.
- FIG. 6A includes two different color filters on the pixel array 600 .
- the pixel array 600 can be used in fish imaging. Pixel 602 corresponds with the color filter red. Pixel 603 corresponds with the color filter blue. Pixel array 600 is partially filled for illustration purposes. Matching pattern and shading on two or more pixels of the array 600 denotes pixels of the same filter type. By adjusting a normal Bayer filter, the pixel array 600 can increase a camera's light sensitivity for key frequencies. In some implementations of sea lice detection, these key frequencies are red light (e.g., 625 nm to 780 nm wavelength) and blue light (e.g., 450 nm to 485 nm wavelength). The color filters on the pixel array 600 correspond to these frequencies. In the arrangement shown in pixel array 600 , the amount of light captured in both the red and blue spectrum is effectively doubled compared with a normal red, green and blue pixel array used in some standard cameras.
- the additional light sensitivity can reduce the number of images that need to be captured for sea lice detection. For example, a scene could be illuminated with both blue and red LEDs simultaneously. A camera could then capture an image. In some implementations, separate images could be extracted from the red and blue components of a single image.
- the color arrangement can be swapped. For example, blue pixels can take the place or red pixels and vice versa.
- color filters able to transmit different ranges of wavelengths can be used.
- the pixels able to register blue light like item 603 in pixel array 600 could be swapped with pixels able to register ultraviolet light.
- FIG. 6B is another custom Bayer filter which includes three different color filters on the pixel array 610 .
- Pixel 612 corresponds with the color filter blue.
- Pixel 614 corresponds to a blank color filter which allows all wavelengths to register evenly.
- Pixel 616 corresponds with the color filter red.
- Pixel array 610 is partially filled for illustration purposes. Matching pattern and shading on two or more pixels of the array 610 denotes filters of the same type. By adjusting a normal Bayer filter, the pixel array 610 allots equal amount of pixels for each channel (e.g., red filter channel, blue filter channel, blank filter channel). The structure is uniform and can potentially be more easily interpreted by neural networks working with output images.
- the color filters can accept light with wavelength within a particular range (e.g., 625 nm to 780 nm for the red filter 616 , 450 nm to 485 nm for the blue filter 612 , and the full visible spectrum for the blank filter 614 ).
- the color arrangement can be flipped.
- the additional light sensitivity can reduce the number of images that need to be captured for sea lice detection. For example, a scene could be illuminated with both blue and red LEDs simultaneously. A camera could then capture a single image and from that image, separate images could be extracted for both the red and blue components.
- the color arrangement can be flipped.
- blue pixels can take the place or red pixels and vice versa.
- color filters able to transmit different ranges of wavelengths can be used.
- the pixels able to register blue light like item 612 in pixel array 610 could be swapped with pixels able to register ultraviolet light.
- FIG. 6C is another custom Bayer filter which includes three different color filters on the pixel array 620 .
- Pixel 622 corresponds with the color filter red.
- the red color filter of pixel 622 allows light to pass through if the wavelength of the light is within the range 625 nm to 780 nm.
- Pixel 624 corresponds with the color filter blue, in this implementation allowing light through with wavelength within the range 450 nm to 485 nm.
- Pixel 626 corresponds to a blank color filter which allows, in this implementation, all wavelengths to register evenly.
- Pixel array 610 is partially filled for illustration purposes.
- Matching pattern and shading on two or more pixels of the array 620 denotes filters of the same type.
- the pixel array 620 creates smaller two by two windows (i.e. a group of four mutually connected pixels forming a square) made up of the specific filter channels used (e.g., red filter channel, blue filter channel, blank filter channel).
- This type of structure has the advantage of granularity as well as applications for other fish related identification work. For example, for applications in which images are needed in more light wavelengths than just red and blue, the blank filter data can be used.
- the pixel array 620 is well suited for full spectrum photography as well as sea lice detection specific photography concentrated within the wavelengths specified of red and blue.
- the color arrangement can be flipped while maintaining the general pattern.
- the additional light sensitivity can reduce the number of images that need to be captured for sea lice detection. For example, a scene could be illuminated with both blue and red LEDs simultaneously. A camera could then capture an image. In some implementations, separate images could be extracted from the red and blue components of a single image.
- the arrangement of pixels can be changed while preserving the overall pattern.
- the locations of red pixels similar to red pixel 622 and blue pixels similar to blue pixel 624 can be switched while preserving the overall pattern and benefits of the cell array 620 as shown.
- color filters able to transmit different ranges of wavelengths can be used.
- the pixels able to register blue light like item 624 in pixel array 620 could be swapped with pixels able to register ultraviolet light.
- FIG. 7 is a diagram which shows a system 700 comprised of an incident light beam 701 , a primary lens 702 , a beam splitter 704 , a red filter 705 , a blue filter 706 , a camera 707 , and another camera 708 .
- the system 700 can be used for image collection.
- the incident light beam 701 can be the light from an exposure of a fish within a pen.
- the primary lens 702 can be made out of glass and can help direct the light towards the beam splitter 704 .
- additional lenses or mirrors can be used for focusing the incident beam.
- the beam splitter 704 is constructed such that a portion of the incident beam 701 is reflected and a portion of the incident beam 701 is transmitted creating two beams of light from the incident beam 701 . Additional optical elements not shown can be used within the beam splitter 704 and other devices within the system 700 . For example, within the beam splitter 704 can be multiple lenses and mirrors as well as gluing and connecting agents.
- the red filter 705 and the blue filter 706 can be tuned to allow specific frequency light through.
- the red filter 705 can be tuned to allow only light with wavelength between 620 nm and 750 nm.
- the blue filter 706 can be tuned to allow only light with wavelength between 440 nm and 485 nm.
- the camera 707 and the camera 708 can capture a light beam using a light detector.
- the light detector captures incoming light and creates an image.
- the light detector can encode the captured light as a list of pixels with color and intensity.
- the pixel information can be stored as an image and can be used by other devices and systems.
- Stage A of FIG. 7 shows the incident beam 701 moving to the lens 702 .
- the incident beam 701 can be the light from an exposure of a scene.
- the incident beam can be comprised of the light reflected off a fish swimming in a pen.
- Stage B of FIG. 7 shows the incident beam 701 split by the beam splitter 704 .
- the beam splitter 704 can have multiple lenses and mirrors used to direct the two outbound light beams.
- Stage C of FIG. 7 shows the output of the beam splitter 704 passing through the red filter 705 .
- the light before the red filter 705 can be any wavelength reflected or transmitted from the beam splitter 704 .
- the light after the red filter 705 can be any wavelength within the range of the filter (e.g., 620 nm and 750 nm).
- Stage C of FIG. 7 shows the output of the beam splitter 704 passing through the blue filter 706 .
- the beam passing through the red filter 705 and the blue filter 706 can be separate such that light passing through the red filter 705 does not also pass through the blue filter 706 .
- the light before the blue filter 706 can be any wavelength reflected or transmitted from the beam splitter 704 .
- the light after the blue filter 706 can be any wavelength within the range of the filter (e.g., 440 nm and 485 nm).
- Stage D of FIG. 7 shows the output of the red filter 705 reaching the camera 707 .
- the camera 707 can use a light detector to capture the incoming light from the red filter 705 and create an image. This image can be a stored group of pixels with colors and intensities. Images captured by the camera 707 can be used for sea lice detection.
- Stage D′ of FIG. 7 shows the output of the blue filter 706 reaching the camera 708 .
- the camera 708 can use a light detector to capture the incoming light from the blue filter 706 and create an image. This image can be a stored group of pixels with colors and intensities. Images captured by the camera 708 can be used for sea lice detection.
- Possible advantages of the system 700 is that it preserves the spatial resolution of each channel. It is also easier to construct color filters (e.g., red filter 705 , blue filter 706 ) than the devices in some other image collection methods (e.g., custom image chips requiring per pixel accuracy). Simple colored optical filters can be manufactured.
- Some potential drawbacks include the cost of the beam splitter 704 and the fact that after splitting, the light captured by camera 707 and camera 708 will be less intense than the incident beam 701 . This can be alleviated with a greater intensity light on the subject of the image but greater intensity light can affect the subject's behavior. For example, a more intense light may scare fish away from the field of view captured by the incident beam 701 . This could result in fewer opportunities to collect images of fish.
- FIG. 8 is a diagram which shows the system 800 comprised of an incident beam 801 , a primary lens 802 , a spinning mirror 804 , a red filter 805 , a blue filter 806 , a camera 807 , and another camera 808 .
- the system 800 can be used for image collection. In some implementations, images collected can be used in the process of detecting sea lice.
- the incident light beam 801 can be the light from an exposure of a fish within a pen.
- the primary lens 802 can be made out of glass and can help direct the light towards the spinning mirror 804 .
- additional lenses or mirrors can be used for focusing the incident beam.
- the spinning mirror 804 is constructed such that the incident beam 801 is reflected at an angle.
- Two angles vital to the system 800 is the angle which reflects the incident beam 801 towards the red filter 805 and camera 807 and the angle which reflects the incident beam 801 towards the blue filter 806 and the camera 808 . These two angles can be separate portions of a rotation of the spinning mirror 804 .
- Additional optical elements not shown can be used within the spinning mirror 804 and other devices within the system 800 .
- the spinning mirror 704 can be multiple lenses and mirrors as well as gluing and connecting agents.
- the red filter 805 and the blue filter 806 can be tuned to allow specific frequency light through.
- the red filter 805 can be tuned to allow only light with wavelength between 620 nm and 750 nm.
- the blue filter 806 can be tuned to allow only light with wavelength between 440 nm and 485 nm.
- the camera 807 and the camera 808 can capture a light beam using a light detector.
- the light detector captures incoming light and creates an image.
- the light detector can encode the captured light as a list of pixels with color and intensity.
- the pixel information can be stored as an image and can be used by other devices and systems.
- Stage A of FIG. 8 shows the incident beam 801 moving to the lens 802 .
- the incident beam 801 can be the light from an exposure of a scene.
- the incident beam can be comprised of the light reflected off a fish swimming in a pen.
- Stage B of FIG. 8 shows the incident beam 801 reflected by the spinning mirror 804 .
- the spinning mirror 804 can have multiple lenses and mirrors used to accept and direct the outbound light beam.
- Stage C of FIG. 8 shows the output of the spinning mirror 804 passing through the red filter 805 .
- the light before the red filter 805 can be any wavelength reflected by the spinning mirror 804 .
- the light after the red filter 805 can be any wavelength within the range of the filter (e.g., 620 nm and 750 nm).
- Stage C′ of FIG. 8 shows the blue filter 805 .
- the output of the spinning mirror 804 can be directed towards the blue filter 806 .
- the directed light can pass through the blue filter 806 .
- the light before the blue filter 806 can be any wavelength reflected or transmitted from the mirror 804 .
- the light after the blue filter 806 can be any wavelength within the range of the filter (e.g., 440 nm and 485 nm).
- the cases of output from the spinning mirror 804 can be separate such that light passing through the red filter 805 does not also pass through blue filter 806 .
- Stage D of FIG. 8 shows the output of the red filter 805 reaching the camera 807 .
- the camera 807 can use a light detector to capture the incoming light from the red filter 805 and create an image. This image can be a stored group of pixels with colors and intensities. Images captured by the camera 807 can be used for sea lice detection.
- Stage D′ of FIG. 8 shows the camera 808 .
- the output of the spinning mirror 804 can be directed towards the blue filter 806 .
- the output of the blue filter 806 can be directed towards the camera 808 .
- the camera 808 can use a light detector to capture the incoming light from the blue filter 806 and create an image. This image can be a stored group of pixels with colors and intensities. Images captured by the camera 808 can be used for sea lice detection.
- the spinning mirror 804 can rotate at high speed and direct the portion of the incident beam 801 reflected from the spinning mirror 804 into a camera (e.g., the camera 807 , the camera 808 ).
- a camera e.g., the camera 807 , the camera 808 .
- the process of rotating the spinning mirror 804 between directing light towards the camera 807 or the camera 808 can introduce a slight delay between the two cameras as they take their images.
- the motion of rotation can also affect the period of exposure for camera 807 or camera 808 .
- the mirror can snap between locations which could allow for longer imaging without warping due to the moving of the image.
- FIG. 9 is a diagram which shows the system 900 for stereo camera image capture.
- the system 900 is comprised of an incident beam 901 , an incident beam 902 , a primary lens 904 , a primary lens 905 , a red filter 906 , a blue filter 907 , a camera 909 , and another camera 910 .
- the camera 909 and the camera 910 can be connected to form a stereo camera system.
- the system 900 can be used for image collection.
- images collected can be used in the process of detecting sea lice.
- the incident light beams 901 and 902 can be the light from an exposure of a fish within a pen.
- the primary lenses 904 and 905 can be made out of glass and can help direct the light towards the red filter 906 or the blue filter 907 .
- additional lenses or mirrors can be used for focusing the incident beam.
- the red filter 906 and the blue filter 907 can be tuned to allow specific frequency light through.
- the red filter 906 can be tuned to allow only light with wavelength between 620 nm and 750 nm.
- the blue filter 907 can be tuned to allow only light with wavelength between 440 nm and 485 nm.
- the camera 909 and the camera 910 can capture a light beam using a light detector.
- the light detector captures incoming light and creates an image.
- the light detector can encode the captured light as a list of pixels with color and intensity.
- the pixel information can be stored as an image and can be used by other devices and systems.
- Stage A of FIG. 9 shows the incident beams 901 and 902 moving towards the lenses 904 and 905 respectively.
- the incident beams 901 and 902 can be light from simultaneous exposures of a scene.
- the incident beams 901 and 902 can be comprised of the light reflected off a fish swimming in a pen.
- Stage B of FIG. 9 shows the incident beams 901 and 902 focused by lenses 904 and 905 respectively.
- the light output from lens 904 and lens 905 can be sent towards the red filter 906 and the blue filter 907 .
- the process of directing light towards the filters can be comprised of multiple lenses and mirrors.
- Stage C of FIG. 9 shows the output of the lenses 904 and 905 passing through the red filter 906 and the blue filter 907 respectively.
- the light directed towards the red filter 906 can be any wavelength transmitted by the lens 904 or other optical element.
- the light after the red filter 906 can be any wavelength within the range of the filter (e.g., 620 nm and 750 nm).
- the light directed towards the blue filter 907 can be any wavelength transmitted by the lens 905 or other optical element.
- the light transmitted through the blue filter 906 can be any wavelength within the range of the filter (e.g., 440 nm and 485 nm).
- Stage D of FIG. 9 shows the output of the red filter 906 reaching the camera 909 .
- the output of the blue filter 907 can be directed towards the camera 910 .
- the cameras 909 or 910 can use a light detector to capture the incoming light from the filter (e.g., the red filter 906 , the blue filter 907 ) to create an image.
- This image can be a stored group of pixels with colors and intensities. Images captured by the camera 909 and the camera 910 can be used for sea lice detection.
- the system 900 by employing stereo cameras each with a different color filter in front, allows the cameras to take pictures simultaneously with no reduction in incident light besides the losses in various optical elements including the filters. This represents a possible advantage over other image capture techniques.
- a possible disadvantage of the stereo camera setup can include the introduction of parallax between the two images. For example, a pixel at coordinate (x, y) in an image captured by camera 909 will not be the same as a pixel at coordinate (x, y) in an image captured by camera 910 . The introduction of parallax between two images can potentially complicate a multi-frame registration process.
- Embodiments of the invention and all of the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
- Embodiments of the invention can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus.
- the computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter affecting a machine-readable propagated signal, or a combination of one or more of them.
- data processing apparatus encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers.
- the apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
- a propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
- a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- a computer program does not necessarily correspond to a file in a file system.
- a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
- a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
- the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output.
- the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
- processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
- a processor will receive instructions and data from a read only memory or a random access memory or both.
- the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
- a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
- mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
- a computer need not have such devices.
- a computer can be embedded in another device, e.g., a tablet computer, a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few.
- Computer readable media suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks.
- the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
- embodiments of the invention can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
- a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
- keyboard and a pointing device e.g., a mouse or a trackball
- Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
- Embodiments of the invention can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the invention, or any combination of one or more such back end, middleware, or front end components.
- the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
- LAN local area network
- WAN wide area network
- the computing system can include clients and servers.
- a client and server are generally remote from each other and typically interact through a communication network.
- the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- HTML file In each instance where an HTML file is mentioned, other file types or formats may be substituted. For instance, an HTML file may be replaced by an XML, JSON, plain text, or other types of files. Moreover, where a table or hash table is mentioned, other data structures (such as spreadsheets, relational databases, or structured files) may be used.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Marine Sciences & Fisheries (AREA)
- Animal Husbandry (AREA)
- Biodiversity & Conservation Biology (AREA)
- Zoology (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Circuit Arrangement For Electric Light Sources In General (AREA)
- Studio Devices (AREA)
- Farming Of Fish And Shellfish (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
- This application is a continuation of U.S. application Ser. No. 16/743,023, filed Jan. 15, 2020, the contents of which are incorporated by reference herein.
- This specification generally describes lighting controllers, particularly those used for aquaculture.
- Sea lice feed on the mucus epidermal tissue and blood of host marine fish. Sea lice infestations can be a major problem in fish farming, since heavy infections can lead to deep lesions, particularly on the head region. Sea lice infestations can kill or render salmon unsuitable for market.
- By capturing a detailed image of a fish, image analysis can be performed to detect sea lice or other skin features, including lesions, on the fish. Detection can be automatic and can inform various techniques of mitigation. For sea lice detection, mitigation can include methods of delousing. To capture an image, illuminator lights with specific frequencies are controlled by a lighting controller to coincide with camera exposures. The specific frequency of light is chosen for properties likely to aid in the detection of sea lice as well as skin lesions, shortened operculum or other physical deformities and skin features. Illuminator light controllers can use pulse patterns to illuminate a fish with specific frequency light.
- Advantageous implementations can include one or more of the following features. For example, red and blue light-emitting diodes (LEDs) alternately cast light on a fish within the field of view of one or more cameras. A camera can transfer images to a computer which performs visual analysis to detect attached sea lice. The different color light can highlight different features of interest along with improving clarity for sea lice detection. By combining images or analyzing separate images, analysis can inform sea lice detection.
- The wavelength of a beam of light can change depending on the medium in which the beam propagates. The visible spectrum is continuous. Wavelength ranges for given colors within the continuous spectrum are approximate but wavelength or frequency can be used to clearly differentiate two or more colors.
- In some implementations, the detection information for specific fish can be stored. The stored data can be used for lice mitigation, other diagnoses, or in producing analytics. For example, a fish can be detected by a system employing image analysis to have a certain quantity of sea lice attached to the right-side gill. This information can be passed to an automatic delouse, which can remove the sea lice. In addition, this information can be stored on a server to inform population analytics.
- In some implementations, the lighting controller can use pairs of light pulses. For example, the lighting controller can use a red light and a blue light to illuminate a fish. The red light and the blue light can alternate illuminating the fish such that, at some point, the fish is illuminated by the red light and at another point the fish is illuminated by the blue light. Images can be captured of the fish while it is being illuminated by the red light. Images can also be captured of the fish while it is being illuminated by the blue light. Image processing can combine an image captured with red light illumination and an image captured with blue light illumination to determine if the fish has a certain condition. Conditions can include a sea lice infection, a lesion on the body of the fish, or a physical deformity such as a shortened operculum.
- The lighting controller can be used in any area with fish. For example, the lighting controller can be used within a fish pen. The lighting controller can also be used within a fish run.
- In some implementations, the lighting controller can include a blue light with a specific frequency range. For example, the lighting controller can include a blue light that can produce peak power within a wavelength range of 450 nanometers to 480 nanometers.
- In some implementations, the lighting controller can have a certain frequency at which illuminators alternate. For example, the lighting controller can use pairs of light pulses which alternate on and off more than sixty times a second. The specific frequency can be chosen to ensure that a fish does not perceive the illuminators alternating. The specific frequency can be chosen to ensure that a fish perceives the illuminators as steady sources of light.
- In some implementations, camera exposures can be timed to coincide with periods of time in which a fish is illuminated. For example, a camera can open for exposures for a portion of time between when an illuminator is on and illuminating a fish and when the illuminator is off and not illuminating the fish. In some implementations, a camera can open for exposures for a portion of time between when an illuminator is off and not illuminating a fish and when the illuminator is on and illuminating the fish.
- In some implementations, the lighting controller can activate illuminators without any overlap. For example, the lighting controller can illuminate a fish with a blue light for a period of time. The lighting controller can then stop illuminating the fish with the blue light. The lighting controller can then illuminate a fish with a red light for a period of time.
- In some implementations, machine learning can be used to inform elements of the detection process. For example, the lighting controller can vary the time of camera exposure or illumination depending on current background lighting levels or the type of fish detected in the field of view. In some cases, the lighting controller or image analysis process can use positive or negative detection results to inform machine learning. For example, the lighting controller can use a learning data set of known sea lice infected fish and adjust illumination frequency, exposure lengths, or other parameter to produce a greater number of accurate detections or fewer inaccurate detections.
- In some implementations, an image buffer can be used to help aid in image capture. For example, a camera can capture an exposure for an amount of time and save a resulting image to an image buffer. The camera can continue to save images to the image buffer until the image buffer is full. Images saved to the image buffer can be transferred to another device or computer. In some cases, an image buffer can be used to reduce the amount of time in between consecutive image captures. Reducing the amount of time in between consecutive image captures can be advantageous when combining two or more images (e.g., an image captured of a fish illuminated with a red light and an image captured of the fish illuminated with a blue light).
- The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features and advantages of the invention will become apparent from the description, the drawings, and the claims.
-
FIG. 1 is a diagram showing an example of a system for sea lice detection. -
FIG. 2 is a diagram of an exposure pattern. -
FIG. 3 is a diagram of an alternative exposure pattern. -
FIG. 4 is a diagram of another alternative exposure pattern. -
FIG. 5 is a flow diagram illustrating an example of a process for sea lice detection using a lighting controller. -
FIG. 6A, 6B, and 6C are diagrams of custom Bayer filters. -
FIG. 7 is a diagram of a method for image collection using a beam splitter. -
FIG. 8 is a diagram of a method for image collection using a rotating mirror. -
FIG. 9 is a diagram of a method for image collection using a pair of stereo cameras. - Like reference numbers and designations in the various drawings indicate like elements.
-
FIG. 1 is a diagram showing an example of asystem 100 for sea lice detection. Thesystem 100 is comprised of afish pen 101, acontrol unit 120, twoprimary illuminators camera 105, and thefish 109. Thefish pen 101 is formed with netting, e.g., rope, nylon, or silk. Illuminators can be controlled with signals from a lighting controller. In some implementations, the lighting controller can be connected to thecontrol unit 120. Thefish 109 can be a member of a population of fish located with thefish pen 101. In this example, the fish is a salmon with sea lice on its body. - In some implementations, the detection of sea lice can include specific species of sea lice. For example, several species of ectoparasitic copepods of the genera Lepeophtheirus and Caligus. The type of fish being analyzed can affect the process of sea lice detection. For example, upon detection of a salmon, a system can adapt a system of detection for the detection of Lepeophtheirus salmonis—a species of sea lice which can be especially problematic for salmon. In some implementations, a detection of a specific species of sea lice can be separated from other sea lice detections. For example, a detection of Lepeophtheirus salmonis can be separated from sea lice detections of Caligus curtis and Lepeophtheirus hippoglossi.
- In
FIG. 1 , thefish pen 101 is shown at initial time τ1 and later time τ2. The status (e.g., on/off) as well as the position of the contents of thefish pen 101 can change from time τ1 to τ2. - The times τ1 and τ2 correspond to the time at which a first image is captured (τ1) and the time at which a second image is captured (τ2). In some implementations, different exposure techniques can enable sea lice detection with only a single image capture. The various exposure techniques as well as exposure patterns are discussed below.
- The two
primary illuminators Illuminator 102 transmits light within the wavelength range of 440 nm to 485 nm and appears blue. The blue light region is distinct from the cyan light region in that the blue light region stretches from 450 nm wavelength up to 485 nm wavelength, while the wavelength of cyan light starts at 485 nm wavelength and increases to 500 nm. Blue light can have peak power between 450 and 485 nm wavelengths while cyan light can have peak power between 485 nm and 500 nm wavelengths. Furthermore, the light of a blue LED used in the lighting controller can be concentrated towards the lower wavelengths of the blue light region creating a separation of blue light to cyan light. The separation can be thousands of gigahertz or greater which equates to roughly ten percent of the entire visible spectrum. A greater separation between red light (e.g., 625 nm to 780 nm wavelength) and blue light (e.g., 450 nm to 485 nm wavelength) can result in greater accuracy in sea lice detection as well as detections of skin lesions, shortened operculum or other physical deformities and skin features. -
Illuminator 104 transmits light within the wavelength range of 620 nm to 750 nm and appears red. Frequency can be tuned to maximize frequency space separation while retaining visible light for camera image capture and minimizing environmental disruptions (e.g., light absorption, light scattering). - The
camera 105 captures visible light images. The exposures ofcamera 105 can be timed with illumination of any other illuminators in the fish pen 101 (e.g.,illuminator 102,illuminator 104, additional illuminators). The exposure ofcamera 105 and illumination of any illuminator can be controlled by thecontrol unit 120. - In some implementations, secondary illuminators can be used. Secondary illuminators can provide additional light for exposures of
camera 105. For example, secondary illuminators can be used to brighten the image of the fish. This can be useful in situations where surface light is minimal. Secondary illuminators can also enable the ability to control the ambient light of an image capture which can be useful in controlling for varying water conditions or location conditions. - In some implementations, more or fewer illuminators can be used. For example, in some situations, secondary illuminators may not be required. These situations may include applications where background light is sufficient or does not pose challenges for sea lice detection. Less illuminators can also be used by installing custom image filters to capture an image or images.
- Stage A in
FIG. 1 shows a particular wavelength being selected by thecontrol unit 120. This wavelength can be used by light emitting diodes (LEDs) within the fish pen shown initem 101. The LEDs can be activated to illuminate thefish 109. In some implementations, other forms of light can be used. For example, instead of LEDs, incandescent light bulbs can be used. Other forms of light production may be used for either the primary set or a secondary set of illuminators. - In some implementations, the wavelengths can be set before imaging events take place. For example, an LED can be installed which emits light in the blue visible spectrum of light with wavelengths between 440 nm and 485 nm. Another LED can be installed which emits light in the red visible spectrum of light with wavelengths between 620 nm and 750 nm. In general, it can be advantageous to use dissimilar frequencies, one with longer wavelength (towards infrared) and another with shorter wavelength (towards ultraviolet). Lights reaction in water should be considered and can prevent some frequencies of light from propagating effectively and therefore functioning properly as a primary illuminator.
- In some implementations, the frequency of the illumination LEDs can be tuned remotely. For example, revolving LED wheels can be used to pick from a variety of LEDs. LEDs can be chosen based on effectiveness. Criteria can include an ability to produce images likely to result in true positive sea lice detection.
- Stage B in
FIG. 1 shows thepen 101 at time τ1. Thefish pen 101 contains thefish 109, thecamera 105, and theprimary illuminators 102 and 103. In this example, secondary illuminators are not used while theprimary illuminators 102 and 103 are used and are set to the colors red and blue, respectively. The illuminators can be controlled by a lighting controller (e.g., control unit 120) connected to thecamera 105 or by thecamera 105 itself. The camera can time exposures as shown inFIG. 2 ,FIG. 3 , andFIG. 4 . The specifics of the different exposure patterns will be discussed later in this application. - At time τ1, the blue LED illuminator fires and bathes the
fish 109 in blue light. Thecamera 105 opens exposures to coincide with the blue LED illuminator. Thecamera 105 can open exposures simultaneously with the flash of an illuminator or after the beginning of the flash. - Stage C in
FIG. 1 shows an image 110 created by an exposure ofcamera 105 and the illumination of theblue LED 102. The dot pattern in the image 110 represents the color blue of the illuminator used to capture the image. In the image 110, thefish 109 is shown withsea lice 111 attached near the head. - In some implementations, multiple fish can be detected within an image. For example, the image taken by
camera 105 can show multiple fish. The multiple fish can have individual sea lice detections. - Stage D in
FIG. 1 shows thepen 101 at time τ2 temporally separated from thefish pen 101 at time τ1. Thefish 109 has moved from left to right as the second illuminator, thered LED 104, fires. The firing of theilluminator 104 coincides with the exposure ofcamera 105. At time τ2, thered LED illuminator 104 fires and bathes thefish 109 in red light. Thecamera 105 opens exposures to coincide with the red LED illuminator. Thecamera 105 can open exposures simultaneously with the flash of an illuminator or after the beginning of the flash. - Stage E in
FIG. 1 shows animage 115 created by an exposure ofcamera 105 and the illumination of thered LED 104. In contrast to the primarily blue image 110, theimage 115 is primarily red owing to the illumination of thered LED 104. This is represented by the absence of the dots used to represent the blue illumination in the image 110. In theimage 115, thefish 109 is shown withsea lice 111 attached. - In some implementations, the exposure of
camera 105 need not be simultaneous with illuminators. For example, theblue LED 102 can fire before thecamera 105 begins capturing images or after. Images captured by thecamera 105 can be selected based on illuminator status during image capture. - Stage F in
FIG. 1 involves feature selection. Feature selection can be a form of image analysis performed on images (e.g., image 110, image 115). In some implementations, image 110 andimage 115 can be combined. Image analysis can be performed to detect features on the body of thefish 109. The image analysis can be performed by various computational methods including algorithms, neural networks, or linear regressions. - In some implementations, the image analysis may be composed of multiple steps. For example, a rough object identifier may be used to detect the
fish 109 within the image 110. A second object identifier may use the output of the first object identifier to locate objects on the fish 109 (e.g., the sea lice 111). The multiple steps can be performed by various computational methods including algorithms, neural networks, or linear regressions. - Stage G in
FIG. 1 involves detecting sea lice based on the image analysis performed. In some implementations, the image of the body of the fish can be separated from the background. Other pre-processing methods can prepare stages of image analysis. Sea lice surrounding and overlaying the image of the body can be detected and counted and attributed to a specific fish. Tallies of sea lice can be kept for individual fish, groups of fish, or whole populations. Detected sea lice data can be used by the system to inform further steps either for mitigation or analytics. - Stage H in
FIG. 1 shows a possible act related to the detection of sea lice. In some implementations, the act can be a form of sea lice mitigation. Techniques can include focused laser light where provided coordinates from the detected sea lice data can be used to target the lasers. Sea lice mitigation can take place in sync with detection or after detection. Detected sea lice data can be stored for future sea lice mitigation, or for analytics, by other devices within the system. In some implementations, thesystem 100 can store detected sea lice data and inform human workers to proceed with a sea lice mitigation technique. For example, infected fish can be tagged with a location which workers can use to catch and delouse the fish. - Stage I in
FIG. 1 shows theoutput 121 of thecontrol unit 120. Thedetection output 121 can include data related to the event of sea lice detection. For example, thedetection output 121 can include instructions for sea lice mitigation, data related to thefish 109, or data related to thesea lice 111. For example, the detection output can specify that seven sea lice are on thefish 109 at specific coordinates or attached to specific features of the fish. The output can specify that sea lice mitigation forfish 109 should be conducted by hand. This data can be stored or used within other systems connected to or withinsystem 100. - The
system 100 can also be useful in detecting other conditions. For example, skin lesions on a fish can be detected using similar methods and processes. In some implementations, instead, or in addition to, analyzing images illuminated by different frequencies of light for elements denoting sea lice infection, a system can perform other analysis. For example, a system can analyze images illuminated by different frequencies of light for elements denoting skin lesions or physical deformities such as shortened operculum. -
FIG. 2 is a diagram of anexposure pattern 200 which can be used by a lighting controller ofsystem 100. Theexposure pattern 200 is comprised of ablue LED 201,red LED 204 and acamera 206. The boxes similar toitem 202 represent time intervals in which theblue LED 201 is illuminating. The boxes similar toitem 205 represent time intervals in which thered LED 204 is illuminating. The boxes similar toitem 207 represent time intervals in which thecamera 206 is open for exposures. The time interval of theblue LED illumination 202 and the time interval of thered LED illumination 205 can be considered an initial pair oflight pulses 203. Succeeding pairs of light pulses may follow the initial pair. - Proceeding within
FIG. 2 from left to right shows the progression of exposure pattern from beginning to a later time. In this example, theblue LED 201 can fire for a duration of 5 milliseconds. Thecamera 206 opens for exposure during this window. The exposure duration can vary but can overlap with the period of illumination from theblue LED 201. An image captured during the illumination of theblue LED 201 can be stored using a computer device connected to thecamera 206 or thecamera 206 itself. - At the end of the illumination window, the
blue LED 201 stops illuminating. After the blue LED has stopped illuminating, thered LED 204 begins illuminating. In some implementations, an overlap between the two LEDs can be used. For example, if theblue LED 201 illuminates fromtime 0 to time 5 ms, thered LED 204 can fire from time 4 ms to 9 ms. Furthermore, the time intervals of the LEDs illumination need not be identical. For example, theblue LED 201 can illuminate for 5 ms while thered LED 204 illuminates for 10 ms. - In some implementations, a gap between sequential illuminations can be inserted. For example, after the illumination of the
blue LED 201 but before the illumination of thered LED 204, thepattern 200 can contain a 1 ms period of non-illumination. In some implementations, periods of non-illumination can be inserted to prevent a subject being illuminated simultaneously by theblue LED 201 and thered LED 204. - After a delay, the
camera 206 can start exposures again. In some implementations, this delay can be inserted to transfer an image to a storage device or somewhere within memory. For example, the delay can be 40 ms. Different implementations can use different delay lengths. In this example, the delay corresponds to the time from the beginning of one exposure to the beginning of the next exposure. The next exposure can be of an illumination that has not previously been captured. For example, if the illumination of theblue LED 201 was captured in exposure number one, the illumination of thered LED 204 can be captured in exposure number two. In this example, the time between exposure one and exposure two can be considered a delay. - While the camera is not capturing an exposure, the
LEDs LEDs - The
exposure pattern 200 can continue for as long as is required. In some implementations, the exposures will end after a subject has left the field of view ofcamera 206. Multiple images can be combined or processed separately. Single images can also be processed. - In some implementations, the
red LED 204 can emit peak power at a specific wavelength. For example, thered LED 204 can emit peak power at a wavelength between 625 nm and 780 nm. In some implementations, theblue LED 201 can emit peak power at a specific wavelength. For example, theblue LED 201 can emit peak power at a wavelength between 450 nm and 485 nm. -
FIG. 3 is a diagram of anexposure pattern 300 which can be inserted by a lighting controller ofsystem 100. Theexposure pattern 300 is comprised of ablue LED 301,red LED 304 and acamera 306. The boxes similar toitem 302 represent time intervals in which theblue LED 301 is illuminating. The boxes similar toitem 305 represent time intervals in which thered LED 304 is illuminating. The boxes similar toitem 307 represent time intervals in which thecamera 306 is open for exposures. The time interval of theblue LED illumination 302 and the time interval of thered LED illumination 305 can be considered an initial pair oflight pulses 303. Succeeding pairs of light pulses may follow the initial pair. - Proceeding within
FIG. 3 from left to right shows the progression of exposure pattern from beginning to a later time. In this example, theblue LED 301 fires at time zero for a duration of 5 milliseconds. Thecamera 306 opens for exposure during this window for a duration of 4 milliseconds. An image captured during the illumination of theblue LED 301 can be stored using a computer device connected to thecamera 306 or thecamera 306 itself. - At the end of the illumination window, the
blue LED 301 stops illuminating. After the blue LED has stopped illuminating, thered LED 304 begins illuminating. In some implementations, an overlap between the two LEDs can be implemented. For example, if theblue LED 301 illuminates fromtime 0 to time 5 ms, thered LED 304 can fire from time 4 ms to 9 ms. Furthermore, the time intervals of the LEDs illumination need not be identical. For example, theblue LED 301 can illuminate for 5 ms while thered LED 304 illuminates for 10 ms. - In some implementations, a gap between sequential illuminations can be inserted. For example, after the illumination of the
blue LED 301 but before the illumination of thered LED 304, thepattern 300 can contain a 1 ms period of non-illumination. In some implementations, periods of non-illumination can be inserted to prevent a subject being illuminated simultaneously by theblue LED 301 and thered LED 304. - After
initial exposure 307, thecamera 306 can start exposures again. In this example, the delay between first and second exposures is shorter thanexposure pattern 200. A shorter delay can be accomplished by using a larger buffer to store multiple images captured within exposures. A graph of the buffer is shown initem 310. Thebuffer graph 310 shows, relative to the horizontal axis of time, the amount of image data held in the image buffer.Item 311 shows the buffer storage increase as theimage 307 is captured.Item 312 shows the buffer storage increase again as theimage 308 is captured. An image from bothexposure 307 andexposure 308 can be stored within the image buffer if the data stored is below a limit like the buffer limit line shown initem 314. - In order to stay within the
buffer limit 314, the exposure pattern can delay to give time for the images stored in the buffer to be transferred out of the buffer onto another storage device. Different implementations can use different delay lengths. The delay can be the time between two consecutive groups of exposures. For example, the delay forpattern 300 can be 80 ms as measured from the beginning ofexposure 307 to the beginning ofexposure 309. This delay may be calibrated to give enough time for the buffer to transfer data. The process of buffer transfer can be seen ingraph 310 as a downward slanted line. - In some implementations, different delay lengths as well as number of exposures captured within an exposure group, can vary. For example, instead of two exposures within the first exposure group, four can be implemented. In general, the number of exposures per group before a period of non-exposure depends on the size of the image buffer used. During a period of non-exposure, data can be offloaded from the image buffer. With a large image buffer, more images can be captured with less delay in between consecutive shots.
- After a period of non-exposure, the
camera 306 can resume exposures. The moment to resume exposures can coincide with buffer storage availability as well as illumination from illuminators (e.g., theblue LED 301, the red LED 304). For example,exposure 307 is timed with illumination from theblue LED 301.Exposure 308 is timed with illumination from thered LED 304. After a period of non-exposure, thecamera 306 can resume exposures. The first exposure after a period of non-exposure can be timed with theblue LED 301 or with thered LED 304. In this case, the exposure after a period of non-exposure is timed with theblue LED 301. Theexposure 309 after a period of non-exposure can also coincide with the buffer storage availability as shown ingraph 310. - While the camera is not exposing, the
LEDs LEDs - The
exposure pattern 300 can continue for as long as is required. In some implementations, the exposures will end after a subject has left the field of view ofcamera 306. Multiple images can be combined or processed separately. Single images can also be processed. - In some implementations, the
red LED 304 can emit peak power at a specific wavelength. For example, thered LED 304 can emit peak power at a wavelength between 625 nm and 780 nm. In some implementations, theblue LED 301 can emit peak power at a specific wavelength. For example, theblue LED 301 can emit peak power at a wavelength between 450 nm and 485 nm. -
FIG. 4 is a diagram of anexposure pattern 400 which can be implemented by a lighting controller ofsystem 100. Theexposure pattern 400 is comprised of ablue LED 401,red LED 404 and acamera 406. The boxes similar toitem 402 represent time intervals in which theblue LED 401 is illuminating. The boxes similar toitem 405 represent time intervals in which thered LED 404 is illuminating. The boxes similar toitem 407 represent time intervals in which thecamera 406 is open for exposures. The time interval of theblue LED illumination 402 and the time interval of thered LED illumination 405 can be considered an initial pair oflight pulses 403. Succeeding pairs of light pulses may follow the initial pair. In some cases, intervals in which neither theblue LED 401 nor thered LED 404 is illuminated may be inserted between or within a pair or pairs of light pulses. - Proceeding within
FIG. 4 from left to right shows the progression of exposure pattern from beginning to a later time. The time of illumination can be set for each light source used in an exposure pattern. For example, in some implementations, theblue LED 401 can fire at time zero for a duration of 5 milliseconds. When the subject is illuminated with a light source, thecamera 406 can open for exposure. An image captured during the illumination of theblue LED 401 can be stored using a computer device connected to thecamera 406 or thecamera 406 itself. - At the end of the illumination window, the
blue LED 401 stops illuminating. After the blue LED has stopped illuminating, thered LED 404 begins illuminating. In some implementations, an overlap between the two LEDs can be used. For example, if theblue LED 401 illuminates fromtime 0 to time 5 ms, thered LED 204 can fire from time 4 ms to 9 ms. Furthermore, the time intervals of the LEDs illumination need not be identical. For example, theblue LED 401 can illuminate for 5 ms while thered LED 404 illuminates for 10 ms. Other durations can also be used. - In some implementations, a gap between sequential illuminations can be inserted. For example, after the illumination of the
blue LED 401 but before the illumination of thered LED 404, thepattern 400 can contain a 1 ms period of non-illumination. In some implementations, periods of non-illumination can be inserted to prevent a subject being illuminated simultaneously by theblue LED 401 and thered LED 404. - After a delay, the
camera 406 starts exposures again. In some implementations, this delay can be inserted to transfer the image to a storage device or somewhere within memory. For example, the delay can be 40 ms fromexposure 407 toexposure 408. Different implementations can use different delay lengths. The delay corresponds to the time difference between two sequential camera exposures. For example, if theblue LED 401 illumination was captured inexposure 407, thered LED 404 illumination can be captured inexposure 408 after a given delay. - After another delay, the
camera 406 exposes again shown initem 409. Theexposure 409 captures an image while no illuminators are illuminated. In the moments before, theblue LED 401 illuminates, followed by thered LED 404 but theexposure pattern 400 includes a period of non-illumination after thered LED 404 within the sequence. Theexposure 409 can be used to get additional data. For example, theexposure 409 can be used to get data on background lighting. This can be useful in situations where other regions of light may be of interest. The images captured without illumination from theblue LED 401 or thered LED 404 can be used in other processes. For example, theexposure 409 can be used to get readings on water condition. The pattern ofblue LED exposure 407,red LED exposure 408 followed bynon-LED exposure 409 can be used repeatedly in theexposure pattern 400. - While the camera is not exposing, the
LEDs LEDs - The
exposure pattern 400 can continue for as long as is required. In some implementations, the exposures will end after a subject has left the field of view ofcamera 406. Multiple images can be combined or processed separately. Single images can also be processed. - The LEDs used as illuminators in the
example exposure patterns - In experiments, capturing a few images of lice on salmon, analysis was performed with frequency ranging from violet (400 nm wavelength) to near-infrared (1000 nm wavelength). A classifier, with a regularized parameter across which frequency bins were used as an input, was trained and chose to use the shortest and longest wavelengths. Other combinations of various greens and blues (which matched the LEDs capable of functioning within the lighting apparatus) were used but the performance of the red and blue LED combination was superior. Additional subjective tests comparing various lighting schemes reached the same conclusion.
- The rate at which the LEDs alternate can be fast enough to make the alternating LEDs appear as steady, non-flashing lights, when perceived by an eye of a human or a fish. For example, the LEDs can alternate at a frequency of 80 to 120 Hz. A high alternating rate is advantageous as it allows the flashing to be less noticeable by the fish being illuminated as well as reducing the time, and thus the visual differences, between consecutive snapshots of the fish when exposure patterns are used. Reducing visual differences can help reduce complexity and improve the resulting accuracy of any later image combination.
- Specific orders have been shown for the
exposure patterns exposure patterns FIG. 4 , the first exposure can be with thered LED 404 illuminating while the second can be an exposure with no illumination. -
FIG. 5 shows aprocess 500 for sea lice detection using a lighting controller. - The
process 500 includes preparing an illumination system and a camera system (502). For example,control unit 120 fromFIG. 1 can select the wavelength used for illuminating thefish 109. - The
process 500 includes detecting fish motion within the field of view of the camera system (504). For example, as thefish 109 swims within the field of view of thecamera 105, theilluminators camera 105 can coordinate through signals sent fromcontrol unit 120 in a manner similar to those discussed inFIG. 2 ,FIG. 3 , orFIG. 4 . - The
process 500 includes using a lighting controller exposure pattern, involving the illumination system and the camera, to capture fish images (506). For example, a specific exposure pattern similar topattern 200 ofFIG. 2 can be used with theblue LED 201 andred LED 204 functioning as the illumination system and thecamera 206 functioning as the camera. - The
process 500 includes analyzing captured fish images for sea lice (508). For example, the control unit can gather image 110 andimage 115 and perform image analysis to detectsea lice 111. - The
process 500 includes storing results within a computer system (510). For example,control unit 120 can store the results of the image analysis involving image 110 andimage 115. - The
process 500 includes employing mitigation techniques based on results (512). The mitigation techniques can include targeted treatments which can be comprised of lasers, fluids, or mechanical devices such as a brush or suction. For example, thecontrol unit 120 can activate lasers to focus intense light on a fish to remove sea lice from the fish. The lasers can use sea lice location data gleaned from the image analysis performed. Thecontrol unit 120 can also delegate the mitigation to other systems or devices (e.g., other computer systems, humans). - In some implementations more or less than two lights can be used for illuminating the subject. For example, instead of the
blue LED 102 and thered LED 104, another LED of a different frequency or color can be added. The illumination of any additional LED can be captured by a camera as images like theimages 110 and 115. - In some implementations, more than one camera can be used. For example, instead of the
camera 105 capturing images, an additional camera can be used to capture images. In some implementations, an additional camera can capture alternate angles of a subject. For example, an additional camera within thefish pen 101 can capture one side offish 109 while thecamera 105 captures the other. - In some implementations, the illumination from illuminators can be of any frequency. For example, instead of the blue and red LED lights used by
illuminator 102 andilluminator 104 respectively, infrared and ultraviolet light can be used. The cameras used to capture images of scenes illuminated by illuminators can have the ability to capture the specific frequency of the illuminator. For example, if an illuminator is illuminating ultraviolet light on the subject, a camera can have the ability to sense and record the ultraviolet light within an image. Any frequency can be used within an exposure pattern like those inFIG. 2 ,FIG. 3 , andFIG. 4 . - In some implementations, more than one fish can be processed within a system like
system 100. For example, thepen 101 inFIG. 1 can show not only thefish 109 but an additional fish. The additional fish can be captured by thecamera 105. Both thefish 109 and the additional fish can be processed bycontrol unit 120 and be data representing their respective detection results can be contained within the resultingdetection output 121. Any number of fish can be processed in this way. Possible limitations to the number of fish processed can exist in hardware or software used. - In some implementations, more than one exposure pattern can be used. For example, both
pattern 200 fromFIG. 2 andpattern 300 fromFIG. 3 can be used. Combinations of patterns can bring alterations to a given pattern and may result in a new pattern which can be used by a device. In some implementations, patterns can be used based on external or internal stimuli. In some situations, it may be beneficial or desirable to choose one exposure pattern over another or a specific combination of one or more exposure patterns. - In some implementations, the exposure patterns may contain an additional light or additional lights. The
exposure pattern exposure pattern 200, an additional light can fire between theillumination 202 and theillumination 205. The additional light can illuminate a given subject in a separate or similar frequency to the frequencies illuminated byilluminator 201 orilluminator 204. For example, the additional light can illuminate in ultraviolet. An exposure pattern can be altered. For example, the illumination of the ultraviolet light source can be captured by an exposure after theexposure 207. - In some implementations, the exposure patterns may contain an additional camera or additional cameras. The
exposure pattern exposure pattern 200, an additional camera can be used to capture exposures afterexposure 207. The additional camera can capture an exposure of a given subject in a separate or similar frequency to the frequencies illuminated byilluminator 201 orilluminator 204. For example, the additional camera can capture exposures of light in the ultraviolet spectrum. An exposure pattern can be altered. For example, an exposure capturing ultraviolet light can be added to theexposure pattern 200 after theexposure 207. - The sea lice on a fish can be detected anywhere within a field of view of a camera. For example, the sea lice detected on a fish can be on any part of the body. The part of body, location, or number can be included within the
detection output 121. - In some implementations, a system can alter detection techniques based on detection circumstances. For example, in the case of various fish species, the detection method can be altered to use algorithms associated with the species or other types of frequency of illuminator light. Furthermore, water quality can be a circumstance of detection that could be registered by the system and alter following sea lice detections. For example, if the water is murky, an increase in the brightness or quantity of lights used can be instigated and carried out by the system. Adjusting the lighting based on fish environment conditions can be a part of the illuminator controller or a separate subsystem depending on implementation. Detection techniques can also be altered by the detection of a species of fish. For example, different species could be considered a detection circumstance and registered by the system. The registering of different species could invoke different forms of detection methods.
- Any alteration in sea lice detection method can result in alterations of sea lice detection output and results. For example, if a sea lice detection method was altered based on the sighting of a particular species of salmon, the output can be altered to save the sea lice detection data with species-specific feature recognition. The output can also be altered to include mitigation techniques tailored to the particular species of salmon.
- In some implementations, more than two modes of light can be used in an exposure pattern. For example, instead of blue and red light, an exposure pattern can use a blue light, a red light, and a yellow light.
- In some implementations, other ranges of light can be used to illuminate the subject for image capture. For example, instead of visible light, a system can use ultraviolet light.
- The
process 500 can also be useful in detecting other conditions. For example, skin lesions on a fish can be detected using similar methods and processes. In some implementations, instead, or in addition to, analyzing images illuminated by different frequencies of light for elements denoting sea lice infection, a system can perform other analysis. For example, a system can analyze images illuminated by different frequencies of light for elements denoting skin lesions or physical deformities such as shortened operculum. - In some implementations, a lighting controller can use a blue illuminator composed of light with multiple wavelengths. For example, a graph of output power versus wavelength for blue light can resemble a gaussian shape with peak power at 465 nm wavelength and 10% power at 450 nm and 495 nm wavelengths. Other implementations could have different proportions of wavelengths or different ranges of wavelengths. For example, a graph of output power versus wavelength for blue light can resemble a gaussian shape with peak power at 460 nm and 0% power at 455 nm and 485 nm wavelengths.
- In some implementations, a lighting controller can use a red illuminator composed of light with multiple wavelengths. For example, a graph of output power versus wavelength for red light can resemble a gaussian shape with peak power at 630 nm wavelength and 10% power at 605 nm and 645 nm. Other implementations could have different proportions of wavelengths or different ranges of wavelengths. For example, a graph of output power versus wavelength for red light can resemble a gaussian shape with peak power at 635 nm and 0% power at 610 nm and 640 nm wavelengths.
-
FIG. 6A, 6B, and 6C are diagrams of custom Bayer filters for use within sea lice detection. -
FIG. 6A includes two different color filters on thepixel array 600. Thepixel array 600 can be used in fish imaging.Pixel 602 corresponds with the color filter red.Pixel 603 corresponds with the color filter blue.Pixel array 600 is partially filled for illustration purposes. Matching pattern and shading on two or more pixels of thearray 600 denotes pixels of the same filter type. By adjusting a normal Bayer filter, thepixel array 600 can increase a camera's light sensitivity for key frequencies. In some implementations of sea lice detection, these key frequencies are red light (e.g., 625 nm to 780 nm wavelength) and blue light (e.g., 450 nm to 485 nm wavelength). The color filters on thepixel array 600 correspond to these frequencies. In the arrangement shown inpixel array 600, the amount of light captured in both the red and blue spectrum is effectively doubled compared with a normal red, green and blue pixel array used in some standard cameras. - In some implementations, the additional light sensitivity can reduce the number of images that need to be captured for sea lice detection. For example, a scene could be illuminated with both blue and red LEDs simultaneously. A camera could then capture an image. In some implementations, separate images could be extracted from the red and blue components of a single image.
- In some implementations, the color arrangement can be swapped. For example, blue pixels can take the place or red pixels and vice versa.
- In some implementations, color filters able to transmit different ranges of wavelengths can be used. For example, the pixels able to register blue light like
item 603 inpixel array 600 could be swapped with pixels able to register ultraviolet light. -
FIG. 6B is another custom Bayer filter which includes three different color filters on thepixel array 610.Pixel 612 corresponds with the color filter blue.Pixel 614 corresponds to a blank color filter which allows all wavelengths to register evenly.Pixel 616 corresponds with the color filter red.Pixel array 610 is partially filled for illustration purposes. Matching pattern and shading on two or more pixels of thearray 610 denotes filters of the same type. By adjusting a normal Bayer filter, thepixel array 610 allots equal amount of pixels for each channel (e.g., red filter channel, blue filter channel, blank filter channel). The structure is uniform and can potentially be more easily interpreted by neural networks working with output images. The color filters can accept light with wavelength within a particular range (e.g., 625 nm to 780 nm for thered filter 616, 450 nm to 485 nm for theblue filter 612, and the full visible spectrum for the blank filter 614). In some implementations, the color arrangement can be flipped. - In some implementations, the additional light sensitivity can reduce the number of images that need to be captured for sea lice detection. For example, a scene could be illuminated with both blue and red LEDs simultaneously. A camera could then capture a single image and from that image, separate images could be extracted for both the red and blue components.
- In some implementations, the color arrangement can be flipped. For example, blue pixels can take the place or red pixels and vice versa.
- In some implementations, color filters able to transmit different ranges of wavelengths can be used. For example, the pixels able to register blue light like
item 612 inpixel array 610 could be swapped with pixels able to register ultraviolet light. -
FIG. 6C is another custom Bayer filter which includes three different color filters on thepixel array 620.Pixel 622 corresponds with the color filter red. In this implementation, the red color filter ofpixel 622 allows light to pass through if the wavelength of the light is within the range 625 nm to 780 nm.Pixel 624 corresponds with the color filter blue, in this implementation allowing light through with wavelength within the range 450 nm to 485 nm.Pixel 626 corresponds to a blank color filter which allows, in this implementation, all wavelengths to register evenly.Pixel array 610 is partially filled for illustration purposes. - Matching pattern and shading on two or more pixels of the
array 620 denotes filters of the same type. By adjusting a normal Bayer filter, thepixel array 620 creates smaller two by two windows (i.e. a group of four mutually connected pixels forming a square) made up of the specific filter channels used (e.g., red filter channel, blue filter channel, blank filter channel). This type of structure has the advantage of granularity as well as applications for other fish related identification work. For example, for applications in which images are needed in more light wavelengths than just red and blue, the blank filter data can be used. In this way, thepixel array 620 is well suited for full spectrum photography as well as sea lice detection specific photography concentrated within the wavelengths specified of red and blue. In some implementations, the color arrangement can be flipped while maintaining the general pattern. - In some implementations, the additional light sensitivity can reduce the number of images that need to be captured for sea lice detection. For example, a scene could be illuminated with both blue and red LEDs simultaneously. A camera could then capture an image. In some implementations, separate images could be extracted from the red and blue components of a single image.
- In some implementations, the arrangement of pixels can be changed while preserving the overall pattern. For example, the locations of red pixels similar to
red pixel 622 and blue pixels similar toblue pixel 624 can be switched while preserving the overall pattern and benefits of thecell array 620 as shown. - In some implementations, color filters able to transmit different ranges of wavelengths can be used. For example, the pixels able to register blue light like
item 624 inpixel array 620 could be swapped with pixels able to register ultraviolet light. -
FIG. 7 is a diagram which shows a system 700 comprised of an incident light beam 701, aprimary lens 702, abeam splitter 704, ared filter 705, ablue filter 706, acamera 707, and anothercamera 708. The system 700 can be used for image collection. - The incident light beam 701 can be the light from an exposure of a fish within a pen. The
primary lens 702 can be made out of glass and can help direct the light towards thebeam splitter 704. In some implementations, additional lenses or mirrors can be used for focusing the incident beam. - The
beam splitter 704 is constructed such that a portion of the incident beam 701 is reflected and a portion of the incident beam 701 is transmitted creating two beams of light from the incident beam 701. Additional optical elements not shown can be used within thebeam splitter 704 and other devices within the system 700. For example, within thebeam splitter 704 can be multiple lenses and mirrors as well as gluing and connecting agents. - The
red filter 705 and theblue filter 706 can be tuned to allow specific frequency light through. For example, thered filter 705 can be tuned to allow only light with wavelength between 620 nm and 750 nm. Theblue filter 706 can be tuned to allow only light with wavelength between 440 nm and 485 nm. - The
camera 707 and thecamera 708 can capture a light beam using a light detector. The light detector captures incoming light and creates an image. For example, the light detector can encode the captured light as a list of pixels with color and intensity. The pixel information can be stored as an image and can be used by other devices and systems. - Stage A of
FIG. 7 shows the incident beam 701 moving to thelens 702. The incident beam 701 can be the light from an exposure of a scene. For example, the incident beam can be comprised of the light reflected off a fish swimming in a pen. - Stage B of
FIG. 7 shows the incident beam 701 split by thebeam splitter 704. Thebeam splitter 704 can have multiple lenses and mirrors used to direct the two outbound light beams. - Stage C of
FIG. 7 shows the output of thebeam splitter 704 passing through thered filter 705. The light before thered filter 705 can be any wavelength reflected or transmitted from thebeam splitter 704. The light after thered filter 705 can be any wavelength within the range of the filter (e.g., 620 nm and 750 nm). - Stage C of
FIG. 7 shows the output of thebeam splitter 704 passing through theblue filter 706. The beam passing through thered filter 705 and theblue filter 706 can be separate such that light passing through thered filter 705 does not also pass through theblue filter 706. The light before theblue filter 706 can be any wavelength reflected or transmitted from thebeam splitter 704. The light after theblue filter 706 can be any wavelength within the range of the filter (e.g., 440 nm and 485 nm). - Stage D of
FIG. 7 shows the output of thered filter 705 reaching thecamera 707. Thecamera 707 can use a light detector to capture the incoming light from thered filter 705 and create an image. This image can be a stored group of pixels with colors and intensities. Images captured by thecamera 707 can be used for sea lice detection. - Stage D′ of
FIG. 7 shows the output of theblue filter 706 reaching thecamera 708. Thecamera 708 can use a light detector to capture the incoming light from theblue filter 706 and create an image. This image can be a stored group of pixels with colors and intensities. Images captured by thecamera 708 can be used for sea lice detection. - Possible advantages of the system 700 is that it preserves the spatial resolution of each channel. It is also easier to construct color filters (e.g.,
red filter 705, blue filter 706) than the devices in some other image collection methods (e.g., custom image chips requiring per pixel accuracy). Simple colored optical filters can be manufactured. Some potential drawbacks include the cost of thebeam splitter 704 and the fact that after splitting, the light captured bycamera 707 andcamera 708 will be less intense than the incident beam 701. This can be alleviated with a greater intensity light on the subject of the image but greater intensity light can affect the subject's behavior. For example, a more intense light may scare fish away from the field of view captured by the incident beam 701. This could result in fewer opportunities to collect images of fish. -
FIG. 8 is a diagram which shows thesystem 800 comprised of anincident beam 801, a primary lens 802, aspinning mirror 804, ared filter 805, ablue filter 806, acamera 807, and anothercamera 808. Thesystem 800 can be used for image collection. In some implementations, images collected can be used in the process of detecting sea lice. - The
incident light beam 801 can be the light from an exposure of a fish within a pen. The primary lens 802 can be made out of glass and can help direct the light towards the spinningmirror 804. In some implementations, additional lenses or mirrors can be used for focusing the incident beam. - The
spinning mirror 804 is constructed such that theincident beam 801 is reflected at an angle. Two angles vital to thesystem 800 is the angle which reflects theincident beam 801 towards thered filter 805 andcamera 807 and the angle which reflects theincident beam 801 towards theblue filter 806 and thecamera 808. These two angles can be separate portions of a rotation of thespinning mirror 804. Additional optical elements not shown can be used within thespinning mirror 804 and other devices within thesystem 800. For example, before or after thespinning mirror 704 can be multiple lenses and mirrors as well as gluing and connecting agents. - The
red filter 805 and theblue filter 806 can be tuned to allow specific frequency light through. For example, thered filter 805 can be tuned to allow only light with wavelength between 620 nm and 750 nm. Theblue filter 806 can be tuned to allow only light with wavelength between 440 nm and 485 nm. - The
camera 807 and thecamera 808 can capture a light beam using a light detector. The light detector captures incoming light and creates an image. For example, the light detector can encode the captured light as a list of pixels with color and intensity. The pixel information can be stored as an image and can be used by other devices and systems. - Stage A of
FIG. 8 shows theincident beam 801 moving to the lens 802. Theincident beam 801 can be the light from an exposure of a scene. For example, the incident beam can be comprised of the light reflected off a fish swimming in a pen. - Stage B of
FIG. 8 shows theincident beam 801 reflected by thespinning mirror 804. Thespinning mirror 804 can have multiple lenses and mirrors used to accept and direct the outbound light beam. - Stage C of
FIG. 8 shows the output of thespinning mirror 804 passing through thered filter 805. The light before thered filter 805 can be any wavelength reflected by thespinning mirror 804. The light after thered filter 805 can be any wavelength within the range of the filter (e.g., 620 nm and 750 nm). - Stage C′ of
FIG. 8 shows theblue filter 805. During the course of a rotation for thespinning mirror 804, the output of thespinning mirror 804 can be directed towards theblue filter 806. The directed light can pass through theblue filter 806. The light before theblue filter 806 can be any wavelength reflected or transmitted from themirror 804. The light after theblue filter 806 can be any wavelength within the range of the filter (e.g., 440 nm and 485 nm). The cases of output from thespinning mirror 804 can be separate such that light passing through thered filter 805 does not also pass throughblue filter 806. - Stage D of
FIG. 8 shows the output of thered filter 805 reaching thecamera 807. Thecamera 807 can use a light detector to capture the incoming light from thered filter 805 and create an image. This image can be a stored group of pixels with colors and intensities. Images captured by thecamera 807 can be used for sea lice detection. - Stage D′ of
FIG. 8 shows thecamera 808. During the course of a rotation for thespinning mirror 804, the output of thespinning mirror 804 can be directed towards theblue filter 806. The output of theblue filter 806 can be directed towards thecamera 808. Thecamera 808 can use a light detector to capture the incoming light from theblue filter 806 and create an image. This image can be a stored group of pixels with colors and intensities. Images captured by thecamera 808 can be used for sea lice detection. - The
spinning mirror 804 can rotate at high speed and direct the portion of theincident beam 801 reflected from thespinning mirror 804 into a camera (e.g., thecamera 807, the camera 808). The process of rotating thespinning mirror 804 between directing light towards thecamera 807 or thecamera 808 can introduce a slight delay between the two cameras as they take their images. The motion of rotation can also affect the period of exposure forcamera 807 orcamera 808. In some implementations, the mirror can snap between locations which could allow for longer imaging without warping due to the moving of the image. -
FIG. 9 is a diagram which shows thesystem 900 for stereo camera image capture. Thesystem 900 is comprised of anincident beam 901, anincident beam 902, aprimary lens 904, aprimary lens 905, ared filter 906, ablue filter 907, acamera 909, and anothercamera 910. In some implementations, thecamera 909 and thecamera 910 can be connected to form a stereo camera system. Thesystem 900 can be used for image collection. In some implementations, images collected can be used in the process of detecting sea lice. - The incident light beams 901 and 902 can be the light from an exposure of a fish within a pen. The
primary lenses red filter 906 or theblue filter 907. In some implementations, additional lenses or mirrors can be used for focusing the incident beam. - The
red filter 906 and theblue filter 907 can be tuned to allow specific frequency light through. For example, thered filter 906 can be tuned to allow only light with wavelength between 620 nm and 750 nm. Theblue filter 907 can be tuned to allow only light with wavelength between 440 nm and 485 nm. - The
camera 909 and thecamera 910 can capture a light beam using a light detector. The light detector captures incoming light and creates an image. For example, the light detector can encode the captured light as a list of pixels with color and intensity. The pixel information can be stored as an image and can be used by other devices and systems. - Stage A of
FIG. 9 shows the incident beams 901 and 902 moving towards thelenses - Stage B of
FIG. 9 shows the incident beams 901 and 902 focused bylenses lens 904 andlens 905 can be sent towards thered filter 906 and theblue filter 907. The process of directing light towards the filters can be comprised of multiple lenses and mirrors. - Stage C of
FIG. 9 shows the output of thelenses red filter 906 and theblue filter 907 respectively. The light directed towards thered filter 906 can be any wavelength transmitted by thelens 904 or other optical element. The light after thered filter 906 can be any wavelength within the range of the filter (e.g., 620 nm and 750 nm). The light directed towards theblue filter 907 can be any wavelength transmitted by thelens 905 or other optical element. The light transmitted through theblue filter 906 can be any wavelength within the range of the filter (e.g., 440 nm and 485 nm). - Stage D of
FIG. 9 shows the output of thered filter 906 reaching thecamera 909. The output of theblue filter 907 can be directed towards thecamera 910. Thecameras red filter 906, the blue filter 907) to create an image. This image can be a stored group of pixels with colors and intensities. Images captured by thecamera 909 and thecamera 910 can be used for sea lice detection. - The
system 900, by employing stereo cameras each with a different color filter in front, allows the cameras to take pictures simultaneously with no reduction in incident light besides the losses in various optical elements including the filters. This represents a possible advantage over other image capture techniques. A possible disadvantage of the stereo camera setup can include the introduction of parallax between the two images. For example, a pixel at coordinate (x, y) in an image captured bycamera 909 will not be the same as a pixel at coordinate (x, y) in an image captured bycamera 910. The introduction of parallax between two images can potentially complicate a multi-frame registration process. - A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. For example, various forms of the flows shown above may be used, with steps re-ordered, added, or removed.
- Embodiments of the invention and all of the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the invention can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter affecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
- A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
- The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
- Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a tablet computer, a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer readable media suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
- To provide for interaction with a user, embodiments of the invention can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
- Embodiments of the invention can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the invention, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
- The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of the invention. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
- Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
- In each instance where an HTML file is mentioned, other file types or formats may be substituted. For instance, an HTML file may be replaced by an XML, JSON, plain text, or other types of files. Moreover, where a table or hash table is mentioned, other data structures (such as spreadsheets, relational databases, or structured files) may be used.
- Particular embodiments of the invention have been described. Other embodiments are within the scope of the following claims. For example, the steps recited in the claims can be performed in a different order and still achieve desirable results.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/697,388 US20220201987A1 (en) | 2020-01-15 | 2022-03-17 | Lighting controller for sea lice detection |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/743,023 US11297806B2 (en) | 2020-01-15 | 2020-01-15 | Lighting controller for sea lice detection |
US17/697,388 US20220201987A1 (en) | 2020-01-15 | 2022-03-17 | Lighting controller for sea lice detection |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/743,023 Continuation US11297806B2 (en) | 2020-01-15 | 2020-01-15 | Lighting controller for sea lice detection |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220201987A1 true US20220201987A1 (en) | 2022-06-30 |
Family
ID=74181391
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/743,023 Active US11297806B2 (en) | 2020-01-15 | 2020-01-15 | Lighting controller for sea lice detection |
US17/697,388 Pending US20220201987A1 (en) | 2020-01-15 | 2022-03-17 | Lighting controller for sea lice detection |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/743,023 Active US11297806B2 (en) | 2020-01-15 | 2020-01-15 | Lighting controller for sea lice detection |
Country Status (6)
Country | Link |
---|---|
US (2) | US11297806B2 (en) |
EP (1) | EP3869949B1 (en) |
JP (1) | JP7256928B2 (en) |
CA (1) | CA3165180C (en) |
CL (1) | CL2022001758A1 (en) |
WO (1) | WO2021146040A1 (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CL2016002664A1 (en) * | 2015-10-22 | 2018-01-05 | Intervet Int Bv | A method for automatic monitoring of sea lice in salmon aquaculture |
CA3084294A1 (en) | 2017-12-20 | 2019-06-27 | Intervet International B.V. | System for external fish parasite monitoring in aquaculture |
US11980170B2 (en) | 2017-12-20 | 2024-05-14 | Intervet Inc. | System for external fish parasite monitoring in aquaculture |
US12127535B2 (en) | 2017-12-20 | 2024-10-29 | Intervet Inc. | Method and system for external fish parasite monitoring in aquaculture |
US11659820B2 (en) * | 2020-03-20 | 2023-05-30 | X Development Llc | Sea lice mitigation based on historical observations |
CN111696114A (en) * | 2020-04-13 | 2020-09-22 | 浙江大学 | Method and device for identifying hunger degree of penaeus vannamei based on underwater imaging analysis |
US11388889B2 (en) * | 2020-08-14 | 2022-07-19 | Martineau & Associates | Systems and methods for aquatic organism imaging |
US11700839B2 (en) * | 2021-09-01 | 2023-07-18 | X. Development | Calibration target for ultrasonic removal of ectoparasites from fish |
CN115005143B (en) * | 2022-05-26 | 2023-08-04 | 国信中船(青岛)海洋科技有限公司 | Culture cabin with automatic fish capturing device and capturing method |
US20240015405A1 (en) * | 2022-07-07 | 2024-01-11 | Mineral Earth Sciences Llc | Dynamic lighting for plant imaging |
Citations (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3787867A (en) * | 1971-04-12 | 1974-01-22 | Automatic Power Division Pennw | Navigational aid system |
US4940323A (en) * | 1987-06-15 | 1990-07-10 | Downing John C | Light stimulator |
US8303129B1 (en) * | 2010-06-10 | 2012-11-06 | Scott Thielen | Wrist-mounted illumination device |
US20130050465A1 (en) * | 2010-02-05 | 2013-02-28 | Esben Beck | Method and device for destroying parasites on fish |
US20130273599A1 (en) * | 2010-12-23 | 2013-10-17 | Xpertsea Solutions Inc. | Photo-coupled data acquisition system and method |
US20150136037A1 (en) * | 2012-06-14 | 2015-05-21 | Koninklijke Philips N.V. | Illumination system for cultivation of aquatic animals |
US20170003160A1 (en) * | 2015-07-03 | 2017-01-05 | Smart Catch LLC | Marine animal data capture and aggregation device |
US20170150701A1 (en) * | 2015-11-29 | 2017-06-01 | F&T Water Solutions LLC | Recirculating Aquaculture System and Treatment Method for Aquatic Species |
US20170172114A1 (en) * | 2014-03-28 | 2017-06-22 | Cooke Aquaculture Inc. | Method and apparatus for removal of sea lice from live fish |
US20180303073A1 (en) * | 2015-10-22 | 2018-10-25 | Intervet Inc. | A Method for Automatic Sea Lice Monitoring in Salmon Aquaculture |
US20190228218A1 (en) * | 2018-01-25 | 2019-07-25 | X Development Llc | Fish biomass, shape, and size determination |
US20190340440A1 (en) * | 2018-05-03 | 2019-11-07 | X Development Llc | Fish measurement station keeping |
US20190363791A1 (en) * | 2016-11-23 | 2019-11-28 | Agency For Science, Technology And Research | Light emitting diode communication device, method of forming and operating the same |
US20200107524A1 (en) * | 2018-10-05 | 2020-04-09 | X Development Llc | Sensor positioning system |
US20200113158A1 (en) * | 2017-06-28 | 2020-04-16 | Observe Technologies Limited | Data collection system and method for feeding aquatic animals |
US20200155882A1 (en) * | 2018-11-21 | 2020-05-21 | Ali Tohidi | Fire forecasting |
US20200170226A1 (en) * | 2017-05-29 | 2020-06-04 | Ecotone As | Method and System for Underwater Hyperspectral Imaging of Fish |
US20200267947A1 (en) * | 2016-05-24 | 2020-08-27 | Itecsolutions Systems & Services As | Arrangement and method for measuring the biological mass of fish, and use of the arrangement |
WO2020180188A1 (en) * | 2019-03-06 | 2020-09-10 | Submerged As | Sea lice detection device and method for detection of sea lice |
US20200288678A1 (en) * | 2017-12-20 | 2020-09-17 | Intervet Inc. | System for external fish parasite monitoring in aquaculture |
US20200292857A1 (en) * | 2017-08-04 | 2020-09-17 | SMR Patents S.à.r.l. | Modulation control method for a filter device, electro- or magneto-optic modulated anti-flicker filter device, camera system with such a filter device, and a rearview device with such a camera system |
US20200288680A1 (en) * | 2017-12-20 | 2020-09-17 | Intervet Inc. | System for external fish parasite monitoring in aquaculture |
US20200288679A1 (en) * | 2017-12-20 | 2020-09-17 | Intervet Inc. | Method and system for external fish parasite monitoring in aquaculture |
US10842894B1 (en) * | 2019-07-30 | 2020-11-24 | Steribin, LLC | Systems and methods for treating a contaminated container |
US10856520B1 (en) * | 2020-01-10 | 2020-12-08 | Ecto, Inc. | Methods for generating consensus feeding appetite forecasts |
US10925262B2 (en) * | 2012-12-19 | 2021-02-23 | Signify Holding B.V. | Illumination system and method for enhancing growth of aquatic animals |
US10935783B1 (en) * | 2019-09-17 | 2021-03-02 | Aquabyte, Inc. | Optical system for capturing digital images in an aquaculture environment in situ |
US20210142052A1 (en) * | 2019-11-12 | 2021-05-13 | X Development Llc | Entity identification using machine learning |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3875820B2 (en) | 2000-01-14 | 2007-01-31 | ペンタックス株式会社 | Electronic endoscope capable of switching between normal light illumination and special wavelength light illumination, and rotary three-primary color filter / shutter used therein |
EP2962556B1 (en) | 2014-06-30 | 2018-10-24 | Ardeo Technology AS | A system and method for monitoring and control of ectoparasites of fish |
US11134659B2 (en) * | 2015-01-22 | 2021-10-05 | Signify Holding B.V. | Light unit for counting sea lice |
EP3259958A1 (en) * | 2015-02-17 | 2017-12-27 | Philips Lighting Holding B.V. | Lighting device |
GB2539495B (en) | 2015-06-19 | 2017-08-23 | Ace Aquatec Ltd | Improvements relating to time-of-flight cameras |
WO2019245722A1 (en) | 2018-06-19 | 2019-12-26 | Aquabyte, Inc. | Sea lice detection and classification in an aquaculture environment |
-
2020
- 2020-01-15 US US16/743,023 patent/US11297806B2/en active Active
- 2020-12-23 WO PCT/US2020/066800 patent/WO2021146040A1/en unknown
- 2020-12-23 EP EP20839535.0A patent/EP3869949B1/en active Active
- 2020-12-23 JP JP2022537167A patent/JP7256928B2/en active Active
- 2020-12-23 CA CA3165180A patent/CA3165180C/en active Active
-
2022
- 2022-03-17 US US17/697,388 patent/US20220201987A1/en active Pending
- 2022-06-28 CL CL2022001758A patent/CL2022001758A1/en unknown
Patent Citations (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3787867A (en) * | 1971-04-12 | 1974-01-22 | Automatic Power Division Pennw | Navigational aid system |
US4940323A (en) * | 1987-06-15 | 1990-07-10 | Downing John C | Light stimulator |
US20130050465A1 (en) * | 2010-02-05 | 2013-02-28 | Esben Beck | Method and device for destroying parasites on fish |
US9072281B2 (en) * | 2010-02-05 | 2015-07-07 | Stingray Marine Solutions As | Method and device for destroying parasites on fish |
US8303129B1 (en) * | 2010-06-10 | 2012-11-06 | Scott Thielen | Wrist-mounted illumination device |
US20130273599A1 (en) * | 2010-12-23 | 2013-10-17 | Xpertsea Solutions Inc. | Photo-coupled data acquisition system and method |
US9103812B2 (en) * | 2010-12-23 | 2015-08-11 | Xpertsea Solutions Inc. | Photo-coupled data acquisition system and method |
US20150308948A1 (en) * | 2010-12-23 | 2015-10-29 | Xpertsea Solutions Inc. | Photo-coupled data acquisition system and method |
US9410881B2 (en) * | 2010-12-23 | 2016-08-09 | Xpertsea Solutions Inc | Photo-coupled data acquisition system and method |
US20150136037A1 (en) * | 2012-06-14 | 2015-05-21 | Koninklijke Philips N.V. | Illumination system for cultivation of aquatic animals |
US10925262B2 (en) * | 2012-12-19 | 2021-02-23 | Signify Holding B.V. | Illumination system and method for enhancing growth of aquatic animals |
US20170172114A1 (en) * | 2014-03-28 | 2017-06-22 | Cooke Aquaculture Inc. | Method and apparatus for removal of sea lice from live fish |
US10843207B2 (en) * | 2014-03-28 | 2020-11-24 | Cooke Aquaculture Inc. | Method and apparatus for removal of sea lice from live fish |
US20170003160A1 (en) * | 2015-07-03 | 2017-01-05 | Smart Catch LLC | Marine animal data capture and aggregation device |
US20180303073A1 (en) * | 2015-10-22 | 2018-10-25 | Intervet Inc. | A Method for Automatic Sea Lice Monitoring in Salmon Aquaculture |
US20210068375A1 (en) * | 2015-10-22 | 2021-03-11 | Intervet Inc. | Method for Automatic Sea Lice Monitoring in Salmon Aquaculture |
US10863727B2 (en) * | 2015-10-22 | 2020-12-15 | Intervet Inc. | Method for automatic sea lice monitoring in salmon aquaculture |
US20170150701A1 (en) * | 2015-11-29 | 2017-06-01 | F&T Water Solutions LLC | Recirculating Aquaculture System and Treatment Method for Aquatic Species |
US20200267947A1 (en) * | 2016-05-24 | 2020-08-27 | Itecsolutions Systems & Services As | Arrangement and method for measuring the biological mass of fish, and use of the arrangement |
US20190363791A1 (en) * | 2016-11-23 | 2019-11-28 | Agency For Science, Technology And Research | Light emitting diode communication device, method of forming and operating the same |
US20200170226A1 (en) * | 2017-05-29 | 2020-06-04 | Ecotone As | Method and System for Underwater Hyperspectral Imaging of Fish |
US20200113158A1 (en) * | 2017-06-28 | 2020-04-16 | Observe Technologies Limited | Data collection system and method for feeding aquatic animals |
US20200170227A1 (en) * | 2017-06-28 | 2020-06-04 | Observe Technologies Limited | Decision making system and method of feeding aquatic animals |
US20200292857A1 (en) * | 2017-08-04 | 2020-09-17 | SMR Patents S.à.r.l. | Modulation control method for a filter device, electro- or magneto-optic modulated anti-flicker filter device, camera system with such a filter device, and a rearview device with such a camera system |
US20200288678A1 (en) * | 2017-12-20 | 2020-09-17 | Intervet Inc. | System for external fish parasite monitoring in aquaculture |
US20200288680A1 (en) * | 2017-12-20 | 2020-09-17 | Intervet Inc. | System for external fish parasite monitoring in aquaculture |
US20200288679A1 (en) * | 2017-12-20 | 2020-09-17 | Intervet Inc. | Method and system for external fish parasite monitoring in aquaculture |
US20190228218A1 (en) * | 2018-01-25 | 2019-07-25 | X Development Llc | Fish biomass, shape, and size determination |
US20190340440A1 (en) * | 2018-05-03 | 2019-11-07 | X Development Llc | Fish measurement station keeping |
US20200107524A1 (en) * | 2018-10-05 | 2020-04-09 | X Development Llc | Sensor positioning system |
US20200155882A1 (en) * | 2018-11-21 | 2020-05-21 | Ali Tohidi | Fire forecasting |
WO2020180188A1 (en) * | 2019-03-06 | 2020-09-10 | Submerged As | Sea lice detection device and method for detection of sea lice |
US10842894B1 (en) * | 2019-07-30 | 2020-11-24 | Steribin, LLC | Systems and methods for treating a contaminated container |
US10935783B1 (en) * | 2019-09-17 | 2021-03-02 | Aquabyte, Inc. | Optical system for capturing digital images in an aquaculture environment in situ |
US20210080715A1 (en) * | 2019-09-17 | 2021-03-18 | Aquabyte, Inc. | Optical system for capturing digital images in an aquaculture environment in situ |
US20210142052A1 (en) * | 2019-11-12 | 2021-05-13 | X Development Llc | Entity identification using machine learning |
US10856520B1 (en) * | 2020-01-10 | 2020-12-08 | Ecto, Inc. | Methods for generating consensus feeding appetite forecasts |
Also Published As
Publication number | Publication date |
---|---|
EP3869949A1 (en) | 2021-09-01 |
JP2023503176A (en) | 2023-01-26 |
CA3165180A1 (en) | 2021-07-22 |
CL2022001758A1 (en) | 2023-06-02 |
US20210212298A1 (en) | 2021-07-15 |
EP3869949B1 (en) | 2024-07-17 |
CA3165180C (en) | 2023-08-08 |
WO2021146040A1 (en) | 2021-07-22 |
US11297806B2 (en) | 2022-04-12 |
JP7256928B2 (en) | 2023-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220201987A1 (en) | Lighting controller for sea lice detection | |
KR102669768B1 (en) | Event camera system for pupil detection and eye tracking | |
US11381753B2 (en) | Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging | |
DK181217B1 (en) | Method and system for external fish parasite monitoring in aquaculture | |
US10708573B2 (en) | Apparatus and methods for three-dimensional sensing | |
US20230316516A1 (en) | Multi-chamber lighting controller for aquaculture | |
US7542628B2 (en) | Method and apparatus for providing strobed image capture | |
US10007337B2 (en) | Eye gaze imaging | |
CN106062665A (en) | User interface based on optical sensing and tracking of user's eye movement and position | |
JP6327753B2 (en) | Pupil detection light source device, pupil detection device, and pupil detection method | |
CN110312079A (en) | Image collecting device and its application system | |
US11179035B2 (en) | Real-time removal of IR LED reflections from an image | |
US11582397B2 (en) | Enhanced controller synchronization verification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: X DEVELOPMENT LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MESSANA, MATTHEW;THORNTON, CHRISTOPHER;YOUNG, GRACE CALVERT;SIGNING DATES FROM 20200410 TO 20200420;REEL/FRAME:059296/0652 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
AS | Assignment |
Owner name: TIDALX AI INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:X DEVELOPMENT LLC;REEL/FRAME:068477/0306 Effective date: 20240712 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |