Automated Fish Measuring System Final Report

Prepared For: The North Pacific Research Board

In Reference To: Automated Fish Measuring System Award Number: 1525 Performance Period: 07/01/2015 – 09/30/2017

Prepared By: Peter Brodsky, Principal Investigator Principal Engineer The Applied Physics Laboratory, University of Washington (APL-UW) Seattle, WA 98105 206-543-4216 [email protected]

Submission Date: November 2017

1

Table of Contents Abstract 2 Keywords 2 Citation 2 Chronology 3 Introduction 2 Objectives 3 Industry Partnership 3 Development Evolution 4 Enclosure Design 5 Final Enclosure 7 Web Data Access 8 Image Process Algorithm 9 Deployment 10 Performance and Data Analysis 11 Conclusions 15 Management and Policy Implications 16 Data Products 16 Outreach and Education 17 Acknowledgements 19

Abstract The problem of mortality in Pacific halibut in trawl is addressed by means of an automated method for length measurement which significantly reduces fish time out of water. A chute enclosure with downlooking video camera and custom measurement algorithms was designed, built, and tested with replica fish at our facility in Seattle. This system was deployed aboard an Amendment 80 factory trawl vessel in the Bering Sea for two months during winter 2017. Results of that deployment have been analyzed and demonstrate great promise for this technology.

Keywords Pacific halibut, Exempted Permit, deck sorting, , image processing, video camera, electronic monitoring.

Recommended Citation Brodsky, P. Automated Fish Measurement System. North Pacific Research Board Final Report 1525

2

Chronology This was a new project and was the first NPRB-funded project for PI Peter Brodsky. The work was motivated by discussions between PI Brodsky and Mark Fina at US in the 2014/2015 timeframes. These included informal meetings with NOAA/NMFS monitoring personnel regarding electronic systems. A number of papers describing work already performed in this general area were reviewed. These included Testing the Use of Electronic Monitoring to Quantify At-sea Halibug in the Central Gulf of Alaska Rockfish , by J. Bonney and K. McGauley (2008) and Final Report on EFP 12-01: Halibut deck sorting experiment to reduce halibut mortality on Amendment 80 Catcher Processors, by J.Gauvin (2012), among others. In addition, in 2014 US Seafoods allowed us to mount an inexpensive Go-Pro video camera above the measurement table on one of their deck-sorting vessels to capture imagery data. Results from that informal experiment were encouraging enough to motivate our proposal to NPRB. Funded work began July of 2015 and continued through September of 2017. Semi- annual progress reports were submitted throughout this period, except for the final 6 months, which constituted a no-cost extension. The extension was necessary due to delays in the hosting vessel schedule and the volume of video and imagery data which had to be manually processed.

Introduction APL-UW, in conjunction with its industry partner, United States Seafoods, LLC, APL-UW planned a camera system for use aboard commercial fishing vessels to speed the process of measuring halibut prior to returning them to the sea. Specifically, the system would provide length measurements that are used to generate weight estimates required for monitoring deck-sorting of halibut bycatch. In addition to accurately measuring fish for scientific analysis, a high priority goal was to minimize impact on normal discard operations, thus increasing industry buy-in to use of such a system. A number of former and ongoing electronic monitoring projects have a similar purpose as this research. Many if not most of these projects utilize still images as the raw data source and rely on expensive machine vision cameras. We are attempting to demonstrate that the use of multi-frame video offers an attractive, cost effective alternative. Video is widely and inexpensively available through ruggedized off- the-shelf surveillance cameras. It is also inherently robust by virtue of the number of images available over the course of a single fish traversing the field of view. The original concept was of a standalone video camera mounted outside and above the discard chute. The camera would be connected by ethernet cable to a computer housed inside and executing custom algorithms developed by researchers in UW EE Department’s Information Processing Laboratory . The algorithms track fish as they slide down the chute, generating accurate measurements based on aggregation of multiple video frames for each fish. The resulting measurements would be displayed in realtime on the processing computer, as well as to a client application running on any computer connected to the system via the ship’s LAN (e.g. in the wheelhouse). A throughput goal of at least one fish per second was targeted, as was the ability to use a commercial off the shelf (COTS) video camera rather than a highly specialized (and expensive) computer vision device. As development and testing proceeded, it became clear that the challenges of outdoor shadowing necessitated the use of an enclosure to provide a controlled lighting environment. Practical constraints imposed by the vessel provided by US Seafoods also dictated that all processing be done within the enclosure rather than on a computer located belowdecks inside the vessel. The resulting enclosure provided a non-intrusive environment by which fish could be electronically monitored with minimal

3 impact to normal vessel operations, and simplified installation and connections. All data was logged and, while not utilized aboard the demonstration vessel, a web-based service was provided by which the rest of the vessel could see the results in realtime had an ethernet connection been provided.

Objectives The specific objectives of this project were simple: demonstrate that an electronic, video-based monitoring system could measure deck-sorted bycaught halibut as they were returned to the sea, in order to reduce mortality. Specifically, show that such a system could measure fish as accurately as human observers, and do so faster and on more fish.

Industry Partnership This work was performed in conjunction with, and with the support of US Seafoods, a Seattle-based company whose vessels participate in Amendment 80 fisheries, the management program targeted for the deck-sorting Exempted Fishing Permit (EFP). During the course of the research described here, US Seafoods provided access to their dockside facilities, loaned APL-UW fish slide equipment from their inventory, and ultimately hosted the measurement system itself aboard their factory trawl vessel, Seafreeze America.

Development Evolution Given the difficulty of obtaining, keeping, and working with actual fish, initial prototyping relied on plywood or foam replicas cut to accurate halibut shape and scale. In addition, realistic color was achieved by painting from photos of a variety of real fish. Painting was done on both sides: dark brown/mottled on one and white to light gray on the other. Replica fish were tested with both dark and light side up to emulate a realistic scenario. Camera selection was based on several requirements. First, the camera must be IP (internet protocol) compatible. This enables the video feed to be posted directly on any LAN and enabling any computer on the network to receive the data over Ethernet, either hard-wired or wirelessly. In addition, the camera must be robust enough to survive outdoor weather conditions. Fortunately, the security and surveillance communities have driven development of rugged, affordable, high-quality cameras resulting in ready commercial availability. The first iteration of system design utilized a commercial treadmill acting as the “runway” on which fish were run underneath a free-standing camera mounted on a simple wood frame. This design suffered from a number of flaws realiting to realism which ultimately precluded its implementation. For one, the specific camera, an NC303-XB Hikvision was only rated to IP-66 standards (IP in this context is Ingress Protection: for shipboard use a camera should be rated no less than IP-67). More fundamentally, use of a moving platform to carry the fish is unrealistic, as it does not enable them to rotate or to translate laterally as real fish will on a chute slide. Nevertheless, this system was very useful for initial algorithm development, especially the Kalman filter-based tracking component.

4

Figure 1. Left: NC303-XB Hikvision camera, Right: camera mounted above treadmill with foam halibut on the runway. Note the camera calibration “chessboards” on the left. To gain realism, the next iteration of the runway design was one of a simple inclined slide, consisting of a stainless steel U-shaped frame, also supplied by US Seafoods, with blue HDPE (High Density Polyethelyne) covering the runway. HDPE is tough, waterproof, and available in a variety of colors. Again, the camera was fitted on a simple overhead mount looking down, although we replaced the simple Hikvision unit with a much more capable model from Vivotek (IP8381-E). The latter includes superior optics, more image tuning options, and is fully IP-67 rated.

Figure 2. Left: Inclined slide, Right: Vivotek camera Testing with this system with actual fish at US Seafoods’ dockside facility demonstrated the difficulty of this approach. Namely, shadows created by external features and the mount itself proved extremely vexing to the segmentation and tracking software. The challenges found in the stationary setup suggested that the situation would be much worse on an actual vessel. Pitching, rolling, and yawing at sea would cause shadows to move in unpredictable ways, creating an even more challenging environment. Tuning the algorithms to adapt to this would have required time and resources beyond the scope of the project, such as significant time on a chartered vessel or construction of a 3D rotating structure. Development of an enclosed design provided a simple, cost effective means of addressing these challenges.

5

Enclosure Design Development of the enclosed chute system began with a plywood prototype; this enabled us to optimize size and shape, develop lighting techniques, and select an appropriate camera.

Figure 3. Prototype enclosure with trap door view It immediately became clear that the “bullet”-style Hikvision and Vivotek cameras would not work with this design as their fields-of-view (FOV) were too narrow given the height constraint imposed by the enclosure. A goal of the enclosure was that it fit within an existing discard chute aboard a vessel (specifications for which were provided by US Seafoods) and that it present as low a vertical profile as possible to avoid interference with other deck equipment. This necessitated switching to a dome-style camera. This prototype ultimately enabled us to select and design interfaces between all the necessary components:  Camera: KT&C KNC-p2CR28V12IR 2.1 Megapixel Outdoor IR Rugged IP Dome  Computer: SmallPC “iBrick”; 2.10GHz processor, 4 GB RAM, 64 GB SSD  Lighting: 12V Cool White LED strips  Monitor: NavPixel NPD0835  Assorted electronics including Ethernet switch and AC/DC power supply With the exception of the computer and monitor, electronics were installed in a durable fiberglass housing, with waterproof glands providing cable pass-throughs between the housing and enclosure body itself. The monitor selection was particularly important. This was the only component which would reside completely outside in the weather. An early design decision was to enable observers at the discard chute during operation to observe its performance in realtime. The NavPixel monitor is IP-68 rated and designed for harsh marine use and it proved up to the task, never failing despite freezing weather and saltwater spray. Note that the computer, while not outside the enclosure, was fully exposed to saltwater splash inside. For this reason the military-grade iBrick was selected as the processor, and it too performed without failure during the entire at-sea deployment.

6

Figure 4. Left: Dome video camera, Center: iBrick computer, Right: NavPixel Monitor

Calibration Another factor in the decision to move to an enclosed chute design relates to camera calibration. This process generates coefficients that enable the processing algorithms to convert the segmented images in pixel space into physical units (e.g. centimeters). A free-form outdoor design would have required the calibration procedure to be repeated onboard the vessel any time the camera was moved or rotated, either intentionally or otherwise. The procedure itself, while not technically complex, is tedious and time- consuming, requiring numerous placements of a calibration “chessboard” over the runway in various orientations. Moving to a self-contained system enabled us to perform the calibration once in the laboratory before delivery. One interesting impact of moving to the dome camera is that its wide FOV is achieved by use of a “fish-eye” lens: this creates significant distortion near the FOV limits and makes accurate calibration critical.

Final Enclosure Once the required elements had been finalized via the prototype, an operational model (aluminum) was designed by APL-UW engineers and fabricated by a Seattle area shop. The result is a strong, lightweight structure with flared entry/exit to funnel fish in/out and removable doors to enable internal access to the electronics, camera, and computer. Two images below show the enclosure after delivery. In the first the access panels are off, allowing camera and computer installation. The second shows the enclosure with the monitor installed during test operations at APL-UW. In both cases note the electronics housing mounted on the left side. As delivered the enclosure measured approximately 60” long by 24” wide by 32” high and weighed approximately 55 lbs. with all components included.

7

Figure 5. Left: bare enclosure. Right: In-lab enclosure testing. Note measured fish (ATF) displayed on the monitor. The enclosure was put through extensive testing to fine tune lighting, camera settings, and algorithm parameters. A significant hindrance was the difficulty in obtaining and working with actual fish. Ultimately, US Seafoods did procure and deliver to us a large number of frozen arrowtooth flounder (ATF) to substitute for halibut. Testing with these proved logistically challenging and ATF do not reach the sizes that halibut do. However results were encouraging enough to plan delivery of the system for deployment on a vessel engaged in trawl fishing under the deck-sorting EFP.

Web Data Access As mentioned above, a web-based tool was developed simultaneously with the image processing and data storage application. This tool enables any computer with a web browser and ethernet connectivity to the processing computer to see and display the following: 1. The camera chute view in realtime. 2. Fish measurements as they are made, superimposed on segmented images. 3. Previous measurements and images via a flip-back history mechanism. An example of the web tool, from data taken during lab testing is shown below. Note that the measurement result pane is always slightly behind the realtime view in time, as the former displays only the last valid measurement. Our goal was to make this view of the system’s performance available to personnel physically removed from the system location, for example from a computer in the wheelhouse or down below. The only hardware required would be an ethernet cable connected to a port available in our electronics housing. Ultimately time and resource constraints did not permit this connection and thus this capability was not exercised during the Seafreeze America deployment.

8

Figure 6: Web-based tool realtime view

Image Processing Algorithm The measurement algorithm is actually composed of three individual components: a) segmentation, b) tracking, and c) measurement.

Segmentation is the process by which the image of the fish is distinguished from the background. In our software, a Gaussian Mixture Model (GMM) is used to model the background and extract the foreground (the fish). The resulting segmented portion of the image is refined through the process of histogram backprojection

In the tracking step, a Kalman filtering-based tracker which estimate the position and velocity of fish is used to associate the segmentations of the same fish in contiguous frames. These segmentations are then rotated into the horizontal based on their orientations, which is acquired by principal component analysis (PCA), and aggregated together to get the final segmentation of the fish.

In the measurement step, a scoring function is used to locate the head/tail endpoints of the final segmentation using scoring functions. The length of the fish is then calculated by connecting the head/tail endpoints through a morphological midline along the center of the fish image body. Note that in addition to the length measurement, the software generates a quality metric, from 0 to 1, representing the algorithm’s own estimate of its ability to track a fish through multiple video frames and segment the fish from the background each time. This metric was intended to give us an objective quantity for statistically analyzing system performance post hoc. In fact, the quality metric proved to be overly optimistic in many cases, with high values assigned to measurements that were clearly not worthy of them. This became

9 evident as we manually evaluated measurements by eye. The result was the need for a much more laborious quality control mechanism, described in detail below (Performance Analysis). Finally, the system also converts the calculated length into an equivalent weight estimate, using the formula from Clark (1991):

W= 9.205E-6 x L3.24

Where L is the measured length in centimeters and W is the resulting weight in pounds. Note that a “measurement” is the length of the fish from the tip of the nose to the center of the tail fork. As mentioned above, we imposed a requirement on the system that it be capable of visually displaying each measurement on the local monitor in realtime. An example of a displayed measurement, with associated weight estimate and quality metric is shown below.

Figure 7: Sample displayed measurement

Implementation of these algorithms makes extensive use of the OpenCV suite of open-source computer vision software. OpenCV is available for most computer operating systems and with interfaces to a number of programming languages, including C++.

Deployment In mid-January 2017, the system was installed aboard the US Seafoods factory trawler Seafreeze America while at the Seattle facility. The unit was placed directly in the exit slide of the discard chute, which puts it immediately downstream of the measurement table used by the observers to perform manual measurements. Due to logistical complications around this time (the vessel getting ready for sea), electrical power was not available at that location on the vessel at the time of installation. Therefore no shipboard testing was possible before departure. We left the system with instructions for hardwiring AC power, which was performed by the ship’s crew en route to Alaska. Basic system operations, with some simple troubleshooting instructions were printed, laminated, and supplied as well. In fact, operation was basic enough that little instruction was required; the crew simply needed to flip a power switch on to begin and flip off when a haul’s worth of discard was complete.

10

Figure 7. Measurement enclosure installed on discard chute. Note the view straight to the sea through the enclosure in the image on the right.

Performance and Data Analysis Upon return from approximately 45 days of use aboard the Seafreeze America, the entire system with collected data was returned to APL-UW for analysis. Automated halibut measurements, in both CSV and Postgres DB form were stored on the main computer drive. Raw video footage was written to a 1 TB external drive. For correlation of our automated measurements with those made by onboard observers, US Seafoods provided us with transcribed measurement logs produced by the latter. These logs, in the form of an Excel spreadsheet, were then compared with measurements made by the automated system. Using the former as truth gives us an independent measure of the system’s accuracy and viability. Note that EFP rules require only every Nth fish be measured before discard, where N was 5 during the Seafreeze America deployment. However, some questions remain as to whether this decimation factor was not rigorously adhered to in actual operations. A very brief statistical look at the data is represented by:  Total time of captured video: Approximately 15 hours, spread over 380 individual video files  Total number of automated fish measurements: 5,329  Total number of vessel-recorded fish measurements: 706 The large discrepancy between the last two numbers cannot be explained by the factor of 5 decimation for hand measurement (the ratio here is approximately 7.5). Rather, this is due primarily to challenging conditions within the system chute resulting in a number of spurious “measurements” which are not in fact real. This is described in greater detail below. It became immediately clear that correlating the vessel log measurements with those from the automated system would be a significant challenge. Most critical was the lack of absolute timestamps on the former (vessel log) measurements. The vessel logs only listed the particular fish’s time out of water, and then only to minute (60 second) resolution. We had hoped that the vessel’s own deck camera might provide an independent source of correlation, but the recordings were unfortunately lost. As a result, a very laborious process was required post-deployment to try to match vessel logs with automatically collected data. Our first task, independent of the hand-vs-automated correlation was sanity checking the automated measurements themselves. As noted above, the calculated quality metric was not ultimately useful as an

11

independent arbiter of measurement accuracy. In fact, the metric was heavily skewed toward positive results. A histogram of the metric is shown below.

Figure 8: Quality metric histogram

In fact, many measurements which were assigned high quality metrics were in fact blatantly wrong, as determined by cursory look. Examples are shown in the figures below, where the captured image is a) of two different fish measured together as a single unit or b) of a partial fish (with tail cut off). In both cases, the quality metric was assigned a value near 0.8, which would suggest measurement accuracy far better than what is obviously observed.

Figure 9: Multiple fish image capture (invalid) Figure 10: Cut-off fish image capture (invalid)

On the other hand, many of the resulting measurements (over 1700) were deemed reasonable. One example is shown below. Determination of “reasonable” was a somewhat subjective process, performed by painstakingly reviewing all 5000+ images individually. The result was that of the 5,329 measurements taken overall, 1,706 were judged valid and the other 2,924 were invalid. This is a ratio of approximately

12

32%/68% valid/invalid, i.e. somewhat less than half the measurements were considered usable for comparison.

Figure 11: Valid measurement image

We did notice a very evident trend toward decreasing accuracy and validity ratios as time went by. This was almost certainly due to deteriorating conditions inside the enclosure, namely dirt and salt accumulation on the camera’s glass lens cover and curling of the HDPE runway edges. The latter theoretically could also have the effect of distorting the fish distance from the camera by pushing fish upward and thereby skewing measurements toward the high end, however the curling was only apparent at the very end of the chute exit, by which time most tracked measurements have already been made (before the fish reached that part of the chute). The operating crew did obviously wipe the camera down occasionally; this was evident from the raw video footage. Causes of measurement failure were varied, and included smearing on the camera lens cover (though we believe this was a minor impact). More significant was the rate at which fish were sent through the chute (greater than 1/second) and the tendency for fish to occasionally overlap and “pile up” near the exit before exiting. The former was actually not as significant as we had feared; experimentation in our lab suggest the algorithms are fast enough to handle fish at a high rate. However the latter (overlapping fish) is a very difficult geometry for the segmentation logic to handle. Finally, glare on the water used to lubricate the fish down the chute proved particularly difficult to deal with (see below). Ultimately, without better observer timetagging or independent deck camera data, fish-by-fish measurement correlation between those recorded by the observers and the automated system proved impossible. For some attempt at comparison, we generate a simple statistical synopsis, shown below.

Number of Length Mean Length Standard Deviation

Measurements (cm) (cm) Vessel Observer 706 60.8 19.2 Logs Automated System 1706 74.9 29.9 Table 1: Comparison of measurement statistics

13

Length distributions also suggest reasonable agreement, per the histograms below. Both show peaks at a range of 45-55 cm and both show a high-end tail toward a maximum of ~120 cm.

Figure 12: Histogram of length measurements from observer logs

14

Figure 13: Length measurement histogram from automated system

Conclusions A video-based fish discard measurement system is clearly viable. The system as delivered, with no prior testing on live halibut was able to cleanly measure over 2000 fish in an actual operational environment. The hardware and computer networking design proved itself more than robust enough to handle the extreme conditions of winter in the Bering Sea; no hardware or software failures were reported. On the other hand, this remains a hard problem. To the extent that fish discard flow rates and live fish movement cannot be rigorously controlled in an operational environment, accurate automated measurements of non-stationary fish is a challenge, albeit one in which progress has been made. Lessons learned from this experiment lead to the following recommendations for a follow-on effort: 1. Better design for the runway. While the HDPE proved robust and maintained color, it tended to lift and curl near the ends of the chute. Options might include permanent glue-down or use of an equivalently colored paint which would adhere to aluminum. 2. Improved algorithm tolerance to background variation. We noticed a tendency for the segmentation algorithm to be fooled by water rivulets running down the chute alongside the fish. This water is obviously required for the fish to slide cleanly, so the system must be robust to its presence. 3. Access to live or live-like fish before deployment. A major factor contributing to degraded performance was motion of fish as they move down the chute. While lateral movement is expected, live fish also tend to arch up and even flip over. The latter may be rare, but the former is common and at best leads to erroneous measurements as the mid section of fish is at a different distance from the camera than the head and tail. This motion is important and is unfortunately one of the most difficult to emulate in a controlled (laboratory) environment. It might be possible to fabricate spring- actuated replica fish which could articulate, at least crudely, in an arching manner described above.

15

Better timekeeping of hand-made measurements by onboard observers. This is critical to our ability to evaluate the automated system’s performance. Technology exists which could be brought to bear here, for example an audio capture device into which an observer would simple state the measurement value. These devices can convert spoken words into equivalent numbers and would timetag the measurements automatically.

Management and Policy Implications As EFPs have advanced the use of deck-sorting toward regulatory implementation, the use of observer sampling for estimating the amount of halibut returned to the water has the potential to limit halibut mortality savings. Observer sampling protocols must balance a loss of precision (arising as fewer fish are measured) against a delay of returning halibut to the water (as more fish are measured). The development of a reliable camera system for measuring fish can avoid the need to make this trade off. Camera fish measurement would allow both accurate and rapid measurements of all fish returned to the water from the deck. This more precise data would allow for better estimation of the halibut mortality (and associated reduced mortality) from deck-sorted halibut. A camera measurement system could also free up observers to perform more viability samples and more closely oversee activities on deck generally.

Data Products A list of the data products generated by the measurements system can be summarized as: 1. Timetagged raw video footage of the system field of view, collected at all times the system was powered on. This footage is useful both as raw evidence of the deck-sorted halibut discard process generally, and as a backup mechanism for validating the collected measurements. In total, the system recorded 15H:5M:32S of video, in AVI format, spread over 380 individual video files 2. Fish measurements as detected and tracked by the system collected in comma-separated-value (CSV) files, suitable for import into spreadsheet programs like Microsoft Excel. As stated above, 5,329 measurements were made over the 45 days of at-sea operation. Each CSV line represents a single measured fish and consists of: 1. Count 2. Time (seconds since file start) 3. Frame number 4. Length (pixels) 5. Length (cm) 6. Width (pixels) 7. Width (cm) 8. Estimated weight (lbs) 9. Estimated segmentation quality (0-1) 10. Image filename 3. One segmented image for every measurement, in JPG format, i.e. that referenced by the 10th item in the CSV file list. 4. Equivalent measurement data stored in a single Postgres database, exportable to a number of commonly used formats. Each record of the DB contains: 1. Date/Time of the measurement 2. Length (cm)

16

3. Width (cm) 4. Estimated weight (lbs) 5. Estimated segmentation quality (0-1) 6. Image filename In addition, we were provided observer measurement logs provided by US Seafoods in the form of Excel spreadsheets. These contained 706 measurements over the period our system was in operation.

Outreach and Education Before development began, the author met several times with personnel from NOAA NMFS in their Seattle office of the Alaska Center to discuss techniques and operational impact of this work in the larger context of electronic monitoring. That conversation informed a number of design requirements, including acceptable fish throughput rates and realistic resource allocation (i.e. funding request and expenses). Much of the development and testing work was done with assistance from undergraduate students at the University of Washington. These represented a wide cross-section of disciplines including Oceanography, Electrical Engineering, Mechanical Engineering, and Computer Science. These students advertised the system and its purpose to peers and faculty, and numerous student groups from both UW and local high schools visited APL-UW for demonstration and discussion.

Figure 14. UW Oceanography undergrad Alexis Harper after assisting with installation aboard Seafreeze America

The author attended the Alaska Marine Science Symposium (AMSS) three consecutive years (2015, 2016, 2017) to describe the project and its goals with other researchers. In addition, a poster with flyer was presented at the 2017 AMSS, as shown below.

17

Figure 15. 2017 AMSS Poster and flyer During installation of the system aboard the Seafreeze America, ship’s crew were asked about its relevance and intrusiveness, if any. Response was uniformly positive. In addition, upon return of the vessel to Dutch Harbor AK after 3 months of use, crew were interviewed again. Sentiment again ran positive. Initial concerns about camera lens icing or fogging proved unfounded, the system operated “turnkey” as designed, and it held up completely to the rigors of operating in an environment as harsh as that aboard a in the Bering Sea. One recommended change was that the enclosure entrance be made slightly larger to accommodate bigger fish, however the system did not preclude any fish from going down the chute. During early phases of the development program, we consulted frequently with Farron Wallace at the NOAA’s National Marine Fisheries Service in Seattle, where he leads efforts to integrate electronic monitoring in the North Pacific fisheries industry. While we had originally planned to present the results of this project to fishing and related trade associations, time and budgetary constraints limited our interactions to those with our direct industry partner, US Seafoods. APL-UW is working to include a summary of this project, including a link to the NPRB project synopsis web page on its public website. Finally, we have begun and will continue a collaboration with others engaged in camera-based electronic monitoring. For example, Dr. Craig Rose at NOAA Fisheries has been developing a system based on still imagery and laser triggers to measure fish under similar circumstances. We have visited Dr. Rose at his facility and he has visited ours and we have exchanged notes and lessons learned. Now that our seagoing data has been received and processed, we would propose to share the results with Dr. Rose and his team.

18

Acknowledgements Dr. Mark Fina at US Seafoods acted as liason between APL-UW and industry and worked the logistics of both dockside testing at their facility and deployment of the final system aboard the F/T Seafreeze America. Alexis Harper, UW Oceanography undergraduate, provided essential support during all phases of development, deployment, and data analysis. Dr. Craig Rose and Suzanne Romain of NOAA Fisheries provided invaluable advice on selection of hardware and construction processes for “bulletproofing” the deployed system given its likely use and abuse. Prof. Jenq-Neng Hwang at UW EE provided graduate and undergraduate student access and technical consultation services. Tsung-Wei Huang, UW EE graduate student, developed the image processing segmentation, tracking, and measurement algorithms. Jestoni Orcejola and Dave Dyer, APL-UW mechanical engineers, designed the final enclosure, electronics housings, and camera mounting system. Yuhao Zhu, UW CS undergraduate student, developed the web-based client interface and associated server side software. The captain and crew of F/T Seafreeze America hosted and operated the system.

19