Insect Identification for Environmental Monitoring and Ecological Science

The goal of this project is to develop robotic technology and computer vision algorithms for automating biodiversity surveys of small arthropods including (a) macroinvertebrates in freshwater streams, (b) soil mesofauna, and (c) freshwater zooplankton.

Team members

Team Alumni

Data Sets

Current Status

Stoneflies

We collected specimens for 9 taxa of Stoneflies as shown in this image:

The images were collected using the apparatus pictured here:

The specimen is dropped into a well on the right where it enters an alcohol solution. It is then pumped along a plastic tube into the field of view of the microscope, where a side jet captures it and causes it to rotate. When the side jet is cut off, the specimen falls to the bottom of the tube where it is photographed. Additional pulses of the side jet will cause the specimen to rotate into different positions, so that we can capture a good view of the back (dorsal side) of the specimen. The specimen is then pumped out the left-hand side of the apparatus.

Computer vision algorithms are applied that incorporate machine learning techniques. On these 9 taxa of Stoneflies, we have achieved greater than 95% correct classification using two different methods. The first method wraps a standard dictionary-based bag-of-keypoints classifier within an outer boosting loop. This remarkably simple mechanism appears to work by making the dictionaries that are learned in the second and subsequent iterations discriminative. See our ICML 2009 paper for details.

The second method applies our new framework of evidence trees to accumulate classification votes from each detected point in the image. The detected points (represented as SIFT descriptors) are dropped through a random forest of evidence trees. The leaf of each tree stores the training data (the evidence). All of the evidence accumulated from all of the detected points is then combined using Out-of-Bag stacking to create a second-level stacked classifier that makes the final decision. See our CVPR 2009 paper for details.

EPTs

We have collected and photographed a database of 50 taxa of Mayflies, Stoneflies, and Caddisflies (EPTs) using a new version of the stonefly apparatus. The first part of this database (29 taxa) is called the EPT29 dataset, and it will be made available shortly. Each image is automatically segmented and rotated to produce the images shown here. There are more than 29 images because the Caddisflies build houses, and some of the images only show these houses.

Soil Mesofauna

We have collected and identified specimens of 29 taxa of soil mesofauna. We have also developed a robotic apparatus for photographing specimens placed in a Petri dish, as shown in the following image:

Specimens are placed in the Petri dish, which rests on an X-Y computer-controlled stage. The dish is surrounded by a ring of LEDs to provide incident lighting. Under the dish is an LCD screen that provides transmitted light in any desired color. This is helpful, because as can be seen above, many of the specimens are transparent or translucent. The microscope/camera scans across the Petri dish. At each position, it takes 40 images consisting of 10 different focal depths each taken under four different combinations of lighting and exposure time. Our plan is to analyze these 40 images and try to identify the specimens in the field of view. If a specimen is successfully identified, then a robot arm with an automated pipette will extract the specimen from the dish and place it in a designated well of a 96-well plate. We are still developing the robotic system. Here is a video showing the robot arm being debugged using a styrofoam mock-up of the microscope.

Robotic Extraction Video (Windows Media format).

Publications

Lin, J., Dietterich, T. (2013). Is Fine Grained Classification Different? Poster abstract. CVPR-2013 Workshop on Fine-Grained Visual Classification. PDF Preprint

Lin, J. (2013). A Study of Methods for Fine-Grained Object Classification of Arthropod Specimens. MS Thesis. School of EECS, Oregon State University. PDF download

Yao, Q., Liu, Q., Dietterich, T. G., Todorovic, S., Lin, J., Diao, G., Yang, B., Tang, J. (2013). Segmentation of touching insects based on optical flow and NCuts. Biosystems Engineering, 114, 67-77. PDF Preprint.

Lin, J., Larios, N., Lytle, D., Moldenke, A., Paasch, R., Shapiro, L., Todorovic, S., Dietterich, T. (2011). Fine-Grained Recognition for Arthropod Field Surveys: Three Image Collections. First Workshop on Fine-Grained Visual Categorization (CVPR-2011). PDF Preprint.

Larios, N., Lin, J., Zhang, M., Lytle, D., Moldenke, A., Shapiro, L. G., and Dietterich, T. G. (2011). Stacked Spatial-Pyramid Kernel: An Object-Class Recognition Method to Combine Scores from Random Trees, IEEE Workshop on Applications of Computer Vision. 329-335. PDF Preprint.

Lytle, D. A., Martinez-Munoz, G., Zhang, W., Larios, N., Shapiro, L., Paasch, R., Moldenke, A., Mortensen, E. A., Todorovic, S., Dietterich, T. G. (2010). Automated processing and identification of benthic invertebrate samples. Journal of the North American Benthological Society, 29(3), 867-874. PDF preprint.

Larios, N., Soran, B., Shapiro, L., Martinez-Munos, G., Lin, J., Dietterich, T. G. (2010). Haar Random Forest Features and SVM Spatial Matching Kernel for Stonefly Species Identification. IEEE International Conference on Pattern Recognition. In press. PDF Preprint.

Dietterich, T. G. (2009). Machine Learning in Ecosystem Informatics and Sustainability. Abstract of Invited Talk. Proceedings of the 2009 International Joint Conference on Artificial Intelligence (IJCAI-2009). Pasadena, CA. PDF Preprint.

Zhang, W. (2009). Image Features and Learning Algorithms for Biological, Generic and Social Object Recognition. Doctoral Dissertation. School of Electrical Engineering and Computer Science, Oregon State University. PDF Version.

Martinez-Munoz, G., Zhang, W., Payet, N., Todorovic, S., Larios, N., Yamamuro, A., Lytle, D., Moldenke, A., Mortensen, E., Paasch, R., Shapiro, L., Dietterich, T. (2009). Dictionary-Free Categorization of Very Similar Objects via Stacked Evidence Trees.. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR-2009), Miami Beach, FL. PDF Preprint.

Zhang, W., Surve, A., Fern, X., Dietterich, T. (2009). Learning Non-Redundant Codebooks for Classifying Complex Objects. In International Conference on Machine Learning (ICML-2009). Montreal, Canada. 1241-1248. PDF Preprint.

Zhang, W., Deng, H. (2008). Understanding Visual Dictionaries via Maximum Mutual Information Curves. 19th International Conference on Pattern Recognition (ICPR-08). 1-4. PDG Preprint.

Zhang, W., Dietterich, T. G. (2008). Learning Visual Dictionaries and Decision Lists for Object Recognition. 19th International Conference on Pattern Recognition (ICPR-08). 1-4. DOI 10.1109/ICPR.2008.4761769. PDF Preprint.

Sarpola, M.J., Paasch, R.K., Dietterich, T.G., Lytle, D.A., Mortensen, E. N., Moldenke, A.R., and Shapiro, L. (2008). An aquatic insect imaging device to automate insect classification, Transactions of the American Society of Agricultural and Biological Engineers, 51 (6): 2217-2225. PDF Preprint.

Larios, N., Deng, H., Zhang, W., Sarpola, M., Yuen, J., Paasch, R., Moldenke, A., Lytle, D., Ruiz Correa, S., Mortensen, E., Shapiro, L., Dietterich, T. (2008). Automated Insect Identication through Concatenated Histograms of Local Appearance Features. Machine Vision and Applications, 19 (2):105-123. PDF Preprint.

Peterson, C., Paasch, R. K., Ge, P., Dietterich, T. G. (2007). Product innovation for interdisciplinary design under changing requirements. International Conference on Engineering Design (ICED2007), Paris, France. PDF preprint.

E. N. Mortensen, E. L. Delgado, H. Deng, D. Lytle, A. Moldenke, R. Paasch, L. Shapiro, P. Wu, W. Zhang, T. G. Dietterich (2007). Pattern Recognition for Ecological Science and Environmental Monitoring: An Initial Report. In N. MacLeod and M. O'Neill (Eds.) Automated Taxon Identification in Systematics. 189-206. CRC Press, Boca Raton. PDF preprint.

Deng, H., Zhang, W., Mortensen, E., Dietterich, T. (2007). Principal Curvature-based Region Detector for Object Recognition. IEEE Conference on Computer Vision and Pattern Recognition (CVPR-2007). Minneapolis, MN. 1-4. DOI 10.1109/CVPR.2007.382972. PDF Preprint.

Larios, N., Deng, H., Zhang, W., Sarpola, M., Yuen, J., Paasch, R., Moldenke, A., Lytle, D., Ruiz Correa, S., Mortensen, E., Shapiro, L. G., Dietterich T. G. (2007). Automated Insect Identification through Concatenated Histograms of Local Appearance Features. IEEE Workshop on Applications of Computer Vision (WACV-2007), 26-32. Austin, TX. PDF Preprint.

Deng, H., Mortensen, E. N., Shapiro, L., Dietterich, T. G. (2006). Reinforcement Matching Using Region Context. In S. Lucey and T. Chen (Eds.) Beyond Patches. Workshop at IEEE Conference on Computer Vision and Pattern Recognition. IEEE. New York. PDF preprint.

Zhang, W., Deng, H., Dietterich, T. G., Mortensen, E. N. (2006). A Hierarchical Object Recognition System Based on Multi-scale Principal Curvature Regions. Proceedings of the International Conference on Pattern Recognition, Vol. I. 778-782. PDF Preprint.

Presentations

Boosted Evidence Trees for Object Recognition with Applications to Arthropod Biodiversity Studies, Caltech, January 19, 2011. PDF slides.

Dietterich, T. (2009). Vision and Machine Learning for Automated Arthropod Biodiversity Studies. PDF of Powerpoint Presentation. Cornell AI Seminar, March 2009.

Dietterich, T. (2007). BugID: Rapid Throughput Insect Population Counting Using Computer Vision. PDF of Powerpoint Presentation. Oregon State University Ecosystem Informatics Colloquium, May 2007.

Acknowledgement of Financial Support

Support for this project is provided by the National Science Foundation under NSF Grant numbers IIS-0326052 and IIS-0705765. Any opinions, findings and conclusions or recomendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation (NSF).

Other Research Directions

Here is a special specimen that we recently studied very carefully.