The accurate identification of insects is critical in research of ecosystems and in pest control in agriculture and forestry. Writing in the International Journal of Systems, Control and Communications, a team from China has focused on the identification of insects in the Wudalianchi Scenic Area in Heilongjiang Province. This region of China is considered one of the most useful for studying species adaption and the evolution of biological communities. In such studies, rapid and accurate insect identification in the field is critical.
Yao Xiao, Aocheng Zhou, Lin Zhou, and Yue Zhao of The School of Technology at Beijing Forestry University have developed an automatic insect identification system based on the SE-ResNeXt convolutional neural network, which they suggest could reduce the researchers’ workload as well as reducing the incorrect assignment to species. The team demonstrated 98 percent accuracy with their system, which coupled with field expertise could improve such studies in a meaningful way. The development of a website and app using the neural network will improve data storage and visualisation. Such efforts will ultimately supplant the archaic storage of insect specimens, especially given that such specimens do not represent the currency of ecosystems.
Research in locations such as the Wudalianchi Scenic Area is vital for conservation efforts especially given the rapid and widespread decline in biodiversity being seen the world over and in particular with respect to insects and other invertebrates as well as birds, fish and many other types of organism.
The team suggests that their app is particularly suited to research in the forest environment. However, in terms of what one might refer to as the bigger picture, the app and the associated website could find use in education, public understanding, and the broadening of conservation awareness.
Xiao, Y., Zhou, A., Zhou, L. and Zhao, Y. (2023) ‘Automatic insect identification system based on SE-ResNeXt’, Int. J. Systems, Control and Communications, Vol. 14, No. 1, pp.81–98.
No comments:
Post a Comment