DESCRIPTION
Creation of a method and a device for assessing the gender dimorphism of poultry egg embryos based on computer vision techniques and artificial neural networks
An important welfare problem in the egg layer industry is the culling of day-old male chicks. There is a gender preference in chicken production. The methods used in practice to determine the gender when sorting day-old chicks are not numerous. Many researchers have tried different strategies to determine the gender of an embryo before the chicks hatch from the egg, and even before incubation, based on the differences in DNA content in the blastoderm, the hormonal differences (estrogens) in the allantois fluid, and the fluorescent properties of embryo blood. It has been accurately established that gender differences exist in embryo weight at incubation, its composition, egg odor, DNA content, blood fluorescence intensity, and Raman scattering. But none of these methods have been used in practice because they are invasive methods.
A promising and effective method for determining embryonic gender in an egg should not affect the integrity of the egg shell or the embryo inside, nor should it have a negative impact on the development of the embryo in the egg and after the hatching and development process. In addition, it must be fast-acting so that it can be applied to a large number of eggs. The method must be economically feasible and ethically acceptable.
Significant novelty and significance of the project lies in the application of multispectral methods of data processing, extracted from digital images of the studied object (egg embryo) and a new approach of image segmentation and masking in the training of convolutional neural networks, when it is necessary not only to determine the class of the whole image, but also to segment its areas by class of samples. These methods will allow to create simple and fast technical means, which can be effectively used in the existing technologies in poultry production, including incubation.
Supervisor:
Aleynikov A.F.