There are some difficulties in researches on supervised learning using neural networks: the difficulty of selecting a proper network structure, and the difficulty of interpretation of hidden units. In this work, GAd (Genetic Algorithm with Degeneration) is proposed to solve the difficulties by optimizing the network structure of neural networks. The GAd employs real-coded genetic algorithm and introduces the idea of genetic damage. In GAd, the information of damaged rate is added to every gene. The GAd inactivates the genes that have lower effectiveness using genetic damage. The performance of GAd for structural learning is shown by optimizing a simple problem. Also, it is shown that GAd is an efficient algorithm for the structural learning of layered neural networks by applying GAd to the learning of a logic function.