SGAS: Sequential Greedy Architecture Search
Figure 1. Comparison of search-evaluation Kendall τ coefficients. We show Kendall τ correlations for architecture rankings between the search and the evaluation phase of DARTS and SGAS. Architectures are obtained from 10 independent search runs.
Architecture design has become a crucial component of successful deep learning. Recent progress in automatic neural architecture search (NAS) shows a lot of promise. However, discovered architectures often fail to generalize in the final evaluation. Architectures with a higher validation accuracy during the search phase may perform worse in the evaluation (see Figure 1). Aiming to alleviate this common issue, we introduce sequential greedy architecture search (SGAS), an efficient method for neural architecture search. By dividing the search procedure into subproblems, SGAS chooses and prunes candidate operations in a greedy fashion. We apply SGAS to search architectures for Convolutional Neural Networks (CNN) and Graph Convolutional Networks (GCN). Extensive experiments show that SGAS is able to find state-of-the-art architectures for tasks such as image classification, point cloud classification and node classification in protein-protein interaction graphs with minimal computational cost.
Figure 2. Best cell architecture for image classification tasks.
Figure 3. Best cell architectures on ModelNet40 and PPI with each Criterion.
Table 1. Performance comparison with state-of-the-art image classifiers on CIFAR-10. We report the average and best performance of SGAS (Cri.1) and SGAS (Cri.2) Criterion 1 and Criterion 2 are used in the search, respectively. *Note that mean and standard derivation are computed across 10 independently searched architectures.
Table 2. Comparison with state-of-the-art image classifiers on ImageNet. We transfer the top 3 performing architectures on CIFAR-10 to ImageNet in the mobile setting. × + denote multiply-add operations. The average and best performance of SGAS (Cri.1) and SGAS (Cri.2) are reported.
Table 3. Comparison with state-of-the-art architectures for 3D object classification on ModelNet40. 10 architectures are derived for both SGAS and Random Search within the same search space.
Table 4. Comparison with state-of-the-art architectures for node classification on PPI. SGAS (small) is the small network stacking the cell searched by SGAS (Cri.1).