摘要:In recent years, deep convolutional neural networks (DCNNs) have delivered notable successes in visual tasks, and in particular, image classification related applications. However, they are sensitive to the selection of their architectural and learning hyperparameters, which impose an exponentially large search space on modern DCNN models. Traditional hyperparameter selection methods include manual model tuning, grid, or random search but these require expert domain knowledge or are computationally burdensome. On the other hand, Bayesian optimization and evolutionary inspired techniques have surfaced as viable alternatives to the hyperparameter problem. In this work, an alternative automated system that combines the advantages of evolutionary processes and state-of-the-art Bayesian optimization is proposed. Specifically, the search space is first partitioned into separate discrete-architectural, and continuous and categorical learning parameter subspaces, which are then efficiently traversed by a stochastic genetic search applied to the former, combined with a genetic-Bayesian search of the latter. Several sequential experiments on prominent image classification tasks reveal that the proposed method results in overall classification accuracy improvements over several well-established techniques, and significant computational costs reductions compared to brute force computation.