In the second phase of the Neural Search project, we developed a new ResNet-Based Neural Search model that extends the Quantum Neural Search framework by incorporating ResNet-inspired Fitness and Transition blocks. We successfully integrated this improved architecture into the AlphaZero framework for the game of Go and trained it within a Reinforcement Learning loop. The updated model demonstrated improved stability and better exploration capabilities compared to the baseline model (without Neural Search). We also conducted hyperparameter tuning to further enhance training stability and optimize performance.
The code for the project is in this public repository.
We designed a new ResNet-Based Neural Search that builds on the Quantum Neural Search framework but incorporates ResNet-inspired Fitness and Transition blocks.
def transition_resnet(input):
out = conv_block1(input) # extract initial features
out = conv_block2(out) # expand each feature
out += tile(input) # add residual to each expanded output
return F.relu(out)
The code for the Resnet Based Transition Function is here : transition_function.py
The Fitness function is structured similarly to the Transition function. However, instead of expanding states, it evaluates the candidates generated by the transition function and assigns scores to each pixel to guide the search process by allowing search to take combinations of pixels between current and next states.
The code for the Resnet Based Fitness Function is here : fitness_function.py
Our ResNet-based Search is constructed using a combination of the Transition and Fitness blocks. The search module iterates multiple times to expand and evaluate candidate states efficiently.
search = nn.Sequential(*[
quantum_neural_search(
transition=TransitionResNetBlock(num_input_filters, num_output_filters, input_size),
fitness=FitnessResNetBlock(num_input_filters, num_output_filters, input_size),
)
for _ in range(num_search)
])
where