Results for Training and Inference time of the models:
Overview of the inference time for small and medium methods
The table shows the inference time in total seconds, average time per patch and per bug with different beam sizes, for both small and medium methods (two sheets).
The training of the models for small and medium BFPs took 6 and 15 hours respectively, running on a server with three consumer-level GPUs