The documentation states that influence trimming can be used "to reduce the computation time for boosted models with substantially losing accuracy". By default, the weight_trim_rate parameter is 0.95. After disabling influence training by changing that parameter to 0, I actually achieve a large speed-up. When using a dataset with 262144 samples, I achieve a 5x speed-up. When using a dataset ten times larger, I achieve a 3x speed-up. This seems to be the opposite of the expected behavior. Can anyone explain why this might be happening? Thanks!