Conditionals in nn layers introduce transient read/write variables that disables fused operators such as CP relu_maxpooling_backward and hence redundant execute sparsity-introducing sel+ operator. This operator causes unnecessary dense-to-sparse-to-dense conversion and becomes the heavy hitter after native BLAS change. Note: some fused operators such as CP relu_maxpooling are still applied because there is no conditional in between those layers.
Without conditionals in dropout layer: https://github.com/apache/incubator-systemml/blob/master/scripts/nn/layers/dropout.dml#L49-L53
With conditional, we add sel+ to the heavy hitter:
[~firstname.lastname@example.org] Since you created the layers, I think you should decide how best to restructure the DML. My recommendation would be to create two layers in case of conditionals.