All convolutions inside of a dense block are ReLU-activated and use batch normalization. Channel-wise concatenation is only achievable if the height and width Proportions of the info keep on being unchanged, so convolutions inside a dense block are all of stride one. Pooling levels are inserted between dense blocks https://financefeeds.com/wall-street-pepe-presale-surpasses-48m-could-is-be-the-new-pepe/