This answer quotes ChatGPT

Yes, in the convolutional neural network(CNN), the weights within the same convolutional kernel are shared, whether in the same feature graph or between different feature graphs. This is because the convolution kernel uses the same weight when convolving each position to extract similar features. Therefore, when convolution operations are performed with the same convolution kernel in different feature graphs of the same batch, the weight of the convolution kernel is shared to improve the efficiency and generalization ability of the model.

In the feature graph stack you mentioned, suppose the convolution kernel size of the convolution operation is(Cout, Cin, kh, kw), where Cout and Cin are the number of channels of output and input respectively, and kh and kw are the height and width of the convolution kernel respectively. Then, each feature graph in the stack will be convolved with the same convolution kernel and therefore share the same weight between them.