The main reason numpy isn't used in NN implementations is that it does not, natively speaking, have GPU support. Tensor structures in PyTorch and TensorFlow have the most solid backend support for GPUs (TPUs) and have a good amount of numpy's ndarray capabilities. There is recent work to put numpy on the same footing for deep learning. Check
https://github.com/google/jax