I have searched the documentation, but I couldn't find the answer. Does SPACY uses ReLu, Softmax or both as activation function?
Thanks
By default, SPACY uses both, as we can see in layers architectures page from SPACY 3.0:
https://spacy.io/usage/layers-architectures