Reading the tensorflow text summarization model it states "The results described below are based on model trained on multi-gpu and multi-machine settings. It has been simplified to run on only one machine for open source purpose."
Further in the guide this command is invoked :
bazel build -c opt --config=cuda textsum/...
Does this command not relate to cuda/gpu? Why is this command truncated?
This is a bazel
command: --config=cuda
means "use the CUDA build configuration (and generate GPU-compatible code)", and textsum/...
means "all targets under the directory textsum
" (i.e. the command isn't truncated and you should type the literal ...
when entering the command).