LightSeq is built using Docker and trtis containers from NVIDIA GPU Cloud (NGC). Before building you must install Docker and nvidia-docker and login to the NGC registry.
Fistly, you need to build the docker image which is the trtis build environment.
cur_dir=$(pwd)
git clone https://github.com/NVIDIA/tensorrt-inference-server.git
cd tensorrt-inference-server && git checkout r19.05
docker build -t tensorrtserver_build --target trtserver_build .
Now you should start container and mount LightSeq to it.
cd ${cur_dir}
git clone https://github.com/bytedance/lightseq.git
cp -r ./lightseq ./tensorrt-inference-server/src/custom/lightseq
docker run --gpus all -it --rm -v ${cur_dir}/tensorrt-inference-server/src:/workspace/src tensorrtserver_build
Finally, build LightSeq inside container
# inside container
cd /workspace
# For compatibility with fp16
sed -i '/COMPUTE_CAPABILITIES/s/5.2,6.0,6.1,7.0,7.5/6.0,6.1,7.0,7.5/g' ./.bazelrc
bazel build -c opt src/custom/lightseq/...