Updated REDME.md and linked to the published article.
"Covid-19 AI demo in all-Docker" deployment including dockerised Flask, FastAPI, Tensorflow Serving and HA Proxy etc etc.
Full documentation is pulished here at Deploy ML/DL models into a consolidated AI demo service stack
As a jump start, we can simply use docker-compose to deploy the following dockerised components into an AWS Ubuntu server
Please refer to full documentation on section "Dockerised Components"
Default start-up: docker-compose up -d
Scale-up start-up: docker-compose up --scale fastapi=2 --scale flask=2 -d
Sample application on AWS. Note: this service is on a temp AWS address and not up 24/07.
Please see section "2. Test demo APIs" within full documentation on section "Dockerised Components"
Please see section "3. Benchmark-test demo APIs" within full documentation on section "Dockerised Components"
Full documentation is pulished here
Tensorflow exported models are ommited in the models directory, since they are ~250M (larger than 100M by Github). I will upload these large files seperately.
The models are exported from Jupyter pipelines at here and here
Updated REDME.md and linked to the published article.