I don't really know anything about NLP, so I can't comment on the analysis quality. What I meant to say is that a full-on Linux container seems a bit overkill for distributing what is essentially just a python app.
If I just wanted to run an instance of this locally to play around with, I would probably want to git clone it, make a virtualenv, run "pip install -r requirements.txt" or "python setup.py develop" to grab the dependencies, and then run some python script which starts up the web server.
And if I wanted to deploy the app to EC2, I would probably just grab an AMI and launch an instance off of that.