Project Domains | Mentors | Project Difficulty |
---|---|---|
NLP, Deep Learning, Transformers, Model Deployment, Pre-processing | Rakshita Kowlikar, Prithvi Tambewagh | Hard |
Project Description
Turn a pre-trained T5 model into a grammar-fixing engine:
- Clean and pair error-annotated sentences from FCE or Lang-8.
- Fine-tune T5 with Hugging Face Transformers on Google Colab GPUs.
- Measure quality with GLEU, precision, recall, and F1.
- Wrap the model in a lightweight FastAPI service so users can POST text and get corrected output.
Optional extras include batch endpoints, ONNX/quantized inference, and WandB experiment tracking.