Abstract: This paper presents a comprehensive Machine Learning Operations (MLOps) framework tailored for neural style transfer, focusing on modularity, maintainability, and scalability. Leveraging deep learning models, particularly VGG16, and inspired by seminal works like "A Learned Representation for Artistic Style," our framework integrates cutting-edge MLOps principles to enhance development processes and reproducibility. The implementation utilizes PyTorch for neural networks, FastAPI for backend optimization and MLflow, DVC, and Dagshub for detailed experiment tracking and version control. The frontend is developed with Streamlit, ensuring user-friendly interaction, while Docker guarantees deployment portability. Continuous integration and deployment are managed via GitHub Actions, with AWS ECS and Fargate providing scalability and reliability. Terraform is employed for Infrastructure as Code, enhancing system architecture agility. This end-to-end approach aims to improve model performance, streamline pipelines, and uphold reproducibility and sustainability in neural style transfer applications, pushing the boundaries of innovation in this domain. Our integrated MLOps framework demonstrates significant potential in advancing neural style transfer technology.

Keywords: Machine Learning Operations (MLOps), neural style transfer, deep learning, model deployment, reproducibility, scalability, maintainability


PDF | DOI: 10.17148/IJARCCE.2024.13550

Open chat
Chat with IJARCCE