Docker has become a popular tool for developers around the world to deploy applications in a reproducible and robust way. The presence of Docker and Docker composers has reduced the time required to set up new technologies and introduce complex technology stacks for our applications. If you’re working on DS, ML, or scientific computing, then this video is for you. It will cover the best practices when creating Docker containers for data-intensive applications, from optimizing your image build to ensuring that your containers are stable and effective in the deployment of workflows. The audience will leave the video optimistic that Docker will be implemented across a variety of DS, ML, and study projects.
If you are done creating or writing code for an application, all you need is to build an infrastructure to successfully launch it on the market. Doing so, however, is quite a complicated procedure. If you want to do it the easiest way, you might consider using a cloud platform or PaaS (Platform as a service). Aside from Heroku (one of the first cloud platforms), there is another PaaS that is widely used today ~ Docker! Docker has become a standard tool for developers around the world to deploy applications in a reproducible and robust manner.
Just like most cloud platforms, one of the greatest advantages of using Docker is how it effectively reduces the time needed to set up new software and implement complex technology stacks for the applications. In this video presented by Tania Allard, we will take a deeper look at Docker and how to properly deploy your applications on this platform. The video will also cover the best practices when building Docker containers for data-intensive applications, from optimizing your image build to ensuring that your containers are secure, and more. To learn more about the Docker platform and how to utilize it for apps deployment, feel free to watch the video below.