Share this post on:

Having uploaded the files to the blob storage I checked that the files could be read from there, which was as easy as using the endpoint of the file and reading it into a pandas dataframe. However, it wasn’t as straightforward to write a file. Eventually, I discovered the correct way to do it, which is using the Python Azure SDK to access the files. Microsoft has updated their libraries several times over the last few years so it took a while to find the correct way to do it – many solutions didn’t work. This required constant rebuilding of the container as installing the azure libraries broke the build process.

However, I learnt a lot about Docker and Azure during this time. Rather than deleting and rebuilding from scratch, I discovered that if you make changes and rebuild it only rebuilds the layers of the image that need it. The same goes for pushing it to Azure. I also discovered a way to manually run a scheduled task which means that I didn’t have to keep waiting for the next scheduled run. I can see myself using Docker containers much more in the future and is definitely something I will be learning more about.

Share this post on:

Leave a Comment

Your email address will not be published. Required fields are marked *