I have on my own computer now split the app into two parts and have them talking to each other through accessing the same files. However, I need to work out the best way to do this on Azure with containers.
I have experimented with Fastapi which would have an endpoint on the backend that the frontend could access. It works absolutely fine and would be a great solution except for one thing; it would mean that the backend container would need to be always running so that it could be accessed. This has relatively large costs involved compared to spinning up a container once a day to update the data and then close down again.
I therefore looked at Azure File Storage which seemed simple and cheap. You upload the CSV files and these can easily be read by Pandas. However, it seemed to be incredibly difficult to work out how to get the back end to write files to Azure File Storage. I considered a database but the cost of this compared to file storage, for the current small scale, put me off.
I then considered ways to ensure the backend ran on a schedule. I tried threading in Python which works and you can easily set a schedule for a script to run, but the actual container needs to run. I tried to create a CRON job in the container to run the script but this proved difficult. However, I then discovered Azure Container Registry Tasks which have to be set up by Azure CLI and rather than creating a Container Instance, you just create a task where the container is stored to run it using CRON. I managed to get this running successfully but still had the problem of how data was to be shared between the frontend and backend.