Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I have web application for running unit tests for our data and I want to deploy it as Azure Web Site.
The problem is in this app I'm downloading quite large zip files, extracting them (~50MB, 500 files inside) and doing some tests over these files.
Where should I save these large files on Azure Web Sites and where hould I extract them? On localhost I've been using "Path.GetTempPath()", but Azure Web Site is reporting that there is no space in this folder, even though my Azure Site has 1000MB total and about 990MB free.
Is there any way how to use these 1000MB for my file operations?
In case this is not possible, should I use the Azure Blob Storage for the extracted files?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
276 views
Welcome To Ask or Share your Answers For Others

1 Answer

In case of Web Sites and when your storage requirements fit withing the constraints of provided local storage - you certainly can use local storage.

However Path.GetTempPath() is not your best choice for Azure Web Site. I would say you shall put all the files in a folder which is part of your web app root folder, i.e. Server.MapPath("~/tmp/"). Make sure to first check for folder existence, etc. There you can utilize all the storage you have.

As for Blob - you have to unzip each file separately and upload it separately to a blob. And when you have to work with the files, you have to download them again. I don't believe this is real solution, as long as you have enough local storage you can utilize.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...