Menu Home

Azure Website Slow on First Request part II

Earlier this year I wrote about how to mimic the “always-on” feature even in a free Azure Website, by running a custom webjob that periodically would request the site, thus keeping it in memory and always fast and responsive.

A friendly reader noticed me last week that my original solution is no longer working, as Microsoft indeed doesn’t let the site stay in memory even if it frequently receives client requests. I tested that over the last week and it’s indeed true, now only requests to the site from the Azure portal or from the Kudu (scm) interface will keep it in memory. What a bummer.

But fear not, I have made a new webjob, that will enable you to keep your free Azure web site in memory. I have had it running for 3 days without interruptions now, so it works exactly as my first web job did when I released it.

The solution is simple, instead of requesting a public available page in our site, you just periodically request the kudu/scm portal with your username and password. This way you trick Azure into thinking that you are accessing the portal, and that will stop it from unloading your site.

The code looks like this

using System;
using System.Collections.Generic;
using System.Configuration;
using System.Diagnostics;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;

namespace SJKP.AzureKeepWarm
    class Program
        static void Main(string[] args)
            var runner = new Runner();
            var siteUrl = ConfigurationManager.AppSettings["SiteUrl"];
            var waitTime = int.Parse(ConfigurationManager.AppSettings["WaitTime"]);


        private class Runner
            private HttpClient client = new HttpClient(new HttpClientHandler() { Credentials = new NetworkCredential([Your-WebsiteName], [YourPassWord]) });

            public async Task HitSite(string siteUrl, int waitTime)
                while (true)
                        var request = await client.GetAsync(new Uri(siteUrl));
                        Trace.TraceInformation("{0}: {1}", DateTime.Now, request.StatusCode);
                    catch (Exception ex)
                    await Task.Delay(waitTime * 1000);

In the above script you have to change the [Your-WebsiteName] to the name of your website, prepended with $. So in my case, my website is named statsofpoe, my name becomes $statsofpoe. The [YourPassWord] needs to be replaced with the password you find in your publishing file
, where you take the value of userPWD.

Finally you need to change the appsetting to value of SiteUrl to point to the, in my case I use this deeplink,

That’s it now you have a free Azure Website that is as responsive as a paid one with the always-on feature enabled.

Categories: Windows Azure

Tagged as:

Simon J.K. Pedersen

7 replies

  1. Legend! This insight is very helpful. Really needed a way keep my WebJob running and recursive pings were’nt working.

  2. I’m glad you like it, we will see for how long Microsoft lets this work around go untouched 🙂

  3. Good solution!

    Azure mobile services(Consuming database data through JS calls) can only be awakened by JS api call.

    During the time of demos, I am simulating some api calls periodically using JS’s setInterval by keeping a chrome window open in a remote system to keep the db server awake.

    Azure schedulers also do GET, POST, PUT http requests but they can’t do JS calls. So they fail here.

    Please suggest if you have any other solutions in mind

  4. I don’t really understand. A call to mobile service from a javascript should also be a http request, either post or more likely get. Have you tried looking in the network tab of your browser’s developer tools to see what actually goes on when running the code in your setInterval. I would assume that you can fake that request from server side code pretty easy.

  5. Wondering is this approach still valid? I tried it and got “WebJob is stopping due to website shutting down”. 🙁

  6. Hi Louie,

    Unfortunately, or as expected Microsoft have changed so that you can’t keep a site alive, by calling it from itself. But fear not, you can create a web job in another site and have that ping your site, the same way as described in the blog post. I have tried running the job to keep all my free sites alive from a site on a standard plan with Always On Enabled and it works fine.
    If you can’t afford a standard site for the purpose you could just create a page in your web application that you call every 5 minute or so to keep the web jobs alive. One free way to do this is to use application insight and create a web test, that you point to check your specific page that will open a connection to the kudu/ site.
    If you want to make sure that your approach is working you can go to:
    https://[sitename] and click properties for the w3wp (scm) process and see when it was started.
    Kudu Process Explorer

Leave a Reply

Your email address will not be published. Required fields are marked *