I am setting up an instance of Azure Front Door (AFD) for a public-facing website to help improve its global page speed performance.
For the root documents/HTML I don’t have caching enabled (at the moment still working through appropriate cache controls, but as the site currently sets no-cache, no-store, it wouldn’t be worth it) and I have noted that it is now missing a Content-Encoding header and as such is being called on lighthouse scoring.
The origin server is hosted in IIS and is running an aspnetcore application. When requesting from the origin server directly I get brotli response back as expected.
I’ve read and think I’ve covered the troubleshooting described here:
https://learn.microsoft.com/en-us/azure/frontdoor/standard-premium/troubleshoot-compression
I have set up my web.config as so:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<location path="." inheritInChildApplications="false">
<system.webServer>
<handlers>
<add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModuleV2" resourceType="Unspecified" />
</handlers>
<aspNetCore processPath="dotnet" arguments=".PublicWebsite.dll" stdoutLogEnabled="false" stdoutLogFile=".logsstdout" hostingModel="inprocess">
<handlerSettings>
<handlerSetting name="enableShadowCopy" value="true" />
<handlerSetting name="shadowCopyDirectory" value="../ShadowCopyDirectory/" />
</handlerSettings>
</aspNetCore>
</system.webServer>
</location>
<system.webServer>
<httpCompression noCompressionForHttp10="false" noCompressionForProxies="false" />
</system.webServer>
</configuration>
The HTML document is 10kb so is above any potential 1kb limit for compression (however I would hope that this isn’t in play as the caching configuration is disabled).
My expectation is that end users get a brotli encoded response but currently it seems to just be raw and uncompressed.