通过 S3 从 Amazon CloudFront 提供压缩的 CSS 和 JavaScript

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/5442011/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-23 17:12:45  来源:igfitidea点击:

Serving gzipped CSS and JavaScript from Amazon CloudFront via S3

javascriptcssamazon-s3gzipamazon-cloudfront

提问by Donald Jenkins

I've been looking for ways of making my site load faster and one way that I'd like to explore is making greater use of Cloudfront.

我一直在寻找使我的网站加载速度更快的方法,我想探索的一种方法是更多地使用 Cloudfront。

Because Cloudfront was originally not designed as a custom-origin CDN and because it didn't support gzipping, I have so far been using it to host all my images, which are referenced by their Cloudfront cname in my site code, and optimized with far-futures headers.

因为 Cloudfront 最初不是作为自定义源 CDN 设计的,并且因为它不支持 gzipping,所以到目前为止我一直使用它来托管我的所有图像,这些图像在我的站点代码中由它们的 Cloudfront cname 引用,并使用 far 进行了优化-期货头文件。

CSS and javascript files, on the other hand, are hosted on my own server, because until now I was under the impression that they couldn't be served gzipped from Cloudfront, and that the gain from gzipping (about 75 per cent) outweighs that from using a CDN (about 50 per cent): Amazon S3 (and thus Cloudfront) did not support serving gzipped content in a standard manner by using the HTTP Accept-Encoding header that is sent by browsers to indicate their support for gzip compression, and so they were not able to Gzip and serve components on the fly.

另一方面,CSS 和 javascript 文件托管在我自己的服务器上,因为直到现在我的印象是无法从 Cloudfront 提供 gzip 文件,并且 gzipping 的收益(约 75%)超过了来自使用 CDN(大约 50%):Amazon S3(以及 Cloudfront)不支持通过使用浏览器发送的 HTTP Accept-Encoding 标头以标准方式提供 gzip 压缩的内容,以表明它们支持 gzip 压缩,并且所以他们无法即时使用 Gzip 和服务组件。

Thus I was under the impression, until now, that one had to choose between two alternatives:

因此,直到现在,我的印象是,必须在两种选择之间进行选择:

  1. move all assets to the Amazon CloudFront and forget about GZipping;

  2. keep components self-hosted and configure our server to detect incoming requests and perform on-the-fly GZipping as appropriate, which is what I chose to do so far.

  1. 将所有资产移至 Amazon CloudFront,而无需考虑 GZipping;

  2. 保持组件自托管并配置我们的服务器以检测传入的请求并根据需要执行动态 GZipping,这是我目前选择做的。

There wereworkarounds to solve this issue, but essentially these didn't work. [link].

还有变通办法来解决这个问题,但本质上这些没有工作。[链接]。

Now, it seems Amazon Cloudfront supports custom origin, and that it is now possible to use the standard HTTP Accept-Encoding method for serving gzipped content if you are using a Custom Origin[link].

现在,Amazon Cloudfront 似乎支持自定义源,并且如果您使用的是自定义源[链接],现在可以使用标准的 HTTP 接受编码方法来提供 gzipped 内容

I haven't so far been able to implement the new feature on my server. The blog post I linked to above, which is the only one I found detailing the change, seems to imply that you can only enable gzipping (bar workarounds, which I don't want to use), if you opt for custom origin, which I'd rather not: I find it simpler to host the coresponding fileds on my Cloudfront server, and link to them from there. Despite carefully reading the documentation, I don't know:

到目前为止,我还没有能够在我的服务器上实现新功能。我在上面链接的博客文章,这是我发现的唯一一篇详细介绍更改的博客文章,似乎暗示您只能启用 gzipping(我不想使用的条形解决方法),如果您选择自定义来源,我宁愿不要:我发现在我的 Cloudfront 服务器上托管相应的文件并从那里链接到它们更简单。尽管仔细阅读了文档,但我不知道:

  • whether the new feature means the files should be hosted on my own domain server viacustom origin, and if so, what code setup will achieve this;

  • how to configure the css and javascript headers to make sure they are served gzipped from Cloudfront.

  • 新功能是否意味着文件应该通过自定义源托管在我自己的域服务器,如果是,什么代码设置将实现这一点;

  • 如何配置 css 和 javascript 标头以确保它们是从 Cloudfront 以 gzip 格式提供的。

回答by Skyler Johnson

UPDATE:Amazon now supports gzip compression, so this is no longer needed. Amazon Announcement

更新:亚马逊现在支持 gzip 压缩,因此不再需要。 亚马逊公告

Original answer:

原答案:

The answer is to gzip the CSS and JavaScript files. Yes, you read that right.

答案是对 CSS 和 JavaScript 文件进行 gzip。是的,你没看错。

gzip -9 production.min.css

This will produce production.min.css.gz. Remove the .gz, upload to S3 (or whatever origin server you're using) and explicitly set the Content-Encodingheader for the file to gzip.

这将产生production.min.css.gz. 删除.gz, 上传到 S3(或您使用的任何源服务器)并将Content-Encoding文件的标头显式设置为gzip.

It's not on-the-fly gzipping, but you could very easily wrap it up into your build/deployment scripts. The advantages are:

它不是即时的 gzip 压缩,但您可以非常轻松地将其打包到您的构建/部署脚本中。优点是:

  1. It requires no CPU for Apache to gzip the content when the file is requested.
  2. The files are gzipped at the highest compression level (assuming gzip -9).
  3. You're serving the file from a CDN.
  1. 当请求文件时,Apache 不需要 CPU 来 gzip 内容。
  2. 文件以最高压缩级别(假设为gzip -9)进行 gzip 压缩。
  3. 您正在从 CDN 提供文件。

Assuming that your CSS/JavaScript files are (a) minified and (b) large enough to justify the CPU required to decompress on the user's machine, you can get significant performance gains here.

假设您的 CSS/JavaScript 文件 (a) 已缩小且 (b) 足够大以证明在用户机器上解压缩所需的 CPU 是合理的,您可以在此处获得显着的性能提升。

Just remember: If you make a change to a file that is cached in CloudFront, make sure you invalidate the cache after making this type of change.

请记住:如果您对 CloudFront 中缓存的文件进行更改,请确保在进行此类更改后使缓存无效。

回答by Sean

My answer is a take off on this: http://blog.kenweiner.com/2009/08/serving-gzipped-javascript-files-from.html

我的回答是一个起飞:http: //blog.kenweiner.com/2009/08/serving-gzipped-javascript-files-from.html

Building off skyler's answer you can upload a gzip and non-gzip version of the css and js. Be careful naming and test in Safari. Because safari won't handle .css.gzor .js.gzfiles.

根据斯凯勒的回答,您可以上传 css 和 js 的 gzip 和非 gzip 版本。在 Safari 中小心命名和测试。因为 safari 不会处理.css.gz.js.gz文件。

site.jsand site.js.jgzand site.cssand site.gz.css(you'll need to set the content-encodingheader to the correct MIME type to get these to serve right)

site.jssite.js.jgzsite.csssite.gz.css(您需要将content-encoding标头设置为正确的 MIME 类型才能使它们正确地提供服务)

Then in your page put.

然后在你的页面里放。

<script type="text/javascript">var sr_gzipEnabled = false;</script> 
<script type="text/javascript" src="http://d2ft4b0ve1aur1.cloudfront.net/js-050/sr.gzipcheck.js.jgz"></script> 

<noscript> 
  <link type="text/css" rel="stylesheet" href="http://d2ft4b0ve1aur1.cloudfront.net/css-050/sr-br-min.css">
</noscript> 
<script type="text/javascript"> 
(function () {
    var sr_css_file = 'http://d2ft4b0ve1aur1.cloudfront.net/css-050/sr-br-min.css';
    if (sr_gzipEnabled) {
      sr_css_file = 'http://d2ft4b0ve1aur1.cloudfront.net/css-050/sr-br-min.css.gz';
    }

    var head = document.getElementsByTagName("head")[0];
    if (head) {
        var scriptStyles = document.createElement("link");
        scriptStyles.rel = "stylesheet";
        scriptStyles.type = "text/css";
        scriptStyles.href = sr_css_file;
        head.appendChild(scriptStyles);
        //alert('adding css to header:'+sr_css_file);
     }
}());
</script> 

gzipcheck.js.jgz is just sr_gzipEnabled = true;This tests to make sure the browser can handle the gzipped code and provide a backup if they can't.

gzipcheck.js.jgz 只是sr_gzipEnabled = true;这个测试以确保浏览器可以处理 gzip 压缩的代码并在不能时提供备份。

Then do something similar in the footer assuming all of your js is in one file and can go in the footer.

然后在页脚中做一些类似的事情,假设你的所有 js 都在一个文件中并且可以放在页脚中。

<div id="sr_js"></div> 
<script type="text/javascript"> 
(function () {
    var sr_js_file = 'http://d2ft4b0ve1aur1.cloudfront.net/js-050/sr-br-min.js';
    if (sr_gzipEnabled) {
       sr_js_file = 'http://d2ft4b0ve1aur1.cloudfront.net/js-050/sr-br-min.js.jgz';
    }
    var sr_script_tag = document.getElementById("sr_js");         
    if (sr_script_tag) {
    var scriptStyles = document.createElement("script");
    scriptStyles.type = "text/javascript";
    scriptStyles.src = sr_js_file;
    sr_script_tag.appendChild(scriptStyles);
    //alert('adding js to footer:'+sr_js_file);
    }
}());
</script> 

UPDATE:Amazon now supports gzip compression. Announcement, so this is no longer needed. Amazon Announcement

更新:亚马逊现在支持 gzip 压缩。公告,所以这不再需要。 亚马逊公告

回答by Danack

Cloudfront supports gzipping.

Cloudfront 支持 gzip 压缩。

Cloudfront connects to your server via HTTP 1.0. By default some webservers, including nginx, dosn't serve gzipped content to HTTP 1.0 connections, but you can tell it to do by adding:

Cloudfront 通过 HTTP 1.0 连接到您的服务器。默认情况下,包括 nginx 在内的一些网络服务器不向 HTTP 1.0 连接提供 gzipped 内容,但您可以通过添加以下内容告诉它这样做:

gzip_http_version 1.0

to your nginx config. The equivalent config could be set for whichever web server you're using.

到您的 nginx 配置。可以为您使用的任何 Web 服务器设置等效的配置。

This does have a side effect of making keep-alive connections not work for HTTP 1.0 connections, but as the benefits of compression are huge, it's definitely worth the trade off.

这确实具有使保持活动连接不适用于 HTTP 1.0 连接的副作用,但由于压缩的好处是巨大的,因此绝对值得权衡。

Taken from http://www.cdnplanet.com/blog/gzip-nginx-cloudfront/

取自http://www.cdnplanet.com/blog/gzip-nginx-cloudfront/

Edit

编辑

Serving content that is gzipped on the fly through Amazon cloud front is dangerous and probably shouldn't be done. Basically if your webserver is gzipping the content, it will not set a Content-Length and instead send the data as chunked.

通过 Amazon 云前端即时提供 gzip 压缩的内容是危险的,可能不应该这样做。基本上,如果您的网络服务器正在压缩内容,它不会设置 Content-Length 而是将数据作为分块发送。

If the connection between Cloudfront and your server is interrupted and prematurely severed, Cloudfront still caches the partial result and serves that as the cached version until it expires.

如果 Cloudfront 和您的服务器之间的连接中断并过早断开,Cloudfront 仍会缓存部分结果并将其用作缓存版本,直到它过期。

The accepted answer of gzipping it first on disk and then serving the gzipped version is a better idea as Nginx will be able to set the Content-Length header, and so Cloudfront will discard truncated versions.

首先在磁盘上对其进行 gzip 压缩,然后提供 gzip 压缩版本的公认答案是一个更好的主意,因为 Nginx 将能够设置 Content-Length 标头,因此 Cloudfront 将丢弃截断的版本。

回答by Chris

Yesterday amazon announced new feature, you can now enable gzip on your distribution.

昨天亚马逊宣布了新功能,您现在可以在您的发行版上启用 gzip。

It works with s3 without added .gz files yourself, I tried the new feature today and it works great. (need to invalidate you're current objects though)

它适用于 s3,无需自己添加 .gz 文件,我今天尝试了新功能,效果很好。(虽然需要使您当前的对象无效)

More info

更多信息

回答by pingles

We've made a few optimisations for uSwitch.com recently to compress some of the static assets on our site. Although we setup a whole nginx proxy to do this, I've also put together a little Heroku app that proxies between CloudFront and S3 to compress content: http://dfl8.co

我们最近对 uSwitch.com 进行了一些优化,以压缩我们网站上的一些静态资产。虽然我们设置了一个完整的 nginx 代理来执行此操作,但我还组合了一个小 Heroku 应用程序,它在 CloudFront 和 S3 之间代理以压缩内容:http: //dfl8.co

Given publicly accessible S3 objects can be accessed using a simple URL structure, http://dfl8.cojust uses the same structure. I.e. the following URLs are equivalent:

鉴于可以使用简单的 URL 结构访问可公开访问的 S3 对象,http://dfl8.co仅使用相同的结构。即以下 URL 是等效的:

http://pingles-example.s3.amazonaws.com/sample.css
http://pingles-example.dfl8.co/sample.css
http://d1a4f3qx63eykc.cloudfront.net/sample.css

回答by Rafi

You can configure CloudFront to automatically compress files of certain types and serve the compressed files.

您可以将 CloudFront 配置为自动压缩某些类型的文件并提供压缩文件。

See AWS Developer Guide

请参阅 AWS开发人员指南