Skip to content

Conversation

@ChristianPavilonis
Copy link
Collaborator

@ChristianPavilonis ChristianPavilonis commented Jan 19, 2026

Summary

Fixes a pre-existing bug where the first-party proxy panics when the origin server returns compressed (gzip, brotli, zstd) HTML or CSS content. This bug surfaced when mocktioneer was deployed to our Coolify instance, which serves responses with Content-Encoding: zstd or Content-Encoding: br when browsers send Accept-Encoding: gzip, deflate, br, zstd.

Problem

The proxy forwards the client's Accept-Encoding header to the origin server. When a browser requests content, it sends:
Accept-Encoding: gzip, deflate, br, zstd
The origin respects this and returns compressed content. However, finalize_proxied_response calls take_body_str() directly on the response body, which:

  1. Assumes the body is valid UTF-8
  2. Panics via .expect() when it encounters compressed bytes (which are not valid UTF-8)
    This worked in curl testing (which doesn't send Accept-Encoding by default) but failed 100% of the time in browsers.

Solution

  1. Added decompress_body helper - Decompresses response bodies based on Content-Encoding header, supporting gzip, deflate, and brotli.
  2. Filter Accept-Encoding header - Instead of forwarding the client's encoding preferences, we now always send Accept-Encoding: gzip, deflate, br to the origin, ensuring we only receive encodings we can handle.
  3. Proper error handling - Changed finalize_proxied_response to return Result<Response, ...> and use String::from_utf8() with proper error handling instead of the panicking take_body_str().
  4. Strip Content-Encoding from response - Since we decompress before rewriting, the final response is uncompressed (CDN can re-compress for clients).

Future Improvements

  • Add zstd support - We could add the zstd crate and handle Content-Encoding: zstd in decompress_body. This would allow the origin to use the most efficient compression.
  • Replace take_body_str() usage elsewhere - The Fastly SDK's take_body_str() method calls .expect() internally and will panic on invalid UTF-8. We should audit other usages in the codebase and replace with take_body().into_bytes() + String::from_utf8() for graceful error handling.

@aram356 should we add zstd suppor and replace the panicking functions?

@ChristianPavilonis ChristianPavilonis changed the title fix compression issues for proxy Fix proxy panic when origin returns compressed HTML/CSS responses Jan 19, 2026
@ChristianPavilonis ChristianPavilonis marked this pull request as ready for review January 19, 2026 23:10
@jevansnyc
Copy link
Collaborator

I think in context of this we also need to think about possibilities of running out of memory using take_body_str() for larger responses. This will kill the request and throw a 503

Copy link
Collaborator

@aram356 aram356 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤔 We have functions to deal with proxying compressed content from publisher domains. See publisher.rs

I think best try to use the existing functions and refactor to be consistent for proxying any content either from creative or publisher domain.

@jevansnyc
Copy link
Collaborator

I agree to be consistent and totally fine using existing functions, just not sure I've ever really considered super large pub domain response (I don't see where creative could overflow memory allocation), so more just wanted to clarify that Fastly Compute has a 128MB memory limit by default (but it can be increased), and while there's not much HTML in a single request that would hit this there may be buffering happening elsewhere trying to hold / pull the response together?

@ChristianPavilonis ChristianPavilonis force-pushed the fix/proxy-compression-issues branch from b02d52c to 9de521a Compare January 20, 2026 20:58
@ChristianPavilonis ChristianPavilonis force-pushed the feature/auction-integrations branch from d694498 to eaee2cf Compare January 20, 2026 20:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants