Prev post Archive

Adding content negotiation to Servoiardi

For this website, I wanted some pages, like my Moka to be viewable and usable from the shell, using utilities like curl. It would also make sense if error messages were viewable from the shell, especially as I already support uploading from the shell for my pastebin. To make this work, I need Servoiardi (my web-server) to detect whether a request originates from the command-line or a browser, and then respond with either a "text/plain" or a "text/html" version of a resource.

This process is called "Server-Driven Content Negotiation".

Using User-Agent

My initial approach to this problem was to inspect the User-Agent header, and match it against a set of known TTY and Browser clients. My initial implementation of the User-Agent header is shown below.

use super::*;

const NAME: Name = Name::new_static("User-Agent", "user-agent");

#[derive(Copy, Clone)]
pub enum UserAgentKind { Browser, Tty }

#[derive(Debug)]
pub struct UserAgent<'a>(&'a str);

impl<'a> Header<'a> for UserAgent<'a>
{
    fn name() -> Name { NAME.clone() }
    fn decode_value(s: &'a str) -> http::Result<Self> { Ok(UserAgent(s)) }
    fn encode_value(&self) -> String { self.0.to_owned() }
}

const TTY_SIGNATURES: &'static [&'static str] = &[ "curl", "wget" ];

impl<'a> UserAgent<'a>
{
    pub fn kind(&self) -> UserAgentKind
    {
        let lower = to_lowercase(self.0);
        if TTY_SIGNATURES.iter().any(|sig| lower.contains(sig))
        {
            UserAgentKind::Tty
        }
        else
        {
            UserAgentKind::Browser
        }
    }
}

This kind of matching-on-user-agent is common - it's what sites like wttr.in do, but it's not perfect. It limits the number of browsers that can access the website properly, as any that haven't been explicitly included in the TTY_SIGNATURES list may get the wrong version of the website. Browser settings or command-line options also can't affect which version of the page gets displayed. Someone using curl or wget might want to download an HTML copy for example.

The Accept header

A better option for this purpose is the Accept header. This header consists of a list of media-type patterns, each with an optional quality value. For example:

My implementation of the Accept header stores an ordered list of AcceptItem structures, each associating a quality stored as a floating point number with a media-type pattern.

Quality Values

To provide a generic way of expressing the quality of a resource according to dimensions such as the Accept header, I used the following Quality enum:

#[derive(PartialEq, Eq, PartialOrd, Ord, Clone, Copy, Debug)]
pub(crate) enum Quality
{
    NoMatch,        // The resource is definitely not a match.
    Unknown,        // It is not known whether the resource might be a match.
    Match(u32, i32) // The resource matches, with a quality between 0 and
                    // u32::MAX, and a nudge value.
}

I used integers rather than floats to quantify the quality of a match, as they are fully order-able. The nudge value is included to allow infinitesimal adjustment of the quality of a match. One use of this in the case of the Accept header is handling the following header, sent by Microsoft Edge.

Accept: text/html, application/xhtml+xml, image/jxr, */*

Strictly interpreted, this means "I don't care whatsoever", though the clear intention is to prefer HTML, XHTML, and JPEG XR. To solve this, when calculating qualities from Accept headers, each match is nudged negatively by the index of the item in the header, making earlier items infinitesimally better, and later items infinitesimally worse.

The Vary header

When servers can serve multiple versions of a web-page, each copy needs to be cached separately, and caches need to be able to understand which requests should result in which response.

Normally, caches assume that the URL determines what response is served, and other non-cache-related headers are ignored. This would in our case, be a problem, as we use request headers to determine the response type. This is where the Vary header comes in. It contains a list of the names of all headers whose value could have affected the response.

To implement this, I added a vary<H: Header>() function to my http::Request struct, which marks a header in the request as varying the response. A Vary header is then be added to whatever response the request elicits. This header should be included with any response, including 304-not-modified. However this isn't currently implemented, as caching is handled before requests are passed to services. In the future, I may rework caching so that all requests are passed to services, and the caching is handled afterwards, before the response is rendered.

My Server-Driven Content Negotiation Algorithm

This all comes together in a content negotiation algorithm. It is somewhat inspired by Apache's algorithm, but with a few changes to suit my needs.

My algorithm splits a request into different "Dimensions" at the moment, I've implemented just two dimensions:

Each potential Choice is evaluated in each dimension, and the best overall choice is sent as a response. I also added a "strict" and "non-strict" mode - The strict mode results in a 406-not-acceptable response if a response that satisfies the request's requirements cannot be found, while the non-strict mode will always send the best response it can offer, even if it isn't acceptable.

The server-driven-content-negotiation algorithm works like this:

  1. Associate each choice of response with a quality. Initially the maximum possible value.
  2. For each dimension:
    1. If none of the choices of response care about this dimension:
      • Skip to evaluating the next dimension.
    2. Mark the request as being varied by this dimension.
    3. For each choice of response, calculate a new quality by combining the current quality of that response with the quality calculated by evaluating the response in this dimension.
    4. If the greatest of the newly calculated responses is not a match:
      • If in strict mode: Discard any newly calculated qualities that are Unknown, retaining any that are NoMatch.
      • If not in strict mode: Discard all newly calculated qualities.
    5. Update the qualities associated with any responses that have non-discarded newly calculated values.
  3. Find the highest associated quality.
  4. If the highest associated quality is NoMatch:
    • Return HTTP 406 "Not Acceptable".
  5. Choose the first potential response with the highest associated quality.

Because the dimensions are evaluated in order, the first evaluated dimensions can overrule later dimensions in non-strict mode. For example, the UriParam dimension can override the MediaType header, meaning that a URI parameter can override whatever the browser sends in its Accept header. Likewise, because the choices are evaluated in order, the first choice will be chosen over subsequent choices when there is no preference. For our use-case of sending text/plain to the terminal, and text/html to browsers, we need to give our text/plain response priority, as curl and other utilities don't specifically prefer any repsonse type, while browsers explicitly prefer HTML.

I wrapped up the content negotiation into a simple macro, and used it to build a generic TextPage struct, that can be used to serve generic text to a client, either as text/html, or text/plain. This is now what I use for error messages, and my Moka service.

Future expansion

In the future, it would be good to add support for more dimensions of content negotiation, like different languages and encodings.

It would also be nice to fully implement plain-text output for Matthewdown, and be render text output rather than HTML for the rest of the website. Reincorporating User-Agent detection, it might also be possible to include ANSI escape codes in served content.

Written by Francis Wharf
Prev post Archive