Published at
Updated at
Reading time
3min

Today I fixed an annoyance in one of my side projects Tiny Helpers. Tiny Helpers is a resource collection of free online developer tools. The site includes tool screenshots and, if available, the maintainer's GitHub profile image.

And these profile images were the problem; when you navigated the site, you requested many GitHub profile images, and, eventually they stopped showing up.

Comparison of maintainer listing. On the left side there are many profile images whereas on the right side there are many broken images.

So what's going on there? The answer could be found in the developer tools network panel.

DevTools showing a request to the GitHub profile images API that shows the response code "429 Too many requests"

GitHub limits the profile image requests when you're making too many. I couldn't find the exact rate limits, but one thing is clear, the more projects will be listed on Tiny Helpers, the quicker visitors will hit these limits.

Luckily, modern hosting can help out with this problem!

A serverless image/caching proxy

Tiny Helpers is hosted on Vercel. Vercel provides CI/CD functionality, CDNs and also serverless functions. Drop a file into your /api project directory, write some JavaScript and start making requests to your new API!

An API endpoint alone was not helping with GitHub's rate limit problem, though. I needed a way to reduce the number of requests, and this is where Vercel's Edge Caching comes into play. You can not only deploy serverless functions but also instruct Vercel to cache the responses on their edge network. All you need is to define a cache-control header!

With all this functionality, I could:

  • Deploy a new API endpoint that accepts query params for the user and profile image size (/api/user-image/?stefanjudis&size=40).
  • Fetch and proxy the GitHub profile images in a serverless function.
  • Cache the responded image to save requests to GitHub.

And here's the code to get all this working.

// /api/image.js

// `got` streamlines Node.js request handling
const got = require('got');

module.exports = async (req, res) => {
  try {
    const { user, size } = req.query;
    const GITHUB_URL = `https://github.com/${user}.png${
      size ? `?size=${size}` : ''
    }`;
    const imageRequest = got(GITHUB_URL);

    // Use the `got` promises to:
    //   1. receive the content type via `imageResponse`
    //   2. receive the buffer via `imageBuffer`
    const [imageResponse, imageBuffer] = await Promise.all([
      imageRequest,
      imageRequest.buffer(),
    ]);

    // Define a caching header to cache the image on the edge
    // FYI: Caching is tricky, and for now, I went with 12h caching time
    // There might be better configurations, but it does the trick for now
    // 
    // Read more: https://vercel.com/docs/concepts/functions/edge-caching
    res.setHeader('Cache-Control', 's-maxage=43200');
    res.setHeader('content-type', imageResponse.headers['content-type']);
    res.send(imageBuffer);
  } catch (error) {
    // Handle thrown 404s
    if (error.message.includes('404')) {
      res.status(404);
      return res.send('Not found');
    }

    // Fail hard if it's not a 404
    res.status(500);
    res.send(error.message);
  }
};

Deploying my new image proxy took me thirty minutes. With all these new tools in our toolbelt, it's a great time to be a Frontend developer. ♥️

If you enjoyed this article...

Join 5.5k readers and learn something new every week with Web Weekly.

Web Weekly — Your friendly Web Dev newsletter
Reply to this post and share your thoughts via good old email.
Stefan standing in the park in front of a green background

About Stefan Judis

Frontend nerd with over ten years of experience, freelance dev, "Today I Learned" blogger, conference speaker, and Open Source maintainer.

Related Topics

Related Articles