Back to all snippets
Library/Node.js/Rate Limiter
typescriptadvancedmiddlewaresecurityrate-limiting

How to implement Rate Limiter in Typescript

Simple in-memory rate limiting middleware

Quick Answer

This in-memory rate limiter middleware counts requests per IP address within a sliding window and returns `429 Too Many Requests` once the limit is exceeded.

Code Snippet

1import { Request, Response, NextFunction } from 'express';
2
3interface RateLimitStore {
4  [key: string]: { count: number; resetTime: number };
5}
6
7const store: RateLimitStore = {};
8
9const rateLimit = (maxRequests: number, windowMs: number) => {
10  return (req: Request, res: Response, next: NextFunction) => {
11    const key = req.ip;
12    const now = Date.now();
13
14    if (!store[key] || now > store[key].resetTime) {
15      store[key] = { count: 1, resetTime: now + windowMs };
16      return next();
17    }
18
19    if (store[key].count < maxRequests) {
20      store[key].count++;
21      return next();
22    }
23
24    res.status(429).json({ error: 'Too many requests' });
25  };
26};
27
28export default rateLimit;

What is Rate Limiter?

Rate limiting prevents abuse, brute-force attacks, and accidental DDoS by capping how many requests a client can make in a time window. This in-memory implementation is suitable for single-process servers. For multi-process or distributed environments, use Redis-backed rate limiting with the same middleware interface.

How It Works

  1. 1Each incoming request is keyed by `req.ip`.
  2. 2If no record exists or the time window has expired, a new count is started.
  3. 3If the count is below the limit, the request passes through and the count increments.
  4. 4Once the limit is reached, `429 Too Many Requests` is returned until the window resets.

Common Use Cases

  • Login endpoints - Prevent brute-force password attacks
  • Public APIs - Limit unauthenticated requests per IP
  • OTP endpoints - Prevent SMS/email OTP flooding
  • Expensive queries - Protect resource-intensive endpoints

Key Benefits

  • Zero dependencies — no Redis or external service required for single servers
  • Configurable limit and window per route
  • Returns standard 429 Too Many Requests with no library needed
  • Drop-in Express middleware compatible with any route

Common Mistakes to Avoid

  • Using in-memory storage behind a load balancer — each process has its own store, so limits are per-instance not per-user. Use Redis for distributed deployments.
  • Keying by IP address behind a reverse proxy — `req.ip` may resolve to the proxy's IP. Set `app.set('trust proxy', 1)` to get the real client IP.
  • Not cleaning up expired entries — the store grows unbounded. Add a periodic cleanup or use a library.

Quick Tips

  • Click the "Copy" button above to copy the code to your clipboard
  • This code is production-ready and can be used in your projects immediately
  • Check out related snippets below for more typescript examples

Frequently Asked Questions

How do I use Redis for distributed rate limiting?

Replace the in-memory `store` object with Redis INCR and EXPIRE commands. The `express-rate-limit` package with a Redis store adapter is the standard production approach.

About This Typescript Code Snippet

This free typescript code snippet for rate limiter is production-ready and copy-paste friendly. Whether you are building a web app, API, or frontend interface, this advanced-level example will help you implement rate limiter quickly and correctly.

All snippets in the Snippetly library follow typescript best practices and are tested for real-world use. You can adapt this code to work with React, Vue, Node.js, or any project that uses typescript.

Tags: middleware, security, rate-limiting  | Language: typescript  | Difficulty: advanced  | Category: Node.js

Build Your Own Snippet Library

Organise your team's code snippets with Snippetly. Share knowledge and boost productivity across your organisation.