Skip to content

Coding Challenges ​

Quick Reference ​

Scan this table in 5 minutes before your round.

ChallengeKey TechniqueTime / SpaceCommon Gotcha
Autocomplete with DebounceuseDebounce hook + AbortControllerO(1) debounce, O(n) filterStale closures; forgetting to abort in-flight requests
Fuzzy SearchScore-based character matchingO(n * m) where m = query lengthNot handling case sensitivity; scoring consecutive matches equally to scattered ones
Deep CloneRecursive traversal + WeakMapO(n) time and spaceCircular references cause infinite loops; missing Map/Set/Date/RegExp
Flatten (Array & Object)Recursion or explicit stackO(n) where n = total elementsForgetting depth parameter; mutating input; mixing array/object flatten
LRU CacheMap insertion-order + delete/re-setO(1) get and putNot deleting before re-setting to refresh order; off-by-one on capacity

Challenge 1: Autocomplete with Debounce ​

What the interviewer asks ​

Build an autocomplete input. As the user types, fetch suggestions from an API after a short delay. Cancel stale requests and handle loading/error states.

Implementation ​

useDebounce hook ​

ts
import { useEffect, useState } from 'react';

function useDebounce<T>(value: T, delayMs: number): T {
  const [debounced, setDebounced] = useState(value);

  useEffect(() => {
    const timer = setTimeout(() => setDebounced(value), delayMs);
    return () => clearTimeout(timer);
  }, [value, delayMs]);

  return debounced;
}

useAutocomplete hook ​

ts
import { useEffect, useRef, useState } from 'react';

interface UseAutocompleteResult {
  results: string[];
  isLoading: boolean;
  error: string | null;
}

function useAutocomplete(query: string): UseAutocompleteResult {
  const [results, setResults] = useState<string[]>([]);
  const [isLoading, setIsLoading] = useState(false);
  const [error, setError] = useState<string | null>(null);
  const abortRef = useRef<AbortController | null>(null);

  const debouncedQuery = useDebounce(query, 300);

  useEffect(() => {
    if (!debouncedQuery.trim()) {
      setResults([]);
      return;
    }

    // Cancel previous in-flight request
    abortRef.current?.abort();
    const controller = new AbortController();
    abortRef.current = controller;

    setIsLoading(true);
    setError(null);

    fetch(`/api/search?q=${encodeURIComponent(debouncedQuery)}`, {
      signal: controller.signal,
    })
      .then((res) => {
        if (!res.ok) throw new Error(`HTTP ${res.status}`);
        return res.json();
      })
      .then((data: string[]) => {
        setResults(data);
        setIsLoading(false);
      })
      .catch((err) => {
        if (err.name === 'AbortError') return; // expected, not an error
        setError(err.message);
        setIsLoading(false);
      });

    return () => controller.abort();
  }, [debouncedQuery]);

  return { results, isLoading, error };
}

Full component with keyboard navigation ​

tsx
import { useState, useRef, KeyboardEvent } from 'react';

function Autocomplete() {
  const [query, setQuery] = useState('');
  const [activeIndex, setActiveIndex] = useState(-1);
  const { results, isLoading, error } = useAutocomplete(query);
  const listRef = useRef<HTMLUListElement>(null);

  const handleKeyDown = (e: KeyboardEvent<HTMLInputElement>) => {
    switch (e.key) {
      case 'ArrowDown':
        e.preventDefault();
        setActiveIndex((prev) => Math.min(prev + 1, results.length - 1));
        break;
      case 'ArrowUp':
        e.preventDefault();
        setActiveIndex((prev) => Math.max(prev - 1, 0));
        break;
      case 'Enter':
        if (activeIndex >= 0 && results[activeIndex]) {
          setQuery(results[activeIndex]);
          setActiveIndex(-1);
        }
        break;
      case 'Escape':
        setActiveIndex(-1);
        break;
    }
  };

  return (
    <div style={{ position: 'relative', width: 320 }}>
      <input
        value={query}
        onChange={(e) => {
          setQuery(e.target.value);
          setActiveIndex(-1);
        }}
        onKeyDown={handleKeyDown}
        placeholder="Search..."
        role="combobox"
        aria-expanded={results.length > 0}
        aria-activedescendant={
          activeIndex >= 0 ? `option-${activeIndex}` : undefined
        }
      />

      {isLoading && <span>Loading...</span>}
      {error && <span style={{ color: 'red' }}>{error}</span>}

      {results.length > 0 && (
        <ul ref={listRef} role="listbox" style={{ listStyle: 'none', padding: 0 }}>
          {results.map((item, i) => (
            <li
              key={item}
              id={`option-${i}`}
              role="option"
              aria-selected={i === activeIndex}
              style={{
                padding: '8px',
                background: i === activeIndex ? '#e2e8f0' : 'transparent',
                cursor: 'pointer',
              }}
              onClick={() => {
                setQuery(item);
                setActiveIndex(-1);
              }}
            >
              {item}
            </li>
          ))}
        </ul>
      )}
    </div>
  );
}

How it works ​

  1. useDebounce delays the query value by 300ms so we do not fire a request on every keystroke.
  2. useAutocomplete watches the debounced value, creates a new AbortController before every fetch, and aborts the previous one. This guarantees only the latest request populates state.
  3. The component tracks activeIndex for keyboard navigation. Arrow keys move the highlight, Enter selects, Escape dismisses.

Edge cases to mention ​

  • Empty or whitespace-only input should clear results, not fire a request.
  • If the user clears the input while a request is in-flight, the abort prevents a stale result from appearing.
  • encodeURIComponent protects against special characters in the query string.
  • ARIA attributes (combobox, listbox, option, aria-activedescendant) are required for screen reader support.

Follow-up questions ​

  • How would you cache previous results? (Keep a Map<string, string[]> ref, check before fetching.)
  • How would you handle rate limiting from the API? (Exponential backoff, or queue requests.)
  • What changes for a server component approach in Next.js? (Move fetch to a server action, stream results.)

Challenge 2: Fuzzy Search Algorithm ​

What the interviewer asks ​

Implement a fuzzy search that ranks results by match quality. Typing "pmt" should match "payment" higher than "apartment".

Implementation ​

ts
interface FuzzyResult<T> {
  item: T;
  score: number;
}

function fuzzyScore(query: string, target: string): number {
  const q = query.toLowerCase();
  const t = target.toLowerCase();

  let score = 0;
  let qi = 0;
  let consecutive = 0;
  let prevMatchIndex = -2; // intentionally < -1

  for (let ti = 0; ti < t.length && qi < q.length; ti++) {
    if (t[ti] === q[qi]) {
      // Consecutive matches are worth more
      consecutive = ti === prevMatchIndex + 1 ? consecutive + 1 : 1;
      score += consecutive;

      // Bonus: match at start of word
      if (ti === 0 || t[ti - 1] === ' ' || t[ti - 1] === '-' || t[ti - 1] === '_') {
        score += 2;
      }

      prevMatchIndex = ti;
      qi++;
    }
  }

  // All query characters must be found in order
  if (qi < q.length) return 0;

  // Bonus for shorter targets (tighter match)
  score += 1 / t.length;

  return score;
}

function fuzzySearch<T>(
  query: string,
  items: T[],
  keyFn: (item: T) => string
): FuzzyResult<T>[] {
  if (!query) return [];

  return items
    .map((item) => ({ item, score: fuzzyScore(query, keyFn(item)) }))
    .filter((r) => r.score > 0)
    .sort((a, b) => b.score - a.score);
}

Usage ​

ts
const products = ['payment', 'apartment', 'permit', 'department'];

const results = fuzzySearch('pmt', products, (p) => p);
// [{ item: 'payment', score: ~7 }, { item: 'apartment', score: ~3 }, ...]

How it works ​

  • Walk through the target string looking for each query character in order.
  • Consecutive matches multiply the score (p-a-y vs p...m...t).
  • Word-boundary matches get a bonus (matching the "p" in "payment" at index 0 is worth more).
  • Shorter targets get a small bonus so "payment" beats "department" when scores are close.
  • If any query character is not found, the score is 0 (no partial credit).

Edge cases to mention ​

  • Empty query should return an empty array, not all items.
  • Identical strings should always score highest.
  • Unicode characters: toLowerCase handles most Latin scripts but not all; toLocaleLowerCase is safer for internationalized apps.

Follow-up questions ​

  • How would you highlight the matched characters in the UI? (Return matched indices alongside the score.)
  • What about typo tolerance? (Levenshtein distance or allow skipping one query char with a penalty.)
  • How would you handle 10k+ items? (Web Worker, pre-sorted index, or limit to top-N with early exit.)

Challenge 3: Deep Clone ​

What the interviewer asks ​

Implement a deepClone function that handles objects, arrays, Date, RegExp, Map, Set, and circular references.

Implementation ​

ts
function deepClone<T>(value: T, seen = new WeakMap()): T {
  // Primitives and functions (functions are not cloned)
  if (value === null || typeof value !== 'object') {
    return value;
  }

  // Circular reference check
  if (seen.has(value as object)) {
    return seen.get(value as object);
  }

  // Special built-in types
  if (value instanceof Date) return new Date(value.getTime()) as T;
  if (value instanceof RegExp) return new RegExp(value.source, value.flags) as T;

  if (value instanceof Map) {
    const map = new Map();
    seen.set(value as object, map);
    value.forEach((v, k) => map.set(deepClone(k, seen), deepClone(v, seen)));
    return map as T;
  }

  if (value instanceof Set) {
    const set = new Set();
    seen.set(value as object, set);
    value.forEach((v) => set.add(deepClone(v, seen)));
    return set as T;
  }

  // Arrays and plain objects
  const clone: any = Array.isArray(value) ? [] : {};
  seen.set(value as object, clone);

  for (const key of Reflect.ownKeys(value as object)) {
    clone[key] = deepClone((value as any)[key], seen);
  }

  return clone;
}

How it works ​

  1. Primitives (number, string, boolean, null, undefined, symbol, bigint) and functions are returned as-is. Functions cannot be meaningfully cloned.
  2. Before recursing into any object, we check the WeakMap. If the object was already visited, we return the previously created clone, breaking the cycle.
  3. Date and RegExp are reconstructed from their internal values.
  4. Map and Set are created empty, registered in seen immediately (to handle circular refs within them), then populated.
  5. For plain objects and arrays, Reflect.ownKeys captures both string and symbol keys.

structuredClone() comparison ​

FeaturedeepClone (ours)structuredClone()
Circular referencesYes (WeakMap)Yes (built-in)
Date, RegExp, Map, SetYesYes
FunctionsReturns referenceThrows DataCloneError
DOM nodesNot handledThrows DataCloneError
Symbol keysYes (Reflect.ownKeys)No (stripped)
Prototype chainLost (plain object)Lost (plain object)
PerformanceJS-level recursionNative C++ (faster for large trees)

When to use structuredClone: Most production code. It is native, handles cycles, and covers the common types.

When you need a custom clone: When your objects contain functions, symbol-keyed properties, or you need to preserve class instances with custom logic.

Edge cases to mention ​

  • structuredClone cannot clone functions or DOM nodes (throws).
  • Prototype chain is lost in both approaches; the clone is a plain object.
  • Reflect.ownKeys includes non-enumerable symbol properties, which Object.keys would miss.

Follow-up questions ​

  • How would you preserve the prototype chain? (Object.create(Object.getPrototypeOf(value)) instead of {}.)
  • How does JSON.parse(JSON.stringify(x)) compare? (Loses undefined, Date becomes string, no functions, no circular refs, no Map/Set.)
  • What about ArrayBuffer and TypedArray? (Use .slice() to clone the buffer.)

Challenge 4: Flatten ​

What the interviewer asks ​

Implement flatten for arrays (with optional depth) and for objects (dot-notation keys).

Array Flatten ​

Recursive ​

ts
function flattenArray<T>(arr: unknown[], depth: number = Infinity): T[] {
  const result: T[] = [];

  for (const item of arr) {
    if (Array.isArray(item) && depth > 0) {
      result.push(...flattenArray<T>(item, depth - 1));
    } else {
      result.push(item as T);
    }
  }

  return result;
}

Iterative (stack-based) ​

ts
function flattenArrayIterative<T>(arr: unknown[], depth: number = Infinity): T[] {
  const stack: Array<{ value: unknown; depth: number }> = arr
    .map((value) => ({ value, depth }))
    .reverse(); // reverse so we process left-to-right

  const result: T[] = [];

  while (stack.length > 0) {
    const { value, depth: d } = stack.pop()!;

    if (Array.isArray(value) && d > 0) {
      for (let i = value.length - 1; i >= 0; i--) {
        stack.push({ value: value[i], depth: d - 1 });
      }
    } else {
      result.push(value as T);
    }
  }

  return result;
}

Usage ​

ts
flattenArray([1, [2, [3, [4]]]]); // [1, 2, 3, 4]
flattenArray([1, [2, [3, [4]]]], 1); // [1, 2, 3, [4]]
flattenArray([1, [2, [3, [4]]]], 2); // [1, 2, 3, 4] — depth 2 is enough here? No:
// depth=2 → [1, 2, 3, [4]], need depth=3 or Infinity for full flatten

Object Flatten ​

ts
function flattenObject(
  obj: Record<string, unknown>,
  prefix: string = '',
  result: Record<string, unknown> = {}
): Record<string, unknown> {
  for (const [key, value] of Object.entries(obj)) {
    const newKey = prefix ? `${prefix}.${key}` : key;

    if (
      value !== null &&
      typeof value === 'object' &&
      !Array.isArray(value) &&
      !(value instanceof Date)
    ) {
      flattenObject(value as Record<string, unknown>, newKey, result);
    } else {
      result[newKey] = value;
    }
  }

  return result;
}

Usage ​

ts
flattenObject({ a: { b: { c: 1 } }, d: [2, 3] });
// { 'a.b.c': 1, 'd': [2, 3] }

flattenObject({ user: { name: 'Alice', address: { city: 'Mumbai' } } });
// { 'user.name': 'Alice', 'user.address.city': 'Mumbai' }

How it works ​

Array flatten: The recursive version spreads nested arrays into the result while decrementing depth. The iterative version uses an explicit stack to avoid call-stack overflow on deeply nested input.

Object flatten: Recursively walks object keys, building up a dot-separated prefix. Non-plain-objects (arrays, Dates, null) are treated as leaf values.

Edge cases to mention ​

  • depth = 0 should return a shallow copy, not the original reference.
  • Arrays inside objects: decide whether to flatten them or keep as-is (usually keep).
  • Keys containing dots in the source object create ambiguity when flattening; mention this if asked.
  • The iterative array approach avoids stack overflow for arrays nested thousands of levels deep.

Follow-up questions ​

  • How would you implement unflatten (reverse the object operation)? (Split keys on ., build nested structure.)
  • What is Array.prototype.flat(depth) doing under the hood? (Essentially the recursive approach with a depth check.)
  • How does this relate to Redux state normalization? (Flattening nested API responses into { byId, allIds } shape.)

Challenge 5: LRU Cache ​

What the interviewer asks ​

Implement an LRU (Least Recently Used) cache with get and put in O(1) time.

Implementation ​

ts
class LRUCache<K, V> {
  private capacity: number;
  private cache: Map<K, V>;

  constructor(capacity: number) {
    if (capacity <= 0) throw new Error('Capacity must be positive');
    this.capacity = capacity;
    this.cache = new Map();
  }

  get(key: K): V | undefined {
    if (!this.cache.has(key)) return undefined;

    // Move to most recent: delete and re-insert
    const value = this.cache.get(key)!;
    this.cache.delete(key);
    this.cache.set(key, value);
    return value;
  }

  put(key: K, value: V): void {
    // If key exists, delete first to refresh insertion order
    if (this.cache.has(key)) {
      this.cache.delete(key);
    }

    this.cache.set(key, value);

    // Evict least recently used (first entry in Map)
    if (this.cache.size > this.capacity) {
      const lruKey = this.cache.keys().next().value!;
      this.cache.delete(lruKey);
    }
  }

  get size(): number {
    return this.cache.size;
  }
}

Usage ​

ts
const cache = new LRUCache<string, number>(3);

cache.put('a', 1);
cache.put('b', 2);
cache.put('c', 3);

cache.get('a');    // 1 — 'a' is now most recent
cache.put('d', 4); // evicts 'b' (least recently used)

cache.get('b');    // undefined — evicted
cache.get('c');    // 3
cache.get('d');    // 4

How it works ​

JavaScript's Map preserves insertion order. The first key in iteration order is the oldest entry. By deleting and re-setting a key on every access, we move it to the end (most recent). When the map exceeds capacity, we evict the first key.

Both get and put are O(1) because Map.delete, Map.set, Map.has, and Map.keys().next() are all O(1) operations.

Why this works but a plain object would not ​

Plain objects ({}) do not guarantee insertion order for integer-like keys (they are sorted numerically). Map guarantees insertion order for all key types.

Edge cases to mention ​

  • put with an existing key should update the value and refresh its position.
  • Capacity of 0 is invalid; throw or handle explicitly.
  • get for a missing key should return undefined, not throw.
  • Map.keys().next().value gives the least recently used key in O(1).

Follow-up: Adding TTL (Time-To-Live) ​

ts
interface CacheEntry<V> {
  value: V;
  expiresAt: number;
}

class LRUCacheWithTTL<K, V> {
  private capacity: number;
  private defaultTTL: number;
  private cache: Map<K, CacheEntry<V>>;

  constructor(capacity: number, defaultTTLMs: number = 60_000) {
    this.capacity = capacity;
    this.defaultTTL = defaultTTLMs;
    this.cache = new Map();
  }

  get(key: K): V | undefined {
    const entry = this.cache.get(key);
    if (!entry) return undefined;

    // Check expiry
    if (Date.now() > entry.expiresAt) {
      this.cache.delete(key);
      return undefined;
    }

    // Refresh position
    this.cache.delete(key);
    this.cache.set(key, entry);
    return entry.value;
  }

  put(key: K, value: V, ttlMs?: number): void {
    this.cache.delete(key);
    this.cache.set(key, {
      value,
      expiresAt: Date.now() + (ttlMs ?? this.defaultTTL),
    });

    if (this.cache.size > this.capacity) {
      const lruKey = this.cache.keys().next().value!;
      this.cache.delete(lruKey);
    }
  }
}

The TTL version wraps each value with an expiresAt timestamp. On get, if the entry has expired, it is deleted and undefined is returned. Expired entries are also naturally evicted when capacity is exceeded, since they sit at the front of the map (oldest entries).

Follow-up questions ​

  • How would you implement this without Map (classic doubly-linked list + hash map)? (Node with prev/next pointers, O(1) unlink and append.)
  • How is LRU used in frontend caching? (API response caching, image caching, memoization of expensive computations.)
  • What is the difference between LRU and LFU (Least Frequently Used)? (LRU evicts by recency, LFU evicts by access count; LFU needs a frequency counter and is O(1) with a more complex data structure.)
  • In a fintech dashboard context, which cache strategy would you use for real-time price data vs. static reference data? (Short TTL + LRU for prices, long TTL + LRU for reference data like currency codes.)

Frontend interview preparation reference.