tripple-cache

🚀 Triple-Layered Web Caching Strategy: How Memory, IndexedDB and HTTP Cache Improved Speed by 96%

1. The Modern Web Performance Challenge

In today’s digital age, page load speed is one of the most critical factors affecting user experience and SEO rankings. According to Google research, 53% of users will leave a website if it takes more than 3 seconds to load. With modern web applications, especially SPAs (Single Page Applications), continuous API calls to fetch data can cause slow loading, significantly reducing user satisfaction.

We encountered this issue with our AVA Manifest system – a system that provides configuration data and game information for our website:

  • High response time: 1.82 seconds for an API request
  • Large data volume: 28.5MB of transferred data
  • High request count: 674 requests for a single page load
  • High TTFB (Time To First Byte): 1.72 seconds

These numbers seriously impact UX and reduce the likelihood of users continuing to use our website.

2. Performance Optimization Concept

After analyzing the problem, we realized that the data from the API doesn’t change frequently, only updating after each new release. This opened up a significant opportunity to apply a multi-layered caching strategy:

  1. Memory Cache: Store data in JavaScript memory for fastest access
  2. IndexedDB: Store data in the user’s browser with large capacity and long lifetime
  3. HTTP Cache: Leverage the browser’s default caching mechanism combined with stale-while-revalidate

We also designed version control mechanisms and automatic stale data removal to ensure users always have the latest data without affecting the experience.

3. Selected Optimization Strategies

3.1. Multi-layered Cache with Different Lifetimes

  • Memory Cache (5 minutes): Store data in JavaScript memory, fastest access speed
  • IndexedDB (15 minutes): Store data in the browser, large capacity, persists between sessions
  • HTTP Cache (15 minutes + 1 hour stale-while-revalidate): Leverage browser’s default cache, combined with stale-while-revalidate for background updates

3.2. Version Control and Automatic Cleanup

  • Version Control: Create a hash from data to check for changes
  • Auto Cleanup: Automatically delete stale data after expiration
  • Background Revalidation: Update data in the background while users use cached data

4. Why Choose These Strategies?

4.1. Aligned with User Behavior

Game sessions typically last 15-30 minutes, so the cache configuration is designed to match this timeframe:

  • Memory cache (5 minutes): Sufficient to handle repeated requests on the same screen
  • IndexedDB (15 minutes): Matches the average duration of a game session
  • HTTP Cache (15 minutes + 1 hour SVR): Ensures smooth UX even when there are server-side changes

4.2. Balance Between Performance and Freshness

The “stale-while-revalidate” strategy allows us to serve cached data immediately (increasing speed) while silently updating new data from the server (ensuring freshness).

4.3. Reducing Server Load

With a high cache hit rate (95%), we significantly reduce the number of requests to the server, helping:

  • Reduce bandwidth costs
  • Decrease backend system load
  • Increase system load capacity

5. Solution Implementation

5.1. Data Structure for Memory Cache

Memory Cache Configuration

// Define cache durations for different strategies
const CACHE_DURATION = {
  MEMORY: 5 * 60 * 1000,        // 5 minutes for memory cache
  HTTP: 15 * 60 * 1000,         // 15 minutes for HTTP cache
  STALE: 60 * 60 * 1000         // 1 hour for stale-while-revalidate
};

// In-memory cache objects
const appDataCache = {
  data: null,        // The cached data
  version: null,     // Version hash for comparison
  timestamp: null    // When the data was cached
};

5.2. IndexedDB Initialization

IndexedDB Setup

// Initialize IndexedDB
function initializeDatabase() {
  return new Promise((resolve, reject) => {
    const request = indexedDB.open('app-cache', 1);
    
    // Handle database opening errors
    request.onerror = () => reject(request.error);
    
    // Handle successful database open
    request.onsuccess = () => resolve(request.result);
    
    // Handle database upgrades/creation
    request.onupgradeneeded = (event) => {
      const db = event.target.result;
      
      // Create object store if it doesn't exist
      if (!db.objectStoreNames.contains('data-store')) {
        const store = db.createObjectStore('data-store', { keyPath: 'id' });
        
        // Create indexes for efficient queries
        store.createIndex('version', 'version');
        store.createIndex('timestamp', 'timestamp');
      }
    };
  });
}

5.3. Automatic Cleanup Mechanism for Stale Data

Cleanup Process

// Remove expired data from IndexedDB
async function cleanupExpiredData() {
  try {
    const db = await initializeDatabase();
    const transaction = db.transaction('data-store', 'readwrite');
    const store = transaction.objectStore('data-store');
    
    // Get all stored items
    const request = store.getAll();
    
    request.onsuccess = () => {
      const items = request.result;
      const now = Date.now();
      let removedCount = 0;
      
      // Check each item for expiration
      items.forEach(item => {
        const age = now - item.timestamp;
        
        // Remove if older than cache duration
        if (age > CACHE_DURATION.MEMORY) {
          console.log(`Removing expired item: ${item.id}`);
          store.delete(item.id);
          removedCount++;
        }
      });
      
      if (removedCount > 0) {
        console.log(`Cleanup completed: removed ${removedCount} items`);
      }
    };
    
    // Return a promise that resolves when transaction is complete
    return new Promise((resolve, reject) => {
      transaction.oncomplete = resolve;
      transaction.onerror = () => reject(transaction.error);
    });
  } catch (error) {
    console.error('Cleanup failed:', error);
  }
}

5.4. Version Control with Hash Function

Version Hashing

// Generate a hash from data for version control
function generateDataHash(data) {
  // Convert data to string
  const str = JSON.stringify(data);
  
  // Simple hash function
  let hash = 0;
  for (let i = 0; i < str.length; i++) {
    const char = str.charCodeAt(i);
    hash = ((hash << 5) - hash) + char;
    hash = hash & hash; // Convert to 32-bit integer
  }
  
  // Return positive hex string
  return Math.abs(hash).toString(16);
}

// Check if versions are different
function hasVersionChanged(oldVersion, newData) {
  const newVersion = generateDataHash(newData);
  return oldVersion !== newVersion;
}

5.5. Cache Handling Logic in React/Redux Application

Cache Flow Logic

// Example with Redux Saga
function* fetchDataWithCaching(action) {
  // Step 1: Try memory cache first (fastest)
  if (appDataCache.data && Date.now() - appDataCache.timestamp < CACHE_DURATION.MEMORY) {
    console.log('Using memory cache');
    
    // Return cached data immediately
    yield put({ type: 'FETCH_DATA_SUCCESS', payload: appDataCache.data });
    
    // Background revalidation (fetch in background)
    const freshData = yield call(api.fetchData);
    if (freshData && hasVersionChanged(appDataCache.version, freshData)) {
      console.log('Updating cache with new version');
      
      // Update the cache with new data
      appDataCache.data = freshData;
      appDataCache.version = generateDataHash(freshData);
      appDataCache.timestamp = Date.now();
      
      // Save to IndexedDB
      yield call(saveToIndexedDB, 'app-data', freshData);
      
      // Update UI with new data
      yield put({ type: 'FETCH_DATA_SUCCESS', payload: freshData });
    }
    
    return; // Exit early
  }
  
  // Step 2: Try IndexedDB if memory cache missing/expired
  try {
    const cachedData = yield call(loadFromIndexedDB, 'app-data');
    
    if (cachedData) {
      console.log('Using IndexedDB cache');
      
      // Update memory cache
      appDataCache.data = cachedData.data;
      appDataCache.version = cachedData.version;
      appDataCache.timestamp = Date.now();
      
      // Return cached data immediately
      yield put({ type: 'FETCH_DATA_SUCCESS', payload: cachedData.data });
      
      // Background revalidation
      const freshData = yield call(api.fetchData);
      if (freshData && hasVersionChanged(cachedData.version, freshData)) {
        // Update caches with new data
        appDataCache.data = freshData;
        appDataCache.version = generateDataHash(freshData);
        appDataCache.timestamp = Date.now();
        
        // Save to IndexedDB
        yield call(saveToIndexedDB, 'app-data', freshData);
        
        // Update UI with new data
        yield put({ type: 'FETCH_DATA_SUCCESS', payload: freshData });
      }
      
      return; // Exit early
    }
  } catch (error) {
    console.error('Error loading from IndexedDB:', error);
  }
  
  // Step 3: No cache available, fetch from API
  try {
    console.log('Fetching from API');
    yield put({ type: 'FETCH_DATA_LOADING' });
    
    const freshData = yield call(api.fetchData);
    
    if (freshData) {
      // Update memory cache
      appDataCache.data = freshData;
      appDataCache.version = generateDataHash(freshData);
      appDataCache.timestamp = Date.now();
      
      // Save to IndexedDB
      yield call(saveToIndexedDB, 'app-data', freshData);
      
      // Update UI
      yield put({ type: 'FETCH_DATA_SUCCESS', payload: freshData });
    }
  } catch (error) {
    yield put({ type: 'FETCH_DATA_ERROR', error });
  }
}

6. Mechanism of Operation

6.1. Data Request Processing Flow

  1. Check Memory Cache:
    • If data exists and hasn’t expired (< 5 minutes) → Use immediately
    • Silently call API to check for new version
  2. Check IndexedDB:
    • If memory cache doesn’t exist or has expired → Look in IndexedDB
    • If data exists and hasn’t expired (< 15 minutes) → Use and update memory cache
    • Silently call API to check for new version
  3. Call API (if no cache or all have expired):
    • Use fetch with HTTP Cache-Control headers
    • Save results to both memory cache and IndexedDB

6.2. Background Revalidation

When data is served from cache, we still send a background request to the server to check if there’s a new version:

  1. Calculate hash from current data
  2. Compare with hash of new data
  3. If different → Update cache and UI
  4. Users aren’t interrupted during this process

6.3. Automatic Cleanup of Stale Data

Whenever data is accessed from IndexedDB, we run a cleanup process to remove expired data:

  1. Scan all items in IndexedDB
  2. Check timestamp of each item
  3. Remove items that have exceeded cache time

7. Optimization Results

7.1. Performance Metrics Before and After Optimization

MetricBeforeAfterImprovement
Load Time1.82s70.79ms96.1% ⬇️
Data Volume28.5 MB3.0 MB89.5% ⬇️
Request Count67453320.9% ⬇️
TTFB1.72s61.23ms96.4% ⬇️
Cache Hit Rate0%95%95% ⬆️

7.2. Benefits

  • Significantly improved page load speed: 96.1% reduction in response time
  • Bandwidth savings: 89.5% reduction in data transfer
  • Smooth user experience: Data served immediately from cache
  • Reduced server load: 95% of requests served from cache
  • Offline functionality: Application can still function when connection is lost

7.3. Limitations

  • Data update delay: It can take up to 15 minutes for users to see the latest data
  • Complexity in version management: Requires a good version control mechanism
  • Debugging challenges: Cache can make debugging more complex

8. Future Improvements

8.1. Cache Partitioning

Divide cache into different parts based on the frequency of data changes:

  • Static data (changes by version): Long-term cache
  • Dynamic data (changes frequently): Short-term cache

8.2. Service Worker

Integrate Service Worker to:

  • Better control cache
  • Support complete offline functionality
  • Automatically update when there’s a new version
// Simple Service Worker registration
if ('serviceWorker' in navigator) {
  window.addEventListener('load', () => {
    navigator.serviceWorker.register('/service-worker.js')
      .then(registration => {
        console.log('Service Worker registered with scope:', registration.scope);
      })
      .catch(error => {
        console.error('Service Worker registration failed:', error);
      });
  });
}

// Basic service-worker.js example
const CACHE_NAME = 'app-cache-v1';
const URLS_TO_CACHE = [
  '/',
  '/index.html',
  '/styles/main.css',
  '/scripts/main.js',
  '/api/static-data'
];

// Install event - cache critical assets
self.addEventListener('install', event => {
  event.waitUntil(
    caches.open(CACHE_NAME)
      .then(cache => {
        console.log('Opened cache');
        return cache.addAll(URLS_TO_CACHE);
      })
  );
});

// Fetch event - serve from cache if available
self.addEventListener('fetch', event => {
  event.respondWith(
    caches.match(event.request)
      .then(response => {
        // Cache hit - return response
        if (response) {
          return response;
        }
        
        // Clone the request
        const fetchRequest = event.request.clone();
        
        // Make network request and cache the response
        return fetch(fetchRequest).then(response => {
          // Check if valid response
          if (!response || response.status !== 200 || response.type !== 'basic') {
            return response;
          }
          
          // Clone the response
          const responseToCache = response.clone();
          
          // Open cache and store response
          caches.open(CACHE_NAME)
            .then(cache => {
              cache.put(event.request, responseToCache);
            });
            
          return response;
        });
      })
  );
});

8.3. Cache API

Use Cache API combined with Service Worker to:

  • Store complete HTTP responses
  • Manage cache at request/response level

8.4. Data Optimization

  • Apply gzip/brotli for data transfer
  • Use GraphQL to only fetch necessary data
  • Break data into smaller parts to load as needed
// Example GraphQL query to fetch only needed data
const fetchPartialData = async (neededFields) => {
  const query = `
    query GetAppData {
      appData {
        ${neededFields.join('\n')}
      }
    }
  `;
  
  const response = await fetch('/graphql', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({ query })
  });
  
  const result = await response.json();
  return result.data.appData;
};

// Usage
const userProfile = await fetchPartialData([
  'id',
  'username',
  'email',
  'preferences {
    theme
    notifications
  }'
]);

9. Summary and Recommendations

Screenshot-2025-03-21-at-11.31.29 🚀 Triple-Layered Web Caching Strategy: How Memory, IndexedDB and HTTP Cache Improved Speed by 96%

9.1. When to Apply Multi-layered Caching Strategy

  • Data changes infrequently or on a cycle
  • High API response times
  • Large data volumes
  • Desire for offline application functionality

9.2. Basic Implementation Steps

  1. Analyze data: Identify data types and change frequency
  2. Design cache strategy: Define layers and lifetimes
  3. Implement IndexedDB: Initialize DB, create object stores and indexes
  4. Build cache logic: Implement storage, retrieval, and cleanup
  5. Version control: Create hash mechanism to compare versions
  6. Integrate with data flow: Apply in Redux saga or React hooks

9.3. Final Recommendations

  1. Measure before and after: Always collect metrics before and after optimization to evaluate effectiveness
  2. Consider tradeoffs: Balance between speed and data freshness
  3. Prioritize UX: Always prioritize user experience, display data from cache first then update later
  4. Start simple: Implement step by step, beginning with memory cache then gradually adding more complex layers
  5. Handle errors well: Always have fallback mechanisms when cache fails

Performance optimization is a continuous journey, not a destination. By applying multi-layered caching strategies, you can significantly improve user experience and reduce load on your backend systems.

Post Comment