Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
We Cut API Latency by 67% By Fixing One Simple Thing | Metrics Bus |
about 13 hours ago
18 views
Written by Prashant Basnet
👋 Welcome to my Signature, a space between logic and curiosity.
I’m a Software Development Engineer who loves turning ideas into systems that work beautifully.
This space captures the process: the bugs, breakthroughs, and “aha” moments that keep me building.
We cut our API latency by 67% and boosted throughput by 31%. We didn't change a single line of business logic. We didn't upgrade our servers. We fixed one stupidly simple thing.
Stop making your users wait for your logs.
The Problem:
We sprinkle logging throughout our applications for
As developers, we log frequently across the entire request journey.
This creates a latency. Meaning we make our user wait unnecessary, for logging to be done before we could fulfill that request.
HOW IT HAPPENS
In average, a single request might hit somewhere from 1-5 log statements.
THE HIDDEN COST
logger.info('Processing request', { userId }); // Your code waits here while this writes logger.debug('Query result', { data }); // blocks again logger.info('Request complete', { duration }); // And againThe Curiosity ?
I was doing some load testing around of api. We had our average latency around 800-900 millisecond. The average throughput was around 11 request per second.
Our Innovation:
We used async metrics bus that takes the info but only logs it after the request has been sent to the client.
THE IMPLEMENTATION
Step 1: Zero-latency emission
emit(eventType: string, data: any): void { this.bus.info(logObj); // Returns immediately - no I/O }Step 2: Batched flush
private readonly MAX_BUFFER_SIZE = 1000; private readonly PROCESS_INTERVAL_MS = 5000; setInterval(() => { this.flushLogs(); // Process batch async }, 5000);---
Step 3: Parallel writes using composite pattern
private async flushLogs(): Promise<void> { const batch = this.logBuffer.splice(0); // Write to all destinations in parallel const writePromises = batch.map(log => this.compositeWriter.write(log) ); await Promise.allSettled(writePromises); // Individual failures don't fail the batch }THE RESULTS