Research & Papers

New Kids: An Architecture and Performance Investigation of Second-Generation Serverless Platforms

A new study of 7 platforms shows cold starts are now an afterthought, with latency dropping from 40ms to 10ms.

Deep Dive

A team of seven researchers, led by Trever Schirmer, has published a comprehensive analysis titled 'New Kids: An Architecture and Performance Investigation of Second-Generation Serverless Platforms.' The paper identifies a clear evolution in serverless computing, moving beyond the first generation of containerized, centralized platforms. The new second-generation platforms leverage lightweight isolates (like WebAssembly) and edge-native deployment, fundamentally changing the performance profile for developers.

Through detailed architectural analysis of seven platforms and a massive microbenchmark evaluation of over 38 million function calls, the study quantifies the impact. The shift to this new architecture slashes warm request latency by approximately 75%, from around 40 milliseconds down to just 10 milliseconds. More significantly, it renders the notorious problem of 'cold starts' virtually obsolete, transforming it from a major performance hurdle into a minor consideration.

However, the research also notes a key trade-off: this performance leap comes at the cost of a more limited execution environment compared to traditional container-based systems. The findings provide crucial data for architects and developers choosing a platform, highlighting that the serverless landscape is no longer monolithic and that specific use cases will benefit from different architectural generations.

Key Points
  • Identifies a shift from 1st-gen (container/centralized) to 2nd-gen (lightweight isolates/edge) serverless platforms.
  • Benchmarks show a 75% reduction in warm request latency, from ~40ms to ~10ms.
  • Cold starts are minimized to an 'afterthought,' but the execution environment is more constrained.

Why It Matters

Enables developers to build far more responsive, real-time applications by drastically reducing serverless function latency.