Every distributed system needs APIs, and choosing the wrong API paradigm early creates friction that compounds over the life of a project. Build a chatty REST API for a mobile app with limited bandwidth, and you spend months optimizing. Choose gRPC for a public API consumed by third-party JavaScript clients, and you force every consumer to work around browser limitations. Pick GraphQL for a simple two-endpoint internal service, and you maintain a schema and resolver layer that adds complexity without value.
REST, GraphQL, and gRPC each solve real problems, but they solve different problems. This guide walks through the strengths and trade-offs of each paradigm, provides concrete code examples, and offers a decision framework you can apply to your own systems.
REST: Resource-Oriented Simplicity
REST (Representational State Transfer) has been the dominant API paradigm for over fifteen years. Its model is straightforward: resources are identified by URLs, operations map to HTTP methods, and responses are typically JSON. This simplicity is REST's greatest strength -- every developer knows how to call a REST endpoint, every language has an HTTP client, and the tooling ecosystem is enormous.
A typical REST endpoint in Express:
import express from "express";
const app = express();
app.use(express.json());
interface Product {
id: string;
name: string;
price: number;
category: string;
inventory: number;
}
const products: Map<string, Product> = new Map();
// GET a single resource
app.get("/api/products/:id", (req, res) => {
const product = products.get(req.params.id);
if (!product) {
return res.status(404).json({ error: "Product not found" });
}
res.json(product);
});
// GET a collection with filtering and pagination
app.get("/api/products", (req, res) => {
const { category, limit = "20", offset = "0" } = req.query;
let results = Array.from(products.values());
if (category) {
results = results.filter((p) => p.category === category);
}
const total = results.length;
const paginated = results.slice(Number(offset), Number(offset) + Number(limit));
res.json({
data: paginated,
pagination: { total, limit: Number(limit), offset: Number(offset) },
});
});
// POST to create a resource
app.post("/api/products", (req, res) => {
const product: Product = {
id: crypto.randomUUID(),
...req.body,
};
products.set(product.id, product);
res.status(201).json(product);
});
// PUT to replace a resource
app.put("/api/products/:id", (req, res) => {
if (!products.has(req.params.id)) {
return res.status(404).json({ error: "Product not found" });
}
const updated = { ...req.body, id: req.params.id };
products.set(req.params.id, updated);
res.json(updated);
});
// DELETE a resource
app.delete("/api/products/:id", (req, res) => {
if (!products.delete(req.params.id)) {
return res.status(404).json({ error: "Product not found" });
}
res.status(204).send();
});
When REST excels:
- Public APIs. REST's ubiquity means third-party developers can integrate without learning a new paradigm or toolchain. OpenAPI/Swagger provides standardized documentation and client generation.
- Resource-oriented domains. When your API naturally maps to CRUD operations on well-defined entities, REST's resource model is a clean fit.
- Caching. HTTP caching works natively with REST. GET requests can be cached at CDN, proxy, and browser levels using standard
Cache-ControlandETagheaders. This is a major advantage for read-heavy APIs. - Simple integrations. Webhooks, third-party services, and no-code tools overwhelmingly expect REST endpoints.
Where REST struggles:
- Over-fetching and under-fetching. A REST endpoint returns a fixed shape. If the client only needs two fields from a 20-field resource, it still receives all 20. If the client needs data from three related resources, it makes three requests. This is particularly painful on mobile networks.
- Versioning. Evolving a REST API without breaking clients is notoriously difficult. URL versioning (
/v1/,/v2/) leads to duplicated code. Header-based versioning is cleaner but less discoverable. - Chatty interactions. Complex screens that aggregate data from multiple resources require multiple round trips, adding latency.
GraphQL: Flexible Queries for Complex Clients
GraphQL was created at Facebook to solve the exact problems REST struggles with: mobile clients needing precise data shapes and screens that aggregate data from multiple sources. Instead of predefined endpoints, GraphQL exposes a schema that clients query declaratively, requesting exactly the fields they need in a single request.
A GraphQL server with Apollo:
import { ApolloServer } from "@apollo/server";
import { startStandaloneServer } from "@apollo/server/standalone";
const typeDefs = `#graphql
type Product {
id: ID!
name: String!
price: Float!
category: Category!
inventory: Int!
reviews: [Review!]!
}
type Category {
id: ID!
name: String!
products: [Product!]!
}
type Review {
id: ID!
rating: Int!
comment: String!
author: String!
createdAt: String!
}
type Query {
product(id: ID!): Product
products(category: String, limit: Int = 20, offset: Int = 0): ProductConnection!
}
type ProductConnection {
items: [Product!]!
totalCount: Int!
hasMore: Boolean!
}
type Mutation {
createProduct(input: CreateProductInput!): Product!
updateProduct(id: ID!, input: UpdateProductInput!): Product!
}
input CreateProductInput {
name: String!
price: Float!
categoryId: ID!
inventory: Int!
}
input UpdateProductInput {
name: String
price: Float
inventory: Int
}
`;
const resolvers = {
Query: {
product: (_, { id }) => productService.findById(id),
products: (_, { category, limit, offset }) =>
productService.findAll({ category, limit, offset }),
},
Product: {
category: (product) => categoryService.findById(product.categoryId),
reviews: (product) => reviewService.findByProductId(product.id),
},
Mutation: {
createProduct: (_, { input }) => productService.create(input),
updateProduct: (_, { id, input }) => productService.update(id, input),
},
};
const server = new ApolloServer({ typeDefs, resolvers });
const { url } = await startStandaloneServer(server, { listen: { port: 4000 } });
A client query that fetches exactly what a product detail page needs:
query ProductDetail($id: ID!) {
product(id: $id) {
name
price
inventory
category {
name
}
reviews {
rating
comment
author
}
}
}
One request, one response, no over-fetching. The mobile version of this page might omit the reviews field entirely, reducing payload size without any server-side changes.
When GraphQL excels:
- Mobile applications. Bandwidth and latency constraints make precise data fetching critical. GraphQL eliminates unnecessary data transfer and reduces round trips.
- Backend-for-Frontend (BFF) pattern. When multiple clients (web, mobile, internal tools) consume the same backend but need different data shapes, GraphQL lets each client request exactly what it needs.
- Rapidly evolving frontends. Frontend teams can add or remove fields from their queries without waiting for backend API changes, dramatically reducing coordination overhead.
- Complex, interconnected data. When entities have deep relationships and clients need to traverse those relationships in varied ways, GraphQL's graph model is natural.
Where GraphQL struggles:
- Caching. Because all queries hit a single POST endpoint, HTTP-level caching does not work. You need client-side caching (Apollo Client, urql) or server-side persisted queries.
- Complexity. The N+1 query problem is pervasive in naive GraphQL implementations. You must implement DataLoader or equivalent batching to avoid database query explosions. Rate limiting is also harder because a single query can be cheap or enormously expensive.
- File uploads. GraphQL has no native file upload support. You end up bolting on multipart request handling or falling back to REST for uploads.
- Simple APIs. For an API with five straightforward endpoints, GraphQL's schema, resolver, and tooling overhead is not justified.
gRPC: High-Performance Service Communication
gRPC uses Protocol Buffers (protobuf) for serialization and HTTP/2 for transport. It is a binary protocol, meaning it is not human-readable, but it is significantly faster and more bandwidth-efficient than JSON-based alternatives. gRPC was built by Google for internal microservice communication, and that origin shows in its design priorities: performance, strong typing, and streaming.
Define a service with protobuf:
syntax = "proto3";
package products;
service ProductService {
rpc GetProduct (GetProductRequest) returns (Product);
rpc ListProducts (ListProductsRequest) returns (ProductList);
rpc CreateProduct (CreateProductRequest) returns (Product);
rpc StreamInventoryUpdates (InventoryFilter) returns (stream InventoryUpdate);
}
message Product {
string id = 1;
string name = 2;
double price = 3;
string category_id = 4;
int32 inventory = 5;
}
message GetProductRequest {
string id = 1;
}
message ListProductsRequest {
string category = 1;
int32 limit = 2;
int32 offset = 3;
}
message ProductList {
repeated Product products = 1;
int32 total_count = 2;
}
message CreateProductRequest {
string name = 1;
double price = 2;
string category_id = 3;
int32 inventory = 4;
}
message InventoryFilter {
repeated string product_ids = 1;
}
message InventoryUpdate {
string product_id = 1;
int32 new_inventory = 2;
string updated_at = 3;
}
Implement the server in Node.js:
import * as grpc from "@grpc/grpc-js";
import * as protoLoader from "@grpc/proto-loader";
const packageDefinition = protoLoader.loadSync("products.proto");
const proto = grpc.loadPackageDefinition(packageDefinition).products;
const server = new grpc.Server();
server.addService(proto.ProductService.service, {
getProduct: (call, callback) => {
const product = productStore.get(call.request.id);
if (!product) {
return callback({
code: grpc.status.NOT_FOUND,
message: "Product not found",
});
}
callback(null, product);
},
listProducts: (call, callback) => {
const { category, limit, offset } = call.request;
const results = productStore.list({ category, limit, offset });
callback(null, { products: results.items, totalCount: results.total });
},
// Server streaming: push real-time inventory updates
streamInventoryUpdates: (call) => {
const { productIds } = call.request;
const listener = (update) => {
if (productIds.length === 0 || productIds.includes(update.productId)) {
call.write(update);
}
};
inventoryEmitter.on("update", listener);
call.on("cancelled", () => {
inventoryEmitter.off("update", listener);
});
},
});
server.bindAsync("0.0.0.0:50051", grpc.ServerCredentials.createInsecure(), () => {
console.log("gRPC server running on port 50051");
});
When gRPC excels:
- Microservice-to-microservice communication. When both client and server are internal services you control, gRPC's strong typing, code generation, and performance are hard to beat. Protobuf serialization is 5-10x faster than JSON parsing.
- Streaming. gRPC natively supports client streaming, server streaming, and bidirectional streaming. This makes it ideal for real-time data feeds, log streaming, and long-running operations.
- Polyglot environments. Proto files generate client and server code for virtually every language. A Python ML service, a Go API gateway, and a C# order processor can all communicate through the same gRPC contracts with zero manual serialization code.
- High throughput. HTTP/2 multiplexing, binary serialization, and header compression make gRPC the best choice when you need to move large volumes of data between services efficiently.
Where gRPC struggles:
- Browser clients. Browsers cannot make native gRPC calls. You need gRPC-Web with a proxy (like Envoy) as an intermediary, which adds infrastructure complexity.
- Human readability. Binary payloads cannot be inspected with curl or browser dev tools. Debugging requires tooling like
grpcurlor BloomRPC. - Public APIs. Third-party developers expect REST or GraphQL. Asking external consumers to set up protobuf compilation and gRPC client libraries creates unnecessary adoption friction.
Performance Comparison
Understanding the performance characteristics helps inform the decision.
Serialization size. Protobuf messages are typically 30-50% smaller than equivalent JSON. For a product object with 10 fields, a JSON representation might be 400 bytes, while the protobuf equivalent is 180-220 bytes. This difference compounds at scale -- a service processing millions of messages per hour saves significant bandwidth and memory.
Serialization speed. Protobuf serialization and deserialization are 5-10x faster than JSON in most benchmarks. JSON parsing requires string processing, while protobuf operates on binary data with known field offsets.
Latency. gRPC over HTTP/2 benefits from multiplexed connections, header compression, and persistent connections. REST over HTTP/1.1 opens a new TCP connection for each request (unless keep-alive is configured). GraphQL typically sends larger request payloads (the query string) but consolidates multiple REST calls into a single round trip.
Throughput under load. In benchmarks testing 10,000 concurrent requests for a simple read operation, gRPC consistently handles 2-3x the throughput of equivalent REST endpoints on the same hardware. GraphQL falls between the two, depending on query complexity and resolver efficiency.
Versioning Strategies
API evolution is inevitable, and each paradigm handles it differently.
REST versioning typically uses URL-based (/v1/products, /v2/products) or header-based (Accept: application/vnd.api.v2+json) strategies. URL versioning is explicit and easy to understand but leads to code duplication. A more sustainable approach is to keep endpoints stable and make additive, non-breaking changes: add new fields, new endpoints, and new query parameters without removing existing ones.
GraphQL versioning is, by design, continuous. Fields are added to the schema without breaking existing queries. Deprecated fields are marked with the @deprecated directive, and clients are expected to migrate over time. This eliminates version numbers entirely -- but requires discipline to avoid schema bloat.
gRPC versioning leverages protobuf's built-in forward and backward compatibility. New fields with new field numbers can be added without breaking existing clients. Removed fields should be marked reserved to prevent accidental reuse. For breaking changes, you create a new service version in the proto package (package products.v2).
A Decision Framework
Rather than arguing that one paradigm is universally best, use this decision framework based on your actual constraints.
Choose REST when you are building a public-facing API, when HTTP caching is important, when your domain maps cleanly to resources and CRUD operations, or when simplicity and broad tooling support matter more than raw performance.
Choose GraphQL when your clients need flexible data fetching, when you have multiple client types with different data requirements, when your data model has complex relationships that clients traverse in varied ways, or when frontend development velocity is a priority.
Choose gRPC when you are building internal microservice communication, when latency and throughput are critical, when you need streaming capabilities, or when you operate in a polyglot environment and want strong contracts with code generation.
Many production systems use more than one paradigm. A common pattern is gRPC for internal service-to-service communication, GraphQL as a BFF layer aggregating data from multiple gRPC services, and REST for public third-party integrations. The paradigms are not mutually exclusive.
Designing APIs That Last
The API paradigm you choose is important, but it is less important than the quality of your API design within that paradigm. Regardless of whether you use REST, GraphQL, or gRPC, invest in clear naming, consistent error handling, comprehensive documentation, and thoughtful versioning. A well-designed REST API will outperform a poorly designed GraphQL API in developer experience every time.
If you are designing APIs for a new system or evaluating whether your current API strategy is serving your team well, Maranatha Technologies can help. Our team has designed and built APIs across all three paradigms for systems ranging from startup MVPs to enterprise platforms. Visit our software architecture services or reach out directly to start the conversation.