Creating A REST API in Rust
Introduction
Why Choose Rust for REST APIs?
Rust has emerged as an exceptional choice for building modern REST APIs, offering a unique combination of performance, safety, and reliability that makes it ideal for backend systems:
Strong Type System: Rust's sophisticated type system makes it perfect for building reliable backend systems. You can model your API contracts precisely using types, ensuring data integrity throughout your application. Serialization and deserialization become type-safe operations, reducing the likelihood of runtime errors.
Rich Ecosystem: The Rust ecosystem has matured significantly, offering robust libraries for web development, database integration, authentication, caching, and more. This ecosystem enables rapid development while maintaining the performance and safety benefits of Rust.
Versatile Applications: Rust excels across diverse domains—from microservices and web APIs to system programming, blockchain applications, IoT devices, and even WebAssembly targets. This versatility means your team's Rust expertise can be leveraged across multiple projects and platforms.
Speed and Performance: Rust delivers near-C performance without the overhead of garbage collection. This translates to faster response times, higher throughput, and more efficient resource utilization—critical factors for APIs that need to handle thousands of concurrent requests.
Memory Safety and Security: Rust's ownership system prevents common security vulnerabilities like buffer overflows, null pointer dereferences, and memory leaks at compile time. For APIs that handle sensitive data or operate in security-critical environments, this built-in safety is invaluable.
Reliability and Stability: The strong type system and compile-time guarantees ensure that many runtime errors are caught before deployment. When your API compiles in Rust, you can be confident it will run reliably in production.
Why REST APIs?
REST (Representational State Transfer) remains the dominant architectural style for web APIs, offering several advantages over alternatives like GraphQL and gRPC:
Simplicity and Familiarity: REST APIs use standard HTTP methods and status codes, making them intuitive for developers to understand and implement. The learning curve is minimal compared to more complex protocols.
Universal Compatibility: REST APIs work with any HTTP client, from web browsers to mobile apps to command-line tools. This universal compatibility makes integration straightforward across different platforms and languages.
Caching and CDN Support: HTTP's built-in caching mechanisms work seamlessly with REST APIs, allowing for easy performance optimization through CDNs and proxy caches.
Tooling and Infrastructure: The REST ecosystem is mature, with extensive tooling for testing, monitoring, documentation, and debugging. Most API gateways, load balancers, and monitoring solutions are optimized for HTTP/REST traffic.
Compared to GraphQL: While GraphQL offers query flexibility, REST is simpler to implement, cache, and secure. REST's predictable endpoint structure makes it easier to optimize and monitor individual operations.
Compared to gRPC: While gRPC provides better performance for service-to-service communication, REST's HTTP foundation makes it more suitable for public APIs and web applications where broad compatibility is essential.
Getting Started
Project Setup
Let's begin by creating a new Rust project and setting up the basic structure:
# Create a new Rust project
cargo new my-api
cd my-api
# Initialize git repository
git init
git add .
git commit -m "feat: Initialize new Rust project with basic structure"
Dependencies
Update your Cargo.toml
with the essential dependencies for building a REST API with OpenAPI documentation:
[package]
name = "my-api"
version = "0.1.0"
edition = "2021"
[dependencies]
aide = { version = "0.15.0", features = ["axum-json"] }
axum = "0.8.4"
schemars = "0.9.0"
serde = { version = "1.0", features = ["derive"] }
tokio = { version = "1.0", features = ["full"] }
Key Dependencies Explained:
aide
: Provides OpenAPI documentation generation integrated with Axumaxum
: Modern, ergonomic web framework built on top of hyper and towerschemars
: Generates JSON schemas from Rust types for OpenAPI specificationsserde
: De-facto standard for serialization/deserialization in Rusttokio
: Async runtime that powers the entire application
Building the Core API
Basic Server Setup
Start by replacing the contents of src/main.rs
with a basic server setup:
use aide::{
axum::{ApiRouter, IntoApiResponse, routing::get},
openapi::{Info, OpenApi},
};
use axum::{Extension, Json};
mod api;
async fn serve_api(Extension(api): Extension<OpenApi>) -> impl IntoApiResponse {
Json(api)
}
#[tokio::main]
async fn main() {
let mut api = OpenApi {
info: Info {
description: Some("My REST API with OpenAPI documentation".to_string()),
..Info::default()
},
..OpenApi::default()
};
let app = ApiRouter::new().nest_api_service("/api", api::create_api_router());
// Disable inference after API routes are added to prevent
// inclusion of unused schemas
aide::generate::infer_responses(false);
let app = app
.route("/api.json", get(serve_api))
.finish_api(&mut api)
.layer(Extension(api));
let port = 3000;
let port = std::env::var("PORT").map_or(port, |v| v.parse::<u16>().unwrap_or(port));
let listener = tokio::net::TcpListener::bind(("0.0.0.0", port))
.await
.unwrap();
axum::serve(listener, app).await.unwrap();
}
Creating the API Module
Create src/api.rs
to organize your API routes:
use aide::axum::ApiRouter;
mod users;
pub(crate) fn create_api_router() -> ApiRouter {
ApiRouter::new().nest_api_service("/users", users::create_users_router())
}
This modular approach allows you to organize related endpoints together and compose them into larger API structures.
Implementing Endpoints
Create src/api/users.rs
with your first set of endpoints:
use aide::axum::{
ApiRouter, IntoApiResponse,
routing::{get_with, post_with},
};
use axum::{Json, extract::Path};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
pub(crate) fn create_users_router() -> ApiRouter {
ApiRouter::new()
.api_route_with(
"/",
post_with(create_user, |o| o.summary("Create a new user")),
|o| o.tag("Users"),
)
.api_route_with(
"/{user_id}",
get_with(get_user, |o| o.summary("Get a user by ID")),
|o| o.tag("Users"),
)
}
/// A user in the system.
#[derive(Serialize, Deserialize, JsonSchema)]
pub struct User {
/// The unique ID for the user.
pub id: String,
/// The user's email address.
pub email: String,
/// The user's display name.
pub name: String,
}
#[derive(Serialize, Deserialize, JsonSchema)]
struct CreateUserRequest {
/// Email address for the new user.
email: String,
/// Display name for the new user.
name: String,
}
#[derive(Serialize, Deserialize, JsonSchema)]
struct GetUserParams {
/// The unique ID of the requested user.
user_id: String,
}
async fn create_user(Json(request): Json<CreateUserRequest>) -> impl IntoApiResponse {
// In a real application, you would save to a database here
let user = User {
id: "123".to_string(), // Generate a real ID
email: request.email,
name: request.name,
};
Json(user)
}
async fn get_user(
Path(GetUserParams { user_id }): Path<GetUserParams>,
) -> impl IntoApiResponse {
// In a real application, you would fetch from a database here
let user = User {
id: user_id,
email: "[email protected]".to_string(),
name: "John Doe".to_string(),
};
Json(user)
}
Key Patterns:
- Type Safety: All request/response types derive
JsonSchema
for automatic OpenAPI generation - Documentation: Use
api_route_with
to add summaries and tags to your endpoints - Modular Design: Separate routers for different resource types enable clean organization
Testing Your API
Run your server and test the endpoints:
cargo run
Test the endpoints:
# Create a user
curl -X POST http://localhost:3000/api/users/ \
-H "Content-Type: application/json" \
-d '{"email": "[email protected]", "name": "Test User"}'
# Get a specific user
curl http://localhost:3000/api/users/123
# View the OpenAPI specification
curl http://localhost:3000/api.json
Adding API Documentation
One of the major advantages of using Aide is the automatic generation of interactive API documentation. You have several options for rendering this documentation.
Option 1: Using axum-swagger-ui
Add the dependency to your Cargo.toml
:
axum-swagger-ui = "0.3.0"
Update your src/main.rs
:
use axum::{Extension, Json, response::Html};
use axum_swagger_ui::swagger_ui;
// Add this route in your main function
let app = app
.route("/api.json", get(serve_api))
.route("/docs", get(|| async { Html(swagger_ui("/api.json")) }))
.finish_api(&mut api)
.layer(Extension(api));
Option 2: Using Aide's Built-in Swagger Feature
Add the swagger feature to aide:
aide = { version = "0.15.0", features = ["axum-json", "swagger"] }
Update your imports and route:
use aide::swagger::Swagger;
// Add this route
.route("/docs", Swagger::new("/api.json").axum_route())
Option 3: Using Scalar
For a modern, fast documentation interface:
aide = { version = "0.15.0", features = ["axum-json", "scalar"] }
use aide::scalar::Scalar;
// Add this route
.route("/docs", Scalar::new("/api.json").axum_route())
Option 4: Using Redoc
For a clean, three-panel documentation layout:
aide = { version = "0.15.0", features = ["axum-json", "redoc"] }
use aide::redoc::Redoc;
// Add this route
.route("/docs", Redoc::new("/api.json").axum_route())
After implementing any of these options, visit http://localhost:3000/docs
to see your interactive API documentation.
Deployment
Deploying to Fly.io
Fly.io is an excellent platform for deploying Rust applications with global distribution.
1. Create a Dockerfile
FROM lukemathwalker/cargo-chef:latest-rust-1 AS chef
WORKDIR /app
FROM chef AS planner
COPY . .
RUN cargo chef prepare --recipe-path recipe.json
FROM chef AS builder
COPY --from=planner /app/recipe.json recipe.json
# Build dependencies - this is the caching Docker layer!
RUN cargo chef cook --release --recipe-path recipe.json
# Build application
COPY . .
RUN cargo build --release --bin my-api
# We do not need the Rust toolchain to run the binary!
FROM debian:bookworm-slim AS runtime
WORKDIR /app
COPY --from=builder /app/target/release/my-api /usr/local/bin
ENTRYPOINT ["/usr/local/bin/my-api"]
2. Create a .dockerignore
.git/
/target
node_modules/
*.log
3. Configure Fly.io
Create fly.toml
:
app = 'my-api'
primary_region = 'dfw' # Choose your preferred region
[build]
[env]
PORT = '8080'
[http_service]
internal_port = 8080
force_https = true
auto_stop_machines = true
auto_start_machines = true
min_machines_running = 0
processes = ['app']
[[vm]]
memory = '1gb'
cpu_kind = 'shared'
cpus = 1
4. Deploy
# Install flyctl if you haven't already
curl -L https://fly.io/install.sh | sh
# Login to Fly.io
fly auth login
# Deploy your application
fly deploy
# Open your deployed app
fly open
Deploying to AWS Lambda
For serverless deployment, you can use AWS Lambda with the lambda_http
crate.
1. Add Lambda Dependencies
lambda_http = "0.15.1"
2. Modify Your Main Function
Update src/main.rs
:
use lambda_http::{Error, run, tracing};
#[tokio::main]
async fn main() -> Result<(), Error> {
tracing::init_default_subscriber();
let mut api = OpenApi {
info: Info {
description: Some("My REST API with OpenAPI documentation".to_string()),
..Info::default()
},
..OpenApi::default()
};
let app = ApiRouter::new().nest_api_service("/api", api::create_api_router());
aide::generate::infer_responses(false);
let app = app
.route("/api.json", get(serve_api))
.route("/docs", get(|| async { Html(swagger_ui("/api.json")) }))
.finish_api(&mut api)
.layer(Extension(api));
run(app).await
}
3. Build for Lambda
# Install cargo-lambda
cargo install cargo-lambda
# Build for Lambda
cargo lambda build --release
# Deploy (requires AWS CLI configuration)
cargo lambda deploy
4. Alternative: Using SAM or CDK
You can also deploy using AWS SAM or CDK for more complex infrastructure requirements. The lambda_http
integration makes your Axum application compatible with AWS Lambda's event handling automatically.
Best Practices and Next Steps
Error Handling
Implement comprehensive error handling using custom error types:
use axum::{http::StatusCode, response::IntoResponse, Json};
use serde_json::json;
#[derive(Debug)]
pub enum ApiError {
NotFound,
BadRequest(String),
InternalServerError,
}
impl IntoResponse for ApiError {
fn into_response(self) -> axum::response::Response {
let (status, error_message) = match self {
ApiError::NotFound => (StatusCode::NOT_FOUND, "Resource not found"),
ApiError::BadRequest(msg) => (StatusCode::BAD_REQUEST, &msg),
ApiError::InternalServerError => (StatusCode::INTERNAL_SERVER_ERROR, "Internal server error"),
};
let body = Json(json!({
"error": error_message,
}));
(status, body).into_response()
}
}
Database Integration
Add database support with SQLx:
sqlx = { version = "0.7", features = ["runtime-tokio-rustls", "postgres", "chrono", "uuid"] }
Authentication and Authorization
Implement JWT-based authentication:
jsonwebtoken = "9.2"
Testing
Add comprehensive tests:
[dev-dependencies]
tower = { version = "0.4", features = ["util"] }
tower-http = { version = "0.5", features = ["trace"] }
Conclusion
Building REST APIs in Rust with Axum and Aide provides a powerful combination of performance, type safety, and developer experience. The automatic OpenAPI documentation generation eliminates the common problem of outdated API documentation, while Rust's type system ensures your API contracts are enforced at compile time.
Key benefits of this approach:
- Type-Safe APIs: Rust's type system prevents common API errors at compile time
- Automatic Documentation: OpenAPI specs are generated directly from your code
- High Performance: Near-zero overhead runtime with excellent concurrency
- Production Ready: Built-in support for monitoring, logging, and error handling
- Deployment Flexibility: Easy deployment to cloud platforms, containers, or serverless
As your API grows, you can extend this foundation with databases, authentication, caching, rate limiting, and other production concerns while maintaining the type safety and performance benefits that make Rust an excellent choice for backend development.
The ecosystem continues to mature rapidly, with new crates and tools being developed to support Rust web development. By choosing this stack, you're building on a foundation that will scale with your needs and provide long-term maintainability.