Running R in Production in 2026: A Practical Guide

· 5 min read · Updated March 13, 2026 · intermediate
r production devops docker cron deployment

Running R in production is fundamentally different from running it interactively in RStudio. You need reliability, scheduling, monitoring, and often containerization. This guide covers the essential tools and patterns for production R in 2026.

Why Run R in Production?

R was designed for interactive analysis, not server-side execution. Yet organizations increasingly need to:

  • Run scheduled reports and dashboards
  • Execute ML model inference at scale
  • Power APIs that serve predictions
  • Automate data pipelines that run without human intervention

Modern R tooling makes all of this possible, often with minimal overhead.

Scheduling R Scripts

The simplest production pattern is scheduling R scripts to run at intervals.

cron Jobs

On Linux servers, cron is the standard scheduler:

# Edit crontab
crontab -e

# Run an R script every day at 2 AM
0 2 * * * /usr/bin/Rscript /home/user/scripts/daily_report.R >> /var/log/r_jobs.log 2>&1

Key tips for cron with R:

  • Use absolute paths everywhere
  • Redirect output to a log file
  • Set the working directory explicitly in your script
# Start of your scheduled R script
setwd("/home/user/projects/myproject")

# Load required packages
library(dplyr)
library(glue)

# Your production logic here

The targets Package

For complex pipelines, the targets package provides DAG-based scheduling with automatic dependency tracking:

# _targets.R
library(targets)

tar_option_set(
  packages = c("dplyr", "ggplot2", "readr")
)

list(
  tar_target(raw_data, read_csv("data/input.csv")),
  tar_target(cleaned_data, raw_data |> filter(!is.na(value))),
  tar_target(plot, ggplot(cleaned_data, aes(x, y)) + geom_point()),
  tar_render(report, "report.Rmd")
)

Run with tar_make() to execute only out-of-date targets. This is ideal for data pipelines where you want to avoid re-running expensive steps.

Containerizing R with Docker

Docker has become essential for reproducible R deployments.

Basic Dockerfile for R

FROM r-base:4.3.2

# Install system dependencies
RUN apt-get update && apt-get install -y \
    libcurl4-openssl-dev \
    libssl-dev \
    libxml2-dev \
    libfontconfig1-dev \
    libgit2-dev \
    && rm -rf /var/lib/apt/lists/*

# Set working directory
WORKDIR /app

# Copy your files
COPY . .

# Install R packages
RUN Rscript -e "install.packages(c(tidyverse, plumber))"

# Run your R script or API
CMD ["Rscript", "app.R"]

Multi-stage Build for Smaller Images

# Build stage
FROM r-base:4.3.2 AS builder
RUN apt-get update && apt-get install -y libcurl4-openssl-dev libssl-dev
COPY . .
RUN Rscript -e "install.packages(c(tidyverse, plumber), repos=https://cloud.r-project.org)"

# Runtime stage
FROM r-base:4.3.2
COPY --from=builder /usr/local/lib/R/library /usr/local/lib/R/library
COPY --from=builder /app /app
WORKDIR /app
CMD ["Rscript", "app.R"]

This keeps the final image small by copying only the installed packages, not the build tools.

REST APIs with Plumber

The plumber package turns R functions into REST APIs:

# api.R
library(plumber)

#* @get /predict
#* @param age:numeric
#* @param income:numeric
#* @serializer json
function(age, income) {
  # Load your model
  model <- readRDS("model.rds")
  
  # Make prediction
  new_data <- data.frame(age = as.numeric(age), income = as.numeric(income))
  prediction <- predict(model, new_data)
  
  list(
    prediction = prediction,
    confidence = 0.87
  )
}

Run with plumber::plumb("api.R")$run(host = "0.0.0.0", port = 8000).

Dockerizing Plumber

FROM r-base:4.3.2

RUN apt-get update && apt-get install -y \
    libcurl4-openssl-dev libssl-dev

WORKDIR /app
COPY . .

RUN Rscript -e "install.packages(plumber)"

EXPOSE 8000
CMD ["Rscript", "-e", "plumber::plumb(api.R)$run(host=0.0.0.0, port=8000)"]

Environment Management with renv

The renv package ensures consistent package versions across development and production:

# Initialize renv in your project
renv::init()

# Snapshot your current state
renv::snapshot()

# In production, restore the exact same state
renv::restore()

The lockfile renv.lock commits to version control and guarantees reproducibility.

Monitoring and Logging

Production R needs observability.

Basic Logging

logger <- function(message, level = "INFO") {
  timestamp <- format(Sys.time(), "%Y-%m-%d %H:%M:%S")
  cat(sprintf("[%s] [%s] %s\n", timestamp, level, message))
}

logger("Starting data processing")
# [2026-03-13 14:30:01] INFO Starting data processing

Error Handling in Production

safe_process <- safely(function() {
  # Your processing code
  result <- read_csv("data.csv")
  result |> summarise(mean = mean(value))
}, otherwise = NULL, quiet = FALSE)

result <- safe_process()

if (is.null(result$result)) {
  logger(paste("Error:", result$error), "ERROR")
  # Send alert, e.g., via slack/email
}

Health Checks for Plumber APIs

#* @get /health
function() {
  list(
    status = "healthy",
    timestamp = Sys.time(),
    packages = sessionInfo()$basePkgs
  )
}

Performance Optimization

R is single-threaded by default. Production optimizations include:

Parallel Processing

library(furrr)
library(future)

plan(multisession, workers = 4)

# Parallel map
results <- future_map_dfr(files, ~{
  read_csv(.x) |> process_data()
})

Database Connections

library(DBI)
library(RPostgres)

con <- dbConnect(
  Postgres(),
  host = "db.example.com",
  dbname = "analytics",
  user = Sys.getenv("DB_USER"),
  password = Sys.getenv("DB_PASSWORD")
)

# Always disconnect when done
on.exit(dbDisconnect(con))

Use connection pooling for high-traffic APIs with pool or rpostgis.

CI/CD for R Projects

GitHub Actions example for R:

name: R CI

on: [push]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: r-lib/actions/setup-r@v2
      - name: Install dependencies
        run: |
          Rscript -e "install.packages(c(testthat, covr))"
      - name: Run tests
        run: Rscript -e "testthat::test_dir(tests)"
      - name: Coverage
        run: Rscript -e "covr::codecov()"

Common Production Patterns

PatternToolWhen to Use
Scheduled scriptscron + RscriptSimple daily/weekly jobs
Complex pipelinestargetsDAG workflows with dependencies
REST APIplumberReal-time predictions
Web appShinyInteractive dashboards
ContainerizedDocker + plumberScaleable microservices

See Also

Conclusion

Running R in production is well-supported in 2026. The ecosystem has matured significantly, with tools like targets, plumber, and renv addressing the core challenges of scheduling, API creation, and reproducibility.

Start simple with cron jobs, evolve to targets for complex pipelines, and reach for Docker and plumber when you need scale. The key is treating your R code with the same engineering discipline you would apply to any production system.