Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Environment:Haifengl Smile Quarkus Serve Environment

From Leeroopedia


Knowledge Sources
Domains Infrastructure, Model_Serving, REST_API
Last Updated 2026-02-08 22:00 GMT

Overview

Quarkus 3.30.6 application server environment with Jakarta REST, Hibernate ORM, and SQLite for running Smile's model inference and chat services.

Description

The Smile `serve` module is a Quarkus-based REST application that hosts ML model inference and LLM chat completion endpoints. It uses Jakarta RESTful Web Services (via RESTEasy Reactive), Hibernate ORM Panache for persistence, Mutiny for reactive streaming, and Quinoa for serving the web UI. The deep learning module integrates PyTorch via Bytedeco JNI bindings with optional CUDA 12.9 support.

Usage

Use this environment for running the Model Serving Pipeline and Chat Completion services. Required by `InferenceService`, `InferenceResource`, `ChatCompletionResource`, and their associated model loading and streaming prediction implementations.

System Requirements

Category Requirement Notes
OS Linux, macOS, or Windows Linux recommended for production
JDK Java 25 Required by Smile base module
Memory 4GB+ RAM For model loading and inference
Ports 3000, 3801, 4173 Default Quarkus dev ports
GPU NVIDIA GPU (optional) For deep learning inference with CUDA 12.9

Dependencies

Quarkus Platform

  • `io.quarkus:quarkus-bom` = 3.30.6
  • `io.quarkus:quarkus-rest` (Jakarta REST endpoints)
  • `io.quarkus:quarkus-rest-jackson` (JSON serialization)
  • `io.quarkus:quarkus-arc` (CDI dependency injection)
  • `io.quarkus:quarkus-hibernate-orm-panache` (ORM persistence)
  • `io.quarkus:quarkus-jdbc-postgresql` (PostgreSQL driver)
  • `io.quarkiverse.jdbc:quarkus-jdbc-sqlite` = 3.0.1 (SQLite driver)
  • `io.quarkiverse.quinoa:quarkus-quinoa` = 2.7.1 (web UI serving)

Deep Learning (Optional)

  • `org.bytedeco:pytorch-platform` = 2.7.1-1.5.12
  • `org.bytedeco:cuda-platform` = 12.9-9.10-1.5.12

Web UI

  • Node.js (for Quinoa web UI build)
  • npm packages defined in `serve/src/main/webui/package.json`

JVM Flags

  • `--add-opens java.base/java.lang=ALL-UNNAMED`
  • `--add-opens java.base/java.nio=ALL-UNNAMED`
  • `--enable-native-access ALL-UNNAMED`

Credentials

No API keys or tokens are required for the base serving setup. Model files (`.sml` format) must be present at the configured model path.

Quick Install

# Install Quarkus CLI
curl -Ls https://sh.jbang.dev | bash -s - trust add https://repo1.maven.org/maven2/io/quarkus/quarkus-cli/
curl -Ls https://sh.jbang.dev | bash -s - app install --fresh --force quarkus@quarkusio

# Install web UI dependencies
cd serve/src/main/webui && npm install

# Run in dev mode
./gradlew :serve:quarkusDev

Code Evidence

Quarkus version from `gradle.properties:4-8`:

quarkusPluginId=io.quarkus
quarkusPluginVersion=3.30.6
quarkusPlatformGroupId=io.quarkus.platform
quarkusPlatformArtifactId=quarkus-bom
quarkusPlatformVersion=3.30.6

JVM flags for Quarkus dev mode from `serve/build.gradle.kts`:

quarkusDev {
    jvmArgs = listOf(
        "--add-opens", "java.base/java.lang=ALL-UNNAMED",
        "--add-opens", "java.base/java.nio=ALL-UNNAMED",
        "--enable-native-access", "ALL-UNNAMED",
    )
}

CDI startup model loading from `serve/src/main/java/smile/serve/InferenceService.java:43-55`:

@Startup
@ApplicationScoped
public class InferenceService {
    @Inject
    public InferenceService(InferenceServiceConfig config) {
        var path = Paths.get(config.model()).toAbsolutePath().normalize();
        if (Files.isRegularFile(path)) {
            loadModel(path);
        } else if (Files.isDirectory(path)) {
            // loads all .sml files from directory
        }
    }
}

Common Errors

Error Message Cause Solution
`InaccessibleObjectException` in Quarkus Missing JVM module opens Add `--add-opens java.base/java.lang=ALL-UNNAMED` to JVM args
`ServiceUnavailableException` Chat service not initialized Ensure LLM model is configured and loaded
`NotFoundException` on model endpoint Model ID not found in registry Verify `.sml` model files exist at configured path
`BadRequestException` on predict Invalid JSON input body Ensure request matches model's expected schema

Compatibility Notes

  • Quarkus Dev Mode: Requires additional JVM flags (`--add-opens`, `--enable-native-access`) due to Smile's native FFI usage.
  • Model Format: Models must be serialized in Smile's `.sml` format (Java object serialization).
  • CUDA (Optional): Deep learning features require NVIDIA GPU with CUDA 12.9 drivers and the `smile-deep` module.
  • Web UI: Quinoa plugin automatically builds the React frontend during Quarkus build. Requires Node.js.
  • OMP_NUM_THREADS: DevContainer sets `OMP_NUM_THREADS=1` to avoid OpenBLAS thread contention.

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment