Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Environment:Heibaiying BigData Notes Flink 1 9 Environment

From Leeroopedia


Knowledge Sources
Domains Infrastructure, Stream_Processing
Last Updated 2026-02-10 10:00 GMT

Overview

Apache Flink 1.9.0 stream processing environment with Scala 2.11, supporting both Java and Scala APIs for streaming and batch data processing.

Description

This environment provides Apache Flink 1.9.0 with Scala 2.11 binary compatibility. It includes the Flink Java API (`flink-java`), Streaming API (`flink-streaming-java_2.11`), Kafka connector (`flink-connector-kafka_2.11`), and RocksDB state backend (`flink-statebackend-rocksdb_2.11`). The Scala variant uses `flink-scala_2.11` and `flink-streaming-scala_2.11` with Scala 2.11.12. Projects are packaged using maven-shade-plugin for deployment to Flink clusters.

Usage

Use this environment for any Flink streaming or Flink batch processing pipeline. It is the mandatory prerequisite for the Flink Kafka Streaming Pipeline workflow and all Flink state management examples.

System Requirements

Category Requirement Notes
OS Linux (CentOS 7.6 recommended) Any Linux with JDK 8
Java JDK 1.8 Flink 1.9 requires Java 8
Build Tool Maven 3.0.4+ Explicitly required by Flink projects
Hardware Minimum 2GB RAM per TaskManager Increase for production workloads
Disk 10GB+ For checkpoints and state storage

Dependencies

System Packages

  • `flink` = 1.9.0
  • `java-1.8.0-openjdk-devel`
  • `maven` >= 3.0.4

Java Packages (Maven)

  • `org.apache.flink:flink-java` = 1.9.0 (scope: provided)
  • `org.apache.flink:flink-streaming-java_2.11` = 1.9.0 (scope: provided)
  • `org.apache.flink:flink-connector-kafka_2.11` = 1.9.0 (for Kafka integration)
  • `org.apache.flink:flink-statebackend-rocksdb_2.11` = 1.9.0 (for RocksDB state)
  • `mysql:mysql-connector-java` = 8.0.16 (for MySQL sink)
  • `org.projectlombok:lombok` = 1.18.10 (scope: provided)
  • `org.slf4j:slf4j-log4j12` = 1.7.7 (scope: runtime)
  • `log4j:log4j` = 1.2.17 (scope: runtime)

Scala Packages (for Scala variant)

  • `org.apache.flink:flink-scala_2.11` = 1.9.0 (scope: provided)
  • `org.apache.flink:flink-streaming-scala_2.11` = 1.9.0 (scope: provided)
  • `org.scala-lang:scala-library` = 2.11.12 (scope: provided)

Credentials

No API credentials required for Flink itself. For Kafka integration:

  • Kafka broker addresses configured in application code via `bootstrap.servers` property.

Quick Install

# Download Flink 1.9.0 for Scala 2.11
wget https://archive.apache.org/dist/flink/flink-1.9.0/flink-1.9.0-bin-scala_2.11.tgz
tar -xzf flink-1.9.0-bin-scala_2.11.tgz -C /opt/

# Configure environment
export FLINK_HOME=/opt/flink-1.9.0
export PATH=$PATH:$FLINK_HOME/bin

# Start standalone cluster
$FLINK_HOME/bin/start-cluster.sh

Code Evidence

Flink dependencies from `flink-basis-java/pom.xml`:

<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-java</artifactId>
    <version>1.9.0</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-streaming-java_2.11</artifactId>
    <version>1.9.0</version>
    <scope>provided</scope>
</dependency>

Shade plugin configuration from `flink-basis-java/pom.xml`:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-shade-plugin</artifactId>
    <version>3.0.0</version>
</plugin>

Common Errors

Error Message Cause Solution
`Cannot find database.properties` Running startup from wrong directory Execute Flink commands from the Flink root directory, not `bin/`
`ClassNotFoundException: org.apache.hadoop...` Missing Hadoop dependency Set `HADOOP_HOME` or download Flink Hadoop shaded JAR
`NoResourceAvailableException` No TaskManager slots available Increase `taskmanager.numberOfTaskSlots` in `flink-conf.yaml`

Compatibility Notes

  • Flink dependencies are scope:provided in pom.xml because they are supplied by the Flink cluster at runtime. Do not include them in the uber-JAR.
  • Scala version matters: Flink artifacts use `_2.11` suffix. Do not mix with `_2.12` artifacts.
  • Flink 1.9.0 supports Java 8 only. Java 11 support was added in later Flink versions.
  • Flink cluster configuration: Default `jobmanager.heap.size` and `taskmanager.heap.size` are 1024m; increase for production.

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment