Spring Cloud Data Flow

Spring Cloud Data Flow is a toolkit for building data integration and real-time data processing pipelines.


Pipelines consist of Spring Boot apps, built using the Spring Cloud Stream or Spring Cloud Task microservice frameworks. This makes Spring Cloud Data Flow suitable for a range of data processing use cases, from import/export to event streaming and predictive analytics.

Quick Start
Fork me on GitHub

Overview

The Spring Cloud Data Flow server uses Spring Cloud Deployer, to deploy data pipelines onto modern runtimes such as Cloud Foundry and Kubernetes.

A selection of pre-built stream and task/batch starter apps for various data integration and processing scenarios facilitate learning and experimentation.

Custom stream and task applications, targeting different middleware or data services, can be built using the familiar Spring Boot style programming model.

A simple stream pipeline DSL makes it easy to specify which apps to deploy and how to connect outputs and inputs. The composed task DSL is useful for when a series of task apps requires to be run as a directed graph.

The dashboard offers a graphical editor for building new pipelines interactively, as well as views of deployable apps and running apps with metrics using Prometheus, InfluxDB, or other monitoring systems.

The Spring Could Data Flow server exposes a REST API for composing and deploying data pipelines. A separate shell makes it easy to work with the API from the command line.

Platform Implementations

An easy way to get started on Spring Cloud Data Flow would be to follow the platform-specific implementation links from the table below.

Getting Started Stable Release Milestone/Snapshot Release
Local 2.0.2.RELEASE[docs] 2.1.0.M1[docs]
Cloud Foundry 2.0.2.RELEASE[docs] 2.1.0.M1[docs]
Kubernetes 2.0.2.RELEASE[docs] 2.1.0.M1[docs]

Community Implementations

Quick Start

Step 1 - There are two ways to get started. The quickest is to download the Spring Cloud Data Flow Local-Server's Docker Compose artifact. (Mac users can use 'curl -O' instead of 'wget')

wget https://raw.githubusercontent.com/spring-cloud/spring-cloud-dataflow/v2.0.2.RELEASE/spring-cloud-dataflow-server/docker-compose.yml

Step 2 - From the directory where you downloaded docker-compose.yml, start the SCDF system.

DATAFLOW_VERSION=2.0.2.RELEASE SKIPPER_VERSION=2.0.1.RELEASE docker-compose up

Step 3 - Open the dashboard at http://localhost:9393/dashboard.

Step 4 - Use 'Create Stream(s)' under "Streams" Menu to define and deploy a stream time | log with the name 'ticktock'.

Create TickTock Stream

Deploy TickTock Stream

Once the ‘ticktock’ stream is deployed, you will notice two stream-apps (ticktock.log and ticktock.time) under the "Runtime" tab. Click the i icon of the 'ticktock.log' app to copy the path of the streamed logs.

Deploy TickTock Stream

Step 5 - To verify the deployed stream and the results, copy the path in the "stdout" text box from the dashboard. From another terminal-console type:

docker exec -it skipper tail -f <COPIED-STDOUT-PATH>