Table of Contents
Spring Cloud Function is a project with the following high-level goals:
It abstracts away all of the transport details and infrastructure, allowing the developer to keep all the familiar tools and processes, and focus firmly on business logic.
Here’s a complete, executable, testable Spring Boot application (implementing a simple string manipulation):
@SpringBootApplication public class Application { @Bean public Function<Flux<String>, Flux<String>> uppercase() { return flux -> flux.map(value -> value.toUpperCase()); } public static void main(String[] args) { SpringApplication.run(Application.class, args); } }
It’s just a Spring Boot application, so it can be built, run and
tested, locally and in a CI build, the same way as any other Spring
Boot application. The Function
is from java.util
and Flux
is a
Reactive Streams Publisher
from
Project Reactor. The function can be
accessed over HTTP or messaging.
Spring Cloud Function has 4 main features:
@Beans
of type Function
, Consumer
and
Supplier
, exposing them to the outside world as either HTTP
endpoints and/or message stream listeners/publishers with RabbitMQ, Kafka etc.@Beans
that can be wrapped as above.Note | |
---|---|
Spring Cloud is released under the non-restrictive Apache 2.0 license. If you would like to contribute to this section of the documentation or if you find an error, please find the source code and issue trackers in the project at github. |
Build from the command line (and "install" the samples):
$ ./mvnw clean install
(If you like to YOLO add -DskipTests
.)
Run one of the samples, e.g.
$ java -jar spring-cloud-function-samples/function-sample/target/*.jar
This runs the app and exposes its functions over HTTP, so you can convert a string to uppercase, like this:
$ curl -H "Content-Type: text/plain" localhost:8080/uppercase -d Hello HELLO
You can convert multiple strings (a Flux<String>
) by separating them
with new lines
$ curl -H "Content-Type: text/plain" localhost:8080/uppercase -d 'Hello > World' HELLOWORLD
(You can use QJ
in a terminal to insert a new line in a literal
string like that.)
The sample @SpringBootApplication
above has a function that can be
decorated at runtime by Spring Cloud Function to be an HTTP endpoint,
or a Stream processor, for instance with RabbitMQ, Apache Kafka or
JMS.
The @Beans
can be Function
, Consumer
or Supplier
(all from
java.util
), and their parametric types can be String or POJO. A
Function
is exposed as a Spring Cloud Stream Processor
if
spring-cloud-function-stream
is on the classpath.
A Consumer
is also exposed as a Stream
Sink
and a Supplier
translates to a Stream Source
.
HTTP endpoints are exposed if the Stream binder is spring-cloud-stream-binder-servlet
.
Functions can be of Flux<String>
or Flux<Pojo>
and Spring Cloud
Function takes care of converting the data to and from the desired
types, as long as it comes in as plain text or (in the case of the
POJO) JSON. TBD: support for Flux<Message<Pojo>>
and maybe plain
Pojo
types (Fluxes implied and implemented by the framework).
Functions can be grouped together in a single application, or deployed one-per-jar. It’s up to the developer to choose. An app with multiple functions can be deployed multiple times in different "personalities", exposing different functions over different physical transports.
One of the main features of Spring Cloud Function is to adapt and support a range of type signatures for user-defined functions,
while providing a consistent execution model.
That’s why all user defined functions are transformed into a canonical representation by FunctionCatalog
, using primitives
defined by the Project Reactor (i.e., Flux<T>
and Mono<T>
).
Users can supply a bean of type Function<String,String>
, for instance, and the FunctionCatalog
will wrap it into a
Function<Flux<String>,Flux<String>>
.
Using Reactor based primitives not only helps with the canonical representation of user defined functions, but it also facilitates a more robust and flexible(reactive) execution model.
While users don’t normally have to care about the FunctionCatalog
at all, it is useful to know what
kind of functions are supported in user code.
Generally speaking users can expect that if they write a function for
a plain old Java type (or primitive wrapper), then the function
catalog will wrap it to a Flux
of the same type. If the user writes
a function using Message
(from spring-messaging) it will receive and
transmit headers from any adapter that supports key-value metadata
(e.g. HTTP headers). Here are the details.
User Function | Catalog Registration | |
---|---|---|
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
Consumer is a little bit special because it has a void
return type,
which implies blocking, at least potentially. Most likely you will not
need to write Consumer<Flux<?>>
, but if you do need to do that,
remember to subscribe to the input flux. If you declare a Consumer
of a non publisher type (which is normal), it will be converted to a
function that returns a publisher, so that it can be subscribed to in
a controlled way.
We also provide support for Kotlin lambdas (since v2.0). Consider the following:
@Bean open fun kotlinSupplier(): () -> String { return { "Hello from Kotlin" } } @Bean open fun kotlinFunction(): (String) -> String { return { it.toUpperCase() } } @Bean open fun kotlinConsumer(): (String) -> Unit { return { println(it) } }
The above represents Kotlin lambdas configured as Spring beans. The signature of each maps to a Java equivalent of
Supplier
, Function
and Consumer
, and thus supported/recognized signatures by the framework.
While mechanics of Kotlin-to-Java mapping are outside of the scope of this documentation, it is important to understand that the
same rules for signature transformation outlined in "Java 8 function support" section are applied here as well.
The spring-cloud-function-web
module has autoconfiguration that
activates when it is included in a Spring Boot web application (with
MVC support). There is also a spring-cloud-starter-function-web
to
collect all the optional dependnecies in case you just want a simple
getting started experience.
With the web configurations activated your app will have an MVC
endpoint (on "/" by default, but configurable with
spring.cloud.function.web.path
) that can be used to access the
functions in the application context. The supported content types are
plain text and JSON.
Method | Path | Request | Response | Status |
---|---|---|---|---|
GET | /{supplier} | - | Items from the named supplier | 200 OK |
POST | /{consumer} | JSON object or text | Mirrors input and pushes request body into consumer | 202 Accepted |
POST | /{consumer} | JSON array or text with new lines | Mirrors input and pushes body into consumer one by one | 202 Accepted |
POST | /{function} | JSON object or text | The result of applying the named function | 200 OK |
POST | /{function} | JSON array or text with new lines | The result of applying the named function | 200 OK |
GET | /{function}/{item} | - | Convert the item into an object and return the result of applying the function | 200 OK |
As the table above shows the behaviour of the endpoint depends on the method and also the type of incoming request data. When the incoming data is single valued, and the target function is declared as obviously single valued (i.e. not returning a collection or Flux
), then the response will also contain a single value. For multi-valued responses the client can ask for a server-sent event stream by sending `Accept: text/event-stream". If there is only one function (consumer etc.) then the name in the path is optional. Composite functions can be addressed using pipes or commas to separate function names (pipes are legal in URL paths, but a bit awkward to type on the command line).
Functions and consumers that are declared with input and output in Message<?>
will see the request headers on the input messages, and the output message headers will be converted to HTTP headers.
When POSTing text the response format might be different with Spring Boot 2.0 and older versions, depending on the content negotiation (provide content type and accpt headers for the best results).
To send or receive messages from a broker (such as RabbitMQ or Kafka) you can leverage spring-cloud-stream
project and it’s integration with Spring Cloud Function.
Please refer to Spring Cloud Function section of the Spring Cloud Stream reference manual for more details and examples.
Spring Cloud Function provides a "deployer" library that allows you to launch a jar file (or exploded archive, or set of jar files) with an isolated class loader and expose the functions defined in it. This is quite a powerful tool that would allow you to, for instance, adapt a function to a range of different input-output adapters without changing the target jar file. Serverless platforms often have this kind of feature built in, so you could see it as a building block for a function invoker in such a platform (indeed the Riff Java function invoker uses this library).
The standard entry point of the API is the Spring configuration annotation @EnableFunctionDeployer
. If that is used in a Spring Boot application the deployer kicks in and looks for some configuration to tell it where to find the function jar. At a minimum the user has to provide a function.location
which is a URL or resource location for the archive containing the functions. It can optionally use a maven:
prefix to locate the artifact via a dependency lookup (see FunctionProperties
for complete details). A Spring Boot application is bootstrapped from the jar file, using the MANIFEST.MF
to locate a start class, so that a standard Spring Boot fat jar works well, for example. If the target jar can be launched successfully then the result is a function registered in the main application’s FunctionCatalog
. The registered function can be applied by code in the main application, even though it was created in an isolated class loader (by deault).
There is a sample app that uses the function compiler to create a
function from a configuration property. The vanilla "function-sample"
also has that feature. And there are some scripts that you can run to
see the compilation happening at run time. To run these examples,
change into the scripts
directory:
cd scripts
Also, start a RabbitMQ server locally (e.g. execute rabbitmq-server
).
Start the Function Registry Service:
./function-registry.sh
Register a Function:
./registerFunction.sh -n uppercase -f "f->f.map(s->s.toString().toUpperCase())"
Run a REST Microservice using that Function:
./web.sh -f uppercase -p 9000 curl -H "Content-Type: text/plain" -H "Accept: text/plain" localhost:9000/uppercase -d foo
Register a Supplier:
./registerSupplier.sh -n words -f "()->Flux.just(\"foo\",\"bar\")"
Run a REST Microservice using that Supplier:
./web.sh -s words -p 9001 curl -H "Accept: application/json" localhost:9001/words
Register a Consumer:
./registerConsumer.sh -n print -t String -f "System.out::println"
Run a REST Microservice using that Consumer:
./web.sh -c print -p 9002 curl -X POST -H "Content-Type: text/plain" -d foo localhost:9002/print
Run Stream Processing Microservices:
First register a streaming words supplier:
./registerSupplier.sh -n wordstream -f "()->Flux.interval(Duration.ofMillis(1000)).map(i->\"message-\"+i)"
Then start the source (supplier), processor (function), and sink (consumer) apps (in reverse order):
./stream.sh -p 9103 -i uppercaseWords -c print ./stream.sh -p 9102 -i words -f uppercase -o uppercaseWords ./stream.sh -p 9101 -s wordstream -o words
The output will appear in the console of the sink app (one message per second, converted to uppercase):
MESSAGE-0 MESSAGE-1 MESSAGE-2 MESSAGE-3 MESSAGE-4 MESSAGE-5 MESSAGE-6 MESSAGE-7 MESSAGE-8 MESSAGE-9 ...
As well as being able to run as a standalone process, a Spring Cloud Function application can be adapted to run one of the existing serverless platforms. In the project there are adapters for AWS Lambda, Azure, and Apache OpenWhisk. The Oracle Fn platform has its own Spring Cloud Function adapter. And Riff supports Java functions and its Java Function Invoker acts natively is an adapter for Spring Cloud Function jars.
The AWS adapter takes a Spring Cloud Function app and converts it to a form that can run in AWS Lambda.
The adapter has a couple of generic request handlers that you can use. The most generic is SpringBootStreamHandler
, which uses a Jackson ObjectMapper
provided by Spring Boot to serialize and deserialize the objects in the function. There is also a SpringBootRequestHandler
which you can extend, and provide the input and output types as type parameters (enabling AWS to inspect the class and do the JSON conversions itself).
If your app has more than one @Bean
of type Function
etc. then you can choose the one to use by configuring function.name
(e.g. as FUNCTION_NAME
environment variable in AWS). The functions are extracted from the Spring Cloud FunctionCatalog
(searching first for Function
then Consumer
and finally Supplier
).
You don’t need the Spring Cloud Function Web or Stream adapter at runtime in Lambda, so you might need to exclude those before you create the JAR you send to AWS. A Lambda application has to be shaded, but a Spring Boot standalone application does not, so you can run the same app using 2 separate jars (as per the sample). The sample app creates 2 jar files, one with an aws
classifier for deploying in Lambda, and one executable (thin) jar that includes spring-cloud-function-web
at runtime. Spring Cloud Function will try and locate a "main class" for you from the JAR file manifest, using the Start-Class
attribute (which will be added for you by the Spring Boot tooling if you use the starter parent). If there is no Start-Class
in your manifest you can use an environment variable MAIN_CLASS
when you deploy the function to AWS.
Build the sample under spring-cloud-function-samples/function-sample-aws
and upload the -aws
jar file to Lambda. The handler can be example.Handler
or org.springframework.cloud.function.adapter.aws.SpringBootStreamHandler
(FQN of the class, not a method reference, although Lambda does accept method references).
./mvnw -U clean package
Using the AWS command line tools it looks like this:
aws lambda create-function --function-name Uppercase --role arn:aws:iam::[USERID]:role/service-role/[ROLE] --zip-file fileb://function-sample-aws/target/function-sample-aws-2.0.0.BUILD-SNAPSHOT-aws.jar --handler org.springframework.cloud.function.adapter.aws.SpringBootStreamHandler --description "Spring Cloud Function Adapter Example" --runtime java8 --region us-east-1 --timeout 30 --memory-size 1024 --publish
The input type for the function in the AWS sample is a Foo with a single property called "value". So you would need this to test it:
{ "value": "test" }
AWS has some platform-specific data types, including batching of messages, which is much more efficient than processing each one individually. To make use of these types you can write a function that depends on those types. Or you can rely on Spring to extract the data from the AWS types and convert it to a Spring Message
. To do this you tell AWS that the function is of a specific generic handler type (depending on the AWS service) and provide a bean of type Function<Message<S>,Message<T>>
, where S
and T
are your business data types. If there is more than one bean of type Function
you may also need to configure the Spring Boot property function.name
to be the name of the target bean (e.g. use FUNCTION_NAME
as an environment variable).
The supported AWS services and generic handler types are listed below:
Service | AWS Types | Generic Handler | |
---|---|---|---|
API Gateway |
|
| |
Kinesis | KinesisEvent | org.springframework.cloud.function.adapter.aws.SpringBootKinesisEventHandler |
For example, to deploy behind an API Gateway, use --handler org.springframework.cloud.function.adapter.aws.SpringBootApiGatewayRequestHandler
in your AWS command line (in via the UI) and define a @Bean
of type Function<Message<Foo>,Message<Bar>>
where Foo
and Bar
are POJO types (the data will be marshalled and unmarshalled by AWS using Jackson).
The Azure adapter bootstraps a Spring Cloud Function context and channels function calls from the Azure framework into the user functions, using Spring Boot configuration where necessary. Azure Functions has quite a unique, but invasive programming model, involving annotations in user code that are specific to the platform. The Spring Cloud Function Azure adapter trades the convenience of these annotations for portability of the function implementations. Instead of using the annotations you have to write some JSON by hand (at least for now) to guide the platform to call the right methods in the adapter.
This project provides an adapter layer for a Spring Cloud Function application onto Azure.
You can write an app with a single @Bean
of type Function
and it will be deployable in Azure if you get the JAR file laid out right.
The adapter has a generic HTTP request handler that you can use optionally.
There is a AzureSpringBootRequestHandler
which you must extend, and provide the input and output types as type parameters (enabling Azure to inspect the class and do the JSON conversions itself).
If your app has more than one @Bean
of type Function
etc. then you can choose the one to use by configuring function.name
.
The functions are extracted from the Spring Cloud FunctionCatalog
.
You don’t need the Spring Cloud Function Web at runtime in Azure, so you need to exclude this before you create the JAR you deploy to Azure.
A function application on Azure has to be shaded, but a Spring Boot standalone application does not, so you can run the same app using 2 separate jars (as per the sample here).
The sample app creates the shaded jar file, with an azure
classifier for deploying in Azure.
The Azure tooling needs to find some JSON configuration files to tell it how to deploy and integrate the function (e.g. which Java class to use as the entry point, and which triggers to use). Those files can be created with the Maven plugin for a non-Spring function, but the tooling doesn’t work yet with the adapter in its current form. There is an example function.json
in the sample which hooks the function up as an HTTP endpoint:
{ "scriptFile" : "../function-sample-azure-2.0.0.BUILD-SNAPSHOT-azure.jar", "entryPoint" : "example.FooHandler.execute", "bindings" : [ { "type" : "httpTrigger", "name" : "foo", "direction" : "in", "authLevel" : "anonymous", "methods" : [ "get", "post" ] }, { "type" : "http", "name" : "$return", "direction" : "out" } ], "disabled" : false }
You can run the sample locally, just like the other Spring Cloud Function samples:
and curl -H "Content-Type: text/plain" localhost:8080/function -d '{"value": "hello foobar"}'
.
You will need the az
CLI app and some node.js fu (see https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-java-maven for more detail). To deploy the function on Azure runtime:
$ az login $ mvn azure-functions:deploy
On another terminal try this: curl https://<azure-function-url-from-the-log>/api/uppercase -d '{"value": "hello foobar!"}'
. Please ensure that you use the right URL for the function above. Alternatively you can test the function in the Azure Dashboard UI (click on the function name, go to the right hand side and click "Test" and to the bottom right, "Run").
The input type for the function in the Azure sample is a Foo with a single property called "value". So you need this to test it with something like below:
{ "value": "foobar" }
The OpenWhisk adapter is in the form of an executable jar that can be used in a a docker image to be deployed to Openwhisk. The platform works in request-response mode, listening on port 8080 on a specific endpoint, so the adapter is a simple Spring MVC application.
Implement a POF (be sure to use the functions
package):
package functions; import java.util.function.Function; public class Uppercase implements Function<String, String> { public String apply(String input) { return input.toUpperCase(); } }
Install it into your local Maven repository:
./mvnw clean install
Create a function.properties
file that provides its Maven coordinates. For example:
dependencies.function: com.example:pof:0.0.1-SNAPSHOT
Copy the openwhisk runner JAR to the working directory (same directory as the properties file):
cp spring-cloud-function-adapters/spring-cloud-function-adapter-openwhisk/target/spring-cloud-function-adapter-openwhisk-2.0.0.BUILD-SNAPSHOT.jar runner.jar
Generate a m2 repo from the --thin.dryrun
of the runner JAR with the above properties file:
java -jar -Dthin.root=m2 runner.jar --thin.name=function --thin.dryrun
Use the following Dockerfile:
FROM openjdk:8-jdk-alpine VOLUME /tmp COPY m2 /m2 ADD runner.jar . ADD function.properties . ENV JAVA_OPTS="" ENTRYPOINT [ "java", "-Djava.security.egd=file:/dev/./urandom", "-jar", "runner.jar", "--thin.root=/m2", "--thin.name=function", "--function.name=uppercase"] EXPOSE 8080
Note you could use a Spring Cloud Function app, instead of just a jar with a POF in it, in which case you would have to change the way the app runs in the container so that it picks up the main class as a source file. For example, you could change the
ENTRYPOINT
above and add--spring.main.sources=com.example.SampleApplication
.
Build the Docker image:
docker build -t [username/appname] .
Push the Docker image:
docker push [username/appname]
Use the OpenWhisk CLI (e.g. after vagrant ssh
) to create the action:
wsk action create example --docker [username/appname]
Invoke the action:
wsk action invoke example --result --param payload foo { "result": "FOO" }