Mark Fisher, Dave Syer, Oleg Zhurakousky, Anshul Mehra
3.0.0.M1
Introduction
Spring Cloud Function is a project with the following high-level goals:
-
Promote the implementation of business logic via functions.
-
Decouple the development lifecycle of business logic from any specific runtime target so that the same code can run as a web endpoint, a stream processor, or a task.
-
Support a uniform programming model across serverless providers, as well as the ability to run standalone (locally or in a PaaS).
-
Enable Spring Boot features (auto-configuration, dependency injection, metrics) on serverless providers.
It abstracts away all of the transport details and infrastructure, allowing the developer to keep all the familiar tools and processes, and focus firmly on business logic.
Here’s a complete, executable, testable Spring Boot application (implementing a simple string manipulation):
@SpringBootApplication
public class Application {
@Bean
public Function<Flux<String>, Flux<String>> uppercase() {
return flux -> flux.map(value -> value.toUpperCase());
}
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
It’s just a Spring Boot application, so it can be built, run and
tested, locally and in a CI build, the same way as any other Spring
Boot application. The Function
is from java.util
and Flux
is a
Reactive Streams Publisher
from
Project Reactor. The function can be
accessed over HTTP or messaging.
Spring Cloud Function has 4 main features:
-
Wrappers for
@Beans
of typeFunction
,Consumer
andSupplier
, exposing them to the outside world as either HTTP endpoints and/or message stream listeners/publishers with RabbitMQ, Kafka etc. -
Compiling strings which are Java function bodies into bytecode, and then turning them into
@Beans
that can be wrapped as above. -
Deploying a JAR file containing such an application context with an isolated classloader, so that you can pack them together in a single JVM.
-
Adapters for AWS Lambda, Azure, Apache OpenWhisk and possibly other "serverless" service providers.
Spring Cloud is released under the non-restrictive Apache 2.0 license. If you would like to contribute to this section of the documentation or if you find an error, please find the source code and issue trackers in the project at github. |
Getting Started
Build from the command line (and "install" the samples):
$ ./mvnw clean install
(If you like to YOLO add -DskipTests
.)
Run one of the samples, e.g.
$ java -jar spring-cloud-function-samples/function-sample/target/*.jar
This runs the app and exposes its functions over HTTP, so you can convert a string to uppercase, like this:
$ curl -H "Content-Type: text/plain" localhost:8080/uppercase -d Hello HELLO
You can convert multiple strings (a Flux<String>
) by separating them
with new lines
$ curl -H "Content-Type: text/plain" localhost:8080/uppercase -d 'Hello > World' HELLOWORLD
(You can use QJ
in a terminal to insert a new line in a literal
string like that.)
Building and Running a Function
The sample @SpringBootApplication
above has a function that can be
decorated at runtime by Spring Cloud Function to be an HTTP endpoint,
or a Stream processor, for instance with RabbitMQ, Apache Kafka or
JMS.
The @Beans
can be Function
, Consumer
or Supplier
(all from
java.util
), and their parametric types can be String or POJO.
Functions can also be of Flux<String>
or Flux<Pojo>
and Spring
Cloud Function takes care of converting the data to and from the
desired types, as long as it comes in as plain text or (in the case of
the POJO) JSON. There is also support for Message<Pojo>
where the
message headers are copied from the incoming event, depending on the
adapter. The web adapter also supports conversion from form-encoded
data to a Map
, and if you are using the function with Spring Cloud
Stream then all the conversion and coercion features for message
payloads will be applicable as well.
Functions can be grouped together in a single application, or deployed one-per-jar. It’s up to the developer to choose. An app with multiple functions can be deployed multiple times in different "personalities", exposing different functions over different physical transports.
Function Catalog and Flexible Function Signatures
One of the main features of Spring Cloud Function is to adapt and support a range of type signatures for user-defined functions,
while providing a consistent execution model.
That’s why all user defined functions are transformed into a canonical representation by FunctionCatalog
, using primitives
defined by the Project Reactor (i.e., Flux<T>
and Mono<T>
).
Users can supply a bean of type Function<String,String>
, for instance, and the FunctionCatalog
will wrap it into a
Function<Flux<String>,Flux<String>>
.
Using Reactor based primitives not only helps with the canonical representation of user defined functions, but it also facilitates a more robust and flexible(reactive) execution model.
While users don’t normally have to care about the FunctionCatalog
at all, it is useful to know what
kind of functions are supported in user code.
Java 8 function support
Generally speaking users can expect that if they write a function for
a plain old Java type (or primitive wrapper), then the function
catalog will wrap it to a Flux
of the same type. If the user writes
a function using Message
(from spring-messaging) it will receive and
transmit headers from any adapter that supports key-value metadata
(e.g. HTTP headers). Here are the details.
User Function | Catalog Registration | |
---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Consumer is a little bit special because it has a void
return type,
which implies blocking, at least potentially. Most likely you will not
need to write Consumer<Flux<?>>
, but if you do need to do that,
remember to subscribe to the input flux. If you declare a Consumer
of a non publisher type (which is normal), it will be converted to a
function that returns a publisher, so that it can be subscribed to in
a controlled way.
Function Component Scan
Spring Cloud Function will scan for implementations of Function
,
Consumer
and Supplier
in a package called functions
if it
exists. Using this feature you can write functions that have no
dependencies on Spring - not even the @Component
annotation is
needed. If you want to use a different package, you can set
spring.cloud.function.scan.packages
. You can also use
spring.cloud.function.scan.enabled=false
to switch off the scan
completely.
Function Routing
Since version 2.2 Spring Cloud Function provides routing feature allowing you to invoke a single function which acts as a router to an actual function you wish to invoke This feature is very useful in certain FAAS environments where maintaining configurations for several functions could be cumbersome or exposing more then one function is not possible.
You enable this feature via spring.cloud.function.routing.enabled
property setting it
to true
(default is false
).
This enables RoutingFunction
under the name router
which is loaded in FunctionCatalog.
This function has the following signature:
public class RoutingFunction implements Function<Publisher<Message<?>>, Publisher<?>>, Consumer<Publisher<Message<?>>> {
. . .
}
This allows the above function to act as both Function
and Consumer
.
As you can see it takes Message<?>
as an input argument. This allows you to communicate
the name of the actual function you want to invoke by providing function.name
Message header.
In specific execution environments/models the adapters are responsible to translate and communicate function.name
via Message header. For example, when using spring-cloud-function-web you can provide function.name
as an HTTP
header and the framework will propagate it as well as other HTTP headers as Message headers.
Using Message also allows us to benefit from `MessageConverter`s to convert incoming request to the actual input type of the target function
Kotlin Lambda support
We also provide support for Kotlin lambdas (since v2.0). Consider the following:
@Bean
open fun kotlinSupplier(): () -> String {
return { "Hello from Kotlin" }
}
@Bean
open fun kotlinFunction(): (String) -> String {
return { it.toUpperCase() }
}
@Bean
open fun kotlinConsumer(): (String) -> Unit {
return { println(it) }
}
The above represents Kotlin lambdas configured as Spring beans. The signature of each maps to a Java equivalent of
Supplier
, Function
and Consumer
, and thus supported/recognized signatures by the framework.
While mechanics of Kotlin-to-Java mapping are outside of the scope of this documentation, it is important to understand that the
same rules for signature transformation outlined in "Java 8 function support" section are applied here as well.
To enable Kotlin support all you need is to add spring-cloud-function-kotlin
module to your classpath which contains the appropriate
autoconfiguration and supporting classes.
Standalone Web Applications
The spring-cloud-function-web
module has autoconfiguration that
activates when it is included in a Spring Boot web application (with
MVC support). There is also a spring-cloud-starter-function-web
to
collect all the optional dependencies in case you just want a simple
getting started experience.
With the web configurations activated your app will have an MVC
endpoint (on "/" by default, but configurable with
spring.cloud.function.web.path
) that can be used to access the
functions in the application context. The supported content types are
plain text and JSON.
Method | Path | Request | Response | Status |
---|---|---|---|---|
GET |
/{supplier} |
- |
Items from the named supplier |
200 OK |
POST |
/{consumer} |
JSON object or text |
Mirrors input and pushes request body into consumer |
202 Accepted |
POST |
/{consumer} |
JSON array or text with new lines |
Mirrors input and pushes body into consumer one by one |
202 Accepted |
POST |
/{function} |
JSON object or text |
The result of applying the named function |
200 OK |
POST |
/{function} |
JSON array or text with new lines |
The result of applying the named function |
200 OK |
GET |
/{function}/{item} |
- |
Convert the item into an object and return the result of applying the function |
200 OK |
As the table above shows the behaviour of the endpoint depends on the method and also the type of incoming request data. When the incoming data is single valued, and the target function is declared as obviously single valued (i.e. not returning a collection or Flux
), then the response will also contain a single value.
For multi-valued responses the client can ask for a server-sent event stream by sending `Accept: text/event-stream".
If there is only a single function (consumer etc.) in the catalog, the name in the path is optional. Composite functions can be addressed using pipes or commas to separate function names (pipes are legal in URL paths, but a bit awkward to type on the command line).
For cases where there is more then a single function in catalog and you want to map a specific function to the root
path (e.g., "/"), or you want to compose several functions and then map to the root path you can do so by providing
spring.cloud.function.definition
property which essentially used by spring-=cloud-function-web module to provide
default mapping for cases where there is some type of a conflict (e.g., more then one function available etc).
For example,
--spring.cloud.function.definition=foo|bar
The above property will compose 'foo' and 'bar' function and map the composed function to the "/" path.
Functions and consumers that are declared with input and output in Message<?>
will see the request headers on the input messages, and the output message headers will be converted to HTTP headers.
When POSTing text the response format might be different with Spring Boot 2.0 and older versions, depending on the content negotiation (provide content type and accpt headers for the best results).
Standalone Streaming Applications
To send or receive messages from a broker (such as RabbitMQ or Kafka) you can leverage spring-cloud-stream
project and it’s integration with Spring Cloud Function.
Please refer to Spring Cloud Function section of the Spring Cloud Stream reference manual for more details and examples.
Deploying a Packaged Function
Spring Cloud Function provides a "deployer" library that allows you to launch a jar file (or exploded archive, or set of jar files) with an isolated class loader and expose the functions defined in it. This is quite a powerful tool that would allow you to, for instance, adapt a function to a range of different input-output adapters without changing the target jar file. Serverless platforms often have this kind of feature built in, so you could see it as a building block for a function invoker in such a platform (indeed the Riff Java function invoker uses this library).
The standard entry point of the API is the Spring configuration annotation @EnableFunctionDeployer
. If that is used in a Spring Boot application the deployer kicks in and looks for some configuration to tell it where to find the function jar. At a minimum the user has to provide a function.location
which is a URL or resource location for the archive containing the functions. It can optionally use a maven:
prefix to locate the artifact via a dependency lookup (see FunctionProperties
for complete details). A Spring Boot application is bootstrapped from the jar file, using the MANIFEST.MF
to locate a start class, so that a standard Spring Boot fat jar works well, for example. If the target jar can be launched successfully then the result is a function registered in the main application’s FunctionCatalog
. The registered function can be applied by code in the main application, even though it was created in an isolated class loader (by deault).
Functional Bean Definitions
Spring Cloud Function supports a "functional" style of bean declarations for small apps where you need fast startup. The functional style of bean declaration was a feature of Spring Framework 5.0 with significant enhancements in 5.1.
Comparing Functional with Traditional Bean Definitions
Here’s a vanilla Spring Cloud Function application from with the
familiar @Configuration
and @Bean
declaration style:
@SpringBootApplication
public class DemoApplication {
@Bean
public Function<String, String> uppercase() {
return value -> value.toUpperCase();
}
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
}
You can run the above in a serverless platform, like AWS Lambda or Azure Functions, or you can run it in its own HTTP server just by including spring-cloud-function-starter-web
on the classpath. Running the main method would expose an endpoint that you can use to ping that uppercase
function:
$ curl localhost:8080 -d foo
FOO
The web adapter in spring-cloud-function-starter-web
uses Spring MVC, so you needed a Servlet container. You can also use Webflux where the default server is netty (even though you can still use Servlet containers if you want to) - just include the spring-cloud-starter-function-webflux
dependency instead. The functionality is the same, and the user application code can be used in both.
Now for the functional beans: the user application code can be recast into "functional" form, like this:
@SpringBootConfiguration
public class DemoApplication implements ApplicationContextInitializer<GenericApplicationContext> {
public static void main(String[] args) {
FunctionalSpringApplication.run(DemoApplication.class, args);
}
public Function<String, String> uppercase() {
return value -> value.toUpperCase();
}
@Override
public void initialize(GenericApplicationContext context) {
context.registerBean("demo", FunctionRegistration.class,
() -> new FunctionRegistration<>(uppercase())
.type(FunctionType.from(String.class).to(String.class)));
}
}
The main differences are:
-
The main class is an
ApplicationContextInitializer
. -
The
@Bean
methods have been converted to calls tocontext.registerBean()
-
The
@SpringBootApplication
has been replaced with@SpringBootConfiguration
to signify that we are not enabling Spring Boot autoconfiguration, and yet still marking the class as an "entry point". -
The
SpringApplication
from Spring Boot has been replaced with aFunctionalSpringApplication
from Spring Cloud Function (it’s a subclass).
The business logic beans that you register in a Spring Cloud Function app are of type FunctionRegistration
. This is a wrapper that contains both the function and information about the input and output types. In the @Bean
form of the application that information can be derived reflectively, but in a functional bean registration some of it is lost unless we use a FunctionRegistration
.
An alternative to using an ApplicationContextInitializer
and FunctionRegistration
is to make the application itself implement Function
(or Consumer
or Supplier
). Example (equivalent to the above):
@SpringBootConfiguration
public class DemoApplication implements Function<String, String> {
public static void main(String[] args) {
FunctionalSpringApplication.run(DemoApplication.class, args);
}
@Override
public String uppercase(String value) {
return value.toUpperCase();
}
}
It would also work if you add a separate, standalone class of type Function
and register it with the SpringApplication
using an alternative form of the run()
method. The main thing is that the generic type information is available at runtime through the class declaration.
The app runs in its own HTTP server if you add spring-cloud-starter-function-webflux
(it won’t work with the MVC starter at the moment because the functional form of the embedded Servlet container hasn’t been implemented). The app also runs just fine in AWS Lambda or Azure Functions, and the improvements in startup time are dramatic.
The "lite" web server has some limitations for the range of Function signatures - in particular it doesn’t (yet) support Message input and output, but POJOs and any kind of Publisher should be fine.
|
Testing Functional Applications
Spring Cloud Function also has some utilities for integration testing that will be very familiar to Spring Boot users. For example, here is an integration test for the HTTP server wrapping the app above:
@RunWith(SpringRunner.class)
@FunctionalSpringBootTest
@AutoConfigureWebTestClient
public class FunctionalTests {
@Autowired
private WebTestClient client;
@Test
public void words() throws Exception {
client.post().uri("/").body(Mono.just("foo"), String.class).exchange()
.expectStatus().isOk().expectBody(String.class).isEqualTo("FOO");
}
}
This test is almost identical to the one you would write for the @Bean
version of the same app - the only difference is the @FunctionalSpringBootTest
annotation, instead of the regular @SpringBootTest
. All the other pieces, like the @Autowired
WebTestClient
, are standard Spring Boot features.
Or you could write a test for a non-HTTP app using just the FunctionCatalog
. For example:
@RunWith(SpringRunner.class)
@FunctionalSpringBootTest
public class FunctionalTests {
@Autowired
private FunctionCatalog catalog;
@Test
public void words() throws Exception {
Function<Flux<String>, Flux<String>> function = catalog.lookup(Function.class,
"function");
assertThat(function.apply(Flux.just("foo")).blockFirst()).isEqualTo("FOO");
}
}
(The FunctionCatalog
always returns functions from Flux
to Flux
, even if the user declares them with a simpler signature.)
Limitations of Functional Bean Declaration
Most Spring Cloud Function apps have a relatively small scope compared to the whole of Spring Boot, so we are able to adapt it to these functional bean definitions easily. If you step outside that limited scope, you can extend your Spring Cloud Function app by switching back to @Bean
style configuration, or by using a hybrid approach. If you want to take advantage of Spring Boot autoconfiguration for integrations with external datastores, for example, you will need to use @EnableAutoConfiguration
. Your functions can still be defined using the functional declarations if you want (i.e. the "hybrid" style), but in that case you will need to explicitly switch off the "full functional mode" using spring.functional.enabled=false
so that Spring Boot can take back control.
Dynamic Compilation
There is a sample app that uses the function compiler to create a
function from a configuration property. The vanilla "function-sample"
also has that feature. And there are some scripts that you can run to
see the compilation happening at run time. To run these examples,
change into the scripts
directory:
cd scripts
Also, start a RabbitMQ server locally (e.g. execute rabbitmq-server
).
Start the Function Registry Service:
./function-registry.sh
Register a Function:
./registerFunction.sh -n uppercase -f "f->f.map(s->s.toString().toUpperCase())"
Run a REST Microservice using that Function:
./web.sh -f uppercase -p 9000 curl -H "Content-Type: text/plain" -H "Accept: text/plain" localhost:9000/uppercase -d foo
Register a Supplier:
./registerSupplier.sh -n words -f "()->Flux.just(\"foo\",\"bar\")"
Run a REST Microservice using that Supplier:
./web.sh -s words -p 9001 curl -H "Accept: application/json" localhost:9001/words
Register a Consumer:
./registerConsumer.sh -n print -t String -f "System.out::println"
Run a REST Microservice using that Consumer:
./web.sh -c print -p 9002 curl -X POST -H "Content-Type: text/plain" -d foo localhost:9002/print
Run Stream Processing Microservices:
First register a streaming words supplier:
./registerSupplier.sh -n wordstream -f "()->Flux.interval(Duration.ofMillis(1000)).map(i->\"message-\"+i)"
Then start the source (supplier), processor (function), and sink (consumer) apps (in reverse order):
./stream.sh -p 9103 -i uppercaseWords -c print ./stream.sh -p 9102 -i words -f uppercase -o uppercaseWords ./stream.sh -p 9101 -s wordstream -o words
The output will appear in the console of the sink app (one message per second, converted to uppercase):
MESSAGE-0 MESSAGE-1 MESSAGE-2 MESSAGE-3 MESSAGE-4 MESSAGE-5 MESSAGE-6 MESSAGE-7 MESSAGE-8 MESSAGE-9 ...
Serverless Platform Adapters
As well as being able to run as a standalone process, a Spring Cloud Function application can be adapted to run one of the existing serverless platforms. In the project there are adapters for AWS Lambda, Azure, and Apache OpenWhisk. The Oracle Fn platform has its own Spring Cloud Function adapter. And Riff supports Java functions and its Java Function Invoker acts natively is an adapter for Spring Cloud Function jars.
AWS Lambda
3.0.0.M1
The AWS adapter takes a Spring Cloud Function app and converts it to a form that can run in AWS Lambda.
Introduction
The adapter has a couple of generic request handlers that you can use. The most generic is SpringBootStreamHandler
, which uses a Jackson ObjectMapper
provided by Spring Boot to serialize and deserialize the objects in the function. There is also a SpringBootRequestHandler
which you can extend, and provide the input and output types as type parameters (enabling AWS to inspect the class and do the JSON conversions itself).
If your app has more than one @Bean
of type Function
etc. then you can choose the one to use by configuring function.name
(e.g. as FUNCTION_NAME
environment variable in AWS). The functions are extracted from the Spring Cloud FunctionCatalog
(searching first for Function
then Consumer
and finally Supplier
).
Notes on JAR Layout
You don’t need the Spring Cloud Function Web or Stream adapter at runtime in Lambda, so you might
need to exclude those before you create the JAR you send to AWS. A Lambda application has to be
shaded, but a Spring Boot standalone application does not, so you can run the same app using 2
separate jars (as per the sample). The sample app creates 2 jar files, one with an aws
classifier for deploying in Lambda, and one executable (thin) jar that includes spring-cloud-function-web
at runtime. Spring Cloud Function will try and locate a "main class" for you from the JAR file
manifest, using the Start-Class
attribute (which will be added for you by the Spring Boot
tooling if you use the starter parent). If there is no Start-Class
in your manifest you can
use an environment variable or system property MAIN_CLASS
when you deploy the function to AWS.
If you are not using the functional bean definitions but relying on Spring Boot’s auto-configuration, then additional transformers must be configured as part of the maven-shade-plugin execution.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</dependency>
</dependencies>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>aws</shadedClassifierName>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/spring.handlers</resource>
</transformer>
<transformer implementation="org.springframework.boot.maven.PropertiesMergingResourceTransformer">
<resource>META-INF/spring.factories</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/spring.schemas</resource>
</transformer>
</transformers>
</configuration>
</plugin>
Build file setup
In order to run Spring Cloud Function applications on AWS Lambda, you can leverage Maven or Gradle plugins offered by the cloud platform provider.
Maven
In order to use the adapter plugin for Maven, add the plugin dependency to your pom.xml
file:
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-function-adapter-aws</artifactId>
</dependency>
</dependencies>
As pointed out in the Notes on JAR Layout, you wil need a shaded jar in order to upload it to AWS Lambda. You can use the Maven Shade Plugin for that. The example of the setup can be found above.
You can use theSpring Boot Maven Plugin to generate the thin jar.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<dependencies>
<dependency>
<groupId>org.springframework.boot.experimental</groupId>
<artifactId>spring-boot-thin-layout</artifactId>
<version>${wrapper.version}</version>
</dependency>
</dependencies>
</plugin>
You can find the entire sample pom.xml
file for deploying Spring Cloud Function
applications to AWS Lambda with Maven here.
Gradle
In order to use the adapter plugin for Gradle, add the dependency to your build.gradle
file:
dependencies {
compile("org.springframework.cloud:spring-cloud-function-adapter-aws:${version}")
}
As pointed out in Notes on JAR Layout, you wil need a shaded jar in order to upload it to AWS Lambda. You can use the Gradle Shadow Plugin for that:
buildscript {
dependencies {
classpath "com.github.jengelman.gradle.plugins:shadow:${shadowPluginVersion}"
}
}
apply plugin: 'com.github.johnrengelman.shadow'
assemble.dependsOn = [shadowJar]
import com.github.jengelman.gradle.plugins.shadow.transformers.*
shadowJar {
classifier = 'aws'
dependencies {
exclude(
dependency("org.springframework.cloud:spring-cloud-function-web:${springCloudFunctionVersion}"))
}
// Required for Spring
mergeServiceFiles()
append 'META-INF/spring.handlers'
append 'META-INF/spring.schemas'
append 'META-INF/spring.tooling'
transform(PropertiesFileTransformer) {
paths = ['META-INF/spring.factories']
mergeStrategy = "append"
}
}
You can use the Spring Boot Gradle Plugin and Spring Boot Thin Gradle Plugin to generate the thin jar.
buildscript {
dependencies {
classpath("org.springframework.boot.experimental:spring-boot-thin-gradle-plugin:${wrapperVersion}")
classpath("org.springframework.boot:spring-boot-gradle-plugin:${springBootVersion}")
}
}
apply plugin: 'org.springframework.boot'
apply plugin: 'org.springframework.boot.experimental.thin-launcher'
assemble.dependsOn = [thinJar]
You can find the entire sample build.gradle
file for deploying Spring Cloud Function
applications to AWS Lambda with Gradle here.
Upload
Build the sample under spring-cloud-function-samples/function-sample-aws
and upload the -aws
jar file to Lambda. The handler can be example.Handler
or org.springframework.cloud.function.adapter.aws.SpringBootStreamHandler
(FQN of the class, not a method reference, although Lambda does accept method references).
./mvnw -U clean package
Using the AWS command line tools it looks like this:
aws lambda create-function --function-name Uppercase --role arn:aws:iam::[USERID]:role/service-role/[ROLE] --zip-file fileb://function-sample-aws/target/function-sample-aws-2.0.0. SNAPSHOT-aws.jar --handler org.springframework.cloud.function.adapter.aws.SpringBootStreamHandler --description "Spring Cloud Function Adapter Example" --runtime java8 --region us-east-1 --timeout 30 --memory-size 1024 --publish
The input type for the function in the AWS sample is a Foo with a single property called "value". So you would need this to test it:
{ "value": "test" }
The AWS sample app is written in the "functional" style (as an ApplicationContextInitializer ). This is much faster on startup in Lambda than the traditional @Bean style, so if you don’t need @Beans (or @EnableAutoConfiguration ) it’s a good choice. Warm starts are not affected.
|
Type Conversion
Spring Cloud Function will attempt to transparently handle type conversion between the raw input stream and types declared by your function.
For example, if your function signature is as such Function<Foo, Bar>
we will attempt to convert
incoming stream event to an instance of Foo
.
In the event type is not known or can not be determined (e.g., Function<?, ?>
) we will attempt to
convert an incoming stream event to a generic Map
.
Raw Input
There are times when you may want to have access to a raw input. In this case all you need is to declare your
function signature to accept InputStream
. For example, Function<InputStream, ?>
. In this case
we will not attempt any conversion and will pass the raw input directly to a function.
Functional Bean Definitions
Your functions will start much quicker if you can use functional bean definitions instead of @Bean
. To do this make your main class
an ApplicationContextInitializer<GenericApplicationContext>
and use the registerBean()
methods in GenericApplicationContext
to
create all the beans you need. You function need sto be registered as a bean of type FunctionRegistration
so that the input and
output types can be accessed by the framework. There is an example in github (the AWS sample is written in this style). It would
look something like this:
@SpringBootApplication
public class FuncApplication implements ApplicationContextInitializer<GenericApplicationContext> {
public static void main(String[] args) throws Exception {
FunctionalSpringApplication.run(FuncApplication.class, args);
}
public Function<Foo, Bar> function() {
return value -> new Bar(value.uppercase()));
}
@Override
public void initialize(GenericApplicationContext context) {
context.registerBean("function", FunctionRegistration.class,
() -> new FunctionRegistration<Function<Foo, Bar>>(function())
.type(FunctionType.from(Foo.class).to(Bar.class).getType()));
}
}
Platform Specific Features
HTTP and API Gateway
AWS has some platform-specific data types, including batching of messages, which is much more efficient than processing each one individually. To make use of these types you can write a function that depends on those types. Or you can rely on Spring to extract the data from the AWS types and convert it to a Spring Message
. To do this you tell AWS that the function is of a specific generic handler type (depending on the AWS service) and provide a bean of type Function<Message<S>,Message<T>>
, where S
and T
are your business data types. If there is more than one bean of type Function
you may also need to configure the Spring Boot property function.name
to be the name of the target bean (e.g. use FUNCTION_NAME
as an environment variable).
The supported AWS services and generic handler types are listed below:
Service | AWS Types | Generic Handler | |
---|---|---|---|
API Gateway |
|
|
|
Kinesis |
KinesisEvent |
org.springframework.cloud.function.adapter.aws.SpringBootKinesisEventHandler |
For example, to deploy behind an API Gateway, use --handler org.springframework.cloud.function.adapter.aws.SpringBootApiGatewayRequestHandler
in your AWS command line (in via the UI) and define a @Bean
of type Function<Message<Foo>,Message<Bar>>
where Foo
and Bar
are POJO types (the data will be marshalled and unmarshalled by AWS using Jackson).
Custom Runtime
An AWS Lambda custom runtime can be created really easily using the HTTP export features in Spring Cloud Function Web. To make this work just add Spring Cloud Function AWS and Spring Cloud Function Web as dependencies in your project and set the following in your application.properties
:
spring.cloud.function.web.export.enabled=true
Set the handler name in AWS to the name of your function. Then provide a bootstrap
script in the root of your zip/jar that runs the Spring Boot application. The functional bean definition style works for custom runtimes too, and is faster than the @Bean
style, so the example FuncApplication
above would work. A custom runtime can start up much quicker even than a functional bean implementation of a Java lambda - it depends mostly on the number of classes you need to load at runtime. Spring doesn’t do very much here, so you can reduce the cold start time by only using primitive types in your function, for instance, and not doing any work in custom @PostConstruct
initializers.
Microsoft Azure
3.0.0.M1
The Azure adapter bootstraps a Spring Cloud Function context and channels function calls from the Azure framework into the user functions, using Spring Boot configuration where necessary. Azure Functions has quite a unique, but invasive programming model, involving annotations in user code that are specific to the platform. The easiest way to use it with Spring Cloud is to extend a base class and write a method in it with the @FunctionName
annotation which delegates to a base class method.
This project provides an adapter layer for a Spring Cloud Function application onto Azure.
You can write an app with a single @Bean
of type Function
and it will be deployable in Azure if you get the JAR file laid out right.
There is an AzureSpringBootRequestHandler
which you must extend, and provide the input and output types as annotated method parameters (enabling Azure to inspect the class and create JSON bindings). The base class has two useful methods (handleRequest
and handleOutput
) to which you can delegate the actual function call, so mostly the function will only ever have one line.
Example:
public class FooHandler extends AzureSpringBootRequestHandler<Foo, Bar> {
@FunctionName("uppercase")
public Bar execute(
@HttpTrigger(name = "req", methods = { HttpMethod.GET,
HttpMethod.POST }, authLevel = AuthorizationLevel.ANONYMOUS)
Foo foo,
ExecutionContext context) {
return handleRequest(foo, context);
}
}
This Azure handler will delegate to a Function<Foo,Bar>
bean (or a Function<Publisher<Foo>,Publisher<Bar>>
). Some Azure triggers (e.g. @CosmosDBTrigger
) result in a input type of List
and in that case you can bind to List
in the Azure handler, or String
(the raw JSON). The List
input delegates to a Function
with input type Map<String,Object>
, or Publisher
or List
of the same type. The output of the Function
can be a List
(one-for-one) or a single value (aggregation), and the output binding in the Azure declaration should match.
If your app has more than one @Bean
of type Function
etc. then you can choose the one to use by configuring function.name
. Or if you make the @FunctionName
in the Azure handler method match the function name it should work that way (also for function apps with multiple functions). The functions are extracted from the Spring Cloud FunctionCatalog
so the default function names are the same as the bean names.
Accessing Azure ExecutionContext
Some time there is a need to access the target execution context provided by Azure runtime in the form of com.microsoft.azure.functions.ExecutionContext
.
For example one of such needs is logging, so it can appear in the Azure console.
For that purpose Spring Cloud Function will register ExecutionContext
as bean in the Application context, so it could be injected into your function.
For example
@Bean
public Function<Foo, Bar> uppercase(ExecutionContext targetContext) {
return foo -> {
targetContext.getLogger().info("Invoking 'uppercase' on " + foo.getValue());
return new Bar(foo.getValue().toUpperCase());
};
}
Normally type-based injection should suffice, however if need to you can also utilise the bean name under which it is registered which is targetExecutionContext
.
Notes on JAR Layout
You don’t need the Spring Cloud Function Web at runtime in Azure, so you can exclude this before you create the JAR you deploy to Azure, but it won’t be used if you include it, so it doesn’t hurt to leave it in. A function application on Azure is an archive generated by the Maven plugin. The function lives in the JAR file generated by this project. The sample creates it as an executable jar, using the thin layout, so that Azure can find the handler classes. If you prefer you can just use a regular flat JAR file. The dependencies should not be included.
Build file setup
In order to run Spring Cloud Function applications on Microsoft Azure, you can leverage the Maven plugin offered by the cloud platform provider.
In order to use the adapter plugin for Maven, add the plugin dependency to your pom.xml
file:
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-function-adapter-azure</artifactId>
</dependency>
</dependencies>
Then, configure the plugin. You will need to provide Azure-specific configuration for your
application, specifying the resourceGroup
, appName
and other optional properties, and
add the package
goal execution so that the function.json
file required by Azure is
generated for you. Full plugin documentation can be found in the plugin repository.
<plugin>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-functions-maven-plugin</artifactId>
<configuration>
<resourceGroup>${functionResourceGroup}</resourceGroup>
<appName>${functionAppName}</appName>
</configuration>
<executions>
<execution>
<id>package-functions</id>
<goals>
<goal>package</goal>
</goals>
</execution>
</executions>
</plugin>
You will also have to ensure that the files to be scanned by the plugin can be found in the Azure functions staging directory (see the plugin repository for more details on the staging directory and it’s default location).
You can find the entire sample pom.xml
file for deploying Spring Cloud Function
applications to Microsoft Azure with Maven here.
As of yet, only Maven plugin is available. Gradle plugin has not been created by the cloud platform provider. |
Build
./mvnw -U clean package
Running the sample
You can run the sample locally, just like the other Spring Cloud Function samples:
and curl -H "Content-Type: text/plain" localhost:8080/function -d '{"value": "hello foobar"}'
.
You will need the az
CLI app (see https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-java-maven for more detail). To deploy the function on Azure runtime:
$ az login $ mvn azure-functions:deploy
On another terminal try this: curl https://<azure-function-url-from-the-log>/api/uppercase -d '{"value": "hello foobar!"}'
. Please ensure that you use the right URL for the function above. Alternatively you can test the function in the Azure Dashboard UI (click on the function name, go to the right hand side and click "Test" and to the bottom right, "Run").
The input type for the function in the Azure sample is a Foo with a single property called "value". So you need this to test it with something like below:
{ "value": "foobar" }
The Azure sample app is written in the "non-functional" style (using @Bean ). The functional style (with just Function or ApplicationContextInitializer ) is much faster on startup in Azure than the traditional @Bean style, so if you don’t need @Beans (or @EnableAutoConfiguration ) it’s a good choice. Warm starts are not affected.
|
Apache OpenWhisk
3.0.0.M1
The OpenWhisk adapter is in the form of an executable jar that can be used in a a docker image to be deployed to Openwhisk. The platform works in request-response mode, listening on port 8080 on a specific endpoint, so the adapter is a simple Spring MVC application.
Quick Start
Implement a POF (be sure to use the functions
package):
package functions;
import java.util.function.Function;
public class Uppercase implements Function<String, String> {
public String apply(String input) {
return input.toUpperCase();
}
}
Install it into your local Maven repository:
./mvnw clean install
Create a function.properties
file that provides its Maven coordinates. For example:
dependencies.function: com.example:pof:0.0.1-SNAPSHOT
Copy the openwhisk runner JAR to the working directory (same directory as the properties file):
cp spring-cloud-function-adapters/spring-cloud-function-adapter-openwhisk/target/spring-cloud-function-adapter-openwhisk-2.0.0.BUILD-SNAPSHOT.jar runner.jar
Generate a m2 repo from the --thin.dryrun
of the runner JAR with the above properties file:
java -jar -Dthin.root=m2 runner.jar --thin.name=function --thin.dryrun
Use the following Dockerfile:
FROM openjdk:8-jdk-alpine
VOLUME /tmp
COPY m2 /m2
ADD runner.jar .
ADD function.properties .
ENV JAVA_OPTS=""
ENTRYPOINT [ "java", "-Djava.security.egd=file:/dev/./urandom", "-jar", "runner.jar", "--thin.root=/m2", "--thin.name=function", "--function.name=uppercase"]
EXPOSE 8080
you could use a Spring Cloud Function app, instead of just a jar with a POF in it, in which case you would have to change the way the app runs in the container so that it picks up the main class as a source file. For example, you could change the ENTRYPOINT
above and add--spring.main.sources=com.example.SampleApplication
.
Build the Docker image:
docker build -t [username/appname] .
Push the Docker image:
docker push [username/appname]
Use the OpenWhisk CLI (e.g. after vagrant ssh
) to create the action:
wsk action create example --docker [username/appname]
Invoke the action:
wsk action invoke example --result --param payload foo
{
"result": "FOO"
}