As well as being able to run as a standalone process, a Spring Cloud Function application can be adapted to run one of the existing serverless platforms. In the project there are adapters for AWS Lambda, Azure, and Apache OpenWhisk. The Oracle Fn platform has its own Spring Cloud Function adapter. And Riff supports Java functions and its Java Function Invoker acts natively is an adapter for Spring Cloud Function jars.
The AWS adapter takes a Spring Cloud Function app and converts it to a form that can run in AWS Lambda.
The adapter has a couple of generic request handlers that you can use. The most generic is SpringBootStreamHandler
, which uses a Jackson ObjectMapper
provided by Spring Boot to serialize and deserialize the objects in the function. There is also a SpringBootRequestHandler
which you can extend, and provide the input and output types as type parameters (enabling AWS to inspect the class and do the JSON conversions itself).
If your app has more than one @Bean
of type Function
etc. then you can choose the one to use by configuring function.name
(e.g. as FUNCTION_NAME
environment variable in AWS). The functions are extracted from the Spring Cloud FunctionCatalog
(searching first for Function
then Consumer
and finally Supplier
).
You don’t need the Spring Cloud Function Web or Stream adapter at runtime in Lambda, so you might need to exclude those before you create the JAR you send to AWS. A Lambda application has to be shaded, but a Spring Boot standalone application does not, so you can run the same app using 2 separate jars (as per the sample). The sample app creates 2 jar files, one with an aws
classifier for deploying in Lambda, and one executable (thin) jar that includes spring-cloud-function-web
at runtime. Spring Cloud Function will try and locate a "main class" for you from the JAR file manifest, using the Start-Class
attribute (which will be added for you by the Spring Boot tooling if you use the starter parent). If there is no Start-Class
in your manifest you can use an environment variable MAIN_CLASS
when you deploy the function to AWS.
Build the sample under spring-cloud-function-samples/function-sample-aws
and upload the -aws
jar file to Lambda. The handler can be example.Handler
or org.springframework.cloud.function.adapter.aws.SpringBootStreamHandler
(FQN of the class, not a method reference, although Lambda does accept method references).
./mvnw -U clean package
Using the AWS command line tools it looks like this:
aws lambda create-function --function-name Uppercase --role arn:aws:iam::[USERID]:role/service-role/[ROLE] --zip-file fileb://function-sample-aws/target/function-sample-aws-2.0.0.BUILD-SNAPSHOT-aws.jar --handler org.springframework.cloud.function.adapter.aws.SpringBootStreamHandler --description "Spring Cloud Function Adapter Example" --runtime java8 --region us-east-1 --timeout 30 --memory-size 1024 --publish
The input type for the function in the AWS sample is a Foo with a single property called "value". So you would need this to test it:
{ "value": "test" }
Note | |
---|---|
The AWS sample app is written in the "functional" style (as an |
Your functions will start much quicker if you can use functional bean definitions instead of @Bean
. To do this make your main class
an ApplicationContextInitalizer<GenericApplicationContext>
and use the registerBean()
methods in GenericApplicationContext
to
create all the beans you need. You function need sto be registered as a bean of type FunctionRegistration
so that the input and
output types can be accessed by the framework. There is an example in github (the AWS sample is written in this style). It would
look something like this:
@SpringBootApplication public class FuncApplication implements ApplicationContextInitializer<GenericApplicationContext> { public static void main(String[] args) throws Exception { FunctionalSpringApplication.run(FuncApplication.class, args); } public Function<Foo, Bar> function() { return value -> new Bar(value.uppercase())); } @Override public void initialize(GenericApplicationContext context) { context.registerBean("function", FunctionRegistration.class, () -> new FunctionRegistration<Function<Foo, Bar>>(function()) .type(FunctionType.from(Foo.class).to(Bar.class).getType())); } }
AWS has some platform-specific data types, including batching of messages, which is much more efficient than processing each one individually. To make use of these types you can write a function that depends on those types. Or you can rely on Spring to extract the data from the AWS types and convert it to a Spring Message
. To do this you tell AWS that the function is of a specific generic handler type (depending on the AWS service) and provide a bean of type Function<Message<S>,Message<T>>
, where S
and T
are your business data types. If there is more than one bean of type Function
you may also need to configure the Spring Boot property function.name
to be the name of the target bean (e.g. use FUNCTION_NAME
as an environment variable).
The supported AWS services and generic handler types are listed below:
Service | AWS Types | Generic Handler | |
---|---|---|---|
API Gateway |
|
| |
Kinesis | KinesisEvent | org.springframework.cloud.function.adapter.aws.SpringBootKinesisEventHandler |
For example, to deploy behind an API Gateway, use --handler org.springframework.cloud.function.adapter.aws.SpringBootApiGatewayRequestHandler
in your AWS command line (in via the UI) and define a @Bean
of type Function<Message<Foo>,Message<Bar>>
where Foo
and Bar
are POJO types (the data will be marshalled and unmarshalled by AWS using Jackson).
An AWS Lambda custom runtime can be created really easily using the HTTP export features in Spring Cloud Function Web. To make this work just add Spring Cloud Function AWS and Spring Cloud Function Web as dependencies in your project and set the following in your application.properties
:
spring.cloud.function.web.export.enabled=true
Set the handler name in AWS to the name of your function. Then provide a bootstrap
script in the root of your zip/jar that runs the Spring Boot application. The functional bean definition style works for custom runtimes too, and is faster than the @Bean
style, so the example FuncApplication
above would work. A custom runtime can start up much quicker even than a functional bean implementation of a Java lambda - it depends mostly on the number of classes you need to load at runtime. Spring doesn’t do very much here, so you can reduce the cold start time by only using primitive types in your function, for instance, and not doing any work in custom @PostConstruct
initializers.
The Azure adapter bootstraps a Spring Cloud Function context and channels function calls from the Azure framework into the user functions, using Spring Boot configuration where necessary. Azure Functions has quite a unique, but invasive programming model, involving annotations in user code that are specific to the platform. The easiest way to use it with Spring Cloud is to extend a base class and write a method in it with the @FunctionName
annotation which delegates to a base class method.
This project provides an adapter layer for a Spring Cloud Function application onto Azure.
You can write an app with a single @Bean
of type Function
and it will be deployable in Azure if you get the JAR file laid out right.
There is an AzureSpringBootRequestHandler
which you must extend, and provide the input and output types as annotated method parameters (enabling Azure to inspect the class and create JSON bindings). The base class has two useful methods (handleRequest
and handleOutput
) to which you can delegate the actual function call, so mostly the function will only ever have one line.
Example:
public class FooHandler extends AzureSpringBootRequestHandler<Foo, Bar> { @FunctionName("uppercase") public Bar execute( @HttpTrigger(name = "req", methods = { HttpMethod.GET, HttpMethod.POST }, authLevel = AuthorizationLevel.ANONYMOUS) Foo foo, ExecutionContext context) { return handleRequest(foo, context); } }
This Azure handler will delegate to a Function<Foo,Bar>
bean (or a Function<Publisher<Foo>,Publisher<Bar>>
). Some Azure triggers (e.g. @CosmosDBTrigger
) result in a input type of List
and in that case you can bind to List
in the Azure handler, or String
(the raw JSON). The List
input delegates to a Function
with input type Map<String,Object>
, or Publisher
or List
of the same type. The output of the Function
can be a List
(one-for-one) or a single value (aggregation), and the output binding in the Azure declaration should match.
If your app has more than one @Bean
of type Function
etc. then you can choose the one to use by configuring function.name
. Or if you make the @FunctionName
in the Azure handler method match the function name it should work that way (also for function apps with multiple functions). The functions are extracted from the Spring Cloud FunctionCatalog
so the default function names are the same as the bean names.
Some time there is a need to access the target execution context provided by Azure runtime in the form of com.microsoft.azure.functions.ExecutionContext
.
For example one of such needs is logging, so it can appear in the Azure console.
For that purpose Spring Cloud Function will register ExecutionContext
as bean in the Application context, so it could be injected into your function.
For example
@Bean public Function<Foo, Bar> uppercase(ExecutionContext targetContext) { return foo -> { targetContext.getLogger().info("Invoking 'uppercase' on " + foo.getValue()); return new Bar(foo.getValue().toUpperCase()); }; }
Normally type-based injection should suffice, however if need to you can also utilise the bean name under which it is registered which is targetExecutionContext
.
You don’t need the Spring Cloud Function Web at runtime in Azure, so you can exclude this before you create the JAR you deploy to Azure, but it won’t be used if you include it so it doesn’t hurt to leave it in. A function application on Azure is an archive generated by the Maven plugin. The function lives in the JAR file generated by this project. The sample creates it as an executable jar, using the thin layout, so that Azure can find the handler classes. If you prefer you can just use a regular flat JAR file. The dependencies should not be included.
You can run the sample locally, just like the other Spring Cloud Function samples:
and curl -H "Content-Type: text/plain" localhost:8080/function -d '{"value": "hello foobar"}'
.
You will need the az
CLI app (see https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-java-maven for more detail). To deploy the function on Azure runtime:
$ az login $ mvn azure-functions:deploy
On another terminal try this: curl https://<azure-function-url-from-the-log>/api/uppercase -d '{"value": "hello foobar!"}'
. Please ensure that you use the right URL for the function above. Alternatively you can test the function in the Azure Dashboard UI (click on the function name, go to the right hand side and click "Test" and to the bottom right, "Run").
The input type for the function in the Azure sample is a Foo with a single property called "value". So you need this to test it with something like below:
{ "value": "foobar" }
Note | |
---|---|
The Azure sample app is written in the "non-functional" style (using |
The OpenWhisk adapter is in the form of an executable jar that can be used in a a docker image to be deployed to Openwhisk. The platform works in request-response mode, listening on port 8080 on a specific endpoint, so the adapter is a simple Spring MVC application.
Implement a POF (be sure to use the functions
package):
package functions; import java.util.function.Function; public class Uppercase implements Function<String, String> { public String apply(String input) { return input.toUpperCase(); } }
Install it into your local Maven repository:
./mvnw clean install
Create a function.properties
file that provides its Maven coordinates. For example:
dependencies.function: com.example:pof:0.0.1-SNAPSHOT
Copy the openwhisk runner JAR to the working directory (same directory as the properties file):
cp spring-cloud-function-adapters/spring-cloud-function-adapter-openwhisk/target/spring-cloud-function-adapter-openwhisk-2.0.0.BUILD-SNAPSHOT.jar runner.jar
Generate a m2 repo from the --thin.dryrun
of the runner JAR with the above properties file:
java -jar -Dthin.root=m2 runner.jar --thin.name=function --thin.dryrun
Use the following Dockerfile:
FROM openjdk:8-jdk-alpine VOLUME /tmp COPY m2 /m2 ADD runner.jar . ADD function.properties . ENV JAVA_OPTS="" ENTRYPOINT [ "java", "-Djava.security.egd=file:/dev/./urandom", "-jar", "runner.jar", "--thin.root=/m2", "--thin.name=function", "--function.name=uppercase"] EXPOSE 8080
Note you could use a Spring Cloud Function app, instead of just a jar with a POF in it, in which case you would have to change the way the app runs in the container so that it picks up the main class as a source file. For example, you could change the
ENTRYPOINT
above and add--spring.main.sources=com.example.SampleApplication
.
Build the Docker image:
docker build -t [username/appname] .
Push the Docker image:
docker push [username/appname]
Use the OpenWhisk CLI (e.g. after vagrant ssh
) to create the action:
wsk action create example --docker [username/appname]
Invoke the action:
wsk action invoke example --result --param payload foo { "result": "FOO" }