---
apiVersion: v1
kind: List
items:
- apiVersion: v1
kind: ConfigMap
metadata:
labels:
funktion.fabric8.io/kind: Connector
provider: fabric8
project: connector-spark-rest
version: 1.1.9
group: io.fabric8.funktion.connector
name: spark-rest
data:
deployment.yml: |
---
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
labels:
funktion.fabric8.io/kind: Subscription
connector: spark-rest
spec:
replicas: 1
template:
metadata:
labels:
funktion.fabric8.io/kind: Subscription
connector: spark-rest
spec:
containers:
- image: fabric8/connector-spark-rest:1.1.9
name: connector
schema.yml: |
---
component:
kind: component
scheme: spark-rest
syntax: spark-rest:verb:path
title: Spark Rest
description: The spark-rest component is used for hosting REST services which has been defined using Camel rest-dsl.
label: rest
deprecated: false
async: false
consumerOnly: true
javaType: org.apache.camel.component.sparkrest.SparkComponent
groupId: org.apache.camel
artifactId: camel-spark-rest
version: 2.18.1
componentProperties:
port:
kind: property
type: integer
javaType: int
deprecated: false
secret: false
description: Port number. Will by default use 4567
ipAddress:
kind: property
type: string
javaType: java.lang.String
deprecated: false
secret: false
description: Set the IP address that Spark should listen on. If not called the default address is '0.0.0.0'.
minThreads:
kind: property
type: integer
javaType: int
deprecated: false
secret: false
description: Minimum number of threads in Spark thread-pool (shared globally)
maxThreads:
kind: property
type: integer
javaType: int
deprecated: false
secret: false
description: Maximum number of threads in Spark thread-pool (shared globally)
timeOutMillis:
kind: property
type: integer
javaType: int
deprecated: false
secret: false
description: Thread idle timeout in millis where threads that has been idle for a longer period will be terminated from the thread pool
keystoreFile:
kind: property
type: string
javaType: java.lang.String
deprecated: false
secret: false
description: Configures connection to be secure to use the keystore file
keystorePassword:
kind: property
type: string
javaType: java.lang.String
deprecated: false
secret: false
description: Configures connection to be secure to use the keystore password
truststoreFile:
kind: property
type: string
javaType: java.lang.String
deprecated: false
secret: false
description: Configures connection to be secure to use the truststore file
truststorePassword:
kind: property
type: string
javaType: java.lang.String
deprecated: false
secret: false
description: Configures connection to be secure to use the truststore password
sparkConfiguration:
kind: property
type: object
javaType: org.apache.camel.component.sparkrest.SparkConfiguration
deprecated: false
secret: false
description: To use the shared SparkConfiguration
sparkBinding:
kind: property
type: object
javaType: org.apache.camel.component.sparkrest.SparkBinding
deprecated: false
secret: false
description: To use a custom SparkBinding to map to/from Camel message.
properties:
verb:
kind: path
group: consumer
required: true
type: string
javaType: java.lang.String
enum:
- get
- post
- put
- patch
- delete
- head
- trace
- connect
- options
deprecated: false
secret: false
description: get post put patch delete head trace connect or options.
path:
kind: path
group: consumer
required: true
type: string
javaType: java.lang.String
deprecated: false
secret: false
description: The content path which support Spark syntax.
accept:
kind: parameter
group: consumer
type: string
javaType: java.lang.String
deprecated: false
secret: false
description: 'Accept type such as: ''text/xml'' or ''application/json''. By default
we accept all kinds of types.'
bridgeErrorHandler:
kind: parameter
group: consumer
label: consumer
type: boolean
javaType: boolean
optionalPrefix: consumer.
deprecated: false
secret: false
defaultValue: false
description: Allows for bridging the consumer to the Camel routing Error Handler which mean any exceptions occurred while the consumer is trying to pickup incoming messages or the likes will now be processed as a message and handled by the routing Error Handler. By default the consumer will use the org.apache.camel.spi.ExceptionHandler to deal with exceptions that will be logged at WARN/ERROR level and ignored.
disableStreamCache:
kind: parameter
group: consumer
type: boolean
javaType: boolean
deprecated: false
secret: false
defaultValue: false
description: Determines whether or not the raw input stream from Spark HttpRequestgetContent() is cached or not (Camel will read the stream into a in light-weight memory based Stream caching) cache. By default Camel will cache the Netty input stream to support reading it multiple times to ensure Camel can retrieve all data from the stream. However you can set this option to true when you for example need to access the raw stream such as streaming it directly to a file or other persistent store. Mind that if you enable this option then you cannot read the Netty stream multiple times out of the box and you would need manually to reset the reader index on the Spark raw stream.
mapHeaders:
kind: parameter
group: consumer
type: boolean
javaType: boolean
deprecated: false
secret: false
defaultValue: true
description: If this option is enabled then during binding from Spark to Camel Message then the headers will be mapped as well (eg added as header to the Camel Message as well). You can turn off this option to disable this. The headers can still be accessed from the org.apache.camel.component.sparkrest.SparkMessage message with the method getRequest() that returns the Spark HTTP request instance.
transferException:
kind: parameter
group: consumer
type: boolean
javaType: boolean
deprecated: false
secret: false
defaultValue: false
description: If enabled and an Exchange failed processing on the consumer side and if the caused Exception was send back serialized in the response as a application/x-java-serialized-object content type. This is by default turned off. If you enable this then be aware that Java will deserialize the incoming data from the request to Java and that can be a potential security risk.
urlDecodeHeaders:
kind: parameter
group: consumer
type: boolean
javaType: boolean
deprecated: false
secret: false
defaultValue: false
description: If this option is enabled then during binding from Spark to Camel Message then the header values will be URL decoded (eg 20 will be a space character.)
exceptionHandler:
kind: parameter
group: consumer (advanced)
label: consumer,advanced
type: object
javaType: org.apache.camel.spi.ExceptionHandler
optionalPrefix: consumer.
deprecated: false
secret: false
description: To let the consumer use a custom ExceptionHandler. Notice if the option bridgeErrorHandler is enabled then this options is not in use. By default the consumer will deal with exceptions that will be logged at WARN/ERROR level and ignored.
exchangePattern:
kind: parameter
group: consumer (advanced)
label: consumer,advanced
type: string
javaType: org.apache.camel.ExchangePattern
enum:
- InOnly
- RobustInOnly
- InOut
- InOptionalOut
- OutOnly
- RobustOutOnly
- OutIn
- OutOptionalIn
deprecated: false
secret: false
description: Sets the exchange pattern when the consumer creates an exchange.
matchOnUriPrefix:
kind: parameter
group: advanced
label: advanced
type: boolean
javaType: boolean
deprecated: false
secret: false
defaultValue: false
description: Whether or not the consumer should try to find a target consumer by matching the URI prefix if no exact match is found.
sparkBinding:
kind: parameter
group: advanced
label: advanced
type: object
javaType: org.apache.camel.component.sparkrest.SparkBinding
deprecated: false
secret: false
description: To use a custom SparkBinding to map to/from Camel message.
synchronous:
kind: parameter
group: advanced
label: advanced
type: boolean
javaType: boolean
deprecated: false
secret: false
defaultValue: false
description: Sets whether synchronous processing should be strictly used or Camel is allowed to use asynchronous processing (if supported).
documentation.adoc: |+
[[Spark-rest-Spark-restComponent]]
Spark-rest Component
~~~~~~~~~~~~~~~~~~~~
*Available as of Camel 2.14*
The Spark-rest component allows to define REST endpoints using the
http://sparkjava.com/[Spark REST Java library] using the
link:rest-dsl.html[Rest DSL].
INFO: Spark Java requires Java 8 runtime.
Maven users will need to add the following dependency to their pom.xml
for this component:
[source,xml]
-------------------------------------------------
org.apache.camel
camel-spark-rest
${camel-version}
-------------------------------------------------
[[Spark-rest-URIformat]]
URI format
^^^^^^^^^^
[source,java]
----------------------------------
spark-rest://verb:path?[options]
----------------------------------
[[Spark-rest-URIOptions]]
URI Options
^^^^^^^^^^^
// component options: START
The Spark Rest component supports 11 options which are listed below.
{% raw %}
[width="100%",cols="2,1m,7",options="header"]
|=======================================================================
| Name | Java Type | Description
| port | int | Port number. Will by default use 4567
| ipAddress | String | Set the IP address that Spark should listen on. If not called the default address is '0.0.0.0'.
| minThreads | int | Minimum number of threads in Spark thread-pool (shared globally)
| maxThreads | int | Maximum number of threads in Spark thread-pool (shared globally)
| timeOutMillis | int | Thread idle timeout in millis where threads that has been idle for a longer period will be terminated from the thread pool
| keystoreFile | String | Configures connection to be secure to use the keystore file
| keystorePassword | String | Configures connection to be secure to use the keystore password
| truststoreFile | String | Configures connection to be secure to use the truststore file
| truststorePassword | String | Configures connection to be secure to use the truststore password
| sparkConfiguration | SparkConfiguration | To use the shared SparkConfiguration
| sparkBinding | SparkBinding | To use a custom SparkBinding to map to/from Camel message.
|=======================================================================
{% endraw %}
// component options: END
// endpoint options: START
The Spark Rest component supports 13 endpoint options which are listed below:
{% raw %}
[width="100%",cols="2,1,1m,1m,5",options="header"]
|=======================================================================
| Name | Group | Default | Java Type | Description
| verb | consumer | | String | *Required* get post put patch delete head trace connect or options.
| path | consumer | | String | *Required* The content path which support Spark syntax.
| accept | consumer | | String | Accept type such as: 'text/xml' or 'application/json'. By default we accept all kinds of types.
| bridgeErrorHandler | consumer | false | boolean | Allows for bridging the consumer to the Camel routing Error Handler which mean any exceptions occurred while the consumer is trying to pickup incoming messages or the likes will now be processed as a message and handled by the routing Error Handler. By default the consumer will use the org.apache.camel.spi.ExceptionHandler to deal with exceptions that will be logged at WARN/ERROR level and ignored.
| disableStreamCache | consumer | false | boolean | Determines whether or not the raw input stream from Spark HttpRequestgetContent() is cached or not (Camel will read the stream into a in light-weight memory based Stream caching) cache. By default Camel will cache the Netty input stream to support reading it multiple times to ensure Camel can retrieve all data from the stream. However you can set this option to true when you for example need to access the raw stream such as streaming it directly to a file or other persistent store. Mind that if you enable this option then you cannot read the Netty stream multiple times out of the box and you would need manually to reset the reader index on the Spark raw stream.
| mapHeaders | consumer | true | boolean | If this option is enabled then during binding from Spark to Camel Message then the headers will be mapped as well (eg added as header to the Camel Message as well). You can turn off this option to disable this. The headers can still be accessed from the org.apache.camel.component.sparkrest.SparkMessage message with the method getRequest() that returns the Spark HTTP request instance.
| transferException | consumer | false | boolean | If enabled and an Exchange failed processing on the consumer side and if the caused Exception was send back serialized in the response as a application/x-java-serialized-object content type. This is by default turned off. If you enable this then be aware that Java will deserialize the incoming data from the request to Java and that can be a potential security risk.
| urlDecodeHeaders | consumer | false | boolean | If this option is enabled then during binding from Spark to Camel Message then the header values will be URL decoded (eg 20 will be a space character.)
| exceptionHandler | consumer (advanced) | | ExceptionHandler | To let the consumer use a custom ExceptionHandler. Notice if the option bridgeErrorHandler is enabled then this options is not in use. By default the consumer will deal with exceptions that will be logged at WARN/ERROR level and ignored.
| exchangePattern | consumer (advanced) | | ExchangePattern | Sets the exchange pattern when the consumer creates an exchange.
| matchOnUriPrefix | advanced | false | boolean | Whether or not the consumer should try to find a target consumer by matching the URI prefix if no exact match is found.
| sparkBinding | advanced | | SparkBinding | To use a custom SparkBinding to map to/from Camel message.
| synchronous | advanced | false | boolean | Sets whether synchronous processing should be strictly used or Camel is allowed to use asynchronous processing (if supported).
|=======================================================================
{% endraw %}
// endpoint options: END
[[Spark-rest-PathusingSparksyntax]]
Path using Spark syntax
^^^^^^^^^^^^^^^^^^^^^^^
The path option is defined using a Spark REST syntax where you define
the REST context path using support for parameters and splat. See more
details at the http://sparkjava.com/readme.html#title1[Spark Java Route]
documentation.
The following is a Camel route using a fixed path
[source,java]
---------------------------------------
from("spark-rest:get:hello")
.transform().constant("Bye World");
---------------------------------------
And the following route uses a parameter which is mapped to a Camel
header with the key "me".
[source,java]
--------------------------------------------
from("spark-rest:get:hello/:me")
.transform().simple("Bye ${header.me}");
--------------------------------------------
[[Spark-rest-MappingtoCamelMessage]]
Mapping to Camel Message
^^^^^^^^^^^^^^^^^^^^^^^^
The Spark Request object is mapped to a Camel Message as
a??`org.apache.camel.component.sparkrest.SparkMessage` which has access
to the raw Spark request using the getRequest method. By default the
Spark body is mapped to Camel message body, and any HTTP headers / Spark
parameters is mapped to Camel Message headers. There is special support
for the Spark splat syntax, which is mapped to the Camel message header
with key splat.
For example the given route below uses Spark splat (the asterisk
sign)??in the context path which we can access as a header form the
Simple language to construct a response message.
[source,java]
------------------------------------------------------------------------------
from("spark-rest:get:/hello/*/to/*")
.transform().simple("Bye big ${header.splat[1]} from ${header.splat[0]}");
------------------------------------------------------------------------------
[[Spark-rest-RestDSL]]
Rest DSL
^^^^^^^^
Apache Camel provides a new Rest DSL that allow to define the REST
services in a nice REST style. For example we can define a REST hello
service as shown below:
[source,java]
----------------------------------------------------------------
return new RouteBuilder() {
@Override
public void configure() throws Exception {
rest("/hello/{me}").get()
.route().transform().simple("Bye ${header.me}");
}
};
----------------------------------------------------------------
[source,xml]
--------------------------------------------------------------
Bye ${header.me}
--------------------------------------------------------------
See more details at the??link:rest-dsl.html[Rest DSL].
[[Spark-rest-Moreexamples]]
More examples
^^^^^^^^^^^^^
There is a *camel-example-spark-rest-tomcat* example in the Apache Camel
distribution, that demonstrates how to use camel-spark-rest in a web
application that can be deployed on Apache Tomcat, or similar web
containers.
[[Spark-rest-SeeAlso]]
See Also
^^^^^^^^
* link:configuring-camel.html[Configuring Camel]
* link:component.html[Component]
* link:endpoint.html[Endpoint]
* link:getting-started.html[Getting Started]
* link:rest.html[Rest]