Wednesday, March 4, 2015

The new JBoss Demo Central Github Organization and Site

I am pleased to announce, along with the other JBoss Technology Evangelist -Eric Schabell, Thomas Qvarnstrom and Christina Lin, our Central Organization for JBoss Demo Repositories is available.  The team has worked hard to pull together existing content and start new content as well.

There are two ways to access jbossdemocentral -
1) The website with an easy to navigate front end to access the source code, videos, articles, etc for each demo - http://jbossdemocentral.com
2) The github organization with all the source code repositories for the demos -

Give the demos a try and follow us on twitter and our blogs!

Tuesday, March 3, 2015

Running a simple demo with the Salesforce Connector with Fuse


JBoss Fuse and Apache Camel provides an easy to use Salesforce Connector to interact with Salesforce.  The component supports producer and consumer endpoints to communicate with Salesforce using Java Data Transfer Objects (DTOs).  There is a companion maven plugin Camel Salesforce Plugin that generates these DTOs.    The Component supports three Salesforce APIs - 1) REST API 2) REST Bulk API 3) REST Streaming API.

Salesforce is best known for its customer relationship management (CRM) product, which is composed of Sales Cloud, Service Cloud, Marketing Cloud,, Chatter and  They provide a developer platform which allows easy testing of Fuse and Salesforce integration.

Salesforce enables access through their APIs through their connected Apps with OAuth.  The image to the right shows the flow of OAuth with Salesforce and you can learn more at

Salesforce Setup

First a developer is required for our testing which you can get at   The Name, Email, Username and Password are required to register.   If you already have a Developer Edition organization, verify that you have the “API Enabled” permission. This permission is enabled by default, but may have been changed by an administrator. After the account is created the connected app needs to be setup.

The main parameters to be used with the camel salesforce component are:
  • clientId
  • clientSecret
  • userName
  • password
  • loginUrl (

A connected app (previously known as a 'remote access' app) is an application that integrates with Salesforce using APIs such as the REST and SOAP APIs. In addition to standard OAuth capabilities, connected apps allow administrators to set various security policies and have explicit control over who may use the applications.

In Salesforce, from Setup, click Create | Apps, and under Connected Apps click New to create a new connected app if you have not already done so.  The Enable OAuth Settings checkbox should be selected.  The full access OAuth Scope should be selected as well.  The Callback URL you supply here is the same as your Web application's callback URL. Usually it is a servlet if you work with Java. It must be secure: http:// does not work, only https://. For development environments, the callback URL is similar to https://localhost:8443/RestTest/oauth/_callback. When you click Save, the Consumer Key is created and displayed, and a Consumer Secret is created (click the link to reveal it).  A quick note that the OAuth 2.0 specification uses “client” instead of “consumer.” Salesforce supports OAuth 2.0.

New Connected App
After the New Connected App has been save then the OAuth Policies need to be set.  The Permitted Users should be set to All users may self authorize.  The IP Relaxation should be set to Relax IP restrictions. 

Connected Apps Management
Fuse Setup

In our simple demo we are just going to pull some data as a consumer by running a simple query on the Opportunity Object.  In future blogs we will do more complicated operations as producer and consumer.  

Camel Route with salesforce component
First  create a new project with a Blueprint Archetype.  JBoss Developer Studio 7.1.1 and Integration Stack 7.0.3 were used.

Project archetype screen creating a new Fuse Project
We will make two changes to the pom.xml.  The first is adding the camel-salesforce dependency and the camel-salesforce-maven-plugin.   The reason we want to add the salesforce maven plugin is that it allows easily building the DTOs.   In the project in the github repository you will find sample DTOs as well as a sample pom.xml.

So we will build the DTO's before updating the route.  In the directory of the workspace you run the following command:

mvn camel-salesforce:generate -DcamelSalesforce.clientId= -DcamelSalesforce.
clientSecret= -D

userName= -D


The four parameters should be updated with the values that were created during the salesforce setup.  The source is created in a default directory,  ${}/generated-sources/camel-salesforce.  Move these over to the java source directory so you can use them.  Keep in mind that if fields are changed in salesforce then the DTOs will need to be regenerated.

We built a route that contains the properties for connecting and contains a quick query to retrieve data.  A simple route was used to start the route and then consume data from Salesforce with a query.

salesforce:query?sObjectQuery=SELECT id,name from Opportunity&sObjectClass=org.apache.camel.salesforce.dto.Opportunity.class
For more detail on the route you can view the code in the repository.  

Running the project

The project is located at  The steps are in the readme.  Clone the repository and find the camel-blueprint project in the projects folder.  I put the properties in the blueprint.xml as an example to be included with the route.  I also included a unit test to run the query as well which uses a property file.

Step 1: Import the project into JBDS

Step 2: Modify the blueprint.xml and the to contain the correct authentication parameters.

Step 3: Right click on the in the src/test/java folder and select Run As Unit Test.  

Step 4: Right click on the blueprint.xml in the src/main/resources folder and select Run As Local Camel Context (without Test).

In both steps you should see the data in json returned for the opportunities.

Monday, March 2, 2015

One Way and Two Way SSL and TLS


Top to Bottom - Internet Explorer, Firefox, Safari, Chrome, and Opera 

Before going into Endpoint Security with Camel with EAP and Fuse,  I wanted to provide a quick primer on Secure Sockets Layer (SSL).  We will have a quick overview and then discuss 1-way and 2-way SSL.  SSL should be the first step in protecting sensitive data across the network pipe.  It will minimize the man-in-the-middle attacks and eavesdropping.  SSL is the standard security technology for establishing an encrypted link between a web server and a browser.  This makes sure the data passed between the server and browser or server and server remains private and not modified by providing encryption and trust.

Encryption uses a private key/public key pair which ensures that the data can be encrypted by one key but can only be decrypted by the other key pair.   This is referred to as the Public-Key Infrastructure (PKI) Scheme.  The public key is shared while the private key is kept locally.  This is described more in 1 and 2 way SSL below concerning the files and which are stored where.  This can also be extending to server to server communication, in addition to browser to server communication.

Trust is achieved through the use of certificate trust.   Certificate trust can be thought of as a chain that starts with the Certificate Authority (or CA).  A CA is a company or entity that issues SSL Certificates.   Web browsers and  systems come loaded with a list of recognized issuers and that list is kept up to date by automatic updates.   Certificates can also be self-signed for testing.

Benefits of SSL through the CIA Triad

I blogged about the CIA Triad Model which is located here,  The SANS institue has an excellent beginners guide to SSL and TLS which also describes the value of SSL/TLS in relation to the CIA Triad.
The C-I-A (Confidentiality, Integrity, Availability) Model for information security is addressed in several ways by the use of a secure communications protocol. Confidentiality of the information being passed is the main purpose of the SSL and TLS protocols. Integrity is addressed through the use of message authentication in each message from the first handshake. Additionally, non-repudiation is accounted for through certificate passing in addition to the integrity check from the message authentication. Though more responsibility for the Availability portion of the model (in this example) is placed on the server, Availability is slightly addressed since secure communications prevent malicious users from having direct access to the system. 
Difference between SSL and TLS

Transport Layer Security (TLS) and its predecessor, Secure Sockets Layer (SSL), are both cryptographic protocols designed to provide communications security over a computer network. The terms SSL and TLS are often used interchangeably or in conjunction with each other (TLS/SSL), but one is in fact the predecessor of the other — SSL 3.0 served as the basis for TLS 1.0 which, as a result, is sometimes referred to as SSL 3.1.  In this document, the US Government describes TLS Guidelines for Implementation and indicates that SSL v3 not be used for sensitive government communications or for HIPAA-compliant communications.  This chart does a good job with SSL/TLS support in browsers and the affected vulnerabilities (BEAST, POODLE, CRIME, RC4). 


Of course with no SSL data across the network is not encrypted.   Using no SSL is usually done in a development or test environment.   

2-way SSL (Mutual or Client Authentication)

In two-way SSL authentication, the SSL client application verifies the identity of the SSL server application, and then the SSL server application verifies the identity of the SSL-client application.

Two-way SSL authentication is also referred to as client or mutual authentication because the application acting as an SSL client presents its certificate to the SSL server after the SSL server authenticates itself to the SSL client.

Establishing the encrypted channel using certificate-based 2-Way SSL involves:
  1. A client requests access to a protected resource.
  2. The server presents its certificate to the client.
  3. The client verifies the server’s certificate.
  4. If successful, the client sends its certificate to the server.
  5. The server verifies the client’s credentials.
  6. If successful, the server grants access to the protected resource requested by the client.
1-Way SSL

In such mode, the SSL-client application is not verified by the SSL-server application. Only the server is verified.

Using a Customer Context with the Camel Components and Data Virtualization


Cojan van Ballegooijen, Red Hat Senior Solution Architect, Bill Kemp, Red Hat Senior Solution Architect, and myself have created an example around a Customer Context Use Case to show how to use the Camel Components in Fuse to access a Data Virtualization Virtual Database (VDB).  The data service provides the customer context which is aggregated data from a XML file and CSV file.  The data on each customer provides the name, the credit score, the number of calls the customer has placed to customer support and the sentiment (Hot, Cold, Warm) toward the company from social media.  We will review the components and show how to run the demo.  The demo repository is located in jbossdemocentral on github.  In our project directory we have the individual use cases which are built and deployed when running the scripts.  The Teiid jdbc jar is loaded into the profile with wrap file during the run script.

Use Case 1 - JDBC Component

In the first use case we are set up the bean for the sql query that we want to execute and the bean for the datasource properties.  A timer component runs a query every 60 seconds, the results from the query are then split into individual records and then sent to the log.  We are using the Blueprint DSL in the blueprint.xml.

blueprint.xml design view
blueprint.xml source view with the query property and datasource properties
Note the url that accesses the CustomerContext Virtual Database.  Also the query is set in the body and the datasource name is part of the jdbc URI.

JDBC Component excerpt from the Camel Component Page:
The jdbc component enables you to access databases through JDBC, where SQL queries (SELECT) and operations (INSERT, UPDATE, etc) are sent in the message body. This component uses the standard JDBC API, unlike the Camel SQL Component component, which uses spring-jdbc.

Maven users will need to add thecamel-jdbc dependency to their pom.xml for this component.  This component can only be used to define producer endpoints, which means that you cannot use the JDBC component in a from() statement.  The URI Format for the JDBC component is:


This component only supports producer endpoints.   You can append query options to the URI in the following format, ?option=value&option=value&...

Use Case 2 - SQL Component

The second use case is similar to the first in that a timer component runs a query every 60 seconds, the results from the query are then split into individual records and then sent to the log.  Also we are using the Blueprint DSL in the blueprint.xml.  The difference with the SQL component is the query is part of the URI of the component.   Also we are loading the datasource into the SqlComponent class.

blueprint.xml design view
blueprint.xml source view of SqlComponent with datasource reference
SQL Component excerpt from the Camel Component Page:
The sql: component allows you to work with databases using JDBC queries. The difference between this component and JDBC component is that in case of SQL the query is a property of the endpoint and it uses message payload as parameters passed to the query.   From Camel 2.11 onwards this component can create both consumer (e.g. from()) and producer endpoints (e.g. to()).  In previous versions, it could only act as a producer.

This component uses spring-jdbc behind the scenes for the actual SQL handling.  Maven users will need to add the camel-sql dependency to their pom.xml for this component.  The SQL component also supports:
The SQL component uses the following endpoint URI notation:

sql:select * from table where id=# order by name[?options]

You can append query options to the URI in the following format, ?option=value&option=value&...

Use Case 3 - Olingo Component

The Olingo component will be a part of Fuse 6.2 so we decided to wait until that release in order to document and add this component to this demo.  You can try an example with Camel 2.14 which we have in the folder of the project.  We will cover in more detail in a follow up article for the Olingo component.

Olingo Component excerpt from Camel Component Page:
The Olingo2 component utilizes Apache Olingo version 2.0 APIs to interact with OData 2.0 and 3.0 compliant services. A number of popular commercial and enterprise vendors and products support the OData protocol. A sample list of supporting products can be found on the OData website.

Maven users will need to add the camel-olingo2 dependency to their pom.xml for this component.  The URI format for the Olingo component is:


Use Case 4 - JETTY Component for a REST Service

For Use Case 4 we use a REST service to expose the OData Data Virtualization Service.

blueprint.xml design view
blueprint.xml source view of the route
Note the CustomerContextVDB OData service is being used with the DV Username and Password as parameters.  This returns all the data when accessing the Jetty URL, http://localhost:9000/usecase4.

Jetty Component excerpt from the Camel Component Page:
The jetty component provides HTTP-based endpoints for consuming and producing HTTP requests. That is, the Jetty component behaves as a simple Web server. Jetty can also be used as a http client which mean you can also use it with Camel as a producer.

Maven users will need to add the camel-jetty dependency to their pom.xml for this component.  The URI format is:


You can append query options to the URI in the following format, ?option=value&option=value&...

Running the Project

Step 1: Download and unzip the repository or Clone the repository. If running on Windows, it is reccommended the project be extracted to a location near the root drive path due to limitations of length of file/path names.

Step 2: Add the DV and Fuse Products to the software directory.  You can download them from the Customer Support Portal (CSP) or

Step 3: Run '' or 'init.bat' to setup the environment locally. 'init.bat' must be run with Administrative privileges.

Step 4: Run '' or 'run.bat' to start the servers, create the container and deploy the bundles.

Step 5: Sign onto the Fuse Management console, http://localhost:8181, with the admin user and check the console log to see the output from the routes for the use cases. You can also view the Camel Diagrams.  Browse to http://localhost:9000/usecase4 to see the data for Use Case 4 through Jetty.

The demo can be run in a docker container in addition to a local install. Full instructions can be found in support/docker/ of the project.

Friday, February 20, 2015

SOA and API Summit February 26

This week I prepared our material for the SOA and API Summit that will be presented during a live session February 26. You can download the slides and whitepapers now, then attend the session on February 26th by registering at idevnews.  Click on the reserve a seat which will enable you to download the whitepapers and slidedeck.  My presentation is titled, "Success in the API Economy with Red Hat JBoss".
Title:SOA & APIs Summit
Speakers:Oracle, Redhat, Axway, Talend
Date/Time:February 26, 2015
10am PT / 1pm ET - Online Conference

SOA & APIs Summit is a multi-vendor online event where industry experts will show how SOA and APIs are transforming the way F1000s think about IT and business models.

Topics to include:
  • SOA & APIs Power the ‘Extended Enterprise’ 
    How today’s SOA and API architectures help IT more easily adopt end-to-end solutions for Big Data, Cloud, Mobile and SaaS  
  • Integration for Real-Time Business 
    New architectures (API, JSON, REST, SOA) are delivering real-time integration for decisions, analytics and more.  
  • Learn from F1000 Success Stories 
    How savvy API / SOA investments reward F1000s with happier customers, thriving partner networks,  growing revenues, smarter and quicker apps.   
  • Cut Coding with SOA / API Strategies 
    Every day, 1000s of app ideas come to life quicker thanks to smart integration platforms that reduce coding – even eliminate it
  • The API-Driven Business 
    API platforms power and secure new ways to share, communicate and innovate – with internal teams and outside partners.

Maximize information exchange in your enterprise with AMQP

This week I presented a webinar on Maximizing information exchange in your enterprise with AMQP.  We went through an AMQP overview, comparison of technologies with AMQP, Fuse and A-MQ and a simple demo to show a producer, consumer and broker.  The main features of AMQP include Interoperability, Queueing, Routing, Reliability and Security.

The demo was simple example on the ease of use of AMQP with JBoss A-MQ.  I have included the steps below so that you can give it a try.

Step 1: Download and unzip JBoss A-MQ 6.1 from
Step 2: Clone the repository from
Step 3: Add the transport to the transportconnectors section in the activemq.xml file in /etc of the A-MQ install directory
<transportConnector name="amqp" uri="amqp://"/>
Step 4: Start the Fuse server by running amq in /bin of the A-MQ install directory
Step 5: Run mvn -P consumer from the cloned repository
Step 6: Run mvn -P producer from the cloned repository
Note: you should see the messages received by the consumer which are sent by the producer
Step 7: Browse the Management Console at http://localhost:8181 to take a look at the statistics for the producer and consumer

Sunday, February 15, 2015

JBoss Data Virtualization Sizing Architecture Tool

The JBoss Data Virtualization Sizing Architecture Tool is a simple web application that has around 10 - 15 questions.  After all questions are answered and submitted, corresponding recommendations for Data Virtualization will be presented.  The recommendations include:
  • How many servers are need, with how many cores?
  • How much memory/JVM size for each node?
  • Suggestions of configuration changes for any performance improvement.
Follow the link, Sign on with your Red Hat account and click start to enter the responses to the questions to get a recommendation.