Wednesday, November 19, 2008

Deploy Ruby on Rails to JBoss (Tomcat)

Ruby can be run on the JVM as a war file. The Hello World is really very simple following the steps outlined by Arun Gupta.

1. Install jruby.

2. Install rails and JRuby-Jack

kstam-mbpro-2:ruby kstam$ jruby -S gem install rails warbler --no-ri --no-rdoc
JRuby limited openssl loaded. gem install jruby-openssl for full support.
http://wiki.jruby.org/wiki/JRuby_Builtin_OpenSSL
Successfully installed activesupport-2.1.2
Successfully installed activerecord-2.1.2
Successfully installed actionpack-2.1.2
Successfully installed actionmailer-2.1.2
Successfully installed activeresource-2.1.2
Successfully installed rails-2.1.2
Successfully installed warbler-0.9.11
7 gems installed

3. Create a Hello Rails app: jruby -S rails hello -d mysql

4. open hello/config/environment.rb in your favorite editor and uncomment so the line looks like
config.frameworks -= [ :active_record, :active_resource, :action_mailer ]

5. Use warble to create your war file
jruby -S warble

6. copy the hello.war to your server/default/deploy directory

7. Hit http://localhost:8080/hello in your browser.



Conclusion: A rails application is basically packaged up in the WEB-INF/ directory of the war and a ruby filter defined in the web.xml takes care of running it.

I also found that in JBoss 5, there would be no need for deploying it in a war, as the the deployer framework of JBoss-Microcontainer enables in-place native deployment of Rails applications in a way familiar to traditional rubyists.

Voices that matter, Professional Ruby Conference

As stated before I'm a Java guy, however this week I'm at a Ruby conference in Boston. I'm working on a 2 year old java based application and we started to do Ruby on Rails right when I got on board. The RoR has been pretty successful, and I have to admit it sure is much, much easier to work with than Struts and jsps. There is still a chance Seam may save Java for the web tier, but for the current project Seam is not an option. So we're working on a hybrid architecture where we can run RoR and Java, so here I am at the Ruby conference! So when I write this review you will now hopefully understand my point of view.

Schedule: November 17-20, 2008.

For me the presentation by Thomas Enebo, Ruby and the Java Virtual Machine was an eye opener. Being new to Ruby in general I didn't not have much understanding on how JRuby worked on the JVM until now. It turns out that although JRuby trails Ruby a bit in terms of releases, it is identical to Ruby, but will have a different sets of bugs ;), however the best part is that JRuby is faster, and it can call into Java natively. It does run your code as byte code, but it stays dynamic! On the flipside this means that it won't ever be as fast as running Java on the JVM. Now the cool part is that you can deploy a RoR application as war file to your appserver, and talk natively to any code written in Java. Currently they are working on to finalize Ruby 1.9 features. This presentation really came to live for me when I went to the 'Lightning Round' later that night, where Mark Menard (from Vita Rara) spoke about his experiences with JRuby. He runs a Spring based application and first tried Groovy on Rails, but in the end he discovered that Groovy meta-model is too close to Java and so he started experimenting with RoR deployed to the JVM, and discovered that it really works as advertised. He is rewriting his application from the ground up and just replaced Hibernate by ActiveRecord, which proves that you can port pieces of your app while being in flight. For me this is very exciting to hear. Not necessarily that Hibernate can be swapped out from ActiveRecord, but more that it is not only possible but that it is actually a practical solution to build hybrid apps by running Ruby on the JVM!

Another talk I really liked was given by Ezra Zygmuntowicz, Four Years of Ruby Deployment. He went over the history of Ruby webservers (Mongrel, Thin, Ebb, NginX, Passenger), and liked Rack as the web equalizer since it abstracts these different implementation, and he picked Thin + NginX as his implementation preference. However then he demoed an application called Nanite, which is a web-based app to configure a linux server, with the possibility of cloning a certain configuration to multiple other nodes.


Oh I almost forgot, everyone seems to run their own little consulting firm writing Rails apps, all the speakers wrote a book and everyone uses Twitter.

It's actually pretty cool as I learn more about it!

Monday, October 13, 2008

Anything goes alternative Olympic Games


I've been toying with the idea of an alternative Olympics where anything goes as long as it is human powered. The rules are simple, (if you're stupid enough) you can use any kind of dope but more interestingly you can use any kind of technology, just as long as it is human powered. I want to see Phelps with one of these things! Maybe he can wear a fin on his back to go with it.

Saturday, October 11, 2008

Hibernate Interceptors, Events and JPA Entity Listeners

Hibernate became is an implementation of the JPA specification. The JPA specification is part of the EJB3 specification (JSR-220). See, SUNs FAQ on that if you want to know more about that.

Hibernate Interceptors
In a recent project I wanted to intercept Create, Update and Delete events. From the Hibernate Reference Documentation it was pretty straightforward to find out how to create an Interceptor class. I extended my AuditInterceptor class from the EmptyInterceptor interface and implemented the onSave, onFlushDirty and onDelete methods but then was a bit puzzled how to activate the interceptor. Chapter 2 talks about specific hibernate configuration properties, but how to use those when you're using a JPA persistence.xml? It turns out the answer is very simple. Pretty much any vendor specific property can be set in the persistence.xml using a properties block. In this case a property with name hibernate.ejb.interceptor needs to be set to the full name of my AuditInterceptor class.

Some advantages are that
  • you don't to change any of your existing code, just add your interceptor and update the persistence.xml
  • the onFlushDirty gives you a nice detailed information on what changed
but on the flipside
  • the interceptor fires for every entity bean, and chances are you're only interested in a subset. I ended up doing the filtering in my AudutInterceptor.
  • if your primary key is set by the database, the id is still undefined when onSave fires.
Hibernate Events
The capabilities of Hibernate Events API goes beyond those of the Interceptors. It gives you very fine-grained control over where to hook into the event stream. Check out the org.hibernate.event package for all the different interfaces you can implement.

Some advantages
  • fine-grained control over where to hook into the persistence process (lots of Pre- and Post-event interfaces)
  • The event objects are contain a wealth of information, including previous and current state of the entity.
but on the flipside
  • hooking up your EventListener is either done programmatically or using the Hibernate session-factory configuration block. I guess I could use a hibernate.cfg.xml file and reference that in my persistence.xml, but it started to feel convoluted to me.
JPA EntityListeners
FInally there are non vendor specific JPA EntityListeners. To use an EntityListener you simply add a '@EntityListener (class=)' annotation. Then in your listener class you can use other annotations (like @PrePersist, @PreRemove, @PostPersist, @PostRemove, @PreUpdate, @PostUpdate and @PostLoad) to decorate your listener methods.

Some advantages are
  • fine-grained control as to which entities you want to listen to (annotation on the entity, referencing your listener class)
  • pretty good control on where you want to hook in your listener (annotation on a method on the listeners class)
but on flipside
  • the API only gives you the current state of the entity, and does not provide information as to what changed in case of an update.
Conclusion
In my case I ended up with a hybrid solution using a HibernateInterceptor for the Update and Delete, and using EntityListeners for the Create event, listening to the PostInsert events so that the id field on my entities are defined.

Saturday, September 27, 2008

Transformation using Smooks, part 1: XML2XML

Part 1 is sort of the Hello World on Smooks, and how it is better then using plain XSLT.

Smooks is fragment based data transformation framework. It can handle many different data formats and is the default transformation engine of three open source ESBs: (JBossESB, Synapse and Mule ESB). In this case I needed an XML2XML transformation. I'm used to using XSLT for that, but I had some date formatting to do, so I figured I would use Smooks. Currently the latest release is v1.0.1, so this is what I used. Note that some of the documentation on the Smooks website already referring the v1.1.x version (you can tell by the xsd reference, smooks-1.1.xsd), so watch out for that since that will not work using 1.0.x. However you can still use v1.0.x notation in v1.1. x

So how to go get started?

1. First you will need to download the Smooks Libraries, but if you're like me and use JBossESB, then you can skip this step. Note that Smooks is very modular, so if you don't need all the features you'll only need a subset of the jars provided by Smooks.

2. Create a transformation unittest, like the one below to test your transformation. I based this one on the one given in the Smook User Guide, the one at the very end of the document).

package com.sermo.services.foley.transform;

import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileWriter;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.StringReader;
import java.io.StringWriter;
import java.io.Writer;

import javax.xml.transform.stream.StreamResult;
import javax.xml.transform.stream.StreamSource;

import org.custommonkey.xmlunit.XMLAssert;
import org.custommonkey.xmlunit.XMLUnit;
import org.junit.BeforeClass;
import org.junit.Test;
import org.milyn.Smooks;
import org.milyn.container.ExecutionContext;
import org.milyn.event.report.HtmlReportGenerator;
import org.milyn.payload.JavaSource;
import org.xml.sax.SAXException;

/**
* Unit test for Smooks transformation.
*
* Smooks configuration file is located in src/main/resources/smooks-config.xml
* Input file for transformation is located in src/test/resouces/input.xml
* Expected file from trasformation is located in src/test/resouces/expected.xml
* Smooks execution report will be created in target/smooks-report.html
*/
public class SmooksTest {

public static Smooks smooks = null;

@BeforeClass
public static void setupSmooks() throws SAXException, IOException
{
smooks = new Smooks( "smooks-res.xml");
}
@Test
public void testTransformation()
throws FileNotFoundException, IOException, SAXException
{
InputStream in = this.getClass().getResourceAsStream("/input.xml");
System.out.println(new InputStreamReader(in).toString());
StreamSource source = new StreamSource(this.getClass().getResourceAsStream("/input.xml"));
ExecutionContext executionContext = smooks.createExecutionContext();
//create smooks report
Writer reportWriter = new FileWriter( "smooks-report.html" );
executionContext.setEventListener( new HtmlReportGenerator( reportWriter ) );
StreamResult result = new StreamResult( new StringWriter() );
smooks.filter( source, result, executionContext );
System.out.println("Smooks output:" + result.getWriter().toString());
InputStream is = this.getClass().getResourceAsStream("/expected.xml");
//compare the expected xml (src/test/resources/expected.xml) with the transformation result.
XMLUnit.setIgnoreWhitespace( true );
XMLAssert.assertXMLEqual(new InputStreamReader(is), new StringReader(result.getWriter().toString()));
}
}

In this case it transforms the XML in the input.xml file into a format that should correspond to the XML in the expected.xml file. The actual transformation happens at line 55 (smooks.filter), and the result can be accessed using a StreamResult called result. XMLAssert is used to check that the output corresponds to our expectation.

3. Smooks has one configuration file, the smooks-res.xml file, in which you define the smooks-resource-list, and any entry in this list is a resource-config entry. This file contains the definition of the transformation:

<?xml version='1.0' encoding='UTF-8'?>
<smooks-resource-list xmlns="http://www.milyn.org/xsd/smooks-1.0.xsd" >

<resource-config selector="global-parameters">
<param name="default.serialization.on">false</param>
<param name="stream.filter.type">SAX</param>
</resource-config>

<!-- Date Parser used by all the bean populators -->
<resource-config selector="decoder:UTCDateTime">
<resource>org.milyn.javabean.decoders.DateDecoder</resource>
<param name="format">yyyy-MM-dd HH:mm:ss</param>
</resource-config>

<resource-config selector="hibernate-event">
<resource>org.milyn.javabean.BeanPopulator</resource>
<param name="beanId">eventType</param>
<param name="beanClass">java.util.HashMap</param>
<param name="bindings">
<binding property="type" selector="hibernate-event/@eventType" />
</param>
</resource-config>

<resource-config selector="category">
<resource>org.milyn.javabean.BeanPopulator</resource>
<param name="beanId">category</param>
<param name="beanClass">java.util.HashMap</param>
<param name="bindings">
<binding property="name" selector="category/name" />
<binding property="type" selector="category/type" />
<binding property="id" selector="category/id" />
<binding property="createDate" selector="category/createDate" type="UTCDateTime" />
</param>
</resource-config>

<resource-config selector="category">
<resource type="ftl">
<![CDATA[<caffeine-event type="${eventType.type}">
<category>
<cat-name>${category.name}</cat-name>
<cat-type>${category.type}</cat-type>
<id type="integer">${category.id}</id>
<utc-date><#if category.createDate?exists>${category.createDate?string("yyyy-MM-dd'T'HH:mm:ss'Z'")}</#if></utc-date>
</category>
</caffeine-event>]]>
</resource>
</resource-config>

</smooks-resource-list>

Note that the first sections sets up some global parameters like which parser it should be using under the hood (SAX in this case). Next I defined a date parser called 'decoder:UTCDateTime' which can parse the date format in the input.xml, so that it can be transformed into another date format later. The next two sections set up two Java beans (using a HashMap actually), so we can add name, value pairs. The first bean selects data on the 'hibernate-event' tag, setting the type, by selecting the value of eventType attribute. The second section populates another HashMap by looking at the data in the category element. This is what Smooks means with being fragment based. You can use different types of technologies to parse your incoming message (XML in this case). Finally in the last section it builds the output message. Here I chose to use FreeMarker (ftl), which is very nice templating language.

4. input.xml

<hibernate-event eventType="create">
<category>
<name>Sumatra</name>
<type>Coffee Bean</type>
<id type="integer">5000</id>
<createDate>2009-02-09 21:00:01</createDate>
</category>
</hibernate-event>

This is the incoming XML message.

5. expected.xml

<caffeine-event type="create">
<category>
<cat-name>Sumatra</cat-name>
<cat-type>Coffee Bean</cat-type>
<id type="integer">5000</id>
<utc-date>2009-02-09T21:00:01Z</utc-date>
</category>
</caffeine-event>

This is the XML we want to transform the input.xml to. So this file is the expected output, so we can use XMLUnit to assert that we got what we were expecting.

6. Smooks Execution Report

When you are in the middle of building your smooks-res.xml, the Smooks Execution Report can come in very handy for debugging purposes. The lines

//create smooks report
Writer reportWriter = new FileWriter( "smooks-report.html" );
executionContext.setEventListener( new HtmlReportGenerator( reportWriter ) );

create an HTML based report called 'smooks-report.html', which you can simply open in your favorite browser.


Conclusion

Fragment based transformation rocks, have you ever tried to do any date manipulation or working on a highly normalized XML in XSLT? By the way you can still use XSLT instead of a BeanPopulator; mix and match. I also like having a real templating language to create my outgoing message. By picking the tool for the task you can expect high transformation performance. There was nice thread on theServerSide on that here.

Part 2, will be about how a Java2XML transformation and how you'd deploy the transformation as a service to JBossESB.

Friday, August 15, 2008

Javadoc

I've been writing Java for more then a decade now and the funny thing is that I never actually read how to write Javadoc. Don't get me wrong I write it alright, but when I came across this document, it was kinda fun to discover the true intentions.

Wednesday, August 6, 2008

MySQL Profiling and Performance tools

I'm working with a MySQL database that has been growing, and has reached about 2 Gb these days. So needless to say we're getting more interested in profiling the database. I started a search for some tools and this is the list thus far, in no particular order.

1. MyTop
The first tool I came across was mytop, which is a little tool inspired by 'top'. I also found this quick tutorial tutorial on how to interpret the screens. It does require depend on some Perl libraries, which you can install using MCPAN:
sudo perl -MCPAN -e shell

then at the cpan prompt:
install DBI
install DBD::mysql
install Term::ReadKey
install Term::ANSIColor
install Time::HiRes

In the directory where you extracted mytop execute:
perl Makefile.PL
make
make test
sudo make install


2. Maatkit
MySQL toolkit is now called Maatkit. Maatkit can be found on sourceforge and more recently on code.google. It looks like it can come in very handy with running replicated databases. Peter Zeitsev has nice blog entries on what it can do, or you can just read the docs.

3. MySQL Query Profiling
Recently MySQL added a profiling tool in the server itself called MySQL Query Profiler. This tool provides very detailed information on where the time is spend for a query.

4. MyProfi
MyProfi is a log analyzer and profiler. Extracts the most popular queries grouping them by their normalized form and shows the statistics for each group. Helps you recognize the most frequently run queries to be able to optimize overall db performance.

Wednesday, July 30, 2008

Where did my class get loaded from?

In Java is it not always easy to figure out from where a certain class is loaded. Particularly when the same class is packaged up in more then one archive. So if you want to figure out from which archive the class got loaded from, this little piece of code may come in handy

Monday, July 21, 2008

JBoss IDE Timeout waiting for server to come up

"Did not start in 50s". That's a pretty annoying message to receive when you're trying to fire up JBoss from within eclipse using the JBoss IDE plugin. It turns out that there is a setting called "Server timeout delay" where: Unlimited (-1), Longer (1) - Shortest (9)

Open the file

/.metadata/.plugins/org.eclipse.core.runtime/.settings/org.eclipse.wst.server.core.prefs

and add

machine-speed=1

Sunday, May 4, 2008

PiecesOfFlare Version 1.1.1 released

I recently upgraded to the Europa version of Eclipse (3.3) and I noticed that my PiecesOfFlare (POF) plugin stopped working. So after 2 years, we now have a new release. The POF plugin is a small plugin that enables you to hit the save button in eclipse, and then have the updated (jsp, js, php, etc) file being copied to another destination outside your source tree. When I say outside, think to your webserver or appserver on the same file system, or even a to a remote machine (using scp).

You can download the plugin from sourceforge, or you can use the update site at http://piecesofflare.sourceforge.net/updates/.

Tuesday, April 8, 2008

Book Review: Business Process Management with JBoss jBPM

Business Process Management with JBoss jBPM
A Practical Guide for Business Analysts, by Matt Cumberlidge

I have been working on Service Orchestration in JBossESB, using jBPM, and a few weeks back PACKT Publishing asked me to review the JBoss jBPM book. I have been reading it whenever I had a spare moment. I have to admit the book was not what I initially expected it to be. For some reason I thought the book would be an in depth text book about jBPM and it's features. It is not, well it's got some of that, but what it really is, is what the title already says: “Business Process Management with JBoss jBPM; A Practical Guide for the Business Analyst”. So I guess I skipped over the 'for the Business Analyst' part. However after actually reading the book, I think the author picked the right subject by explaining how to do a BPM project in the first place.

Truth be told I may have some idea about it, but I never read anything that formal on the matter, and the best part is that Matt simply follows a fictitious (or is it?) project about a record company called “Bland Records”. In Matt's words “Bland Records' specialty is in finding talentless, yet attractive youths, assembling them into bands of four or five, partnering the ready-made band with a songwriter and some real musicians, who finish the product with an addictive set of tunes. The end product is released on an unsuspecting public who promptly shoot the band to number one in the charts”.

The book is to the point and an easy read. The first few chapters are about what a Business Process actually is, what the interaction the Business Analyst and the Developer should be and how to set up interaction channels with the customer, how to put together the project team and how to find the right project sponsors to optimize the chances to deliver a successful product. In his book he sets up a workshop for the project team where the business process is analyzed and a flow diagram is created, before breaking for lunch. After lunch the team goes through the flow diagram to identify the roles and responsibilities. He goes over terminology like PID (Process Identification Document), SME (Subject Matter Experts) and RACI (Responsible, Accountable, Consulted, Informed), and how to use the RACI Matrix, and finishes by how to end up with a realistic implementation plan. I really enjoyed these two chapters.

Chapter 3 explains how to install jBPM and it gives a quick introduction to the product. The chapter has plenty of screenshots to keep you on track and you should get a good visual picture in your mind about jBPM capabilities, the chapter finishes by implementing the business process as a proof-of-concept system. Chapter 4 adds a user interface to the prototype and by the end of the chapter you have a fully functional system, for which you can get sign off from the client. This is a nice break point in the project where some wrinkles can get ironed out before putting more time into making the thing look nice. Chapter 5 speaks to how to deploy and setup the proof-of-concept and how to go through some iterations with the team. At this point the integration points with other systems should still be stubbed out. Finally, when the iterations have honed the business process implementation, another sign off follows and in Chapter 6 he describes how to convert the proof-of-concept to a production ready system and by setting up a BAM (Business Activity Monitor) to gather process metrics in production. At the end of chapter 6 the project goes to production and in chapter 7 the team comes together for a post mortem, where they try to evaluate the success of the project, and they try to come up with an ROI of the system. Assuming all is in order a final sign-off from the client can now be obtained. Over time the process metrics can be tracked and future projects may implement process change requests.

The book is a quick read and I think it very wise to hand out a copy to each of the members on your team when starting a BPM project. The price may be a little steep, but part of it goes to the jBPM project and I'm sure those costs can be expensed on the project anyway ;). Thumbs up on this one.

Monday, March 17, 2008

Using an EJB3 Interceptor in Seam

For one of the demos for the NEJUG presentation we integrated the JPetStore (Spring based) and the DVDStore (Seam based) with JBossESB. The idea is that when orders are placed in either store the orders are processed in some ESB based Order Processing Service, using jBPM for the Orchestration. Here I want to show what you need to do to intercept a the order when the user hits the 'confirm' button in the Seam based DVD Store. For this we use an EJB3 interceptor. The beauty of this solution is that no code changes are needed in the DVD Store itself.

1. Modify the ejb-jar.xml
First we need to add some configuration to the ejb-jar.xml, as shown in Figure 1.


Figure 1. Add the interceptor to the ejb-jar.xml


So first we define the interceptor class by referencing
com.jboss.dvd.seam.CheckoutInterceptor
, next we need to specify when this class should be called, which is done by adding the second piece of xml which specifies the bean name for which the interceptor should fire. Here we want it to fire when the
CheckoutAction
is called.

2. Add the interceptor class

The interceptor class itself looks like

package com.jboss.dvd.seam;

import javax.interceptor.AroundInvoke;
import javax.interceptor.InvocationContext;

public class CheckoutInterceptor {

@AroundInvoke
public Object sendOrderToESB(InvocationContext ctx) throws Exception {

System.out.println("*** Entering CheckoutInterceptor");
Object target = ctx.getTarget();
//Just making sure
if (target instanceof CheckoutAction) {
if (ctx.getMethod().getName().equals("submitOrder")) {
System.out
.println("We will send the following completedOrder object to ESB");
Order completedOrder = ((CheckoutAction) target).currentOrder;
Customer customer = ((CheckoutAction) target).customer;
completedOrder.setCustomer(customer);
System.out.println("Completed Order= " + completedOrder);
}
}
try {
return ctx.proceed();
} finally {
System.out.println("*** Exiting CheckoutInterceptor");
}
}
}

Figure 2. The CheckoutInterceptor Code

And that is all there is to it. In the interceptor we check which method is called, and if it is the
submitOrder
method we print out the order. In the demo we added code to serialize the order to XML, and then dropped in onto a gateway Queue, on its way to the ESB.

NEJUG on SOA - The SOA-P Store: Bed, Bath and Beyond (II)

I'm happy to say that the presentation went over well. We got some great questions and in general I think people really got it. Tom Cunningham compiled a list of the questions we got:

Is the registry different than UDDI?
Is there portability across ESB implementations (Mule / Service Mix / JBoss)?
Is there portability across different BPEL providers?
When you are talking about components for ESB, are you talking about adapters?
How gunshy should I be using JBoss ESB (are people using it in production)?
What strategies can I use for portability between ESB providers?
Is there any inherent security in the content based routing? Are there plans on adding something in this area (ACEGI/Spring Security in particular)?
What would you typically see in an actions block?
Would an example of an action be something execuing a rule?
When do you use a J2EE Servlet Filter approach and when would you use ESB?
How would you roll back a transaction?
How do you integrate with RESTful services?
How tightly couple is the rules engine with the ESB?
What monitoring tools exist to see what is going on inside the ESB?
Is there a way inside the ESB to log payload in and out of the ESB?
Is the ESB's state recoverable? Will it maintain state?
How do you track long transactions?
What's the connection between Rules and the ESB?
Will hot deploy consume all current requests before redeploying?
Is load balancing simple round robin?
Does JBPM compete with BPEL?
Is there interactive debugging so that you can step through the JPDL flow?
Are the enterprise design patterns (wiretap, splitter, aggregatore) available inside of a visual designer in JBDS?
Is the JBPM plugin going to be available for NetBeans?
Who is the jbpm-console intended for?
What's the difference between SOA and ESB?
How do you migrate to ESB?

The most humorous question had to be the last one, where someone wanted to know if content based routing could be used as a method of escaping hardware licenses.


Sorry, you had to be there for the answers, but maybe it can be a start for a JBossESB FAQ page.

Monday, March 10, 2008

WhichJar Utility

Where oh where is that class? Setting up your classpath in Java is half the battle. And finding in which jar the class you are looking for lives can be a tedious task. Measured in software years it was ions ago that my friend Toby introduced me to a little shell script he wrote called whichJar. I think I have been using it on a daily basis ever since. It may have changed a little over the years, but not too much, so here it goes:

#!/bin/bash
# nasty script to find a string in the contents of a jar file.
# probably a better way to do this, but then again...

if [ "$1foo" = "foo" ] ; then
echo "$0 usage: `basename $0` string"
echo "where string is the string to look for in the jar's contents"
exit 1
fi

for i in `find . -name \*.jar`
do className=`(jar tvf $i | grep $1)`;
if [ "$className" != "" ]; then
echo -e "$i\t$className";
fi
done

To use it save the above code of to a file called 'whichJar' in your bin directory, and make it executable
chmod a+x whichJar

Then, for example to find the class 'javax.jms.Topic' in any of the jars in the current directory, or any of its subdirectories, simply type
whichJar javax.jms.Topic

Thank you Toby.

NEJUG on SOA - The SOA-P Store: Bed, Bath and Beyond

This Thursday (March 13th) Burr Sutter and I will speak at the NEJUG on SOA "The SOA-P Store: Bed, Bath and Beyond".

This will be a dynamic session focused on the demonstration of the customary capabilities and best practices associated with an Enterprise Service Bus for SOA-focused deployment.

We will get people involved and empowered with real boots-on-the-ground knowledge of how to do SOA and not just pontificate on abstract theory and marketing-speak. The live demonstrations will illustrate how typical Struts+Spring+Hibernate web applications can be liberated as services and enter the world of ESB & SOA.

See you there.

JBoss SOA Platform (SOA-P) Documentation

The SOA-P team incorporates a team of technical writers who take the project specific documentation and turn it into documentation for the SOA Platform, which is the supported 'RHEL' version of JBossESB (where JBossESB would be Fedora). In the true spirit of Open Source, this documentation is available for free, under the support/documentation tab of the Red Hat homepage, or you can go directly to the SOA-P 4.2 docs. I was quite impressed with what they did to some of the docs I wrote! Thanks guys.

Service Orchestration using jBPM

I recently wrote up am entry in the JBossESB blog. This code is now available in the SOA Platform as well as on the trunk of the JBossESB project. For the full documentation in pdf format see the jBPMIntegrationGuide.


Figure 1. Service Integration using jBPM.

Friday, February 15, 2008

Wednesday, February 6, 2008

Single SignOn (SSO) with Seam using JOSSO

I recently used seam_gen to create a josso_console application. If you want to use JOSSO, but you don't yet want to take the leap to hook it up to an LDAP or other industrial strength credential store, then this is the app may come in handly. Figure 1 shows the welcome screen of josso_console application.


Figure 1. The JOSSO console.

If your objective is simply to hook your Seam application up to JOSSO then you should keep reading too.

1. Seam and JAAS
Seam comes with its own security framework which is based on JAAS. The easiest way to hook Seam up to JOSSO is to first configure your Seam-based application the conventional JOSSO way and then hooking up Seam and JOSSO using a Seam authenticator. I worked on a jossoAuthenticator that will set the SSO user and roles information into Seam context, so that you can use all the Seam security features while using JOSSO.

2. Configure JOSSO
In your josso-agent-config.xml add the josso_console in as a partner app


Now your application will have access to the JOSSO cookie.

3. Configure Seam
Next we're going to protect our application using the standard security constraints to the web.xml. For instance if we only want users that have the admin role to access our console you would add


This web.xml references the login-redirect.jsp which you will need to add to the root of your war file. Now, you will be redirected to the josso login screen when trying to access the web application.

Next we have to propagate the authorization information into Seam context. For this we use the jossoAuthenticator.
Next you need to reference this class in your pages.xml, by adding

and commenting out the default authenticator

Finally we need to modify the pages.xml, where we reference the jossoAuthenticator on our welcome page (index.xhtml) like

which will cause the jossoAuthenticator.checkLogin to be called for this page, and in the exception class configuration we specify the index.xhtml page


All should now be working. If you want, you can obtain the full sources from the josso_console application to see the complete application.

Some other resources in this context you may find useful are:
http://www.josso.org/confluence/display/JOSSO1/JBoss+4.2
http://sdudzin.blogspot.com/2007/12/windows-sso-with-jboss-seam.html, and
http://www.ja-sig.org/wiki/display/CASC/Seam+Identity+Integration+(Seam+1.2.1+-+2.0.0)

Tuesday, February 5, 2008

Sony Vaio VGN-FS740/W Battery Problem

The battery on my wife's computer would no longer hold a charge and it was time to replace it. We replaced it with a generic brand battery (little more then $100). It charged fine, but then while it was running a message would suddenly pop up, saying that the battery was not inserted correctly and that it had to switch to hibernate mode. The battery light would flash rapidly after this, and it would never come back long or it would go hibernate again. Re-seating the battery had no effect. It turns out that Sony truly has turned evil. They are running a little service that checks whether the battery is a Sony battery ($300!), and if it is not then it kicks into action and sends your machine into hibernation!

After some digging I found this article, and the fix is easy, the service is called ISBMgr and to get rid of it permanently fire up msconfig like so


go to the startup tab and uncheck the ISBMgr.



Now when you restart it will ask you if you knew that some services where changed, just say yes and it will never bother you again.

Bad Sony. No more Sony for me..

Thursday, January 31, 2008

Compare two OpenOffice Documents

I received some feedback on a document I wrote and I wanted to know what the changes were. In Word you can track the changes, but where is this functionality in OpenOffice? I looked around a bit and found this really nice feature I had been missing - it is the 'compare document' feature, and it allows you to accept changes on a per diff level. Simply open the original document and then under Edit, select the 'Compare Document...'



and browse to the edited version document. After that it will bring up a nice dialog box with all the diffs which you can accept or deny. I think it makes a lot of sense not to do this in the document itself like Word does.

Wednesday, January 30, 2008

Post XML code fragments

It turns out that it is not that straightforward to display XML fragments in a blog. Usually I use GeSHi to do the formatting for me. I then look at the source for the HTML page and grab the piece that I need, which I then host on my own site, and I reference it in an iframe like

Switch jUDDI on JBossESB over to Postgres

By default JBossESB uses HSQL to handle persistence for jUDDI. For production it is recommended to switch it to for example Postgres.

1. First you will need to download the Postgres JDBC driver. I selected the '8.2-507 JDBC 2EE' driver. You can copy it to either the jboss_home/server/default/lib directory, or to the jbossesb.sar/lib directory.

2. Create the juddi database. Run the pgAdmin a shown in Figure 1.



Figure 1. The Postgres Admin

and create a juddi user, which has create rights to create tables, see Figure 2.



Figure 2. Add a user called 'juddi'


3. Next you should update the datasource for jUDDI, which is defined in the jbossesb.sar/juddi-ds.xml. Comment out the Hypersonic configuration and add the postgres datasource configuration like



4. Finally we have to tell jUDDI where it can find the DDL to create the juddi schema. For this we edit the jbossesb.sar/esb.juddi.xml. Modify the 'juddi.sqlFiles' setting by replacing the 'hsqldb' occurances to 'postgresql'.



Now on startup JBossESB will create the jUDDI tables in the Postgres juddi database.

For an overview of jUDDI in JBossESB in pdf format see the Registry Guide.

Friday, January 25, 2008

jBPM ant task to deploy a process definition to a running server

I've been working on jBPM integration into JBossESB to do Service Orchestration, and I created a new ant task to deploy a process definition to a running instance of jBPM. When I say a running instance I mean an instance that deploys the jBPM-console. The jBPM-console ships with a DeployerServlet which is used by the jBPM eclipse plugin to deploy process definitions. The ant task is called "DeployProcessToServerTask" and it uses the very same servlet. The java code for the task has been attached to jira JBPM-1117 so it should hopefully make it into SVN soon.

To use the DeployProcessToServer Task you need to define a new task using the taskdef command

<taskdef name="deployToServer" classname="org.jbpm.ant.DeployProcessToServerTask">
<classpath refid="exec-classpath"/>
</taskdef>

So here I defined a task called 'deployToServer', but you can call it whatever you want. The classpath should obviously point to the jar containing this class. The 'deployToServer' can either deploy a Process Archive (PAR), which is a zip file containing the files you want to deploy, or you can specify a set of files which it will zip up and deploy for you. You can use the following attributes and subelements:







Attribute DescriptionDefault value
processThe location of process archive-
servernameThe name of the server which is used to build the deployment urllocalhost
serverportThe port to which the http protocol is bound8080
serverdeployerThe address of the deployer servlet/jbpm-console/upload
debugDebug flag which, if set, writes out a debug.par zip file. This is especially useful if fileSet subelements are used, so you can check what gets loaded to the server-
SubElement DescriptionDefault value
fileSetA fileSet element-


The following xml fragment deploys all the files in the processDefinition directory to the server

<target name="deployProcess" description="deploys the process definition">
<echo>Deploy the process definition</echo>
<taskdef name="deployToServer" classname="org.jbpm.ant.DeployProcessToServerTask">
<classpath refid="exec-classpath"/>
</taskdef>
<deployToServer>
<fileset dir="${basedir}/processDefinition" includes="*"/>
</deployToServer>
</target>

Or if you already have a par archive ready to go you could reference that

<target name="deployProcess" description="deploys the process definition">
<echo>Deploy the process definition</echo>
<taskdef name="deployToServer" classname="org.jbpm.ant.DeployProcessToServerTask">
<classpath refid="exec-classpath"/>
</taskdef>
<deployToServer process="${build}/process.par"/>
</target>