Friday, May 21, 2010

DOAG News 02/10: GlassFish and Java EE 6 at Oracle

Some self promotion again. The latest DOAG News, which is the quarterly magazin from the German Oracle users Group DOAG e.V. published an article of mine. This is about GlassFish and Java EE 6 at Oracle and is ment as a guide to Java EE 6 and the "new" Oracle product. Beside this I wrote about some things seen along the road from Sun to Oracle.

All DOAG members get the magazin as part of their membership for free. You can also have it, if you like. Check the german DOAG publications website for more details.

Thursday, May 20, 2010

Spring and Google vs. Java EE 6

As promissed, I was thinking about the future of Java these days. Collecting information and trends is a part of this.
I have heared a lot of people talking about Google I/O as the next Java One. The fears about what is going to happen to the #1 Java conference run by Oracle are big. It seems as if Google could provide a valid alternative with I/O. Unfortunatualy I was not able to attend it but of corse I listened carefully to whats happening there. And believe it or not, I think it's worth talking about it. Ok: Not realy as a replacement for J1 but about Google's and Spring's capabilities to provide a growing platform for a different kind of enterprise Java programming model.

This all started with the aquisition by VMware what made a smaller company able to move faster than ever before. Followed by the first major partnership with salesforce.com. What is called VMforce should be the first step to become something like a PaaS standard.


Our [VMware and Google] shared vision is to make it easy to build, run, and manage applications for the cloud, and to do so in a way that makes the applications portable across clouds. The rich applications should be able to run in an enterprise's private cloud, on Google's AppEngine, or on other public clouds committed to similar openness.
(Source: Steve Herrod, Chief Technology Officer)


Spring and everything around it is heading for the could. Literally with the speed of light. It only takes few weeks between new announcements around this topic. At the end of the day, Spring, VMWare and Google are providing a cloud based deployment platform for Spring based Java applications. That sounds modern, fast, easy and is potentially very interesting. It may provide the easiest, no-comprise way to publish Java applications. If you look at other cloud alternatives they are either more restrictive for the developers ("old" Google AppEngine) or provide services at an infrastructure level like Amazon's EC2.

But: Spring and VMware are going to build their own Java universe where they dictate momentum, their 'standards' and more and more the commercial consequences as well. From an Enterprise Java point of view it's simpler. Too many things are called "Spring". And this makes it easy on the first look. You don't have to talk about 30 something specifications but about one big framework. And while Spring and Rod Johnson in particular have been extremely valuable in influencing the direction of Java (2)EE after the 1.4 release to the new, much more pragmatic world of Java EE 5, Spring has also caused polarization and fragmentation. Instead of helping forge the Java community together, it has sought to advanced its own cause. Which is perfectly valid - but should be recognized for what it is. Spring is not necessarily open, is not free, is not a community or even multi-vendor effort. Lock in with Spring is just another type of vendor-lockin. And that is, why it will never be a replacement for Java EE.
But there is another takeaway for the Java community and the owner of Java. The hype around innovative and integrated solutions is a proof for the Java EE universe moving too slowly along. Bring in more flexibility. Have more courage with changes. Find a way to adopt trends faster and support better modularity.

Links and readings
Springing Ahead Toward The Open PaaS
VMware to Collaborate with Google on Cloud Computing
Google and VMware's "Open PaaS" Strategy
SpringSource Tool Suite with Google Integration Download
Enabling Cloud Portability with Google App Engine for Business and VMware



(Written with the help of some thoughts of fellow ACE Director Lucas Jellema (@lucasjellema). Thanks!)

Wednesday, May 19, 2010

polishing java



polishing java, originally uploaded by myfear.

Taken today. My little girl enjoying her fathers java cup ;) And I thought it is time for some thoughts about latest developments and the future of Java. And here we are: I am sitting on an article writing. I will publish selected parts of it in the near future. Prommised.


Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 Unported License.

Tuesday, May 18, 2010

friendly reminder. do architecture documentation the right way.

I had to work through a lot of documentation lately. If you have bigger systems this is anything but fun. Especially, if nothing more than some document templates exist for the team to put their knowledge in. Also very unhandy is documentation that is not up to date. If you want to make it better, you could take the following suggestions serious and put them in place for your project. I don't want to call them documentation principles, but they should guide you in writing clear and worth reading architectur documentation for your software projects. The points are in no particular order. I value them equal but had to start and end with anyone.

Don’t repeat yourself
Even if it is alluring because there are still some customers out there, believing that only the total amount of documentation counts. It's not. Try to be DRY. If this does not work: Try harder! Documentation is there to have a single place for putting all information available.

Avoid surprises
According to the principle of least astonishment (POLA), you should avoid solutions that leave the reader wondering. Try to make everything as clear as possible. Try to use different perspectives and views for different stakeholders.

Use standardized structures
If you are not in the position using a standard software development process, invest the time to develop at last a standard set of documents you want your team to take care of. Take the time to brief your team according to the structure and what your expectations are.

Work iterative
Compared to a great novel, software documentation is hardly ever finished. You will find the need for documenting open issues and first thoughts. Don't just put empty headlines in it for them. Create a seperate chapter and fill in as many information you may have according to your open issues. Don't create empty chapters at all. The simplest thing to have in a chapter is a note why this is (nearly) empty now and when you expect it to be filled.

Five2Nine
According to the rules for cognitive psychology the short term memory of a person can hold five up to nine elements. If you draw charts (Especially for mngmt audience ;)) try to reduce yourself to not more elements per chart. Any higher number will not work. And don't forget about the legend. A chart without one is worth nothing.

Draw and explain
If you make drawings and charts don't forget to explain them. If you don't explain your symbols or notation you will end up with a big misinterpretation. Even for a big project it could be sufficiant to have the basic perspectives (functional, contextual, distribution, development and runtime) as charts and explain them in more detail. A good start for this can be found on the CC-licensed http://www.arc42.com/.

Write for your readers
Before you begin writing take a moment, relax and thing about the audience for your document. If you find out, that you have more than one group, it could be worth thinking about a reading guideline in which you state which chapter is worth reading for whom. If you follow this, you will find it easier to nail down the points and don't find yourself explaining the basics over and over again.

Be complete but don't write an epic
The hardest part. The bigger the project, the more documentation needs. Anyway. Try to find a way to strip and skip all unneeded parts. Think about splitting your whole documentation into several parts. But keep a root document where you put the basics. It does not make sense to have a single document with more than 200 pages.

Document the "why"
Don't just simply put the result in you documents. Explain the reasons why you have chosen the solution. This will give your readers a chance to follow your thoughts and don't put you under pressure explaining even the weird constructs.

Document the "what"
Invest in a short chapter to explain, what kind of software you are going to support with your design. What are the basic functional requirements behind it? What is the business case? Why does the project exist. This is also a good place to link to from the "why"s and could explain why you prefer open source over closed source solutions for example.

As always, this is ment as a general guideline. There are plenty examples out there where one or more of the mentioned points is/are violated. The most basic requirement I have concerning the documentation is that you make your decisions transparent and readable.

Monday, May 17, 2010

Gmail, Notes/Domino and Migration

I was playing around with Gmail, the gdata api and Lotus Notes/Domino lately. The idea was to migrate some old stuff (obviously private stuff) from my Lotus Notes archives to a separate Gmail account. My company is running Notes and it's easy to have separate archives but I feel like having a backup from time to time and it is also very handy to have access to those things from everywhere without carrying your computer with you all the time. Therefore I need to transfer the archives over to gmail. If you are a Google Apps Premier or Google Apps Education Edition customer, all you have to do is to look at the provided tools, install them and run your migration. If you have a free account you are stuck. But it would not be google, if you would not find any solution to this ;)

Here is my small howto on migrating Lotus Notes databases to Google free Gmail accounts. If you are trying this, you should have some insights in Java, Notes and MIME/Email concepts already. If not, you will find plenty of service companys out there helping you. You can even contact Google directly about this. All others go ahead reading :)

Tools, sources and environment
  • Get yourself a gmail account for testing :)
  • Get yoursel a copy of the Lotus Notes 8.5.1 client (better the designer).
  • Have a copy of your favorite Java IDE in place (for me this is still Eclipse) and start a new project.
  • If you are not willing to start from scratch, get the gdata-java-client samples.
  • Add all required google libraries (gdata-core, -client,-appsforyourdomain,-media, etc.) to your project's build path
  • Copy the notes.jar from %LOTUS%\notes\jvm\lib\ext to your project's build path
  • Add the Notes install folder %LOTUS%\notes to your project's run and debug-configuration path environment.
  • Copy the sample.appsforyourdomain.migration.AppsForYourDomainMigrationClient.java file from the gdata samples to your project

Running the first test
If everything is in place, you should be able to run the AppsForYourDomainMigrationClient for the first time. Running it out of the box requires you to set some program arguments. --username --password --domain. It places 5 new emails in your inbox. If you look at the example in more detail, you will see, that it makes use of the batch processing api. Basically the code takes a String, parses it, puts it into a Rfc822Msg and this is put into a MailItemFeed which is processed by the MailItemService.batch() method. All the magic happens in the runSample() method. You see, that whatever we are going to push to google, should be a rfc822 compliant message.

Getting documents from domino/notes
The idea is to fetch a message from Notes/Domino, convert it to a rfc822 message and send it via the gdata-api to google. If you are unshure about the notes connection things (they get tricky from time to time .. see links below). Let's go.
// initialize notes session
NotesThread.sinitThread();
Session session = NotesFactory.createSession();
session.setConvertMime(false);
// open your database
Database d = session.getDatabase(null, "folder/oldstuff.nsf");

Now we have to get all the documents one by one and let the magic happen.
DocumentCollection inbox = d.getAllDocuments();
Document doc = inbox.getFirstDocument();
while (doc != null) {
//...
doc = inbox.getNextDocument(doc);
}

That is all you have to do. Now you are able to iterate over the complete document collection in one archive. But: All you have now is an instance of a lotus.domino.Document. Some handy methods are there but it's by far not enough to take it, serialise it and put it on the wire.

From Document to MIMEEntity
First thing you will have to do with a single Document is to convert it to a so called lotus.domino.MIMEEntity. This is the starting point for all further processing. MIMEEntity mime = doc.getMIMEEntity(); if the mail was already received via smtp this is most likely everything you have to do. If you have original domino Documents at hand, this will not work and mime will be null. Beginning with 8.5.1 you have a public void convertToMIME(int conversiontype, long options) method at the Document API. This will do the job for you. You can have different conversion types available. I am using the Document.CVT_RT_TO_PLAINTEXT_AND_HTML for multipart/alternative.

From MIMEEntity to Rfc822Msg
Now you are only one step away from a Rfc822Msg. All you have to do now is to build a StringBuffer with all the RFC 822 requirements fulfilled. The tricky part are the headers. To get them you have to work with the lotus.domino.MIMEHeader object.Vector headers = mime.getHeaderObjects();
for (int j = 0; j < headers.size(); j++) { MIMEHeader header = (MIMEHeader) headers.elementAt(j); buffer.append(header.getHeaderName() + ": " + header.getHeaderValAndParams() + "\r\n"); }

If you work through the Document you will see, that it has a child for every multipart part. So you have to make shure, you iterate over each child and add it to the buffer, too.MIMEEntity child1 = mime.getFirstChildEntity();
while (child1 != null) {
//...
if (child2 == null) {
child2 = child1.getNextSibling();
if (child2 == null) {
child2 = child1.getParentEntity();
if (child2 != null)
child2 = child2.getNextSibling();
}
}
child1 = child2;
}

If you are working your way through attachments, you will find it usefull to have the public void encodeContent(int encoding) from the MIMEEntity to convert attachements to MIMEEntity.ENC_BASE64. Everything else will not work.

From Rfc822Msg to MailItemEntry
Take your buffer and put it to the Rfc822Msg Rfc822Msg rfcMsg = new Rfc822Msg(buffer.toString());. Add it to a MailItemEntry mailItem.setRfc822Msg(rfcMsg);.

Lables and properties
You can apply labels and additional properties to the MailItemEntries. A label will show up in gmail as you are used to. I flaged all migrated emails with a "private" label like this mailItem.addLabel(new Label("private"));. It's also possible to take the original folder names of the Document and put it as label. You also can decide, if you want the message to appear as unread in the inbox or in the sent folder or whatever if you add additional properties like this: mailItem.addMailProperty(MailItemProperty.INBOX);.

Sending mail
Use the provided Batch approach but send single mails within a single batch worked for me. Here is the basic approach: BatchUtils.setBatchId(mailItem, "" + uniqueId);
BatchUtils.setBatchOperationType(mailItem,
BatchOperationType.INSERT);
MailItemFeed feeder = new MailItemFeed();
feeder.getEntries().add(mailItem);
MailItemFeed feedR = mailItemService.batch(domain,
destinationUser, feeder);

You can get a status of the submission BatchStatus status = BatchUtils.getBatchStatus(returnedEntry);. If something went wrong you will get a status unequal 201.

Further thoughts
You should not try to send messages bigger than 5000000 bytes. Google will reject them. It's best to have some migration/error logging in place allowing you to manualy migrate failed documents. For me it was handy to log the documents doc.getUniversalID() for later migration. Another limit at Google are the requests per second. I don't know how high/low this exactly is, but you will get batch status erros if you exceed this limit. You should wait >30sec after continuing with your submissions.

Now you have your Domino/Notes documents in your Gmail account. It was fun to try this and you should give it a try yourself if you need too :)

Links and readings
Google Data Protocol Developer's Guide Overview
Java access to the Domino Objects
The Multipart Content-Type

Wednesday, May 12, 2010

DOAG 2010 Conference and Exhibition CfP still running

As you might have heard. The Call for Papers for the DOAG 2010 Conference and Exhibition is still open until 2010/30/06.
In more than 300 speakers slots the DOAG 2010 Conference, which takes place November 16th-18th, 2010 in Nuremberg, provides current information on the successful use of the Oracle products as well as practical tips and tricks and exchange of experience. Stay up to date with informations and follow @doagkonferenz on twitter.


Links:
www.doag2010.org (de)
www.doag2010.org (en)
Submit a session (en)
List of Conference topics

Tuesday, May 4, 2010

Java EE 5 or 6 - which to choose today

Companys are starting new Java EE projects over and over again these days. It still is one of the widely used enterprise technologies today. If you find youself in the situation of having to kick off a new project you keep asking yourself the question for the right technology and product. The time between the launch of a new specification and the first commercial implementations could be hard, cause you have to decide what you are going to do. Stick to whatever spec is available through your current vendors implementation or move on with a new product or choose from the already available parts of the new spec and mix them up with old versions. The basic question behind this is: What Java EE version should I start over with today?
I was asked this frequently since the new specification came out. And I always find it difficult to answer. Today I will try to summarize my thoughts on this and post a decision helper for you to find your own answers.

Status quo
Java EE 6 is out since December 2009. Up to now the GlassFish Server Open Source Edition 3 is available as the reference implementation. The Oracle GlassFish Server 3 offers a supported distribution. The Other vendors are missing support for Java EE 6. You may rest assured that first vendors come out with complying versions earliest in Q4/10. Some milestone builds for first servers are already out there (e.g. JBoss). You also can find implementations of separate specifications (JSF 2.0, JPA 2.0) but most of them miss commercial vendors support in current Java EE 5 servers up to now.

The basic decision
The basic decision you have to make ist, whether you are going to give the current GlassFish v3 a timely try or not. This seems easy at first. But if you try to make this decision from an enterprise point of view you have to keep some things in mind. This is what the following flowchart is trying to demonstrate. You basically have three options:
- Stick to Java EE 5 on whatever platform or server you are using
- Use Java EE 6 with one of the GlassFish 3 distributions
- Develop for Java EE 6 with GlassFish 3 and switch to your vendors distribution later


Things misfiting the chart
Of course, this is a simple black and white approach. As usual there are some greys in it, too. If you already are a "GlassFish company" this is much simpler, as the chart indicates. You probably even would stick to the version 3 if you need (some kind of) clustering or failover.
- and if you are running JBoss you could think about using the latest milestone builds.
- and you can also give the recently released WebLogic Server 10.3.3.0 a try if you are only interested in JSF 2.0.
- and you can try to use EclipseLink 2.0 or Hibernate 3.5 for JPA 2.0 support
- and you could decide that it's worth doing some educational projects to skill up your developers
- and ... and ... and

Your development project and the future
Nothing prevents you from thinking about the future of your development project. But you should keep in mind, that it is always some kind of risk, if you mix development and productive platforms. If you are striving for a GlassFish 3 development environment and planning to run on any other vendor's server, you should respect this in your plans. The most safe way here could come true for the interaction between GlassFish and Oracle WebLogic. Not knowing in detail what the future holds for both but I am still expecting some kind of utilities that support transition from one to the other.

Links and readings
GlassFish.org
hibernate.org
EclipseLink
JPA 2.0
JSF 2.0
JBoss AS 6.0.0.M3
Oracle WebLogic

Monday, May 3, 2010

About the Java EE 6 Web Profile und the Future

I was thinking about the new Java EE 6 web profile recently. Beside some current presentations at the jax2010 it seems as if more and more of the well know "webcontainer" products move into this direction (Apache Web Profile, Resin and others I did not mention here ...).
If you look at the new web profile in more detail, you see that it is a specified minimal configuration targeted for small footprint servers that should support something called "typical" web applications.
It is thought of as a minimal specification, so a vendor is free to add additional services in their concrete implementation. The required elements of the web profile are:
  • Java EE 6 (JSR-316)
  • Servlet 3.0 (JSR-315)
  • JavaServer Pages (JSP) 2.2 (JSR-245)
  • Expression Language (EL) 2.2 (JSR-245)
  • Debugging Support for Other Languages 1.0 (JSR-45)
  • Standard Tag Library for JavaServer Pages (JSTL) 1.2 (JSR-52)
  • JavaServer Faces (JSF) 2.0 (JSR-314)
  • Common Annotations for Java Platform 1.1 (JSR-250)
  • Enterprise JavaBeans (EJB) 3.1 Lite (JSR-318)
  • Java Transaction API (JTA) 1.1 (JSR-907)
  • Java Persistence API (JPA) 2.0 (JSR-317)
  • Dependency Injection for Java 1.0 (JSR-330)
  • Contexts and Dependency Injection for Java EE platform 1.0 (JSR-299)
  • Bean Validation 1.0 (JSR-303)
  • Managed Beans 1.0 (JSR-316)
  • Interceptors 1.1 (JSR-318)
Clear from a technology point for view. For me this was simply a new way of prunning; getting rid of some older specs and finding a way to have more Java EE certified servers again (compare this).
Even if I am not shure, if the contained technologies do make up a good and simple profile (What about Mail and JSP?) it could be a step into the right direction.
But what I realy do not get is the actual hype (and that's what I am feeling) around this. What about the full blown Java EE 5/6 servers? Geronimo, GlassFish, JBoss, or whatever else is out there are completely usable to me. Times where J2EE servers were hard to install and administrate belong to the past! It's quite easy to setup, update and configure for example the latest GlassFish v3. Where is the added value with the web profile? Nobody forces you to use any kind of API in your applications. Further on I strongly believe that you should take some actions to prevent this. Adam Bien told me, that:

@AdamBien
http://twitter.com/AdamBien/statuses/13177648403
@myfear webprofile + JAX-RS + JMS -> should be sufficient for 90% of all projects.

But that's exactly the point. The defined Java EE 6 web profile is a good start. It's something I would love to call the new webcontainer standard. But it is still some steps away from defining a lightweight new Java Enterprise Edition. And beside the fact, that it's easier to have a certified product with a minimal set of technolgies I am missing the usecase for this.

Could this all simply be about saving time and effort for new starting Java developers? Hope not. Even if I probably like to call the selection of specifications a adequate starting point. There is nothing that will prevent a future Java EE developer from knowing about most of the specifications defined by the full blown Java EE 6. Show me one, that did not came across Webservices, JMS and Mail ...

Does it make a real difference how many technologies you are running in an instance? In terms of performance? In terms of administration? What is more important to me is, that the instances only make use of container services and component models that are required by the application. And that is the direction I would love to see Java EE move in the future. Don't try to drive those profile things any further.
Focus on a defined, clear and up to date set of technolgies. Find a simple and convenient way to drop support of older standards or versions, make container services available to pojos and don't force pojos to implement container services or component models. And last but not least find a general way to allow plugins for Java EE.

What are your thoughts about this? Would love to know more about them...