Enterprise grade Java.
You'll read about Conferences, Java User Groups, Java, Integration, Reactive, Microservices and other technologies.

Friday, February 27, 2009

JSF Versions and Weblogic Server

11:46 Friday, February 27, 2009 Posted by Test 2 comments:
, ,
We were playing with the JSF containers in WLS and stumbled upon some problems. I'll try to summarize the findings in one place here. Hope, this is helpfull for you.

If you create a new Weblogic Server Domain it does not come with any JSF libraries active. If your try to ship your webapplication with a separate JSF implementation, you are free to do this. Keep in mind, that the dependency injection mechanisms does not work with managed beans, if you do not use any of the WLS provided JSF containers.

Declaring something like this:
@EJB
private MySession session;

will lead to a java.lang.NullPointerException because of the missing depencency injection mechanisms. Would love to know any details about this. Maybe I'll find out more in the future..


The following ones are available with WLS 10gR3:







LIBRARYNAMEIMPLEMENTATION VERSION
jsfSun Reference1.2-b20-FCS - or 1_2_03-rc2
jsf-riSun Reference1.1.1
jsf-myfacesMyFaces1.1.3
jsf-myfacesMyFaces1.1.1


If you want to use any of them in your own webapplikation, you have to deploy the libraries and refer to them with the weblogic.xml depolyment descriptor:


  1. In the administration console navigate to the Deployments page using the left hand menu.

  2. Then select the Install button at the top of the deployments table

  3. Using the directory browser navigate to the $BEA_HOME/wlserver_10.3/common/deployable-libraries directory. Then select the apropriate war (e.g. jsf-1.2.war), and click the Next button.

  4. Make sure that the Install this deployment as a library is selected. Click the Next button on the Install Application Assistant page.

  5. Click the Next button on the Optional Settings page.

  6. Make sure that the Yes, take me to the deployment's configuration screen. is selected. Click the Finish button on the Review your choices and click Finish page.

  7. On the Settings for jsf(1.2,1.2.3.1) page set the Deployment Order to 99 so that it is deployed prior to auto deployed applications. Then click the Save button.



After this, you have to ensure that your application uses the right libraries by entering the needed parts into the deployment descriptor (weblogic-application.xml for an ear and weblogic.xml for any war file)

<library-ref>
<library-name>jsf</library-name>
<specification-version>1.2</specification-version>
<implementation-version>1.2</implementation-version>
<exact-match>false</exact-match>
</library-ref>

If you are wondering, if there is a posiblity to just deploy the .jar File as a Library: theoreticaly yes :) But: There is a restriction in place that prevents webarchives from accessing deployed .jar libraries. You can only access .jar libraries from ear or ejb.jar files. Webapplications are only allowed to access .war libraries. If you try it, you will get an error like this:

"xxxx" library reference is not allowed in the weblogic.xml file.
Only WAR libraries are allowed.
Unknown WebLogic Shared Library Framework Validation Problem


Thursday, February 26, 2009

Meet me at SIG Fusion Middleware

08:30 Thursday, February 26, 2009 Posted by Test No comments:
, , , ,
I will be at the SIG Fusion Middleware in Cologne in March, 24.
The event will take place in the Hilton, Marzellenstr. 13-17, 50668 Köln and is hosted by the DOAG Deutsche ORACLE-Anwendergruppe e.V.


I will give a talk about Clustering and High Availability with Oracle Weblogic Server 10gR3. Slides will become available shortly after the session.

Some more information about the talk are availabe in german on my company's website.

Wednesday, February 25, 2009

JCP Approves the Java EE 6 Public Review (JSR-316)

06:00 Wednesday, February 25, 2009 Posted by Test No comments:
, , , , ,
The Executive Committee for EE/SE has approved the Public Review for JSR 316, the Java EE 6 Specification.

The vote was 12 YES, 1 NO, 1 ABSTAIN, and 2 NO-VOTED; see Ballot Results. You can download the PRD draft, and, as always, your feedback to the Executive Committee and to the JSR316 Expert Group are very welcome.

Most interesting comments from the ballot results are the comments from SAP and Springsource.


SAP AG voted Yes with the following comment:
SAP votes YES but would like to content that more work is required for the integration of JSR 299 into the platform to be useful and successful. We also would like the Spec Lead to consider putting more emphasis on architectural rigor regarding a single consolidated and extensible component model to be used across the platform - right now there are three (EJB, JSF and JSR 299).


SAP also has different component models in place already. And none of them is truely widespread in the java corner of SAP at the moment. If you want to get more information on this, look the "getting started with application development" guide on the SAP community network.
I am a big fan of SAP's JavaAS >=7. But I am not shure, if JEE is all about one common component model. Even if we have three ore more in place. The only thing that should be forced is the seamless interaction between them.


SpringSource voted Abstain with the following comment:
The introduction of profiles is a step in the right direction. However, we are disappointed not to see a minimal web profile, especially as this has become the choice of most enterprise Java users.

As with previous releases, the inclusion of unproven technology is a risk, especially in JSR 299 and EJB concurrency annotations. The number of substantial changes late in the process also gives us concern about the maturity of the result--especially, the impact of the scope creep of JSR 299 on other specifications.

Experience has shown that tying dependency injection features to a server environment does not match user requirements, as injection is common to all application types. We would have preferred to see a dependency injection model for SE, as we proposed in 2007.

Finally, we are not convinced that the end result matches the goals of Java EE 6 as defined in the original specification request, which we strongly supported.


Ok. This is a more serious comment. I am worried about the missing profiles, too. As I have seen the "Webprofile" for the first time I was wondering about what someone could call "web" :-) The requested "minimal webprofile" could be something similar to the JEE5 Webcontainer. Anyway: It is missing and I do expect that we get in bigger trouble with this doing our first projects on JEE6. Beside that I would love to see some more specialized profiles (e.g. portal, sip, etc).

Not beeing a member of the EG, I am not shure about the "inclusion of unproven technology" and the "number of substantial changes late in the process". But I am shure that both can result in problems. Looking at the different EG members with all their different interests it's like a miracle to me, seeing a new consolidated version of the JEE Spec "nearly" in time ;) What seems to be clear is, that if the inventors of Spring (aka the non JEE component guys :-)) have reservations on the impact of the JSR-299 it's worth looking at.

Monday, February 23, 2009

Creating an app in one month

10:56 Monday, February 23, 2009 Posted by Test No comments:
, ,
Found this awesome tutorial today.
Building PhotoKast: Creating an iPhone app in one month.
Beside the fact, that it is quite interesting to read about the project, I do love the thought before the actual implementation happens.

What’s in a name?
Discovering a good name for your product is important. Spend some more thoughts on it and do not only check for an available domainname :)

Designing the 7 deadly sins
Make sure your application concept is addictive. This part gives credit to Tim Chang's speech at a little Web 2.0 event held in Orange County CA last year. There, he perfectly summarized a key design rule for social networks: build them for one or more of the seven deadly sins. Mapped to software this could mean:








LustDating
GluttonyShopping
GreedBudget, Financial
SlothProductivity
WrathDebate, Politics
EnvyShopping
PrideOnline Identity Management


Easy as 123. Or not.
Make sure your application is intuitive and easy to use.

Short term commitment or marriage?
Think about the shelf-life of your product
before releasing it

Friday, February 20, 2009

WebLogic Resource Security

08:48 Friday, February 20, 2009 Posted by Test No comments:
, ,
Yesterday a coworker dropped in and asked me about the Weblogic security concept. He was trying to deploy the JEE example applikation Dukes Bank on the 10.x version and got in trouble with the changed resource protection.
The webapplication has a single web.xml DD and a separate weblogic.xml DD is missing. If you have something like this in your web.xml

<security-constraint>
<web-resource-collection>
<web-resource-name>Success</web-resource-name>
<url-pattern>/welcome.jsp</url-pattern>
<http-method>GET</http-method>
<http-method>POST</http-method>
</web-resource-collection>
<auth-constraint>
<role-name>webuser</role-name>
</auth-constraint>
</security-constraint>
<login-config>
<auth-method>BASIC</auth-method>
<realm-name>default</realm-name>
</login-config>
<security-role>
<role-name>webuser</role-name>
</security-role>

Than you need to tweek the default behaviour of the Weblogic Server to get this up and running. You have different options in wls to secure your resources:



  • Deployment Descriptor Only (Java EE standard)

  • Custom Roles

  • Custom Roles and Policies

  • Advanced



If you choose the first option, you need a weblogic.xml DD to define the roles to principle/group mapping.
If you choose the Custom Roles you can configure the role mappings from a role mapping provider that you configure for the security realm. You can use the Administration Console to configure the provider. Any role mappings in the deployment descriptors are ignored. The model uses the policies that are defined in the web.xml and ejb-jar.xml deployment descriptors.
If you choose Custom Roles and Rolicies, you configure a role mapping provider and an authorization provider for your security realm. You can use the Administration Console to configure the providers. Any role mappings or policies in the deployment descriptors are ignored.
If you want to import the basic information from the DD and configure Roles and Policies on this basis, you need to choose Advanced.

First step is to configure your realm:



Don't forget to
a) Disable the Combined Role Mapping and
b) change the "Check Roles and Policies" to "All Web applications and EJBs". If you are finished, you defenetely need a server restart. Even if the wls is happy and working.

After this, you have to install your deployment.
Carefull: You can override the realm settings during deployment time. So, don't change anything here.
If the deployment is installed, you can browse with the admin console to your deployment and have a look at the security settings. In this case, you can see, that the security2.war has a resource Role "webuser" assigned to the URL Pattern "/".



The only thing left to do, is to create the needed user or group. The role condition inported from the DD states, that a "webuser" could be a "Group : webuser" or a "User : webuser". Therefore you have to go back to your realm and add whatever fits your plans.
a) a User with the name "webuser" or
b) a Group with the name "webuser" and don't forget to assign some users to the group

And after this, you are done with the show and see the welcome screen:



If you like, you can download the sample ear file for your configuration tests.

Thursday, February 12, 2009

best free rich text editors

12:15 Thursday, February 12, 2009 Posted by Test No comments:
, ,
Found this post from Antonio Lupetti. He lists the rest Rich Text Editors ready to use in web projects with some comments and hints how to implement them on pages using a few lines of HTML code.
It seems too easy to use one of them within your software product. But, be aware of the license.


  1. Yahoo! UI Library: Rich Text Editor

  2. Offered under a BSD License

  3. Free Rich Text Editor

  4. Offered under a Creative Commons Attribution 2.5 Generic license

  5. TinyMCE

  6. Offered under a LGPL license, also known as the GNU Lesser General Public License, former GNU Library Public License.

  7. FCKeditor

  8. Offered under the GPL, LGPL and MPL open source licenses but an Closed Distribution License - CDL is also available

  9. Xinha

  10. Offered under the original htmlarea licence, essentially this is a BSD style licence


Wednesday, February 11, 2009

XMind 3 for Eclipse 3.4 - Open Source, Open Storm!

10:28 Wednesday, February 11, 2009 Posted by Test No comments:
, ,
Are you working with mind maps? Are you developing software with eclipse? Than you should have a look at XMind 3 for Eclipse.
Simply install the plugin from the official plugin site:
http://www.xmind.net/xmind/updates/xmind3_for_eclipse/
and your are done. For a more detailed explanation on how to use XMind, have a lookt at the help section on the vendor's homepage.

The plugin is available under two licenses: the Eclipse Public License v1.0 (EPL)
and the GNU Lesser General Public License v3 (LGPL).
If you are wondering, how and why such a great tool was released as Open Source, the solution to this is simple. You can buy a XMind Pro version, too.
This contains additional features. Among them are additional export filter for MindJet MindManager, Microsoft Word, Microsoft PowerPoint, PDF, RTF.

Oracle WebLogic Server on Amazon's EC2

05:41 Wednesday, February 11, 2009 Posted by Test No comments:
, , ,
Amazon Elastic Compute Cloud (Amazon EC2) is a web service that provides resizable compute capacity in the cloud. It is designed to make web-scale computing easier for developers.
Amazon EC2’s simple web service interface allows you to obtain and configure capacity with minimal friction. It provides you with complete control of your computing resources and lets you run on Amazon’s proven computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change.

You can now find two new Amazon Machine Images (AMIs) for Oracle WebLogic Server listed in Amazon's public AMI repository.
In a matter of minutes, you can have a fresh fully functioning WebLogic Server environment up and running ready to host your JEE applications.
On the oracle technology network there is a separate Cloud Computing Center (you find it under middleware). There you can finde a guide (PDF) to Oracle and Weblogic in the cloud. After all some excellent instructions in getting started with the provided AMIs.
For more detailed information have a look at the Amazon Elastic Compute Cloude Developer Guide.

There are two AIMs availabe on EC2 with weblogic. Both 32 & 64 Bit versions of Weblogic Server 10.3.0.0 with JRockit JDK 6.0 R27.6 (Java version 1.6.0_05) installed on a Oracle Enterprise Linux 5 Update 2 JeOS-1.0.1.
Keep in mind, that the WLS does not include the samples, webserver plugins and the workshop components.
More information about licensing, Oracle in the cloud and more general questions can be found in the Oracle Cloud Computing FAQ.

Tuesday, February 10, 2009

Using HTML and HTTP Compression with Weblogic Server

07:06 Tuesday, February 10, 2009 Posted by Test No comments:
, , , , ,
Have you ever wondered about the extra spaces in your HTML output from JSP Pages or Servlets? Even the ones, dealing with webdesign on more or less older browsers did :)
Beside the formatting this could also have an impact to overall application performance, if you are delivering big pages.
The first thing, that came to my mind was the gzip compressed HTTP response. Apache 2.0 comes with the mod_deflate, which adds a filter to Gzip the content. Filters can be blanket — in Internet Explorer everything is compressed — or selective — compressing only specific MIME types (determined by examining the header generated, either automatically by Apache or a CGI or other dynamic component.
To enable blanket compression, set the SetOutputFilter directive to a Web site or Directory container, for example:

<Directory "/your/path/to/htmlroot">
SetOutputFilter Deflate
</Directory>

You can take advantage of this using WLS if you use the WLS Apache Plugin, too. Used in combination, you get a gzip'ed http stream, that reduces the amount of data transfered to the clients.

If you don't have an apache in front of your server, you can still have a servlet filter installed, which could do the gzip compression of the content for you.
There is a quick samle available from bea (download: http://ftpna2.bea.com/pub/downloads/Gzipfilter_war.zip). Simply add the weblogicx-gzip.jar included in this distribution into your war's WEB-INF/lib directory. Register the gzip
filter in your web.xml as shown below:

<filter>
<filter-name>GZIPFilter</filter-name>
<filter-class>weblogicx.servlet.gzip.filter.GZIPFilter</filter-class>
</filter>

After this, you have to map all resources which can benefit from compression such as
.txt, .log, .html and .htm as a filter mapping.. You can also use the filter to compress output from jsps and other dynamic content. Typically compressing certain image types does not prove to be advantageous since they are already compressed
so make sure they are not mapped to the gzipfilter.

<filter-mapping>
<filter-name>GZIPFilter</filter-name>
<url-pattern>/*.html</url-pattern>
</filter-mapping>

Of corse there are plenty other solutions out there. You could even write your own gzip filter. Find the right solution for you. This should not be a productive solution anyway. It consumes too much server ressources and is something the mod_deflate could do much more efficient. But it could be handy for tests and debugging.

To optimize the output of your HTML code, you can use the newly added feature from the Weblogic Server (>=10.x). It's called HTML Template Compression. To use it, you simply have to add the following code to your weblogic.xml:

<weblogic-web-app>
<jsp-descriptor>
<compress-html-template>
true
</compress-html-template>
</jsp-descriptor>
</weblogic-web-app>

This removes any extra whitespaces from the generated HTML output. For example:

<html>
<body>
<text>
</text>
</body>
</html>

will be rewitten as:
<html><body><text></text></body></html>

Weblogic Server - quick(er) development roundtrips - fast-swap

06:34 Tuesday, February 10, 2009 Posted by Test 1 comment:
, , , ,
This is something that could eat up time in any bigger project. If you make changes to your working application and you need to track down bugs, it's inevitable to redeploy your application on any change. What is done in seconds, if you
a) know the WLS administration and/or
b) your application is quite lean

If you have a fullblown JEE application with all the magic inside, it's no problem to wait for a successful redeployment for minutes.
In practice there are some ways to get rid of this overhead very quickly.

1) Weblogic's ChangeAwareClassLoader (WLS <=10.x)
Java classloaders do not have any standard mechanism to undeploy or unload a set of classes, nor can they load new versions of classes. In order to make updates to classes in a running virtual machine, the classloader that loaded the changed classes must be replaced with a new classloader. When a classloader is replaced, all classes that were loaded from that classloader (or any classloaders that are offspring of that classloader) must be reloaded. Any instances of these classes must be re-instantiated. If you deploy an exploded application and run the WLS in development mode, you can take advantage from the ChangeAwareClassloader. If you make changes to a single class, the WLS simply replaces the whole classloader and starts over with your newly created class. This was one of the first approaches to quicker development roundtrips. Anyway it is still not the solution for bigger applications.

2) Weblogic Fast-Swap (WLS >=10.x)
Java EE 5 introduces the ability to redefine a class at runtime without dropping its ClassLoader or abandoning existing instances. This allows containers to reload altered classes without disturbing running applications, vastly speeding up iterative development cycles and improving the overall development and testing experiences. The usefulness of the Java EE dynamic class redefinition is severely curtailed, however, by the restriction that the shape of the class – its declared fields and methods – cannot change. The purpose of FastSwap is to remove this restriction in WLS, allowing the dynamic redefinition of classes with new shapes to facilitate iterative development.

With FastSwap, Java classes are redefined in-place without reloading the ClassLoader, thereby having the decided advantage of fast turnaround times. This means that you do not have to wait for an application to redeploy and then navigate back to wherever you were in the Web page flow. Instead, you can make your changes, auto compile, and then see the effects immediately.


  • FastSwap is only supported when WLS is running in development mode. It is automatically disabled in production mode.

  • Only changes to class files in exploded directories are supported. Modifications to class files in archived applications, as well as archived JAR files appearing in the application’s classpath are not supported.



To enable FastSwap in your application, add <fast-swap>true</fast-swap> to the weblogic-application.xml file.

FastSwap can also be enabled for a standalone web-application by adding the <fast-swap> element to the weblogic.xml file.

Once FastSwap is enabled at the descriptor level, an appropriate ClassLoader is instantiated when the application is deployed to WLS. It is recommended that you have your IDE setting set to "compile-on-save" so that java files are compiled on saving. The FastSwap agent tries to find all classes that have been modified since the last iteration by looking at all directories in the classpath.

FastSwap is supported with POJOs (JARs), Web applications (WARs) and enterprise applications (EARs) deployed in an exploded format. FastSwap is not supported with resource adapters (RARs).


    Supported changes:

  • Addition and Removal of static methods

  • Addition and Removal of instance methods

  • Changes to static and instance method bodies

  • Addition and Removal of static fields

  • Addition and Removal of instance fields




    limitations:

  • Java Reflection results do not include newly added fields and methods and include removed fields and methods. As a result, use of the reflection API on the modified classes can result in undesired behavior.

  • Changing the hierarchy of an already existing class is not supported by FastSwap.
    For example, either a) changing the list of implemented interfaces of a class; or b) changing the superclass of a class, is not supported.

  • Addition or removal of Java annotations is not supported by FastSwap, since this is tied to the reflection changes mentioned above.

  • Addition or removal of methods on EJB Interfaces is not supported by FastSwap since an EJB Compilation step is required to reflect the changes at runtime.

  • Addition or removal of constants from Enums is not supported.

  • Addition or removal of the finalize method is not supported.



When FastSwap is enabled, after you recompile a class, FastSwap attempts to redefine classes in existing classloaders. If redefinition fails because your changes fall outside the scope of supported FastSwap changes, the JVM throws an UnsupportedOperationException in the WLS window and in the server log file. Your application will not reflect the changes, but will continue to run.

Wednesday, February 4, 2009

Heartstorming instead of Brainstorming!

07:07 Wednesday, February 4, 2009 Posted by Test No comments:
, , ,
I am a more or less frequent reader of the Daily Dueck.
This is a column published by Prof. Dr. Gunter Dueck. He is an IBM Distinguished Engineer, an IEEE Fellow, a member of the IBM Academy of Technology,
corresponding member of the Göttingen Academy of Sciences and a member of the board of
the DMV (German Society of Mathematicians). He authored some satirical-philosophical
books on humans, management and life (Wild Duck, E-Man, Omnisophie, Supramanie,
Topothesie – on humane keeping of humans).

This months Daily Dueck (no 83) balances accounts with a common moderation technique: Brainstorming. The Daily Duck is published in german only. If you have to miss the perfect provocating writings because of this, you should probably try to learn german :) It is even worth this effort ;)

Back to brainstorming. In short, Guenther Dueck proposes to switch to heartstorming. He says, ideas arise if the heart is reaching for something. If the heart is standing at the ocean and looking out for the width. Ideas are buring flames reaching out for something.
Brainstorming happens under pressure. And this is something that is definetely not supporting an idea or a burning heart.

If you come across a burning heart during such a session: Try to support it. If someone is willing to spend his blood, his heart on something, you should be the one helping him out. „Enablement instead of tracking“ is the key.

Tuesday, February 3, 2009

Talking to SAP - JEE integration with SAP

11:44 Tuesday, February 3, 2009 Posted by Test No comments:
, , , , , ,
This is an old article, written back in 2006 but never published. I think it's still worth reading and beside the fact, that oracle now took over bea, and some product versions moved on, its still up-to-date.

Today, hardly a single company can do without systems from the SAP AG [1]. SAP – For years, these three letters have stood for business applications and effective company processes. However, despite the wide distribution, there are still not enough classic developers, who feel comfortable enough in the SAP world, that they are able to restore earlier integration solutions. The need for communication between the worlds is becoming increasingly important, which means that it is time to take a look at the different possibilities available and to find ways to support the interaction between the two worlds.


Introdcution


When speaking to developers from a wide range of backgrounds about the issue of SAP, a wide variety of reactions is seen. Whether excited head-nodding or open rejection of the ABAP, BAPIs, IDocs, etc. not a single expression is lacking. The majority of the developer community is not dissatisfied to consider SAP a blackbox and to merely investigate how to get around it.
It is not a surprise that non-SAP developer communities have a difficult time with the topic. There is hardly a range of products that has the same exposure and has still been able to remain so intransparent. Especially the technical aspects of the SAP products are still mostly handled by a select group of specialitists. However, this fact alone does not discourage companies from installing products from the wide range of wonderful and clever abbreviations, such as SD (Sales & Distribution), FI (Finance), CO (Controlling), HR (Human Resources), etc. The main reason though is the increasing cost pressure. The desire to omptimize relevant business processes using standard products and to uniformly support them through the IT usually gets IT departments moving. Yet, this approach is completely comprehensible. If optimization and standardization can save costs, further costs can be saved by employing certified employees instead of training existing employees, and the guaranteed support from the manaufacturer is also an added bonus.
What is quickly forgotten is the fact that such problem-free packages do have their price. In addition to implementation and company-specific customizing, there are also considerable license fees. But these are only a few reasons why companies are tending to portray only the business-critical basics in their standard products (financial accounting, human resource management, etc.). IT usually takes care of the remaining processes using older individual developments, which have expanded over the years and which take some getting used to. These are usually based on Java, Microsoft or other platforms. What initially looks like a feasible interaction actually creates a lot of questions upon closer examination. Many individual developments could potentially use interfaces to SAP, since they can access the same dataset or even functions portrayed in SAP. Thus, it would be rather profitable to connect SAP to the rest of the world.



Developers and Consultants

This is where the history of SAP software in companies becomes interesting for non-SAP developers as well. Both partners would have to sit down and find solutions for questions such as integration, access and data exchange. However, both sides do tend to exaggerate in the reports from such discussions. What is sure, is that it does not usually take long until this babylonian language deficit has prevented any effective cooperation at all.

When taking a close look at a customer’s different views of software, their development processes or related implementation modules, it can clearly be seen that the SAP products do differ greatly from products made by other manufacturers.  To put it another way, SAP’s product portfolio contains large blocks of products for optimizing and standardizing almost all company processes. Starting with cross-industry topics, such as human resource management and financial accounting to specific solutions for say health care or the automobile industry…almost every industry is represented. Unlike other products, where programmers work with a program until it fits, no programming is performed in the SAP product. Instead, adjustments are made until the product fits the company processes and situation. This adjustment process is not referred to as “development”, but as “customizing”, i.e. customer-specific adjustments. Which is why it is also important that qualified and certified consultants make the adjustments and not developers.


Consultants speak the language of their product and process. This seems to contradict the work of classic developers, who focus on interfaces, programming languages and process environments. This is a classic process verses technology vocabulary. What makes it even more difficult is that SAP’s varied technology uses its own methods. Although consultants speak about technology as well, they are usually referring to proprietary technologies. With so many differences, it is not surprising that initiating a successful interaction does indeed require some work.
Commonalities


Since the fundamental aspects seem to differ so greatly, it is no wonder that the two worlds do not seem to migrate toward each other and that one world does not know much about the other. To create successful integration solutions between SAP and the rest of the work, a step will first have to be taken in the other direction. Many developers are aware of the abbreviation R/3. R/3 is a company information system (also referred to as ERP - Enterprise Resource Planning). R/3 has been used since about 1993 and is a client/server system. The R stands for realtime and the 3 refers to the 3 layers which create such a system (a database layer, application layer and presentation layer). R/2, the previous version, was designed for operation in mainframe systems. The direct successor of R/3 is mySAP ERP. NetWeaver was presented as the successor at the beginning of 2003. With the R/3 systems, data is stored in standard relational databases, yet the overall business processing is done in an application server. The application server executes the programs written in the proprietary language, called ABAP. Quite an expansion was seen in release 4.7 of the SAP R/3 Enterprise. As SAP’s official answer to the boom in internet technologies, it was named the SAP Web Application Server (WebAS) in the summer of 2000. Event the initial versions, 6.10 and 6.20, already had their own HTTP and Web service interfaces for the ABAP runtime. A parallel Enterprise Java Runtime appeared in the application server in version 6.30 after SAP took over the In-Q-My company. This part of the application is J2EE 1.3 conform and enables Enterprise Java programming. Whereas the technical platform used to be somewhat neglected, it was now receiving increased attention in the SAP product portfolio. The current WebAS 6.40 is the basic process environment for almost all SAP products and is delivered with NetWeaver 2004. Whoever signs up on SAP’s [2] free developer site is also given the opportunity to download and try a few preview versions of the technical basis. In addition to the current NetWeaver 2004 SP16 version, it also provides the successor version NetWeaver 2004s SP7 and a preview of NewWeaer 2007, with limited functionalities.





Differences

In other words, two completely separate hearts beat in WebAS: The ABAP engine for executing ABAP code and the so-alled J2EE engine. WebAS can be installed with either one or both engines. Except for the fact that the application server has two different runtimes, the concept as such is not that foreign.

Unfortunately, ones does have to admit that the similarities end there - at least when ignoring the J2EE side and focussing on the ABAP process environment. Still, non-SAP developers are faced with a myriad of foreign abbreviations when working in an SAP application environment. The ABAP engine is composed of different parts (incl. lock server, updating processes, spool processes, etc.). These processes can be distributed to different engines if necessary. In the simplest case, all parts run on a single application server. This kind of installation is referred to as a central instance. Whereas this arrangement is extremely efficient for smaller scenarios, it does have to be distributed to numerous physical systems for larger installations. A few components (especially the lock and updating processes) can only exist once per system. These are also managed under the name of “central services”. The parts responsible for actually running programs, can be freely scaled and distributed. The term “dialog instances” has been established for them.

The abbreviation ABAP originally stood for the german expression “Allgemeiner Berichts-Aufbereitungs-Prozessor“ (translation: General Report Formatting Processor). The language was mostly used to program evaluations and did not directly make changes in a database. During later developments in the language, and based on the internationalization of the products, the abbreviation was redefined as “Advanced Business Application Programming” in 1993. No scope has been defined for the language and has been continously expanded in the past. As a 4th generation language, it was specifically developed for processing mass data in commercial applications. The business logic is realized using ABAP functions, ABAP classes and semantic ABAP business objects. It can be programmed functionally or even object-specific. Data is retained using the SAP standard ABAP OpenSQL or using NativeSQL or other batch processing functions (batch input, direct input, call transaction). A presentation server is used for the formatting and for display to the user. The client includes the SAP GUI, a native Windows application. The Internet Transaction Server (ITS) can be used to access the SAP GUI function using a standard browser. The server side of all these display variants is based on the Dynpro portayal technology. The processing of Business Server Pages (BSP) became available with WebAS 6.20. These are comparable to JSP pages, except they are programmed in ABAP. A wide variety of communication interfaces are used to communicate within and outside the SAP world. The most common are RFC, ALE, BAPI, HTTP and Web service. Whereas HTTP and Web service are well known, the others are not.

RFC is the abbreviation for “Remote Function Call”. This technology is used to call ABAP functions or function modules across subsystem limits and to exchange data. RFC is based on CPI-C, an IBM development, and has been available since release R/2. Basically, RFC is a communication protocol for exchanging messages. RFC is categorized into different specific types for better understanding. These include sRFC (sychronous RFC), aRFC (asynchronous RFC), tRFC (transaction RFC) and qRFC (queued RFC). Technically, this could be considered an RPC (Remote Procedure Call).

The standard technology for realizing distributed systems at SAP is ALE technology. ALE stands for “Application Link Enabling”. It was specifically developed for exchanging data between business systems using the IDoc document format based on RFC. In doing so, ALE ensures consistent and reliable message delivery across system limits. In other words, ALE provides a routing and workflow level for transferring messages.

BAPI stands for “Business Application Programming Interfaces”. As a standardized API, it enables external access to functionalities in the SAP business objects. In the ABAP world, business objects portray real objects, such as an employee or a customer order. They are complete encapsulations containing data related to the object as well as business processes, and can thus be efficiently embedded in the actual structural and implementation details of the physical data. One clear example is the combination of business objects and BAPI. Upon close examination of the SAP business object “Material”, one can see that it includes a check for the availability of the material, etc. These checks are defined virtually as methods and take on the form of a BAPI with the name “Material.CheckAvailability”. Technically, the BAPI is a specific implementation of function modules, which have to meet corresponding design guidelines. Thus, they cannot contain dialogs, have to be RFC-capable and have to deliver a suitable return value. In other words, a BAPI can be considered an implementation guideline. More details can be found in [3].

In order to transfer the somewhat complex data structures to the SAP world and externally, the possibilities are not limited to transferring simple values. With the Intermediate Document (IDoc) SAP has defined a separate data exchange format. It is message-based and asynchronous. In it, defined and effective message types (delivery notes, orders) are assigned to the so-called IDoc types. This means, that an IDoc contains data structures as well as processing logic. It can also be used in R/3 systems, e.g. for transferring data to external systems using BAPIs. 


This short examination of the SAP technology clearly shows that there is no way around function modules, RFC, BAPI or IDoc when communicating with SAP.  Even if this method is described as cross-platform and standardized, it is still not possible to use the functions offered in other worlds without a little extra work. SAP provides so-called “Connectors” to achieve this purpose. There are also many open standards and other ways to connect SAP systems from the Java world.


SAP Java Connector


Java Connector (JCo) by SAP is probably one of the most widely known. The current version, 2.1.7, can be run on Windows, Linux, Solaris AIX and many other platforms. SAP customers can download them free of charge from the service website [4]. JCo is actually a complete Middleware for communicating with the SAP systems with R/3 releases. In addition to the simple connection to SAP systems and their pooling, it also includes numerous libraries for accessing IDocs, RFC and BAPIs. It supports synchronous and asynchronous access, whether from Java to ABAP or vice-versa. Only the connection to the Java world is a little more difficult. Java Native Interface (JNI) is used for accessing the shared libraries of the connector when in Java. In other words, the native methods are not called directly. A Java proxy class exists for each class of the connector, which will be addressed when in Java.
Although the connector has proven to be relatively stabile, it should still be used with caution. Experiences with JNI have shown, that the JNI interface cannot catch all of the runtime errors on all platforms. This means, that the Java Virtual Machine (VM) may simply be terminated if such an error occurs. To effectively avoid this problem, one would have to operate the JCo component on a separate JEE cluster and make it one-hundred percent downtime secure.
Another alternative would be to use the Java Resource Adapater (JRA) implementation. However, it only supports synchronous RFC calls and uses a technical user when logging on to the system. This is not suitable for all cases.
JCo can also be used to enable the communication between a combined installation of J2EE and ABAP engine. If both engines are located on hardware, the JCo will no longer use only RFC for communicating with the ABAP stack, but will also rely on an accelerated construct, the so-called FastRFC. FastRFC is an expansion of the RFC libraries. It no longer uses TCP/IP, instead it relies on a split memory area. This means, that it can only be used when the RFC client and RFC server are installed on the same engine. The programming is explained rather well in the documentation, using numerous examples. Additional support for JCo can be found on SAP’s help pages [5]. People who frequently work with JCO should also take a look at some of the tools of the OpenSource community. The community provides proxy generators and exclipe integrations [6], which simplify the use of JCo.






SAP Enterprise Connector


One very comfortable possibility for using JCo is available when developing Java in SAP. Once a developer is familiar with the possibilities of the J2EE Engine and those of the SAP development environment, NetWeaver Developer Studio (NWDS), he/she can then use the SAP Enterprise Connector. It is a plug-in for NWDS and is usually integrated into the delivery, so does not have to be installed separately. It encapsules the complexity of the JCo and creates a separate proxy class for all called ABAP functions. To do so, a small runtime component is required for the Java client. The creating of structures and copying of relevant items is still taken care of by a wizard. It can also be used in NWDS to set up connections to an SAP system and to directly search for BAPIs online. Once the right function has been found, a simple click will have the wizard generate the typical Java proxys for the target function.





Webservices


SAP has offered an implementation of Web service standards since Web application server 6.20. So-called SOAP processors can be used to address RFC-capable function modules from ABAP and convert them to Web services. The Web service implementation on the J2EE side of the WebAS 6.20 still had to be used with caution at that time. However, as of WebAS 6.0, the implementation of J2EE standards has become much more stabile and NWDS provides a complete J2EE-conform development environment. Web services can be provided quite easily in the environment. Even if NWDS is still based on an older Eclipse in version 2.0, the wizards and plug-ins provided by SAP can certainly be used. Users can easily figure out how to use them and can produce quick results. However, the JCo is needed to integrate ABAP functions into the rest of the world using the J2EE engine. There is another option if this seems too complicated, which is to pack a Web service around a BAPI directly from the ABAP. To do so, the ABAP Workbench is used to create a Web service, which only takes a few simple clicks and a few configuration settings for the authentication. The automatism also creates a suitable WSDL. Unfortunately, working with the Workbench is not easy for non-SAP developers. It definitely requires support. In this case, it is best simply to find someone who can quickly take care of the matter.



Enterprise Application Integration (EAI)


The Enterprise Application Integration (EAI) can also be used instead of connectors and SAP specifications. The tendency over the past few years to use this all-purpose tool is still unbroken, even in service-oriented architectures (SOA). The basic principle of EAI is quite simple: A central or even distributable Middleware layer provides the basic functions for connecting different services. These include connectivity, routing and transformation of information or messages. The endpoints are attached using adapters. A wide variety of data can also be transported from the source systems to the target systems. Manufacturers of EAI products also offer adapters. These may have to be configured in the target landscape and the transformations or data flow may have to be defined. Such adapters do tend to be rather costly. However, so are most EAI products, which means that their use is only profitable if the landscape and the adapters are already available. Technically, there are hardly any systems which cannot be connected by an adapter. Each manufacturer takes a slightly different approach. It is understandable that they prefer expanding their own product line and their functions, and not those from other providers. Yet everyone has SAP adapters and a reliable connection can be made using an adapter from any manufacturer.



SAP NetWeaver Exchange Infrastructure (XI/PI)


SAP’s method for EAI is the XI. The current version 3.0 is delivered as part of the NetWeaver platform and the license is free when used for internal SAP communication. However, it is quite a different story when distributing SAP data to non-SAP systems using the XI. In this case, additional licenses are required. However, one does have to admit that the integration using XI is excellent, especially from SAP’s side. Basically, the XI communication is based on HTTP and the transferred data are described in XML. Unlimited open standards are used in this case. Similarly, classic EAI infrastructures also recognize the XI adapter. However, SAP refers to them as “proxy”. In addition to ABAP and Java proxys, there are also HTTP, JDBC, IDoc, RFC, File, JMS and many others. These should cover almost all functional requirements. Yet, experts disagree when it comes to performance. One thing is certain, and that is that an additional layer usually means additional overhead.



SAP Web Dynpro


Web Dynpro is a modern variant of the class SAP dynpros. This technology can be used in the NetWeaver Developer Studio to easily create SAP-conform interfaces. It is a model-driven concept, which minimizes the coding effort and provides visual assistance when designing interfaces. The Adaptive RFC (Adaptive Remote Function Call) is used to address SAP backend systems, enabling BAPIs to be integrated into the Web Dynpro’s model-driven concept. The necessary BAPIS can be located and selected in the SAP system using a wizard. The corresponding Java proxies are then generated based on the metadata stored in the BAPIs and the data objects connected to the Web Dynpro interface elements. This process is also slightly sensitive to any changes made to the transferred data formats.
For JEE developers, the jump to NWDS and the Web Dynpro technology is a feasible undertaking. Even if they do have to get used to a few characteristics, it is still a bit similar to known and established concepts, such as MVC, components, etc. Unfortuantely, the Web Dynpro runtime libraries are only available for the SAP WebAS J2EE Engine. Thus, it cannot really be referred to as an actual integration, since most of the work is done in the SAP world and on SAP systems - even from a Java side. This scenario is a good alternative, especially when a corresponding infrastructure already exists.



SAP Enterprise Portal


The SAP Enterprise Portal (EP) is also based on the WebAS J2EE Engine Stack and is able to connect SAP backend systems. It uses a connector framework to do so. As a JCA-conform adapter, it is integrated into the J2EE Engine and assumes the communication between SAP and non-SAP applications when in the portal. Once the corresponding connectors have been integrated, the backend systems can either be addressed using a generic iView component or through the direct programming against the connector Framework API. The latter is also supported by a plug-in in NWDS. All in all, the framework not only enables the simple addressing of existing connectors, but also provides a standardized integration point for own connectors in the Enterprise Portal. A separate connector can be developed in NWDS for this purpose and deployed in the J2EE Engine. Currently, connectors are available for SAP ERP and SAP BW. JDBC is also available and can be acquired directly from SAP.



BEA/Oralce Solutions

WebLogic Portal integration


BEA WebLogic Portlets for SAP is a suite of pre-built portlets that visually integrate enterprise application data from SAP into general-purpose BEA WebLogic Portal applications. WebLogic Portal's flexible, robust infrastructure makes it easy to build new portal applications that simplify, personalize, and lower the cost of data access for customers, partners, and employees.

BEA WebLogic Portlets for SAP are built to take advantage of WebLogic Portal's capabilities. For example, administrators can use the portal to map users in the Portal to SAP users, enabling proper data security.

BEA WebLogic Portlets for SAP are supported by a powerful application connectivity architecture and framework that includes a presentation layer, processing layer, data engine layer, and database layer.

The set of included portlets map to different business objects in SAP, and can be incorporated into new or existing portal applications. The portlets are driven by a set of Java page flows that invoke an extensible processing layer and a data engine, which in turn call Java Controls to request data from SAP R/3. Data is retrieved from SAP via BAPI (Business API). The responses are processed by the data engine, and rendered by the Java page flows. Runtime data view configuration is supported using pre-packaged XSL templates.




BEA WebLogic Adapters

BEA WebLogic Adapters are easy-to-use, standards-based adapters to enterprise information systems, providing the critical "last mile" of connectivity to your applications. Designed for use with BEA WebLogic Integration, this growing portfolio of application, technology, and utility adapters conforms to the J2EE Connector Architecture specification and features enhancements that enable faster, simpler, and more robust integration of business-critical applications. The BEA WebLogic Adapter for SAP supports asynchronous, bi-directional interactions between the adapter and your SAP R/3 applications, and enables business process workflows running within BEA WebLogic Integration to transfer data to and from a SAP R/3 application. The adapter allows you to execute SAP IFR XML, IDocs, BAPI calls, or custom RFCs.



BEA WebLogic Integration


BEA WebLogic Integration converges application development and integration technologies into a single, unified platform. It delivers rapid business integration within an enterprise and empowers IT to meet dynamic business goals and new opportunities quickly, with faster time to value.




BEA AquaLogic Data Services Platform


Enable a single unified view of data from any source across the enterprise. BEA AquaLogic Data Services Platform allows data services to function as single access points for unified and consistent information - for easier data access, aggregation and updates, better data consistency, and simpler application development.





BEA AquaLogic Service Bus


BEA AquaLogic Service Bus, through its service integration and management capabilities, accelerates configuration and deployment, and simplifies management of shared services across the SOA. BEA AquaLogic Service Bus delivers intelligent message brokering, dynamic routing and transformation, all in support of heterogeneous service end-points, integrated with service lifecycle management capabilities including service registration, monitoring, and threshold-defined service level agreement enforcement.




The quantity of technical coversions for integrating SAP into the Java world is still relatively small. From a technial point of view, there are only two truly feasible solutions: Web services and Java Connectors. Anyone willing to program Java in the SAP world can also use J2EE Engine Stacks, portals or even Web Dynpros. Another broad topic is still how to integrate Web services. Whether retrieving relevant data to a JEE system using JCo and then packing it into Web services, or whether doing so directly in the J2EE Engine or even in the ABAP Stack, it all depends on the target scenario and the skills available in the developer group.

Integration by EAI/SOA is much easier. In this case, the respective manufacturers usually supports integration using adapters and the user is only responsible for data transfer and transformation. The most complete solution stack is in fact available by bea itself.  

From a pragmatic point of view, more and more ready-made integration solutions in portals or even the Office suite are becoming available for a variety of application cases. Time management and accessibility inquiries do not have to be reinvented by every company, regardless of the platforms on which they will be used. The chance of finding a suitable, finsihed product is not as difficult as it used to be.


Yet, the search for a single, right way is initially disappointing. Many different factors have to be considered to find it.  In addition to license fees for infrastructure, necessary hardware and expenses for finished solutions, performance and transaction security can also influence the decision. This is where experience can help you find the right solution. Especially when considering complex and major installations, there is no limit to the amount of time which can be spent searching for an optimal solution.

Additional Reading


Monday, February 2, 2009

Perf4j - Release 0.9.9 available

10:34 Monday, February 2, 2009 Posted by Test No comments:
, ,
I was playing around with perf4j recently. Quite impressive, how simple you can generate performance statistics for your application.
Even without using any of the delivered aspects you can simply write your own interceptor and use an appropriate stoppwatch, wherever needed.
In my sample case, I wrote a simple interceptor. This is the slightes way ever. I skipped essential things like handling exceptions and so on. But this should be enough to demonstrate the basics ;)


public class PerformanceLoggingInterceptor implements MethodInterceptor {
[...]

public Object invoke(MethodInvocation invocation) throws Throwable {
//new Log4JstopWatch with your own tag
StopWatch stopWatch = new Log4JStopWatch("myperftag", logger);

Object retVal = invocation.proceed();
stopWatch.stop(cls + "." + methodName);
return retVal;
}
}



Now you need to configure the intercepter and setup perf4j within your application.
1) Use the right dependency with maven

<dependency>
<groupId>org.perf4j</groupId>
<artifactId>perf4j</artifactId>
<version>0.9.9</version>
<scope>compile</scope>
</dependency>

2) Change your log4j.properties to an xml style version and add the needed appenders. The documentation on the perf4j website is a good place to start with the basic configuration.
Remember to change the tags, that should be logged appropriate, e.g.:

<param name="TagNamesToGraph"
value="doStart,myTag,test2" />


3) If you like to expose the statistics within your webapp you have to configure the delivered servlet within your web.xml. An example is provided on the perf4j website.

If this is finished, you can take a look at the working system and get a chart generated from google, which looks like this:



Since the 0.9.8 release a bug with non english locales was fixed. In addition you can now use a csv enabled logfile parser, that generates csv output.