Friday, December 25, 2009

My first JavaFX script

I just wrote my first JavaFX script. This script is actually a little benchmark I run when first looking at a new language. Is it rigorous and scientific ? Not at all. It is a very modest benchmark that gives me a feel for the language speed. It includes a bunch of operations typical of the kind of code that I write (string manipulations, collection manipulations, etc...). The code is not optimized to any degree and might actually run faster (or slower) if I was to use more idiomatic style. I also use a number of Java class. I think this is needed because this is also representative of what actual code will look like.
So what is the result ? Turns out that in this benchmark JavaFX is only 2 times slower than Java (511 milliseconds in one tests compared to 272 using Java). If you compare that to other JVM languages that is actually quite good. Groovy for example runs this particular benchmark in about 2600 millis (about 5 times slower than JavaFX). Of course JavaFX is a statically typed language so I was hoping I would get that kind of performance. I suspect that this will improve with subsequent versions. Scala for example runs the benchmark in about the same time as Java.
Of course speed is not the only criteria for language selection. Groovy for example being a dynamic language has a lot of characteristics that make it really appealing. However, in my case for the kind of applications I work on I cannot run 10 times slower than Java. If I can write only a small portion of an application using a language then this language becomes much less interesting. With the kind of speed that JavaFX is giving me I know I can write most of my code using it. That makes it really appealing. Now if JavaFX graphic rendering can get fast enough and with its already compeling UI capability it becomes a clear winner in my book.

Sunday, November 1, 2009

In a previous blog I mentioned using DbC as a technique on a project. I forgot to mention one important element that is very important in DbC and that's invariants. Since I did not use any fancy tools to work with DbC I had to find a way to define invariants. I choose to simply define a private method called invariants and added a call to this method in a conditional compile block at the beginning and end of every public methods. Like I said in my previous discussion this simple way of using DbC adds a bit of clutter to the code but this is a small cost to pay to get the benefit of DbC.
Invariants are assertions about a class that are always true. For a linked list for example:

first != null || size == 0

Combined with preconditions and postconditions the impact on code correctness is just amazing. Thinking about the assertions just get you there faster.
On my next project I just might use Microsoft's Contract tool. The only annoying thing is that I probably won't be able to use this with MONO.

Sunday, October 18, 2009

FInally a decent Groovy and Scala IDE

Yesterday I installed InterlliJ Community edition on my Linux system at home. I have not completed my evaluation but I must say that what I have seen so far looks good.
Groovy support is available "out of the box" while Scala support can be added by downloading a plug-in. I have been reading about both languages and I even started writing little scripts at work in Groovy (utilities to extract data from text files and such). However I must say that the lack of a good IDE was quite a hindrance. As a Java programmer I expect a lot from my IDE being used to programs like Eclipse and Netbeans. Some of the productivity gains from those new languages are not so impressive when one is used to working with a good IDE. Take for example the Groovy (or Scala) "def" keyword. This is often presented as one of the advantages of Groovy over Java. You can replace:

StringBuilder myString = new StringBuilder();

with:

def myString = new StringBuilder()

The problem is that in Eclipse for example when I enter the first expression I type the following:

myString = new StringBuilder();

Then I just press +F1 and select "Create local variable" and the IDE adds the missing type at the beginning of the line.
Another example is the Groovy @Delegate annotation that generates delegate methods for a given member. Again in Eclipse I just right click the member and select "Generate delegate methods" from the Source menu.
Of course, in this category both Groovy and Scala offer much more gains then what an IDE like Eclipse can offer. However, those gains have to be weighted against the loss of other IDE functionality. As a Java programmer and Eclipse user I expect a good browser for my language. A browser is an essential part of a good OOP environment. This is so true that Smalltalk development kits have always included a browser. It is somewhat painful to apply good OOP principles if you don't have a browser (good OO programming tends to result in more numerous small classes).
The other must of course is code completion. The large number of core classes and API makes this absolutely essential.
I think a good open source environment like IntelliJ will contribute to the adoption of both Groovy and Scala.

Sunday, October 4, 2009

Design by Contract and Containers classes

I had to write a special ordered linked list class for a project at work. Since I have seen a lot of example of applications of Design by contract for this type of code I decided to give it a try. For the DbC preconditions and postconditions I did not use anything fancy I just wrote a class with static methods that all look like this:


public static void precondition (string description, bool assertion)
{
#if DEBUG
if (!assertion)
{
throw new AssertionException (
"Precondition error: " + description);
}
#endif
}



When not running in debug mode the method becomes empty and if you decide not to put them in a conditional compilation block the overhead is very small. In my case because the ordered linked list is used in a very performance critical part of the program all use of the precondition and postcondition are in a #if DEBUG/#endif block. This clutters the code but considering the gains it is not so bad. (Microsoft has something available that is somewhat cleaner but when I last checked you needed the Team Edition of VS to use it).
Anyway, I found that for something like a container DbC is really great. Along with DbC I also wrote a good suite of unit tests for my class. Turns out the DbC checks detected a few errors in my code that would have gone undetected with my initial batch of tests. The DbC failure gave me a really good hint about the kind of tests I had to add so in the end with the DbC checks and the updated test suite I was really confident about my new class (I eventually got 80% coverage in my test and I plan to write the one or two missing tests I need to get to 100%). I felt that comming up with and writing the preconditions, postconditions and invariant really helped me quickly get to fully working solution.
I will not hesitate to use this again despite the clutter for any class I feel will benefit from DbC.

Sunday, September 13, 2009

Unit tests and the Layers Pattern (part 4)

Last time we looked at the layers in OPC UA module into a little bit more details. Now lets look at how the tests were structured. I was not please at how the diagram represented the organisation of the tests so here is a modified version:

========================================================
Interface (Java) : (Unit tests)(Unit tests)
======================================|===========|=====
Logical (Java) : Unit tests | |
======================================|===========|=====
Low-level access : | |
......................................|...........|.....
Java : | |
......................................|...........|.....
JNI (ANSI C++) : V |
..................................................|.....
C++/CLI : |
..................................................V.....
C# : Unit tests
========================================================

The unit tests are divided into the following categories:

Horizontal (single layer)


Test of classes in the logical layer


Most are classical Junit tests. Mostly, they test class methods in an isolated manner. I do have “behavior driven” tests here that use a mock implementation of the lower layers. Because of the layered approach, the Mock implementation is quite simple. It uses a Map in the background with backdoor methods to setup parameter values. The methods that fake UA method calls don't do anything except changed predefined parameter values.

Test of classes in the low-level layer


Same thing here except that I use Nunit since this is written in C#. The difference is that I don't have lower levels in my code. The next layer is external and that is the OPC UA framework. I was able to expand my tests here using generics and conditional compilation. I had to do this because the OPC UA framework does not use a lot of interface or abstraction. I ended up having to work very hard to test some part of the code. A lot of the tests here are for the special Queue used for subscriptions.

Vertical (multiple layers)


Tests of the JNI interface


Here I use a separate DLL that does not use the C++/CLI layer. This allows me to test the JNI part of the code in isolation so that if I have a bug I know that the problem is in the pure native C++ layer. I could have used horizontal tests here but they would have been very limited since most of the code is mostly JNI mechanics.

Tests of all upper layers


These tests go from the logical down to the low-level layer. The low-level layer however is Mocked so this group of tests is mainly a test of the C++/CLI mechanic. Of course there is a small amount of redundant tests of the JNI code here. This is unavoidable. However, because the JNI code is tested in isolation elsewhere this is not a problem. I know that if I have a bug here there is a high probability that the bug is in the C++/CLI mechanic.

Conclusion


Structuring the code in layers allows to more easily test more code. Having different group of tests allows to quickly find the source of a bug. You avoid much debugging using this modular approach. You also can tests more stuff as part of the build because you can use Mock implementations of key components and avoid having to use an actual OPC UA server on the build machine.
The code is tested with an actual OPC UA server as part of manual tests. These are JUnit tests that I run manually on my development machine and that use all real layers. Finally, system and integration tests close the loop.

Saturday, September 5, 2009

Unit tests and the Layers Pattern (part 3)

Last time we looked at the layers for my OPC UA client project without going into too much details. For convenience I have repeated the diagram below.


============================================
Interface (Java) : (Unit tests)
======================================|=====
Logical (Java) : Unit tests
======================================|=====
Low-level access : |
......................................|.....
Java : |
......................................|.....
JNI (ANSI C++) : V
......................................|.....
C++/CLI : |
......................................V.....
C# : Unit tests
============================================


Lets describe the layers into a little bit more details:

Interface


As described in part 1 this is were you define the public API for the module. In my Java code this is made up mostly of Java interfaces. In C++ I would use pure abstract classes. The interface also defines things like Enum and constants that are part of the interface. In my project this is in a separate group of package (namespace) and one could go as far as putting this in a totally separate projects. Putting the interface in a separate project helps make he separation between the interface and the rest of the code even more explicit and this helps to avoid some type of errors were the interface is contaminated with implementation elements from other layers. In my case I kept all the Java code in the same project and it went fairly well.

Logical


The logical layer is the part that uses the low-level access layer to implement actual business logic. Things like:

if (parameterX.value == aSpecificValue)
{
// Do something
}
else
{
// Do something else call a UA MethodY()
}

In this layer I actually have a state machine that switches state and takes different action based on parameter values. The logical layer uses other sublayers (configuration persistence, ...) but we won't go into those details here because it would make things too complicated. This layer contains a good number of unit tests (horizontal).

Low-level access


This layer defines an interface of its own. In my case this interface is not visible from outside the module. It defines the following method:

  • read one or more parameters

  • write one or more parameters

  • call UA methods

  • subscribe for update notification for one or more parameters

  • fetch data updated through the subscription mechanism


The only code in this layer is the code necessary to use the UA framework to perform the tasks listed above. And in fact the only part that contains more complicated logic is the part that manages the subscription and this is mainly a kind of smart queue mechanism. This code is the part responsible for most of the unit tests located directly in the layer (horizontal).

Parting comments


I want to emphasize that except for the sublayers in the low-level access layer the layers have nothing to do with the use of different languages. The same layers would have been present with an all Java module. In other words if a Java OPC UA framework had been available in a sufficiently advanced state for my project the layers would have been the same.
Next time we will keep exploring the layers and how the unit tests were structured.

Sunday, August 30, 2009

Unit tests and the Layers Pattern (part 2)

I have used the Layers Pattern on my last project. This was a OPC UA client for data acquisition module in an Industrial Continuous Data Acquisition suite of software. This suite of applications was already using a proprietary plug-in framework to allow integration of different analyzers. The idea was to take the generic approach one step further and use a standard technology (OPC UA) for interfacing to analyzers.
On this project OPC UA can be seen as providing the low-level data access module. Using this the program could read/write parameters synchronously or use a subscription mechanism to get notifications when parameters were updated. Starting the project I also faced the problem of not having much documentation available and not much in terms of sample code. It turned out that the only source of significant code example was the C# framework. For a Java application this is a problem. While I started the project thinking that I would use an ANSI C library and JNI, I ended up having to use JNI, C++/CLI and a C# framework. In the end I had something like the following layers for my new module:


============================================
Interface (Java) : (Unit tests)
======================================|=====
Logical (Java) : Unit tests
======================================|=====
Low-level access : |
......................................|.....
Java : |
......................................|.....
JNI (ANSI C++) : |
......................................|.....
C++/CLI : |
......................................V.....
C# : Unit tests
============================================


You can see that the low-level layer contains four sub-layer. One for each technology involved. Each of these sub-layers is fairly simple except for the C# sub-layer. This is because of the need to support the subscription mechanism. The diagram above also shows how the unit tests are distributed. Some tests are restricted to a layer (horizontal) and some tests span all layers (vertically).
Next time we will look more closely at each layers and also at how the unit tests are structured in more details.