Monday, April 23, 2018

The Curious Case of tracing Java Serialization NotSerializableException

I say curious, because Java Serialization has been around for a very long time, and yet I could not find any tools to help track down exactly where the NotSerializableExceptions were coming from.

Yes, of course, the stack trace tells you which class has caused the offence, but even in a mildly complex object graph, the source of the error can be tough to find.

So in the end, I wrote SerializationTracer, which walks through the object graph being serialised and identifies objects which fail or succeed in serialising.  Null fields and empty collections are assessed with some static analysis.

The output looks like this:

UIScope -> SerializationResult(outcome=FAIL, info=java.io.NotSerializableException: com.google.inject.Key)
UIScope.cache -> SerializationResult(outcome=FAIL, info=java.io.NotSerializableException: com.google.inject.Key)
UIScope.cache.comparator -> SerializationResult(outcome=NULL_FAILED_STATIC_ANALYSIS, info=Comparator is NOT Serializable. ? super K is NOT Serializable.)

The first part of each line points to the exact field causing the problem, using an object 'path' - pretty trivial in the example above, but of course that could extend a long way down into a deep object graph.
It was designed with testing in mind, but we are now thinking of putting it into 'production' - that is, trapping a failed session serialisation, and using SerializationTracer to give us a more coherent understanding of the source of failure.

Thursday, February 2, 2017

RestEasy with embedded Undertow

Keeping it really easy

There is pretty good documentation for RestEasy and Undertow, but piecing together the Gradle dependencies for that combination eluded my for a while.

So very simply, this is it ... the Gradle dependencies, and a test (the test is translated to Spock from the example in the RestEasy docs)

build.gradle dependencies:

   compile 'org.jboss.resteasy:resteasy-jaxrs:3.1.0.Final'  
   compile 'org.jboss.resteasy:resteasy-undertow:3.1.0.Final'  
   compile group: 'io.undertow', name: 'undertow-core', version: '1.4.8.Final'  
   compile group: 'io.undertow', name: 'undertow-servlet', version: '1.4.8.Final'  

the test

 class MyAppTest extends Specification {  
   
   @Path("/test")  
   public static class Resource  
   {  
     @GET  
     @Produces("text/plain")  
     public String get()  
     {  
       return "hello world";  
     }  
   }  
   
   @ApplicationPath("/base")  
   public static class MyApp extends Application  
   {  
     @Override  
     public Set<Class<?>> getClasses()  
     {  
       HashSet<Class<?>> classes = new HashSet<Class<?>>();  
       classes.add(Resource.class);  
       return classes;  
     }  
   }  
   
   static UndertowJaxrsServer server  
   
   Client client  
   
   def setupSpec() {  
     server = new UndertowJaxrsServer().start();  
   }  
   
   def setup() {  
     client = ClientBuilder.newClient();  
   }  
   
   def cleanup() {  
     client.close();  
   }  
   
   def cleanupSpec() {  
     server.stop();  
   }  
   
   def "testApplicationPath"() {  
     given:  
     server.deploy(MyApp.class);  
   
     when:  
     String val = client.target(TestPortProvider.generateURL("/base/test")).request().get(String.class);  
   
     then:  
     val == "hello world"  
   }  
   
   def "testApplicationContext"() {  
     given:  
     server.deploy(MyApp.class, "/root");  
   
     when:  
     String val = client.target(TestPortProvider.generateURL("/root/test"))  
         .request().get(String.class);  
   
     then:  
     val == "hello world"  
   }  
   
   def "testDeploymentInfo"() {  
     given:  
     DeploymentInfo di = server.undertowDeployment(MyApp.class);  
     di.setContextPath("/di");  
     di.setDeploymentName("DI");  
     server.deploy(di);  
   
     when:  
     String val = client.target(TestPortProvider.generateURL("/di/base/test"))  
         .request().get(String.class);  
   
     then:  
     val == "hello world"  
   }  
 }  

Saturday, March 12, 2016

Spock the Difference

Back to Testing


It is quite a while ago since I blogged about testing (or about anything else ...) but my last post on the subject concluded that AssertJ was the best option, for me at least.

I was using AssertJ with Mockito and the Mycila runner to support Guice. I was generally happy with that combination, although there were one or two odd syntactical problems in the transition from Java 7 to 8.

And then there was [Spock](http://spockframework.github.io/spock/docs/1.0/index.html).

First thoughts

I'm not even sure why I tried Spock in the first place - I didn't think there was anything especially wrong with the libraries I was using.

I also have to admit I am not a huge fan of Groovy in itself, although I am using Gradle (also written in Groovy of course) more and more.  The power of Groovy, though, is when it is used to provide a well designed DSL.

Spock demonstrates this really well, so I gave it a try ... this post is a quick overview of my experience, based on several months of use.  I am not trying to cover all of Spock's functionality here, but I can say that I have not found anything yet which the AssertJ/Mockito combination could do that Spock cannot do.

The "classic" approach

Using the AssertJ/Mockito approach, a trivial test might look like this

 @Test
 public void doTest() {
 
   //given
   objectUnderTest.prepare(5);
   when(mockCollaborator.getStatus()).thenReturn("x");
 
   //when  
   objectUnderTest.doSomething();
 
   //then
   assertThat(objectUnderTest.getState()).isEqualTo("OK");
   assertThat(objectUnderTest.isActive()).isTrue();
   verify(mockCollaborator, times(2)).getStatus();
 }


The Spock approach

The equivalent in Spock would look like this:

 def "prep value 5, Out returns OK and calls collaborator twice"(){  
   given:  
   objectUnderTest.prepare(5);  
     
   when:  
   objectUnderTest.doSomething();  
     
   then:  
   objectUnderTest.getState().equals('OK')  
   objectUnderTest.isActive()  
   2 * mockCollaborator.getStatus() >> 'x'  
 }  


There are a few things to note:

  1. The name of the test, which also appears in the test results, is far more expressive
  2. The given / when / then causes are part of the DSL, not just comments, and will tell you if you have got it wrong
  3. The static imports have all gone
  4. The comparison methods in the 'then' clause are all 'native'
  5. The last line of the 'then' clause is an example of the one thing that may trip you up.  It is saying that the mock is expecting to be called twice, and will return a value of 'x' on both occasions.


Spock processes mock interactions in a different way to Mockito, and that can cause some misunderstandings when migrating from one to the other.  It is worth reading that part of the documentation thoroughly.

Not much difference?

At a quick look, apart from being neater, there does not seem to be a massive difference.  In practice, though, I have found that neatness very productive ... even with code completion, 'assertThat  .. isEqualTo' is a bit cumbersome, where using the native comparison methods is far more fluent.

The failure output from Spock is also an improvement on the classic approach, giving a better indication which part of a check has failed:

Condition not satisfied:
stack.size() == 2
|     |      |
|     1      false
[push me]

The only thing I miss are the collection methods of AssertJ - containsOnly(), containsExactly() - for example.  But if I miss them too much, I can still use them in Spock.

Oh, and the Spock data driven tests are neat ...

Conclusions

Spock seems to be one of those tools which grows on you very quickly.  At first, it seemed that its advantages were not that great, but after just a few days, the increase in fluency and productivity convinced me that the change would be worthwhile.

I don't generally convert existing tests unless they need major modification, but conversion is not difficult either.

Now, Spock is all I use in [Krail](https://github.com/davidsowerby/krail) for new unit, and usually integration testing ... and on the rare occasion when I add a test method to existing JUnit tests, the  "classic" method feels very clunky.

The learning curve for Spock is not steep - I just found that the way interactions are used take a little bit of getting used to, but the end result is extremely powerful.

If you are writing tests - I would strongly recommend that you give it a go.  And if you are not writing tests ... well, it might be a very good idea to start :-)

Thursday, January 8, 2015

Gradle Multi-Project structure

I am using a combination of Gradle, Github and Bintray for Krail ... probably a popular combination.

I originally had a single Github repository with a Gradle multi-project setup for a number of libraries.

I had been thinking of breaking out some of the libraries anyway, but Bintray really needs a Github repository per library, so with some regret, I broke up the multi-project into single projects.  The regret was because I thought I needed a directory hierarchy to support a Gradle multi-project.

Now I didn't really want to maintain several build files, all very similar, and just as I expected, a pain to keep up to date.

So I went back to the excellent Gradle documentation looking for inspiration, but it still took a while to work out what I needed to do.

I read the Multi-Project Java build and Multi-Build Projects sections again, but still couldn't find an answer to making the build work across a flat directory structure, with the root project at the same directory level as its subprojects.

Eventually I stumbled on the answers, and they are in the documentation - just that I didn't find them the easy way.  So if you are Gradle, Github and Bintray together, for a multi-project build, this is what you need:

Settings.gradle

In the settings.gradle file, which must be in your root project directory, you define your sub-projects with include statements ... but you are not limited to a hierarchical structure.  You can code any structure you like (coding in a settings file is a bit unusual, too).  Ironically, the first example I found was in the Gradle repository on Github (not in the structure I wanted, but it demonstrates the feature).

My own example looks like this:


include 'krail'include 'krail-jpa'include 'krail-demo'include 'krail-testApp'include 
'krail-testUtil'include 'krail-quartz'include 'krail-bench'include 'q3c-testUtil'


rootProject.children.each {project ->

 String fileBaseName = project.name String projectDirName = '../'+fileBaseName
 project.projectDir = new File(projectDirName)

// for example, makes it krail.gradle project.buildFileName = "${fileBaseName}.gradle"

//make sure it exists assert project.projectDir.isDirectory()
 assert project.buildFile.isFile()
}

All this does is map each of the sub-projects at the same directory level as the root project - it also points to *.gradle files which are named the same as their projects (so krail-jpa.gradle for the krail-jpa project).  This is to avoid having multiple "build.gradle" files in the IDE, which can be confusing.

Now this almost worked, except I couldn't run the sub-projects from their own directories, I had to call grade from the root project directory.

There must be a master

This time I did find the answer in the documentation, but in the Build Lifecycle section, when I was looking for something else!

....if you execute Gradle from within a project with no settings.gradle file, Gradle looks for a settings.gradle file in the following way:
  • It looks in a directory called master which has the same nesting level as the current dir.
  • If not found yet, it searches parent directories.
  • If not found yet, the build is executed as a single project build.
  • If a settings.gradle file is found, Gradle checks if the current project is part of the multiproject hierarchy defined in the found settings.gradle file. If not, the build is executed as a single project build. Otherwise a multiproject build is executed.

For my scenario, all I needed to do was to rename my root project directory as master ... and now I can execute tasks sub-projects from their own directories.

Sunday, November 9, 2014

Moving sub-projects in Git

When I started this project I didn't actually know I was starting a project - it was going to be a template application for Vaadin 7, and the sandbox name I gave it was "V7".

I finally got round to changing the name, and decided to go through a major refactoring exercise at the same time.

I had a multi-project set up with a single Git repository and a master repo on Github.

Managing a multi-project set up is simple with the combination of Gradle, Git and IDEA that I use.  There's a Gradle plugin for Vaadin, too, to make life really easy. 

However, I decided I would separate sub-projects into their own repositories .. mainly to provide logical separation, but also because some of the experimental sub-projects will never be completed.

I wasn't looking forward to the task, thinking it would end up in some complex Git manipulation, but I was pleasantly surprised.  

The method described here, works extremely well - and it keeps your Git history.  The only issue I had was IDEA occasionally re-creating references to sub-projects that I had deleted - I'm not sure what was going on there, but I eventually cleared them.

The final step was to "promote" one remaining sub-project to become the main project, in the existing repo.  This again proved straightforward.  Git is quite happy for you move files around, or rename the local repo, as I did, as long as you don't modify the content of the files at the same time.

And finally, Github makes it really easy to change the name of your repo, even re-directing your original URL to the new one.

It makes a change for something you expect to be complicated to turn out to be easy ...


Friday, September 19, 2014

Eclipse to IDEA

Why consider a change?

I have been using Eclipse for many years, and still really only used a small part of its capability.  I had always been reasonably happy with it, despite the occasional issues, I fully appreciated the fact that I was using free software (although I did make a couple of small donations).

But a couple of years ago I started getting problems with its interactions with Ubuntu.  Now to be fair to Eclipse, some of the underlying issues were almost certainly outside Eclipse itself  ..  but I had been losing time from menus disappearing and SWT related random crashes for a long time.

At a suitable break point in developing V7,  I decided to upgrade to Eclipse Luna.  Initially I thought all was well, but when some (but not all) menus stopped working I admit I just lost patience.

So I tried Intellij IDEA.

Making the Change

I knew of course that changing from something very familiar would be a bit of a challenge, so naturally I started with the Community Edition of IDEA.  

Trying to do some very simple things took time, because I had to find out how to do them.  Now the IDEA documentation is pretty good, but sometimes you just have to know the right words to find what you are looking for.

It wasn't helped because I only code in my spare time - my current day job doesn't require it - so I was forgetting new things between sessions.  But I resolved to be patient and give myself a chance to find my way around ...

For reasons I can't explain, I find I use keyboard shortcuts more than I did on Eclipse.  Ironically, if I had used shortcuts more on Eclipse the changeover may have been easier as you can select an Eclipse key mapping for IDEA.

Anyway, after a bit of early frustration just getting used to the differences, I came to like IDEA, and I've been using it for about 3 months now.  The biggest single reason is still that it doesn't crash, or have quirky menu issues.  

My favourite feature is the Live Templates - much more powerful than the Eclipse equivalent and can be a real time saver by generating code for your common coding patterns.

I like the Gradle integration too .. with Eclipse I always used the command line for Gradle, it was just easier.  No great problem, but Gradle actually just feels like part of IDEA.

Of course there isn't the huge array of plugins that Eclipse has available, but even so, there are quite a few;  it seems that most, if not all, of the common requirements would be covered.

Paying the Price

Now the Community Edition is only for pure Java, and for a while that was enough.  But it doesn't take long before you stray into needing the licenced (called 'Ultimate') version.  In my case, I wanted to run a web app on Tomcat, which isn't supported via the Community Edition.

So then I was faced with buying a licence, or returning to Eclipse ... and the licence would cost more than I had contributed to Eclipse. 

I decided to go ahead, still mainly because of stability, but also because I keep finding features, often quite small in themselves, which just make coding a bit easier ... and I am fortunate enough to be able to afford a personal licence.

I must admit though, that I still feel a bit guilty about leaving Eclipse  ... I've always been a keen supporter of the open source movement.



Wednesday, March 26, 2014

Vaadin, Guice and Quartz

Progress

I can't believe it is so long since I posted to this blog.  I guess I must have been busy!

Those who have read previous posts will know that I have been working on a project called V7 to provide Vaadin, Guice and Shiro integration, with some I18N and navigation support thrown in.

Making Time(r)?

Quartz has become a popular scheduler for Java, and I wanted something to test my V7 design to ensure that I could integrate and add in a library without too much difficulty.

I have recently committed an implementation for this, so there now also an optional V7 library for Quartz, already integrated with Guice.

Observations

I do find some of the Quartz API a little strange, but despite that the integration seems to be fairly robust.  And to be fair, I guess Quartz is popular for a reason - it works!

One thing which I find strangely missing (and I know I am not alone) is the ability to define dependencies between jobs.  

But .. it works, and is now integrated with V7.