Classpath too long… with Spring Boot and Gradle

Java applications get more and more complex and we rely on more libraries than before. But command lines have some length limits and eventually you can get into troubles if your classpath gets too long. There are ways how to dodge the problem for a while – like having your libraries on shorter paths. Neither Gradle nor Maven are helping it with their repository formats. But this is still just a pseudo-solution.

Edit (Oct 29): I added section about the plugin that makes the whole solution automatic would work if it treated paths as URLs.

When you suddenly can’t run the application

On Windows 10 we hit the wall with the command line of our Spring Boot application going over 32 KB. On Linux this limit is configurable and in general much higher, but there is often some hard limit for a single argument – and classpath is just that, a single argument. The question is whether we really want to blow our command line with all those JARs or whether we can do better.

Before we get there, though, let’s propose some other solutions:

  • Shorten the common path (as mentioned before). E.g. copy all your dependencies into something like c:\lib and make those JARs your classpath.
  • With all the JARs in a single place, you may actually use Java 6+ feature -cp “lib/*”. That is, wildcard classpath using * (not *.jar!) and quotes. This is not a shell wildcard (that would just expand it into a long command line again) but actual feature of java command (here docs from version 7). This is actually quite usable and it also scales – but you have to copy the JARs.
  • Perhaps you want to use environment variable CLASSPATH instead? This does not work, the limit is 32 KB as well. So no-solution.
  • You can also extract all the JARs into a single tree and then repackage as a single JAR. This also scales well, but involves a lot of disk operations. Also, because first class appearance is used, you have to extract the JARs in classpath order without overrides (or in reverse with overrides).

From all these options I like the second one the best. But there must be some better one, or not?

JAR with Class-Path in its manifest

I’m sure you know the old guy META-INF/MANIFEST.MF that contains meta-information about the JAR. It can also contain the classpath which will be added to the initial one on the command line. Let’s say the MANIFEST.MF in some my-cp.jar contains a line like this:

Class-Path: other1.jar other2.jar

If you run the application with java -cp my-cp.jar MainClass it will search for that MainClass (and other needed ones) in both “other” JARs mentioned in the manifest. Now I recommend you to experiment with this feature a bit and perhaps Google around it because it seems easy, but it has couple of catches:

  • The paths can be relative. Typically, you have your app.jar with classpath declared in manifest and deliver some ZIP with all the dependencies with known relative paths from your app.jar. You can still run your application with java -cp app.jar MainClass, or, even better, java -jar app.jar with Main-Class declared in the manifest as well.
  • The paths can also be absolute, but then you need to start with a slash (natural on Linux, not so on Windows). On Windows it can be any slash actually, I guess it works the same on Linux (compile once, run anywhere?).
  • If the path is a directory (like exploded JAR) it has to end with a slash too.
  • And with spaces you get into some escaping troubles… but by that time you’d probably figure out that the paths are not paths (as in -classpath argument) but in fact URLs.
  • Now throw in the specifics of the format of MANIFEST.MF like max line length of 72 chars, continuation lines with leading space, CRLF, … Oh, and if you try to do your Manifest manually, don’t forget to add one empty line – or, in other words, don’t forget to terminate the last line with CRLF as well. (Talking about line separators and line terminators can get very confusing.)

Quickly you wish you had a tool that does this all for you. And luckily you do.

Gradle to the rescue

We actually had also specific needs for our classpath. We ran bootRun task with the test classes as well for development reasons. In the end, bootRun is not used for anything but for development, right?

Adding test runtime classpath to the total classpath “helped” us to go over that command-line limit too. But we still needed it. So instead of just having classpath = sourceSets.test.runtimeClasspath in the bootRun section we needed to prepare the classpath JAR first. For that I created classpathJar task like so:

task classpathJar(type: Jar) {
  inputs.files sourceSets.test.runtimeClasspath

  archiveName = "runboot-classpath.jar"
  doFirst {
    // If run in configuration phase, some artifacts may not exist yet (after clean)
    // and File.toURI can’t figure out what is directory to add the critical trailing slash.
    manifest {
      def classpath = sourceSets.test.runtimeClasspath.files
      attributes "Class-Path": classpath.collect {f -> f.toURI().toString()}.join(" ")

This code requires couple of notes, although some of it is already in the comments:

  • We need to treat the files (components of the classpath) as URLs and join them with spaces.
  • To do that properly, all the components of the classpath must exist at the time of processing.
  • Because after clean at the time of the task configuration (see Gradle’s Build Lifecycle) some components don’t exist yet, we need to set the classpath in the task execution phase. What may not exist yet? JARs for other projects/modules of our application or classes dirs for the current project. Important stuff, obviously. (If you run into seemingly illogical class not found problems, this may be the culprit.)
  • Another reason why these artifacts may not exist is missing proper dependencies. That’s why I mention all three concatenated components of the classpath in the inputs.files declaration.

EDIT: For the first day of this post I’ve got dependsOn instead of inputs.files. It was a mistake causing unreliable task execution when something upstream was changed. Sorry for that. (I am, I suffered.)

And that’s it

Now we need to just mention this JAR in the bootRun section:

bootRun {
  classpath = classpathJar.outputs.files
  //…other settings, like...
  main = appMainClass // used to specify alternative "devel" main from test classpath

I’m pretty sure we can do this in other build tools and we can make some plugin for it too. It would probably be possible with some doFirst directly in the bootRun, but I didn’t want to mix it there.

But again, this nicely shows that Gradle lets you do what you need to do without much fuss. It constantly shouts: “Yes, you can!” And I like that.

But wait, there’s a plugin for it!

EDIT: October 28th, 2018

Weeks after I resolved the problem myself I decided to re-search the internet about it with fresh head again. Through this issue I found gradle-utils-plugin and its recently updated clone. I decided to use the latter and when I removed my original solution (bootRun task in my case has classpath = sourceSets.test.runtimeClasspath) all I need to do is add plugin declaration at the beginning of the build script:

plugins {
    // solves the problem with long classpath using JAR instead of classpath on command line
    id "ua.eshepelyuk.ManifestClasspath" version "1.0.0"

(EDIT: November 2nd) However – there’s a twist. This plugin does not work with spaces – that is, it’s not tested and it’s broken. It does not treat paths as URLs, which is critical in case of class-path in Manifest. Repository does not offer “Issues”, so we’re back to our own solution.


Self-extracting install shell script with Gradle

My road to Gradle was much longer than I wanted. Now I use it on the project at the company and I definitely don’t want to go back. Sure, I got used to Maven and we got familiar – although I never loved it (often on the contrary). I’m not sure I love Gradle (yet), but I definitely feel empowered with it. It’s my responsibility to factor my builds properly, but I can always rely on the fact that I CAN do the stuff. And that’s very refreshing.


Gradle is not easier to learn that Maven I guess. I read Building and Testing with Gradle when it was easily downloadable from Gradle’s books page (not sure what happened to it but you probably still can get it somehow). The trouble with Gradle is that sometimes the DSL changes a bit – and your best bet is to know how the DSL relates to the API. The core concepts are more important and more stable than DLS and some superficial idioms.

Does it mean you have to invest heavily in Gradle? Well, not heavily, but you don’t want to merely scratch the surface if you want to crack why some StackOverflow solutions from 2012 don’t work anymore out-of-the-box. I’m reading Gradle in Action now, nearly finished, and I can just say – it was another good time investment.

My problem

I wanted to put together couple of binary artefacts and make self-extracting shell script from them. This, basically, is just a zip and a cat command with some head script. Zip gets all the binary artefacts together and cat joins the with this ZIP file – both separated by some clear separator. I used this “technology” back in the noughties, and even then it was old already.

How can such an head script look like? Depends on many things. Do you need to unzip into a temporary directory and run some installer from there? Is unzipping itself the installation? Let’s just focus on the “stage separation”, because the rest clearly depends on your specific needs. (This time I used this article for my “head” script, but there are probably many ways how to unzip the “tail” of the file. Also, in the article TGZ was used, I went for ZIP as people around me are more familiar with that one.)

set -eu

# temporary dir? target installation dir?

echo "Extracting (AKA installing)..."
ARCHIVE=`awk '/^__ARCHIVE_BELOW__/ {print NR + 1; exit 0; }' $0`
tail -n+$ARCHIVE $0 >
unzip -q -d $EXTRACT_TO
rm -f

# the rest is custom, but exit 0 is needed before the separator
exit 0


Now, depending on the way how you connect both files you may or may not need empty line under the separator.

To be more precise and complicated at the same time, there may need to be LF (\n or ASCII 10) at the end of the separator line or not. Beware of differences in “last empty line” meaning in various editors (Windows vs Linux), e.g. vim by defaults expects line terminator at the end of the file, but does not show empty line, while Windows editors typically do (see the explanation).

Concatenation… with Gradle (or Ant?)

Using cat command is easy (and that one requires line-feed after separator). But I don’t want to script it this time. I want to write it in Gradle. And Gradle gives me multiple superpowers. One is called Groovy (or Kotlin, if you like, but I’m not there yet). The other is called Ant. (Ant? Seriously?! Yes, seriously.)

Now I’m not claiming that built-in Ant is the best solution for this particular problem (as we will see), but Ant already has a task called concat. BTW: Ant’s task is just a step or action you can execute, if you thing Gradle-tasks think Ant-targets, and Ant’s targets are not of our concern here.

Ant provides many actions out of the box and all you need to do is use “ant.” to get to Gradle’s AntBuilder. But before we try that, let’s try something straightforward, because if you can access file, you can access its content too. One option is to use File’s text property and something like in this answer. Groovy script looks like this:

apply plugin: 'base' // to support clean task

task createArchive(type: Zip) {
    archiveName ''
    from 'src/content'

task createInstallerRaw(dependsOn: createArchive) {
  doLast {
    file("${buildDir}/").text =
      file('src/').text + createArchive.outputs.files.singleFile.text

OK, so let’s try it:

./gradlew clean createInstallerRaw
# it does it’s stuff
diff build/distributions/
# this prints out:
Exctracting (AKA installing)...
Binary files build/distributions/ and differ

Eh, the last line is definitely something we don’t want to see. I used couple of empty files in src/content, but with realistic content you’d also see something like:

  error:  invalid compressed data to inflate out/Hot Space/01 - Staying Power.mp3
out/Hot Space/01 - Staying Power.mp3  bad CRC 00000000  (should be 9faa50ed)

Let’s get binary

File.text is for strings, not for grown-ups. Let’s do it better. We may try the bytes property, perhaps joining the byte arrays and eventually I ended up with something like:

task createInstallerRawBinary(dependsOn: createArchive) {
  doLast {
    file("${buildDir}/").withOutputStream {
      it.write file('src/').bytes
      it.write createArchive.outputs.files.singleFile.bytes

Now this looks better:

./gradlew clean createInstallerRawBinary
# it does it’s stuff
diff build/distributions/

And the diff says nothing. And even Hot Space mp3 files play back flawlessly (well, it’s not FLAC, I know). But wait – let’s try no-op build:

./gradlew createInstallerRawBinary
2 actionable tasks: 1 executed, 1 up-to-date

See that 1 executed in the output? This build zips the stuff again and again. It works, but it definitely is not right. It’s not Gradlish enough.

Inputs/outputs please!

Gradle tasks have inputs and outputs properties that declaratively specify what the task needs and what it produces. There is nothing that prevents you from using more than you declare, but then you break your own contract. This mechanism is very flexible as it allows Gradle to check what needs to be run and what can be skipped. Let’s use it:

task createInstallerWithInOuts {
  inputs.files 'src/', createArchive
  outputs.file "${buildDir}/"

  doLast {
    outputs.files.singleFile.withOutputStream { outStream ->
      inputs.files.each {
        outStream.write it.bytes

Couple of points here:

  • It’s clear what code configures the task (first lines declaring inputs/outputs) and what is task action (closure after doLast). You should know basics about Gradle’s build lifecycle.
  • With both inputs and outputs declared, we can use them without any need to duplicate the file names. We foreshadowed this in the previous task already when we used createArchive.outputs.files.singleFile… instead of “${buildDir}/distributions/”. This works its magic when you change the archiveName in the createArchive task – you don’t have to do anything in downstream tasks.
  • No dependsOn is necessary here, just mentioning the createArchive task as input (Gradle reads it as “outputs from createArchive task”, of course) adds the implicit, but quite clear, dependency.
  • With inputs.files we can as well try to iterate over them. Here I chose default it for the inner closure and had to name the parameter outStream.

Does it fix our no-op build? Sure it does – just try to run it twice yourself (without clean of course).

Where is that Ant?

No, I didn’t forget Ant, but I wanted to use some Groovy before we get to it. I actually didn’t measure which is better, for archives in tens of megabytes it doesn’t really matter. What does matter is that Ant clearly says “concat”:

task createInstallerAntConcat {
  inputs.files 'src/', createArchive
  outputs.file "${buildDir}/"

  doLast {
    // You definitely want binary=true if you append ZIP, otherwise expect corruptions
    ant.concat(destfile: outputs.files.singleFile, binary: true) {
      // ...or mulitple single-file filesets
      inputs.files.each { file ->
        fileset(file: relativePath(file))

This uses Ant task concat – it concatenates files mentioned in nested fileset. This is equivalent Ant snippet:

<concat destfile="${build.dir}/" binary="yes">
  <fileset file="${src.dir}/"/>
  <fileset file="${build.dir}/distributions/"/>

It’s imperative to set the binary flag to true (default false), as we work with binary content (ZIP). Using single-file filesets assures the order of concatenation, if we used something like (in the doLast block)…

ant.concat(destfile: outputs.files.singleFile, binary: true) {
  fileset(dir: projectDir.getPath()) {
    inputs.files.each { file ->
      include(name: relativePath(file))

…we may get lucky and get the right result, but just as likely the ZIP will be first. The point is, fileset does not represent files in the order of nested includes.

We may try filelist instead. Instead of include elements it uses file elements. So let’s do it:

ant.concat(destfile: outputs.files.singleFile, binary: true) {
  filelist(dir: projectDir.getPath()) {
    inputs.files.each { file ->
      file(name: relativePath(file))

If we run this task the build fails on runtime error (during the execution phase):

* What went wrong:
Execution failed for task ':createInstallerAntConcatFilelistBadFile'.
> No signature of method: is applicable for argument types: (java.util.LinkedHashMap) values: [[name:src\]]
  Possible solutions: wait(), any(), wait(long), each(groovy.lang.Closure), any(groovy.lang.Closure), list()

Hm, file(…) tried to create new, not Ant’s file element. In other words, it did the same thing like anywhere else in the Gradle script, we’ve already used file(…) construct. But it doesn’t like maps and, most importantly, is not what we want here.

What worked for include previously -although the build didn’t do what we wanted for other reasons – does not work here. We need to tell the Gradle explicitly we want to use Ant – and all we need to do is to use ant.files(…).

Wrapping it up

Now when I’m trying it I have to say I’m glad I learned more about Gradle-Ant integration, but I’ll just use one of the non-ant solutions. It seems that ant.concat is considerably slower.

In any case it’s good when you understand Gradle build phases (lifecycle) and you know how to specify task inputs/outputs.

When working with files, it’s always important to realize whether you work with texts or binaries, whether it matters, how it’s supported, etc. It’s important to know if/how your solution supports order of the files when it matters.

Lastly – when working with shell scripts it’s also important to assure they use the right kind of line terminators. With Git’s typically automatic line breaks you can’t just pack a shell script with CRLF and run it on Linux – this typically results in a rather confusing error that /bin/bash is not the right interpreter. Using editor in some binary mode helps to discover the problem (e.g. vi -b But that is not Gradle topic anymore.

I like the flexibility Gradle provides to me. It pays off to learn its basics and to know how to work with its documentation and API. But with that you mostly get your rewards.

Why Gradle doesn’t provide “provided”?

EDIT May 10, 2016: Gradle 2.12 finally brings compileOnly dependency configuration for Java plugin (until then available only with WAR plugin). It does not model exactly what provided means in Maven, but covers most of the cases like using Java EE compile time dependencies in libraries, etc.

EDIT Jan 15, 2016: Gradle itself recently recommended nebula.provider-base plugin that introduces provided scope. I added build-nebula.gradle to the repo, check it out! Too bad I can’t link the resource, I simply cannot find it anymore, but the plugin works.

Honestly, I don’t know. I’ve been watching Gradle for around 3 years already, but except for primitive demos I didn’t have courage to switch to it. And – believe it or not – provided scope was the biggest practical obstacle in my case.

What is provided, anyway?

Ouch, now I got myself too. I know when to use it, or better said – I know with what kind of dependencies I use it. Use it with any dependency (mostly an API) that are provided (hence the name I guess :-)) at the runtime platform where your artifact will be run. Typical case is javax:javaee-api:7.0. You want to compile your classes that use various Java EE API. This one is kinda “javaee-all” and you can find separate dependencies for particular JSRs. But why not to make your life easier when you don’t pack this into your final artifact (WAR/EAR) anyway?

So it seems to be like compile (Maven’s default scope for dependencies) except that it should not be wrapped in WAR’s lib directory, right? I guess so, except that provided is not transitive, so you have to name it again and again, while compile dependencies are taken from upstream projects.

BTW: This is why I like writing blog posts – I have to make it clear to myself (sometimes not for the first time, of course). Maven’s dependency scopes are nicely described here.

But Gradle has provided!

Without being strict what Gradle is and what are its plugins, when you download Gradle, you can use this kind of scope – if you use ‘war’ plugin, just like in this simple example. If you want to run it (and other examples from this post), just try the following commands in git-bash (or adjust as necessary):

$ svn export
$ cd gradle-provided
$ gradle -b build-war.gradle build

Works like a charm – but it’s WAR! Good thing is you can now really check that the provided dependency is not in the built WAR file, only Guava sits there in WEB-INF/lib directory. But often we just need JARs. Actually, when you modularize your project, you mostly work with JARs that are put together in a couple of final artifacts (WAR/EAR). That doesn’t mean you don’t need Java EE imports in these JARs – on the contrary.

So this providedCompile is dearly missed in Gradle’s java plugin. And we have to work around it.

Just Google it!

I tried. Too many results. Various results. Different solutions, different snippets. And nothing worked for me.

The main reason for my failures must have been the fact that I tried to apply various StackOverflow answers or blog advices into an existing project. I should have tried to create something super-simple first.

Recently I created my little “litterbin” project on GitHub. It contains any tests, demos or issue reproductions I need to share (mostly with my-later-self, or when I’m on a different computer). And today, finally, I tried to proof my latest “research” in provided scope – you can check various files using vanilla aproach or propdeps plugins (read further). You can also “svn export” (download) the project as I showed higher and play with it.

My final result without using any fancy plugin is this:

apply plugin: 'maven'
apply plugin: 'java'
apply plugin: 'idea'

repositories {

configurations {

sourceSets {
    main {
        compileClasspath += configurations.provided
        test.compileClasspath += configurations.provided
        test.runtimeClasspath += configurations.provided

// if you use 'idea' plugin, otherwise fails with: Could not find method idea() for arguments...
idea {
    module {
         * If you omit [ ] around, it fails with: Cannot change configuration ':provided' after it has been resolved
         * This is due Gradle 2.x using Groovy 2.3 that does not allow += for single elements addition.
         * More:
         */ += [configurations.provided]
        downloadJavadoc = true
        downloadSources = true

dependencies {
    compile ''
    provided 'javax:javaee-api:7.0'

In the comments you can see the potential problems.

With strictly contained proof-of-concept “project” I can finally be sure what works and what doesn’t. If it works here and doesn’t work when combined with something else, the problem is somewhere else (or in the interaction of various parts of the build). Before I always tried to migrate some multi-module build from Maven, and although I tried to do it incrementally, it simply got over my head when I wanted to tackle provided dependencies.

Just use something pre-cooked!

If you want provided scope you can also use something that just gives it to you. Spring Boot plugin does, for instance, but it may also add something you don’t want. In this StackOverflow answer it was suggested to use propdeps plugin managed by Spring. This just adds the scope you may want – and nothing else. Let’s try it! I went to the page and copied the snippets – the build looked like this:

apply plugin: 'maven'
apply plugin: 'java'
apply plugin: 'idea'

repositories {

buildscript {
    repositories {
        maven { url '' }
    dependencies {
        classpath ''

configure(allprojects) {
    apply plugin: 'propdeps'
    apply plugin: 'propdeps-maven'
    // following line causes Cannot change configuration ':provided' with Gradle 2.x (uses += without [ ] internally)
    apply plugin: 'propdeps-idea'
    apply plugin: 'propdeps-eclipse'

dependencies {
    compile ''
    provided 'javax:javaee-api:7.0'

As the added comment suggest, it wasn’t complete success. Without IDEA plugin and the section, it worked. But the error with the IDEA parts was this:

Cannot change configuration ':provided' after it has been resolved.

You google and eventually find this discussion, where the key message by Peter Niederwieser (core Gradle developer) is:

Gradle 2 updated to Groovy 2.3, which no longer supports the use of ‘+=’ for adding a single element to a collection. So instead of ‘ += configurations.provided’ it’s now ‘ += [configurations.provided]’.

Funny part is, that it is actually fixed in the spring-projects/gradle-plugins version 0.0.7, they have just forgotten to update the examples in the README. 🙂 So yeah, with 0.0.7 instead of 0.0.6 in the example, it works fine.

How can this stop you?

Maybe provided scope is not that trivial. Scope is actually not the right word in Gradle world, but my mentality and vocabulary is rooted in Maven world after all the years. If provided was obvious and easy they’d probably resolve this never ending story already. Now the issue is polluted with advocates for the scope (yeah, I didn’t resist either) and it’s difficult to understand what the problem is on the side of the Gradle team, except it seems they’re just ignoring it for a couple of years.

Original reporter claimed it doesn’t make sense to stay with Maven for this – and he is right. He is also right that many developers don’t understand how Configuration works (true for me as well) and how it relates to ClassLoader (true again). I’ve read some Gradle book and read many parts of the manual, trouble is that my problems were always about migrating existing Maven builds. Not big ones, but definitely multi-module with provided dependencies. And it really is not easy from this position.

I successfully used Gradle for one-time projects, demos, etc. Every time I try to learn something new about it. I acknowledge that building domain is a hard domain. Gradle has good documentation, but it doesn’t mean it’s always easy to find the right recipe. I never worked with a team where someone was dedicated for this task and I was mostly (and sadly) the best learned member when it came to builds with tons of other stuff on my hands. (Sorry for rant. It springs from the fact that builds are considered secondary matter, or worse. And there is too much primary concerns anyway.)

When one doesn’t know how to get to “provided” scope – that was available “for free” in Maven – any obstacle seems much bigger than it really is. There is simply too much we don’t know when we tackle the Gradle the first time. Nobody tells you “don’t use propdeps-plugin:0.0.6, try 0.0.7 instead”.

Or you get Gradle like message “Cannot change configuration ‘:provided’ after it has been resolved” which is probably perfectly OK from Gradle point of view – it nicely covers underlying technology. But it also covers the root cause that Groovy 2.3 simply doesn’t support += without wrapping the right side into […] – and even that only in some cases:

// correct line, but fails without [ ]
idea { module { += [configurations.provided] }}

Even –stacktrace –debug will not help you to find the root cause. Maybe if you’d debug the build in IDE, but I’m definitely not there yet (not with Gradle, I debug Maven builds sometimes).

I hope you can now appreciate how subtle the whole problem is and how much difficulty it may cause.

provided or providedCompile?

And that is another trick – people call it differently. “providedCompile” is probably more Gradle-like (and available with war plugin), “provided” is what we are used to from Maven. Now imagine you experiment with various solutions how to introduce this kind of scope – that is you test different plugins. And all these call it differently. Every time you have to go to your dependency list and fix it there, or wonder why it doesn’t work when you forget. It just adds to the chaos when you already navigate unknown territory.

And it also nicely underlines the fact how much it is missing for java plugin out of the box. Because “it is supported in ‘war’ plugin” is not satisfactory answer. I want to use Java EE imports in my JAR that may be later put to WAR. Or I may run it in embedded container that will be declared with different dependencies. “This mostly affects only library developers” is also not true. Sure, it affects my Java Simon (which is a library), but I used provided scope for JAR modules on every single project in my past.

Now imagine this is your first battle with Gradle (which more or less was in my case). How should I be confident about releasing to Maven Central? It reportedly works, but then, for experienced Gradle users everything is easy…


During my research I found also the article Provided Scope in Gradle. I don’t know how accurate it is for Gradle 2.x or whether Android guys didn’t solve it already somehow. Author added nice pictures and also started with “What is provided anyway?” question (I swear it was a natural choice for my first subheader too :-)). And again it just shows how much complicated Gradle builds are when it’s not available out of the box.

It doesn’t mean I don’t want to try to get to Gradle build. I don’t like Maven’s rigidity – although I appreciate the conventions and I’ll follow those with my Gradle builds too. But sometimes you just want to switch something from false to true – and it takes 10 XML lines. You may say, meh! But it means you see less on the screen, builds are not readable, etc. And we already agreed, I hope, that building is a (potentially) complex domain. Readability is a must.

Sure there is something about polyglot Maven, but there is still also the issue with the lack of flexibility. I’m absolutely convinced that Gradle is the way to go. I tried it for simple things and I liked it, and I have no doubt I’ll learn it well enough to master bigger builds too.

Hopefully, provided will not be problem anymore. 🙂

Happy New Year 2013!

Happy New year, of course! My last year was a bit poorer blog-wise. For some reasons I was more lazy to write about things. Heck, sometimes I think that I was less lucky with new technology in overall. I achieved some nice results with testing in our company during the previous year. This year I wanted to push Continuous Integration, testing a bit further, maybe Gradle – but results in CI area are mixed and the rest brought no real results at all.

On the brighter side, I managed to finish my quest for system time shifter on JVM that would be usable for testing purposes – all documented in my post. Blogging is not all of course and I am quite happy how topics around Clean Code got some attention around me. We pushed Java Simon project a bit further too, I learned a few interesting things around Spring, MVC and jQuery… Add this beautiful Scala class on Coursera and this year was more than fun after all.

Still I’d like to make some resolutions. I discovered QueryDSL (thanks to a colleague of mine) and this seems to be answer to readable and compile time safe Criteria – because those shipped with JPA2 are simply horrible to read. It works well with IDEA’s annotation processor, Maven and it should be no problem with Gradle either. Ah, Gradle! For around two years I’m watching this guy but for whatever reason I was not able to use it for anything more than a few tests – but that is not Gradle’s fault. I like it, I like the idea, I like the language – and I think this year is time to switch Java Simon from Maven to Gradle. And after that I’ll go on with projects in our company, although the battle there will be more difficult I guess.

Out of technology, I managed to put together a few songs with my colleagues and it was fun – the first time I played in something close to a band. We played only on our company party but it doesn’t change anything… it was a real fun. We didn’t have a drummer so I used my Native Instruments Maschine Mikro and pre-programmed our songs – and I was really happy with the results. I’ll probably dedicate a post to Maschine Mikro, because it is one really interesting controller (and software too!).

Maschine Mikro controller

Talking about music, I managed to upload two full-blown tracks to my Soundcloud and later added two simple guitar+voice tracks. While mixing/mastering is still my weakness, I’m happy that I was able to pull through this recording-wise. And just how I imagined – my songs composed with paper, pen and acoustic guitar many years ago can really work as rock recording too.

So what about this year and those resolutions? Gradle – sure. More testing methodology on our projects – maybe I’ll even manage to document it here on the blog. Pushing Continuous delivery just a bit further again. Scala or other JVM language? I don’t know. Maybe, maybe for tests. And a bit of my music – I need to practice more with keyboard, guitar and bass guitar (yeah, I bought lovely Yamaha bass too).

Bass guitar Yamaha RBX375

Last resolution is no resolution at all – we have to survive somehow “socialistic” experiments of our government here in Slovakia (although there is nothing social about them). Europe has its own deal of problems – and USA? Well they saved themselves from falling down that fiscal cliff or what – just a few hours ago. And it probably means to make the cliff a bit higher for the next time. So we might have escaped one Doom’s day lately at the end of 2012, but who knows how our civilization will fare in the future.

Then I remember those really poor and I know we have nothing really horrible to complain about. So once again – Happy New year – and whole year of 2013!

Honestly… I hate Maven

And I don’t give a damn that I don’t know it good enough. Why “good enough” in Maven is so difficult when it was so easy with Ant? I remember how we came from “make” to Ant for our projects. I remember what we tried with Ant. Sometimes we failed when we wanted too much.

And then I remember trying Maven. The Next Big Thing (was it 2005? sooner?), revolution in builds (and dependency management, and… everything, right?), and probably the next best thing after wheel. So I tried it. Maybe I was one day from our final goal, maybe just an hour. But I eventually gave up. I failed. Maybe I was just plain stupid. Or Maven too smart.

I did my best to forget about it when I was asked to provide Java Simon in some Maven repository. It was pain again. Not just to restructure our modules, but to understand that magic. And deployment. And plugins. And dependencies, tons of documentation. Maybe Maven makes complex things simpler. But Maven also makes simple things complex. And then repository of my choice changed their configuration and I decided to move on to Maven central.

Documentation again, javadoc generation (still have to figure this out), …a lot of learning for such an obvious goal. Because people want everything in Maven repository. Understandably of course. I, too, want our library to be used – Maven is our standard, our salvation.

Yes, I don’t understand Maven. I understand the concept, but I don’t understand why it has to be so complicated when one needs something very simple. Why things just don’t work. The whole infrastructure around Maven is crazy. If something isn’t right with build most of my colleagues just try to ignore the problem because they don’t want to mess with Maven. Yet we use it.

Recently I checked Gradle. It starts where Maven ended. I switched one of my older projects from Ant to Gradle. I had to do these things to do so:

  • switch structure to Maven-like POM-compliant structure.
  • call Ant’s native2ascii target (Gradle has simple facility to do that), because my project have resource bundles in ISO Latin 2.
  • and… that was it!

It was just so much more satisfying. Second step was a bit troublesome, but I was just happy when it all worked and my build file was just a few lines long. I also noticed that Gradle offers not only declarative approach, but you can say what and how you want things done when you need it. Right now I’m not doing any further research, but I know I will carry on with Gradle later when necessary.

Right now I have some Maven work to do. And I’m biased, I know it, I’m also frustrated and it all came to me – and I know that I just hate Maven. Not because it is bad – I actually don’t care. But because it’s everywhere, like a plague, it’s too complex (is parent + six sub-modules so difficult to comprehend? yes, with Maven) and you have to live with it if you want to offer anything that looks like library to other people. And worst of all you have to follow tons of additional rules when you need the stuff hosted somewhere. Maybe it’s necessary evil – but still, evil it is. There is no beauty, there is no elegance, there is just… POM. And XML, of course.