The Blog
Sep 29, 2007
Maven 2 Property References
I finally figured out how Maven 2 property references work today. I'm sure this is covered in the documentation somewhere, but since I just sort of jumped in head first I've been discovering a lot about Maven through trial and error (and looking at other people's build scripts).
For the Flex build script I've been working on, I'm using the maven-antrun-plugin to run an Ant task that will run FlexUnit during the test phase of the build. I'm building the SWF in to the standard build area for Maven (the "target" directory) and wanted to reference this in the embedded Ant build. Here's the POM file that I'm using so far for reference.
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>net.israfil.examples</groupId>
<artifactId>israfil-swf-example</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>swf</packaging>
<properties>
<flex.home>/opt/flex_sdk_2</flex.home>
</properties>
<build>
<finalName>UserManager</finalName>
<plugins>
<plugin>
<groupId>net.israfil.mojo</groupId>
<artifactId>maven-flex2-plugin</artifactId>
<version>1.2-SNAPSHOT</version>
<extensions>true</extensions>
<configuration>
<flexHome>${flex.home}</flexHome>
<useNetwork>true</useNetwork>
<strict>true</strict>
<warning>true</warning>
<main>Test.mxml</main>
</configuration>
</plugin>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>test</phase>
<configuration>
<tasks>
<taskdef
resource="com/adobe/ac/ant/tasks/tasks.properties"
classpathref="maven.compile.classpath" />
<echo message="${project.build.outputDirectory}" />
<echo message="${project.build.finalName}" />
<flexunit timeout="0"
swf="${project.build.directory}/UserManager.swf"
toDir="${project.build.directory}/reports" haltonfailure="true" />
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<!-- Add other SWC dependencies for JUnit-style test with output -->
<dependency>
<groupId>actionscript3.flexunit</groupId>
<artifactId>flexunit</artifactId>
<version>0.85</version>
<type>swc</type>
</dependency>
<dependency>
<groupId>com.adobe.ac.ant.tasks</groupId>
<artifactId>flexunitant</artifactId>
<version>1.0</version>
</dependency>
<!-- Required by the FlexUnit Ant Task -->
<dependency>
<groupId>org.dom4j</groupId>
<artifactId>dom4j</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.jaxen</groupId>
<artifactId>jaxen</artifactId>
<version>1.1-beta-6</version>
</dependency>
</dependencies>
</project>
I couldn't figure out how to reference the Maven property for the project build directory, and searching the web for "maven properties" just came up with the properties from the Maven 1 implementation which didn't help.
Then, it hit me: the properties are just dot-notation references for the path to the items in the POM. Taking a look at the
Super POM, you can see that the tags wrapping the project build directory (target) are accessible through descendant dot notation.
So, the following XML snippet from the Super POM:
<project>
...
<build>
<directory>target</directory>
<outputDirectory>target/classes</outputDirectory>
<finalName>${pom.artifactId}-${pom.version}</finalName>
...
</build>
...
</project>
... would enable you to reference the build directory as ${project.build.directory} ("target"), ${project.build.outputDirectory} ("target/classes"), and ${project.build.finalName} (holding the output of the referenced values "${pom.artifactId}-${pom.version}").
I'm sure this is something stupid that anybody familiar with Maven would know, but since it took me a while to figure it out I thought I'd share with the world.
Sep 28, 2007
Making the Most (or Too Much?) of Flex Components
One of the first things that you learn to do as a Flex developer is create custom components by extending those that come with the core MXML library. Usually you'll bundle a few items together, create a custom event or two, and maybe override some functionality.
However, we've been experimenting with what else you can do with components without crossing the line in to bad practice. Frankly, we're not sure where the line is yet, but hopefully somebody with some more professional Flex experience will stumble upon my blog and set us straight if we're venturing beyond the realm of nice-smelling code.
One of the things I'd really like to do is loosely pair the services that will support our business domains with a set of reusable components that can utilize them. The goal would be that as a developer, you could drop a component tag on a page and have both data and functionality appear immediately.
This probably makes more sense for us than the average Flex development shop because we don't build a lot of one-off Flex apps. Instead, we're building custom software to support specific lines of business within CFI, many of whom consume similar information and perform common functionality in differing business process flows.
Let me explain.
We have a lot of software user interfaces that essentially do the same thing, but slighty differently for different business units. So, for example, our sales floor might search hotel availability when making a sale of a vacation, while our customer service department might search hotel availability to add extra nights to an existing vacation. The business rules for what sort of availability should be searched might be different for each business unit.
Another thing our business really likes to do is the same thing, slightly differently, a lot. So, we might start a sales program that six sales agents test market, where we offer a limited set of hotel availability or adjust the pricing of the hotels somehow to match a test marketing program.
So, we have a few options to make this work in Flex.
1) Write a custom component set, and hook it up to one or more services that handle the variances in business rules. Do this using a framework, writing essentially the same glue code over and over again for each separate application.
2) Write a custom component set, and design it so that each component can invoke its own data service directly (i.e. embed the calls to the remote service directly in the component). Make the bound services implement an interface of some kind so that the component can be wired up with different services.
So, what we'd like to explore is #2. For each application, we'd skip the repetitive glue code, and instead would write a base component and extend/compose it in to a custom component for each front end application. The extension/composition would simply pair the component with the implementation of the service interface containing the custom business logic.
As varying events happen in the component, they can be emitted for handling in the full UI implementation, which would use a standard Flex framework to process the results as necessary. So, for example, a hotel search might take place entirely inside the component with a direct call to the service backing the component, and then when the user double clicks the hotel they want to book, a hotelSelected event is announced that can be trapped by the UI and processed accordingly with the flow of the particular business process for that business unit.
If we go this direction, we'll essentially end up with a set of base components backed by directly called services, with both services and components closely related to a particular problem domain (such as seaching hotel availability, charging credit cards, etc.). For each specific UI implementation, the base service could be used (or a custom service implementation associated appropriately), and bound to the business process through handling of events dispatched by the service-backed components.
We've already prototyped out some components along these lines, and so far we can't see anything we're scared of from a practice or maintainability standpoint. Of course, it's early days, and we have plenty of time left to see this crash and burn in practice.
If you have any thoughts on this approach, please share them. Are we setting ourselves up for total disaster, or is this something that might actually make a lick of sense?
Maven Test Problem with Surefire
I've started porting a few apps at work in to Maven to do the final test run. One of the first systems I am importing is our credit card processing system.
Everything went fine until I tried to run the
mvn package command to bundle up the web app. I got the following error:
org.apache.maven.surefire.testset.TestSetFailedException: Unable to instantiate POJO \
'class test.com.westgateresorts.paymentprocessing.creditcard.testdata.TestDataFactory'
Obviously, the surefire test plugin couldn't instantiate the class, but I couldn't understand why this would be a problem since the class was on the test classpath. I looked on Google and didn't find anything interesting.
Then, I remembered something about the surefire plugin running every class named "**/Test*.java", "**/*Test.java", or "**/*TestCase.java". What was happening was that the plugin was trying to instantiate this class as a test case, when in fact it's just a convenience class we use to provide test data. Unfortunately, this was one of our early Java projects, and we didn't follow all the proper naming conventions for our test classes and their support classes.
Not a problem, though - there is a way to
configure inclusion and exclusion of tests in the surefire plugin. I'm going to leave this until next week, though, since it's FRIDAY!
Flex 3
Man, Flex 3 is looking
cool as balls. I'm definitely going to hold off buying my Flex Builder for Mac license until this thing goes gold.
And when's
AIR coming out of beta? We want our Windows Taskbar/Mac Dock integration for our new Flex sales app, damn it. :)
Either way, I'm psyched about the direction and growing maturity of the platform. It looked like it had potential two years ago when we bought it, and it's nice to see Adobe coming through for us.
Sep 26, 2007
cf.objective() 2008
I've been invited to speak again at
cf.objective() 2008 in the "Architecture & Design in Software/-Integration/EAI/SOA/--Java" track.
Last year's conference was awesome, so I can't wait to get back there again and blather on about something or other. Topic submissions are going to be requested soon, so I'll let you know what I decide to pitch when the time comes.
How To Change A Corporate IT Department (a.k.a. Beating Your Head Against A Wall, But Not In Vain)
We have a large and incredibly significant Flex project taking place at CFI right now: we're rewriting the front end of one of our core applications, which handles sales for the front end of our business in the call center environment. If you'd told me five years ago that this is what we'd be doing in Q3 of 2007, I would have called you crazy.
I can safely say that this project is the culmination of the beginning of everything I have been trying to do with our IT department since I joined it as a lowly junior developer in application support. This application, upon successful delivery, will be the fledgling model around which the next several years of software development at CFI will be based.
In total, getting us to this point took the following:
1) Three years for me to fly in the face of all of our departmental practices, building our web development team to the point where the processes and technologies we were using showed some significant benefits over our traditional methods,
2) another year to convince our Director and CIO that we needed to change direction for the rest of the department,
3) two years of repeating myself to the business and negotiating ad nauseum to focus our efforts on re-engineering our core systems to bring them in line with their needs,
4) the last nine months or so getting all of our software development teams moving toward the new direction (some faster than others), and
5) the tireless work of countless people who have engaged in, supported, and led the transition.
Total UnderestimationPaul, who was our Director when we decided to make the switch, had always been wary of such a large transition in process and technology. He certainly wasn't scared of it, but I think he had a much better sense of just how much work it was going to be and how long it would take (maybe that's why he left right when it started? :) ). Having sat firmly in the driver's seat for all of 2007, I must say that I had absolutely no idea what I was getting myself in to.
Not that I regret it. I do regret the fact that I don't get to mess around with all the cool technologies we're now using as much as the developers do (lucky bastards).
However, I'm somewhat obligated to make the change happen, since I ended up in a unique positon to do so. That's not a statement born of conceit, but the state of affairs due to my 11 year tenure with the organization and my career path through it. The unique blend of positions and experiences I've had at CFI have given me a solid understanding of our business challenges, and my delivery record and the specific mix of relationships that I maintain from prior positions allow me to say and do things that are (usually) diametrically opposed to our prior IT model. People have always joked that I could get away with anything at CFI, and while I think that's a gross exaggeration, I've certainly been able to facilitate some radical changes in direction that nobody else wanted (or were obtuse enough :) ) to champion.
Changing CulturePrior to this experience, I had no expertise of any sort with changing an organizational culture. My first management role put me in the enviable position of being handed a small department (three people, myself included) and being given cart blanche to build it up from scratch. The fact that we had nothing mission-critical to support at the time made my job all the easier.
Honestly, our first project failed miserably, after which we spent about six months to a year just figuring out what we wanted to do differently to prevent that from happening again. We changed our SDLC, implemented Fusebox and the FLiP process, bought some development tools, and had great success with our next project immediately - a trend which continued for quite some time. We certainly did a lot of things wrong and incrementally improved things over the years, but by the time I left the department we had a pretty good thing going (at least philosophically - I can't defend most of the code we wrote :) ).
After leaving the web team, I had the opportunity to run another development team (one of our Oracle teams), and that gave me a clear perspective on a few things.
1) First, it was obvious that everything I had been told about the business units for our internal apps being harder to manage than the ones I dealt with for web projects was utter nonsense. For some reason, the other department managers seemed to think that the way we did things on the web team would never work with their teams, which was bunk. The same analysis techniques and communication strategies worked just as well with my newly inherited business units as they had with everybody else. As with anything work related, setting the attitude of the team appropriately defined its ability to succeed or fail at its goals.
2) Most of the things I had suspected were challenging with our legacy internal application architecture (technology selection, coupling issues, scalability challenges, etc.) were legitimate concerns. This explained a lot about the pain some of our teams were feeling.
By enabling the resources on my new team and making some changes to our interactions with the business, I was able to facilitate a lot of improvements, which I felt would prove the purpose behind some of the changes we were trying to make department-wide. However, I still found myself dealing with a lot of resistance in mindset and attitude to making the switch to newer technologies and more modern software development processes. The state of affairs and what needed to be done to improve them had become abundantly clear in my mind due to my successes and failures in varying development paradigms; but my audience, having only ever done things one particular way, often lacked this perspective.
As a result, I found myself constantly repeating myself for two years straight. Literally, almost every day, I was having the same conversations with different people about our new direction, answering the same questions, countering the same arguments, so on and so forth, until finally the light would turn on and they would buy in to it. It's been an utterly draining experience that I've felt like giving up on at least once every three months since I started doing it. Thankfully, I'm amazingly stubborn and a glutton for punishment, so I kept at it no matter how painful it became.
Getting SupportIf I were to advise somebody on making a similar IT organizational culture change in the future, I would recommend a few things.
Item #1: Fresh PerspectiveFirst, you have to hire people that are of a similar mindset to you. There is a great set of audio books by Jack Welch (called "Winning"), and he has some thoughts on organizational change that are 100% accurate - one of which is identifying who from the "old way" is going to make it, and who isn't. There's simply nothing you can do to change a person's sense of nostalgia for the old ways, and if they can't get on board and change to meet the new ways, they unfortunately become less useful to the organization. Often times, these people have to be let go - something we certainly didn't want to be forced to do with anybody.
Personally, I was completely unsuccessful at showing some of our old resources the benefits of the new practices. However, the great thing about hiring new people from outside the company is that they breathe fresh ideas and attitudes in to the organization. I stepped on a lot of feet during my progression at CFI (often due to a lack of professional maturity), and many times, I believe past grudges got in the way of people hearing (and believing) what I had to say. By putting a few new people in the organization and highlighting their results, I was able to win over some of the doubters.
The first example I'll use was a new project manager I hired shortly after taking over the Oracle team. Before this hire, the business analysts on this team really weren't on board with where I was trying to take them. So, I brought somebody in who had experience with RUP and a contagiously positive attitude. I put him in the same box as the other guys, gave him a project that the team had failed to deliver on for two years, and let things happen. Within four months, the project (the first RUP project for the team) went to production. The combination of this success plus the interactions with the new hire had changed the perspectives of the other business analysts. We were also able to see improvements in the attitude of the business. They were so happy with the quick results, they asked for this new hire to lead all of their future projects.
The second example I'll use was a hire made on another team, in this case a development resource. Typically, when we hired a new development resource in to the old way of doing things, they would take four to six months to become useful since we had no persistent system documentation for them to refer to (we referred to this lull in productivity as "learning the system"). A lot of people fought me on the principle of creating persistent documentation, a key tenant of our new SDLC. When the new development resource was hired, he was productive on a high profile project in about three day's time, having reviewed the system documentation we'd created as part of the project and immediately getting his bearings. This gave me yet another case to point to of the new ways working where the old ways had failed.
A key to hiring effectively is standardizing your hiring practices. We implemented some rules in hiring that allowed us to have some more consistent metrics for bringing people in. It's not perfect, but it's better than the free-for-all we had beforehand.
Item #2: Dedicated Change ManagementThe second item that is critical to the success of an organizational change is a set of dedicated resources to facilitate and formalize all the changes. This was supposed to be me and my architecture team when I originally left the web team, but since I had to immediately run a troubled development team, and my architectural resources were 100% committed to a high profile project, we didn't get much done. This year, I inherited several newly formed departments as a result of a position change, and had to direct development for our PCI compliance intiatives, so I was no more effective in promoting organizational change in my new role than I had been in my previous one.
So, while we've made a lot of progress on our transition this year, it's only been because of other people taking ownership and making things happen. I've directed and guided a lot of what we've come up with, but the real work was done by developers and project management resources getting their hands dirty. The largest benefit of this state of affairs is that the quality of the solutions has been very high, because they were born of real world challenges, and battle-tested by the internal developer and project manager communities. I always say that design by community is far better than design by committee.
Item #3: Support from the BossThe third item that you absolutely have to have is support from your superior(s). My boss gave me cart blanche to make changes (this time with the entire department). While she plays devil's advocate often enough, and we butt heads occasionally, it just makes me validate my ideas before executing on them. If I didn't have her supporting me, I would have given up on this initiative a long time ago.
Knowing When It's WorkingThe last twelve months have been pretty positive. We've hired a lot of really smart people who understand where we're going, and have helped us get to the next level. We've also had a lot of people participate in a new technology/process project, and most of them have really enjoyed what they have gotten to do while adding significantly to their skill sets (some for the first time in years). There's been a lot of challenges and tribulations, but an equal number of successes, and everybody has learned from their experiences.
I've also noticed a distinct attitude shift across the department in the last two months. Pretty much all of the prior naysayers have accepted the change in direction, while others are actively promoting it in everything they do. For many, there's the feeling that the transition can't finish soon enough. There's a hopeful and positive feeling to the new projects that are in progress, and a cohesion forming between our teams as they find themselves getting on the same page with their processes, technologies, and direction. This cohesion has never existed in our department in the six years I have worked there.
Best practices are being shared regularly through natural collaboration, occuring as our expertise with each new technology and tool matures. Small groups of motivated and responsible parties have been forming automatically to solve common problems, after which they are presented to me for blessing and standardization.
The most interesting thing to me is that this cultural attitude shift occurred over a very short period of time. We had years of staunch resistance and disinterest, and then suddenly, things have started to gel as though somebody snapped their fingers. I like to think of this as the "event horizon" of organizational change. I'm not sure if this is a common and/or legitimate perception of events of this nature, but that's the way it has felt to me after almost five years of pushing for it to happen.
In ClosingI've been rambling on for way too long, so I'll wrap this up. I want to make this point clear for anybody else who is trying to change the organizational mindset and culture of their corporate IT department.
The bottom line is this: the entire period of culture shift required a near constant level of effort, during which I found that I had to maintain an unwavering vigil of positive reinforcement for our new direction and technologies. I often felt like giving up on the draining and repetitive process of convincing people to do something that they usually didn't want to do; after all, it would have been far easier to just go and work somewhere where the organizational values I yearned for were already in place (although truth be told, I'm not sure if such a company exists on the East Coast).
But now that the switch seems to have flipped, things have changed so drastically in terms of culture and attitude that I have to remind myself that it used to be the other way around. We've got really interesting work happening in exciting technologies, and the rewards gleaned from watching resources apply their new found skills to a successful project can't be measured in words.
Sure, we have a crapload of work left to do to get it all finished. I'm currently wrapping up the departmental development process changes our web team pioneered, while another pair of resources irons out the kinks in our SDLC and requirement/project management tools. I'm going to spend all of next year getting our QA processes straight, and pushing hard to promote the transition across the teams that are just now getting comfortable with making the jump.
Any change in corporate IT culture is going to fail if you aren't doing it for the benefit of the people that will live with it. My motivation for making things different stemmed entirely from the pain of failure after our first web team project crashed and burned, and the strong desire to make sure that situation never happened again. I can honestly say I've failed more than I've succeeded in my career, but having the opportunity to see things turn around at CFI and play a role in them doing so has been both tiring and thrilling in equal measure.
Sep 25, 2007
Build Tools: Maven Flex Plugins
UPDATE 2007.09.28: as Christian Gruber (the maven-flex2-plugin's author) graciously pointed out, the net.israfil Flex 2 Maven plugin **is** available on the public Maven repo. You can get to it here: http://repo1.maven.org/maven2/net/israfil/mojo/maven-flex2-plugin/. I've left my original post as is for posterity's sake, but you can check the Google Group for the plugin to see why the integration build was failing (pending a reply from Christian after I posted my build file).So, we have a
build tool shoot-out coming up next week, which means I need to get my crap together and get a CF/Flex build working in Maven.
Naturally, the easiest thing to do would be to get a Maven 2 Flex Plugin and let that build my code for me. However, I had a few small issues getting it going, so I though I would share my experiences in case anybody else has the same problems.
Turns out there are two Maven plugins for building Flex apps: the maven-flex2-plugin option from
israfil.net and another one I found today by accident on
ServeBox.com (the flex2-plugin). Besides this, the ServeBox guys seem to have all sorts of Flex goodness (including a
Java/Flex framework, apparently - who knew?). Anyway, I had been planning on using the one from israfil.net, so I started there.
Installing the PluginFirst off, I needed to install the plugin in my local Maven repository. Usually, installing Maven plugins is a snap, since they are hosted on global repositories that your local Maven installation knows how to talk to, and Maven installs them automatically just be you declaring them in your build. However, since israfil.net is not a hugely popular plugin, it's necessary to download it and install it from source.
First step, check the code out from SVN. We can find the SVN repo on the
Project Information page (the Maven site plugin generates this for your automatically if you tell it where your repo's located in your POM).
svn checkout \
http://israfil-mojo.googlecode.com/svn/tags/israfil-mojo-all-9 \
israfil-mojo-all-9
If you look at what you checked out, you'll see that you have a tiered Maven project. There is a top level project for the israfil-mojo family of projects, and beneath that is each of the sub-projects for compiling flex2 (maven-flex2-plugin), generating ASDocs (maven-asdoc-plugin), etc.
Once you have the project code, you need to install it in to your local repository. Usually, you just invoke
mvn install, but I had some issues with the build process.
First, due to the way that the maven-invoker-plugin was declared in the build file, Maven could not locate the plugin in the global repository to download it. So, I found the plugin on the appropriate
Project Information page again, and checked it out from SVN.
svn checkout \
https://svn.apache.org/repos/asf/maven/plugins/tags/maven-invoker-plugin-1.0 \
maven-invoker-plugin
Next, I ran
mvn install from the root of the downloaded
maven-invoker-plugin directory, and Maven performed its usual magic of downloading all the dependencies and installing the plugin for me.
Once that was resolved, I went back to the Israfil Flex plugin. I ran
mvn install from the root of the
israfil-mojo-all-9 directory. Everything ran fine (compilation, tests, etc.) until Maven hit this point:
...
[INFO] Installing /Users/.../israfil-mojo-all-9/maven-flex2-plugin/pom.xml to /Users/.../it-1.2-SNAPSHOT/maven-flex2-plugin-it-1.2-SNAPSHOT.pom
[INFO] [invoker:run {execution: default}]
[INFO] Building: maven-flex2-actionscript1-test/pom.xml
[INFO] ...FAILED[code=1]. See /Users/.../build.log for details.
...
What was happening was, while running the invoker plugin, something was failing. It looked like some sort of ActionScript test was failing. I went in to the directory referenced in the error (/Users/porgesm/Desktop/israfil-mojo-all-9/maven-flex2-plugin/src/it/maven-flex2-actionscript1-test/) and found a pom.xml. Looking in the pom.xml, there was an integration test of some kind being run. Rather than trying to figure out why this was failing, I took the brute force approach and simply commented out the invoker plugin's invocation in the pom.xml under the
maven-flex2-plugin directory (under the downloaded
israfil-mojo-all-9 directory) as follows:
<!--
<plugin>
<artifactId>maven-invoker-plugin</artifactId>
<configuration>
<debug>true</debug>
<projectsDirectory>src/it</projectsDirectory>
<pomIncludes>
<pomInclude>*/pom.xml</pomInclude>
</pomIncludes>
</configuration>
<executions>
<execution>
<phase>integration-test</phase>
<goals><goal>run</goal></goals>
</execution>
</executions>
</plugin>
-->
Running
mvn install again now produced the following output:
[INFO] ----------------------------------------------
[INFO] Reactor Summary:
[INFO] ----------------------------------------------
[INFO] Israfil Mojo: All ............................... SUCCESS [1.852s]
[INFO] Flex 2.0 maven plugin support classes ........... SUCCESS [2.834s]
[INFO] Flex 2.0 maven plugin ........................... SUCCESS [0.913s]
[INFO] ASDoc API documentation generator plugin ........ SUCCESS [0.604s]
[INFO] ----------------------------------------------
[INFO] ----------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ----------------------------------------------
[INFO] Total time: 6 seconds
[INFO] Finished at: Tue Sep 25 22:13:27 EDT 2007
[INFO] Final Memory: 11M/21M
[INFO] ----------------------------------------------
Nice - we're in business.
Creating A Sample Flex ProjectNext, I configured a simple Maven project in Eclipse with a Main.mxml containing a button that said "Hello" when clicked. This .mxml file goes in to src/main/flex under the project, the default location the plugin expects code to be located.
The pom.xml file for this project was really simple. I took the sample from the israfil.net web site and modified it slightly for my needs:
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>net.israfil.examples</groupId>
<artifactId>israfil-swf-example</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>swf</packaging>
<properties>
<flex.home>/opt/flex_sdk_2</flex.home>
</properties>
<build>
<!-- if you want this vs. artifactId.swf use finalName -->
<finalName>exampleApp</finalName>
<plugins>
<plugin>
<groupId>net.israfil.mojo</groupId>
<artifactId>maven-flex2-plugin</artifactId>
<version>1.2-SNAPSHOT</version>
<extensions>true</extensions>
<configuration>
<flexHome>${flex.home}</flexHome>
<useNetwork>true</useNetwork>
<strict>true</strict>
<!-- ... other options ... -->
<warning>true</warning>
<!-- if you use dataservices, make sure you overri... -->
<!--
<dataServicesConfig>
src/main/resources/services-config.xml
</dataServicesConfig>
-->
<main>Main.mxml</main>
</configuration>
</plugin>
</plugins>
</build>
</project>
Basically, I commented out the <dataServicesConfig/> element since I'm not using dataservices, and added a <properties/> element at the top of the file with the path to my Flex 2 SDK directory (containing the frameworks, player, bin directories etc. that come with the Flex 2 SDK).
I ran
mvn compile, but had another issue, this time with Flex compilation.
...
[INFO] defaults: Error: unable to open './macFonts.ser'
...
Duh. Checking the
Usage page in the plugin's project site, I see that I have to uncomment the <local-fonts-snapshot>localFonts.ser</local-fonts-snapshot> element in my flex-config.xml file. RTFM, fo' sho'.
I ran
mvn compile again, and this time (after a brief compilation pause) a file called
exampleApp.swf appeared in my
target directory. Sure enough, double clicking the SWF opened the Flash projector and produced a running Flex app.
ThoughtsThis was certainly a bit more of a pain than I had expected, but the issues with the build were not unresolvable. In a controlled development environment, you would build and deploy a lesser-known plugin like this to your corporate Maven repository to save your developers the hassle of doing so manually.
The plugin seems to work nicely once configured, although I've only put it through simple paces so far. The other plugin I mentioned from ServeBox supposedly has FlexUnit integration, so I'll definitely need to check that out - otherwise, I'll need to invoke the FlexUnit Ant tasks from Maven in order to run the unit tests for the sample application we're building for next week's user group meeting. ServeBox also has their own public Maven repo from which Maven can retrieve the plugin, so the install process should be simpler once I add this repo location to my Maven settings file.
Stay tuned and I'll post more info on my ongoing experiences with the israfil.net plugin, as well as the one from ServeBox.
Sep 22, 2007
MG:Flex or PureMVC?
We've got a major Flex app cooking at CFI right now, the largest one we've done to date. We're re-writing the front end (and some of the back end) for the sales application used by our call center agents. This app has high visibility, since it drives the sales and marketing department that produces over $200 million in annual timeshare revenue.
When we started off with Flex 1.5 a few years ago, we naturally used Cairngorm because... well, there wasn't really anything else. Since then
PureMVC has popped up, as well as
Model-Glue for Flex.
Overall, we found Cairngorm a little heavyweight and too web-centric to fit naturally with Flex. I don't mean to knock the framework, since it has its merits, but we're taking the approach of treating our Flex apps as desktop apps distributed over the web, with all the stateful goodness that an app of that nature enjoys. Using a framework based on Front Controller doesn't really jive with our philosophy.
So, what do we want? I'd say two things: minimal configuration and framework boilerplate, and something that leverages the built-in event model and bindings that make Flex so cool.
Overall, I think we're equally satisfied with the job that PureMVC and MG:Flex do, but we're nervous about MG:Flex's alpha status. That being said, if the code
Joe has cooked up so far works, and does what we need it to, then I don't see why we wouldn't use it.
Anyway, we have a few POCs going for both frameworks as we try to solve a few challenges with our application. Outside of the call center sales flow, we have some enterprise-wide considerations to tackle, including role-based security and some other global application management concerns. As always, I'll be blogging the outcome with fervour.
Abandoning ColdFusion?
Following several discussions at work recently, it looks like we might be abandoning ColdFusion.
I've said it before and I'll say it again: I think ColdFusion is a great front end complement to back end Java. However, at this point in time, I'm not convinced it's the
best complement.
Most of the web sites we build at CFI are simple CRUD apps, or front ends to our enterprise applications with either a local database or nothing at all. For all of these scenarios, we get to design our web application databases as we see fit, and pretty much everything is based on Active Record. For something like this, ColdFusion looks seriously long in the tooth compared to frameworks like Rails or Grails.
Besides that, CF is just plain verbose. I was critical in the early days of CFCs due to the tag-based syntax. This didn't really bother us as much when we started writing our back ends in Java, but it has slowly worn away at our web developers to the point where they're ready for something with a terser syntax. You just can't justify the length of a CFC compared with a class from a scripting language.
Wearing my business hat for a second, I've got the option of either upgrading our twelve CF 7 server licenses to CF 8 for around $45,000, or getting a competing platform for zip/ziltch/nada.
So, we've been discussing switching to Groovy on Grails or Ruby on Rails. Since Grails is still figuring out its place in the world, and with Ruby being a pretty awesome language and JRuby being available, we've decided to start with JRuby. If the Rails on JRuby POC is successful, we're not looking at it as a supplement to CF; it will be a wholesale replacement.
On the one hand, I'll be a little sad to see CF go. After all, I built my career on CF development, and was able to combine the language of my profession (CF) with the language of my passion (Java) until we could actually transition our technology environment over to a more Java-centric model. However, like all technologies, CF has to be constantly evaluated against its competition for its merits from a productivity and capability standpoint. Unfortunately, we're guessing that the productivity aspect just won't stand up to the alternatives, especially Rails.
We've got some pretty critical items that the POC needs to address.
1) Will Rails-based apps plug neatly in to our forward-looking enterprise service bus architecture?
2) How does RJS compare to the new Ajax features in CF?
3) Is WAR-based deployment with JRuby going to play well with our current server configuration, or require a substantial rework of our web server infrastructure?
4) Are Ruby builds going to be as seamless with Maven as our CF builds would have been?
5) Is there really more productivity in a Ruby-based environment over a CF-based one?
6) Performance/scalability?
There were a couple more items, but they are on my whiteboard at work. In any event, I'll be blogging on the outcome, so stay posted for more.
Sep 19, 2007
Code is the New Art
I couldn't have said it better myself, so I'll
link to somebody who already did and
somebody else who thinks we're both full of crap (in an amusing kind of way).
Build Tools: Buildr
Dan's been spending the last week or so (on and off) porting my sample Maven build script over to Buildr. He and I got together today to review his work and compare Maven and Buildr side by side.
Hopeful SpeculationGoing in to the evaluation, I was hopeful that Buildr would present a better alternative to Maven for a few reasons.
The first reason is that I see us scripting our deployment process to production in a scripting language, and that might as well be Ruby. Being able to have our entire build script and deployment process in the same language would give us a complete package.
The second reason is more forward-looking. Although I have found Maven to meet all of our needs so far, we're playing by the rules Maven defines, which doesn't really push Maven to its limits. I wrote
a post on a ColdFusion Maven build script, and while I got it working using a modification of the resource bundling built in to Maven's build process, I wasted an hour or so before that trying to get Maven's Assembly plug in to do the same thing.
Some of the challenges people have had with Maven became a bit clearer to me after this exercise, but my hypothesis regarding them using Maven for alternative purposes seems to hold true. The reality is, Maven is very tightly coupled to Java build processes, and anything else really requires a custom plugin. Trying to bend Maven-standard scripts to your will to do something out of the ordinary seems to be an exercise in futility. Of course, one could argue that Maven is not designed as a scripting language, and this would be accurate - but limiting Maven to the scope of Java applications seriously limits its usefulness for new technologies that we're not using yet. This wouldn't be an issue with Buildr, since its Ruby-based nature would make it super easy to script up something unexpected in a jiffy. I would hate to set up an infrastructure based upon Maven only to find that we're stuck when we try to introduce something else later on.
Back to BuildrSo, Dan's port basically did everything that my Maven build did, with a few exceptions. Some nice things about Buildr include the following.
1) The scripting syntax is very terse, which makes sense since the guys who wrote Buildr wanted a DSL for building software. Maven's XML files are lengthy, and hardly complicated by any stretch of the imagination, but it is nice to have a short and sweet file like Buildr's to review when you're looking over something.
2) Buildr has all the power of Ruby, so the things that were built in to Maven but didn't exist in Buildr were pretty easy to code. Dan found a way to hook in to the Maven PMD plugin, to simulate profiles, and to allow for inheritance-based overrides, as well as writing some code to resolve transitive dependencies (something Buildr lacked, and which Dan submitted to the Buildr project for the next release). The ease with which Dan was able to add these items speaks well to the future applicability of Buildr to unforeseen requirements in our build process.
3) Buildr conforms to all the Maven conventions, so taking a Maven project and porting it over (or vice versa) is pretty straightforward. There's even a tool for porting POMs to Buildr scripts, which makes sense since Maven POMs are simple XML files and can be transformed to any format like any other XML document.
4) Buildr has Eclipse project integration, where you can check out your project from SVN and run a single command to create .project and .classfile files. The .classfile files uses an M2_REPO classpath variable to reference your Maven repository, so your library dependencies get resolved to your local Maven repository, removing the need to add them to your project as copied JARs.
Of course, there were also some cons.
1) Buildr doesn't work with Artifactory. Artifactory is basically the best option for a Maven-style repository that is out there, and I was actually surprised that it doesn't work out of the box as a default option. Dan has sworn to fix this and submit the code to the Buildr project, since this was a bit of a show stopper for us.
2) Although Buildr can create Eclipse projects and set up initial dependencies, it didn't really have a nice model for adding dependencies on the fly while developing. Maven's Eclipse plugin reacts to changes to the POM file, automatically adding libraries to the classpath and downloading dependencies if necessary. In the absence of a Buildr Eclipse plugin, a developer would be forced to run a command from the command line in order to simulate this functionality. This isn't a huge issue, but struck me as something that would quickly become an annoyance during development.
3) Buildr lacks source control management connectivity out of the box. We could certainly have scripted integration with SVN by issuing SVN commands to the command line from Ruby, but this would tie us to SVN. The nice thing about Maven's SCM plugin is that it is tool-independent; so long as you use
an SCM supported by the plugin (which includes most popular ones), your build file requires no changes other than the protocol in the SCM URL pointing to your repository. Although I don't see us moving off of SVN any time soon, the independence of the plugin is a nice little bonus, especially since we might decide to go to a more mature commercial product such as Perforce at some point.
4) Maven has oodles of plugins for almost anything you can think of. Although Dan was able to invoke a few of the plugins from the command line, it required extra thought to put this in place, whereas Maven provides this capability out of the box.
5) There was no equal to the Maven's SQL plugin in Buildr. Being able to stage SQL scripts during development, QA, and production staging is going to be a really neat feature. I don't have this working completely in the Maven build yet but it shouldn't be a far stretch from
the POC I did a few days ago.
6) There's no equivalent to Maven's site generation plugin in Buildr. Honestly, this would not have been a show stopper if it were the only item, but it was a detractor. In the early days, we had discussed adding site generation to our automated build process, and the fact that Maven has it is a nice bonus.
7) Finally, the fact that Buildr is in Ruby is a small challenge for us. We only have a handful of developers working regularly with Ruby outside of work, whereas everybody on the team is familiar with XML from Spring and Hibernate configuration and/or writing Ant builds. Adding a new language to the mix, while not the end of the world, just creates an additional item for us to train our developers on. Ultimately, I think this is going to be a moot point, since I'm very interested in exploring Ruby and Rails as platforms for developing our web sites and services in the future.
SummaryUltimately, Maven won out between the two tools due to the fact that it worked with Artifactory, builds all the technologies in our environment pretty easily, had the SQL integration, and has lots of bells and whistles out of the box.
Most of the items in Maven's favor were simply a function of its maturity and the fact that we are doing a lot of Java development. In the absence of Java, it would be somewhat useless due to the narrow scope of the tool. I can very easily see us moving to Buildr once the gap closes in functionality, and I think this is going to happen relatively soon (as in, within a year or so). Dan was telling me there is a version 2.0 of Buildr on the horizon, and I imagine with all the Railsing and JRubying going on that more people will be jumping on the Buildr bandwagon as they look for a great build tool written in Ruby to supplement their Ruby-based development.
Since there is an easy migration path to Buildr from Maven, I don't have any concerns about a switchover in the future. All in all, it gives us a nice option to keep in our back pocket if the Maven situation should end up backfiring on us.
Sep 18, 2007
Thermo-Nuclear War?
Sep 17, 2007
Book Review: Enterprise JMS Programming
I recently bought a copy of
Enterprise JMS Programming to assist with my
Mule tinkerings.
I was looking at JMS books for a while before I took the plunge and bought one. For some reason, bookstores in Orlando, FL just weren't carrying any JMS love, which pisses me off because I hate buying books I haven't thumbed through. There are a lot of factors that go in to the quality of a book, especially a programming book, and it's just not possible for me to get a good feel through online samples. Also, with regard to JMS, all the books worth having seem to be on older-than-current versions of JMS (this one included). I settled on Enterprise JMS Programming, and hoped for the best.
The book has been pretty awesome. The author did a really nice job of packaging the chapters in to easily consumable chunks focused on specific areas, and was able to deliver both code samples and best practices on almost every page without providing so much that the reader became bored, or so little that the material was unhelpful. Also, best practices and "gotchas" in real-world implementations are described whenever possible within the context of the item being discussed.
The fact that the book covers the earlier version of JMS has not been a challenge so far, partially because the API for JMS hasn't changed that much, and partially because the way the content is presented the concepts are still relevant regardless of the final implementation.
I like that the code samples are simplified without being devoid of useful content, and tightly focused on the topic at hand. Some books try to follow a theme of building an application or framework through the chapters, which pisses me off because I can't jump around the book at random without having to decipher which parts of the sample code are boilerplate for the sample app, and which are related to the technology I'm trying to learn.
That being said, having a soup-to-nuts sample app to look at is certainly useful, and the end of the book shows three system architectures for varying applications. The author has done a nice job of keeping the requirements for these applications simple while still introducing enough differentiation and quirks to demonstrate varying models and architectural styles. Each application case study is short enough to be coherent while providing enough sample code to build a picture of the full implementation in the reader's mind.
Another nice aspect of the book are the sample chapters on JMS administration. The discussion of JMS administration is useful because you can't really have a good messaging architecture if you don't understand the challenges that must be tackled to administrate it. There are also some examples given of walk-throughs in two different JMS products, but unfortunately these are so old that they're not relevant any more (all the screenshots are in Netscape 4, if that gives you an idea :) ). Since the high level discussion on JMS administration is implementation agnostic, these chapters can be safely skipped without detracting from the value of the book.
So, kudos to Shaun Terry for his efforts. This book is quickly becoming one of my favorites, and I highly recommend it to anybody planning on learning JMS.
Sep 15, 2007
Build Tools: Adding SQL to the Maven Mix
Something I haven't tackled yet is adding SQL executions to the mix for our automated builds. There are two potential use cases for this:
1) Setup/teardown of test databases during unit testing
2) Publishing SQL changes when we publish our new CF/Flex/Java code
Since there is a SQL plugin for Maven, I decided to give it a go with my local MySQL instance. I logged in as my MySQL root user and created a database to play with and gave myself permissions for it.
CREATE DATABASE maven;
GRANT ALL ON maven.* TO 'porgesm'@'localhost';
Next, I created a Maven build to execute some SQL. I wanted to use the mysql-connectorj library, so I went to the public Maven repository at
http://repo1.maven.org and found the details on the available MySQL depedencies under
http://repo1.maven.org/maven2/mysql/mysql-connector-java.
<?xml version="1.0" encoding="UTF-8"?><metadata>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.0.5</version>
<versioning />
</metadata>
Now I needed a Maven build that would make use of this library. I found the details on the
SQL plugin for Maven and created a modified version of the sample build files they showed on their web site.
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>MavenSQL</groupId>
<artifactId>MavenSQL</artifactId>
<version>0.0.1</version>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>sql-maven-plugin</artifactId>
<!-- JDBC Driver -->
<dependencies>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.0.5</version>
</dependency>
</dependencies>
<configuration>
<driver>com.mysql.jdbc.Driver</driver>
<url>jdbc:mysql://localhost:3306/maven</url>
<username>porgesm</username>
<password>password</password>
<autocommit>true</autocommit>
<onError>continue</onError>
<srcFiles>
<srcFile>src/../sql/${project.version}_drop.sql</srcFile>
<srcFile>src/../sql/${project.version}_create.sql</srcFile>
</srcFiles>
</configuration>
</plugin>
</plugins>
</build>
</project>
Usually when developing, I find it easiest to create two scripts: one that will reverse any changes I intend to make (a "drop" script), and then one to stage my new changes (a "create" script). To do this, I created two files: ${project.version}_drop.sql and ${project.version}_create.sql. I use the token ${project.version} so that the plugin will automatically reference a SQL script based upon the version of the project I am on. This will give me something akin to a poor man's version of migrations in Rails, where only the SQL scripts for the current version will execute when the build file is run.
Of course, the first time you run the drop script, it will fail since the changes don't exist yet. To make Maven ignore errors due to the drop, I threw in the <onError>continue</onError> declaration.
I then created two files that held simple SQL statements to be executed in the build.
0.0.1_drop.sqlDROP TABLE maven_table;
0.0.1_create.sqlCREATE TABLE maven_table (stuff NUMERIC);
You will notice that there is a username/password entry in the build file. I can move this to my global settings.xml file inside my Maven home directory, where I keep all my other passwords. If I were to do this, the settings entry would look like this.
<?xml version="1.0"?>
<settings>
<servers>
<server>
<id>settingsKey</id>
<username>porgesm</username>
<password>password</password>
</server>
</servers>
</settings>
I would then reconfigure the configuration in the plugin as follows.
...
<configuration>
...
<settingsKeys>settingsKey</settingsKeys>
<driver>com.mysql.jdbc.Driver</driver>
<url>jdbc:mysql://localhost:3306/maven</url>
<autocommit>true</autocommit>
...
</configuration>
...
I ran the build, and it executed flawlessly. It gives you output regarding the SQL scripts that you run.
mac:~/Documents/code/javaprojects/MavenSQL porgesm$ mvn sql:execute
[INFO] Scanning for projects...
[INFO] Searching repository for plugin with prefix: 'sql'.
[INFO] -------------------------------------------------
[INFO] Building Unnamed - MavenSQL:MavenSQL:jar:0.0.1
[INFO] task-segment: [sql:execute]
[INFO] -------------------------------------------------
[INFO] [sql:execute]
[INFO] Executing file: /Users/.../sql/0.0.1_drop.sql
[INFO] Executing file: /Users/.../sql/0.0.1_create.sql
[INFO] 2 of 2 SQL statements executed successfully
[INFO] -------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] -------------------------------------------------
[INFO] Total time: < 1 second
[INFO] Finished at: Sat Sep 15 15:20:27 EDT 2007
[INFO] Final Memory: 2M/3M
[INFO] -------------------------------------------------
One case that is not handled in this build file is if there are no SQL scripts to be run. If no SQL scripts for the current version being deployed exist, then the plugin fails and the build dies. There are two possible solutions. One would be to require a set of SQL scripts for every build, and just have them be blank if there is no SQL to execute (a little kludgey). Another would be to set execution logic in the build so that we only execute this plugin if SQL scripts exist, or by using a command line parameter.
There is a <skip/> option in the SQL plugin to allow you to conditionally set execution, so this wouldn't be too hard to do. Here is an example of skipping builds if Maven is being run during a test execution.
<configuration>
<username>postgres</username>
<password>password</password>
<settingsKeys>key</settingsKeys>
<url>jdbc:postgressql://localhost:5432:yourdb</url>
<skip>${maven.test.skip}</skip>
</configuration>
So, I'm still finding some simple and useful additions to the build process with Maven, and they're still working fine. I saw something on a web site saying that Maven's goal was to handle 95% of Java development use cases out of the box, and that rule still seems to apply in my experience so far.
Build Tools: Maven and ColdFusion
We had some discussions at work today surrounding Maven and Buildr, and the build strategy I've been working on. One item I hadn't covered yet was dependency management and build automation for ColdFusion projects.
The thing with ColdFusion is that there isn't really any code to compile; you just have a bunch of .cfm files, and usually a set of CFCs that you'll want to invoke. Some of these CFCs will be in libraries that you have written, while others will be third party libraries (such as ColdSpring and Fusebox).
The issues with resolving dependencies in CF aren't really that much different from resolving them in Java. The biggest difference is that at deployment time, Java web applications will bundle up a WAR file with all the library dependencies in the WEB-INF/lib directory, whereas CF apps just need a "web root" directory with all the CF code, and the root of the package(s) for any CFCs that the app needs at the same height of the directory tree.
There is, of course, another option for CF apps, which is to use WAR-based deployment. We're thinking about this for a few of our apps, but the problem with this option is that you have to bundle the entire CF server with the application. The advantages are that we can have a pristine classpath for the application, and use Maven/Buildr to resolve library dependencies for Java libraries, but the disadvantage is that the resultant WAR ends up very large (CF by itself is about 100 MB, slightly less without the CF administrator).
Since both Maven and Buildr support WAR-based builds and deployment, WAR-based CF apps would be the easiest to take care of. All we'd need to do is follow Maven's standard convention for the project file system layout, and put the .cfm files and CFC packages in the directory where Maven expects JSP files (in the root of the web app). However, since we're not sure if we want to go this route yet, I wondered how hard it would be to have Maven resolve dependencies for ColdFusion and build the entire app without having to resort to the WAR-based approach.
It turns out it wasn't that hard. The first thing I needed to do was create a few projects: two "library" projects to hold CFCs to be used as dependencies, and another "application" project to rely on the library dependencies.
In order to have Maven resolve the dependencies, I needed to install the libraries in to Artifactory. Artifactory can store any archive format, so to keep things simple I just left them as JARs. I set up a basic Maven build for a JAR. However, since there is no need to compile any code, I overrode the location of the "resources" directory and just put my top-level CFC package inside. In Maven, anything you put in this "resources" directory just ends up in the root of the JAR, so by doing this, Maven would JAR up my CFCs, putting the top-level CFC package in the root of the JAR file, just like it would put the top-level Java package in a JAR to be picked up by the classpath.
So, the directory structure for all of my ColdFusion Maven projects (for both the libraries and the application) look like the following:
-- project root
-- pom.xml
-- src
-- com
-- packagename
-- packagename
-- Component.cfc
The build file looks like this (ignore the package and artifact naming referencing "coldspring", I was just pretending to have a real CF library of some kind).
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>coldfusion.library</groupId>
<artifactId>coldspring</artifactId>
<version>0.0.1</version>
<packaging>jar</packaging>
<build>
<resources>
<resource>
<directory>src</directory>
<includes>
<include>*/**</include>
</includes>
</resource>
</resources>
</build>
<distributionManagement>
<repository>
<id>localArtifactory</id>
<name>Local Artifactory Repository</name>
<url>http://localhost:8081/artifactory/cfi-releases</url>
</repository>
<snapshotRepository>
<id>localArtifactory</id>
<name>Local Artifactory Repository</name>
<url>http://localhost:8081/artifactory/cfi-snapshots</url>
</snapshotRepository>
</distributionManagement>
</project>
The <resources/> section simply says "all my code is in a directory in the root of the project called src." The <distributionManagement/> section defines my local Artifactory instance, where I want to deploy my libraries. I run the command
mvn deploy to package my JAR and deploy it to the repository. I created a second mock library for Transfer, used the same Maven build file, and deployed that to the Artifactory repository also. The build file for this library looks the same as the other one, but has the following opening block:
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>coldfusion.library</groupId>
<artifactId>transfer</artifactId>
<version>0.0.1</version>
<packaging>jar</packaging>
...
Next, I needed to create an application to hold some .cfm pages that would make use of my libraries. I set up the same build file again, but this time, added dependencies for the two libraries. I also added an override to the assembly plugin so that it would resolve the dependencies at build time and bundle them in to the final JAR. This means that the final JAR will contain both the application code pages, and the dependent libraries, in the root of the JAR file. Simply unzipping this JAR file would result in a directory ready to be uploaded to a CF server and executed.
This build file looks like the following. To package up my application in to a JAR containing all of the source for the app and the dependencies, and deploy it to Artifactory, I simply run the command
mvn deploy again.
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>coldfusion.application</groupId>
<artifactId>cf-application</artifactId>
<packaging>jar</packaging>
<version>0.0.1</version>
<build>
<resources>
<resource>
<directory>src</directory>
<includes>
<include>*/**</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<configuration>
<descriptorRefs>
<descriptorRef>
jar-with-dependencies
</descriptorRef>
</descriptorRefs>
</configuration>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<distributionManagement>
<repository>
<id>localArtifactory</id>
<name>Local Artifactory Repository</name>
<url>http://localhost:8081/artifactory/cfi-releases</url>
</repository>
<snapshotRepository>
<id>localArtifactory</id>
<name>Local Artifactory Repository</name>
<url>http://localhost:8081/artifactory/cfi-snapshots</url>
</snapshotRepository>
</distributionManagement>
<dependencies>
<dependency>
<groupId>coldfusion.library</groupId>
<artifactId>coldspring</artifactId>
<version>0.0.1</version>
</dependency>
<dependency>
<groupId>coldfusion.library</groupId>
<artifactId>transfer</artifactId>
<version>0.0.1</version>
</dependency>
</dependencies>
</project>
You'll notice that there is quite a bit of replication in these files for the <distributionManagement/> and <resources/> sections. This is easy to remove with Maven's project inheritance model, but since this is just a POC I replicated the code blocks between build files.
The advantage of using Maven's POMs for the build process is that I get all the benefits that Maven provides out of the box: I can create "snapshot" builds for code under development, and easily manage multiple versions of libraries, regardless of whether I created them or I have downloaded them from third parties. I can also integrate with SVN or any other source control system that Maven supports, and take advantage of multiple profiles for staging to different environments or modifying the build process in some way (such as copying in differing configuration files for each environment, like we need to do with our Java apps).
It's also conceivable that if I wanted to, I could write Maven plugins in Java to manipulate or scan my CF code. For example, I could write a CF code coverage plugin, and run it when building my code to create a report of inconsistencies. Alternatively, I could add a plugin that runs the
cfcompile executable to precompile my code. I have no plans to write either of these plugins right now, but it's nice to know the facility is there in Maven if I want it.
One item we'll definitely want to do is have the code stage itself to a local development environment, or to an integration testing environment. We can use the ability to switch profiles to set up a different distribution method for integration builds, during which the code will be compiled and then FTP'd to the appropriate server for testing to begin.
Dan's going to be porting my Maven scripts over to Buildr next week, so hopefully by week's end we'll have an idea of where we're headed next. Assuming we get all the items migrated, I'd really like to make a decision on a build platform and set a new standard for builds in our environment the week of September 24th.
Sep 11, 2007
When Languages Become Religions
So, I guess they
stopped printing CFDJ recently, and the CF community is either (a) bouncing with joy over its demise (the magazine was far from impressive), or (b) really upset about what this means for the CF community's ability to bring people in to the language.
I'm going to harp on the latter part for a second. I've never really understood why people get senitmental about languages or technologies. People who give a crap about this industry update their skills often and enable themselves to be more productive in their niche area, solving problems faster/more effectively. It's not like getting more people in to CF is going to make the language any better or worse at solving computing problems; only the technology itself can do that, and those attributes will be the ones that draw a crowd or drive them away.
If you ask me, the writing's been on the wall for CF for a long time. I think Macromedia said it best when they referred to CF MX in a marketing video as "the productivity layer for J2EE." That's basically how I see it. If you own the CF license already and you're using Java on the back end, CF is a great replacement for the de-facto JEE front-end standard of JSP. If you don't own CF, you might be doing things differently with JSP, but you probably aren't really missing it much.
Lets face it; Rails is dominating the web development arena. It's free, it's fast, and it meets 98% of use cases that CF, JSF, ASP.NET, etc. were intended for. Rails has great support for Ajax, and it's become such a de facto platform for technology startups that it's almost a cliché. While I still think CF is great for certain use cases, I certainly wouldn't be writing any standalone web applications in it if I started my own company. Heck, I'd even think twice about Java, and I love Java. Now, if I was writing a core system that had to do a lot more than serve dynamic web pages, Java's still my first choice. As always, the right tools for the right jobs.
So, the bottom line is, if you love a language and technology but people are moving away from it, you need to realize that you are becoming part of a minority that will eventually be relegated to the fate of COBOL and FORTRAN. No manner of press hype or shiny marketing will change that. Death is sure to come to Java one day, and Flex, and all the other things I love right now - but guess what? I'll let them go when their time comes if the replacement solutions are clear winners.
Paul Graham said it best when he wrote in "Hackers and Painters" about the "perfect language." Since it has yet to be created, there's clearly room for more change in the industry, and thank God for it since it's one of the things that makes me love what I do.
Build Tools: More Experiences with Maven 2 and Buildr
Well, I'm finished evaluating Maven; it does everything I need it to, and did so flawlessly (with one exception) for my needs. The exception was a slight inconsistency in how command line parameters get passed to the SCM plugin, which was very easy to find a workaround for.
Everything else did what it said on the box, and altogether Maven and its army of plugins has allowed me to set up a proof of concept build automation environment for our Java applications. I was able to do all this in the same amount of time it took me to get 20% through the same proof of concept when using Ant and BeanShell as the build tools.
I've found a lot of people blogging about how they are really pissed off with Maven, that it doesn't work, and that they will never use it again. To be honest, I'm really not sure why they had so much grief. The only thing I can think of is that they tried to use Maven for use cases for which it wasn't intended; maybe they were trying to write build scripts instead of follow Maven's conventions, or maybe they weren't up to the task of digging for documentation when needed. I can certainly say that after working with several open source projects with spotty documentation, I've developed a pretty keen sense for where to look (hint: mailing lists and sample code are your savior). Maven's documentation is spotty in areas, and I did have to dig a little to understand some of the plugins properly, but it certainly wasn't as intense as the digging I had to do to get in to the depths of other projects such as Mule or ActiveMQ.
I've also seen that large, popular open source projects such as Spring and Hibernate use Maven for their builds. If Maven was really such a pain in the ass, I'm sure people with minimal patience for tedium such as Rod Johnson and Gavin King would have avoided it for another solution (or rolled their own). As late as 2.0.5, Spring was still using Maven 1.0. I've been using Maven 2, which is supposed to be a drastic improvement, so I can only imagine how building a project as complex as Spring would have been with the previous version. Apparently, they managed.
Anyway... experience mismatch aside, I'm ready for the next step, which is to evaluate Buildr and potentially port what I have in Maven over. I looked through Buildr's manual (very nice docs, by the way), and it looks dead simple. I'm no Ruby slinger, but
Dan is, so between he and I I'm sure we'll figure it out in short order.
First impressions of Buildr indicate that it does everything Maven does, but in script instead of XML. I'm not sure if I like this better or not. Script has its advantages; for example, Maven would be a total pain in the ass to script anything in. Your options are (a) write a Java plugin (read: learning curve), or (b) jump in to an Ant task and try scripting there (read: slow, painful, and unrewarding).
However, a lot of the stuff I've done in Maven makes perfect sense as configuration info in XML, since it's all convention based. All I've had to do is indicate which plugins I want, where my SVN repo is, what my project is called, and other info that is static and non-programmatic. However, an example of where Maven would fail me is when we deploy one of our web apps which requires symbolic links to be set up in the deployment directory after the code is staged. There would be no easy way to do this in Maven, whereas in Buildr I'm sure we'd just script the commands from Ruby over ssh.
After skimming the Buildr documentation, there are a few things I think it might have a hard time matching when compared with Maven.
1) Plugins. Maven's got them by the truckload, and the ones I played with were all really good and easy to configure. There doesn't appear to be a bridge for using Maven plugins in Buildr; all Buildr replaces is the Maven build script/config file.
2) Overridable project configuration inheritance. Maven makes setting up a global configuration for a development team to share a total breeze. When said developers create a project, they just create a minimal config file with the differences between their build and the global one. In my POC, all you need to do is copy down the pom.xml file for a project from Artifactory and run a command from the command line, and so long as you have the global config file in your local Maven repo, Maven will checkout source, resolve dependencies, build, test, and deploy the project with no further intervention.
On the flip side, there are a few things Buildr clearly has over Maven.
1) Handling of non-JAR/WAR/POM resources. I think there might be a way to do this in Maven, but Buildr makes it super simple. We need a way to manage non-Java resources in our build lifecycle (such as pure ColdFusion projects, or CFCs as dependencies), and Buildr seems to have elegant solutions for these problems.
2) We talked about scripting functionality. It's important enough that I'll mention it again in case you're not getting the point. :)
Dan and I should be tackling the full comparison this week, so I'll post back with the results once they are in.
Sep 9, 2007
Build Tools: Maven 2
I'm evaluating Maven 2 as a
build tool for our environment. Here are my findings so far.
Adam had been assigned the task of doing a high level evaluation of Maven 2 for us, and wrote a page on our internal Wiki on his findings. I was really intrigued by Adam's evaluation, and decided to play around with Maven to see if it really was as convoluted and error-prone as some of the other build solution communities had lead us to believe. I was also interested in why the guys at the Apache group would use this tool regularly if it was so troublesome, especially when they had created it to simplify and empower their Ant-based builds - exactly the reason we were looking for new tools ourselves.
Giving Maven a ChanceI had looked in to Maven in the past (over a year ago) and found the documentation painfully lacking. On this, my second tour, I found
Apache's Maven 2 web site very useful with quick start guides and comprehensive examples and documentation for the wide array of pre-existing plugins available. I did have to do some digging on Google for some specifics and examples, and found some of the build files in public SVN repositories useful for the less-well-documented plugins. I also found the book
Better Builds with Maven extremely useful, even though it requires you to sign up to download it, and you can't copy/paste text out of it for some strange reason (I've only worked with it on the Mac so this might not be an issue on Windows or Linux).
I've probably sunk a sum total of 8-10 hours in to learning Maven and putting together my POC build, and I'm somewhat surprised to say it's doing just about everything I need it to, and did so the first time around with no issues. I've got objectives #1, #2, and #3 from the bottom of my
previous blog post on the topic nailed, with a few bells and whistles on top.
With regard to objective #3 (generate documentation), Maven spoils the user for choice. There are 11 reporting/code quality plugins listed in Better Builds with Maven, handling everything from code quality/standards enforcement, to copy/paste detection, to todo listing, to test coverage/result reporting, to change logging. Maven builds JavaDocs out of the box, and allows you to easily link to standard JavaDoc documents such as those from Sun for Java and JEE. Maven also has the built in ability to generate a documentation web site, which you can customize through creating templates utilizing a simple markup language, and then publish to the server of your choice using a variety of methods including FTP, HTTP, and several others.
Today, I leveraged Maven's inheritance model to set up a common build file that will work for all of our Java-based projects, requiring almost no configuration by developers as they start new projects. Here's a sample of the XML file a developer would have to write to build any of our Java projects:
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.westgateresorts</groupId>
<artifactId>parent-pom</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>sample-maven-project</artifactId>
<name>subproject</name>
<packaging>jar</packaging>
</project>
With the exception of following a very simple pattern for indicating project dependencies, that's the entire file. Without expecting you to understand
Maven 2's POM files, this block of XML simply defines the parent config file, the name of the component for the project being described, and that the project should be packaged as a JAR file. Besides this boilerplate block, the developer requires no knowledge of Maven unless they need to customize the build. This should be a non-issue in 95% of cases since all of our Java apps follow a similar convention and are (at present) either JAR'd as a library or wrapped in servlet-based application (both of which Maven supports out of the box). If we need to accommodate different builds in the future, pretty much all the changes can be encapsulated in the parent build file, removing the need to edit the individual build files as much as possible.
Also note that this same build file inherits the ability to bootstrap an Eclipse project for the developer, allowing almost instantaneous access to a working project for them to get to work on.
Maven has the ability to customize builds to differing profiles, and to store server access information in a settings file in the user's home directory. Combing these features with an
Artifactory repository and SVN server, it was a snap to set Maven up to build differently for varying environments. Essentially, each role in our development lifecycle (developer, QA analyst, buildmaster) would have an appropriately configured settings file, and that would take care of indicating what sort of build should be run, including customized deployment options and the addition and removal of custom steps. Artifactory also allows restricted read/write/delete access to repositories, so we can lock down who can publish builds and to where, allowing us to enforce process and responsibilities in to our build lifecycle.
MigrationAs it turns out, migrating from our Ant-based builds would be a snap.
1) Two directories full of existing code would need to be moved to new locations meeting Maven's conventions.
2) Two new directories would need to be created and populated with production and QA resource files.
3) A pom.xml file very similar to the code sample listed above would need to be added to the project directory.
4) Any customizations would need to be added to the pom.xml file.
For all the projects I have worked on so far, the only item I can see potentially needing tweaking in the project's pom.xml file is the target Java version, which is a simple addition to the
<builds/>
area of the POM file.
Favorite Features So Far- Being able to
inherit from a standard build file that can be easily distributed as our process changes.
- Loads of
plugins, all of which (so far) worked right out of the box.
- Lots of documentation online and sample applications.
- Good choices for
Maven project conventions, which would make sense to our developers.
- Ability to
modularize larger projects, providing additional convenience through the inheritance model.
- The build goals that Maven follows meet best practices for software development out of the box.
-
Abstraction over our source control implementation. With a little more work I'll have the builds able to branch/tag code, and auto-increment version numbers too.
- Tight integration with Artifactory, which has been a pleasure to use and configure.
-
Archetypes can be created as project templates, so that developers can start new Maven projects using the customized template for our environment that I have created.
- There's a pretty nice
Eclipse plugin for Maven that hooks in to Eclipse's build process and really makes the build and dependency resolution seamless to the developer.
Unexplored Options and Gotchas- I haven't looked at scripting at all yet. I have a feeling that this will be a major weakness for Maven, considering that it is XML-based and plugins must be written in Java.
- The learning curve for Artifactory and Maven together wasn't too steep for me personally, but wasn't short either.
- There's a Flex plugin out there, but I have yet to use it. Adam said it worked fine, but I need to validate that.
- There's nothing by way of a plugin for ColdFusion yet; we'll have to write our own or script something. Again, I see this being more complex for Maven than for Buildr. That being said, this would be a one-time task that I doubt would change much after it was completed.
- I haven't tackled deployment to production or post-deployment scripting. To be honest, at the present time I could easily see us doing builds with Maven and deployments with a scripting language such as Ruby or Python. But if you're going to jump in to Ruby, you might as well just use Buildr for the whole thing, which gives me yet another reason to investigate Buildr as an alternative.
On to BuildrI've gotten the impression from
Dan that Buildr basically does all the same stuff as Maven but with a hell of a lot less effort. That being said, we've already identified that the SVN integration isn't as immediate as it is with Maven, and I'll be surprised if Buildr's documentation options are as comprehensive out of the box.
However, I'm open minded to what Buildr offers over Maven, specifically Ruby scripting (which should hopefully make the missing SVN integration almost a non-issue, since we can invoke the command line from Ruby). I'm also hoping for a more pragmatic approach to builds, considering Buildr was introduced by a team of people trying to make what is essentially a simpler and better version of Maven.
Dan and I are getting together next week to look at my Maven POC and see what sort of effort it is to implement the same thing in Buildr. I'd love nothing more than to get the same benefits I've found with Maven with a terser build language, and what is hopefully a shorter learning curve to boot.
Sep 8, 2007
Build Tools
We've been messing around for a while at work with various options for automated build tools. Naturally, our journey started with Ant, but in recent days we've been veering down the path of more complete solutions.
The challenges we've been facing are as follows.
1) Dependency resolution is a pain in the ass. We're trying to standardize the libraries our Java apps rely upon to a certain extent but allow exceptions where necessary, which has resulted in variation in the build process for each project (precisely what we want to avoid).
2) Our Ant scripts for many of our projects basically all look the same. This breeds repetition in the builds, which could be a real problem if we change our build process significantly or decide to add new requirements later on.
3) Ant is painful to script in when you need to step outside the box. You can certainly do it using a variety of tools and extensions; I had the most luck incorporating BeanScripting in to Ant to automate SVN tasks, and was able to use some module extensions (from
Antelope if I recall) to get something approaching reusable functions. However, you end up with blended XML tags and script code, and it's certainly much clunkier than pure scripting.
4) We need to be able to customize our build process for varying environments; specifically development, unit testing, quality assurance testing, and production. For each environment, we need to be able to specify different settings, and in some cases get around inconsistencies in our hardware, app server, or OS environments (which while very undesirable are currently unavoidable).
5) Initially, we're just trying to tackle 100% build automation and deployment, but over the long term we want to add a lot of bells and whistles, such as automatically generating and deploying a common documentation repository with JavaDocs and other useful artifacts. Whatever tool we choose needs to be able to support this capability.
6) We have several technology choices to support at CFI: Java, ColdFusion, and Flex. We will likely be adding more in the future, and although there is little infrastructure to support it today, it would be nice for us to automate the build process for our legacy Oracle Forms and PL/SQL apps as well (so long as we are maintaining them).
To meet these challenges,
LeGros kicked off a little project to evaluate several tools that would potentially fit in to our environment:
Buildr,
Ivy,
Raven, and
Maven 2. For each tool, a developer was assigned to review the functionality and build a small sample project as a proof of concept.
IvyWe'll start with Ivy, since it's the simplest of the tools that we evaluated. Ivy essentially adds dependency resolution capabilities to Ant. It can use a variety of dependency resolution methods, including Maven repositories (something we found all the tools were able to hook in to - pretty much the only common thread).
Ivy is attractive because we can plug it in to the existing Ant scripts that we have, so it carries the shortest learning curve for our development teams. Unfortunately, it doesn't really address the other challenges that we have to face in any way; we'd have to continue to script solutions for these problems.
RavenRaven is a build tool for Java based upon Rake, which I've never used but I understand is very popular for Ruby/Rails projects. Raven essentially brings Rake-style builds to Java projects, and resolves dependencies through a Gem repository, which (again) I'm unfamiliar with but seems akin to the Maven repositories.
Raven looks promising because you script it with Rake tasks and Ruby, and Ruby is clearly a fantastic and terse scripting language. Unfortunately, when
Dan evaluated Raven, he felt like there wasn't a lot of support on the project when compared to the others (Ivy has been adopted by Apache and both Buildr and Maven have active communities supporting them). We don't really want a tool that isn't growing and changing with fervour, so I'm doubtful we'll end up adopting Raven as a result.
BuildrBuildr is essentially a Ruby implementation of Maven, which makes use of a Ruby-based DSL for builds instead of Maven's verbose XML syntax. Buildr was started by a team that found Maven incredibly frustrating to use for builds, which bodes well; in my experience, projects born of frustration tend to be very practical and a pleasure to use (
Spring and
Mule come to mind).
That being said, something I have often found discouraging in the development community is the ease with which people dismiss projects and technologies in favor of fads. Java and its community have taken a real beating from people in recent years, and while much of the criticism is well deserved, a lot of it is delivered with logic applied painfully out of context. I've learned a lot of good things from the lighter, more agile approaches and technologies favored today, but I've also seen many of these ideas work beautifully on a project supporting a single small application but fail horribly or scale inappropriately in a corporate setting that is much larger and more complicated. I'm hoping the team behind Buildr hasn't fallen foul to this dismissive attitude, and that the tool fits well and scales appropriately across any scenario.
Buildr's benefits of Ruby scriptability and a terse build file format look appealing indeed, plus the fact that Buildr plugs in to Maven-style repositories. The ability to use a Maven-style repository would allow us to use Buildr and Ivy together, providing a clean crossover from (or backwards compatibility for) our existing Ant builds and solving the dependency problem across the board. Maven's the only build tool I've been able to play so far, but Buildr is the next one I want to dig in to for building a fully-fleshed proof of concept.
Maven 2The current release of Maven is claimed to be a massive improvement over the former version, and seems to be the de facto standard for building projects created by the Apache group. Maven sprouted after an Apache developer tired of scripting huge builds in Ant, so it has that "born of frustration" component that I tend to find promising, as well as being something of a natural evolution from Ant. Maven is based on an XML build syntax and a convention-over-configuration approach, and has some pretty cool features such as project inheritance and modularization. There's a rich library of plug ins for Maven, and the repositories that most of the tools mentioned so far support are based on an original Maven standard.
On the downside, Maven has a reputation for not working as expected and/or becoming painful to work with when you step outside of the convention box. As a result, we had a bad taste in our mouths for Maven, felt that it would have a steeper learning curve for our developers, and would have a high level of impact on our projects, since they would have to be completely converted to Maven's conventions in order to get the benefits of the convention-over-config approach.
SummaryWell, that's it for the round up on our build tools. I've decided to play with Maven and Buildr. Raven seems ruled out based upon the relative lack of maturity and community support, and Ivy basically does what it says on the box and doesn't require further investigation in my mind.
I'm going to add some separate blog posts for Maven and Buildr, documenting my experiences with them and overall impressions. The build process I'm using as the POC has to be able to do the following:
1) Build a complete project as follows: checkout from SVN, resolve dependencies, compile, test, package artifacts, and deploy final artifacts to a repository. Build options, compilation/test options, and final repository location must be configurable for our multiple target environments.
2) Bootstrap a project for development: checkout from SVN, create an Eclipse project with all appropriate builders and configurations in place, set up library dependencies, and import the project in to the developer's workspace ready for action.
3) Generate documentation: at the bare minimum, create JavaDoc files and publish them to an appropriate destination (such as a development staging area for code under development, or a production instance for live code).
4) Support extension and scriptability. One item that immediately comes to mind is the ability to execute database scripts at deployment time, but I'm sure there will be other cases in the future. We need something that is flexible to our current and future needs.
I hope you will come along with me for the ride and post your questions and thoughts. We've got an ongoing thread of sessions at
Adogo right now for configuration management and build tools, kicked off at this month's session with
LeGros's excellent talk on Subversion, so if you're in the Orlando area I recommend you join us if this is a topic in which you have an interest.