The Blog

Jul 29, 2005

Extolling the Beauty of VirtualPC 

by Maxim Porges @ 5:49 PM | Link | Feedback (2)

I was just chatting with Sean Tierney about how cool VirtualPC is. Sean uses VirtualPC to set up instant development and production servers on his laptop for development and testing, and I use VirtualPC at work to run our Windows-only apps on my Mac.

The premise is simple. You start by creating a virtual computer instance on your existing hard drive, which is nothing more than a simple .vpc file. This file starts at about 40 MB, and will expand dynamically as you install an OS on it and fill it with files.

Next, you boot your virtual PC instance, which greets you with BIOS prompts. You then install your OS. You can then continue to install software and add files to it just like a regular PC. Of course, the best thing to do is to make a backup copy of the PC's .vpc file AFTER the vanilla OS installation, but BEFORE you add any more software - that way you always have a fresh copy of you chosen OS ready to go (which is how Sean gets his instant server environments).

VirtualPC really has saved my life, since it enables me to be a Mac user in a Windows world. And, of course, because it allows me to install that copy of MechWarrior II (which I found in the back of the closet today) on my G4 in a Windows 98 environment - bwa ha ha!

Most of all, I'm interested to see how MechWarrior II reacts to having 50 times the minimum hardware requirements available at its disposal... :)

The Mystery of the Missing White Paper 

by Maxim Porges @ 12:18 AM | Link | Feedback (0)

If you're wondering what happened to my white paper on using Java objects with Spring and ColdFusion instead of CFCs, then fear not: it's still on its way. I've been super busy with the J2EE project at my day job, which is finding its way in to my out-of-work hours as well as consuming most of my at-work hours.

In the interim, your patience is appreciated.

Jul 28, 2005

Opinions on JDeveloper, Oracle's J2EE Strategy 

by Maxim Porges @ 11:28 PM | Link | Feedback (7)

We hosted Oracle's technical sales team this week for a demo of their JDeveloper tool, a tour of the capabilities of their J2EE application server, and an overview of their long term J2EE strategy.

To be honest, the meeting was somewhat of a wash. We're so heavily invested in Oracle that the likelihood of us not using their J2EE server moving forward is pretty remote, especially when we're already running an earlier version of the same application server to host our Oracle Forms apps. However, I wanted to see what their strategy was, and better understand some of the more proprietary solutions that they offered, so that I could make intelligent decisions as we flesh out our own J2EE strategy.

JDeveloper was more impressive than I had expected. I'll summarize my thoughts below.


  • Integration with TopLink (Oracle's object-relational mapping solution) looked good. We're probably going to go with TopLink instead of Hibernate, because (a) it comes bundled with the app server, (b) it's one of the oldest and most mature ORM solutions on the market, and (c) it has dedicated support.

  • TopLink works with databases other than Oracle - very surprising (to me). I was expecting vendor lock-in.

  • JDeveloper had a Java editor that looked like it was trying to catch up with Eclipse. Eclipse has the best Java editor I've ever used, and the refactoring tools are superb, so Oracle still has lost of work to do.

  • There was an XML editor in JDeveloper that I liked a lot. Based upon what I saw, it allowed you to create graphical representations of your XSDs, and extract XSDs from raw XML documents.

  • There's a database querying tool built in to JDeveloper, which makes me happy since there aren't many good, free database manipulation tools for the Mac (from what I have seen).

  • If you're supporting Java and stored procs in your environment, JDeveloper has a PL/SQL editor, which makes it a decent all-in-one solution for Java and PL/SQL editing.

  • UML and round-trip engineering support looked decent. Not as solid as that available in Rational, but decent. One thing I did like in particular is that your UML diagrams are exported in to your JavaDocs from JDeveloper automatically.

  • Sequence diagrams are not supported in JDeveloper, which is a shame (considering that they are so important to the design phase).

  • Oracle plans on creating Eclipse plug-ins for a number of JDeveloper's features. If Oracle is smart, they'll just move JDeveloper to the Eclipse platform altogether. It's time for the straggling tool vendors/developers to get with the program and jump on the Eclipse band wagon 100%.

  • The JSP/HTML editor did what it was supposed to, but I can't say I'd ever plan on using those features. The base looks and feels for the JSP apps were uninspiring to say the least.

  • JDeveloper is now free, which makes it hard to argue against (regardless of your opinion of the tool).



Oracle's J2EE server was also quite rich in features and capabilities.


  • The server itself is small and light enough to be embedded in JDeveloper for running test apps. I was expecting the server to be a huge beast, so this was a pleasant surprise.

  • Oracle's J2EE strategy seems to be quite open, especially when compared with their traditional approach to technology offerings (which tie you to their platform). ADF (Oracle's data binding framework and JSF component library) and TopLink are both available as stand alone products if you need them - for a fee, of course, but at least you aren't trapped with Oracle's platform forever if you build something using those technologies.

  • Oracle has a Business Process Execution Language (BPEL) offering that looks quite interesting. Essentially, BPEL allows you to construct applications declaratively using service components from a service-oriented architecture. BPEL is an open standard, but it was nice to see Oracle offering it with tight integration to visual editing of BPEL processes in JDeveloper. The BPEL engine must be purchased as an add on to the server; this was the only item that fell in to this category (everything else came bundled with the app server).



All in all, I was relieved at the outcome of the meeting. I had expected to see Oracle present a strategy along the lines of "here's how we intend to force our customers to migrate to J2EE using our proprietary solutions". Instead, what I saw looked like a solid tool/server offering with plenty of flexibility, along with implementation choices for the customer baked in. To be honest, that sort of approach is much more likely to result in me purchasing a vendor's products than the traditional strong-arm tactics so favored in the days of yore by the blue chip tech firms.

I'm looking forward to Monday, when I'm meeting with our Director to discuss our options. We've seen tools and products from Oracle and Rational, have invested some time learning about Flex, and have set our sites on a long term layered development strategy using J2EE. I know exactly which offerings I'd like to see implemented, but I won't know the realistic possibilities until after we meet.

Either way, the future's bright, and I've got my shades on.

Jul 21, 2005

Flexing Flex 

by Maxim Porges @ 11:48 PM | Link | Feedback (2)

I spend every Thursday in our Ocoee office, which is our building for call center operations. Basically, almost every marketing effort prior to a customer arriving at one of our properties originates at the Ocoee building. Since our most successful marketing campaigns are web based, I mosey on over to Ocoee each week to spend an hour with the project stakeholders for some face time and issue discussion. We then have break out sessions for items that require additional attention.

After this week's meeting, I scheduled a half hour to take Paul (my boss, our Director of Software Development) through a very brief demo of Flex. It just so happened that we held the meeting in the office of one of our Oracle Forms teams, and before I knew it I was presenting to the entire team on a projector!

I certainly wasn't prepared to go in to great detail, but we had an excellent run through using the Flex Explorer application. We looked at the examples and discussed the code implementation, as well as the viability of the Flash platform and the benefits of a plug-in based model over a traditional web architecture. We also talked about the capabilities of an RIA compared to those of a web site or fat client, especially the benefits of having Flash as the driving technology.

Some excellent questions were raised during the demo. Would the product work in Oracle's app server? Yes, it was recently certified to do just that. How would we talk to the database? Through logical layers of OO code, of course! What's the scalability like? Better than Oracle Forms... :)

My next step is to coordinate with our other team members to round out a formal Flex presentation for Paul and the rest of the team. I have about five pages of outlines for a PPT presentation, and I'll be applying my usual level of slide-fu to drive the points home. I see fantastic potential in this technology, and a solid match for what I believe we should be doing with our software. Now, it's just a matter of investigating and justifying the ROI, and communicating the facts to senior management so that an educated decision can be made.

I must say, I'm really looking forward to the outcome of this process. I've wanted our entire software team to move to Java development for over three years; the benefits of J2EE development over our traditional technology choices are vast and undeniable. The way things are going, it might finally have a chance of happening, and that gives me great hope for the future.

Rational Rose to the Occasion 

by Maxim Porges @ 11:10 PM | Link | Feedback (0)

Okay, so Rational Rose may be the old version of what Rational is doing these days - but I couldn't resist the play on words. :)

The event I am referring to in this blog title is, of course, the on-site demo we had today from Rational Software (who are now a division of IBM). As part of our J2EE project, I've been tasked with finding tools that will support better requirements gathering for OO development, the creation of appropriate architecture documentation, and tighter trackability and integration between the two.

There's really only one company that comes to mind in this problem space - and that's Rational Software. Not only did this company stem from the formal bodies that invented the UML, but they've also created a very logical, well organized, and customizable SDLC for any type of software development (called the Rational Unified Process, or RUP). While this is an achievement in itself, it would be somewhat difficult to implement without the appropriate tools. Thankfully, Rational has also set the standard for tool integration and compliance with the techniques in their SDLC by creating their own tool suite.

Today's on-site lasted about two and a half hours, and covered a top-level look at RUP and Rational's suite of products: a soup-to-nuts, tightly integrated suite that creates automation and trackability at every level of the process.

A few items I found particularly interesting are listed below.

1) Prior to the demo, I was under the impression that you either adopted RUP 100% or you didn't adopt it at all. This was a false assumption. Although RUP is incredibly comprehensive, and comes with tons of how-tos, samples, and process documentation, you can take or leave all the parts of it as you see fit. In fact, Rational encourages you to customize it to your needs.

2) Rational's requirements gathering tools begin with business process modeling (whether those processes are manual and you just want to document them, or if you are planning on automating them through software), and then move seamlessly through requirements gathering, tracking change to requirements, implementing requirements in architecture, and then associating those requirements with the final code. You can literally track the origin of a system code artifact all the way back to the requirement that spawned it - astonishing.

3) Their entire toolset is moving to Eclipse (duh - they're owned by IBM). My surprise was that their tools are being adjusted to run in Linux, which means they might also work natively in OS X (yay!).

4) The requirements tools sync up with Word, but save the atomic items (such as the individual requirement bullets) to a relational DB. This all happens seamlessly through macro integration inside Word. So, you get the flexibility of a rich word processor with the robust data tracking and querying of an RDBMS. Another solid win for the product.

Right off the bat, I can see the advantage of our utilization of Rational's business process modeling tools. If there is one thing I wish we had, it would be comprehensive business process documentation. Westgate Resorts is in the timeshare industry, and like most industries, it comes with jargon and terms that you'd never understand fully unless you had worked in the business. After 8 years with the company, I've been lucky enough to work in a lot of different areas and bring that knowledge with me to our IT department, but our newly hired programmers are not so lucky. Being able to give them business docs to study would be a great orientation exercise.

I'm also a huge fan of tools that integrate. This is one of the reasons I liked Adalon when we first started using it; work on one part of the process acts as a springboard for the next part of the process, and Rational follow suit. To be honest, if I was not so fortunate as to have good tools, and had to sync all my own documents, I'd go mad. Imagine doing process flows in Visio, requirements gathering in Word, project management in MS Project, UML modeling in another tool, and development in Eclipse with no automated links between tools/steps. If I had to do that, I probably wouldn't even bother.

While this might sound like a prima donna stance, the bottom line is that all of those steps take a lot of effort, and there's little ROI if you're starting from scratch again at each step. If I can't use my Visio diagram as a springboard to my UML modeling, and my UML model to springboard my code, and I don't get back and forth integration between all the documents as things change, then I'll end up spending more time syncing my documents than actually enjoying the benefits of having them in the first place. I'd end up getting less work done at the end of the day, which would be wasteful and counter productive. Tools like Rational might seem pricey, but in a team environment of market-priced software engineers, the ROI of an integrated suite quickly becomes apparent.

So, while no decisions were made today, I definitely liked what I saw (and while I'm leading the process, I'm not the only decision maker). Also, the fact that Rational provides such tight links between requirements and code got me to thinking that we could potentially shorten our architecture process (as I was griping about earlier this week) by using more visual representations and fewer written descriptions of requirement implementation. To quote the cliché, a picture says a thousand words - especially when the picture is a UML sequence diagram! :)

I'll be posting more thoughts on Rational tools and RUP as we delve deeper. We've also got an Oracle demo of JDeveloper and Oracle's 10g J2EE Application Server next week, so keep your eyes peeled.

Jul 20, 2005

How Much Architecture Is Too Much Architecture? 

by Maxim Porges @ 10:51 PM | Link | Feedback (2)

My team has been evolving our system design and technology strategy over the last nine months. We've been going from CF/Fusebox 3 with CFCs to CF/Fusebox 4 with Java.

Overall, I'm extremely happy with the decision, and my developers have echoed that sentiment. However, we've been having discussions recently about how much architecture we do prior to development, and how much detail we put in to that architecture.

Here's a run down of our development process once requirements gathering comes to an end. Our SDLC is based upon the Fusebox Lifecycle Process, or FLiP (more about this in my blog post on my FLiP presentation at CFUNITED 2005).

1) An architect sits down and goes over the prototype and requirements document, and begins to explore their ideas for implementation.

2) The architect decomposes the wireframe in Adalon (our Fusebox design tool) and creates a skeleton for the Controller in the Fusebox app. This is subjected to peer review by the team, during a presentation by the architect.

3) The architect fleshes out the Fusebox skeleton with Fuses and FuseActions, while simultaneously developing the API for the Java delegate that the web site will use to interact with any APIs (system specific or third party) needed by the application. (We use the delegate object to hide the actual APIs being used, which allows us to switch APIs on the fly with minimal impact on the site itself). This is also briefly reviewed by peers.

4) The architect develops the object model. Right now, we're doing this by stubbing out the interfaces and concrete classes, and documenting their functionality and responsibilities in JavaDocs. Once again, peer review takes place when this is complete.

5) Finally, the architecture documentation is delivered to the developers for implementation. They code off of the docs with speed and ease.

If this sounds like a lot of work up front, it is - and with good reason. As a manager of a development team, I have to strike a careful balance between total chaos and over-documentation/over-engineering. Imagine these two scenarios, on polar opposites of the spectrum.

I call the first scenario "Developer Code-Fest." In this scenario, the development team is made up of top-flight developers, all hopped up on caffeine and Slashdot, and spewing catch phrases like "agile" and "XP" when they really mean "no documentation" and "by the seat of my Jolt-stained pants." The business requirements are timidly slid under the door of this rabid geek party by the Project Manager. All hell breaks loose. As the sound of clacking keys reaches a crescendo, the developers froth at the mouth uncontrollably, with no thoughts of forward engineering having ever crossed their minds. Upon completion of their masterpiece, they ritualistically torch the requirements document with lighter fluid in a metal trash can, and throw the application over the wall to QA. When the project manager asks them where the technical documentation is, they go back and add just enough to satisfy "the pointy-haired bosses" - after all, they can't remember what the hell they programmed, and if they needed to make changes, they'd just go back and read the code, right?

I call the second scenario "Architect Yourself Into Oblivion." In this scenario, the project arrives fully documented to the thoughtful and bespectacled architect, who regards them with the same level of interest as a biologist who has just discovered a new strain of llama. After three weeks of reviewing the requirements and quietly sipping chamomile tea, the architect decides that they need an additional six weeks to investigate the cleanest design pattern for implementing the "log out" process. A further fifteen business days are lost discussing the benefits of implementing persistence using strict ANSI-92 compliant SQL, as opposed to Oracle's slightly specialized version, which (horror!) would never be portable to DB2. Eight months later, the architect is halfway through the design process and has written seven tomes of documentation, which, while resplendent in its undeniable design simplicity and elegance, can't actually be implemented in any of the languages presently available to mankind. The project runs out of money, everybody gets fired, and the entire island of Puerto Rico is able to power itself for six months off of the BTUs generated by burning the architecture documentation.

Of course, I slightly exaggerated both scenarios, but the point gets made: where do you draw the line between the no documentation/design scenario and over-architecture?

I guess I'll state the case another way to reduce the scope of that question. Here are the things that are important to me as a manager, with the reasons why.

Documentation Before Coding
I've never seen a developer do a decent job of documenting a system after it was programmed. The docs always assumed a level of knowledge of the code that you could only have had if (a) you programmed the thing in the first place, or (b) you read all the code. As a result, I would prefer that the system be documented ahead of time, and (preferably) actually implemented primarily by people who did not write the design documentation. The theory behind this is that if the docs don't make sense by themselves, the system could never be implemented in the first place.

Peer Review of Design Before Coding
I want everything to be looked at by at least two people. Nobody is as good as everybody put together. I know this from experience, because my team has never left a peer review session without a cleaner design coming out of the conversation. (This also applies to post-implementation code review in equal measure.)

Ease of Use, Ease of Change
If my developers finally win the lottery pool, split the winnings, and move to Tahiti, I need sufficient documentation for my new development team to understand the old systems quickly at the usage level (i.e. make this work) and the implementation level (i.e. add this feature without breaking what's already there). I also need the ability for different developers to work on each other's code, without one person being the guru on specific pieces that they built; gurus become bottlenecks to their creations, and end up being bound to the system they built rather than evolving their skills with new systems. Finally, I need third party teams to be able to interact with our APIs without needing to understand the complex intricacies of the API's innards (again, the usage docs satisfy this criteria).

To date, I haven't found a good way to meet these goals without doing a decent amount of up-front design work. On the flip side, most of our apps designed in this fashion have been great successes, and have been relatively well-designed and easy to maintain, so the payoff has been there. As for development timelines? They aren't always the shortest, but they are far from unreasonable - and after all, the biggest overall system cost/wasted future development effort comes from bad design or lack of documentation, so ultimately you save time and money by doing more work on the front end.

So, you might wonder why the hell I'm looking to change anything if what we have works. The answer is simple: if there is a shorter way to get the same results, I'd like to be using it. I'm a huge fan of process improvement, and finding new ways to do old things better. I guess what I would like to see is more agile approaches to our development process, more parallel architecture, and the same level of forward documentation and design as we have at present - without the same volume of work and time required to produce it. To quote the cliché, time is money.

To that end, I'm looking at a lot of different avenues, and my team is having a lot of discussions about where we can create efficiencies and more parallel work. I'll be sharing my thoughts here as we progress, and would love to hear your thoughts on the subject if you have them.

Flex vs. JSF 

by Maxim Porges @ 9:25 PM | Link | Feedback (2)

As part of the J2EE project I'm heading up, we're evaluating front end options. Presently, our Oracle applications are developed using Oracle Forms 6i for front end interaction.

Personally, I have never been a fan of Forms for the following reasons:

1) With Forms, you're bound to Oracle - plain and simple. Lack of vendor/tool flexibility is bad for enterprise software development.

2) The UI functionality is restrictive. It does data input/validation pretty well, but interactive/easy to use UI can be a challenge or imposibility depending upon what you want to do with it.

3) I find the client/server model in Forms extremely strange: you get a heavy client (Java applet) and a heavy server process (in-memory representation of the form on the server). Shouldn't at least one side of the connection get a break? (I won't share any numbers, but the scalability hasn't been stellar based upon our user-to-server ratio).

4) The Forms framework seems to thrive on absolutely no abstraction between your view and your data model. The tools for building Forms literally lead you to creating direct bindings between UI elements and database columns, and to a certain degree require business logic code to be embedded in the front end.

As a result, we've been looking elsewhere for front end choices. Oracle is heading toward a JSF-based strategy for their tool and application suite, so that naturally rose as an option. As for myself, I believe very strongly in the future of RIAs (Rich Internet Applications), so I threw Flex in to the mix.

So, why don't I like JSF? A few reasons, backed up by many years of web development experience:

1) Primarily, JSF relies on the following technologies: HTML, CSS, and JavaScript. All of these technologies are controlled 100% by the browser, which I (as an application developer) have no control over. To boot, none of these technologies behave identically in any two browsers, no matter how "standards compliant" they may clain to be. That's just a fact.

2) JSF is supposed to figure out your browsing environment and render the components accordingly. This is a nice goal, in the same way that contestants at beauty pageants wishing for world peace is a nice goal. I say this because I've never seen a tool that actually supported mutiple browsers seamlessly. (I even have it on good authority from a vendor using JSF that he has already discovered that this goal is a fallacy; this vendor only supports two browsers for his app, and had issues with some simple JSF controls that they still haven't figured out - and he used the standard JSF tag set.)

3) You can't build custom components in JSF and expect them to work in every browser. Not only would you have to debug in every environment (read: time consuming and error prone), but you can't implement features like drag and drop (surely one of the best UI features in existence) on browsers that don't support advanced CSS and JavaScript.

The bottom line is, there is no way to guarantee the enforcement of web standards. By basing a framework on such a house of cards, I almost feel like JSF is a lost cause before it even begins. There are some nice documents on the W3C web site describing how browsers should implement web technologies, but no browser actually works the same way. They can't, because even specifications are open to interpretation by the developers implementing them, so I doubt that the situation will ever be resolved. (If you want a great dissertation on this, I suggest that you check out Designing with Web Standards by the great Jeffrey Zeldman.)

With Flex, there is one dependency: the Flash player. It's been around for years, and everybody has it. It works in old browsers. It works in new browsers. It works on every major OS on the planet. It runs in phones. It runs in PDAs. It is developed by one company, to one specification. This means that if I develop Flex applications, with custom UI controls and all the other bells and whistles, they will always work the same way, whether my client is Netscape 4 or IE 8 (due in 2047), running in Slackware on a cellphone or Longhorn on a laptop.

Some of you may be crying foul: "But Max, you just said that vendor reliance is a bad thing! Surely you would be relying on Macromedia if you went with Flex?" And my response would be, you are right. However, Flex has a few things on its side: a very loosely coupled programming model, and an excellent set of capabilities that would serve us well for many years. With the programming model alone, it would be simple to retire Flex in the presentation tier after many years of service (if such a necessity arose), while guaranteeing our development investment at the business tier and beyond. And let's be honest: the business tier is where the money is.

Ultimately, you have to accept the fact that your front end is destined to change as new UI technology emerges; the trick is to get the most mileage out of it before that day comes. With Flex offering J2EE and .NET integration out of the box, plus an incredibly attractive and powerful UI layer, I would bet my bottom dollar that we'll get plenty of use out of Flex before we have to find something else for our users to poke with their mouses.

And why not go with OpenLaszlo (Flex's open source second cousin)? Well, I've been playing with Laszlo for some time, and I like it - but it's not vendor supported, and it lacks the polish that Macromedia has lovingly applied to Flex. Plus, I wouldn't be entirely comfortable building an enterprise around a product that does not have a team of paid support engineers behind it. Not to mention, Flex feels much more mature than OpenLaszlo, has nice development tools (sorry, IBM - Flex Builder knocks the pants off of the Laszlo IDE for Eclipse), and there are even excellent books about Flex.

Oh, and Flex's default component set is much prettier. Sure, you could spend time creating custom look and feels for Laszlo, but why bother when the Halo LAF in Flex looks so darn good?

Have some thoughts on JSF, Flex, Laszlo, or this blog post? Hit me up on the comments, my friend.

Jul 17, 2005

Getting Started with Subversion 

by Maxim Porges @ 11:20 AM | Link | Feedback (5)

I've been working in the software development industry for some time now, and source control is a huge piece of the puzzle for effective change management and development. However, I've never actually had to administrate source control - that was always within the realm of our sys admin team.

However, we're now evaluating source control systems for our new J2EE project at Westgate Resorts, and I wouldn't mind having something set up on my laptop to use on my many personal projects. Naturally, I've been looking at Subversion for a while, since it's essentially CVS with all the bugs fixed and the features rounded out (and CVS sure wasn't bad). I've also worked with development teams who used the Eclipse CVS plug-in, and it makes life very easy indeed.

So where do I start? With Google, of course! For those of you who may be new to Subversion or in the same predicament as I found myself, I thought I'd post the links and resources I found useful.

Subversion Book (Free)
This is an O'Reilly Subversion book that is being given away for free. I downloaded the PDF version and stored it with all of my other PDF manuals on my laptop. I read the first four chapters in about an hour, and I must say that it is excellently written. It also covers the concepts for trunks, branches, merging, and repositories for those who are new to source control. I'd even recommend it as an excellent primer on these concepts for any source control implementation.

Eclipse Plug-In for Subversion
I haven't installed this yet, but here it is. I'm sure that (as with most Eclipse plug-ins for popular software) it will be excellent.

SvnX
Not using Eclipse, but need a GUI client for OS X? I suggest SvnX (I think this is pronounced "sphinx"), a GUI tool for SVN (the commonly used short name for Subversion). I also found that SVN installation is pretty darn easy if you use Fink (a package and dependency management tool for open source software utilities on OS X, similar to rpm).

That should be enough to get anybody started. The book covers history, concepts, software installation, administration, and extension through development APIs, so that's your best first bet.

Good luck and let me know what you think of Subversion once you get it running!

Jul 16, 2005

Apple (or APPL, as I Like To Call Them) 

by Maxim Porges @ 8:43 AM | Link | Feedback (1)

Apple produced their quarterly report this week, and was met with investor enthusiasm across the market.

If you ask me, it's about frickin' time.

Sure, Apple's stock has sky-rocketed over the last two years, but it was always based on industry analyst opinions on the iPod. Nobody will deny that the iPod is the driving factor in Apple's success - they split off an entire business unit for it in their quarterly reports, no less - but for so long, that's been the only factor that Apple has been measured upon by the industry.

For example, last quarter, Apple had a stunning set of results across the business. However, since the iPod didn't do quite as well as the analysts had hoped, the stock dropped. I thought this was ridiculous. The rest of the business was doing incredibly well, and some of the obvious results of their strategy for the last three years were swinging in to motion.

It seems that this quarter, analysts looked at the big picture. The "halo effect" showed enough oomph for the analysts to admit that it was viable (duh). Mac sales were incredible. Laptop sales were incredible (I contributed to one of them :) ). iTunes is set to hit half a billion songs in short order. Retail stores are growing and continue to be profitable. And there was a huge R&D line item that indicates great things to come (of course, Apple remains tight-lipped about future product strategy, as they should).

Personally, I'm hoping for a few gems from Apple in the next few years. None of these suggestions is based on actual reality, but more the fantastic nature of my overactive imagination.

1) Apple becomes a mobile carrier, and offers top-notch wireless broadband service. Fees will be extortionate, but customers will flock to Apple's brand like the iPod-crazed monkeys that Apple's marketing department has trained them to be.

2) The iPod is reborn as a phone, PDA, video player, Sirius satellite music player, and game machine (in addition to its present feature set). It connects to Apple's broadband network, from which you can download movies, games, and whatever else Apple decides you need to buy. It has Bluetooth so that you can connect to the Internet wirelessly from your laptop using Apple's broadband cellular network wiithout even needing to take the iPod off of your belt buckle.

3) Some kind of evolution happens to the Mac Mini which turns it in to the ultimate home entertainment system. I'll leave the details to the Apple engineers, but undoubtedly it will be very cool, integrate seamlessly with the Internet buying experience, and will trounce the pants off of the embarrasing failure known as the Windows Media Center PC.

Hey, I can dream...

Jul 14, 2005

Adalon on Mac OS X Tiger: Update 

by Maxim Porges @ 10:17 PM | Link | Feedback (0)

I still haven't had a chance to test Adalon on OS X Tiger, but I'm pretty sure my friend Jeff in South Florida is using it and hasn't had any problems. I blogged previously that Sean Corfield was having issues with it, and that I was going to investigate it and post my findings.

I'm emailing Jeff now, so maybe his response can shed some light on the issue.

UPDATE: Well, I just tried it out, and it looks like Adalon runs fine on Tiger. I copied my installation on OS X Panther (on my laptop) over to Tiger 10.4.1 (on my desktop), and it ran without issue. I emailed Sean to see if he gets any output to Console when booting Adalon, which might help diagnose the problem. I'm also using the 1.4.2_07-215 JVM, which may be different to Sean's if he's using the newer version 1.5 JVM (which I haven't downloaded yet).

Java/Spring/ColdFusion White Paper Update 

by Maxim Porges @ 9:49 PM | Link | Feedback (2)

How time flies.

Last weekend was spent in South Florida with the girlfriend's family, which more or less completely destroyed the possibility of any work on the white paper getting done. I did manage to squeeze out the outline for the Java/Spring/CF white paper, so with any luck I'll crank it out this weekend (maybe without 100% tested code samples) and then I can clean it up after posting it if necessary. I'm more interested in time-to-market than total accuracy at this point.

Otherwise, the schedule has been the usual mix of completely hectic/exhilarating. I'm heading up the technical side of our Oracle team's first J2EE-based application, which I more or less put the brakes on until we decide upon (a) a development/architecture tool, (b) an SDLC/requirements gathering process, and (c) a front end platform. It's pretty interesting, since we're taking a group of primarily Oracle Forms developers, and matching them up with the Java/OO experience of myself and one of our sharpest developers, with the intent being that we build the first fully layered OO system to be deployed by the Oracle team. While the Oracle team is highly proficient and has been developing Forms/PL-SQL apps for many years, this is their first forage in to OO. It's a sure sign of the level of intestinal fortitude of our Director (my boss) and the openess of the team to investigate the implementation of new technologies.

As far as the SDLC goes, we've been using FLiP very successfully on the Web Team at Westgate Resorts going on three years now, but we're the only software development team in the department using FLiP full force. The waterfall-based methodology used by the Oracle team is not FLiP. It works, but I'd like to tweak it a bit before we try to do anything OO with it; maybe a bit more forward engineering, and a slightly different tack on gathering business requirements.

From the tool standpoint, we've got demos from Rational and Oracle next week to look at OO modeling solutions, which I'm looking forward to. Between something like Rational's Software Architect software and Adalon, we'll have a pretty formidable tool set on the Web Team. To date, there is very little from a modeling standpoint being used by our Oracle teams, and they have expressed an interest in changing that situation.

I'm interested to see how the product demos go, especially when we do some short term trials. I'm not sure what it is about modeling tools, but they always seem to be slightly under cooked from my experience (read: buggy). I won't complain, though - they're certainly better than writing out your system documentation/forward engineering by hand.

The Oracle tool option we're considering is JDeveloper, with a potential front end option of JSF/ADF. I'm not too concerned about which front end option we go with, but I must say that after 5 years of web development, I'd rather stay away from JSF. I'm automatically wary of anything that requires a browser to render HTML and execute JavaScript as a platform for enterprise system development; there's just too many areas for "interpretation" at the client level. As an alternative, I'm extremely interested in Macromedia Flex, for a number of reasons that I won't list in their entirety - but essentially it comes down to (a) consistent performance and development techniques across all platforms/browsers (including backward compatibility), and (b) fantastic flexibility in user interface design. I'm looking forward to our front end debate to see how these points stack up, especially when we see JDeveloper in action - Oracle tends to build tools that enable rapid development, which is always a plus. Then again, when I explained our goals and needs to our Oracle rep, she told me that Rational would probably be a better option; and how often do you get that kind of candor from a sales rep! Nonetheless, my mind remains open until I see the tools in action.

Of course, I'll update you all on the outcome as the project progresses.

Jul 5, 2005

Brief Post on An Architect's View, Adalon 3.0 Broken on Tiger 

by Maxim Porges @ 11:28 PM | Link | Feedback (2)

I just noticed that Sean Corfield posted briefly on his web site about my FLiP presentation at CFUNITED. Thanks Sean!

Sean also emailed me earlier today letting me know that Adalon 3.0 does not work on Mac OS X Tiger (a.k.a. OS X 10.4). Not having Tiger on my laptop, I haven't run in to this, but I'll be investigating the cause and posting the issue/resolution (if found) once I have it.

FLiP Presentation, Max On Tape, Agile Development 

by Maxim Porges @ 11:10 PM | Link | Feedback (4)

Just a quick update to let you all know that I am still alive. On my return to work, I was buried under a metric assload of email and other such niceties, which of course dominated my day.

I supplied a link to the resources for my CFUNITED FLiP presentation on my previous post (look for the link at the end of the post CFUNITED Day 2). (I would have linked directly to the resources again here, but I'm trying not to scatter resource links on too many posts until I finish migrating my blog to its final host on a hybrid Fusebox/Blogger site.)

Sean Tierney posted an audio recording of me and a bunch of other CF-nerds going off about interfaces and OO techniques in CFCs. It turns out from Sean Corfield's comment on the post that he and I disagree somewhat about this topic, which I expect will result in some interesting discourse as I post my white papers in the coming weeks. I'm going to work on the first part of the CFC/Java series this weekend, assuming I can squeeze it in around a trip to South Florida that will be happening simultaneously.

I had an interesting discussion today with Michael about changing our architecture process to being more agile. Our architecture process is currently based upon procedural Fusebox 3.0 development, even though we're using a quite different toolset now: Java on the back end and Fusebox 4.1 MVC on the front end. Now that we're doing pure OO on the back end, we should be able to break our architecture process up a bit more and get some more throughput. In tune with my usual eagerness for constant process improvement and experimentation, we'll be experimenting with this over the next few projects. I'll let you all know what transpires.