Sunday, October 22, 2006

Sourceforge download problems

Just wanted to download the updated version of TortoiseSVN which I like a lot better than the command-.line svn tool when importing projects from different folders. The only problem was that for some reason, the only available download site was a server in kent, that for some reason was down.
I inspected the download url, that is on the form http://prdownloads.sourceforge.net/tortoisesvn/TortoiseSVN-1.4.0.7501-x64-svn-1.4.0.msi?use_mirror=kent
Well i assumed that the request parameter use_mirror = could probably be replaced by something else. I tried google to see if I could get at list of all these alternate servers, without luck. I saw that the download-server had a dns-name on the form http://kent.dl.sourceforge.net/, then i turned to netcraft.com and entered *.dl.sourceforge.net and it immediately gave me back a list of servers matching the dns-query. Voila, i replaced use_mirror=kent with use_mirror=belnet and there my binary dumped down on disk :-)

Saturday, October 21, 2006

Saturday night in

It's Saturday night and I think I will spend most of it at home together with my computer.
blogger.com has gone live with a new version of blogging software, so I have just downloaded the new Firefox 2.0 RC3 bowser and will give both the face lifted browser and the new blogger a try.

The first thing I noticed was that my del.icio.us plugin doesn't work anymore. I have really been used to adding sites and bookmarks to del.icio.us using the extension, especially since I am using multiple computers, and being able to centralize the management of my bookmarks is really a good thing. Fortunately, I found a replacement, del.icio.us post that works fine with 2.0.
The 2.0 browser now comes with a built-in spell-checker, which is nice. Text that you enter in form field can be validated on the client side, and does not rely on external services. I have used it already.

Other things that I have been using and spending my time on lately is:
* IDEA IntelliJ 6.0 that was released last week
* Google Web Toolkit, GWT
* Directory Opus a commercial alternative to Total Commander

And if you haven't moved to the new beta-version on blogger.com, do it, its better than before, I guess that the Google-guys has been busy lately.

Thursday, September 14, 2006

JavaZone closeup and end-comments

The last day of JavaZone has been just as exiting as the first one. I tried to add a nice mix of high level process-oriented stuff with hard-core technical sessions to keep me from falling asleep today. Didn't have much sleep tonight, huh.
Mary Poppendieck had a great session on Lean SW-development. I am a great fan of the Poppendiecks and especially their first book on Lean Software Development:An Agile Toolkit.
Conventiently she announced that her new book is shipping as we speak. I am also in the process of entering my credit card number on amazon to get a copy.
Thereafter I dived into a session on DTrace, a feature of Solaris systems that enables full tracing of the entire software stack from a thread in the java-vm and down to system calls in the kernel. This approach to tracing will make searching for bottlenecks in the system much easier. The speach was accompanied with some nice live demos.
Simon Ritter another speaker that made JavaZone 2006 worth visiting. He has been experimenting with SPOT and robots, talking about RTSJ and esoteric VM-options on the Solaris VM. Did you know that there are more than 400 different -XXoptions available in the Solaris VM as from java 6? Well , now you do. A I am currently back to work (in the middle of the night) because we are going in to production with a system for a client. Suddenly I realized that it is actually a year until next time I can spend two great days@JavaZone.

Have something on your mind? post-it!

What is new at JavaZone 2006

I am currently attending a session at JavaZone 2006 called Are you a plumber? How to avoid plumbing for business-enriched remote clients. It is a co-talk by Ole Andre Ranvik which is a co-worker of mine and Bjlørn Nordmoen from Western Gecko. The are basically speaking about a project they have been doing together for the last year and of they solved some of the issues they encountered.

The thing about the program this year as far as I can tell is that we do not have to much stunning news, new technology, new releases and all new specs. Much more of the material this year is based upon how we actually use our technology in day-to day work. Agile tracks and and technology driven tracks share a common property with respect to this, they cover what developers have been working with, not what someone think they should work with. With some exceptions, EJB 3.0 being one of them.

Wednesday, September 13, 2006

Backwards compatibility, why?

This is quite an interesting topic. I am currently attending a session on EJB 3.0 and JPA given by Patrick Linskey which has been participating in the spec work. An issue related to backwards compatibility related to some things introduced in JPA (global named queries). And I asked if Patrick sees it as much of a problem to break backwards compatibility. And he said, "we did it with EJB 3.0 and we do not want to do it again". Well I have never ever heard of anyone actually complain about this compatibility break. In my opinion in many cases the effort spent not to break backwards compatibility is just a waste. We would be much better of getting new specs and implementations that improves faster as opposed to versions that does not break backwards compatibility.

Meeting the speakers and authors

I am currently staffing the javaBin stand at JavaZone 2006 so I cannot attend any sessions as of now. However staffing the stand gives me a perfect oppurtunuty to talk to attendees, partners and speakers. Seems like if everybody is having a good time. Some of the sessions has been so popular that we had to schedule them twice, like Patrick Linskeys EJB3 Java Persistence API: The Good, The Bad, and The Ugly.

So far the conference has been running smoothly. I am soon of the the next one when releaved off duty. Think I will head along to a REST vs. Soap smackdown to see what all the fuzz is about :-).

Second Session at JavaZone at 2006

The first session I attended to was the Java Effective Reloaded with Joshua Block. It was packed, so I had to stay in the back. Anyways, I got a picture of what it was all about.

I have allays been fond of GUI development, both web and standalone, although I have never done to much work on GUI's during projects, mostly on my spare time.

I am currently attending the First Aid for Swing by the founder of JGoodies, Karsten Lentzcsh. Karsten is guiding us through the process of doing a project where you have inexperienced developers, no time for usability design and the customer want pay for anything. Cool what a pragmatic attitude, this is how it works in most cases :-)

There are a lot of dos and dont's in developing swing applications. One if to use consistent colors and allays to use native fonts, simply because the render better. Karsten often does a test on applications with excessive use of borders by imagining that the GUI is a physical three dimensional artifact and by drawing the fingernails over this artifact, does it make any sound? "Rattcchhhhhhhhh" not good, to many borders, good tip? :-)

Are your GUI application symmetric? It should be, use gradients and weights to achieve symmetry across the interface, subtle but effective!. Use to three, maybe five fonts at a maximum as a guideline. I guess a lot of this concepts applies to GUI's in general, not only to swing applications.

Depot Lineup at JavaZone

At the javaBin boardroom we are currently in the process of getting the last small details in place for the conference. Yesterday we had the speakers dinner with almost all the speakers for this years venue. I finally got to meet Mary Poppendieck whom I had been emailing for the last six months regarding her speach on the conference, at the Table was also Tom Poppendieck and Gregor Hophe. We all had a nice chat and vere looking forward to a couple of hectic and nice days at javaZone in Oslo.

I will try to get out up to date new published along the way.

Monday, September 11, 2006

Heading to JavaZone 2006

Currently planning the various sessions to attend at JavaZone 2006 in Oslo 13-14 September. The lineup and two-day agenda is impressive I must say.
I have myself been part of the program committee have been working for almost a year now to get the program in place. Hope all of you developers out there look forward to joining up at JavaZone 2006 as much as I do. A mighty combination of Well known names like Rod Johnson and Mary Poppendieck and local heroes like Eirik Torske and Ole Andre Ranvik are some of the speakers this year.

Please also spend some time looking through the agenda and don't join the crowd. Maybe you are in for a surprise when attending the not so well-known speakers?

See you there!

Saturday, June 24, 2006

TSSJs Barcelona closeup

Well, all good things come to an end, including this conference. I will use the spare time tomorrow to reflect upon what I liked and what could be improved about the sessions in TSSJS in Barcelona, which all in all was a pleasant experience, hola!

Friday, June 23, 2006

Model generation and visualization

With MDA being totally out of the picture, Gregor Hophe (energetic and excellent speaker) and Erik Doemenburg is getting down to the point of what is really important point when it comes to modeling, not do create pictures that you can generate code from, but to generate a graphical presentation from an existing system and an existing codebase.

I write a blog entry called "Feeding the architect" that dealt with this topic. In that case using maven, dotuml and GrapViz to generate diagrams (static and dynamic) from an existing codebase. The main problem is to combine data from static code analysis and from a running system, exactly what Gregor and Erik is going into presenting.
Three different models are suggested:

  • DAGS
  • Metrcics
  • Petri nets
  • Trees
Try to extract data from static and dynamic analysis in a simple form, preferably in a text file and then extract data from the text files and map into whatever model you seem fit for the given analysis. A good example of this may be to get a representation of message flow through a workflow system.

The first example being using a 95 line XSLT to generate a dot file put into graphviz to generate a dependency diagram from a spring config file. Well, since I am lazy, I would rather just use BeanDoc. Remember that lazy is good here, do more with less kind of thing :-) But I think the audience get the point here.

The next demo is much more interesting. By doing some modification of the jar packaging and adding some metainformation, the build step create metainformation in text files that makes us able to create a graphical presentation of dependencies between jar - files, components and to create clickable images drilling down to classes and source code. This may be a very effective means to get an overview of the system and to see how well it really aligns with the architecture that you anticipated. This is good stuff, and it should be added to the tool portfolio of every architect and developer.

Gregor now gives a presentation of how he has generated an svg from a running messaging system by instrumenting his own messaging library.

Process mining is interesting in the sense that we do not only present a model that can be generated from a simple dynamic and static analysis, but a heuristic approach to get a picture of what kind of processes that executes and in what order. Gregor presents a really nice example with using ProM to visualize processes using petrinets. This is a tools that accepts XML files as an example that may be nicely represented in a swing gui, really cool.

References:

JCP Panel discussion

OnnoKluyt is leading the panel with the following guys:

  • Jon Bostrøm
  • Tom Baeyens
  • Mike Keith
  • Cameron Purdy
  • Gavin King
With such a lineup where most (all?) of the representatives on the panel is also either a JSR member or spec lead. Makes it little bit difficult to get some interesting and heated arguments but anyway. Having the experts in the field or at least someone profiles within the community lead the work is to start with a good thing. This is not sufficient to get the right technology out to the people, bit it might be a start.

The time to market problem, especially with the Java Platform releases is beeing discussed piched up with a comment from Gavin.

Companies like Microsoft probably implements similar processes internallyalthoughh they are most definitely not public!
Kirk Pepperdine claims that you pretty much need a lawyer to get through the initial process of signing up and getting up to speed. This is something quite different from entering an Open Source community where you are judged on what you have (or are) achieving other that being able to signoff legal documents.

Cameronencouragess the members of the audience to participate andcontributee instead of, as Gavin King says, just complain instead of just complaining about stuff thatdoesn'tt work. Some (around 4) of the JSR's currently running are led by individual members, with Groovy (scripting for java) and Concurrency as good examples. Surprisingly enough more that 50% of the members of the JCP are actually individual members.

Grid based banking

John Davies from C24 has been of of the profiles at this convention. I missed the keynote so I wanted to drop in at the banking and grid session to see what the fuzz was all about.
A lot of cases where we deal with applications we don't really need databases as we know them John claims. More than viewing the database as the central repository of all information you can view the grid as the repository of the information.

Typical cost is 1$ per CPU per hour. Interestingly when several of the banks that John is working for has as many as 10K CPUS. Utilizing these effectively is important with respect to ROI.

C24 want to avoid using XML though the wire when transferring information is not to use XML because it is not efficient enough, but use java databinding and efficient object serialization instead. They do this through something they call Integration Objects. The fact that John discourages the use of XML in the grid computing space is that is not efficient enough. This is also an important moment in the SOA space discussions.

How do you do this when you don't have any data stored in a central repository only in a grid where you don't have a query language.
John is referring to a case study where everything is stored and manipulated in a grid space in two different grid spaces, one that receives the feeds and organize them and one replicated feed that is optimized and used solely for querying.

The technology is based upon JavaSpaces and has some quite interesting attributes. One of them being that querying is done in javaspaces based on what kind of interface(s) that are implemented. Then some matching logic. The thing here is that the searching seems to be quite procedurally oriented, like searching for some course grained items and then doing a new query on that subset. I would prefer this to be more declaratively oriented, don't tell the system how to do it tell it what to get!. This is what has been working well with RDBMS'es for the last 15 (maybe even more) years. Maybe it just because I didn't understand the example?

Richard Öberg in the audience asks what are the maingoctchass. One of the gotchas with JavaSpaces and the way that C24 applies the technology is classloading, understanding the nature of the distribution and multithreading issues in grid environments.

What kind of knowledge will you need to implement this technology effectively. Knowledge of grid computing and products technologies that may have been around for a while without having to much momentum is definitely an issue. This might be a major challenge in implementing grid computing effectively.

Productive coding

Dr. Heinz Kabutz, author of the Java Specialist Newsletter is talking to us about coding productivity and about using the IDE ad your tools effectively. This is also something I have been thinking a lot about since I am often in a position where I do pair programming and coaching. Then you tend to notice how other people work, if they use keyboard shortcuts, how they navigate and so on.

being able to use the IDE effectively, especially using keyboard shortcuts is really a differentiating factor in productivity. How fast you type, and if you use touch or not may also be a factor, because we still think faster than we type.

Important to actually make sure that you know what code you actually need, which code gets executed in production. Important, since most money goes into maintenance and you don't want to maintain unused code.

Heinz also talks about how he has mate this doclet that finds this and that like unused methods fields and so on. I use IDEA to locate stuff like this intead of running a doclet.
More discussions on exception handling, increasing tghe use of private/protected scoping and so on. Many developers should read up about this, resonable scoping will make your class easier to use and less confusing and actually reduce the chance of anything being duplicated if the api uses a reasonable naming strategy and relevant javadoc where needed. Still Heninz has not told me anything that I am not allready aware of, except for for a couple of funny comments in the J2SE core awt code and how the actual enforcements access to final fields as changed back and forth from 1.1 to 1.6.

Development environment of the future

This is a really interesting topic, the topic of the discussion is development environments. IDEs has been one of the most important factors related to developer productivity and is often the starting points for religious debates amongst developers.

Panelists

  • Wayne Beaton (eclipse community)
  • Erik Dornenburg (Thoughtworks)
  • Bruce Tate
  • Cedric Beust (Google)
Cameron Purdy is currently mediating the Panel and he is collecting questions from the audience in writing. I think the fact that he collects questions in writing prevents some effective dialog. People want to talk, not to write so that someone else can talk :-)
Johannes Brodwall (sitting next to me) took up a question related to refactoring and if we are going to see continuing support for refactoring. The panel is not in consensus, Erik claims that we will not see to many new refactorings being developed whereas Bruce points out the fact that the diversity of languages that we are actually using, especially in the AJAX space will require IDE support and continuous effort refactoring support.

Large team support and tools support though collaboration is a very important aspect. Improved integration with communication tools and XP planning tools might me something that can keep us focused in what we do. This a major problem on projects I have been working on. Unfocused developers that are align with fine grained priorities is something that puzzles me. Continuous effort in coaching development teams are something that might be facilitated through the use of an IDE integrated with planning tools (I am not sure if I want a project manager in my IDE .....).

AOP support and support for auxilliary functionality that is not represented as java classfiles like scripting languages, navigation, debugging and profiling is missing. Getting first class support for multiple languages been integrated more easily would be a neat feature.

Cedric points out how neat it is to use a debugger now and then. He urges people that are not using debuggers in java to start doing that instead of using system out printlines. Not much has happened in the debugging space for the last 10 years.

As it seems now, the Java community with IDEA and eclipse is now by far ahead of MS Devstudio while this was quite different only a few years back. This seems to be swinging back and forth, MS DevStudio now getting refactoring support and so on. Myself I am a strong believer of competition and I really think that this will only improve the quality of the tools we are working with.

One reason why Java has been successful is because backwards compatibility has been supported to great length. This means that leveraging existing systems, migrating from one JDK is easier that it might have been. There are a lot of language issues that need to be fixed if we should be able to support stuff like continuations. Thus JVM support for running new languages might be a nice feature.

For those of you that didnt know Cameron is using (as myself) IDEA.

Erik Doernenburg from Thoughtworks answers a question related to BPEL support and workflow support in IDE's by saying that SQL was supposed to be the language that would enable business analysts to write code. Now not even developers use SQL anymore (a pity in my optinion, because the declarative approch in SQL adresses problems, especially in batch applications, that cannot be solved in middleware), so to beleive that enhanced support for business modelling language approaches is a dead end, at least according to Erik.

Closing up the panel, the ultimate question is thrown out, in which direction are we heading:
  • java is still going to stand strong
  • tools that enable us to work with multiple languages in an integrated manner
  • quick feedback related to catching error-situations to reduce roundtrip time
  • higher abstractions like AOP and annotation, maybe also metaporgramming (like GWT) will be even more widely supported
  • domain specific languages
  • lessen the need for code that has to be written, which means less code, even less generated code because generated code will also have to be managed

Thursday, June 22, 2006

Quest for effective Web frameworks

We had a discussion during lunch here related to web frameworks, how many there are, how they compare and how web development in the Java space is actually quite difficult. Creating a nice looking web interface with good interaction capabilities with dynamic content is difficult and costly, and I am not particularly happy with what I have seen so far. I have done some work using spring MVC and JSP 2.0 and JSTL lately, and it works fairly well, but for the presentation part there is still a lot missing. Spring 2.0 will have a new set of tags that will make life easier, but anyway. Wicket seems appealing.

The underlying component model and the fact that you may are able to work with the page itself as a first class object, and being able to have a simple databinding approach.

JPA and caveats

Patrick Lindksey is giving his talk "The good bad and the ugly" related to EJB 3.0 persistence. Interesting to get a more detailed picture of the JPA specification. I really find it more interesting today compared to last year. One of the reasons is that we have more or less nothing else but runtime dependencies and some wrapper code around a generic repository implementation in my current project. This may make it feasible to actually swich persistence providers as long as we have several good open source ones available already. The only thing that will require some work is getting rid of the implementations we have related to searching that uses the Hibernate Criteria API. Maybe JQL will do the job? I don't know I have never been using this part of the API.

One thing Patrick went into depths explaining related to POJO based persistence is really interesting. It really hides some subtleties related to designing API's and access to fields and properties. This applies to most mapping frameworks including Hibernate and is relevant when it comes to discussing domain models and where to put the actual business logic.

Consider the following code fragment representing the setter for the age property in the Customer class:

public void setAge(int age) {
    if (age < 18) {
       throw new JuvenileException("We dont accept young customers");
    }
    this.age = age;
}



This might seem like a business rule to implement in the Customer class, why? Because principles of cohesion should be the rule and the business logic should for reasons of clarity be placed in the Customer class implementation. But what if this is a new Rule that did not previously exist? You have provided this rule such that the API will prevent you from entering Illegal data. But that also means that the persistence framework that will use the setter for populating objects from a search will get the same ,exception as a side effect on data that was valid last year. Hence using setters as a mechanism for managing state in persistent entities is dead wrong. It is a good way to enforce integrity for new (transient entities) but not for dealing with persistence lifecycle. After all APIS and business rules tend to evolve, but historic data tends to stay the same!
The Irony is that you will probably be pretty happy with the first release of the system but after the system is going into maintenance errors will start coming in after the first upgrade. Consultants will probably not even notice, they are on the way implementing setX and setY with heaps of business logic in new domain objects for new customers and by habit use property (instead of field) based access in entities :-)

Geronimo lineup

Who is interested in Geronimo? Obviously a lot of people, I don't know too many people actually using it apart from using it as maven dependencies for the J2EE API's.

I attended Bruce Snyders presentation on ServiceMix and found that one interesting because I have been reading up on Mule a little bit so I attended to get a comparison. I am currently also working on a project where we are on a good way to actually implement small parts of an ESB so the topic is really interesting. With that said, I am not necessarily an advocate of adding a dependency to a third party library just because it contains an out of the box implementation of something I can do with 10 lines of javacode.
Managing 100 different and possible incompatibilities might be just much of a nightmare as writing the 1000 lines of code that those frameworks help me to replace.

The panel lineup of the Geronimo Panel Discussion are:

  • Bruce Snyder
  • Matt Hogstrom
  • James Strachan
  • Aaron Mulder
  • David Jencks
panel is moderated by Ted Neward, a great speaker in my opinion. First thing discussed is the role of OSGI and XBean and how Geronimo is may be used to complement each other.
although OSGI is not supported in Geronimo, one of the reasons is complexity the panel claims.


Why should we select a commercial vendor instead of going for an Open Source alternative, and actually end up paying for a J2EE server, here are some of the points from the panel:
  • Commercial vendors often provides entire suites of products that may complement the J2EE server
  • Support and product roadmaps
  • Performance and scalability
  • Wheter the product has ben subject to rigorous testing (is this a way of saying that the testsuite accompanying Geronimo could be improved?)
  • Service pack/fixpack approach, most commercial vendors provide support for incremental updates
There are a couple fo guys from IBM up on the panel, and they are currently debating how developing and embracing Geronimo may affect Websphere and vice versa. An IBM representative says that they will try to get to a point where changes are merged and updated bidirecionally. Nothing is said related to if they want to estabilsh a common codebase for Geronimo and Webshere.

Spring 2.0 Highlights

Rod starts the presentation with saying that the final 2.0 release will probably be scheduled at the 5'th of July this year. This is good because enable a lot of projects, during the summer to conclude the upgrade to Spring 2.0. A lot of development environments enforces summer freeze periods which makes the summer an attractive period for experimenting with such things. Currently I am busy with getting thins production ready for a big client so I will have to do this on my spare time when I am not hanging out with the guys at PhatMC.com

There are a lot of nice new features that are of particular interest to me as we are using spring at the project I am currently working on. Especially the support for Asynch Messaging without MDBS's with is really good. By trying to replace the message activation infrastructure we are currently using by POJO based messaging we will probably be able to

  1. simplify packaging and deployment, no longer any need to package EJB's (we only use MDBS)
  2. Move some of the testing in from the container based integration environments into the continues testing environment and JUnit tests
Also simplified configuration, and XML-schema support will probably enable us to reduce the amount of spring config we are using. The support for dynamic languages will probably also enable us to experiment with new ways of handling spring configuration and the proliferation of configuration files. And yeah, its fully backwards compatible.

One of the reasons why Spring 2.0 has been delayed has been the support for JPA, which seems to gain momentum and supported by Toplink, Hibernate and Kodo.

Understanding open source

A guy from Sun responsible for Suns managing http://www.sunsource.net/ and participating in developing Suns Open Source strategy is here to explain to us different strategies and approaches for dealing with open source. He starts explaining to us the shift from to the consumer age to the participation age. The key is to understand that certain events changes the way we think about stuff. I think the point he is getting to is the way that Open Source has changed the way we think about software. Various patent models and how large corporations think regarding licensing and patents is also interesting.

Simon Phipps also believes that we will stop paying for using software in advance, and shift towards paying when it adds value. It would be interesting to see how such policies might be enforced. As a consultant I play a role in providing business value through adapting and creating software, not through the software itself, so this is an interesting thing.

Open source communities is based on a casual allignment of interests Phipps concludes.

Would be interesting to discuss further how it is possible to actually charge for software when it adds value and beyond simple scenarios like the number of customers or the number of employees in the company that bought the software. Neither of these marketing models are sufficient to define market value.

Day II TSSJS

Looking through the agenda briefly, regarding Web Application frameworks, there are only two presentations on webframeworks this year, Wicket and RIFE. Not to many people had even heard of those last year. The proliferation of frameworks in the web space really says something about the maturity or lack of such in this space. Rife has continuations which enables a really interesting programming model the reaches far beyond being able to support using the back button in the browser.

Enabling productivity, effective development environments and a component based model like JSF, maybe where everything is expressed in java will be the way to go I beleive. I have been doing a bit of JSP programming and JSTL 2.0 lately. In combination with clever use of CSS like Maxdesign makes webdevelopment much more simple, event not as simple as I would like to.

Wednesday, June 21, 2006

Gregor on SOA

I attended Gregor Hophes session on SOA in Las Vegas I little over a year ago and liked it. I wanted to drop into his session in Barcelona to see if the presentation has been updated.

Gregor now works for Google, let us see if that has changed his attitude and presentation form.

Gregor gives the following definition of a service:

  • Self contained
  • Independent of consumer context (makes few assumptions on use)
  • universally accessible
Seems as if Gregor is pretty much doing the same presentation as he did at TSSJS in Vegas 2005, he is even cracking the same jokes. If he had put up the warning sign, I could be listening to Ross Mason talking about using Mule as an ESB instead.


Achitectural intent. Gregor talks about architectural styles and what architectural artifacts and specifics that defines SOA as a concept and architectural style. To define SOA he claims that it is just as important to state what SOA is not in order to give a proper definition of SOA.

Gregor uses Starbucks as an example of his discussions in the SOA space. At Starbucks (at least in the U.S) the staff performs a highly specialized task, one that collects the order, one that makes the coffee and one that processes the payment. This is throughput optimization, but requires a lot of staff, at least three people in this case. If thoughput it not the case, you might me just as well of having one effective employee that does a good job, it is simpler, more general purpose and less expensive. The point is, maybe you can stick with simpler. Is loose coupling and high co hesion allays for the best, even if you can implement the everything in one small class instead of three? I think that the problem with metaphors like the one with Starbucks is that I find it rare that people actually ask questions about what is the reason for having this design, and why not make it simpler. Developers tend to embrace new technology and paradigms regardless of their appliance. This may be good or bad. Bad in the sense that we may choose the wrong technology for a specific task. Good in the sense that we build knowledge on what works and what doesn't.

For written definitions of what SOA is that you can memorize before you attend JavaZone2006 and want to ask questions, Gregor recommends Orchestrationpatterns.

Q & A
The difference between REST and SOAP is brifly discussed. Gregor points out that the important part of SOA is really the architectural style not implementation specific issues.

Where is the coffe and WIFI

Hey, we have to pay for WIFI here at Fira Palace. I stay at Vincci Arena nearby where WIFI is free. I miss free WIFI, well, but to feed the community with information I am more than happy to pay 17 Euros for 24 hour access.

We didnt get coffe when we arrived, I would like to get coffe when I want and as musch as I want.

Anyway the conference is actually quite intimate. 250 Participants, 275 with speakers.
My current session crasshed with Gavin King's session about Seam, so I went to listen to Jonas Boner instead.

Clustering in Spring

Jonas Boner is talking about clustering in spring, a topic of interest since I am currently developing applications that are deployed into a clustered envrionment, and I want the fact thta applications are deployed into a clustered environment to be transparent. It should be possible to deploy applications that are clusterable also in to non clustered environments such as an embedded Jetty container because that facilitates effective unit testing.

Jonas comes up with a problem definition, scaling and that we need to share data throughout multiple JVM's. I think that in most cases, this should not be neccessary. In stateless applications you might just as well keep multiple copies of reference data (which constitutes what Martin Fowler refers to as the knowledege tier), which is most likely read-only anyway.

Jonas starts go into an example, a Store domain class that is subject to crud operations, and how to replicate state changes across a cluster. Hmmmm, this sounds more like how to replicate state across a cluster, well before you do that, ask the question do you need to? If you want to store the data, put it in the database, you will have to put it there eventually anyway.
Now he continues to discuss aspects related to track changes, well, I have been using Hibernate for that some time, so this is for the most just problems from an OR mapping perspective.

Now some stuff that actually relates to the spring clustering stuff and Teracotta of course, which is the product you will need to support directly what Jonas is referring to. Hani has some viewpoints on this.

In my perception try to design your application so that you have uniform access to data, either read mostly/only reference data and operational data. By designing a data layer effectively I see no reason whatsoever to associate spring with clustering throuch teracotta. If clustering relates to sharing data in a cluster, use isolate this in the data layer. You will most probably be using an OR-mapping tool anyway. Seems as Jonas is taking areas related to data synchronization from the data layer and moving it higher up in the application tiers where tehy do not belong. Synchronizing data using the database is probably the most effective solution anyway.

Continuing the TSSJS venue

Ive just been relaxing in the back on a session with Geert Bevin, the founder of the RIFE framework and he has been explaining to us about continuations and how it is beeing used in RIFE, a webframework that uses continuations heavily.
I was missing some informations related to how continuations may be used decoupled from RIFE by apart from that it is a very powerfult concept. I am not really sure how it will affect the programming model that I am currently using on my projects since all the classes that may contain state in continuations must be cloneable, but I will have to dig into that.

Sunday, June 18, 2006

Preparing for TSSJS-Europe

Just been browsning through the agenda and preparing myself for the Barcelona trip. Been planning the agenda online as well using the online agenda planner available at http://javasymposium-europe.techtarget.com/ . The slots available in the online tool are not the same as in the online agenda. It does not present the slots side by side when they are overlapping either. Guess I will stick to a printed version and just put circles (with a pen) around the slots I want to join.

The agenda is not overly impressive. I counted 43 sessions plus some keynotes and some bofs. But given the lineup, I will not have any problems attending something interesting.

Compared to what we are planning for JavaZone 2006 the venue in Barcelona is small compared to the +80 session programme we are currently concluding, with an even more impressive lineup.

Friday, June 02, 2006

Seaview is Back in business

After the hacker havok at svitjod.rymdweb.com, we never got our b2evolution installation up again. A pity, but thats life. I might as well find somewhere else. I am soon heading for TSSJS In Barcelona as welll, and I want to get ready for some hardcore blogging from the conference, so get ready, at least I am (will?)