Monday, April 21, 2008

ALT.NET Seattle Summary

I attended the ALT.NET Seattle event this weekend. I had mixed emotions about attending and didn't have very high expectations. I am fairly new to ALT.NET (3 months). My first couple of weeks on the ALT.NET discussion list just about soured me on the group. There were a bunch of petty, uncivil discussions and there wasn't a lot of useful information coming out of the discussions. Luckily, I was patient and the signal-to-noise ratio has improved recently. The ALT.NET Seattle event greatly exceeded my expectations. There were a lot of intelligent, passionate people who attended. Also, a couple of the ALT.NET leaders spent a lot of effort working to keep everyone civil. I think those efforts made a huge difference. Thanks to the organizers for making this a memorable event.

Friday evening started with individuals proposing topics. The heavy posters on the ALT.NET discussion list and prolific ALT.NET bloggers dominated this activity. That wasn't surprising nor is it a criticism. I just found it interesting to watch. Next, there was a fishbowl conversation on polyglot programming. Fishbowl conversations were new to me, so it was interesting to observe. I think it was the right thing to do because there were some people who would of and could of dominated the conversation. The result is that a lot of people got to say a little bit about polyglot programming, but no one dominated the conversation. The downside is that there wasn't a lot of substance to the conversation. That's OK because it set the mood for the weekend, civil conversations where everyone gets an opportunity to be heard.

The first session that I attended on Saturday was led by John Lam and was about IronRuby and the DLR. He talked a bit about IronRuby progress. They have a ways to go. He mentioned they are using the Rubinius tests as one measure of done-ness and they have passed 89% of the tests. He talked about working with code at the meta and meta-meta level and how quickly things can get complex and hard to maintain.

The second session that I attended on Saturday was led by Dustin Campbell and was about functional programming. Most of the people in the room had very little experience with functional programming so the conversation didn't go very deep. Functional programming conversations very quickly get into how much better suited FP languages are than non-FP languages for handling concurrency. I understand how FP languages have the possibility of handling concurrency well, but I haven't seen a lot of real-world examples to prove it. Concurrent computing is complex. FP languages may help with the complexity, but they aren't going to eliminate it any time soon and they certainly aren't going to push concurrent computing into the hands of junior/intermediate programmers any time soon.

The third session that I attended on Saturday was about ASP.NET MVC. Many of the attendees use ASP.NET MVC. Phil Haack and Brad Abrams from Microsoft attended and fielded a lot of questions. We talked a bit about Microsoft's 5% adoption of MVC comments. The number was just pulled out of someone's butt and Microsoft has no idea how much MVC will be used. I think Microsoft is also very nervous about upsetting their ISV's and large corporate partners. If MVC is widely adopted then it will be very disruptive to the web control vendors. Microsoft is always fearful of incurring the wrath of large corporations who will ask "Why am I building applications with X when you are moving to Y?". The attendees asked Microsoft to stop using the 5% adoption number because it will scare a lot of managers away from adopting MVC. Some attendees also asked Microsoft to make it as easy as possible to leverage open source Javascript libraries and client-side controls. They seemed receptive to the message.

The fourth session that I attended on Saturday was led by Scott Bellware and was about Behavior Driven Development (BDD) / Context Specification. Part of the discussion was about BDD and whether Bellware had hijacked the term to push something else. He admitted that may be the case. He presented an interesting way of testing software against specifications. While I liked some of the things I saw I doubt that what he is proposing will gain any traction. While his ideas were interesting I don't think he presented a strong case for improving anything. His approach was different, not better.

The fifth session that I attended on Saturday was led by Scott Hanselman and was about whether the .NET community innovates or contributes anything back to the open source community. It seems to me that a fair amount of ALT.NET'ers are very bothered by the image that all of the innovation is happening outside of the .NET platform or only in the open source community. I don't understand the insecurity. Why does it matter where innovation happens? I contributed two comments to the conversation. First, I think open source is not as prevalent in the Microsoft space because there is a rich ecosystem of commercial companies whereas in the non-Microsoft space there isn't. Many of these commercial companies provide source code with their products, but they aren't open source companies. Second, while the .NET community didn't invent something, they have a long list of things they greatly improved. I used JUnit vs. NUnit as an example. Scott Hanselman translated my point into "innovation doesn't necessarily equal invention" which accurately summarizes my point. Someone else made the point that much of the innovation claimed by the open source community happened some time in the past. What innovation has there been recently?

I skipped the two Sunday sessions. My son and two of my grandchildren live in Seattle, so I decided to spend the morning with them instead. There were a couple of sessions that looked interesting, but nothing I couldn't live without.

The best part of the event was the socialization. Unlike a conference, user group, or Code Camp where people are expecting to be passively taught, most people came to this event to talk/socialize. I had a number of interesting conversations with people in between sessions, at dinner, and at the bar.

4 Comments:

At 5:04 PM, Blogger Unknown said...

"I doubt that what he is proposing will gain any traction"

It's gaining quite a bit of traction amongst influencers, which would suggest to me at least that it stands to get quite a bit of traction.

"I don't think he presented a strong case for improving anything."

I've recognized dramatic improvement in knowledge flow on the projects where I've used this, and better designs.

"His approach was different, not better."

Better than raw TDD practice without the supporting collection of practices called BDD, or better than other BDD practices?

 
At 1:17 PM, Blogger Kevin Hegg said...

Scott,

I should have been clearer. What I was referring to was the translation of specifications into tests as you demonstrated. I was not commenting on TDD or BDD practices themselves.

I understand what you are trying to accomplish with adopting a style for writing specifications so that they lend themselves to being automatically translated into tests. This is an admirable goal and if you can succeed doing so on your projects then that is wonderful for you. However, I believe that trying to apply this to all projects is going to be very difficult. Efforts at making specifications machine-readable (and therefore more testable) are not new. People have attempted to do this as almost as long as computers have been around. Human languages are tricky things to deal with. In some domains, it is very difficult to keep ambiguity out of specifications. It is very difficult to write specifications in a manner that translates 1:1 with tests. In some domains, it is very difficult to write specifications that can be translated into classes and methods.

For projects where it is possible to manage the specification writing so that they are machine-readable, then I see your tools as being useful. It is definitely something that should be in developers' toolbags. But, as something that can and should be used on every project, I just don't see that happening any time soon.

Kevin

 
At 1:10 PM, Blogger Unknown said...

The point that I made is that these are not specifications that can be "automatically translated into tests" or made "machine-readable".

I'm quite sure I emphasized this point in the discussion because it's a point I've been making about context/specification since I started using it and teaching it last year.

Context/Specification and BDD are efforts at flowing understanding and knowledge. The goal is absolutely not to further the cause of requirements automation - although sometimes that can be a side-effect.

 
At 8:01 PM, Blogger Kevin Hegg said...

Scott,

I must have missed that part of your presentation. If so, I apologize and we are in sync on Context/Specification and BDD.

Kevin

 

Post a Comment

<< Home