Apr 09 2011

PingTags and LinkedIn…

Category: Blogging,MiscPhil @ 7:00 am

I recently attended  a couple of webinars that focused on the relationship between social media and recruiting. It was very interesting how much emphasis was put on your Internet Presence.  LinkedIn was the obviously leader in professional networking. The speaker actually said your LinkedIn profile is 10 times more important than your resume. I’m not sure I completely believe that point,  at least not until I can apply for a job opportunity with my LinkedIn profile URL!!! However, there must be some truth to the statement; they also said less than 10% of jobs are filled by randomly submitted resumes, via job websites. Most jobs are obtained via your professional network, almost eliminating the need for an actual paper resume. With so much information available about you on the Internet, it is just easier to Google someone. If a perspective hiring manager Googles your name, what exactly will they find? An interesting question…. Are you completely anonymous, with no Internet presence at all? You have a LinkedIn profile, but it is essentially empty and your profile picture is of Rover, your favorite pet. What exactly does that say about you, professionally?  Internet Presence, that was exact reason I started this blog. In the back of my mind, I thought that blogging would be a great, personal communication tool. How better could I share my thoughts, ideas, and lessons learned? It hopefully communicates what I believe is important, from a typically professional perspective.

On to PingTags; a new service, which I believe started earlier this year. It is a pretty simple, generate a QR code and link it with your LinkedIn profile. Give it a try, scan the code to the left! After spending way too much time on my resume and LinkedIn profile, I was rather intrigued by this idea.  I have seen QR codes in magazines, but was never excited by them. They obviously are bound to a paper world and I’m not really much of a paper person. At least 95% of my reading is done through my phone or Kindle, which essentially eliminates the need for scanning QR codes, I just click the link!  So, what is the point of PingTags? What about your business card? That was another interesting suggestion from the webinar; they recommended that everyone have a personal business card, something you can easily share with the people you meet. I honestly have not had a business card in over 15 years. A personal business card is something that I really never thought about, but it does seem like a good idea; and that is exactly what PingTags wants you do to… put the QR code on the back of your business card. I actually thought was kind of cool, in a geeky way! Flip over the card, scan it, and navigate to a nicely formatted mobile version of your profile. Not sure where I will go with this, just good information to know!

https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


Mar 29 2011

Tired of reading blogs? Try Listening to some podcasts…

Category: Blogging,Software DevelopmentPhil @ 12:01 am

During my time off, I decided to start listening to some technical podcasts. I assume this makes me a complete geek, but since I increased my running distance to three miles per day, it turned out to be a very good use of my time. I believe podcasts can be an excellent source of information and even entertainment. The best strategy is to listen to different topics, topics that you don’t encounter everyday. A person can only read so many blogs! If you are like me, you probably only read blogs which are related to your personal interests. Seems completely normal, but how do you ever get exposed to different technologies or techniques? Unless you change companies or completely refresh your team every 6 to 12 months, your learning environment can become very stale. Listening to different topics is an great way to re-energize your brain and kick start your creative thought processing; at least it does for me!

Several months ago, I discovered Scott Hanselman. He happens to work for Microsoft, but don’t hold that against him. He records several podcasts each week or so; I subscribe to two of them and they are both excellent. Scott focuses on a variety of software development issues and technologies. Hanselminutes is more technology oriented, while This Developers Life covers a variety of personal issue, from motivation, drive, to the Egyptian Revolution. I highly recommend going back and listening to all of the old This Developer’s Life episodes; I can’t tell you how insightful and reflective they are. As an added bonus, you get to hear some rather interesting musical choices! The people that he interviews are so interesting and have some really funny commentary. You will hear stories from different types of developers, both famous and infamous. You will learn how they navigated through their careers, sometimes successfully and other times, not so well. All I can say, is they are definitely worth the time! I hope you enjoy them as much as I did.

Interesting Podcast Topics from Hanselminutes…

https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


Mar 28 2011

Continuous Testing Plug-ins verses Continuous Integration

Category: Continuous Integration,Eclipse,TestingPhil @ 12:01 am

This is an extremely interesting concept, but I really wonder how it works in the real world. Toy projects are one thing, but real production size applications are a different story! The principle is very simple; just modify a piece of code in your project and the IDE automatically executes all of the unit tests. Pretty cool concept. But, what if your unit test suite runs for several hours? I believe this is an all too common problem. I have seen many projects dedicate time to building their unit tests, and over time, end up with test suites that can run for multiple hours. There is a very negative side effect associated with this scenario, it can change a developer’s attitude towards checking in code, it actually discourages it. If the developer knows their commits will not be validated for several hours, they tend to check in very infrequently and only when they feel it will not impact other developers. Many factors can contribute to this scenario. The primary factor is that many of these tests are not really unit tests; they typically do not provide the proper isolation (via mocking) and take too long to execute. I’m not going to address those issues directly, but the concept does support and encourage the idea of quick test execution.

Wouldn’t it be cool to run all of the unit tests, in your IDE before checking in the your changes? Obviously, this action is expected by the developer and is completely possible without any special tools. However, it is not automatic and requires extra effort by the developer. Usually, it is easier to let the Continuous Integration server run the tests and fix the fallout later. It seems like a simple problem to solve; create a plug-in that automatically executes all of the unit tests saving a file, requiring no additional effort or interaction from the developer. If you are an Eclipse user, you have two good options, JUnit Max and Infinitest. I would really like to try Kent Beck’s JUnit Max, but you need to purchase a subscription to use (and even try)  the plug-in. That is a show stopper for me, as there are free alternatives such as Infinitest. I evaluated the Infinitest plug-in last year, after it was open-sourced. I did encounter some problems with the plug-in, my Spring-based projects did not work; I was quickly discouraged and abandoned the tool. Months later, I was still intrigued by the idea, so I gave it a try last week on a different project and achieved success!

Unfortunately, Infinitest is not currently installable from the Eclipse Marketplace. It can be quickly installed the old fashioned way from the Eclipse update site.  There is only one setting you may want to tweak, the Slow Test Warning threshold. If  you have unit tests that execute for longer than half a second, Infinitest will add a warning to your Problems View for each one; this could be problematic if you have a large number of long running unit tests. You should also notice a huge new “Continuous Test” status bar in your Eclipse footer. It will update automatically based on your coding activity. JUnit Max is a little more subtle, but provides the same type of feedback mechanism. That’s it, you are now ready to write code and auto test!

I’m not exactly sure how intelligent these plug-ins are, related to determining which unit tests to execute based on what code was changed. Hopefully that is the next level of continuous testing evolution. However, if you modify a single unit test, Infinitest will only run that single test; beyond that, I don’t believe there is much intelligence. If your code modification causes any unit tests to fail, the status bar will turn red. Unfortunately, it does not tell you how many test failed, it only provides the number of tests which it executed. If you mouse over the status bar, it will display the names of the tests which failed, but you can not click on them, they are just there for feedback. You will need to look in the Problems View to discover which test broke, the view will now contain markers for each failed test.

The only issue I have with the plugin, is around user interaction. It does not provide its own View for test results, it simply adds the failures to the existing items in the Problems View. I assumed that the Unit Test View would be updated with the test results, similar to manually running a unit test; however, this was not the case.  It appears that both Infinitest and JUnit Max use the same strategy, providing minimal feedback to the developer. Maybe this a good approach, but I was expecting more visual feedback and confirmation. I associate static problems with the Problems View, such as compiler and code quality errors, not dynamic problems such as test failures. The developer should be able to click on a marker (problem) and be navigated to the exact line of code indicated by that marker, giving them the opportunity to resolve the issue .  Here is my problem with this approach, if the developer clicks on the failed JUnit marker, where will they be navigated? The only possible location is the unit test which failed. Generally speaking, this is not where the problem would be found, it would lie in the code that was just changed. Given this issue, I would rather use a different View for continuously executed tests;   maybe even better, find a way to reuse the existing Unit Test View.

This does bring up an interesting question, do these plug-ins minimize or eliminate the need for Continuous Integration? I don’t believe this is the case, but would seem to eliminate the need for some of the more advanced features the vendors are implementing. One such example is TeamCity’s remote build, pre-commit test feature. There are many factors to consider, but if unit tests execute quickly, I do not see the need for this functionality to be provided by the CI server. If the tests execute for multiple hours, we could be adding more risk into the process, by not committing changes when they are ready to integrate. Remote build, pre-commit testing sounds valuable on paper, but I am struggling to see the real benefits. I tend to think this functionality over complicates the Continuous Integration process. One of the main benefits of Continuous Integration is that it is seamless and automatic. Nothing is required by the development team to reap the CI benefits, simply check in your code early and often.  As more education and process overhead are required by the CI process, it is no longer simple nor free.  The focus should be on doing what makes sense, at the right time, in the right place. These IDE plug-ins seem to have correct perspective, pre-check-in on the developers desk, simple, automatic, and efficient.

I hope that you will try to integrate one of these plug-ins into your personal development process. I believe they will help create a more robust development process, creating fewer Continuous Integration failures and ultimately, a more efficient, faster executing test suite. If you have real world experience with one of these tools, please provide some comments. I am still curious, does this approach work on real projects?

https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


Mar 15 2011

Upgraded Web Browsing…. Firefox 4 and Chrome 10…

Category: Blogging,Software DevelopmentPhil @ 11:49 am

This might seem a little off topic, but the web browser could be considered one of your primary development tools. Just think how much time you actually spend in a browser:  researching, reading, testing, debugging, or simply wasting time! After playing with a few Chrome Extensions, I realized just how much more efficient I could be.  I liked the way Chrome integrates information and makes it useful, which is essentially Google’s Mission.

I had been a loyal Firefox user for many years. Firefox provided a nice cross-platform experience between Windows and Linux, and was highly extensible. This extensibility was one of the early benefits of Firefox, the ability to add new behavior to the browser through “Add-ons”. Amazingly, I was never a big “Add-on” user, I used a few of them, but did not take advantage or even explore what they could do for me. I used the Delicious Add-on for my bookmarking needs, but recently moved to the Diigo Add-on. I also use Firefox Sync for browser synchronization, including my tabs, bookmarks, history, etc. For posting to my blog, I’m a huge fan of the ScribeFire Add-on.  If you happen to do web page development, you have to try the Firebug Add-on. That is basically the extent of my Add-on usage, I did not ask too much from my browser!

Before jumping ship, I upgraded all of my machines to the Firefox 4 Beta. There were numerous technical improvements, but I was primarily focused on pure usability and how the browser could help me be more efficient. Start-up time was one of my biggest Firefox complaints; the browser seemed to have a tendency to bog down over time. The new version seems to have gone through a pretty dramatic user interface overhaul and addressed multiple performance issues, including start-up.

I was pretty happy with the UI changes, preferring the new, but controversial new tab location. The tabs are now located over-top of the navigation tool-bar; there was apparently quite a bit of debate on this little change! I prefer having two control rows at the top of the browser window, one row for tabs and the another row for navigation, apps, and widgets. I have seen a lot of content about these “web apps”, but it seems a little like pure marketing to me! The Firefox implementation, App Tabs, appear to be little more than a space saving short-cut; however, I can see them providing value for highly used web sites.

I had installed Chrome a couple of years ago, but was not too excited by it; I saw no compelling reason to change browsers. Wikipedia has an interesting graph of web browser usage; I was really amazed to see how the Chrome market share has taken off in the last twelve months. Even on my own blog, Chrome accounts for almost 25% of the traffic. I installed the newest version of Chrome last week and was immediately hooked. Unfortunately, I have become a true Google convert. It started with the purchase of my Android phone and there was no looking back. I am not saying that the following activities can or cannot be done in Firefox, I’m simply saying that I like everything better in Chrome!

A simple, but extremely cool feature is the “New Tab” behavior. It obviously opens a new tab, but its contents are quite different than you would expect. It is basically divided into three sections, Apps, Most visited, and Recently closed. Under the Apps section, you will see the Web Store icon; does everyone need to have their own app store these days? Anyway, the Web Store is a very well done site, that makes searching and installing new behavior extremely easy, using either applications or extensions.

Applications seem like fancy bookmarks, but from my reading, they can be (are) a lot more sophisticated. I looked at the SlideRocket app, it was genuinely cool… however, you can also run the app in Firefox! The app concept seems analogous to a rich user interface experience, one that performs like a real desktop application, rather than a collection of old-fashioned HTML pages.

My favorite feature of Chrome has to be the Extensions. Extensions add additional behavior to the browser itself. You can see from the picture to the right, I have added quite a few of them! They integrate into the Navigation Bar and look very nice, consuming minimal space while providing significant functionality. They look similar to the icons found in cell phones; many of the extensions have little indicators that track the number of items you need to address. You can find extensions for all of the standard Google tools: Gmail, Reader, Calendar, and even eBay. The Calendar extension is extremely helpful; it will tell you how long until your next appointment and when you mouse over it, it shows you the event details. The WP Stats is another personal favorite; it tells me how many people have looked at my site! Clicking on some icons will navigate you into the corresponding website, much like a short cut. Other icons have specific behavior, such as showing you detailed web site access statistics or an enhanced view of your search history. My favorite blogging tool, ScribeFire is also available in Chrome, but the spell checker is not working! I like the placement and interaction of the Chrome extensions much better than the traditional Firefox “Add-on” view, which is typically at the bottom of the browser window; Chrome make the extensions feel more integrated with the browser and part of the actual user experience.

My final Chrome praise is the synchronization with my Google account. It is pretty cool to watch an extension get automatically installed on my Windows machine, simply by installing it on my Linux machine. No restart or refresh required, it just automatically shows up!  I did notice one small oddity, I still had to configure the extension on the Windows machine. This seems rather strange, maybe it is a bug…. I assumed that Chrome would save each of the extension’s settings and synchronize them too. Even with this little shortcoming, there is no going back to Firefox for me, I hope you give it a try too!

https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


Mar 03 2011

TDD is too hard…

Category: Software Development,TestingPhil @ 2:11 pm
Earlier this year, I participated in a meeting to recommend testing best practices. Being fairly passionate about TDD, and working in an environment where unit testing is fundamentally an afterthought, I assumed the group would certainly recommended TDD as a best practice. Amazingly, TDD failed to get the official endorsement. Many of the committee members stated that TDD was too hard and required too much up front work. Too hard? Really? I was very surprised by their rational, especially since many of them were seasoned developers. The image to the right has nothing to do with software development, but I love the message! I thought there was a rather subtle correlation to TDD: Thinking first can prevent problems. Think First is probably a little misleading, surely, everyone claims to think before embarking on a task. For me, the real value behind TDD is the thought process; I like to connect thinking with the design process, not the coding process
I had the opportunity to participate in a SD Times webinar on Wednesday called “Leaders of Agile: Practical Agile with Test-Driven Development” by Kent Beck and Mark Seemann.  I took some notes on points I thought were relevant to truly adopting TDD.

 

Concept Observation
Write a little test, write a little code, iterate.
– Don’t spend hours writing code
– TDD tells you where you stand
I have worked with a lot of developers that literally spent hours writing code without ever executing it. Once they feel like they are done coding, they start to think about testing. Many times, this testing is done in the same manner as their development, which I will describe as “without a plan”.  Some even avoid structured, assertion-based testing frameworks, they simply spin up the container and execute the code. This approach will certainly work, but would you really say this is the most effective approach? You end up with a huge pile of un-validated, un-executed code.  Unless you are perfect and never make mistakes, you have no clue how many bugs are buried in the code. Even worse, if you discover a design problem or a more elegant approach, how much code will you have to change or toss away? This is not to say that refactoring or eliminating code is bad. The point is, this effort could have been minimized or even prevented by thinking first. Another benefit of starting from the test perspective, is you always know where you stand in the process; the test works or it does not. If the test does not work, you are probably not done.
TDD allows you to defer design decisions
– Refactoring and Retrofitting
– Working test first, Design as you go
If you think about TDD from an iterative perspective, you are basically working from the outside in. You can almost compare it to the Object Oriented principle of encapsulation, hiding or postponing design decisions.  Unit test should be focused on the behavior of the class, not the actual implementation. Because the unit test is focused on behavior, you have the opportunity to defer some of the design/implementation decisions to a later iteration. This later iteration might only be a couple of hours later, but, by thinking first about the desired behavior and interaction, you are not required to commit to an actual design or implementation.  Obviously refactoring can be a big part of TDD and is hopefully not considered a deterrent. Refactoring does have a cost; I believe this cost is justified and minimized by the confidence provided by a good unit test suite and the long term supportability of a elegant design.
TDD = API Design For me, TDD is more about the design process than the actual unit test. TDD forces me to concentrate on how the class will be utilized, enabling me to focus on the requirements of the class, implementing no more functionality than is actually necessary. Implementing the class first, you are basically guessing at the API, only imagining how the class could be used. Adding and subtracting methods as you go, based on how you feel the class might be invoked. I have to believe that this approach will generally not result in the cleanest API. Focusing on the external interaction and dependencies provides a simpler and more exact view of the actual implementation requirements.
Tests are First Class Citizens. This one is very near and dear to my heart. I can’t tell you how many times developers have pushed back on the enforcement of coding standards and quality as it pertains to the unit test suite. Why would anyone want to allow or encourage bad coding practices in the unit test code, that are not allowed in the real code? Do developers have a quality switch that can be turned on and off? Writing good, clean code over here, and sloppy code over there? It should be all about consistency and supportability. Unit tests need to be as understandable and maintainable as the mainline code; they will be maintained as long as the mainline code and over time, could provide more value (comprehension) than the mainline code.
Testable = Loosely Coupled Testability is another topic I find very interesting. I just love the Testability Explorer tool, but was never able to get others excited about this concept. If not designed effectively, unit tests have the potential to become a roadblock of change. Unfortunately, the same point can be made about the actual code; there are certain designs and implementation that can impede future change. The goal is to use patterns and tools that support change, such as dependency injection. Some people think that TDD can encourage coding without thinking; I have the complete opposite view, thinking first, to establish the design. It is my opinion that TDD is just one small tool that helps flush out designs; doing so without significant overhead and simultaneously building a safety net.

 

I was not sure how these “virtual conferences” actually worked, but this one was very well organized and seemed to go off perfectly. I was very impressed with both of the speakers and recommend subscribing to their blogs. They each made numerous points, but I wanted to share a point from each of them that I found encouraging.  I will probably paraphrase them poorly, but hopefully you get the point!

 

Speaker Thoughts…
Kent Beck – “Everything is hard. You need to be eager and willing to learn.” I thought it was refreshing to hear the words, “eager and willing”.  Unfortunately, we are not always eager and willing! You can’t force anyone to learn or change. There has to be a willingness; we need to be open-minded to absorb and process new thoughts. The benefits are not always immediate and may cause some pain, but in the end, learning is what makes us better.
Mark Seemann – “I tend to do a lot of thinking.” I thought this was great! I never hear people say they do a lot of thinking. Many of us tend to jump into the fire, probably because we are not given sufficient time to evaluate at the situation. My challenge to you, demand the time, take the time. Think First!
https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


Mar 01 2011

Build Evolution: Ant to Maven – A simple exercise…

I typically end up taking the role of “build guy” and I have no idea why! It is probably a control thing; I like to be able to fix problems and make improvements. I have been an Ant-man for many years, and was proficient in the “Art of Make” before that. I never gave Maven a chance, as it always seemed like it was too complicated for adoption by corporate projects. Ant can be very simple and quick; some might even throw the word dirty in there! Unfortunately, I have seen (hopefully not created!) some of the most horrific Ant build systems imaginable. Many times, it is a developer’s “first time effort” that becomes the project’s permanent build system. Because I had the opportunity to work on many different Ant-based projects, I gained experience by using and evaluating many different approaches. I believe that real problem with Ant, is not necessarily Ant itself, but the lack of standards and structure it provides. Ant allows each developer (or project) to invent its own Ant Philosophy. Once you are familiar with the Ant “programming style”, you can script just about anything, any way you want, manipulating files located in any direct structure, located anywhere on the network. Ant is a very powerful and flexible tool. Unfortunately, the more inventive the Ant Philosophy is, the more brittle and complicated the scripts can become. Ant XML also seems to be an excellent “cut and paste” candidate. New team members, with little time or understanding of Ant or the specific philosophy implemented, find duplicating targets the safest method to implement change. Another shortcoming of Ant is dependency management. This problem is easily solved and have blogged about it numerous times in the past. I believe that the use of a dependency management tool, such as Ivy, is an evolutionary step taken by “build guys”. Sadly, many developers don’t seem to be interested in dependency management. They don’t like the dependency on the dependency management tool and would rather just manage them within the project. For me, the value of the central repository and structurally documented, versioned dependencies cannot be undervalued.

I have probably described many of the same experiences you have encountered with Ant-based projects; they can be even more exaggerated when you start managing multiple projects. One possible evolution path is to define a “standard” directory structure and develop a collection of “shared” Ant scripts that work with the standard directory structure. This allows each individual project’s build XML to be little more than a set of properties. This is exactly where we ended up, utilizing an Ivy-like externalized repository for build scriptlets and Ant’s URL import capability. Hold on, a standard directory structure with reusable build components? Almost sounds like some other tool…

The Experiment…
Being an Eclipse guy, I was curious how to combine an Eclipse Web Tools Platform (WTP) project with a Maven project. My ultimate goal was to setup a Jenkins Continuous Integration Server, that monitored a Maven-based project on GitHub, and develop the code within Eclipse and utilizing a Dynamic Web Project facet. Historically, my projects were managed by Hudson using Free-style jobs; they were always Ant-based and extracted from a Subversion repository. I had worked on quite a few web applications using this pattern, so it seemed like a pretty good starting point! I had started a toy project to manage encrypted data a couple of years ago and never put it under version control, it was a perfect candidate for this experiment.

Changing a project’s build tool has a rather large impact and needs be considered from multiple perspectives. I have approached the problem by breaking it down into five (5) different spaces. This might not be the optimal or preferred approach, but does show the high-level differences and addresses the major build challenges. Surely, my next project will be done completely differently, but you have to start learning somewhere!

Perspective Changes
Dependencies To end up with an application that compiles in Eclipse, this change and the next have to be done in parallel. The easiest approach is to create a POM and focus on the dependencies section. If you are already using Ivy to manage your dependencies, this will not be that big of a change. Using a public Maven repository like MVN Repository to locate dependencies is very simple and saves a ton of time. The easiest way to create a POM is to let Eclipse make it for you; you will need to install the Maven plug-in first.  Once you have your project loaded into Eclipse, simply “Enable Dependency Management” in your project’s properties. You can also add dependencies directly from the  plug-in; I’m a little old school and prefer to edit the POM manually, but either approach works fine.
Local Development I never liked the required Maven directory structure; I must have subconsciously adopted the Eclipse project structure as my personal standard. Converting the project’s directory structure was trivial and soon realized it was not that big deal after all; I aways separate my source and test files anyway. The main difference is everything lives under the source root, including the WebContent directory. I’m not sure what the “preferred” process is for loading a Maven project into Eclipse, probably using the Maven Eclipse Plug-in, which will create the appropriate Eclipse property files. Since I was doing a conversion, I don’t think this plug-in would be too helpful. Because I’m fairly Eclipse savvy, I simply wired everything up myself. Having the Eclipse Maven Plug-in will enable your dependencies  to be automatically added to the build path. With the addition of the Eclipse “Deployment Assembly” options dialog, it is extremely easy manually configure your project. You need to have a Faceted Dynamic Web Module for the “Deployment Assembly” option to be visible, but that should be a fairly simple property change as well. At this point, you should have a fully functional, locally deployable web application.
The Build Now we can ignore Eclipse, and focus on building our WAR file using Maven. Even with the minimalist POM shown to the left (plus your dependencies section), it is possible to compile the code and create a basic WAR file. Use mvn compile to ensure that your code compiles. Using the Maven package goal, the source code is compiled, the unit tests are compiled and executed, and the WAR file is built. All of this functionality, without writing one line of XML!  One of the more time consuming parts of an Ant-based build is integrating all of the “extras” typically associated with a project, making them available to the continuous integration server. The extras include: unit test execution, code coverage metrics, and quality verification tools such as Checkstyle, PMD, and FindBugs. This tools are all typically easy to setup, but every project implements them slightly different and never put the results into a standard place! The general process for adding new behavior (tools) to a build appears to be the same for most tools. You simply add the plug-in to the POM and configure it to fire the appropriate goals at the appropriate time in the Maven lifecycle. Ant does not have this lifecycle concept, but it seems like a very elegant way to add behavior into the build. From the following example, I added the Checkstyle tool to the POM. The <executions> section controls what and when the plug-in will be executed. In this example, the check goal will be executed during the compile phase of the build process. Simply executing the compile goal, will cause Checkstyle to be invoked as well. This seems like a very clean integration technique, one that does not cause refactoring ripples.

The Cobertura integration is another very good example. I have been a fan of the Clover code coverage for many years. Since losing access to Clover, I needed to look for an open-source alternative,  and had never tried EMMA or Cobertura. They both seemed like capable tools, but I had more success integrating Cobertura with Jenkins. I’m highlighting this point, as doing code coverage with Ant and Clover can sometimes be a little tricky and usually messy. The Cobertura plug-in takes compete responsibility for instrumenting the code, executing the unit test, and generating the report; completely non-intrusive.

Continuous Integration Most development teams also want their project monitored by a Continuous Integration (CI) process. Modern CI tools such as Hudson/Jenkins provide excellent dashboards for reporting a variety of quality metrics. As I previously stated, it is rather time consuming to develop and test the Ant XML required to generate and publish these metrics; combine that with configuring each CI server job to capture these metrics and you have added a fair amount of overhead. I knew there was support for Maven-based projects within Hudson/Jenkins, but never took the time to understand why it would be beneficial. The main benefit is right there in the description, had I bothered to read it! Configuring a Maven-based job is little more than clicking a few check boxes. No need configure them in Jenkins, using the information provided by Maven, it is basically automatic. This is one interesting aspect of the Hudson-Jenkins fork. Sonatype, the creator of the Nexus Maven Repository manager and the Eclipse Maven plug-in, have chosen the Hudson side of the battle. I wonder what this means for Maven support on the Jenkins side. Obviously, it will not go away, but that might end up being a Hudson advantage in the long run. I still believe that the Jenkins community will quickly out pace the Hudson community.
Deployment I’m not going into much detail on this point, other than to say that I would like to see the deployment process completely separated from the build process, maybe even done by two different people or organizations. I have seen too many projects combine building and deploying into one system, creating artificial dependencies, time consuming, unreliable, and unmaintainable deployment processes. I have also experienced complete deployment overkill, causing simple deployments to take over thirty (30) minutes, with no real guarantee that it would be successful. Hopefully teams (developer and operations) will strive for a minimal deployment overhead/process, one that provides just enough security and control to enable continuous, rapid deployments.

I have barely touched upon the real power of Maven and know that some people are going to say my requirements are too simple to be valid. I’m sure there are projects far more complicated than mine, but I do feel that this project is highly representative of the 80-20 rule. If developer’s can have a little self control, conform, and do what is required verses what would be really cool, simple Maven POMs should satisfy a large number of corporate projects with very little overhead. That is the goal, correct?  I can even see using Maven for my toy projects; why do I want to waste time writing and maintaining a bunch of useless build scripts? I never realized that you could ease into Maven; I think that is the real point of the post! And, it is not actually a huge investment.

When I was Googling for this topic, I found many interesting articles. I truly believe the biggest reason preventing teams from evolving and adopting new approaches is simply fear, fear of change and the unknown. I still think Maven is a rather large “pill to swallow”, but have now experienced enough value to continue investing time into this technology. I took the typical developer approach and dove into Maven with little preparation or understanding… I would not suggest this approach, unless you to have lots of time to waste! If you are unfamiliar with Maven, I hope the following collection of articles will provide a quick overview of the basic concepts. I realize that some of them are a little old, but you should start with the Maven basics, and those really have not changed.

An Introduction to Maven2
Should you move to Maven2
Maven vs Ant: Stop the Battle
How to Migrate from Ant to Maven: Project Structure
Top Ten Reasons to Move to Maven 3
Maven over Ant + Ivy
Automated Deployment with Maven – Going the whole nine yards

https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


Feb 04 2011

Continuous Integration Saga – The move to Jenkins…

Category: Continuous IntegrationPhil @ 2:35 pm

If you have ever looked at my blog, you know that I’m a huge advocate of Continuous Integration and have been pushing Hudson for several years now.  The end of SUN unfortunately had a significant impact on many open-source projects, including Hudson. Support for Hudson continued to work pretty smooth from a user perspective until last December. I don’t think the quality of Hudson changed, but the quality of the releases seemed to change.  The Hudson team had been extremely consistent with their weekly releases; the release notes were always updated, the download links always worked, life was good. As 2010 came to an end,  something seemed to change; links pointed to the wrong versions, releases seems inconsistent, release notes were out of date or lagged days after.  The most obvious indicator was that the community was very quiet… something was happening behind the scenes.

Fast forward to last week, an important vote was taken by the existing Hudson community. It overwhelming decided to break away from ORACLE, essentially forking Hudson into a new product called Jenkins. ORACLE apparently holds the rights to the Hudson name and has already revamped the Hudson website. It will be very interesting to see what happens next. I don’t think I’m a fan of forking. I lived through the remnants of the emacs/XEmacs split; where did that really get us? In the end, I think it is mostly about ego and control; as users, we ultimately end up being split into two camps, typically incompatible with lots of petty sniping. You can read this recent announcement from ORACLE and the follow up responses, it is pretty obvious that this was not a friendly breakup!

Question of the Day – Which side are you picking?
I’m going with Jenkins.

  • First, the community was very committed to the split. Considering they have been leading the product for a very long time, I don’t predict a major change in direction.
  • Second, is Kohsuke Kawaguchi. This is not to take anything away from the rest of the Hudson
    developers; they all seem to do an amazing job and I anticipate this will not change. I am just extremely impressed with what KK has produced over the years.
  • Finally, where will ORACLE go with Hudson? Will there be anyone actually using it? I predict that Hudson will quickly fall behind in functionality to Jenkins and be a “good” choice for large companies that are only concerned about “supportability”.

Alternate Question of the Day – Which butler image will change first?
They can’t be the same, right? I think Jenkins should get a new mascot, after all, it is a brand new world! And one small favor, please don’t make it Chuck Norris!

I recently had the opportunity to create a couple Hudson plug-ins, this was a very educational experience! There is a lot of information available on the process, but unfortunately, I found it more valuable to walk through the code of existing plug-in implementations.  I believe that one of the major advantages of the Hudson/Jenkins architecture is the plug-in infrastructure. With just a little bit of vision and support, teams (and organizations) could extend Hudson into so much more than basic CI server. All things take time…

I already have my Jenkin’s server up and running and hope to get it integrated with my GitHub account in the near future… I want to finish documenting my experience creating a custom plug-in, which I will hopefully share out with my sample implementation.

https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


Feb 04 2011

Interview Question – Name Five reasons Java is not Object Oriented?

Category: Java,Software DevelopmentPhil @ 12:53 pm

I thought it would be a really good idea to share interesting interview questions, especially the ones that caught me off guard!

I found this title to be a very interesting question… I’m not sure it was intended as an open-ended question, as I don’t think there is a right or wrong answer. It really depends on your perspective. Are you looking at it from a purely technical or logical (modeling/solution) perspective?   Even though I might be considered very technical, I prefer to stay out of the really low level stuff. Sure it is fun, to get down to the bare metal once in a while, but I get more enjoyment out of the thought process. Claiming to be a very Object Oriented person and stumbling on this question really bothered me… I came home and did some Googling, but really never found the answers that seemed to live up my expectations. Next, I sent an mail to one of my pals who just loves the nuts and bolts… In his typically witty way, he rattled off: int, float, boolean, double, long, and char. A completely valid answer! He then have me six other bullets which I have merged into the table below and tried to expand upon.

Reason Explanation
Primitives The interviewer lead me with this one, which it is actually pretty obvious when you think about it! This is really not disputable, primitives are not truly objects. I think for historical reasons, namely performance and ease of use, developers were pushed to primitives. With modern JVMs and Java 1.5 autoboxing, I feel like this flaw is slightly minimized. However, from a purest OO perspective this is a valid reason.
Constructors The interviewer also talked about the limitation where constructors can only return an instance of itself, the specific class; which has lead to the creation of factory patterns and DI frameworks. Having been a Spring Framework fanatic for so many years, I have probably developed a different style of programming which as shielded me from this issue. With some of the more  dynamic OO languages, you can apparently return a different type instance from the constructor. I need to study some more on this, but did find an interesting Ruby thread.  Ruby is definitely on my learning list, so I have to stop blogging!
First Class Objects This reason takes us to pure object orientation. I’m going to gloss over this a little and let you read more on your own as I did today, try this simple search. One of the more obvious distinctions, would probably be the meta data / class model provided by Java.
Statics This is actually one of my pet peeves. I am a no fan of static methods.  For me it is simple, statics equal functions which equal non object oriented. I have always bought into the DI propaganda which states that static objects and methods are untestable, which I fundamentally agree with. However, I did recently discover there are are several mocking frameworks that can actually mock out static classes and methods. This is pretty cool, but unnecessary in my preferred design and implementation style.
No Multiple Inheritance Coming from a C++ background, this was always a big discussion point. I generally did not recommend multiple inheritance, as it was tricky and usually implemented (and even modeled) inappropriately. I was much more interested in inheriting behavior rather than state. If I remember correctly, this is where it got interesting using C++.  Java Interfaces do a little bit to address this issue, but there is typically no why to elegantly solve the problem without a little code duplication and/or helper classes.
Strings My nuts and bolt friend gave me this one; not sure where he was going, but obviously the Java String class is its own unique creation. Personally, I was always bummed by the lack of a mutable String class. I might have prefer to see non-mutability implemented via subclasses, or something like the Collections.unmodifiableCollection() method. How about that, another static method reference! Maybe this would have been a little better, stringInstance.getImutable()!
Limited Operator Overloading Having done a lot of C++ in my early programming career, I found elegance in operator overloading and saw it as a void in Java. But, I do have to agree with the reason it was left out of Java, “it was used incorrectly to many times in C++”. Unfortunately, you can use any tool incorrectly; it might have been nice to let us decide if and how to use some of these features, rather than eliminate them. I did find an interesting thread on StackOverflow, so I will keep my thoughts short!
JNI This was another bullet from my nuts and bolt friend. In my opinion, JNI was more of a Java after thought, “bolted-on” to simply to take advantage of existing solutions and provide some level of extensibility. I have not used JNI for many years, and all I can remember is procedural nature of our solution. I don’t remember details, other than it was ugly! You could also say that ugly is not object oriented!
Package Level Scope Late breaking addition. I personally never liked the “package” level scoping provided by Java.  How is this object oriented? You are basically breaking the class level encapsulation, allowing all classes in that package access… This always felt like a little implementation hack to me… not one of my favorites!

My thoughts might not be completely baked, but I need to keep on moving; so please let me know if you think I missed the point! Generally speaking, I have been very happy with Java as a implementation language. All languages have their warts, and Java is no different. The vast number of tools and frameworks allow Java applications to be effectively developed, tested, and deployed.  I also believe that Java is pretty forgiving, allowing developers of all levels to build solutions; from being barely object oriented to insanely obscure. Hopefully, we end up in the middle with simple elegance. I hope to have the opportunity to explore some of these Java flaws in a Ruby, so please check back and see what I have learned!

https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


Feb 04 2011

Social Bookmarking… Migrating to Diigo

Category: Android,MiscPhil @ 12:01 am

The pending demise of Delicious was blogged to death a couple of months ago. I too was saddened by Yahoo’s decision, having become quite dependent on this service. Like many, I began a search for a new solution, but nothing seem to be the perfect replacement. It now appears that Delicious will be sold off, but who really knows what direction it will take. I discovered Delicious a couple of years ago; I blogged about it, showed it to my friends, and used it as a “sharing” solution whenever I had the chance. I’m always looking for ways to be more effective and efficient, both remembering and information sharing. I’m not exactly sure why, but none of my friends were ever really jazzed about the tool or concept… I could only come up with a couple of possible reasons:

  1. Many of my friends are Apple Fanboys… There must be something in that space that performs a similar function, maybe they have their own club!
  2. None of my friends blog. Many of them read multiple RSS feeds, I guess they just read and run! They must be able to retain a lot more information than I can!

My environment and thought process must be a little different.

  1. I live in an Apple free world.  I look for tools that work across platforms.  Dropbox is one of my favorites. It runs on my Android phone, my Ubuntu machines, and even those pesky Windows machines I have to support for the family.  Delicious provided effective plug-ins for most of the web browsers, and there were multiple options in the Android marketplace; this solution pretty much guaranteed I could find what I was looking for!
  2. The more RSS feeds I read, the more topics I find to blog about! This is vicious cycle and quickly creates a backlog! It was so easy to create a simple bookmark in Delicious, and tag it for blogging.
  3. Many times, I start reading an article and determine that it is too heavy and need some extra time to comprehend it. I just create another Delicious bookmark, and tag it for follow up.

After a lot of Googling for Delicious replacements and I landed on Diigo (My Library). I like the free stuff best and they also had a pretty decent Android application. Diigo has one feature that I have really grown to appreciate, the “Read Later” option.  If you find a web page that is interesting, but don’t have the time to read it, you simply click the “Read Later” button on the Diigo tool bar. Pretty simple! When you find time and want to catch up on your reading, you just go to your library and browse the Read Later items. I have not actually taken the time to use the notes and highlighting features, but am completely sold on tagging. Tagging seems like such an efficient way of classifying and searching for information; at least it works for me!  Diigo almost seems like a cross between Evernote and Delicious. It has many more features that Delicious, but the question is…. how will they survive?  I can only hope and might actually have to give them some $$$ to support their efforts.  My only rant is they do not provide a “tag cloud” view in their Android client… If they implement this view, I promise to become a paid member!

I still believe there is real value in the community, I’m not sure I will find one that is receptive, but I can always hope! There has to be value in exploring what your friends and coworkers are discovering and recording as important. I have to assume that much of that information is the bond which brings people together in the first place and ultimately enables them to communicate in a much more efficient and effective manner.

I highly recommend that you give Diigo a chance, and better yet, add me to your network!

https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


Feb 03 2011

Where to start….

Category: Blogging,MiscPhil @ 2:52 pm

Everyone keeps asking me why, what happened to your blog? Why did you stop posting? Unfortunately, it is very complicated answer… I really don’t want to go into the the details. Chances are, you have been in the same boat as me; placed on high profile, transformational project which needed to be done by a specific, non-movable date.  It was a rather large project when you consider how it was staffed; project managers, agile coaches, testing teams, analysis team, business liaisons, business consultants, architects, and a few developers. Combine this sizable staff with a new business vision, introduce some new technology requirements (ESB, BPM, Rules), and I think you will get a pretty clear picture of why I stopped blogging!  This was a very unique opportunity to finally “do it right”; we were able to complete the system last November/December and successfully migrate to production. It was a very stressful period of time for everyone on the team, spanning many months; this including canceled vacations, working weekends, and extremely long days. This is not really a rant, as we all have to go through these experiences at different times in our career, this is really more about what you can learn and can take away from the experience.  Fortunately, I was able to be a student of the process and observed and learned many valuable lessons, and even the reinforcement of many of my fundamental beliefs. Unfortunately, I just did not have the energy to share those thoughts and gave myself a break from blogging.  There are several topics that I do hope to go back and revisit, so you will have to keep on reading!!!!

Finally, the biggest change for me occurred last Friday. I was given the opportunity to take a package from my employer of eighteen years. I want to thank all of the people that guided and supported me during my time there, you know who you are.  I have so many fond memories, met so many great people, it was truly a wonderful time. You would not even believe some my stories from the early days, being chased by security guards (it was not my fault, really), to directing traffic on Wisconsin Ave, flying flight simulators in Silicon Valley with SUN engineers, to almost sitting in Steve Job’s office chair!  Combine this with the large number of projects and unique people that I was given the chance to work with, it was a great place to be. Fortunately, I did not think of my job as work, I was always given the opportunity to learn new things while continually working to innovate and improve the environment.   Anyway, times change, companies change, and people change.  The bottom line is that I am very happy, happier than I have been in a long time. I look at this as a blessing and something that will lead to even better opportunities…  I have no idea what those are, but have a little time to explore the landscape. Thanks again, for all of the personal messages…

What’s next?
Fortunately, I had been working on my resume for the last month, and hope to have it in a marketable state in the very near future. It is amazing how much time it takes to get this going! I have written more emails in the last few days, than I did in the last year! I’m also doing a lot of thinking, what type of place do I really want to work, I hope I have a choice!  I don’t really want a job, I want to be part of a company where I can contribute; I think there is subtle, but significant difference… I worked in a factory during college making all types of flexible steel hose, this was production, not contribution… Contribution is being part of a team, making a difference, having pride in what you do, while producing and meeting your goal. I hope this is not to much to ask!

I am going into learning mode, which makes me very happy… I plan to ramp up my blogging and get my writing skills back into shape. I really enjoy writing, weird! I have already rebuilt my Linux box and have it wired up to a domain, Share My Progress.  I have video conferencing setup via Skype, switched over to my Google Voice number for almost all of my phone calls, all running under Ubuntu 10.10. I had a really good idea yesterday, I’m going to blog about interview questions that I think are very interesting or tripped me up… I already have one in the hopper!

https://www.beilers.com/wp-content/plugins/sociofluid/images/digg_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/reddit_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/dzone_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/stumbleupon_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/delicious_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blinklist_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/blogmarks_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/google_48.png https://www.beilers.com/wp-content/plugins/sociofluid/images/facebook_48.png


« Previous PageNext Page »