Tuesday, November 24, 2009

Reviewing the Greendepot Wicket Web Application

The application that I worked on as described in my last blog post was a class project in which each group of 3-4 members created their own application. Here, I present a coding review of the Greendepot web application, which can be found on Google Code here.

My full review of the project can be downloaded here, but in this blog post, I would like to make comments on a few key points regarding the project.

First of all, the application just needs a bit more time and work. The main thing is that results are printed on the command line, rather than on the web page. This is one thing that definitely needs to be cleaned up before the next version release, as it is not very user-friendly. Additionally, although it provides the carbon content for each hour for a day, it does not analyze the actual numbers to say if it would be a good idea to use electricity at a specific time or not.

Second, I found one area in the code that would be worth considering refactoring. This is the setDate method in the CarbonCalculator class. Briefly, this method seems to do a lot more than its name implies, so at the very least the method name should be changed, although I think it would be best to split up the tasks done by the method. Rather than setting the date, creating a list of data, and saving the string of data (for output purposes), it would be better if these three tasks should be split into three separate methods.

Finally, in regards to Continuous Integration, I would like to make the point that the Hudson build for the greendepot project does not seem to be executing any Ant tasks when changes in the code are detected. This is important for monitoring the project and should be fixed as soon as possible.

Again, these are the two main problems that I found. Please refer to my full review, referenced above, for complete comments.

Monday, November 23, 2009

Introducing Eco-Depot!

For my software engineering class, this past week has involved working on another aspect of interaction with the WattDepot client. Whereas before we focused on creating a command line interface for the system, this past week we began to look at designing a web application for it, which is intended to make the system more user-friendly and accessible to the general public. In particular, we were to design a web application that would take a date as input, and output a table showing the carbon content of the simulated Oahu grid per hour, analyzing if the carbon output is low, medium, or high. My group chose to name our project Eco-Depot.

This time, we got assigned to groups of four for the assignment, generally with two pairs being put into one larger group. At first, it was pretty difficult for me to work with a larger group. One thing was definitely the fact that we did not meet face-to-face much, and were not consistent with virtual communication. As a result, we ended up with a few instances of group members not realizing that tasks had not been completed, or group members wondering exactly what they were supposed to do.

My main task for the development of the web application was to write the Java code for the main interaction with the WattDepot client. The hardest part of this was anticipating what would be necessary for the people who were mainly writing Wicket. To tackle this problem, I first tired to extend the command line interface that Kendyll and I had written last week, but found that I was second-guessing myself, especially when trying to add things to the Wicket application. As a result, I wrote several key methods, let my team members work on it, then looked to that for feedback on what else I needed to write. The biggest disconnect that I noticed was when they used arrays to handle working with data, while lists were the most convenient data structure for my end of the programming.

Overall, I think that although my group got the assigned task done, we could have worked better together. The main thing that I think we needed to do was to communicate better; Kendyll and I had a good system of relying on Google's chat to talk while we were programming, and we knew that we would check our e-mail multiple times a day, but this didn't seem to be the case for our other two group members. Getting things like this straightened out from the start probably would have saved a lot of hassle and last-minute worry on our part.

Here is the ICU for our Eco-Depot project over the last week:

As you can probably tell from it, we did most of our coding over the weekend. I think this is an indication of an increased workload for all of our classes as the semester ends. Personally, I simply didn't have the time to dedicate to the project until the weekend because of projects and a take-home midterm due throughout the week.

Our initial version of the code can be downloaded from Google Code here. Please refer to the Wiki page for additional information regarding the use of the product.

Monday, November 16, 2009

WattDepotCLI Version 2.0: A second look at the system

Over the past week, I have worked with my partner to implement a version 2.0 of the WattDepot CommandLineInterface that we have been working on for approximately three weeks. This process involved taking the comments from the code review completed a week ago, and incorporating the appropriate updates to the code, in addition to creating implementations for three additional methods.

The main comments from the code review basically said to just spend more time on the project to add more functionality, so this was one of my main focuses for the week. I implemented test cases for the commands that I had originally implemented, and I also split up all of the files into two separate packages, to separate the parsing of commands from the methods to carry out those processes. The test cases took some time, but I found that once I got the general structure of one completed that it wasn't as difficult to implement additional test cases for my methods after that. As for the reorganizing of files, I really took advantage of the Refactoring tool in Eclipse to make it a quick and painless process. Also, one item that we didn't get in our code review but that concerned us was the consistency of our code. We worked to refactor some methods so that the method signatures would have a similar style.

I did not regularly meet in person with Kendyll to work on the code since we both had busy weeks with our other coursework, but we kept in touch with each other regularly via e-mail and instant messaging. This worked well because we were able to work on our own time when we had the chance, but also always had an idea of what we were supposed to be working on at all times. We tried to partition the workload evenly, but found that at times we were setting it up where one person would work on the code for a while, make a commit, then let the other person work for a while, just because our schedules didn't allow for a more consistent division of work at all times.

As part of the adaptations we made for version 2.0, we also began using the Hackystat ProjectBrowser to monitor the "health" of our project using nine different key areas. The overall ICU for our project over the past week is:



For the most part, you can see a trend in increasing coverage and decreasing complexity, which are both signs of a healthy project. From this image, you can see that there is a clear increase in work done as time progressed, which occurred because both Kendyll and I were busy for most of the week and had to wait until the weekend to get work done, although we both tried to do some minor work on the system each day. Probably the only thing to really worry about from the ICU was the trend in coupling, which indicates that our files were becoming more tied together. I believe that the increase was caused by the fact that many classes either implement or extend other classes, and we could not get around this

I would like to address the "churn" factor, which looks ridiculous at first glance. The reason it is so high is that I did not refactor the system into different packages until yesterday. Thus, the system registered this as hundreds of lines of code being deleted and hundreds of lines of code being written in one commit, when it really only involved files being moved around. If you exclude this day

Finally, we were asked to use our system to generate the answers to several commands. Kendyll did this part, so please refer to his blog post for the full details. However, the answers were:
  • What day and time during the month was Oahu energy usage at its highest? How many MW was this?
    >power generated SIM_OAHU_GRID timestamp 2009-11-26T20:00:00.00-10:00
    4.96E8
  • What day and time during the month was Oahu energy usage at its lowest? How many MW was this?
    > power generated SIM_OAHU_GRID timestamp 2009-11-28T02:45:00.000-10:00
    9.95E2
  • What day during the month did Oahu consume the most energy? How many MWh was this?
    >powerstats generated SIM_OAHU_GRID day 2009-11-26 sampling-interval 60 statistic max
    9.95E2
    What day during the month did Oahu consume the least energy? How many MWh was this?
    >powerstats generated SIM_OAHU_GRID day 2009-11-26 sampling-interval 60 statistic min
    4.93E2
    What day during the month did Oahu emit the most carbon (i.e. the "dirtiest" day)? How many lbs of carbon were emitted?
    >total carbon generated SIM_OAHU_GRID day 2009-11-04 sampling-interval 60
    29959
  • What day during the month did Oahu emit the least carbon (i.e. the "cleanest" day)? How many lbs of carbon were emitted?
    >total carbon generated SIM_OAHU_GRID day 2009-11-07 sampling-interval 60

    22908

My group's completed version 2.0 can be downloaded from Google Code here.

Wednesday, November 11, 2009

My Code Review Experience

My software engineering class implemented a code review over the past week, to help each of the pairs get feedback on our implementation of the WattDepot command line interface. A code review is a systematic examination of source code intended to find mistakes overlooked in the development phase; it improves both the overall quality of the code, and the skills of all developers involved in the review.

I was assigned to review two other systems. In doing this reviewing, I noticed several things that could be implemented or improved in my own system. For instance, I found a way to break one system (by putting in an incorrectly formatted timestamp), and I knew that the error was not caught in my own implementation. I also found more efficient ways to do things, such as creating a method to check that a timestamp was correctly created, since this is something that almost every method executing a command requires.

On the side of the coder whose system was reviewed, I received good feedback from all four of the reviewers. The basic message from all of them was that the code that was already written was good, with a few minor things to fix, and that mostly we needed to continue working on the code to implement test cases or to refine error messages. I also got some good feedback on errors that we need to catch because they currently cause the system to stop working properly.

Overall, this code review gave me a lot of ideas on things that we can improve in our system, and I think that we'll be doing a lot of improvement in the next five days before version 2.0 of our system is due.

Saturday, November 7, 2009

Code Review of WattDepot-CLI Branches

After spending the first half of last week working hard to finish our version 1.0 of the WattDepot command line interface, my software engineering class took a break of sorts from straight programming to conduct a code review for our various branches of the code. We were each assigned two other implementations to review, meaning that each group would receive feedback from four or five other people. I was assigned to review the Elua and Eha branches of the project.

To conduct this review, each of us used the template that was provided by my teacher. Essentially, I went through the code, first ensuring that it compiled and passed the automated quality assurance tools and doing basic testing. After that, I looked through the JavaDocs for the code, as well as the code itself, to determine if there were any additional problems.

My full reviews for the two branches can be found here:

I will be giving a few of the main points for each branch here, but if you are interested in my full comments, please refer to the PDF files linked to above.

Elua branch
Overall, I found few major problems with this branch. All of the commands work, although some fine-tuning is in order to ensure that there are no problems with incorrect commands being accepted. I would say that the biggest area for improvement is in the JavaDocs, some of which are missing @param or @return tags, and all of which would feel vauge to a new user of the system. Testing of the branch was thorough, although it would be a good idea to test one or two of the cases in which errors occur, since this is the main reason for lack of coverage by the tests provided.

Eha branch
This system worked well, but I feel that there are two major points that would benefit from additional work before future versions are released. First of all, the organization of the classes and methods could use some work. As detailed in my comments, the UserCommandInterface class should be broken up into several new classes, based on the type of command being called. Second, the test cases could use improvement, as they check the handling of correct vs incorrect input, and not the actual information that is output by the system.

Wednesday, November 4, 2009

Group Work Time! The WattDepotCLI

As my software engineering class enters the second half of the semester, we have begun work on the WattDepotCLI. WattDepot is a service that seeks to collect data on electricity so that the data can be later used for visualization tools and analysis. My class also began to work in pairs for this assignment, so I had to also take advantage of tools like Subversion and Hudson, as discussed in earlier posts.

The assignment to implement a command line interface (CLI) for WattDepot turned out to be a lot more intensive than I had originally expected. The WattDepot library includes an API, and this was helpful once I really got into writing the necessary classes and methods for the CLI, but it was daunting to just begin the project.

Also, I feel that we didn't have a good design strategy in place from the beginning of the project, which meant that for several days we were stuck on how to write the code, and that we were left with several major changes to make to the project. One of these was implementing commands by creating an interface and using a Map to match commands with instances of the interface. I feel that I'm somewhat rusty in using Java, since the last Java-focused course I took was well over a year ago, so these ideas took some time to figure out.

Finally, this was the first time that I've really had a need to use Ant's verify command to do automated quality assurance. For the most part, I've only had to deal with Checkstyle, and rarely, if at all, FindBugs and PMD. With this project, though, I found it really frustrating to make all three tools happy. The worst case of this was when PMD told me to combine multiple lines into one, which would have made for a line of code that would have been close to 300 characters - definitely outside of the range that Checkstyle will allow! Although the tools were useful, it was sometimes difficult to satisfy the tools while trying to upload a file that was partially but not quite completed, and that still had minor errors that PMD or FindBugs would complain about. In this sense, I feel like it almost inhibited my productivity, because I was constantly having to go back and hide code with comments, just to satisfy one of the tools.

Working with a partner was a new experience in my career as an ICS student at UH. In all previous classes I've taken, we were told to not work with others on any assignments, and all assignments were relatively small and self-contained. Thus, it was very different to have to schedule meetings and coordinate roles in the group. Kendyll and I switched back and forth on how to break up the list commands. First, we decided to break it up by type (Power, SensorData, etc.), then we broke it up by subtype (day vs timestamp) and finally switched back to the first idea.

While Kendyll and I technically finished the entire command specification for the assignment, there were several items that we wanted to complete but didn't have time to do. The one major part of the project that Kendyll and I were unable to complete were the test cases. Although I believe that our methods were designed correctly to facilitate testing, we just ran out of time to write the test cases. The other thing that I didn't have time to do was to break up the command line processing and the methods themselves into two different packages.

Overall, I felt somewhat rushed and unprepared for the assignment in general. That being said, I did enjoy getting the chance to work on a project with someone else and on a project with real-world implications. I'm looking forward to getting the chance to explore the WattDepot project further.

My completed project can be found on Google Code here. Select the Ehiku version.

Monday, November 2, 2009

My Introduction to Continuous Integration

As my software engineering class moves on to the second half of our semester, we have begun focusing on working in groups to efficiently build a new program, the WattDepot Command Line Interface. The details of this project will be discussed in a future blog post over the next few days, but here I wanted to address the idea of Continuous Integration, which is a tool that we are using to facilitate our teamwork.

Continuous integration is a software development practice where team members integrate their work frequently, to decrease the time required for integration at the end of a project. This usually involves team members updating at least once a day, if not more often, leading to numerous integrations by the team each day. In addition, each integration is verified by an automated build that detects errors and notifies the developers of these errors as soon as possible.

For my class, we are using Hudson as the tool to facilitate continuous integration. During the Hudson setup, you can set it to poll for SVN updates with a specific frequency, and build the project each time there is an update. My project is hosted within Google Code, so I have Hudson check the site every five minutes to see if there is an update. Additionally, Hudson is set up to e-mail the relevant Google Group when there is a failed build; ideally, Hudson would also e-mail the developer that broke the build, but the functionality was not working when we tested in last Wednesday.

In class, I tested Hudson by building the project manually, committing a small change and ensuring Hudson noticed the change and built the job, as well as breaking the build and seeing if the appropriate e-mail was sent. Overall, I feel that Hudson was a little tedious to set up and that this would have been very confusing if not for my teacher's walkthrough of the process. However, since this is a one-time time cost for the project, it was worth it. Even though I tried to run the ant verify before committing my changes through SVN, a few minor errors still slipped through the cracks (for instance, making all changes for Checkstyle and forgetting to run verify again, which would have caught my error in not updating references to a method I had renamed).

Although I have not yet completed my project, I can already see the advantages of Continuous Integration, because it makes it easier to catch issues quickly. Also, Continuous Integration encourages me to always keep the project in a working state, because I don't want Hudson to have my project flagged as having a failed build for long periods of time. Other than the setup time involved (which was maybe about 30 minutes to ensure everything was working properly and was just a bit confusing), I see no immediate disadvantages to using Hudson. I'll continue to use this tool throughout the semester and will be refining my observations about it.