Sunday, December 23, 2007

Install Kalva in Kubuntu Gutsy

I record TV show in Linux using Kalva. It's much lighter than using something like mythtv or other similar application. I like the fact that Kalva behave like a typical computer application instead of operating like some kind of TV or VCR.

Anyway, now that I upgrade my home pc to Gutsy, I need to hand-build it again since it's not an app that is available through standard repository. It take some time for me to gather and redo things needed to make it working again, so I figure this time it would be useful to summarize some notes.

Some prerequisites to get the source built and running :
  • The most obvious one : a working TV Tuner. I have kdetv running, so no problem here.
  • The second obvious one : Development environment like gcc, make, etc..
  • CheckInstall to make installing from source less dirty
  • QT and KDE development files.
  • MPlayer and Mencoder.
  • Perl module that handle Cron automation (Config::Crontab), use CPAN for easy setup for it. See the detail here, it's in germany but you only need to look for the perl's CPAN command below.
Post installation (running) quirks :
  • The configuration page does not seem to store the change. For example, I've setup the video to use v4l2 but it still use v4l in command debug window. However, change the config files "~/.kde/share/apps/kalva" manually in text editor works for now.
  • The Movie List Window (one-time recording) does not show the scheduled recording although "peeking" using atq show that it is already added to the system. After some cursing, hair-pulling, debugging and fixing, change part of file into below, fixed it.
235:   foreach ( @jobs )  {
236: $job = $_;
238: $jobid = $job;
239: $jobid =~ s/\s.*//g;
240: #print "$jobid";
242: #$cmd = `at -c "$_" | tail -2 | head -1`;
243: $cmd = `at -c "$jobid" | tail -2 | head -1`;
244: $cmd =~ s/\n//g;
Kalva is not an app that is majorly supported, so using it require me to get some hand dirty a little bit. I think it's worth it though.

Wednesday, December 19, 2007

UML for Coding

UML can be quite helpful in coding although it is not meant to substitute it. I don't think UML is suitable to be used in paralel/on the same level of coding, I prefer to use it as a support for coding. Here are some area that I find it most useful :

  • Reverse engineer code. If you work with other people's code or the code you wrote years ago, you'll most likely need to get a big picture of it, especially if it's written in the manner that is not intuitive to you. Extracting important classes, it's hierarchies and relations with other classes could help us in understanding those codes.
  • Get pass though blocking in coding. There are time when we feel lost int he middle of our coding and it feels like there's no where else you could go to. We got buried in the micro problems and details, loosing the big picture of what we are doing. This is the time where some high level review would help and some diagrams would be most-welcomed. After we can close the gap again between high-level view with the current problem we could start again with a clearer head and, hopefully, unblocked the dead-end we faced earlier.
  • Document some critical architecture decision. There are occasionally parts of programs that is hard to be made self-documenting no matter how hard we try to find names and structure that could reveal it. The code still won't reveal intentions clearly. This is the time for some external "patch" to the code in the form of words and diagrams.
As the support it is done in just-in-time manner. I fire up a UML tool when I need to do the above. Once things are clearer/unblocked I switch back to code and leave the diagrams alone until the time I find it neccessary to open it again. No synching necessary, I just sync it the next time I touch them again on the part that is relevant at that time.

UML could be helpful when done with the clear purpose i.e: not to replace code, and with the right amount of effort.

Tuesday, December 18, 2007

The Seam

One interesting thing I picked from the book "Working Effectively with Legacy Code" (Michael C. Feathers) is the concept about Code Seam. In a more everyday word, you can call it a flexibility point.

The seam is basically this : if you have part of the code that you would want to made to be more manageable i.e: easily tested, traced, modified, you start by making that part interchangeable. In class it could mean introducing interface, in function you could extract a wrapper function. After you have the flexibility in place, you could make alternate function for testing, make object or function mock.

The book has lots of details on how to do this, but the basic theme pretty much about The Seam. It seems I have been using some of the techniques in it (although without using the fancy name :) ), some are still new to me, some other kind of bend the language little although still understandable.

It's nice to finally have one term to name this kind of thing.

Sunday, December 16, 2007

Software Assimilation

Software is not just installed, it is assimilated. It is intrusive and define the way we work into certain way. Some software introduce some different ways of doing things or initiate a completely new one and it integrate with people's mind. Software "Installation" goes beyond installation on the computer but include the assimilation of it's workflow and the assumption on which the software is based into the user workflow . Some software goes even further into shaping the perspective of it's user in seeing the problem.

It's not strange then we see people react differently to a different program. The style, assumptions and behavior of the program could conflict with the user inner mental model and adapting mental model is not a convinient process for most people.

The trick is then to find the balance between making a software that still can get an "entry point" into the user mental model but at the same time offer significant and interesting new things for the user to experience. For the user, the homework is then to find software that fit with his level of knowledge, understanding and the future growth he would like to go to. Some feature could be felt as helpful for someone could be seen as annoying or even mockery to intellectuality for others.

Friday, December 14, 2007

Using Build System in Visual C++ the Modular Way

Managing build and project in Visual C++ can be confusing and tiring. The amount of details that you need to manage is large and the interface does not help too much either. The problem is even more complicated as your project getting modular since you have lots of .vcproj file to take care of. At this stage, getting the build system be modular will help it feel more align with the modular nature of the code.

There's a very useful (although still with annoying problem I'll discuss below) feature in Visual Studio (VS) 2005 for this. I come across this when I finally fed up with copy and pasting include and link parameters.vcproj problem and do some searching. It turn out the feature is already exist, only not yet on my radar at that time. This feature called Property Sheet, (not to be mixed up as a Windows Dialog type with the same name).

Property Sheet (.vsprops) is a "neutral" project property that you could made and be inherited/linked by several project file (.vcproj). What I mean neutral is that it is made for the sake of it's values and not to be executed directly as a build file.

Visually it would look like this :

This is not something revolutionary, of course. Many build system has already done this and a lot of them doing it in a more flexible and light manner. However, for those who stuck in using the integrated build in Visual Studio, this feature is really helpful. The visual editor for it (Property Manager View) is interesting too and can enable you to make a library that can be linked the way you add Reference in .net project although with more work.

Now, for the negative side : It still a lot of work to do it with the interface that VS provide and the xml format does not look safe to be modified by hand. But the worst of all is the saving mechanism : it feels unintegrated with the rest of the file-saving mechanism of VS. I find myself loosing changes of the .vspros file several time. When you make changes to a vsprop, be sure to click a tiny save button in Propert Manager View every time.

Build system is very crucial in maintaining C++ project agility. If you are using Visual Studio build, then exploiting .vsprops could avoid you being exploited by the project instead.

Friday, December 07, 2007

Essential Software Project Tools

Here is the summary of tools that I find essential for doing software project :

  • Version Control System : Subversion
  • Roadmap, Issue/Bug Tracker : Trac
  • Iteration, Schedule, Progress Tracking : XPlanner
Why bother with any of those?.Any production-level software project would require the team to work on development incrementally and have feedbacks on how they progress towards the direction they would like to go.

Experimental and small program would have only several issues to develop, a small time window to manage ( "let's just get this work and see what happen, we'll think about what to do with it later" ), a little constraint to fulfill and balance. Our brain could handle that amount, maybe with a help of a text file/spreadsheet or two. However, production-level software would explode your brain if you try to manage it all in your head (or at least only left so small amount of space that is unusable for effectively coding anything).

Those tools helps the teams to
  • Focus on the now (Progress Tracker) without loosing touch of the past (Version Control System) or worry about tomorrow (Roadmap, Issue/BugTracker).
  • Focus on the work at hand without loosing touch of the big picture.
  • Separate committed issues with the pool of incoming one without which there will be no milestone/releases/versioning ("let's start from scratch straight to version 5.0!") which a sure recipe for starting a soon-to-be-abandoned project.
In short, it let us to actually do the coding.

Having past beyond code samples and toy-problem, and when it times to get something worth-talking built, it's time to roll-up some tools to help us achieve it, version by version.

Thursday, December 06, 2007

The Why Behind Agile Movement

I read "Agile Software Development Ecosystem" book not long ago. There's a good answer on a question : why it (somehow) works?. The reason lies behind the nature of software development.

Developing software is creative process. Despite there are a lot of tools, languages, components exist today does not make it a deterministic process and it will probably won't be in the near future. There's no step-by-step way to make a software that is guaranteed to produce working program. Every project seems to always pose significant amount of uncertainty.

I find it interesting the book called this kind of activity as Exploratory, creating ways and new mechanism, as opposed to Optimizing problem which deals mostly to enhance existing mechanism. Exploration problem is the problem where unpredictability is common and changes is nothing special and are welcome to happen any time. The strategy then is to spend effort to be change-accomodative instead of wasting it on trying to avoid the unavoidable change. Using optimizing activity to exploration problem is not a best fit.

Agile methodologies proposes things that makes development team could response to change more effectively. It seems the most suited way to deal with software development for now, at least until software development can become something more or less mechanistic, which is less likely to happen anytime soon.

Wednesday, December 05, 2007

Bottom Up Modularizing

Many times when I code and think about part of logic that would go well as a function, I just code it inline. Later when things already working I would then extracting it as a function. It feels much easier that way.

It seems the ideal way is to define a program is topdown : we define modules, classes, functions and then fill in the blanks with code. However, I find that it's not always practical to do so. In reality, the situation does not require exclusively top-down or bottom-up, sometimes one feels natural to do than the other.

So, when to do each?. I find that I do the bottom up when closely-similar function does not exist yet or the signature of the function we are thinking is still vaguely defined. It's easier to just code it inline and extract it later. I think bottom up approach help the brain learn more of the problem than when it is forced to solve it from one direction only.

The case is quite similar for the level of class and library. Sometime it's just more convenient to add class on the same project although it seems not a best fit there or it looks like a more general utilities. Later, when we feel like it or the condition has require it, we could extract it as a library.

So, when it is feels right, just code it inline/directly and let things run first then modularize later when needed.

Tuesday, December 04, 2007

TCLAP for developing CLI in C++

I am faced with the task of parsing command line argument many times quite often, probably almost as often as I make programs itself. Even when we don't intend to pass anything to our program, e.g: it's a GUI program, we usually end up coding argument handling part too eventually. So, I guess it's a homework of every professional programmer to find himself a CLI argument parser library/tool he can rely on.

For me, in C++, TCLAP is that tool. It took me some searching and exploring to finally find and settle with it. I tried things from emulating other existing codes, trying several library with couple problems (license, ease of use, maintainability, maturity of the library). When I tried TCLAP it was like "this is it, it's quite good, I'll stick with it". And now, years has passed since I first using it and I don't have any need to use anything else so far, so I guess it's not just "quite good" library but a really good, easy-to-use one.

Here's a brief snippet to give picture on how it works

2:CmdLine cmd("myProgram", ' ', VERSION);
4:ValueArg<int> arg0("","arg0","Description",true, 0, "Type Description");
5:cmd.add( arg0 );
7://NOTE : more params
10: cmd.parse( argc, argv );
11:} catch (ArgException &e) //NOTE : catch any exceptions
13: std::cerr << "error: " << e.error() << " for arg " << e.argId() << endl;
14: exit(1);
17://NOTE : use any arg.getValue() here like any variable

The code looks really straightforward : make a CmdLine object, add argument objects to it, call parse(), then use the values. It avoids us to have to do nasty parsing or converting that take space and add complexity in our code.

I think this one will still be on my toolbox for a long time.

Monday, December 03, 2007

Maintaining Sense of Mission

There's a large difference when we do things with a sense of mission. It gives an exhilarating sensation that put what you do in a context. It generate a very different level of focus and suddenly we'll have a very different way of deciding what to do first, what to discard, what should be set aside for later. A decision is happening in a much more instant way than what we commonly do. Insights seems to flow easily.

It's a tricky thing to do though. It seems human has a tendency to forget their mission :). This will result in "drifting" mode where things go slowly, inefficient, low productivity (if not contra-productive).

What makes this sense strong and what make it weak?. I think it's a hard question to answer, It could be a leadership, an urgency, honor and many other things. But despite all the ideology side, on the practical side it could be this : the longer the feedback cycle the weaker the sense of mission will become. Put a person in a never ending task and he'll probably goes numb, then add the task with more impossible tasks, he'll probably strangle himself to death :). What we need to do to strengthen this is then to shorten the feedback cycle to keep the fire up.

Shortening feedback cycle takes a lots of forms, for example in XP there are practices like standup meeting, iteration, TDD. The baseline is to connect what we do with a larger picture in a regular basis, regular enough (and discipline enough) to avoid us from drifting away. If we feel drifted, it's a sign there's a feedback loop need fixing.

When a person team loose direction what needs to be done in short time is to place achievable milestones. When coding loose direction, pick the shortest working condition and get that way, if it's not possible, just rewrite small piece or do whatever things that you could that actually able to make something run.

Sense of mission takes what we do to a different level of effectivity. It's like shifting gears in our action or on the team action. Shortening feedback cycle could help us maintain it while let things drifted away would drag us away with it.

Sunday, December 02, 2007

Code as Personal Reflection of It's Writer

When you already have looked at large amount and variety of codes you'll see code as having some kind of personal style on it. You'll start looking at codes saying words like : organized, confusing, pretty, elegant, dirty. So, how come this thing that we commonly perceived as a very technical thing has a very "human" property?.

Code is actually a means of communication. It's the chip that is "technical", but how it is organized and programmed is actually human activity with a large amount of communication goal in it (In a way, even a chip is not just technicalities if we look that it a product of design by human). It is then not a surprise we could find a trace of our human-side in it.

Code is a way we tell ourselves and other people how we perceived pieces of building blocks formed as a coherent, interrelated whole. We describe in it how we think certain pieces is logical to be organized in one group, how we divide some part into more several independent part, how one pieces is should be extracted into one entity (after we find it's redundancy). We pick the best representation and structure that we think will describe the best of the system how it should work step by step, function by function.

Pay attention to the word on the paragraph above. Organizing, Perceiving, Describing, Coherence is words that has connection to who we are as a person. As an addition to our knowledge of programming language and libraries, we organize things based on our personal preferences and skill in organizing things in general. The way we label things and do abstraction in life plays an important role in determining choices in we decided to use in our code.

What makes this even more important in coding is because coding is relatively complex and this makes our skills and behavior even come out in it's relative more reflex/unconscious form that it's usually come out. We only realize it when things already got to it's final state "Wow, this looks like a mess!", "This code end up looks really nice and easy to read." What we are inside is somehow reflected to what we do.

What I take from this personally is : Coding is related to other parts of our life. Good programmer is not the one who just read programming language books everyday for the whole year. We could benefit a lot too from variety of sources not directly related to programming. Books about graphic design, a movie, an art, music, spirituality, news, business, statistics, comics, social event, parenting is as valuable and important sources as any other more direct materials.

Other interesting observation that I notice recently is that with practice of Shared Code and Pair Programming the code that emerge is kinda like something that come up when we make an art together or writing document/article with someone else. It looks like a blend of each contributor but in a more interesting way, call it synergistic if you like.

Realizing that code is not just done by our programmer side could open our eyes to rooms of improvement that we probably would not see otherwise. I find that it would makes the work less boring and stressing too since when we find that we can use our whole self to a problem we have a much wider resources to use and a more dependable consolation to fall back into when there's tr0uble.

Thursday, November 29, 2007

Firefox Setup for Heavy Research

I use Firefox quite routinely for searching things. However, it can still be quite confusing when I do heavy research. I sometime felt lost when the amount of links and followup explode. A little more structurized approaceh and workflow could help here. After quite some time trying things out, I currently settle with the setup below.

The extensions used :
  • Advanced Bookmark Search and Locate in Bookmark Folders. I think without these folder in Firefox's bookmark is only half useful (or half useless, depends on your preference) since you cannot trace back search to folder hierarchy. I don't like storing things in Firefox Bookmark when I find this out. These extensions enable me to depend on it again.
  • Scrapbook. I rarely saved any page again, at least not conventionally. I can save and organize page just by dragging things around using this extension.
Mechanism used :
  • Something that need to be read is saved in Scrapbook while useful links are gathered in bookmark. It's not exclusive though, a link can be both go to scrapbook and bookmark.
  • Setup designated folders, both in Bookmark and Scrapbook. Some folders that used a lot in research : project, task and tmp folder. Project is for long term (more than a day) work while task is for immediate task that I do. Both actually almost one to one mapping from Project and Task in GTD. If a project or task need researching, I made folder with it's name in Bookmark and Scrapbook. However, I usually only add them when there's at least one link/page. The tmp is to dump everything else.
  • Drag links and pages in relevant folder to keep the tab to reasonable size. If a link/page can't be decided instantly to go into project or task folder, drag them to tmp to be followed up later.
  • Archive links and pages when done. Move them to a more "stable" folders for future reference.
I find this setup and mechanism makes my browsing more focus and help a lot in avoiding mindless wandering around. It also helps in preserving the resources I find along the process with more ease.