Scrum is an iterative and incremental process for product development and the organization of teams. Tasks will be achieved faster and with higher quality with the aid of Scrum frameworks. This is possible because of the high self-motivation of the team, which itself chooses how the tasks will be executed. The customer demands will be iterative, prioritized, and quickly realized.
Millions of people today still go through lives untouched by LCD screens
and laser mice, and all the Bachs and Shakespeares of history did
reasonably good work without them, so it must be possible. Is it
The make build tool was (and still is) very influential in the sphere of
software development tools. Its influence is so powerful that even bad
aspects of its design survive in the next generation of build tools.
Many generations of developers grew up in the school of make. Like the
frog in the slowly heating bowl, they got used to its quirks to the
point of not feeling the pain anymore. But they shouldn't be too quick
to conclude that make's way is the one and only way. The punishment is
to miss the opportunity for significant improvement.
Scripting languages are ubiquitous. They are used everywhere: in log
parsing, triggering applications, or performing volatile operations
which require frequent changes in logic. Shell script was one of the
most popular languages through the end of 1980s. Then came Perl, which
revolutionized the world of scripting. Later in the day, we have Python
and Ruby, both pretty popular. In an organization having primarily Java
skills, is it worthwhile to have your developers learn these languages?
We all know about the benefits of digitally signing email messages using
OpenPGP-based software like GnuPG (or its older commercial counterpart,
PGP). Imagine the same benefits applied to the world of the World Wide
Web Service tools in Java are into their third generation now. Web
services were introduced with the hype of a loosely coupled technology
for inter-connecting disparate endpoint systems. But they are, in fact,
suffering from their tight coupling based on the WSDL data
specifications and data types. Most of the current tools offer quick
solutions to expose existing code as web services, but none offer a
simple, yet intuitive and full-featured client.
Mirrors are extremely useful when used to their full potential -- but
this rarely happens. There is nothing wrong with mirrors but the way
that we use them. I want to make it so average users who don't (and
shouldn't need to) know too many technical details can automatically
make the best use of mirrors.
Evolution is a slow process. Getting rid of old bad habits is never
easy. This article is a critique of the Make build tool. I'll list its
shortcomings this week and suggest a few more modern alternatives next
A study of the productivity of software programmers shows the most
talented coders to be over 100 times more efficient than the meanest.
It is clear, however, that there is nowhere near a commensurate increase
XP is a lightweight software process that values people and
communication over processes and tools. It exploits the benefits of
tight collaboration in an environment in which all stakeholders sit
within talking distance of one another and work in pairs. Extreme
programming works extremely well in delivering great software because,
first, it is fun to do, and second, it eliminates fear, promotes
collective code ownership, and encourages frequent small releases. In
fact, it also values working code over detailed designs. It is quite
amazing that the last three or four values are in fact values shared
with the Free/Open Source movement (see "The Cathedral and the Bazaar").
This article looks at current NMS offerings and considers how and what
would make a "real" NMS.
Recently, I was approached by a company that was planning to release its
core code into the Open Source community. Unfortunately, they didn't
have a lot of experience with the Open Source community, nor did they
have an understanding of what draws developers to work on a specific
project. They were aware that word-of-mouth seemed to be the main
method for building interest in the Open Source world, but they wanted
some advice on how to get the word out in the first place.
The usefulness of the Internet has been severely compromised by a
proliferation of spam, worms, crackers, and viruses. The Internet
has been stifled by harmful traffic (and its related expenses) which
have increased to a now intolerable level. According to the U.N.,
UNTAD, Symantec, F-prot, MessageLabs, and several market analysts,
the financial burden of dealing with harmful Internet traffic reached
tens of billions of Euros this year. Next year will see this increase
to hundreds of billions of Euros if the problems worsen as forecasts
predict. What is destroying the Internet, and can its collapse be
The purpose of this essay is to explain why I believe Perl 6, the way it
currently seems to progress, is the wrong thing at the wrong time, and
why I predict (with all the expected caveats of predicting something)
that it won't be successful. I will also suggest a better alternative
for the future of Perl which makes more sense at this point.
There are many GUI components available which can be used to display
vector graphics and animation. Most of them are dedicated to particular
classes of vector images such as graphs, business diagrams, SVG images,
geographic maps, technical drawings, and financial charts.
Unfortunately, it is not yet always possible to find a suitable
component for a particular application which satisfies price, licensing,
scalability, performance, stability, feature availability, and other
requirements. And if one wants to add vector-based visualization to an
existing application, additional requirements of integration ability
Intrusion detection is one of the major challenges to information
security. In this article, we will consider network intrusion
detection, the analysis of network traffic for suspicious behavior. I
base my argument on my experience with a popular network intrusion
detection system (NIDS) and informal discussion with other network
There is a fundamental sea change happening in the industry around us, a
move away from prescriptive top-down mandates to implement a methodology
to a developer-led "viral adoption" of Agile Practices. But exactly how
widespread is this movement?
The GNOME Desktop Environment is a dynamic, young project and full of
expectations. I regret that mine are somewhat higher, though.
Over the (fairly) recent past, software developers have been offered a
plethora of panacean remedies that seek to address inherent inadequacies
and observed problems in traditional software development methodologies.
More often than not, however, they come bundled with their own variants
of the said inadequacies and problems, and (at least as far as software
developers are concerned) basically replace an old devil with a new.
But not all do this. One particular credo that has seen some
well-deserved success in this regard is Xtreme Programming.
A new software hosting platform is available, and we are spreading the
A comment on a bug I submitted recently spurred me to provide some
feedback from an application user's perspective on bug reports. There are
ways of responding to a bug report that encourage the types of responses
that are helpful to developers, and there are ways of responding that
only produce anger and frustration, without getting anything fixed. My
hope is to encourage good communication between bug reporters and
developers to enable better, quicker bugfixes.
The history of software development is full of controversies. One of
the oldest is the controversy about modular vs. monolithic software
The GnuCash installation instructions warn non-programmers against even trying to install it. The word “nightmare” is used. Ideally, the process should be quite simple. If the project were distributed using Zero Install, users could safely fetch and run it, with all its required dependencies, using a single command.
Back in the heady days of Macintosh System 7.5, Greg Landweber released Aaron, which changed the system’s windows and buttons to match the “Platinum” appearance of the upcoming MacOS 8 (codenamed “Copland” after the composer Aaron Copland). Hacked versions of Aaron quickly appeared with the MacOS 8 images replaced by images of the hacker’s creation. Landweber realized he had a cash cow and released Kaleidoscope, which could switch “schemes” on the fly.
Following the distributed and coordinated attack on antispam service
providers over the last month, this article is intended to provide an
overview of one such attack on email verification technology company
Bluebottle and highlight the core challenges faced by email verification
systems in handling email.
This article records our experiences with packaging an application for
many distributions and shows areas in which packagers, Linux
distributions, and developers can improve coordination for better and
easier distribution. We look at communication problems, packaging
errors, package dependencies, menu entries, and bug tracking systems.
Any veteran GNU/Linux user has, at one point or another, run across a
package which used the autoconf/automake toolset. There is a lot to be
said in favor of this emerging standard. Running "./configure && make
&& make install" usually results in a working installation of whatever
package you are attempting to compile. The autoconf tools are also
portable to almost every *nix platform in existence, which generally
makes it easier to release your program for a large variety of
systems. However, despite these few pluses, the auto* tools are
constantly a thorn in the side of users and developers alike.
SpamAssassin has emerged as the most popular antispam tool in the
Open Source world. It has gained such momentum that it has even
crossed over into the commercial world as SpamKiller by Network
Associates, and other commercial products are also based on it. This
article is a short comparison of real world results between two
antispam tools, SpamAssassin and Spastic.
The plethora of Free Software applications available today, none
working perfectly, is a problem which stands in the way of major
adoption of Linux on the desktop. In order to conquer the desktop, we
have to stand united.
Since my good old Pentium 166 days, I've liked to search for the best
optimizations possible so programs can take the maximum advantage of
hardware/CPU cycles. If I have a nice piece of hardware, why not run
it at its full power, using every little feature? Shouldn't we all
try to get the best results from the money invested in our machines?