Board index » delphi » Future tools should do less building and more analyzing

Future tools should do less building and more analyzing


2007-08-12 01:27:33 AM
delphi63
I think that modern computer programmers would be
better off if we could do more good fact-based
analysis and less "wondering".
The latest programming tools for have tended to be
more for building programs rather than for analyzing
programs. In this push to create tools for building
programs, we have also been overlooking significant
gaps in the tool capabilities (like the lack of
(a) a good variable type for designating a particular
time or doing arithmetic with time values, (b) fixed
and floating point decimal number variables that
could match our business calculations, and (c)
assignments that will warn us of loss of precision).
We need to get back to
(1) filling in the gaps in the capabilities of our
tools so that our tools could better fit the problem
and
(2) making better analysis tools that would tell us
why our programs are failing or might fail.
This could lead to a lot less wondering, especially
group wondering, more programmers that could solve
their own problems, and, who knows, maybe even a
newsgroup server, compiler, IDE, or TLB editor that
could diagnose its own problems.
JohnH, 2007-08-11
 
 

Re:Future tools should do less building and more analyzing

John Herbster writes:
Quote
I think that modern computer programmers would be
better off if we could do more good fact-based
analysis and less "wondering".

Generally, this is what profilers are for.
What I find wrong with development today, is not the tools we have, but
the lack of IT shops to properly architect and scope an application out
before development begins.
Things have surely changed since I began my career in a professional IT
environment. I remember the days where the IT managers had actually
worked in IT. Application managers had actually written programs
themselves, so they understood the fundamentals of proper SDLC. They
understood the nuances of application development because they had
actually been there and done that.
Network managers had actualy been engineers and had actually built real
systems and understood what it meant to properly set up and spec out
things like latency issues, security, and system matrixes.
Today, I meet precious few IT managers who know an array from a map, a
function from a procedure, a class hierarchy from a copybook or lib,
etc. Yet these same managers, attempt to manage IT and IT Projects ( I
will include PMs here as well).
Therefore, once the basic business requirements are gathered, these
people believe the specs are adequate to begin development. BY this I
mean they stop at the BRD/FRD level and do not think about things like
architecting the actual programs with Sequence, Activity, State, and
Class diagrams. Of course years ago, UML was not used, but pick your
methodology here (flow charts, prototyping, flow charts, psuedo code,
etc). They do not want to spend the time doing these essential steps
and many times, will not even allow cross group architecture (where one
group controls one part of the application and another group the other
part or parts of the application). All they seem to care about is
pumping out applications and coming up with some time line estimate,
without doing the homework necessary to provide a realistic time line.
65% or more of development issues could be and would be sorted out, if
as part of the SDLC process, time was properly allocated for the design
process, before actual coding began. But with managers and PMs who
understand nothing about IT running the show, they do not understand why
this needs to be done and they view such activities as stalling or a
waste of development time and resources. IMHO, this is why 70% of
software [projects are overtime and over budget. It did not use to be
this way, so that of course is the basis of my opinion.
I remember the days when for a typical 12 month project, we would spend
3 months gathering business requirements, 6 months architecting the
design, and only 3-4 months actually writing the code. Why such an
apparent discrepancy? Because we knew exactly what we were coding, why,
and how. We knew what functions, classes, and procedures we needed to
develop, what external APIs we were going to use and why. We have real
design specs, blueprints for the application, not just business
requirements and we had developed these across all tiers where
integration was necessary. Sure, this process took time, but it saved in
actual development time and made maintenance so much easier. We rarely
went overtime and over budget, and the resulting code was clean. QA and
UAT Testing was dramatically reduced, since the application was coded
according to the specs to begin with. We did not have to discover what
the REAL application requiements were during the develpment process and
the "gotchas" were kept to a bare minimum, since design flaws were
fleshed out before coding actually began.
Having lived in both realms, I have to say I miss the "good ole days." I
am so sick and tired of having a BRD in front of me and then being
demanded to provide an estimate of how much time it will take to develop
X and how many resources I need, before I have a chance to do any true
architecture. It is much too common in today's IT shops and it means
more hours and more frustration. Unfortunately, too many of todays
developers do not know anything about proper planning and scoping out
projects, due to the failure to follow what would seem to be rational
steps to ensure success. It is not these developers fault either, since
they have never been under a system where this is considered the norm
rather than the exception.
We have better tools today than ever before, but we have less buggy code
being released. Why? I remember when there were very few IDEs, yet
the code was cleaner, tighter, and better integrated. Less tools, but
better management and planning was the key.
The old addage is always true, yet is rapidly becoming a lost art, "Fail
to plan, plan to fail." Too many today are failing to plan and the
results are obvious.
 

Re:Future tools should do less building and more analyzing

being an old-timer in this business myself I couldn't agree more. Today's
software design often works like if a car manufacturer would draw the new
car on teh wall and let the mechanics start working. I am sick too of all
these "I need an application doing this and that, what does it cost and how
long does it take?". But many IT shops also have this one underpaid guy (see
the link in John Jacobson's post "Atlas shrugged off") who works day and
night and manages to knit something together, sometimes good, somtimes bad.
Joe
 

Re:Future tools should do less building and more analyzing

Quote
The old addage is always true, yet is rapidly becoming a lost art, "Fail
to plan, plan to fail." Too many today are failing to plan and the
results are obvious.
I have to concur, this has fallen sort of "out of fashion". All the time saving
and project saving decisions are IME made (or not made) in the very early days,
by knowing not just where you want to go, but also how you will get there.
There are even "methodologies" that focus on getting things done first, however
dirty they may be done, and then hoping everything can at a later staged be
refactored into stability/performance/whatever (yes vanilla extreme programming,
I'm looking at you)... Of course at the later stage there is no time for
cleanup, not in the least because "getting things done however dirty" isn't a
time-saver in itself.
Eric
 

Re:Future tools should do less building and more analyzing

Paul Nichols [TeamB] writes:
Quote

The old addage is always true, yet is rapidly becoming a lost art, "Fail
to plan, plan to fail." Too many today are failing to plan and the
results are obvious.
One thing about planning is that for smaller projects, it is possible to
reach the end without proper planning and have a decent outcome. But
very quickly, as the project size increases, the lack of planning will
become a major problem.
We have one of these large projects in a bad state and as sad as it is,
the current managers or the young "architects" of the project, don't
seem to understand the importance of the planning phase. It was skipped
altogether and they see now that it was a mistake, but they don't seem
to understand that it was huge mistake and it may very well doom the
whole project.
And yes, we are using the very latest tools from our vendor. <Sigh>
David S.
 

Re:Future tools should do less building and more analyzing

"Eric Grange" <XXXX@XXXXX.COM>writes
Quote
>The old addage is always true, yet is rapidly becoming a lost art, "Fail
>to plan, plan to fail." Too many today are failing to plan and the
>results are obvious.

I have to concur, this has fallen sort of "out of fashion". All the time
saving and project saving decisions are IME made (or not made) in the very
early days, by knowing not just where you want to go, but also how you'll
get there.

There are even "methodologies" that focus on getting things done first,
however dirty they may be done, and then hoping everything can at a later
staged be refactored into stability/performance/whatever (yes vanilla
extreme programming, I am looking at you)... Of course at the later stage
there is no time for cleanup, not in the least because "getting things
done however dirty" isn't a time-saver in itself.

Totally agree, which is why "vanilla" Extreme Programming of the last decade
has been just another Emperor-with-no-clothes.
-d
 

Re:Future tools should do less building and more analyzing

David Smith writes:
Quote
Paul Nichols [TeamB] writes:
>
>The old addage is always true, yet is rapidly becoming a lost art,
>"Fail to plan, plan to fail." Too many today are failing to plan and
>the results are obvious.

One thing about planning is that for smaller projects, it is possible to
reach the end without proper planning and have a decent outcome. But
very quickly, as the project size increases, the lack of planning will
become a major problem.

We have one of these large projects in a bad state and as sad as it is,
the current managers or the young "architects" of the project, don't
seem to understand the importance of the planning phase. It was skipped
altogether and they see now that it was a mistake, but they don't seem
to understand that it was huge mistake and it may very well doom the
whole project.

And yes, we are using the very latest tools from our vendor. <Sigh>

David S.
You are certainly not alone David. The last project I had (previous to
new position), worked my team 60-80 hrs a week for months. I wrote
documents, protested, etc. I tried to explain that the project was
woefully under planned and not even remotely scoped properly. The
response back was "NO excuses, we only want to see results!"
I lost two people on this project (very good and irreplaceable
developers) as a result. Surprised I did not lose more!
 

Re:Future tools should do less building and more analyzing

David Smith writes:
Quote
Paul Nichols [TeamB] writes:
>
>The old addage is always true, yet is rapidly becoming a lost art,
>"Fail to plan, plan to fail." Too many today are failing to plan and
>the results are obvious.

One thing about planning is that for smaller projects, it is possible to
reach the end without proper planning and have a decent outcome. But
very quickly, as the project size increases, the lack of planning will
become a major problem.

We have one of these large projects in a bad state and as sad as it is,
the current managers or the young "architects" of the project, don't
seem to understand the importance of the planning phase. It was skipped
altogether and they see now that it was a mistake, but they don't seem
to understand that it was huge mistake and it may very well doom the
whole project.

I'm curious. Now that "it is a mistake" is known, how are they
going to solve the problem.
Also, pardon my ignorance, but what exactly is the 'planning phase'?
What does one plan for? How does one plan? I realize these questions
are probably answered during the academic phase in Computer Science,
but since I am not a Comp.Sci. graduate, I am curious as to how a
person such as I, be able to plan projects.
I have a project which has a function in the grand scheme
of things at work. I have the tools to complete this project. I'm
the only person who is developing the said project. What exactly
do I need to plan aside for the typical "which objects should
be developed to handle this"?
Edmund
 

Re:Future tools should do less building and more analyzing

Is this all about ALM or profiling code?
Edmund
 

Re:Future tools should do less building and more analyzing

"Ed" <XXXX@XXXXX.COM>wrote
Quote
Is this all about ALM or profiling code?
Edmund,
I intended it to be about writing and maintaining
code that better fits the problem and which could
be made more reliable. But apparently there is
still a lot of ALM interest{*word*154} around.
Rgds, JohnH
 

Re:Future tools should do less building and more analyzing

Ed schrieb:
Quote
Is this all about ALM or profiling code?
its about first starting the brain before working.
ALM aas it is nowadays is only an excuse not to use
the brain for what one got it, thinking.
Martin
 

Re:Future tools should do less building and more analyzing

Ed writes:
Quote
David Smith writes:
>Paul Nichols [TeamB] writes:
>>
>
I'm curious. Now that "it is a mistake" is known, how are they
going to solve the problem.

That's the problem, there is no way to solve the issue, if you do not
properly plan up front. The project scope is blown, time lines become
blurred and meaningless, and the developers are required to work
unreasonable hours rushing trying to take the stench and dissatisfaction
off of the ones who have made bad decisions to start with.
You basically get left with the task of trying to design and redesign as
well as constantly rewrite code as you understand exactly what you are
working with and what the real requirements and expectations were in the
first place. The code gets messy and you are left with spaghetti code
that is refactored and refactored and refactored.
Quote
Also, pardon my ignorance, but what exactly is the 'planning phase'?
What does one plan for? How does one plan? I realize these questions
are probably answered during the academic phase in Computer Science,
but since I am not a Comp.Sci. graduate, I am curious as to how a
person such as I, be able to plan projects.

Well it always depends upon the project you are working on as to how
much planning you need to do, but basically the following paradigm will
work.
Planning Phase:
Step One: Talk with the unit desiring to have the application written.
This could be an internal busines customer or an external entity. Of
course, if you are the person writing an application for at large
distribution, you are the customer. Write these requirements down, and
reiterate the requirements as you understand them. Question anything you
are not clear on or about to make sure you understand exactly what the
customer is expecting the application to do and why. Generally I follow
a police investigative technique. What is the technique? Always ask the
following questions,
1. Who (who is it for, who is the intended audience?)
2. What (what you are writing, what does it do, what does it
need to integrate with, what tools/languages/databases,
OSes, App Servers, etc do I need?)
3. Where (where will the application be deployed? Will it be
deployed as a server based application, Client/Server,
on Unix, Linux, Solaris, App Server, etc.)
4. When (When does it need to be in service, When does all of
the functionality need to be put into place, can we separate the
functionality into separate releases?)
5. How (How do I go about developing this app. What
languages/tools/databases do we need to develop for, how
do I integrate with other sources, how do I class this
application, how do I develop and break out
functionality, etc.)
If you ask these questions and are diligent about answering all 5, you
will generally come out with the information you need.
Once you get the answers above, you start the formal design process.
Usually the above answers come in the form of Use Cases, either a Use
Case diagram or some other similar Use Case methodology (like bullet
points or flow charts. With Databases, this will usually consists of an
ERD (Entity Relational Diagram).
Once you have a system of Use Cases in a formal document, share this
with the business users or client(s). Make sure that they understand
what your understanding is and add or modify as they see the initial
requirements that need to be modified. If you are lucky, you will have
Business Analyst or PM that will already scoped this out for you, so you
may or may not be involved in the initial steps per above. However,
usually even if I have a BA or PM that gathers initial requirements
(usually in a Business Requirements Document or BRD), I will usually put
this in Use Cases for my team.
If the business or customers have a specific time they need an
application, you may find that you have to trim the requirements and
employ a phased rollout approach. By this I mean that you set
application priorities where you define what are the must have
functionalities in an initial release, with the understanding that added
functionality will come in subsequent releases.
Once these requirements are well documented and signed off on (very
important, if the customer/business does not sign off, you are inviting
and even encouraging scope creep),you start the process of modeling. How
involved the modeling process is developed, will depend on the
complexity of the application itself.
Usually, I try to create an effective class model, modeling the base
classes first. For more complex parts of the application process, I may
use Sequence, Acitvity, or State diagrams. You do not necessarily have
to go the UML route, but with a good UML tools (like Togethersoft), you
can actually use your model to create the core code, which serves two
purposes.
(1)It is easier to find flaws when writing good models. Using UML for
this design process, you are not wasting effort by using something like
flow charts or even psuedo code (which may be necessary as well for
complex parts), due to the fact that the model can actually be used to
create the actual code itself.
(2) Your model becomes a self documenting core code based. This is
extremely useful for integration of existing code into a new or
extended application.
Once these models are establish, you can start handing it off to your
team. Now they have a good model from which to work from and good
requirements from which to work. This modeling process needs to be
performed, not only from your own application, but from all of the
programs you will be working with or integrating with as well.
For instance, if your application is integrating with an existing
databases, you need to have a working ERD Model. If you are integrating
with Web Services, you need good documentation explaining how these
services are suppose to work and how you call them, what the data types
are, etc Sequence diagrams are great here, but if you do not have
these, at the least some documentation and the XSDs will come in handy.
Once you have undertaken this formal process, you can give a good
picture of how long and how many resources you will need to complete the
program in x hours. You should, during this process, flesh out any
potential problem areas as well. For instance, if the business rules say
that I must calculate taxes using a Web Services for each state and
locale for an Order Entry System, then I should be able to see potential
problems when the locale are overlapping. Take Bristol TN and Bristol VA
as an example. The city actually resides in two states and has two
locales. Taxes could be a problem to calculate unless the locale is
specifically identified and some lookup routine would identify which
Bristol we would be dealing with to calculate taxes. This small detail
might be overlooked in an area where the development is far removed from
these types of scenarios. You could potentially code the application not
taking this into account and only discover at QA, UAT, or Production
that you have missed a major potential problem. Proper design and
discussion would probably have caught this using the Where, When, How,
and Why model of investigation.
Quote
I have a project which has a function in the grand scheme
of things at work. I have the tools to complete this project. I'm
the only person who is developing the said project. What exactly
do I need to plan aside for the typical "which objects should
be developed to handle this"?


Use the investigative methodoloogy method (Who, What, When, Where, and
How) and you should be able to determine the planning process. If you
are writing a brand new application, that does not need to integrate
with any other app. you should still plan, but the planning is not as
critical once you get passed the Use Case stage. However, if your
application is integrating with other applications and back end systems,
you need to get with these groups and determine what type of planning
and documentation would be beneficial to you and to them.
Remember the application you write today will probably need to be
expanded and updated for years. You may not be the one doing the
maintenance years from now or you may indeed be the one who hasn't even
thought about this code for years.
Hope this helps!!
 

Re:Future tools should do less building and more analyzing

Paul Nichols [TeamB] writes:
Quote

You basically get left with the task of trying to design and redesign as
well as constantly rewrite code as you understand exactly what you are
working with and what the real requirements and expectations were in the
first place. The code gets messy and you are left with spaghetti code
that is refactored and refactored and refactored.

I can see how this issue may not be resolvable.
Quote
Well it always depends upon the project you are working on as to how
much planning you need to do, but basically the following paradigm will
work.

Planning Phase:

Step One: Talk with the unit desiring to have the application written.
Had that talk once for a project. The unit desiring this application
apparently decided to change the format and/or requirements without
mentioning it to me. While i didn't need to rewrite the code, but
I had to change the code because the format changed.
Now, as much as I am furious about it, is that the idiots requiring
the program is actually expecting *me* to run it. Basically I
wasted *my* time creating some sort of UI when I could've
just slapped a damn script and ran it.
Quote
Hope this helps!!
Thanks Paul! Very much appreciated and definitely gives me
a starting point to understand how to do a project correctly.
Should also be part of a programming FAQ.
Edmund
 

Re:Future tools should do less building and more analyzing

Paul Nichols [TeamB] writes:
Quote
John Herbster writes:
>I think that modern computer programmers would be better off if we
>could do more good fact-based analysis and less "wondering".
>
Generally, this is what profilers are for.

What I find wrong with development today, is not the tools we have,
but the lack of IT shops to properly architect and scope an
application out before development begins.

Totally agree. It angers me when as a developer I am asked to make time
projections and resource estimations without having been made part of
the initial analysis and design process.
To tie in with John, I think his point is valid. Often a developer will
estimate, say 6 weeks for development. During testing it is dicovered
that (maybe) the numerical values are off. Now starts the debugging.
Even though the code and formulas might be correct, the lack of exact
numerical types for instance will be the cause of error and not the
code itself. This can lead to *lengthy* fruitless debugging sessions if
the developer does not know this. Having tools to do a deeper analysis
can help shorten these fruitless debugging excersises.
--
 

Re:Future tools should do less building and more analyzing

John Herbster writes:
Quote
we have also been overlooking significant
gaps in the tool capabilities (like the lack of
(a) a good variable type for designating a particular
time or doing arithmetic with time values, (b) fixed
and floating point decimal number variables that
could match our business calculations, and (c)
assignments that will warn us of loss of precision).
All three are 'commercially hopeless'. As a tool producer, CodeGear
can't come now and say "we did it all wrong all those years, and BTW,
we have also helped YOU to do it all wrong..."
As for the floating point 'problem', I think you must start already at
the education end - you *must* make people understand that computers
don't work with 'theoretical mathematical concepts' but with a limited
number of binary bits. The concept of limited precision must be taught,
not just mentioned once...
Both (a) and (b) can be solved partly with (community created?)
libraries, but the solution really needs to go into the product to
'force' everyone to used the same library. But Borland/CodeGear has
already shown us that they are happy to include community projects into
the product if they are good enough - just get the projects started,
finished and verified, and they might well get into Delphi.
Quote
This could lead to a lot less wondering, especially
group wondering, more programmers that could solve
their own problems,
Assuming there is documentation, clearly stating what the different
concepts are good for, where they should be used (and where they
shouldn't), etc.
Also assuming the average IDE user is able to read (and understand)
that documentation...
--
Anders Isaksson, Sweden
BlockCAD: web.telia.com/~u16122508/proglego.htm
Gallery: web.telia.com/~u16122508/gallery/index.htm