OK, we might have a difference of philosophy here, but at least this is what
I would do:
Loose the TTables. I know I have said so before, and I am not taking a{*word*211}
out of you. I really do mean it.
You say network traffic is an issue, and TTabes are NOT helping at all.
Keep in mind that Cached Updates will NOT reduce network traffic, merely
save it all for one huge bang. And if network traffic is an issue, I would
go for the "odd now and then" networktraffic, rather than the "Once every
now and then we'll jam it up good" approach.
I take it from earlier mails on this matter that you a writing a
input-intense form, which removes the possibility of using a "Add", "Edit"
and "Delete" button to prompt input in a separate form. And I think I
remember something about this beeing in a grid. If so, this is what I would
do.
Make a standard grid. NOT the database type. (You get more control over
formatting and database stuff doing it manually)
1 : set "MyGrid.RowCount:=5000;" or some other silly high number.
2 : Empty the grid.
3 : Fill it with appropriate data.
4 : Use the "OnSelectCell" to detect a row change. (Store the last row, if
current row <> last row, do point 5, if not, ignore, because the user is
still on the same row)
5 : Take the row the user was on and perform an update or insert into the
database. I would recommend using another grid with the visible propery set
to False, where you store a flag stating if the line has been inserted
before or if it is a new one, possibly with the ID generated by your
trigger. This way you save time on the database stuff, because you will
know if you need to update or insert. Update the invisible grid with flag
when having performed an insert.
The trigger in itself is a common problem area. I am not familiar with
Interbase, as I use Oracle mostly, but in essence, if you set a field as NOT
NULL and then have the trigger perform on the "AfterInsert" event, you will
violate the database design. You need the trigger on the "BeforeInsert"
event.
I would also drop the trigger all together, because bugtesting becomes more
difficult. How about just doing a "select max(id_i_want) from my_table for
update" which will lock it while you get the next number. Or use a stored
procedure to do this. But anyway, make sure your trigger works OK.
The above is one approach that works for me.
You could also of course turn off the CachedUpdate thingy on the TTable if
traffic is an issue, since as I said earlier, it only saves it for later,
not reduces it.
And, if database responsetime while typing is an issue, add a TDatabase, and
then, when showing the screen, do a MyDatabase.StartTransaction;
Then, when the form is finished, do a "MyDatabase.Commit;" with a
"MyDatabase.Rollback;" OR a fix of the problem and a commit again on the
event of an exception.
This will reduce the time spent on each insert/update as you go along.
Hope this helps,
Roger
Quote
Philip Cain wrote in message <350c2bc0.645...@forums.borland.com>...
>Roger, Old Buddy,
>The ID is produced in a trigger in the database.
>On the form in question, the tables are not dynamically linked, either
>by a table join, table lookup or master/detail relationship. There is,
>of course, an implied master/detail relationship (one customer has
>many contacts) but I haven't made it explicit in the interface. What
>I'm trying to do is to use the UpdateRecord event to manually fill in
>the CustID value in the contacts table (I get it from the database if
>the customer is new or from the customer record the customer already
>exists.)
>My motivation for using cached updates is merely to get the job
>working. I haven't yet had the luxury to think about performance
>issues. I was forced into cached updates by the very issue that began
>the Tool Time thread. There, I was using TTables, as was my wont, to
>do a linked lookup. There is no TTable solution to that problem,
>apparently. Others suggested that I could do it with TQuery and
>TSQLUpdate tools and indeed I could. It worked but that combination
>cannot work without cached updates. At any rate, once having used that
>approach successfully, I followed my habit by continuing to use the
>approach that worked, but I reverted to TTable on the next form I had
>to build. There were no links or joins to deal with and TTable seems a
>more straightforward approach to simple table management. Cached
>updates seems to make sense here because the Customer schema is
>transaction rich for any customer and network traffic is an issue.
>I know that there are a lot of different approaches one can take with
>these data access tools. What I would really like is to know one
>coding approach that works in, say, 99% of the cases (this is pretty
>ordinary database management, after all) and just use it. Instead, I
>seem to come up short every time I make the slightest variance from a
>working model. So the requirements on a new form are a little
>different than another and so I vary the work to match it and the work
>fails. Pretty frustrating.
>If there's a standard or best-use approach, I'd like to know it.
>Phil Cain
>"Roger Arnesen" <w...@cares.kom> wrote:
>>Phillip, Old Friend,
>>Personally I do not use Cached Updates, but if you would like suggestions
to
>>different approaches, I would be happu to provide them.
>>First of, some questions :
>>Exactly why are you using Cached Updates? Is is speed related or
>>validation/transaction related?
>>Is the ID generated by a trigger on the database or code?
>>I assume that the TTable in question is linked to another TTable, on the
>>missing field as the key? If so, that could be causing your problem.
>>I would recommend Updating as you go along directly. CachedUpdates are
>>basically as "FastTrack" approach to Transaction control, and personally I
>>don't believe in free lunches, so you may have to do this the "proper"
way.
>>Roger