Automatically filling field

  • Thread starter Thread starter PF
  • Start date Start date
P

PF

Hi all,

I.ve posted this problem, but I think I didn't explain
mtself good.
The problem is that I have the following table:
[dt] [value]
01-01-03 100
02-01-03 110
03-01-03 30
04-01-03
05-01-03
06-01-03 50
.... ...
I want the blank values in the [value] field to be filled
with the last not blank value (30 in 04-01-03 and 05-01-
03). Does anyone has any idea who to solve this?

Thanks in advance.

PF
 
Create an update query with the 2 fields mentioned. Sort on [dt]. Use this
query as the source for an Update Query. In the update query, for [value]
set the Update To to a user defined function, passing the current value of
[value]. The function should have a variable declared as Static so that it
will remember its last value. If the value passed is Null, assign the value
of the static variable to the function. If the value passed isn't Null, then
assign this value to the function and the static variable. The question
becomes, how to reset the static variable if you do this more than once,
otherwise if the first record of the next run has a Null value, it will get
a value from the previous run. Of course, it will reset when the database is
closed.

Now, for another item. It is highly recommended that you DON'T use reserved
words as names. You have a field named value, but value is a reserved word
that is used to refer to the value of an item.

Example Update Query:
UPDATE Table1 SET Table1.[value] = RepeatMe([value]);

Example Function:
Public Function RepeatMe(varNumber As Variant) As Integer
Static intNumber As Integer
If Not IsNull(varNumber) Then
intNumber = varNumber
End If
RepeatMe = intNumber
End Function
 
Hi,



Trying to reformulate the presentation, I found some flaws to my initial proposition. So, I hope I have tie all loose ends with the following.


Assume a table, KnownData, two fields, When and Meter:
KnownData When meter
2003-01-01 100
2003-02-01 110
2003-03-01 30
2003-06-01 50



Assume a table, WantedWhen, one field, When:
WantedWhen When
2003-01-02
2003-01-30
2003-04-01
2003-05-01
2003-05-30
2003-06-02



Note that my date_time setting is yyyy-mm-dd.


The following SQL statement:

SELECT w.when, Int(SUM(cyi) )
FROM (SELECT w.When, MAX(b.meter) * Exp(Sum(Log(Abs((w.when -a.when)/(b.when-a.when))))) AS cyi
FROM KnownData AS a INNER JOIN KnownData AS b
ON a.when <> b.when, WantedWhen As w
GROUP BY b.When, w.When)
GROUP BY w.When


produces an interpolation, with the following results:

Query32 when Expr1001
2003-01-02 104
2003-01-30 116
2003-04-01 289
2003-05-01 385
2003-05-30 90
2003-06-02 73



Now, the problem is that you can't interpolate back at the point you supplied, since log( 0 ) would occur, and that is an error. On the other hand, you can try to interpolate at one second after, and that would be fine:

Query32 when Expr1001
2003-01-01 00:00:01 100
2003-01-30 116
2003-04-01 289
2003-05-01 385
2003-05-30 90
2003-06-02 73



So, to keep things as simple as possible, the table KnownData should be with date_time at midnight, and WantedWhen table should be with data, at 00:00:01. That should avoid all problems:


Query32 Expr1 Expr2
2003-01-01 00:00:01 100
2003-02-01 00:00:01 110
2003-03-01 00:00:01 30
2003-04-01 00:00:01 289
2003-05-01 00:00:01 385
2003-06-01 00:00:01 50



Note that the formula go through all the "known data", exactly.


If that is a "convention" of entering data at midnight but interpollation done at 00:00:01, is a convention you can live with, the query is not more complex than that. Otherwise, I have to find a way to bypass the log( 0 ) problem.


Observe that Lagrangian interpollation (well, here lightly modified with the ABS( ) ) can produces large fluctuations (here, is it "right" to bump up to 385 in June ), mainly if the sequence has a lot of up and down ( 100, 110 (up), 30 (down), 50 ( up ) ). Sure, if you have more "known" points, the Lagrangian interpollation can be hoped to be smooter... but it is not always the case.



Hoping it may help,
Vanderghast, Access MVP
 
Hi,

Not terribly much more complex, finally:


SELECT w.when AS Expr1, Int(Sum(cyi)) AS Expr2
FROM (SELECT w.When, MAX(b.meter) * Exp(Sum(iif(w.when=a.when, 0,
Log(Abs((w.when -a.when)/(b.when-a.when)))))) * Sgn(
min(abs(w.when-a.when))) AS cyi
FROM KnownData AS a INNER JOIN KnownData AS b
ON a.when <> b.when , WantedWhen As w
GROUP BY b.When, w.When)
GROUP BY w.when;




allows you to interpollate at the points you already used, exactly, without
having to add one second to it. Note that Sgn is a VBA function that that
the sign of its argument, so, Sgn(min(abs( ) ) returns 0 if there "was" a
potential log( 0 ), 1 otherwise.




Hi,



Trying to reformulate the presentation, I found some flaws to my initial
proposition. So, I hope I have tie all loose ends with the following.


Assume a table, KnownData, two fields, When and Meter:
KnownData When meter
2003-01-01 100
2003-02-01 110
2003-03-01 30
2003-06-01 50



Assume a table, WantedWhen, one field, When:
WantedWhen When
2003-01-02
2003-01-30
2003-04-01
2003-05-01
2003-05-30
2003-06-02



Note that my date_time setting is yyyy-mm-dd.


The following SQL statement:

SELECT w.when, Int(SUM(cyi) )
FROM (SELECT w.When, MAX(b.meter) *
Exp(Sum(Log(Abs((w.when -a.when)/(b.when-a.when))))) AS cyi
FROM KnownData AS a INNER JOIN KnownData AS b
ON a.when <> b.when, WantedWhen As w
GROUP BY b.When, w.When)
GROUP BY w.When


produces an interpolation, with the following results:

Query32
when Expr1001
2003-01-02 104
2003-01-30 116
2003-04-01 289
2003-05-01 385
2003-05-30 90
2003-06-02 73



Now, the problem is that you can't interpolate back at the point you
supplied, since log( 0 ) would occur, and that is an error. On the other
hand, you can try to interpolate at one second after, and that would be
fine:

Query32
when Expr1001
2003-01-01 00:00:01 100
2003-01-30 116
2003-04-01 289
2003-05-01 385
2003-05-30 90
2003-06-02 73



So, to keep things as simple as possible, the table KnownData should be with
date_time at midnight, and WantedWhen table should be with data, at
00:00:01. That should avoid all problems:


Query32
Expr1 Expr2
2003-01-01 00:00:01 100
2003-02-01 00:00:01 110
2003-03-01 00:00:01 30
2003-04-01 00:00:01 289
2003-05-01 00:00:01 385
2003-06-01 00:00:01 50



Note that the formula go through all the "known data", exactly.


If that is a "convention" of entering data at midnight but interpollation
done at 00:00:01, is a convention you can live with, the query is not more
complex than that. Otherwise, I have to find a way to bypass the log( 0 )
problem.


Observe that Lagrangian interpollation (well, here lightly modified with the
ABS( ) ) can produces large fluctuations (here, is it "right" to bump up to
385 in June ), mainly if the sequence has a lot of up and down ( 100, 110
(up), 30 (down), 50 ( up ) ). Sure, if you have more "known" points, the
Lagrangian interpollation can be hoped to be smooter... but it is not always
the case.



Hoping it may help,
Vanderghast, Access MVP
 
This would be kind of slow, but it should be fairly easy to understand.

Update YourTable
Set [Value] =
DLookup("[Value]","YourTable","dt = #" &
DMax("dt","YourTable","Dt<#" & YourTable.Dt & "#") & "#)
Where [value] is Null
 
Back
Top