Sub: Help needed!!!

  • Thread starter Thread starter chiajack
  • Start date Start date
C

chiajack

Dear all,
I have difficulty in writing a macro for the following situation and
need help urgently on this. I would greatly appreciate any kind soul
outthere who is willing to help me out. :)

I have 2 funtions:

f = 1 + (1.4) * 2 / .005 * r
g = (1 / (1 - (1 / .005 * r))) ^ 2

I need to write a macro to incrementally calculate and display to 6
decimals, the first positive value of r such that f-g< precision where
precision will be a variable for the user to input.

Thank you.
 
Are you sure you specified the problem correctly? I
get the following values:

r f g f-g
0.000001 1.00056 1.00040012 0.00015988
0.00001 1.0056 1.004012032 0.001587968
0.00005 1.028 1.020304051 0.007695949
0.0001 1.056 1.04123282 0.01476718
0.0005 1.28 1.234567901 0.045432099
0.001 1.56 1.5625 -0.0025
0.003 2.68 6.25 -3.57
0.005 3.8 #DIV/0! #DIV/0!
0.007 4.92 6.25 -1.33
0.009 6.04 1.5625 4.4775
0.01 6.6 1 5.6
0.1 57 0.002770083 56.99722992
0.5 281 0.00010203 280.999898

If these are correct, f - g is not well behaved around
zero. 'r' could be about .000001 or in (.007, .009).

HTH,
Merjet
 
Back
Top