Wikipedia:Reference desk/Archives/Mathematics/2021 October 2
Appearance
Mathematics desk | ||
---|---|---|
< October 1 | << Sep | October | Nov >> | Current desk > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
October 2
[edit]segmenting a general curve to approximate it with cubics
[edit]To approximate an arbitrary parametric curve in 3-space with line segments, as one does, I use the vector rejection of the second derivative on the first to estimate how soon the difference between the line and the curve will exceed the desired resolution; this determines the spacing of the sample points.
In various unrelated projects, I have the analogous problem of fitting cubic splines. What is the customary way to place the minimum number of knots for a desired accuracy? (A quick search finds answers for circles and ellipses, but I want a more general answer.) —Tamfang (talk) 19:28, 2 October 2021 (UTC)
- The error (supremum of the absolute difference between a continuous curve and the approximating spline) is where is the distance between successive points. If is small enough, the error becomes proportional to so halving means reducing the error by a factor of 16. If you can determine the error for a given small value of , and you want it to be below you should choose a distance less than --Lambiam 21:35, 2 October 2021 (UTC)
- I could express the ideal curve as a
FourierTaylor series around the current point, and read off a bound on ; but it would be pessimistic, because what matters is how much of the error is orthogonal to the curve. (I say "aFourierTaylor series" singular because I'm working in the complex plane.) —Tamfang (talk) 23:49, 2 October 2021 (UTC)
- I could express the ideal curve as a
- If you can express the target curve as a
FourierTaylor series, then, apparently, it is given in analytic form. Is it given in the form of a function ? If so, does the following make sense? Let be the parametric representation of the target and of an approximating spline. Put . Express , the square of its modulus, as aFourierTaylor series, of which the first few terms should vanish. This should give a reasonable estimate of the square of the orthogonal error. --Lambiam 11:53, 3 October 2021 (UTC)
- If you can express the target curve as a
- (I meant a Taylor series. I do that sometimes.) Yes, that makes sense. —Tamfang (talk) 14:30, 3 October 2021 (UTC)
- I guess I'll use, for the step size, the fourth root of where T is the tolerance; which is more or less what you said. This does not promise a minimum number of knots, but I guess the overkill ratio won't exceed 2. —Tamfang (talk) 01:29, 4 October 2021 (UTC)
- I assume denotes the value of parameter at the knot. The ratio will be very close to 2. But there is a very small chance that the fourth derivative purely accidentally almost vanishes there, so consider the "overkill" a safety feature; you can use an additional safety guard against such an eventuality by using --Lambiam 01:53, 4 October 2021 (UTC)
- Good idea, since a double root in is even less likely. Or rather, the smaller of and. —Tamfang (talk) 03:20, 4 October 2021 (UTC)
- No, the error is still also when the fourth derivative vanishes at the starting point, so there is no basis for taking a fifth root. Also, make sure you don’t divide by zero. To guard against both derivatives almost vanishing – extremely unlikely unless the curve is a straight line, but still not impossible –, include 1.5 times the previous step size in the slate of candidates (by using where is the previous step size and is the new step size). The only division-by-zero risk then remaining is in computing the first step: don't set --Lambiam 08:09, 4 October 2021 (UTC)
- Good idea, since a double root in is even less likely. Or rather, the smaller of and. —Tamfang (talk) 03:20, 4 October 2021 (UTC)
- I have no idea how to do what you are asking. But I might start by reading about Gaussian quadrature and seeing if anything there looks helpful. 2601:648:8202:350:0:0:0:1598 (talk) 06:01, 3 October 2021 (UTC)