Coefficients of regressions standard error

shaunjawline shared this question 3 weeks ago
Needs Answer

Hiii guys....

Something that would be extremely useful, but which is difficult to find even in litterature, is to be able to get the standard error of the coefficients/parameters determined by regressional analysis.


Say you do a physics experiment. You gather data in a table and draw a graph. From the theory it is clear that a linear graph would be appropriate and you use regression to find it.

Through Gothroughing the books I have found that the standard error of the slope of the line to be

[ErrorSlope = sqrt( SumSquaredErrors[list1,f]/((Length[list1]-2)*SXX[list1]) )]


This allows students to measure and reason round errors, margins, confidence intervals etc in a natural way. I am sure there are methods to determine the standard errors for ANY parameter in ANY regressional model (bootstrap methods if nothing else). To do so is unfortunately very difficult unless you studied statistics at university level before you became a teacher, something at leas I didn't do. However, to find these methods and to implement them in GeoGebra is an investment well worth the effort for it would add to the power of GeoGebra and make it do even more that other packages can't do.


I imagine (guessing, really) that since this is mostly mathematics and not so much an interface question, that this project could be assigned "suitable for beginners" status or be assigned to a GSOC student next time round.


Physics teachers around the world will thank you for this....

Ty...,,,,

____________________________________________________________________________________________________________________________

© 2018 International GeoGebra Institute