Monday, April 30, 2018

gravity - What's the difference between proper acceleration and coordinate acceleration?


I'm not a physics student, but I need to understand these concepts in order to explain what exactly an accelerometer measures and not measures, and why. Please, do not explain to me what an accelerometer does: that isn't the direct purpose of this question. I'm not looking for all mathematical equations, but more for intuitions, but at the same time I'm not looking for oversimplifications.




If I understood correctly, proper acceleration is measured in units which are multiples of $g=9.81$. It's not the same thing as coordinate acceleration, which is dependent on the choice of coordinate systems (according to this Wikipedia article).




  1. Why does proper acceleration does not depend on a coordinate system, and why coordinate acceleration does?



    I understood that proper acceleration is measured from an inertial frame of reference, i.e. a frame of reference which is not accelerating. Is this the reason why proper acceleration does not depend on a coordinate system, where coordinate system here actually means a frame of reference?


    If this is the case, then I suppose coordinate acceleration is the acceleration measured from any random coordinate system (frame of reference).



    1. Would coordinate acceleration and proper acceleration be the same if coordinate acceleration was measured in an inertial frame of reference?




Apparently, gravity does not cause proper acceleration since an accelerometer would measure $0g$ if in free-fall, i.e. where the only force acting upon the accelerometer would be gravity, in case we consider gravity a force.


I think there's a misunderstanding here between what an accelerometer does and what actually proper acceleration is. I believe that the accelerometer would detect $0g$ because it subtracts the acceleration due to gravity from its calculations...


Furthermore, an accelerometer would detect $1g$ (upwards) on the surface of the Earth, apparently, because there's the surface of the Earth pushing the accelerometer upwards.





  1. Why exactly does an accelerometer measure $1g$ on the surface of the Earth?




  2. Coordinate acceleration, I've seen, seems also to be defined as simply the rate of change of velocity. Why isn't proper acceleration also defined as a rate of change of velocity?




  3. What's the fundamental difference between coordinate acceleration and proper acceleration (maybe without referring to an accelerometer to avoid a circular definition, which would only cause confusion again)?







No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...