Time is defined by the distance (in time) between two specific events... be this a vibration, a tick, or the emission of a neutron. When every known/measurable process experiences this effect, I'd certainly call that time dilation.
Persol if you could bear with me a little on this iw ill try to explain why I see a very important distinction between time dilation adn atomic slowing perspectives.
If we assume time dilation in the classical Einstien/Minkowski sense this requires that time is not absolute, That the NOW of our object is undefinable.
If we assume just atomic slowing then time can be deemed absolute and the NOW becomes uniform for all object s universally.
By taking the Einstien approach we have to declare absolute time obsolete but to take a simple atomic slowing approach absolute time is still a valid proposition.
If my clock was governed by the temperature of a glass of water that was placed outside, it woudl obviously fluctuate in it's tick rate due to the changes in atomic rate of the water. Is this time dilation in the Einstienian SR sense? I would suggest not.
If velocity of an object of mass causes the atomic rate to slow down then why would this be any diffeent to the temperature of my water?
We have enough data I guess to know that atomic slowing occurs with altitude [gravity] and velocity. I wont argue as to the validity of that data and assume that it is reasonably correct, however aI would argue the notion that time slows when in fact it is only the atomic rate that slows.
if one takes the Atomic slowing approach nearly 99% of the complications SR has created, all the thought experiments etc are no longer needed.
The student of physics doesn't have to bend over backwards to accomodate a theory that is so counter to common sense or intuition it gives people a big ache in the head.
Ok......so Maybe I should ask:
Why should I accept that time dilation exists when simple atomic slowing explains it much easier and more logically?