Blog home
## The Theory of Relativity Whilst Traveling

9.Jun.10 at 20:38

Blog home

Whenever I travel I notice some odd things happen with time. It happened again this trip and as I was thinking about it I realized it has similarities to Einstein's Theory of Relativity which says something about how time goes slower when one travels at faster speeds. Well, this theory has to do with how perceived time progresses at two different speeds during travel.

See, it only takes a few days of being abroad for me to start missing people that I left. Maybe it's because I generally travel alone and don't have my friends to share with anymore, but even before a week is up it feels as if it has been ages since I spoke with them. This makes for some awkward chat conversations about how the person is doing even though I said goodbye to their face four days ago!

The other side of the coin though is that the end of a trip comes up so quickly. I can see my two month trip already breezing past. I've been gone a week already!

So time travels slow where I am not and fast where I am. It's been ages for the rest of the world and just the blink of an eye here. Though in actuality that is less the speed of time as the amount of time. More time passes where I am not and less where I am. But that gets into weird physics of the standard speed of time change which I am neither smart enough or willing enough to go into.

I'm trying to work out the mathematical formula for these phenomenon and I think it must be something along these lines:

((D * L) / A) + A = T

Where D = Perceiver's Distance from location, L = Length of time to get there, A = Actual time passed, and T = Perceived amount of Time passed. Note, all time is in days.

This works great for a weekend trip from Eugene to Portland: ((100 * 0.083) / 3) + 3 = 5.76 days. This of course happens because you miss your friends having an awesome time on the weekend without you, so it feels like it's been nearly a week. (note, 2hr drive time divided by 24hrs in a day equals 0.083)

However, I'm having trouble when it comes to a trip part way across the country for a week. Say you're around 1200 miles away, like Eugene to Denver are, and takes about 5 hours to fly between. ((1200 * 0.21) / 7) + 7 = 42 days which is a bit much. However, if we add a modifier to our division to bring things into perspective, it may help: ((D * L) / (A * 10)) + A = T. This gives us 10.6 days for the above trip. Very reasonably.

But, as I'm doing this as I write I notice a potential problem which may not be a problem but the truth. For a trip like I'm on you end up with (and this is very rough): ((6000 * 1.5) / (7 * 10)) + 7 = 135.57 days. But once I've been gone for a month it is only preceived as 60 days. You eventually close to an unreachable point where perceived time meets actual time. 90 days feels like 100 days. This could actually be true. However, I feel like 135 days for 7 is a bit extreme.

Obviously this theory needs some refining.

The other side of the coin though is that the end of a trip comes up so quickly. I can see my two month trip already breezing past. I've been gone a week already!

So time travels slow where I am not and fast where I am. It's been ages for the rest of the world and just the blink of an eye here. Though in actuality that is less the speed of time as the amount of time. More time passes where I am not and less where I am. But that gets into weird physics of the standard speed of time change which I am neither smart enough or willing enough to go into.

I'm trying to work out the mathematical formula for these phenomenon and I think it must be something along these lines:

((D * L) / A) + A = T

Where D = Perceiver's Distance from location, L = Length of time to get there, A = Actual time passed, and T = Perceived amount of Time passed. Note, all time is in days.

This works great for a weekend trip from Eugene to Portland: ((100 * 0.083) / 3) + 3 = 5.76 days. This of course happens because you miss your friends having an awesome time on the weekend without you, so it feels like it's been nearly a week. (note, 2hr drive time divided by 24hrs in a day equals 0.083)

However, I'm having trouble when it comes to a trip part way across the country for a week. Say you're around 1200 miles away, like Eugene to Denver are, and takes about 5 hours to fly between. ((1200 * 0.21) / 7) + 7 = 42 days which is a bit much. However, if we add a modifier to our division to bring things into perspective, it may help: ((D * L) / (A * 10)) + A = T. This gives us 10.6 days for the above trip. Very reasonably.

But, as I'm doing this as I write I notice a potential problem which may not be a problem but the truth. For a trip like I'm on you end up with (and this is very rough): ((6000 * 1.5) / (7 * 10)) + 7 = 135.57 days. But once I've been gone for a month it is only preceived as 60 days. You eventually close to an unreachable point where perceived time meets actual time. 90 days feels like 100 days. This could actually be true. However, I feel like 135 days for 7 is a bit extreme.

Obviously this theory needs some refining.