What's the point of going outside? I know nature is supposed to be amazing and everything and sure, trees and flowers are beautiful, but is it more beautiful that a bed and cozy down comforter?
One of the hard things about living abroad in a beautiful country is when you're feeling down or depressed, you're like, shouldn't I be happy? I thought all my problems went away when I moved to a different country.
You mean they fucking came with me? I'm going to have to do work on myself? Like in therapy? I can't just look outside at the sun shining in my backyard and be made whole. I'm in a different country, for Christ's sake. Doesn't that mean anything?