I have never believed in signs. I enjoy reading my horoscope, and a friend recently got me into reading Tarot, but those are all about narrative lenses: How do these arbitrary but supposedly specific sets of insights wrap around and reveal parts of my story that are obvious and parts that are hidden?
But signs? The universe calling out just to you to tell you what to do, speaking through — what? Books and Instagram and a Broadway musical?