A parsec is a measure of distance. But, you are thinking, Nicole, don’t you astronomers already use the light year as a measure of distance? A light year is the distance that light travels in one year, so it is a convenient measure of distance to explain. (It’s about 5,878,630,000,000 miles, if you are curious.) However, astronomers actually use a unit of distance that makes more physical sense with respect to how we measure distances.
The name “parsec” is pretty self-explanatory once you know what two jargon-y words it’s made of. It is defined as the distance that an object is from Earth to create a parallax of one arcsecond.
Hold out your finger at arms length. No, really, do it. I’ll wait. I won’t laugh! Promise. Okay, now close one eye and note what your finger is in front of. Now, open that and close the other. Notice a difference? Compared to some background object, your finger has appeared to move, no matter how still you keep it. That is essentially parallax.
Now, take this system on a grander scale. Each one of your eyes was a point of view. Replace that with the orbit of the Earth, with one point of view 180 degrees around the orbit from the other. Your finger is now some star to which you would like to know the distance. The background of your room is the background of more distant stars and galaxies. A parallax observer takes pictures of their target star six months apart and measures the change in position against the background of stars that are too distant to show this effect. From that measurement, they can tell the distance thanks to a little geometry.
This picture is very much NOT to scale since the distance to any star is much, much greater than the size of the Earth’s orbit. So, we can use some handy approximations to calculate the distance. See that right triangle? We can pull out from our trigonometry background that
tan(p) = (distance from Earth to Sun) / (distance from Sun to star).
But with the “small angle approximation” and using the definition of the distance from the Earth to the Sun (Astronomical Unit, AU, approximately 93 million miles), we get
p = 1 AU / d
Using just the right units, and doing a little switching around, we get just
d = 1 / p
The unit of measure for p is arcseconds, which is a tiny, tiny angle. If you break a circle into 360 wedges, each one is one degree. If you break that wedge into 60 wedges, each is an arcminute. Split up one of those 60 ways, and you get an arcsecond. Another way to see how tiny that is…. hold up your index finger at arms length again. The width of your finger is about 1 degree. So an arcsecond is 1/3600 the width of your finger. Need I say… TINY! A star that measures a parallax of one arcsecond is now one parsec away.
So how far is a parsec really? If a one-arcsecond measurement is so tiny, then a distance of one parsec must be really far away. And it is, on human scales, but the nearest star is 1.3 parsecs away. The center of our Milky Way galaxy is almost 8000 parsecs away, or 8 kiloparsecs*. Cygnus A, one of my favorite radio galaxies, is 230 million parsecs (or 230 megaparsecs) away. Yeah, this stuff is FAR!
How do parsecs compare to the more-easily-explained light year? Well, one parsec is approximately 3.26 light years. If you ask me the distance to some celestial object, if I know it at all, I probably know it in parsecs, and will make a quick and dirty calculation in my head to light years before answering. And like its cousin the light year, the parsec has been mistaken for a measure of time.
Now, before you go off on your own parsec-scale adventure, check out the newest Carnival of Space #145 at Crowlspace.
Have an astrophysics jargon suggestion? Email me, and I’ll try and include it!
*Edit: I should be a little more careful with my significant figures. We don’t know the distance all that accurately! Thanks @leifb