The History of the Pregnancy Test

If you go as far back as ancient Greece or Egypt, you’ll find evidence that there were attempts at predicting whether a woman was pregnant. Ancient Egyptians used a rather peculiar method, watering wheat and barley bags with the urine of women believed to be pregnant. Germination of the seeds would be an indicator that the woman was with child. Hippocrates believed that women should drink a solution of water and honey, while others tried uroscopy (which relied heavily on visual diagnoses of colors in the urine).

Eventually, scientists discovered that the presence of human chorionic gonadotropin was an indicator of pregnancy. They began to engineer tests that specifically looked for hCG in the urine. This discovery also shattered preconceived notions that hCG was generated by the pituitary gland by proving it was actually the placenta that produced hCG.

A hormonal test was introduced in the 1970s, which could induce menstruation over the course of a few days in non-pregnant women. The test is highly accurate, but it involves quite a bit of complexity.

Today, most women get their first indication of pregnancy from using a test at home. These small sticks absorb some urine and react based on the presence of hCG. Accuracy is relatively high at around 97%, with improper usage decreasing those levels to as low as 75% of the time.

Today’s pregnancy tests are designed to be a lot simpler, and contain specific instructions on when to use them. New tests even attempt to measure the volume of hCG in the urine, which indicates how far along someone might be.