With regard to the human eye, you guys are typically correct! Think of this mathematically (which tends to be easier for optics / lens problems). The focal length of a typical human eye is around 2 cm.
Now, let's say I'm looking at an object that is 20 m away (2000 cm). If we plug into our trusty lens equation (1/f = 1/i + 1/o), we get (1 / 2 cm) = (1 / i) + (1 / 2000 cm). We certainly could calculate this out exactly, but 1/2000 is an extremely small number. This relates to what many passages call a "distant object," which is an object so far away that the "1/o" term can be approximated as 0. For the human eye and its fairly tiny focal length, many (if not most) objects that we look at can be considered "distant."
Now that we've estimated 1/2000 cm to be approximately 0, we get the equation 1/ 2 cm = 1/i, or 1/f = 1/i. This simplifies to f = i. As we certainly want our image to appear on the retina, this equation tells us that the focal length can also be approximated as the distance from the lens to the retina (in other words, equal to the image distance).
Is this always correct? Not exactly - for one, if you're dealing with lenses other than the eye, the focal length may be very large. So there's no guarantee that the object distance will be significantly larger, and we likely have to proceed with the full calculations. Additionally, always keep in mind the conditions that prevent even the image itself from appearing on the retina: hyperopia (farsightedness) and myopia (nearsightedness).