As another example, bringing a new medication to the consumer market takes years of testing. On average, from the conception of a new drug to the release on the marketplace, it takes 5-10 years. During that time, they first do preliminary studies, then (if successful) start doing studies on rats, then (if successful) start doing phase 1 trials on humans, with very limited sample sizes, then (if successful) they start doing phase 2 trials with a large data set and more aggressive dosage, then (if successful) they start doing phase 3 trials, which is practical clinical use.
Now, we could add more to this. Maybe they could study the effects in pigs before they move from rats to humans. Sometimes they do this already, but they don't have to. Maybe we could add a fourth phase that does more robust study on some particular maladies.
But the point is that we have to draw the line somewhere. Yes, if drug companies had to do 20 years of testing instead of 5-10, there would probably be fewer medications that get through and only later are discovered to cause things like heart disease due to long term use. 30 years would be even safer than that. And so forth.
The goal here isn't to stay that longer, more thorough testing = automatically worse, it's only to show that there is a cost. Fewer drugs will be created if we insist on a 20 year test cycle; further, some people who could have benefited from the drug would have gotten it if it were available 10 years earlier. If you want 100% absolutely certainty that a new drug, or food, or preservative, cannot possibly cause any harm, then you will never let anything out of the trial phase, because there is always something else to check for, or a new time horizon to consider.