Quoted By:
Representing numbers as fractions turned out to not be nearly as big a deal as the original educators thought it would be.
The future turned out to be digital and "controlled precision" via decimal subcategory floating point. Fractions are this infinitely precise parlor trick that never really gets to occur in the real world, not even for people who's profession is adding together millions of fractions, because every one of your numbers has to typically fit in a data structure of say 32 or 64 bits in width to be stored on disk and storing numbers as fractions turns out to be a giant waste of time, because it's only more useful on rare occasions.
So the idea of fractions stops being useful after the most basic meme fractions: a half. A quarter, a third, a fourth, a fifth, etc, or 2/3rds, and maybe a few more. Other than that, you're gonna have to be happy with a number and however many units of precision you decide is sufficient for your problem.