I don’t mean this rude or snarky, it’s genuine curiosity… but why do so many people do a health class with their kids? I have seen it all the way from elementary to HS. I feel like that is just part of normal everyday life stuff you should teach your kids. What do these books cover that they wouldn’t learn just through life?