In your opinion:
What should schools teach that they don't?
Edit: I agree with many many of you.
HOWEVER, a lot of you included things that should be taught at home. Need to learn how to cook? Need to learn finances? Manners? These things should be learned at home with the schools adding to what was already taught at home. It's not the schools job to raise your child.