
Originally Posted by
a completely inoffensive name
I just explained this. College is when most people actually start thinking about what kind of person they will be, not about what they want to know about. So they treat college as an orgy of experiences and "life learning" instead of a place to learn about the world around them. Uni's market themselves in whatever way will get them more money. It just so happens that marketing themselves as a place to learn about yourself and obtain experiences gets them a lot more money then as a place of liberal education.
Bookmarks