I need to rant.
Do you want to know what universities are? BUSINESSES. They aren't these wonderful places of learning where free ideas propagate and bloom, or these places of immense intellectual growth where blah blah blah. This kind of garbage belongs in the movies... it is not reality. Reality would tell you that a universities are enterprises: they operate with the sole purpose of earning profits. It just so happens that their manner of earning these profits is to make you pay for your education. You give them money... they give you a piece of paper that increases your ability to get a job. Professors are their employees-- they aren't a glorious beacon of knowledge and freedom of thought, but just people doing their jobs. You are the university's client, which is referred to as a student in the industry of higher education. Stop giving universities special treatment: treat them like you would any other corporation or place of business... because THAT IS WHAT THEY ARE.
Can we please just drop this overly romanticized crap about higher education? I'm sick of these people saying thinking universities somehow special or are these magical supercool places. They're business... nothing more... nothing less. They aren't anything like the movies told you they were. You need a reality check if you think so otherwise.
Oh, this applies to American universities. If you aren't American and don't go to an American university, then please, don't tell me how "I'm wrong". I'm speaking only on what I know (which is American universities), and could care less how this generalizes to other cultures. I don't mean to sound like a jerk with this statement, but I want to avoid the inevitable "Oh, well I go to a university in X country, and this isn't true here!".
Do you want to know what universities are? BUSINESSES. They aren't these wonderful places of learning where free ideas propagate and bloom, or these places of immense intellectual growth where blah blah blah. This kind of garbage belongs in the movies... it is not reality. Reality would tell you that a universities are enterprises: they operate with the sole purpose of earning profits. It just so happens that their manner of earning these profits is to make you pay for your education. You give them money... they give you a piece of paper that increases your ability to get a job. Professors are their employees-- they aren't a glorious beacon of knowledge and freedom of thought, but just people doing their jobs. You are the university's client, which is referred to as a student in the industry of higher education. Stop giving universities special treatment: treat them like you would any other corporation or place of business... because THAT IS WHAT THEY ARE.
Can we please just drop this overly romanticized crap about higher education? I'm sick of these people saying thinking universities somehow special or are these magical supercool places. They're business... nothing more... nothing less. They aren't anything like the movies told you they were. You need a reality check if you think so otherwise.
Oh, this applies to American universities. If you aren't American and don't go to an American university, then please, don't tell me how "I'm wrong". I'm speaking only on what I know (which is American universities), and could care less how this generalizes to other cultures. I don't mean to sound like a jerk with this statement, but I want to avoid the inevitable "Oh, well I go to a university in X country, and this isn't true here!".