Is it worth it anymore to teach kids today ethics? In school I mean. I was required in my senior year of high school, which I just finished up few months ago, to take a career course. The one particular section spoke about being ethical. They made a remark in there to the effect of:
If you are told to do something you don't believe in, then don't do it. Even if you lose your job, at least you know you did what you believe in.
Really? My father did that. He was a electrical engineer for the past 38 years. He was always nice to people. He always 'show up' bosses when they did something yucky. Now, he is making 2/3's of what he use to make. His friends at Westinghouse? He helped them, and are they returning the favor? Nope.
How about the people at the last company he worked for where he made his normal salary. Sure, that company's admin staff was fools, and he was right in a sense when he stood them up right before they went under..... And another company moved in, hired all the kiss-butt people.... but my father. So much for trying to be a "fair" engineer/manager.
He told me "there is no ethics in business". He also warned me never to trust anyone and to be as ruthless if not more then the rest of the people I shall work for when I'm out of college. I myself will do so, since I have a burning hatred for people who try to screw me over now a days.
Does it pay to be ethical anymore? What you think?
My opinion is clear. It is useless.
Bookmarks