It's not new, I know. But I can't help but wonder what has happened to responsibility. Everything is someone else's fault, or job. You don't have to work for your money, someone will give it to you. You don't have to be a good parent, someone else will clean up your mess.
What are we teaching people?
It struck me the other day that we are teaching people to be dumb. I heard a commercial for one of those GPS devices in your car that gives you directions yesterday. Don't even worry about learning which way is north, or reading a map. Just tell the magic box where you want to go and it will give you directions. They have that "On-Star" thing in cars. If you can't remember when to change your oil, they will tell you. Lock your keys inside, they will unlock it.
Involved in an auto accident? They will call police and give them your location, you don't even have to know where you are!
What ever happened to taking care of yourself?