- Credits
- 55,318
there was a family reunion today and a lot of people in my family have young kids and hardly any money or insurance to pay doctors and stuff. A good point was brought up that one of my town's hospitals won't take anyone unless they have insurance (and this place is even a catholic hospital) to ensure doctors get paid.
One way or another a doctor will be paid for his or her services but the point is, isn't being a doctor to help and save lives? Or is it just to ensure your patient lives and you continue gathering the money.
All I could say is small places will be filled with doctors that may care for life but these big building doctors just mostly care for money.
So what do you guys think? Are doctors out to save lives or just to make money? Keep in mind if doctors cared about saving lives they could care less about the money.
One way or another a doctor will be paid for his or her services but the point is, isn't being a doctor to help and save lives? Or is it just to ensure your patient lives and you continue gathering the money.
All I could say is small places will be filled with doctors that may care for life but these big building doctors just mostly care for money.
So what do you guys think? Are doctors out to save lives or just to make money? Keep in mind if doctors cared about saving lives they could care less about the money.