Never forget that Doctors work for the insurance company, not the hospital. Healthcare in the US is a fuckin joke and we are the punchline.
Health - Resources and discussion for everything health-related
Health: physical and mental, individual and public.
Discussions, issues, resources, news, everything.
See the pinned post for a long list of other communities dedicated to health or specific diagnoses. The list is continuously updated.
Nothing here shall be taken as medical or any other kind of professional advice.
Commercial advertising is considered spam and not allowed. If you're not sure, contact mods to ask beforehand.
Linked videos without original description context by OP to initiate healthy, constructive discussions will be removed.
Regular rules of lemmy.world apply. Be civil.
In this case, this is a staff doctor. A person whose job is to review the medical necessity of procedures. So yeah, she literally works for the insurance.
Also, doctors in hospitals are in antagonistic roles with insurance unless they work for an insurance hospital (god, I wish that was a joke). I actually don't know what you're taking about here
All of it. Our medical system needs a massive reevaluation. How we run it, how it generates money, who can benefit from the revenue, and who deals with discipline all staff when issues arise. So all of it.
Why did i look that up
Apparently not only are there insurance companies that own hospitals but there are hospitals that are making their own insurance 🫠
One of my family members works for a major health insurance company and has a similar experience. The people doing the reviews are understaffed and the people with close medical knowledge of the patients current situation don't actually make final decisions. They give recommendations to the few overworked doctors who are given these ridiculously short deadlines and arbitrary pass/fail rules on whether to cover or not.