do auto insurance companies, in general, make money on auto accidents? if so, is it to the point that they would want drivers to have more accidents?
ive been considering this question for a while, and i assume its true, being how often we (us americans) hear and see auto ins ads. i usually assume when folks' money changes hands in any non-clandestine way (cash btwn private citizens) that they lose out to big business...
would the mechanism of profit be increasing rates for an individual after an accident? is there another way?
this is like preventing your car from driving you to the bank so you cant rob it