Last year, I had snow tires put on my car. I never took the time to have them taken off, so here it is getting close to winter again and I am wondering if my tires will be OK for the winter weather ahead. Is it absolutely necessary to take off snow tires every year? This is hard for me as I have to lift them into my car myself and then the added cost of putting them on and balancing them. I am on disability and is difficult for me to lift anything over 10 pounds, let alone tires. Please advise.
I would not advise leaving snow tires on all year. In some states it is illegal to do that. You see, the problem is that the snow tires do not have good warm weather traction. They are made for gripping snow, not smooth pavement. Actually, they should only be used in the snow. So, if you live in an area where it only snows occasionally, then I would not use them until absolutely needed.
Bob, The Auto Answer Man
Have a car care question? Visit our automotive center and see if we've already answered it or a similar question.
Sign up for our free weekly eNewsletter Surviving Tough Times.
Looking for an answer to a frugal living question? Click here to ask a
Dollar Stretcher Stretchpert!
Copyright 1996 - 2013 "The Dollar Stretcher, Inc." All rights reserved unless specifically noted.
Contact the Dollar Stretcher at:
PO Box 14160
Bradenton FL 34280
"The Dollar Stretcher, Inc." does not assume responsibility for advice given. All advice should be weighed against your own abilities and circumstances and applied accordingly. It is up to the reader to determine if advice is safe and suitable for their own situation.