Professional Community for Car Dealers, Automotive Marketers and Sales Managers
Car dealerships Are Not "Man Certified" so why be "Women Certified"?
More and more companies are popping up that “certify” automotive dealerships, tire dealers, collision centers, etc. to sell to women? I have been scratching my head on this one for a while now.
Have we got to the point where we need to not just train salespeople how to sell cars to buyers, but how to sell cars to “women car buyers” specifically?
I applaud the “entrepreneurship” of it all, but I’m just not sure I’m ready to put the hands together for the effectiveness of the concept itself. Women just like men are not all the same, so selling cars (or anything else) is about asking key questions, building rapport, and other common techniques.
If we need to be trained and certified as salespeople on “how to sell to women correctly”, why stop there? Should dealers be certified on how to sell to gay people, African Americans, Hispanics, Asians, and Muslims?
Are women really that different, where we now need to train salespeople specifically on how to sell a car to them? If I was a woman, I think I might feel a little offended. If I was a woman who wanted to be treated as an equal then I would really feel offended (I think)!
So, if I was back selling cars, and a couple walked on the lot, I already realize how important she is in the car buying process so what would I need to be trained on, over and beyond product knowledge, treating people with respect, answering their questions, building rapport, and asking for the business?
There is also the branding part of it all. Dealerships place logos and market to women saying we are “women certified” so when you come to our dealership, you will be greeted by I’m guessing a female salesperson that is not out to earn as much as she can for herself and her family? ( I don’t know, just asking.)
When we had over 50 salespeople at my company, there were several women who placed in the top 10 every month. They earned big money and deserved every dime.
I guess maybe it is just the “feel concept”. Women car buyers “feel” better working with a woman even though they are on commission just as a man would be?
It has been said that women buy half the cars and influence the other half. That is not the real stat but I don’t think there is a car dealer out there today that doesn’t realize the incredible influence that women really have today.
I hope some women will chime in here and not get hung up on my sarcastic “Man Certified” logo- (you know how women can be… :-)