Awhile ago it was suggested that somehow, North American football "encouraged" violence against women. I questioned that assertion. As there have been no replies to my question, I'll take that as meaning everyone agrees that football does nothing of the sort. Thanks for the affirmation! David Kehler Information Systems Technologist Canadian Centre on Disability Studies Web Site: http://www.escape.ca/~ccds/ E-mail: [log in to unmask] %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%