 Have you ever thought about this, since you were old enough to walk too were told not to touch or play with your genitalia, that it was bad. This has gone on all through the ages. Folks this not right. Your penis or your vaginia are parts of your organs of reproduction like your ears are part of your ability to hear and your eyes to see. You don’t keep your eyes or ears covered. They aren’t anything to be ashamed of, so why are your genitalia parts so taboo. Guys show off their biceps and are proud of them, if I have 10 inch pecker I would be very proud of it and would like the whole world to know it. I just think it’s wrong to teach children they so acknowledge what they were born with. If you ask some folks why they think that way and they say that God said so, well didn’t God make us as we are in the beginning and if He thinks it’s bad why didn’t he change the pattern and do something He thought was right.
This I just my ramblings but I would like to hear what others think. |
Most of the people on the planet know that people fuck. It isn't any real secret. Almost everyone agrees that fucking is enjoyable.
But even on a lot of nude beaches, guys are not supposed to get erections. Hey, world hard ons happen. The world doesn't end if someone sees a hard cock.
I think that bare tits should be normal. Just like guys can go shirtless, so should women and maybe the fascination with tits and nipples would be lessened.
Also with cock and vaginas. I think we would all be better off if they were normalized.
New Comment