This is a newer dilemma that I'm facing as I'm re-directing myself toward a life that makes sense for me.
I was born into a religion, told to go to church every Sunday and told that a heaven and a hell existed. The proof: The Holy Bible. I never liked reading it, although there are a lot of interesting stories and characters I actually liked, and I only ever considered Jesus the most relevant part of the book. I've defended Christianity heavy and made sense of it to a lot of people who have tried to use it as a means for their twisted agenda, but now I'm really starting to question it.
I'm not becoming atheist or anything like that. I do have faith and I believe in a higher power, but I don't believe in the personification of god in which there are rules and guidelines to follow in order to please "him" and praise him.
I believe that we weren't necessary given a special purpose to live, we were just created and for that alone we should be thankful and feel "blessed". I don't think that "God" wants us to be righteous or to be murderous, we were given the right and the will to do whatever the fukk we want which is why there is suffering in the world. There is no such thing as god having mercy on you, because "he" can't feel bad for you. "He" should not even be the pronoun for "it".
Most of you already know that religion is the Santa Clause for adults, used in politics for social engineering and of course, financial gains or theft as I like to call it.
Albeit having a bunch of nonsensical rules that aren't even up to date, I feel that being a Christian is a lot easier than being a part of it, in my personal situation. My parents, mother especially, are really religious. And as far as they know I'm still a devout Christian as well. "Spouting" my thoughts seems like an extremely dangerous idea, even moreso if I decided to write a book about it. There's hardly anything that can be more influential to a Black Christian than their god, but along the way if we just gradually start to get our people detached from this foreign religion, then maybe there can be drastic changes for us in the future.
TL;DR
Should I continue to be a delusional Christian, which is easy, or should I steer from the path of society's religion that could possibly jeopardize relationships? There isn't a gain if I do.
I was born into a religion, told to go to church every Sunday and told that a heaven and a hell existed. The proof: The Holy Bible. I never liked reading it, although there are a lot of interesting stories and characters I actually liked, and I only ever considered Jesus the most relevant part of the book. I've defended Christianity heavy and made sense of it to a lot of people who have tried to use it as a means for their twisted agenda, but now I'm really starting to question it.
I'm not becoming atheist or anything like that. I do have faith and I believe in a higher power, but I don't believe in the personification of god in which there are rules and guidelines to follow in order to please "him" and praise him.
I believe that we weren't necessary given a special purpose to live, we were just created and for that alone we should be thankful and feel "blessed". I don't think that "God" wants us to be righteous or to be murderous, we were given the right and the will to do whatever the fukk we want which is why there is suffering in the world. There is no such thing as god having mercy on you, because "he" can't feel bad for you. "He" should not even be the pronoun for "it".
Most of you already know that religion is the Santa Clause for adults, used in politics for social engineering and of course, financial gains or theft as I like to call it.
Albeit having a bunch of nonsensical rules that aren't even up to date, I feel that being a Christian is a lot easier than being a part of it, in my personal situation. My parents, mother especially, are really religious. And as far as they know I'm still a devout Christian as well. "Spouting" my thoughts seems like an extremely dangerous idea, even moreso if I decided to write a book about it. There's hardly anything that can be more influential to a Black Christian than their god, but along the way if we just gradually start to get our people detached from this foreign religion, then maybe there can be drastic changes for us in the future.
TL;DR
Should I continue to be a delusional Christian, which is easy, or should I steer from the path of society's religion that could possibly jeopardize relationships? There isn't a gain if I do.