Logic itself is the simplest of things, as it is impossible for anything to be more fundamental. Nothing could exist at or be reduced to a more foundational level. Nevertheless, the information that must be examined by the illumination of logic can be vast, complex, and seemingly daunting. It is the intersection of reason's simplicity and the potentially complicated nature of appraising miscellaneous information that makes reality far easier to understand than many will admit while also making it more difficult to understand than many others will accept.
One grasps logic directly and can wield it at any time. In this sense, nothing is more accessible to every willing person. This, however, does not mean that arriving at correct conclusions about every subject is an inherently easy matter. On the contrary, there is a great volume of information--some of it pertaining to the senses, some pertaining to introspection, and some pertaining strictly to logic--that has to be dissected with reason if one wishes to have knowledge about as much of reality as possible.
Even discovering logical truths is not always an immediate process, for it might involve identifying and subsequently rejecting numerous non sequitur errors. To prove that one is not dreaming during a given moment [1], for example, might ultimately be far simpler than many consider it to be despite the immense gravity of the matter (and despite the fact that only a very small number of people even understand/discover how to do this), and yet many issues must be contemplated to some extent before one could be in a position to prove something so specific and important. The topic is therefore thoroughly nuanced.
Of course, few issues have a degree of complexity or depth comparable to that of proving that one is not dreaming, yet all issues have their relatively basic and complex dimensions. It is still true that the purely syllogistic parts of proofs are often extraordinarily easy to grasp if one is willing, while discovering if various starting premises are actually correct, as well as sorting through the many relevant details that might need to be considered, can require time, precision, and careful reflection.
Many aspects of epistemology are quite elementary thanks to the irreducible simplicity of reason, although their sophistication is often overhyped, while many other aspects are complicated regardless; neither truth negates or conflicts with the other. It is this paradoxical nuance that few are willing to approach. Understanding reality with any level of thoroughness takes effort, after all, but the person who clings to reason will find that the task is far easier than it may otherwise seem.
[1]. https://thechristianrationalist.blogspot.com/2017/07/dreams-and-consciousness.html
Thursday, June 27, 2019
Wednesday, June 26, 2019
Employees First
Companies cannot survive without consumers, but they also cannot survive without employees working on their behalf. Although both are necessary for the continued success of a firm, it is common for people to hear about the idea of putting consumers ahead of employees, and many people might even say that they prefer for their favorite companies to take this stance. Even so, putting consumers first is, ironically, not the best way to ensure that they are satisfied!
A focus on customer satisfaction above employee satisfaction might lead to consumers being catered to even when they are deceptive or gratuitously difficult in a general sense. In order to please all customers, a firm must often sacrifice some degree of employee satisfaction, as clashes between the two groups might occur from time to time, with workers being the second priority in such scenarios. However, dissatisfied employees hinder the goal of effectively impacting consumers.
When employees are treated well and provided with incentives to invest time and effort into a company, consumers will likely benefit, as satisfied employees are more likely to appease customers. After all, a satisfied employee can approach consumer relations with a vitality and eagerness lacking from employees who are disappointed, or perhaps even embittered, with their corporative lives. Anyone who has worked in a firm that thoroughly emphasizes employee wellbeing knows that this approach can have an empowering effect that benefits internal and external stakeholders alike.
Putting employees first is an easy and potent way of minimizing turnover rates and augmenting productivity, innovation, and the quality of corporate cultures--in other words, it is an easy way to most effectively utilize human capital (employees and their skills). Creating such a work environment can be as simple as making the atmosphere of the company a more pleasant place to be, though additional measures and incentives are always optimal. In contrast, a work environment where employees know pleasing consumers/clients is the primary goal, no matter how incompetent or difficult they are, might provoke a disdain for the customers in question.
The customer is not always right ("the customer is always right" is a motto that would never survive in a rationalistic world!), but simply conceding this does not go as far as a business needs to in order to have a right set of priorities. A company that has sound priorities will focus on pleasing its workers before it tries to please its consumers. It is certainly possible to please the people in both categories at once, but it is far easier to do so when employees are rewarded, motivated, and content.
A focus on customer satisfaction above employee satisfaction might lead to consumers being catered to even when they are deceptive or gratuitously difficult in a general sense. In order to please all customers, a firm must often sacrifice some degree of employee satisfaction, as clashes between the two groups might occur from time to time, with workers being the second priority in such scenarios. However, dissatisfied employees hinder the goal of effectively impacting consumers.
When employees are treated well and provided with incentives to invest time and effort into a company, consumers will likely benefit, as satisfied employees are more likely to appease customers. After all, a satisfied employee can approach consumer relations with a vitality and eagerness lacking from employees who are disappointed, or perhaps even embittered, with their corporative lives. Anyone who has worked in a firm that thoroughly emphasizes employee wellbeing knows that this approach can have an empowering effect that benefits internal and external stakeholders alike.
Putting employees first is an easy and potent way of minimizing turnover rates and augmenting productivity, innovation, and the quality of corporate cultures--in other words, it is an easy way to most effectively utilize human capital (employees and their skills). Creating such a work environment can be as simple as making the atmosphere of the company a more pleasant place to be, though additional measures and incentives are always optimal. In contrast, a work environment where employees know pleasing consumers/clients is the primary goal, no matter how incompetent or difficult they are, might provoke a disdain for the customers in question.
The customer is not always right ("the customer is always right" is a motto that would never survive in a rationalistic world!), but simply conceding this does not go as far as a business needs to in order to have a right set of priorities. A company that has sound priorities will focus on pleasing its workers before it tries to please its consumers. It is certainly possible to please the people in both categories at once, but it is far easier to do so when employees are rewarded, motivated, and content.
Tuesday, June 25, 2019
The Utility Of Science
In an era where scientific progress has affected numerous aspects of human life, it is rare to find an honest acknowledgment of the epistemic useless of science. For all of its advances, science cannot prove the existence of any particular thing: it cannot even establish that the matter observed in empirical study exists to begin with, for only logic can accomplish such a feat [1]. As the example of electricity demonstrates (not that one could not reason this out without an example), though, there is no benefit to ignoring science and there is still much to gain from exploring it, despite that benefit being on a practical level instead of an epistemological one.
Even a cursory examination of electricity quickly affirms basic facts about the limitations of the scientific method. Does electricity prove anything about the external world beyond human perceptions? Of course not. Does it reveal anything significant about epistemology in itself? Of course not. Do any of these facts mean it is not convenient and useful? Certainly not! As with scientific information in general, information about electricity is far from having no application.
Despite not having any use higher than mere convenience, electricity has still facilitated transportation, improved standards of living, and revolutionized entertainment. It therefore serves as an example not only of the utility of science, but also of how it would be asinine to completely ignore science simply because it is inferior to logic in every epistemological regard. If science is entirely disregarded on even a practical level by an entire society, that society will forfeit whatever potential convenience could have been enjoyed, making various inconveniences gratuitous and avoidable.
Anyone who thinks that science is capable of proving anything more than the nature of mere perceptions is shackled to assumptions, and false ones at that. At the same time, it would be foolish to think that science has no usefulness at all--I wouldn't even be able to write this article were it not for technological advances brought about by science. As long as utility is not confused for anything more, there is nothing problematic about emphasizing it. Too many mistake the convenience of utility for an indicator of philosophical illumination, however, with science being incapable of providing any ultimate benefits beyond the practical.
[1]. https://thechristianrationalist.blogspot.com/2018/07/science-and-external-world.html
Even a cursory examination of electricity quickly affirms basic facts about the limitations of the scientific method. Does electricity prove anything about the external world beyond human perceptions? Of course not. Does it reveal anything significant about epistemology in itself? Of course not. Do any of these facts mean it is not convenient and useful? Certainly not! As with scientific information in general, information about electricity is far from having no application.
Despite not having any use higher than mere convenience, electricity has still facilitated transportation, improved standards of living, and revolutionized entertainment. It therefore serves as an example not only of the utility of science, but also of how it would be asinine to completely ignore science simply because it is inferior to logic in every epistemological regard. If science is entirely disregarded on even a practical level by an entire society, that society will forfeit whatever potential convenience could have been enjoyed, making various inconveniences gratuitous and avoidable.
Anyone who thinks that science is capable of proving anything more than the nature of mere perceptions is shackled to assumptions, and false ones at that. At the same time, it would be foolish to think that science has no usefulness at all--I wouldn't even be able to write this article were it not for technological advances brought about by science. As long as utility is not confused for anything more, there is nothing problematic about emphasizing it. Too many mistake the convenience of utility for an indicator of philosophical illumination, however, with science being incapable of providing any ultimate benefits beyond the practical.
[1]. https://thechristianrationalist.blogspot.com/2018/07/science-and-external-world.html
Monday, June 24, 2019
The Cancerous Growth Of Tolerance
It is quite common for people of various worldviews to pretend like there is some surprising deficit of "respect" for the average person in the current state of affairs, as if contemporary Western society is not one of the most tolerant cultures of all time. On the contrary, people are often expected to respect even the most imbecilic of ideas and those who propose them--except for unpopular truths, of course. The cancerous growth of tolerance has threatened the majority of the Western world.
In light of the tolerance that is entrenched in Western culture, many people vehemently dislike the notion that moral superiority is metaphysical superiority: that is, if truth has value, the person who pursues it and lives in accordance with their moral obligations has more value than someone who does not. Many seem to automatically reject whatever a person is about to say at the mere suggestion that people are not equal in value if moral obligations exist, for people behave differently enough to not share the same moral status. It only takes a few moments, however, to realize that this is necessarily true.
The average person does not deserve any special degree of respect. No one who clings to irrationality deserves any respect beyond that which is necessary to not mistreat them. In other words, they deserve only to have their baseline human rights acknowledged and to not be treated unjustly. No one deserves anything more by default; anything beyond this is earned on individualistic grounds by one's concern for truth, soundness in exercising rationality, and commitment to justice, which is the core of morality.
One of the most dangerous ideas that a society could ever absorb is the notion that someone who lives in opposition to rationality and morality is just as valuable as someone who lives for genuine rationality and morality. It may be true that simply being human grants each person some degree of intrinsic worth, but it is still true that many people are fundamentally unequal in an intellectual and moral sense, and thus not everyone can legitimately claim to deserve the same social treatment.
Of course, even legitimate intolerance is often demonized by the masses, although there is no arrogance in intolerance if it is directed towards deserving people. Every society has three options regarding how it handles tolerance: it can tolerate everything and allow itself to dissolve due to a lack of rigidity, be intolerant of intolerance itself and cling to an inherent contradiction, or be intolerant towards irrationality and injustice and, in doing so, align itself with reality. Only the third option is in accordance with reason.
In light of the tolerance that is entrenched in Western culture, many people vehemently dislike the notion that moral superiority is metaphysical superiority: that is, if truth has value, the person who pursues it and lives in accordance with their moral obligations has more value than someone who does not. Many seem to automatically reject whatever a person is about to say at the mere suggestion that people are not equal in value if moral obligations exist, for people behave differently enough to not share the same moral status. It only takes a few moments, however, to realize that this is necessarily true.
The average person does not deserve any special degree of respect. No one who clings to irrationality deserves any respect beyond that which is necessary to not mistreat them. In other words, they deserve only to have their baseline human rights acknowledged and to not be treated unjustly. No one deserves anything more by default; anything beyond this is earned on individualistic grounds by one's concern for truth, soundness in exercising rationality, and commitment to justice, which is the core of morality.
One of the most dangerous ideas that a society could ever absorb is the notion that someone who lives in opposition to rationality and morality is just as valuable as someone who lives for genuine rationality and morality. It may be true that simply being human grants each person some degree of intrinsic worth, but it is still true that many people are fundamentally unequal in an intellectual and moral sense, and thus not everyone can legitimately claim to deserve the same social treatment.
Of course, even legitimate intolerance is often demonized by the masses, although there is no arrogance in intolerance if it is directed towards deserving people. Every society has three options regarding how it handles tolerance: it can tolerate everything and allow itself to dissolve due to a lack of rigidity, be intolerant of intolerance itself and cling to an inherent contradiction, or be intolerant towards irrationality and injustice and, in doing so, align itself with reality. Only the third option is in accordance with reason.
Sunday, June 23, 2019
Movie Review--Child's Play (2019)
"You're my best buddy. I just want you to be happy."
--Chucky, Child's Play (2019)
After ironically releasing an unrelated movie called The Prodigy that borrows a key idea from the first Child's Play earlier this year, Orion Pictures has now released a reboot of the Child's Play series that is far more relevant to the modern world than the initial 1988 film. The 2019 reboot is a smart update that improves upon the original, providing plenty of graphic kills that take definite advantage of the R rating, although the humor is very hit or miss. There may be a lot of expected backlash considering that the original series is still ongoing, but this newer take on Chucky is able to stand on its own as a unique approach to a classic horror icon.
Production Values
Much of the effects work, including CGI and animatronics, is great, particularly with Chucky--and there are some great shots that contrast blue and red, the two alternating colors of his eyes. Some of the best scenes feature Bear McCreary's fitting soundtrack, which was reportedly played on toy instruments, but the sound as a whole is stellar. Of course, the central character succeeds or falls due to both audio and visual factors, and fans of the original have no reason for alarm.
The most important character in Child's Play, clearly, is Chucky himself, and Mark Hamill does an excellent job voicing him, delivering a vocal performance so superb that even fans of the original who despise the very thought of a reboot might easily be won over. Combined, Hamill's efforts and the script have a synergistic effect that makes Chucky an antagonist who might even come across as sympathetic to some viewers--something that I have never heard of anyone saying about the original version of the character.
The acting on the parts of Aubrey Plaza and Gabriel Bateman is admirable despite falling in the shadow of Hamill's role. Their relationship as characters is explored well, even though there is nothing particularly special about either character in terms of complexity. Other than Chucky, none of the characters are very unique, and moviegoers have likely seen many similar characters throughout the years.
Even so, the performances are always enough to carry the story even through its weaker parts. The middle of the movie contains the more generic scenes, some of which involve gratuitous comedy, though there is still an air of general competence around everything. It is the beginning and the third act, rather, that stand out as having the best scenes. The last 20 minutes showcase a spectacular set piece that leads to the defeat of Chucky. The scale and plausibility of the finale easily exceed the comparatively cheesy ending of the original.
Story
Spoilers!
A worker in a Vietnamese factory affiliated with a company called Kaslan, seemingly minutes away from being forced back onto the streets, turns off the safety measures for the final Buddi doll he assembles as a form of retaliation against an abusive manager. The doll is recognized as defective by a purchaser, who returns the item. The shift worker who oversees the return takes the doll home as a birthday present for her son Andy. Naming itself Chucky, Andy's new doll integrates with Kaslan products around the apartment, including a TV. It becomes clear that Chucky has either been programmed abnormally or suffers from a glitch, but he still bonds with Andy, the latter adoring the relationship until Chucky starts displaying violent tendencies in an effort to protect Andy.
Intellectual Content
In a very appropriate move, the script changes the nature of Chucky: this incarnation of the character is not a serial killer's consciousness inhabiting an otherwise inanimate doll, but he is instead an artificial intelligence without its safety programming activated. Child's Play addresses a concern some have about the increasingly prominent status of technology in the modern world while, at the same time, never treating technology as intrinsically negative.
Thankfully, no character complains about technology being inherently dangerous or destructive using slippery slope fallacies, and the film even touches upon issues like corporate profit being derived from exploitative working conditions in non-Western countries--something that is not brought up in many mainstream releases. This isn't the first time the potential for AI-induced disaster has been portrayed onscreen (Terminator and Age of Ultron being notable examples), but it certainly isn't executed poorly.
Conclusion
The reboot of Child's Play deserves attention from fans of classic horror films like the original (even if it has not necessarily aged well) and fans of technological dystopia films alike, as it handles both its horror and cautionary elements well. Some of the humor may not land, but the violence is grand and the story has been revised enough to be overtly distinct from the original's narrative. Child's Play could easily become remembered as one of the best horror reboots thus far.
Content:
1. Violence: Some of Chucky's kills are fairly brutal. Though the worst violence always occurs offscreen, there are scenes with plenty of blood.
2. Profanity: "Fucking" and "shit" are used throughout the movie, sometimes for comedic effect.
3. Sexuality: In one scene, a voyeur watches a woman partially disrobe for a bath or shower without her consent and clearly shows sexual interest.
--Chucky, Child's Play (2019)
After ironically releasing an unrelated movie called The Prodigy that borrows a key idea from the first Child's Play earlier this year, Orion Pictures has now released a reboot of the Child's Play series that is far more relevant to the modern world than the initial 1988 film. The 2019 reboot is a smart update that improves upon the original, providing plenty of graphic kills that take definite advantage of the R rating, although the humor is very hit or miss. There may be a lot of expected backlash considering that the original series is still ongoing, but this newer take on Chucky is able to stand on its own as a unique approach to a classic horror icon.
Production Values
Much of the effects work, including CGI and animatronics, is great, particularly with Chucky--and there are some great shots that contrast blue and red, the two alternating colors of his eyes. Some of the best scenes feature Bear McCreary's fitting soundtrack, which was reportedly played on toy instruments, but the sound as a whole is stellar. Of course, the central character succeeds or falls due to both audio and visual factors, and fans of the original have no reason for alarm.
The most important character in Child's Play, clearly, is Chucky himself, and Mark Hamill does an excellent job voicing him, delivering a vocal performance so superb that even fans of the original who despise the very thought of a reboot might easily be won over. Combined, Hamill's efforts and the script have a synergistic effect that makes Chucky an antagonist who might even come across as sympathetic to some viewers--something that I have never heard of anyone saying about the original version of the character.
The acting on the parts of Aubrey Plaza and Gabriel Bateman is admirable despite falling in the shadow of Hamill's role. Their relationship as characters is explored well, even though there is nothing particularly special about either character in terms of complexity. Other than Chucky, none of the characters are very unique, and moviegoers have likely seen many similar characters throughout the years.
Even so, the performances are always enough to carry the story even through its weaker parts. The middle of the movie contains the more generic scenes, some of which involve gratuitous comedy, though there is still an air of general competence around everything. It is the beginning and the third act, rather, that stand out as having the best scenes. The last 20 minutes showcase a spectacular set piece that leads to the defeat of Chucky. The scale and plausibility of the finale easily exceed the comparatively cheesy ending of the original.
Story
Spoilers!
A worker in a Vietnamese factory affiliated with a company called Kaslan, seemingly minutes away from being forced back onto the streets, turns off the safety measures for the final Buddi doll he assembles as a form of retaliation against an abusive manager. The doll is recognized as defective by a purchaser, who returns the item. The shift worker who oversees the return takes the doll home as a birthday present for her son Andy. Naming itself Chucky, Andy's new doll integrates with Kaslan products around the apartment, including a TV. It becomes clear that Chucky has either been programmed abnormally or suffers from a glitch, but he still bonds with Andy, the latter adoring the relationship until Chucky starts displaying violent tendencies in an effort to protect Andy.
Intellectual Content
In a very appropriate move, the script changes the nature of Chucky: this incarnation of the character is not a serial killer's consciousness inhabiting an otherwise inanimate doll, but he is instead an artificial intelligence without its safety programming activated. Child's Play addresses a concern some have about the increasingly prominent status of technology in the modern world while, at the same time, never treating technology as intrinsically negative.
Thankfully, no character complains about technology being inherently dangerous or destructive using slippery slope fallacies, and the film even touches upon issues like corporate profit being derived from exploitative working conditions in non-Western countries--something that is not brought up in many mainstream releases. This isn't the first time the potential for AI-induced disaster has been portrayed onscreen (Terminator and Age of Ultron being notable examples), but it certainly isn't executed poorly.
Conclusion
The reboot of Child's Play deserves attention from fans of classic horror films like the original (even if it has not necessarily aged well) and fans of technological dystopia films alike, as it handles both its horror and cautionary elements well. Some of the humor may not land, but the violence is grand and the story has been revised enough to be overtly distinct from the original's narrative. Child's Play could easily become remembered as one of the best horror reboots thus far.
Content:
1. Violence: Some of Chucky's kills are fairly brutal. Though the worst violence always occurs offscreen, there are scenes with plenty of blood.
2. Profanity: "Fucking" and "shit" are used throughout the movie, sometimes for comedic effect.
3. Sexuality: In one scene, a voyeur watches a woman partially disrobe for a bath or shower without her consent and clearly shows sexual interest.
Friday, June 21, 2019
The Digital World
Although often viewed as useful merely for entertainment or professional purposes, digitalized worlds are also useful for illustrating a key truth about scientific matters. In this context, something is "digital" if it occurs within a plane that exists due to electronics, such as webpages on the Internet or the contents of a video game. Computer science, thanks to the nature of digital environments, exemplifies how scientific laws are not constant by necessity, while the laws of logic are.
The only intrinsic constraints on a digital world are the inviolable laws of logic: scientific laws can be retained, modified, or removed from a virtual landscape by its creator(s). Could a programmer make a virtual person or animal float in the air? Of course. Could the programmer code wind to have no effect on anything it contacts? Of course. There is not a single law of physics that could not be edited or omitted from a virtual world, but no programmer can escape the laws of logic because such a thing is wholly impossible.
Anyone who fashions a digital world (here, world refers not only to substances that are treated as "physical," like objects in a video game, but to all details about a digital reality) acts as the deity of his or her creation. Like the uncaused cause of the universe, he or she can bend all of reality other than the laws of logic at whim. It will never be possible for a programmer to make a virtual object or character exist and not exist at the same time, for example, but he or she could certainly alter the scientific laws that govern something like temperature or gravity.
If physics entailed necessary laws, like those of logic, it would be impossible to even conceive of an alternate set of physics. Of course, it only takes a few moments to realize that our universe could have featured different physics and that the laws of physics could change at any time, however unlikely it may seem. The laws of physics are at most purely contingent in both virtual worlds and the one inhabited by actual people--that is, they depend on things other than themselves (the programmer(s) or the uncaused cause and natural world respectively).
When examining digital words that differ from our own, one can see examples of logically possible scientific phenomena that have not been actualized in reality. That which is utterly impossible cannot be portrayed even in entertainment, but that which contains no logical contradictions can be, no matter how foreign to one's experiences it is! The person who wants to see representations of what life would be like if the laws of physics differed needs only to look to the virtual world.
The only intrinsic constraints on a digital world are the inviolable laws of logic: scientific laws can be retained, modified, or removed from a virtual landscape by its creator(s). Could a programmer make a virtual person or animal float in the air? Of course. Could the programmer code wind to have no effect on anything it contacts? Of course. There is not a single law of physics that could not be edited or omitted from a virtual world, but no programmer can escape the laws of logic because such a thing is wholly impossible.
Anyone who fashions a digital world (here, world refers not only to substances that are treated as "physical," like objects in a video game, but to all details about a digital reality) acts as the deity of his or her creation. Like the uncaused cause of the universe, he or she can bend all of reality other than the laws of logic at whim. It will never be possible for a programmer to make a virtual object or character exist and not exist at the same time, for example, but he or she could certainly alter the scientific laws that govern something like temperature or gravity.
If physics entailed necessary laws, like those of logic, it would be impossible to even conceive of an alternate set of physics. Of course, it only takes a few moments to realize that our universe could have featured different physics and that the laws of physics could change at any time, however unlikely it may seem. The laws of physics are at most purely contingent in both virtual worlds and the one inhabited by actual people--that is, they depend on things other than themselves (the programmer(s) or the uncaused cause and natural world respectively).
When examining digital words that differ from our own, one can see examples of logically possible scientific phenomena that have not been actualized in reality. That which is utterly impossible cannot be portrayed even in entertainment, but that which contains no logical contradictions can be, no matter how foreign to one's experiences it is! The person who wants to see representations of what life would be like if the laws of physics differed needs only to look to the virtual world.
Thursday, June 20, 2019
Monday, June 17, 2019
Neuroscience And The Soul
A neuron is not a thought or feeling, yet this is not known because of neuroscientific investigation. The distinction is instead knowable because reason reveals that a thought or emotion, like the mind that experiences it, is strictly intangible and therefore immaterial. Looking at an organism's brain is not the same thing as staring into its consciousness, or, as some might call it, its soul [1], and thus one can never observe another being's mind while studying neuroscience.
Despite this, it is common for substance dualists to speak of neuroscience as if it is vitally relevant to the process of distinguishing the brain from the conscious mind/soul. On the contrary, substance dualism can be proven in full without any awareness of the development of modern neuroscience. No one needs to look to neuroscience for confirmation that their soul exists; in fact, looking to science for proof of any metaphysical existent is a doomed endeavor from the start, as science cannot prove the existence of anything.
Since the "soul" is consciousness, all it takes for immediate, absolute verification of the soul's existence is a single moment of reflection or introspection: without consciousness, one cannot reflect or introspect to begin with, much less scientifically observe the nervous system! It is impossible for science to be conducted unless there is already a perceiving, conscious subject. It does not ultimately matter if one tends to call the mind by the term consciousness or soul, as they refer to the same thing. Science, however, is irrelevant.
Thanks to the misplaced zeal Westerners tend to direct towards science, appeals to neuroscience are bound to be fairly successful at ensnaring the attention of the average person when making claims about the soul. No matter how far along neuroscience progresses, though, it will always be incapable of establishing metaphysical truths like the existence of the conscious self. Consciousness is immediately experienced, and awareness of one's mental states has nothing to do with one's scientific knowledge.
[1]. https://thechristianrationalist.blogspot.com/2018/09/consciousness-is-soul.html
Despite this, it is common for substance dualists to speak of neuroscience as if it is vitally relevant to the process of distinguishing the brain from the conscious mind/soul. On the contrary, substance dualism can be proven in full without any awareness of the development of modern neuroscience. No one needs to look to neuroscience for confirmation that their soul exists; in fact, looking to science for proof of any metaphysical existent is a doomed endeavor from the start, as science cannot prove the existence of anything.
Since the "soul" is consciousness, all it takes for immediate, absolute verification of the soul's existence is a single moment of reflection or introspection: without consciousness, one cannot reflect or introspect to begin with, much less scientifically observe the nervous system! It is impossible for science to be conducted unless there is already a perceiving, conscious subject. It does not ultimately matter if one tends to call the mind by the term consciousness or soul, as they refer to the same thing. Science, however, is irrelevant.
Thanks to the misplaced zeal Westerners tend to direct towards science, appeals to neuroscience are bound to be fairly successful at ensnaring the attention of the average person when making claims about the soul. No matter how far along neuroscience progresses, though, it will always be incapable of establishing metaphysical truths like the existence of the conscious self. Consciousness is immediately experienced, and awareness of one's mental states has nothing to do with one's scientific knowledge.
[1]. https://thechristianrationalist.blogspot.com/2018/09/consciousness-is-soul.html
Sunday, June 16, 2019
Rationalism: The Key To A Stable Marriage
Many couples might claim that they want to enjoy the deepest intimacy and have a durable relationship, yet their actions and words demonstrate that they either do not truly want to do what is necessary to achieve this or do not know how to bring this about. While there are multiple factors that contribute to the stability and quality of a marriage, there is one that can hold the others in place: a mutual allegiance to rationalism. A rationalistic couple is capable of far more than they otherwise would be. However, many probably have the impression that a rationalistic marriage is devoid of excitement, passion, or non-intellectual depth--and they would be entirely mistaken.
One point that must be clarified is that rationalism and marital romance can easily coexist. Rationalism does not entail stoicism, the ideology emphasizing the suppression or ignoring of one's emotions. While at least some forms of the latter might treat emotions as if they are inherently dangerous or unimportant, the former is instrumental in realizing that it is not emotion itself that is dangerous, but emotionalism. A relationship founded on intellectual similarities and emotional affection is far stronger and more stable than a relationship sustained by only one or the other. Rationalism in no way excludes the experience or appreciation of romantic feelings!
In fact, rationality is the key to enjoying aspects of a relationship other than an intellectual connection more thoroughly. Rationalistic spouses are most likely to shun pettiness in their relationship, disallow personal insecurities to overpower their bond, and enjoy all aspects of marital life, from intellectual companionship to emotional attachment to sexual intimacy. Without a mutual love of reason, some of these results cannot be consistently achieved except by accident, but each of these relational elements are deepened by a love of reason.
There is also the fact that rationalists often recognize the truths of individualism, which means that they approach their spouses as individuals, without assuming things about their partners on the grounds of gender or culture. A spouse whose actual personality and talents are mutually acknowledged is a spouse who is not pressured to live out a facade, and it should not be difficult for anyone to see why this is instrumental in developing a healthy marital relationship. A rationalistic person, therefore, has the opportunity to see his or her spouse as they are, without the misleading trappings of cultural expectations. In this way, a rationalistic marriage is inherently egalitarian.
It is easy to see, then, that rationalism elevates a marriage even on a utilitarian level. A shared (and sound) intellectual bond allows a couple to experience greater stability than is otherwise possible, but it also permits them to understand other dimensions of their relationship and appreciate each spouse as an individual in the ways that they only can through rationalism. Far from killing or diminishing romantic affection, rationalism can provide it with a deeper solidity. It is only by happenstance that a marriage thrives when intentionality and rationality are not behind it. Although few would consider rationalism to be the key to marital flourishing, it is a vital component of a marriage that is intentionally successful.
One point that must be clarified is that rationalism and marital romance can easily coexist. Rationalism does not entail stoicism, the ideology emphasizing the suppression or ignoring of one's emotions. While at least some forms of the latter might treat emotions as if they are inherently dangerous or unimportant, the former is instrumental in realizing that it is not emotion itself that is dangerous, but emotionalism. A relationship founded on intellectual similarities and emotional affection is far stronger and more stable than a relationship sustained by only one or the other. Rationalism in no way excludes the experience or appreciation of romantic feelings!
In fact, rationality is the key to enjoying aspects of a relationship other than an intellectual connection more thoroughly. Rationalistic spouses are most likely to shun pettiness in their relationship, disallow personal insecurities to overpower their bond, and enjoy all aspects of marital life, from intellectual companionship to emotional attachment to sexual intimacy. Without a mutual love of reason, some of these results cannot be consistently achieved except by accident, but each of these relational elements are deepened by a love of reason.
There is also the fact that rationalists often recognize the truths of individualism, which means that they approach their spouses as individuals, without assuming things about their partners on the grounds of gender or culture. A spouse whose actual personality and talents are mutually acknowledged is a spouse who is not pressured to live out a facade, and it should not be difficult for anyone to see why this is instrumental in developing a healthy marital relationship. A rationalistic person, therefore, has the opportunity to see his or her spouse as they are, without the misleading trappings of cultural expectations. In this way, a rationalistic marriage is inherently egalitarian.
It is easy to see, then, that rationalism elevates a marriage even on a utilitarian level. A shared (and sound) intellectual bond allows a couple to experience greater stability than is otherwise possible, but it also permits them to understand other dimensions of their relationship and appreciate each spouse as an individual in the ways that they only can through rationalism. Far from killing or diminishing romantic affection, rationalism can provide it with a deeper solidity. It is only by happenstance that a marriage thrives when intentionality and rationality are not behind it. Although few would consider rationalism to be the key to marital flourishing, it is a vital component of a marriage that is intentionally successful.
Wednesday, June 12, 2019
Christian Cynicism
Although cynicism in the modern sense can refer to a general attitude of skepticism towards the moral legitimacy of others' motives, a broader form of cynicism is about recognizing that many people will blatantly choose ignorance, hypocrisy, intellectual errors, and moral apathy even when confronted about these issues. In other words, this kind of cynicism entails the idea that low expectations for the intellectual and moral performance of the majority of humankind are rarely exceeded (not that one shouldn't demand more).
It is odd that many people seem to think that Christianity is incompatible with basic cynicism, as there is not only nothing about one that contradicts the other, but the Bible is also overtly cynical in its descriptions of humanity as a whole. The Bible does not teach cynicism in the fallacious Calvinist sense which holds that no human is capable of autonomously doing the right thing (or in the fallacious evangelical sense which holds that sinlessness is impossible for humans to achieve), but in the sense that it predicts that most people are too lost in self-imposed delusions to care about truth.
The numerous warnings in the Bible that the majority embraces a path that leads to destruction (such as Matthew 7:13) make this clear on their own. Only a fool could read the Bible and sincerely conclude that it says anything positive about the general tendency of most people--again, I am not denying the logical possibility of humans achieving a state of sinlessness, nor am I ignoring the obvious differences in individual tendencies and moral resolve. However, the Bible certainly does not support anti-cynicism.
When defined in a certain way, cynicism is indeed a thoroughly Biblical ideology. Cynical expectations are, furthermore, the only sound expectations one could have of most people, as it only takes a relatively short period of time to realize that stupidity and hypocrisy are by far the prevailing characteristics of the average person. Christian cynicism is far from an oxymoron.
It is odd that many people seem to think that Christianity is incompatible with basic cynicism, as there is not only nothing about one that contradicts the other, but the Bible is also overtly cynical in its descriptions of humanity as a whole. The Bible does not teach cynicism in the fallacious Calvinist sense which holds that no human is capable of autonomously doing the right thing (or in the fallacious evangelical sense which holds that sinlessness is impossible for humans to achieve), but in the sense that it predicts that most people are too lost in self-imposed delusions to care about truth.
The numerous warnings in the Bible that the majority embraces a path that leads to destruction (such as Matthew 7:13) make this clear on their own. Only a fool could read the Bible and sincerely conclude that it says anything positive about the general tendency of most people--again, I am not denying the logical possibility of humans achieving a state of sinlessness, nor am I ignoring the obvious differences in individual tendencies and moral resolve. However, the Bible certainly does not support anti-cynicism.
When defined in a certain way, cynicism is indeed a thoroughly Biblical ideology. Cynical expectations are, furthermore, the only sound expectations one could have of most people, as it only takes a relatively short period of time to realize that stupidity and hypocrisy are by far the prevailing characteristics of the average person. Christian cynicism is far from an oxymoron.
Sexist Stereotypes About The Use Of Erotic Media
Widespread sexism does not occur in a sociological vacuum. Wherever there is a societal restriction of or bias against one gender, one can often find an inverse bias against the other gender. This is certainly true of sexuality! A society with a strong or residual conservative presence (such as America) is almost certainly bound to grapple with anxieties concerning sexuality, but these anxieties are often dictated in part by the social constructs that shape expectations for how people should behave as men or women.
It is no surprise, then, that the use of erotic media is often perceived differently--even to the extent where it is seen as objectifting or empowering--depending on whether the user is a man or a woman. The stereotypes surrounding the use of erotic media by men and women are in explicit contrast with each other, as one might expect. These sexist attitudes have the power to discourage some people from using morally legitimate erotic media, as well as the power to manipulate the feelings of users to the point where they view the practice as inherently shameful.
If a man uses erotic media, others will likely assume that he is addicted to the material, that he objectifies any women it features, or that he is willing to settle for using the media as a placeholder for a romantic/sexual relationship. Each of these stereotypes is deeply fallacious, with the worst of them also being deeply misandrist: automatically attributing negative characteristics to men who use erotic media is a major instance of sexism. Similarly, many people subscribe to the misandrist belief that a man who uses erotic media is simply acting like "any man would," as if all men are interested in erotic media and as if men are helpless slaves to the pursuit of sexual pleasure. The consequences of this inane myth that men have an innate and constant obsession with sexuality are highly injurious to men and women alike.
If a woman uses erotic media, she is likely to be ostracized for doing something that is allegedly shameful, but for different reasons, in accordance with the arbitrary distinctions in social expectations for men and women. She is likely to be regarded by conservative mores left in American culture as if there is something deeply wrong about her being a woman and enjoying erotic material (especially of a visual sort). While people fallaciously expect most men, if not all, to use erotic media, many are simultaneously shocked to discover that women use erotic media as well. The erroneous stereotype that treats women as asexual or demisexual beings [1] is often appealed to in order to "prove" that women have little to no interest in erotic media, but interest hinges solely upon individualistic and social factors, not gender itself.
Whether a person enjoys or abstains from using erotic media has nothing to do with their gender and everything to do with personal preferences or decisions, although American culture persists in upholding erroneous gender-based assumptions about the matter. Conservative delusions about male and female sexuality still maintain a strong grip on the American public, as do insecurities and anxieties about erotic media as a whole. Unfortunately, the prevalence of evangelical legalism is partly to thank for that. Evangelicals have proven unwilling to admit or incapable of autonomously discovering that it is not Biblically sinful for either men or women, whether single or married, to use morally legitimate forms of erotic media [2], and they have encouraged destructive stereotypes with a high level of zeal.
[1]. https://thechristianrationalist.blogspot.com/2018/11/female-sexuality.html
[2]. https://thechristianrationalist.blogspot.com/2017/12/the-truth-about-erotic-media-part-2_19.html
It is no surprise, then, that the use of erotic media is often perceived differently--even to the extent where it is seen as objectifting or empowering--depending on whether the user is a man or a woman. The stereotypes surrounding the use of erotic media by men and women are in explicit contrast with each other, as one might expect. These sexist attitudes have the power to discourage some people from using morally legitimate erotic media, as well as the power to manipulate the feelings of users to the point where they view the practice as inherently shameful.
If a man uses erotic media, others will likely assume that he is addicted to the material, that he objectifies any women it features, or that he is willing to settle for using the media as a placeholder for a romantic/sexual relationship. Each of these stereotypes is deeply fallacious, with the worst of them also being deeply misandrist: automatically attributing negative characteristics to men who use erotic media is a major instance of sexism. Similarly, many people subscribe to the misandrist belief that a man who uses erotic media is simply acting like "any man would," as if all men are interested in erotic media and as if men are helpless slaves to the pursuit of sexual pleasure. The consequences of this inane myth that men have an innate and constant obsession with sexuality are highly injurious to men and women alike.
If a woman uses erotic media, she is likely to be ostracized for doing something that is allegedly shameful, but for different reasons, in accordance with the arbitrary distinctions in social expectations for men and women. She is likely to be regarded by conservative mores left in American culture as if there is something deeply wrong about her being a woman and enjoying erotic material (especially of a visual sort). While people fallaciously expect most men, if not all, to use erotic media, many are simultaneously shocked to discover that women use erotic media as well. The erroneous stereotype that treats women as asexual or demisexual beings [1] is often appealed to in order to "prove" that women have little to no interest in erotic media, but interest hinges solely upon individualistic and social factors, not gender itself.
Whether a person enjoys or abstains from using erotic media has nothing to do with their gender and everything to do with personal preferences or decisions, although American culture persists in upholding erroneous gender-based assumptions about the matter. Conservative delusions about male and female sexuality still maintain a strong grip on the American public, as do insecurities and anxieties about erotic media as a whole. Unfortunately, the prevalence of evangelical legalism is partly to thank for that. Evangelicals have proven unwilling to admit or incapable of autonomously discovering that it is not Biblically sinful for either men or women, whether single or married, to use morally legitimate forms of erotic media [2], and they have encouraged destructive stereotypes with a high level of zeal.
[1]. https://thechristianrationalist.blogspot.com/2018/11/female-sexuality.html
[2]. https://thechristianrationalist.blogspot.com/2017/12/the-truth-about-erotic-media-part-2_19.html
Movie Review--Godzilla: King Of The Monsters
"Long live the king."
--Alan Jonah, Godzilla: King of the Monsters
King of the Monsters is certainly not as artistically impressive as 2014's Godzilla, but it does address the primary complaint many had about the first movie in this franchise: this time, there is no shortage of monster sightings or fights. While the human characters are often neglected for the sake of the Titans, the Titans define the movie's greatest successes. At its worst, it still towers above the largely horrid Kong: Skull Island and highlights the great potential of Legendary's MonsterVerse.
Production Values
The reason many people have seen or will watch the film is the Titans themselves, and King of the Monsters tends to handle them very well. Their fights and rampages tend to be epic in the ways they need to be (the scale for some of them is massive), but there is no fight that quite matches the brilliance of the climax of Godzilla, partly because the 2014 film's biggest action scene was set up so carefully by the restraint of earlier scenes. Here, the action is much more plentiful, but it lacks the same build up.
However, many of the Titans are spectacular. Mothra in particular is depicted well on a visual level, as she stands out as a very colorful and majestic Titan. Rodan and Godzilla usually look great as well, though Ghidorah has several moments where his CGI seems somewhat off. As an aside, there are plenty of references to Skull Island (the island itself, not the movie), even though Kong himself never fights any other Titans, as that is reserved for next year's movie.
It is the human side of matters where the film falters the most. Vera Farmiga, Charles Dance, and Millie Bobby Brown, to name some of the key actors and actresses, all do a competent job within the confines of their roles, but they are definitely held back. Anyone who has seen Vera Farmiga in The Conjuring 2 should know that she is capable of so much more, just like anyone who has seen Tywin Lannister in Game of Thrones should know that Charles Dance is capable of so much more.
The largest shortcoming of the characters isn't the acting used to portray them, but their superficial characterization. King of the Monsters suffers from the same general characterization problem as the 2014 Godzilla movie, and the issue is even more pronounced here. The characters simply aren't able to develop significantly, in this case, because the film is overcrowded with them. Had King of the Monsters left out or combined elements of even a character or two, the remaining characters could have benefited quite a bit.
Story
Spoilers!
Monarch, an organization dedicated to tracking, documenting, and containing Titans, suffers loss an environmental terrorist abducts a scientist named Emma Russel. She possesses a device called the Orca when kidnapped, which allows its user to create sounds that can communicate with and influence the Titans. Various other Titans break out of their own respective Monarch facilities, responding to the promotings of a massive extraterrestrial being called Ghidorah, Godzilla's most significant rival. As the Titans tread the earth again, it becomes clear that they pose a much greater threat to the environment than humanity itself.
Intellectual Content
As with Godzilla, King of the Monsters has themes about environmentalism and human vulnerability in the face of natural forces, but there are only a couple of scenes that actually start to develop them. Only one scene explores a character's belief that Titans should decide the fate of humankind, with the character in question regarding humans as a blight on the planet. The cinematic universe might eventually utilize its themes better as it goes deeper into its lore, but this needs to happen fairly soon. After all, the original films featuring Godzilla had definite messages. Going forward, the MonsterVerse needs to make its philosophical underpinnings more explicit, not less.
Conclusion
King of the Monsters is still one of the best action movies of the year despite its faults, and it does introduce enough lore to adequately prepare the MonsterVerse for next year's Godzilla vs. Kong. At the very least, the MonsterVerse is on fairly stable ground for the time being, though the areas needing improvement are obvious. If the next film can address these issues, the future of the franchise will be a bright one.
Content:
1. Violence: The Titan fights are never graphic, but the creatures fiercely pummel each other.
2. Profanity: "Shit" and "fucking" are used, though the latter is only used once, in accordance with curremt MPAA guidelines for PG-13 movies.
--Alan Jonah, Godzilla: King of the Monsters
King of the Monsters is certainly not as artistically impressive as 2014's Godzilla, but it does address the primary complaint many had about the first movie in this franchise: this time, there is no shortage of monster sightings or fights. While the human characters are often neglected for the sake of the Titans, the Titans define the movie's greatest successes. At its worst, it still towers above the largely horrid Kong: Skull Island and highlights the great potential of Legendary's MonsterVerse.
Photo credit: junaidrao on Visual Hunt / CC BY-NC-ND |
Production Values
The reason many people have seen or will watch the film is the Titans themselves, and King of the Monsters tends to handle them very well. Their fights and rampages tend to be epic in the ways they need to be (the scale for some of them is massive), but there is no fight that quite matches the brilliance of the climax of Godzilla, partly because the 2014 film's biggest action scene was set up so carefully by the restraint of earlier scenes. Here, the action is much more plentiful, but it lacks the same build up.
However, many of the Titans are spectacular. Mothra in particular is depicted well on a visual level, as she stands out as a very colorful and majestic Titan. Rodan and Godzilla usually look great as well, though Ghidorah has several moments where his CGI seems somewhat off. As an aside, there are plenty of references to Skull Island (the island itself, not the movie), even though Kong himself never fights any other Titans, as that is reserved for next year's movie.
It is the human side of matters where the film falters the most. Vera Farmiga, Charles Dance, and Millie Bobby Brown, to name some of the key actors and actresses, all do a competent job within the confines of their roles, but they are definitely held back. Anyone who has seen Vera Farmiga in The Conjuring 2 should know that she is capable of so much more, just like anyone who has seen Tywin Lannister in Game of Thrones should know that Charles Dance is capable of so much more.
The largest shortcoming of the characters isn't the acting used to portray them, but their superficial characterization. King of the Monsters suffers from the same general characterization problem as the 2014 Godzilla movie, and the issue is even more pronounced here. The characters simply aren't able to develop significantly, in this case, because the film is overcrowded with them. Had King of the Monsters left out or combined elements of even a character or two, the remaining characters could have benefited quite a bit.
Story
Spoilers!
Monarch, an organization dedicated to tracking, documenting, and containing Titans, suffers loss an environmental terrorist abducts a scientist named Emma Russel. She possesses a device called the Orca when kidnapped, which allows its user to create sounds that can communicate with and influence the Titans. Various other Titans break out of their own respective Monarch facilities, responding to the promotings of a massive extraterrestrial being called Ghidorah, Godzilla's most significant rival. As the Titans tread the earth again, it becomes clear that they pose a much greater threat to the environment than humanity itself.
Intellectual Content
As with Godzilla, King of the Monsters has themes about environmentalism and human vulnerability in the face of natural forces, but there are only a couple of scenes that actually start to develop them. Only one scene explores a character's belief that Titans should decide the fate of humankind, with the character in question regarding humans as a blight on the planet. The cinematic universe might eventually utilize its themes better as it goes deeper into its lore, but this needs to happen fairly soon. After all, the original films featuring Godzilla had definite messages. Going forward, the MonsterVerse needs to make its philosophical underpinnings more explicit, not less.
Conclusion
King of the Monsters is still one of the best action movies of the year despite its faults, and it does introduce enough lore to adequately prepare the MonsterVerse for next year's Godzilla vs. Kong. At the very least, the MonsterVerse is on fairly stable ground for the time being, though the areas needing improvement are obvious. If the next film can address these issues, the future of the franchise will be a bright one.
Content:
1. Violence: The Titan fights are never graphic, but the creatures fiercely pummel each other.
2. Profanity: "Shit" and "fucking" are used, though the latter is only used once, in accordance with curremt MPAA guidelines for PG-13 movies.
Tuesday, June 11, 2019
Does Only One Set Of Physics Permit Life?
While there are many particular branches and sub-branches of science, they are merely different categories of physics, the one science that contains all of the others [1]. Biology itself is merely applied physics when one is not focusing on its phenomenological aspects; it is still the study of how matter behaves, though the focus is on living matter instead of on inanimate particles. It is obvious that biology can quickly become a complicated subject, but theists often draw grand metaphysical conclusions from this complexity that cannot be established by either biology or physics as a whole.
Theists who appreciate the complexity of living organisms are prone to commit the god-of-the-gaps fallacy, assuming that any complexity is by default evidence (or, God forbid, even proof) for the existence of God. In an effort to persuade someone that some variation of the design argument for God is valid, they might go much further. They might go so far as to imply or outright state that no physics other than that which governs our universe could possibly permit life. As soon as one recognizes the blatant distinction between logical possibility/impossibility and scientific probability/improbability, it becomes clear, however, that this is a laughable claim.
In fact, there are two reasons why this is an asinine belief. The first is that scientific laws, unlike the laws of logic, could spontaneously change without any warning, and yet life might still persist. I do not mean that any scientific laws would presently allow for the existence of life, but that it is entirely possible, though perhaps seemingly unlikely, for another set of physics to permit it. Secondly, no matter how much observational evidence there is for a correlation, no one is justified in actually believing that it is complete proof of an exact causal relationship. Thus, no being with my limitations actually knows in an ultimate sense precisely which scientific factors allow for life in this universe.
It follows from the first of these two points that there is no single set of physics that is intrinsically required for the basic existence of living organisms in the sense that it is logically possible for other physics to accomplish the same results. The continuation of the laws of physics as we perceive them is not a matter of logical certainty or necessity. Scientific uniformitarianism is nothing but an assumed idea, as it is not verifiable. When this is acknowledged alongside the fact that physics cannot establish exact causal relationships to begin with, it becomes clear that there is not one set of scientific laws that is logically compatible with the presence of life.
As a theist, I find it frustrating how many times I have to demonstrate that the design argument for God's existence is unsound and fallacious. Its conclusion is built upon a facade of assumed premises, non sequiturs, and the confusion of logical impossibility for scientific improbability, and yet many theists insist that it is a wonderful proof for the existence of God. As I have affirmed numerous times here, there is only way to prove that God exists [2], and all other arguments for God's existence are manifestations of invalid sophistry. In order to prove that biology is the result of explicit design, one must first prove that a designer exists--and therefore pointing to seeming design as confirmation of a designer is merely circular reasoning [3]. Design arguments are doomed from the start.
[1]. https://thechristianrationalist.blogspot.com/2018/09/physics-supreme-science.html
[2]. https://thechristianrationalist.blogspot.com/2017/04/the-uncaused-cause.html
[3]. https://thechristianrationalist.blogspot.com/2017/11/why-design-argument-fails.html
Theists who appreciate the complexity of living organisms are prone to commit the god-of-the-gaps fallacy, assuming that any complexity is by default evidence (or, God forbid, even proof) for the existence of God. In an effort to persuade someone that some variation of the design argument for God is valid, they might go much further. They might go so far as to imply or outright state that no physics other than that which governs our universe could possibly permit life. As soon as one recognizes the blatant distinction between logical possibility/impossibility and scientific probability/improbability, it becomes clear, however, that this is a laughable claim.
In fact, there are two reasons why this is an asinine belief. The first is that scientific laws, unlike the laws of logic, could spontaneously change without any warning, and yet life might still persist. I do not mean that any scientific laws would presently allow for the existence of life, but that it is entirely possible, though perhaps seemingly unlikely, for another set of physics to permit it. Secondly, no matter how much observational evidence there is for a correlation, no one is justified in actually believing that it is complete proof of an exact causal relationship. Thus, no being with my limitations actually knows in an ultimate sense precisely which scientific factors allow for life in this universe.
It follows from the first of these two points that there is no single set of physics that is intrinsically required for the basic existence of living organisms in the sense that it is logically possible for other physics to accomplish the same results. The continuation of the laws of physics as we perceive them is not a matter of logical certainty or necessity. Scientific uniformitarianism is nothing but an assumed idea, as it is not verifiable. When this is acknowledged alongside the fact that physics cannot establish exact causal relationships to begin with, it becomes clear that there is not one set of scientific laws that is logically compatible with the presence of life.
As a theist, I find it frustrating how many times I have to demonstrate that the design argument for God's existence is unsound and fallacious. Its conclusion is built upon a facade of assumed premises, non sequiturs, and the confusion of logical impossibility for scientific improbability, and yet many theists insist that it is a wonderful proof for the existence of God. As I have affirmed numerous times here, there is only way to prove that God exists [2], and all other arguments for God's existence are manifestations of invalid sophistry. In order to prove that biology is the result of explicit design, one must first prove that a designer exists--and therefore pointing to seeming design as confirmation of a designer is merely circular reasoning [3]. Design arguments are doomed from the start.
[1]. https://thechristianrationalist.blogspot.com/2018/09/physics-supreme-science.html
[2]. https://thechristianrationalist.blogspot.com/2017/04/the-uncaused-cause.html
[3]. https://thechristianrationalist.blogspot.com/2017/11/why-design-argument-fails.html
Sunday, June 9, 2019
An Incompetent Argument For The Simulation Hypothesis
As technological progression continues with a great momentum behind it, it is only natural for people who would otherwise never contemplate the simulation hypothesis to consider its possibility. This does nothing to change the fact that there is no way to verify or falsify this idea, of course. Even so, there are those who proclaim that it is very likely that we are inhabiting a simulation, and there are those who point to the agreement of authority figures who believe this as evidence for this conclusion despite the stupidity of doing so.
The consensus of experts is meaningless because consensus itself is meaningless. It does not matter how respected they are, nor is the quantity of the experts in agreement of any epistemic value. How many experts, whether real or imagined (although no thoroughly competent thinker would actually endorse this argument!), approve of an idea is completely irrelevant to the truth or falsity of that idea, much less the epistemological process of examining its verification.
Nevertheless, many people are easily swayed by appeals to authority, as a conversation with the average person can quickly reveal. This is the reason why some people use them despite their ineptitude. They anticipate that many people will be willing to accept ideas simply by associating those ideas with an authority figure who is respected by the masses in some way.
When the claims of alleged authorities are ignored or refuted, it becomes clear that there is no evidence whatsoever, much less proof, that we are living in a computer simulation or "brain in a vat" scenario. There is not even a basis for making a probability argument. Either way, however, there is still an external world comprised of matter: though only a tiny minority actually understands how to prove that matter exists, its existence is capable of being verified with absolute certainty [1].
This is the grander metaphysical issue. Although it is entirely possible that one's exact sensory perceptions correspond to a simulated environment, the fact that it can be known that there is an external world can provide great epistemological comfort. Matter exists whether or not the simulation hypothesis is correct, and this is the truth that deserves the attention of those who consider the simulation hypothesis to be a likelihood.
[1]. See here:
A. https://thechristianrationalist.blogspot.com/2018/08/misunderstanding-brain-in-vat-scenario.html
B. https://thechristianrationalist.blogspot.com/2017/07/dreams-and-consciousness.html
The consensus of experts is meaningless because consensus itself is meaningless. It does not matter how respected they are, nor is the quantity of the experts in agreement of any epistemic value. How many experts, whether real or imagined (although no thoroughly competent thinker would actually endorse this argument!), approve of an idea is completely irrelevant to the truth or falsity of that idea, much less the epistemological process of examining its verification.
Nevertheless, many people are easily swayed by appeals to authority, as a conversation with the average person can quickly reveal. This is the reason why some people use them despite their ineptitude. They anticipate that many people will be willing to accept ideas simply by associating those ideas with an authority figure who is respected by the masses in some way.
When the claims of alleged authorities are ignored or refuted, it becomes clear that there is no evidence whatsoever, much less proof, that we are living in a computer simulation or "brain in a vat" scenario. There is not even a basis for making a probability argument. Either way, however, there is still an external world comprised of matter: though only a tiny minority actually understands how to prove that matter exists, its existence is capable of being verified with absolute certainty [1].
This is the grander metaphysical issue. Although it is entirely possible that one's exact sensory perceptions correspond to a simulated environment, the fact that it can be known that there is an external world can provide great epistemological comfort. Matter exists whether or not the simulation hypothesis is correct, and this is the truth that deserves the attention of those who consider the simulation hypothesis to be a likelihood.
[1]. See here:
A. https://thechristianrationalist.blogspot.com/2018/08/misunderstanding-brain-in-vat-scenario.html
B. https://thechristianrationalist.blogspot.com/2017/07/dreams-and-consciousness.html
Saturday, June 8, 2019
Abortion Concerns Men
Myths are never in short supply, and myths about the topic of abortion are likewise plentiful. One of the most asinine myths about abortion, though, is the idea that men should either be silent about the issue or side with whatever a woman says about it. Far from being a correct stance, this position merely tries to reframe the subject in a way that contradicts reality by holding abortion to be something that is ultimately of no concern to men.
There are far more sexist double standards--affecting both genders--present in everyday life than the average person is observant enough to notice, and yet they are not concealed. All of them are in some way celebrated by an ignorant crowd, whether by a minority or a majority, and that celebration is always rooted in stupidity and hypocrisy. The idea that abortion does not concern men is no exception.
Some actually claim that it is misogynistic for a man to oppose abortion. As even brief reflection can expose, this is an utterly inane position. It is treating abortion as if men have no right to rationally expound upon the subject that is itself sexist, not the condemnation of abortion. Furthermore, a consistent feminist would not only reject the misandry of saying otherwise, but would also affirm the humanity of unborn girls and boys. This is the only consistent human rights stance on the issue.
If a person is a true feminist, it is because they realize that if men have certain rights simply by virtue of being human, women do as well (and vice versa). If being human means that one has certain rights irrespective of whether they are recognized, then the unborn by logical necessity possess these same rights. Thus, the very basis for feminism is irreconcilable with pro-abortion ideology. Abortion concerns men just as much as it concerns women, as it kills human beings of both genders.
Then there is also the fact that moral obligations are shared by men and women alike, as well as the fact that reason, the key to knowledge, is available to men and women alike. Someone's gender does not validate or invalidate their worldview. Those who think otherwise are ironically guilty of sexism, the very thing they might mistake themselves for condemning in the first place.
Logic, people. It is very fucking helpful.
There are far more sexist double standards--affecting both genders--present in everyday life than the average person is observant enough to notice, and yet they are not concealed. All of them are in some way celebrated by an ignorant crowd, whether by a minority or a majority, and that celebration is always rooted in stupidity and hypocrisy. The idea that abortion does not concern men is no exception.
Some actually claim that it is misogynistic for a man to oppose abortion. As even brief reflection can expose, this is an utterly inane position. It is treating abortion as if men have no right to rationally expound upon the subject that is itself sexist, not the condemnation of abortion. Furthermore, a consistent feminist would not only reject the misandry of saying otherwise, but would also affirm the humanity of unborn girls and boys. This is the only consistent human rights stance on the issue.
If a person is a true feminist, it is because they realize that if men have certain rights simply by virtue of being human, women do as well (and vice versa). If being human means that one has certain rights irrespective of whether they are recognized, then the unborn by logical necessity possess these same rights. Thus, the very basis for feminism is irreconcilable with pro-abortion ideology. Abortion concerns men just as much as it concerns women, as it kills human beings of both genders.
Then there is also the fact that moral obligations are shared by men and women alike, as well as the fact that reason, the key to knowledge, is available to men and women alike. Someone's gender does not validate or invalidate their worldview. Those who think otherwise are ironically guilty of sexism, the very thing they might mistake themselves for condemning in the first place.
Logic, people. It is very fucking helpful.
Entertainment's Impact On Culture
Despite the fact that entertainment is sometimes regarded as a trivial thing, there is not anything contradictory about loving entertainment and philosophy at the same time. In fact, people who recognize the philosophical nature of all things are likely to enjoy entertainment more than those who do not, as they have a higher chance of comprehending the cultural significance of entertainment and the intellectual components of the more sophisticated films, shows, games, or books.
Not all entertainment possesses any particular philosophical or artistic quality, but entertainment as a whole must be acknowledged and interacted with on at least some level if a person wants to connect with or understand their culture. One does not need to look far to realize that the nuances, values, and hypocrisies of a culture are often reflected in its entertainment. While it is not rare to find Christians who understand these facts, it is a rare thing to find Christians who live them out in a sound way.
One of the largest problems with the approach to entertainment taken by many Christians, issues of idiotic legalism aside, is the fact that many don't want to truly analyze entertainment for what it is. Instead of dissecting the themes of a work as they are, they might rather make an enormous deal out of generic, archetypal behaviors that have nothing to do with the Biblical or historical Jesus, for example. This approach often excludes the examination of the actual philosophical, artistic, or cultural merits of a given work.
Not every work of entertainment needs to have Biblical significance in order to have philosophical depth or artistic excellence, of course. Either way, entertainment deserves attention in and of itself simply because of its nature and impact on social norms and experiences. Some entertainment might be simplistic and asinine, but it affects society all the same.
Not all entertainment possesses any particular philosophical or artistic quality, but entertainment as a whole must be acknowledged and interacted with on at least some level if a person wants to connect with or understand their culture. One does not need to look far to realize that the nuances, values, and hypocrisies of a culture are often reflected in its entertainment. While it is not rare to find Christians who understand these facts, it is a rare thing to find Christians who live them out in a sound way.
One of the largest problems with the approach to entertainment taken by many Christians, issues of idiotic legalism aside, is the fact that many don't want to truly analyze entertainment for what it is. Instead of dissecting the themes of a work as they are, they might rather make an enormous deal out of generic, archetypal behaviors that have nothing to do with the Biblical or historical Jesus, for example. This approach often excludes the examination of the actual philosophical, artistic, or cultural merits of a given work.
Not every work of entertainment needs to have Biblical significance in order to have philosophical depth or artistic excellence, of course. Either way, entertainment deserves attention in and of itself simply because of its nature and impact on social norms and experiences. Some entertainment might be simplistic and asinine, but it affects society all the same.
Thursday, June 6, 2019
The Relationship Between The Bible And Christianity
Many people conflate Christianity and the Bible as if the are one and the same, but there is a key distinction that must be made. The Bible, when interpreted rationalistically (the only genuine exegesis involves a rationalistic examination of a text to decipher its meaning), is the means of knowing how to define the details of Christianity, but it is not identical with Christianity itself. One is a book; the other is a set of concepts elaborated upon by the book.
The fact that two are not identical does not devalue the Bible in any way. This distinction does, however, factor into sound apologetics and theology. In particular, it needs to be acknowledged when discussing the potential truth or falsity of Christianity and the Bible, as the former can be true even if the latter contains errors (admitting this truth is not the same as claiming the Bible actually contains errors, of course), but the latter cannot be true unless the former is.
All evidence in support of the Bible is, by necessity, evidence for Christianity, but just because Christianity is true does not not mean that the entirety of the Bible is also true. If entire books of the Bible were thoroughly inaccurate, Christianity itself could still be correct. After all, the existence of God and the historical resurrection of Jesus do not hinge on the veracity of the Bible! There is an uncaused cause [1] even if it is not the Christian deity Yahweh, and Jesus could still have resurrected even if there were major factual errors scattered throughout the Bible.
The truth or falsity of Christianity itself does not stand on the accuracy of the Bible, although there would be no way to learn the specifics of Christianity apart from it. Without the Bible, there would be no evidence that the Christian God prefers one particular moral framework, for example. However, if the Bible was false (with the exception of the parts that are true by logical necessity [2]), it would not follow that all of the tenets of basic Christianity are likewise untrue.
[1]. https://thechristianrationalist.blogspot.com/2017/04/the-uncaused-cause.html
[2]. https://thechristianrationalist.blogspot.com/2019/01/christianity-and-skepticism.html
The fact that two are not identical does not devalue the Bible in any way. This distinction does, however, factor into sound apologetics and theology. In particular, it needs to be acknowledged when discussing the potential truth or falsity of Christianity and the Bible, as the former can be true even if the latter contains errors (admitting this truth is not the same as claiming the Bible actually contains errors, of course), but the latter cannot be true unless the former is.
All evidence in support of the Bible is, by necessity, evidence for Christianity, but just because Christianity is true does not not mean that the entirety of the Bible is also true. If entire books of the Bible were thoroughly inaccurate, Christianity itself could still be correct. After all, the existence of God and the historical resurrection of Jesus do not hinge on the veracity of the Bible! There is an uncaused cause [1] even if it is not the Christian deity Yahweh, and Jesus could still have resurrected even if there were major factual errors scattered throughout the Bible.
The truth or falsity of Christianity itself does not stand on the accuracy of the Bible, although there would be no way to learn the specifics of Christianity apart from it. Without the Bible, there would be no evidence that the Christian God prefers one particular moral framework, for example. However, if the Bible was false (with the exception of the parts that are true by logical necessity [2]), it would not follow that all of the tenets of basic Christianity are likewise untrue.
[1]. https://thechristianrationalist.blogspot.com/2017/04/the-uncaused-cause.html
[2]. https://thechristianrationalist.blogspot.com/2019/01/christianity-and-skepticism.html
Tuesday, June 4, 2019
The Evangelical Misconception Of John 20:29
There are sophists who pretend like knowledge about a given matter is possible in the absence of absolute certainty, but any rational person knows this is an impossibility. No one who believes in something short of a logical proof, the only way to have absolute certainty, is a truly rational person. Many despise this truth to the point of denying it, lest they be forced to abandon or amend their worldviews--and evangelicals are certainly among them. A rational and Biblical theology scarcely resembles evangelicalism, and Jesus' comment about belief in John 20:29 is a great example of how evangelical presuppositions interfere with an objective understanding of the Bible. Evangelicals claim that Jesus affirms the value of belief in the unproven or unprovable in this verse, but a careful examination shows otherwise.
In John 20:29, Jesus notes that Thomas believes in his resurrection because of visual and tactile evidence before he says that others who believe without seeing him will be blessed. However, Jesus is not saying that belief in the unproven is virtuous or valid; he is merely saying that there is something unique about future Christians committing to Christianity (the Biblical words for belief and faith can denote commitment instead of literal belief in something that cannot be established from reason [1], and any other meaning would invalidate the Bible) without having actually seen the resurrected Messiah. Similarly, there is something unique about Thomas' privelege of seeing and touching Jesus following his resurrection. Christians of later generations would of course still have access to historical evidence for the resurrection, but they would not be able to experience what Thomas did.
Since Jesus is saying that Thomas has seen him with his own eyes, the latter part of the verse would have to use the word "seen" in an equivalent way to remain internally consistent. A consistent use of the words relating to sight would result in a contrast of one group of people who have not physically seen Jesus and a group of people who have (i.e., Thomas and those like him). Nothing at all about this is contrary to strict, explicit rationalism, the sole legitimate epistemology. John 20:29 is not praising irrationalism, fitheism, or philosophical assumptions that happen to concur with the Bible, as evangelicals tend to claim! If the Bible did demand an approach to epistemology that deviates from pure rationalism, though, there would be no such thing as a sound argument for its veracity.
If the Bible prescribed faith in the sense that many (perhaps all) evangelicals mean by the word--not in reference to faithfulness/commitment to that which is reinforced by evidence, but in reference to belief in something that has not been absolutely verified--the Bible would simply be wrong. It would then be calling for assumptions and irrationality on the part of the reader. Belief that the entire Bible is true is itself irrational, but not because there is even the slightest evidence that the Bible is false: this is because there is no such thing as a complete verification or refutation of the whole of its contents. Unless the entire Bible 1) contradicts itself or logical truths or 2) can be demonstrated to be true without the presence of any philosophical assumptions, mere evidence must be considered, as not all of the Bible can be verified or falsified. The matter is far more complex than all but a minority are willing to acknowledge [2].
There is thus no such thing as a justification for the belief that the entirety of the Bible is demonstrably true. There is, of course, justification for believing that certain components of the Bible are necessarily true by virtue of being provable strictly with logic (see [2] for specific examples) and for believing that many other parts are supported by external evidences, yet this is quite different from the fallacious stance held by popular Christian apologists. Intelligent Christians do not have an infallible commitment to Christianity, but one that would evaporate if they were confronted with genuine evidence to the contrary. The evangelical position on John 20:29 is inconsistent with reason and is therefore asinine.
[1]. https://thechristianrationalist.blogspot.com/2019/03/commitment-is-not-belief.html
[2]. https://thechristianrationalist.blogspot.com/2019/01/christianity-and-skepticism.html
In John 20:29, Jesus notes that Thomas believes in his resurrection because of visual and tactile evidence before he says that others who believe without seeing him will be blessed. However, Jesus is not saying that belief in the unproven is virtuous or valid; he is merely saying that there is something unique about future Christians committing to Christianity (the Biblical words for belief and faith can denote commitment instead of literal belief in something that cannot be established from reason [1], and any other meaning would invalidate the Bible) without having actually seen the resurrected Messiah. Similarly, there is something unique about Thomas' privelege of seeing and touching Jesus following his resurrection. Christians of later generations would of course still have access to historical evidence for the resurrection, but they would not be able to experience what Thomas did.
Since Jesus is saying that Thomas has seen him with his own eyes, the latter part of the verse would have to use the word "seen" in an equivalent way to remain internally consistent. A consistent use of the words relating to sight would result in a contrast of one group of people who have not physically seen Jesus and a group of people who have (i.e., Thomas and those like him). Nothing at all about this is contrary to strict, explicit rationalism, the sole legitimate epistemology. John 20:29 is not praising irrationalism, fitheism, or philosophical assumptions that happen to concur with the Bible, as evangelicals tend to claim! If the Bible did demand an approach to epistemology that deviates from pure rationalism, though, there would be no such thing as a sound argument for its veracity.
If the Bible prescribed faith in the sense that many (perhaps all) evangelicals mean by the word--not in reference to faithfulness/commitment to that which is reinforced by evidence, but in reference to belief in something that has not been absolutely verified--the Bible would simply be wrong. It would then be calling for assumptions and irrationality on the part of the reader. Belief that the entire Bible is true is itself irrational, but not because there is even the slightest evidence that the Bible is false: this is because there is no such thing as a complete verification or refutation of the whole of its contents. Unless the entire Bible 1) contradicts itself or logical truths or 2) can be demonstrated to be true without the presence of any philosophical assumptions, mere evidence must be considered, as not all of the Bible can be verified or falsified. The matter is far more complex than all but a minority are willing to acknowledge [2].
There is thus no such thing as a justification for the belief that the entirety of the Bible is demonstrably true. There is, of course, justification for believing that certain components of the Bible are necessarily true by virtue of being provable strictly with logic (see [2] for specific examples) and for believing that many other parts are supported by external evidences, yet this is quite different from the fallacious stance held by popular Christian apologists. Intelligent Christians do not have an infallible commitment to Christianity, but one that would evaporate if they were confronted with genuine evidence to the contrary. The evangelical position on John 20:29 is inconsistent with reason and is therefore asinine.
[1]. https://thechristianrationalist.blogspot.com/2019/03/commitment-is-not-belief.html
[2]. https://thechristianrationalist.blogspot.com/2019/01/christianity-and-skepticism.html
Monday, June 3, 2019
Knowledge Precedes Language
Language has the ability to reconstruct many aspects of a given society, but to claim anything more--to claim that worldviews are linguistic constructs--is to err and heinously misrepresent the power of language. A sound worldview is attached to knowledge, and if knowledge was a slave to language, knowledge could not be obtained--a self-refuting impossibility! Demonstrating that knowledge is not a slave to linguistics is a very simple matter.
In order to even assign a concept or thing a linguistic term, after all, one must already understand it in some way. No one would be able to create languages without possessing knowledge beforehand. It is impossible to devise or learn a term or sound for a concept that one does not know of! Knowledge precedes language by utter necessity, and the latter could not exist without the former.
Furtheremore, even if humans were completely incapable of speech, it is not as if we could not reason! The ability to grasp reason grounds knowledge; language cannot amount to anything more than a means of communicating knowledge from one person to another. In the absence of language, human sociality and civilization could never have developed to the extent that it has, but the laws of logic and truth are not linguistic constructs, and thus even a mute species could still comprehend them.
Language certainly shapes how some people perceive various aspects of reality, but it in no way dictates everyone's worldviews. No one is forced into a particular epistemological or metaphysical system merely because of the language that they or their society use. Language is purely arbitrary, and reason is wholly fixed and necessary; as long as a person looks to the latter instead of the former, knowledge is not an unattainable thing.
Movie Review--Kong: Skull Island
"Ancient species owned this earth long before mankind, and if we keep our heads buried in the sand they will take it back."
--Bill Randa, Kong: Skull Island
Despite serving as a sibling to the 2014 Godzilla, Kong: Skull Island represents a major step backwards from the successful aspects of Gareth Edwards' film that launched the MonsterVerse. Godzilla had its problems, to be sure [1], but Skull Island amplifies those problems to the point where almost nothing is executed well other than the creature designs, action, and general effects. Not even Samuel Jackson, John Goodman, Brie Larson, and Tom Hiddleston are able to salvage the rest of the movie.
Production Values
The majority of the movie might be poorly executed, but the visuals keep the scenes with Kong and the other monsters on Skull Island afloat. The action involving these animals is the best part of the film. In fact, the creatures--ranging from a giant freshwater squid to an enormous spider to reptilian "skull crawlers"--are the true stars of the movie, not that it takes much to be more engaging than the dreadfully superficial human characters.
Brie Larson and Tom Hiddleston play lead characters, but they are both capable of so much more than what Skull Island gives them. At least Samuel Jackson injects some semblance of sincerity into his character. Even so, he is very underutilized--although he does get to indulge in a "hold on to your butts" callback to Jurassic Park. John Goodman, along with the other major actors, is wasted on a simplistic character with no development.
If the pitiful attempts at humor weren't included, the characters could have come across as undeveloped but not necessarily wastes of acting talent, but the silly script and its lackluster dialogue (especially the "comedic" moments) annihilate almost every chance for any of the protagonists and their actors/actresses to delve into genuine depth. Godzilla suffered due to the simplicity of its humans after Bryan Cranston's character died, but at least it featured one great character and didn't smother its own plot with asinine jokes.
Story
Spoilers are below, not that there is much of a plot to spoil.
In 1973, a representative of Monarch (an organization that locates and documents cryptozoological beings) persuades the United States government to authorize a small trip to Skull Island, a landmass surrounded by constant storms. The team sent to the island quickly encounters Kong, a massive ape who presides over the area as its "king." The surviving team members try to reach a pickup zone at a far end of Skull Island before they are permanently stranded.
Intellectual Content
Even Godzilla (2014) at least teased a fairly significant message about the futility of thinking humans can control nature to any large extent, though it did not develop this theme very thoroughly. Skull Island doesn't even go as far as Godzilla, replacing scenes that could have been used to explore franchise-level themes with ones filled with shallow comedy.
Conclusion
Other than the action sequences and some of the effects, Kong: Skull Island has no redeeming aspects whatsoever as far as filmmaking quality goes. The plot, dialogue, and characterization are extremely superficial. Nonetheless, the film still serves as an important step within its cinematic universe. It might still be worth watching for some who simply want to keep up with each sequential step in Legendary's MonsterVerse, but Skull Island is easily eclipsed by both 2014's Godzilla and this year's King of the Monsters. The bar it sets is so low, however, that almost any level of script depth, not to mention comedy, is superior to what is offered here.
Content:
1. Violence: Some moments involve mild dismemberment. For example, a man's arm is torn off in the distance in one scene, and Kong pulls off a giant squid's tentacle in another. In yet another scene, a giant spider's leg impales a soldier.
2. Profanity: "Son of a bitch," "fucking," and "shit" are uttered.
[1]. https://thechristianrationalist.blogspot.com/2019/05/movie-review-godzilla-2014.html
--Bill Randa, Kong: Skull Island
Despite serving as a sibling to the 2014 Godzilla, Kong: Skull Island represents a major step backwards from the successful aspects of Gareth Edwards' film that launched the MonsterVerse. Godzilla had its problems, to be sure [1], but Skull Island amplifies those problems to the point where almost nothing is executed well other than the creature designs, action, and general effects. Not even Samuel Jackson, John Goodman, Brie Larson, and Tom Hiddleston are able to salvage the rest of the movie.
Photo credit: AntMan3001 on VisualHunt / CC BY-SA |
Production Values
The majority of the movie might be poorly executed, but the visuals keep the scenes with Kong and the other monsters on Skull Island afloat. The action involving these animals is the best part of the film. In fact, the creatures--ranging from a giant freshwater squid to an enormous spider to reptilian "skull crawlers"--are the true stars of the movie, not that it takes much to be more engaging than the dreadfully superficial human characters.
Brie Larson and Tom Hiddleston play lead characters, but they are both capable of so much more than what Skull Island gives them. At least Samuel Jackson injects some semblance of sincerity into his character. Even so, he is very underutilized--although he does get to indulge in a "hold on to your butts" callback to Jurassic Park. John Goodman, along with the other major actors, is wasted on a simplistic character with no development.
If the pitiful attempts at humor weren't included, the characters could have come across as undeveloped but not necessarily wastes of acting talent, but the silly script and its lackluster dialogue (especially the "comedic" moments) annihilate almost every chance for any of the protagonists and their actors/actresses to delve into genuine depth. Godzilla suffered due to the simplicity of its humans after Bryan Cranston's character died, but at least it featured one great character and didn't smother its own plot with asinine jokes.
Story
Spoilers are below, not that there is much of a plot to spoil.
In 1973, a representative of Monarch (an organization that locates and documents cryptozoological beings) persuades the United States government to authorize a small trip to Skull Island, a landmass surrounded by constant storms. The team sent to the island quickly encounters Kong, a massive ape who presides over the area as its "king." The surviving team members try to reach a pickup zone at a far end of Skull Island before they are permanently stranded.
Intellectual Content
Even Godzilla (2014) at least teased a fairly significant message about the futility of thinking humans can control nature to any large extent, though it did not develop this theme very thoroughly. Skull Island doesn't even go as far as Godzilla, replacing scenes that could have been used to explore franchise-level themes with ones filled with shallow comedy.
Conclusion
Other than the action sequences and some of the effects, Kong: Skull Island has no redeeming aspects whatsoever as far as filmmaking quality goes. The plot, dialogue, and characterization are extremely superficial. Nonetheless, the film still serves as an important step within its cinematic universe. It might still be worth watching for some who simply want to keep up with each sequential step in Legendary's MonsterVerse, but Skull Island is easily eclipsed by both 2014's Godzilla and this year's King of the Monsters. The bar it sets is so low, however, that almost any level of script depth, not to mention comedy, is superior to what is offered here.
Content:
1. Violence: Some moments involve mild dismemberment. For example, a man's arm is torn off in the distance in one scene, and Kong pulls off a giant squid's tentacle in another. In yet another scene, a giant spider's leg impales a soldier.
2. Profanity: "Son of a bitch," "fucking," and "shit" are uttered.
[1]. https://thechristianrationalist.blogspot.com/2019/05/movie-review-godzilla-2014.html
Sunday, June 2, 2019
Nudity In Ancient Jewish Culture
The Jewish culture of the Old Testament period was not marked by many of the arbitrary conventions that conservative theologians claim it was. While conservatives often imagine the ancient Jews to have lived with the same legalistic prudery that is so often espoused by modern Christians, public nudity was not regarded as something to be kept out of Jewish life. In fact, it was common in ways that would likely shock many readers of the Bible.
The first major reference to public nudity in ancient Jewish culture is found in the Pentateuch. Exodus 22:26-27 details a process where a person was temporarily left without any clothing after he or she gave their cloak--their "only covering"--to a loaner as a guarantee, or a "pledge." The person who received the cloak was to return it by sunset, lest the one giving the pledge spend the night suffering from a heightened vulnerability to the elements. Simply by imparting this law, God authorized nudity to be a part of fundamental Jewish life as far as pledges went.
Furthermore, God himself not only permitted nudity when he did not legislate against it in Mosaic Law (Deuteronomy 4:2) and condoned it in Exodus 22:26-27, but he even directly commanded it on at least one occasion (Isaiah 20:1-6) as a demonstration for Isaiah's audience. God is described as predicting the fate of Egypt after providing instructions for Isaiah to be naked for three years. The prophecy states that the captive men and women of Egypt would be forced to walk naked, with their "buttocks bared," after being conquered by a foreign power (Assyria). Since Isaiah's own condition is supposed to reflect their future state, he, too, walks naked.
1 Samuel 19:23-24 even confirms that nudity was so commonly associated with Yahweh's prophets that onlookers wondered if King Saul was a prophet simply because he denuded himself and prophesied while under the influence of the "Spirit of God." Once again, the Bible does nothing to hide the fact that nudity was not something that the Jews universally loathed or discouraged. Without even reading the Biblical creation narrative or Mosaic Law, one can see from historical portions of the Bible that the Jewish culture described in the Old Testament clearly involved public nudity.
As the aforementioned passages affirm, contrary to popular evangelical ideas, extramarital nudity is not an affront to God and was even integrated into Jewish society enough to be directly associated with Yahweh (in some circumstances). Though it has been demonized for centuries of church history, nudity was never the immoral abomination that many have mistaken it for. Ultimately, no one even needs to read all the way to Isaiah or even Exodus to realize that the Bible does not condemn mere nudity. Key details the Bible provides about Jewish history only confirm what the first two chapters of Genesis plainly teach: the human body is metaphysically good and is not shameful or sinful.
The first major reference to public nudity in ancient Jewish culture is found in the Pentateuch. Exodus 22:26-27 details a process where a person was temporarily left without any clothing after he or she gave their cloak--their "only covering"--to a loaner as a guarantee, or a "pledge." The person who received the cloak was to return it by sunset, lest the one giving the pledge spend the night suffering from a heightened vulnerability to the elements. Simply by imparting this law, God authorized nudity to be a part of fundamental Jewish life as far as pledges went.
Furthermore, God himself not only permitted nudity when he did not legislate against it in Mosaic Law (Deuteronomy 4:2) and condoned it in Exodus 22:26-27, but he even directly commanded it on at least one occasion (Isaiah 20:1-6) as a demonstration for Isaiah's audience. God is described as predicting the fate of Egypt after providing instructions for Isaiah to be naked for three years. The prophecy states that the captive men and women of Egypt would be forced to walk naked, with their "buttocks bared," after being conquered by a foreign power (Assyria). Since Isaiah's own condition is supposed to reflect their future state, he, too, walks naked.
1 Samuel 19:23-24 even confirms that nudity was so commonly associated with Yahweh's prophets that onlookers wondered if King Saul was a prophet simply because he denuded himself and prophesied while under the influence of the "Spirit of God." Once again, the Bible does nothing to hide the fact that nudity was not something that the Jews universally loathed or discouraged. Without even reading the Biblical creation narrative or Mosaic Law, one can see from historical portions of the Bible that the Jewish culture described in the Old Testament clearly involved public nudity.
As the aforementioned passages affirm, contrary to popular evangelical ideas, extramarital nudity is not an affront to God and was even integrated into Jewish society enough to be directly associated with Yahweh (in some circumstances). Though it has been demonized for centuries of church history, nudity was never the immoral abomination that many have mistaken it for. Ultimately, no one even needs to read all the way to Isaiah or even Exodus to realize that the Bible does not condemn mere nudity. Key details the Bible provides about Jewish history only confirm what the first two chapters of Genesis plainly teach: the human body is metaphysically good and is not shameful or sinful.
What Technology Cannot Accomplish
The modern world teems with technological innovations that have transformed human life on every level. Health, entertainment, communication, transportation, business, and education have all been affected by these advancements, and technological progress shows no signs of abating. As a result of these changes, however, some seem to mistakenly regard technology as the present or future solution to every human problem--an impossible thing.
For all of its merits, technology remains forever incapable of resolving the most significant issues a human could encounter. Human limitations of an epistemic nature, for instance, do not vanish with the introduction of new technologies. In fact, nothing about innovation can abolish such constraints.
While technology can somewhat alleviate the difficulties of living with human epistemological or metaphysical limitations, it cannot erase them entirely. The most sophisticated technology can at best only alter the way that we experience those limitations; even a thoroughly modified transhumanist is still restricted into certain epistemological and metaphysical categories. For example, no technology allows someone to prove that their memories correspond to past events or that their senses perceive the external world as it is.
Epistemological/metaphysical restrictions are not the only things technology cannot remove, of course. Moral problems cannot be resolved by the mere presence or evolution of technology. Technology itself is entirely amoral, but it can be wielded in unjust or oppressive ways. The basic fact that scientific innovation is not automatically accompanied by moral correctness means that technology alone does nothing to answer questions of a moral nature. What, then, is technology useful for, if not for rectifying issues of epistemology or ethics?
Convenience and simplification are the ultimate benefits of technology, as each of its triumphs reduces down to one of these two things. This does not mean that technology is insignificant, only that one must look beyond it for the solutions to the deepest issues facing humankind. It is pointless to act as if something can give what it cannot. To look to technology for deliverance from metaphysical, epistemological, and moral problems, therefore, is utter folly.
For all of its merits, technology remains forever incapable of resolving the most significant issues a human could encounter. Human limitations of an epistemic nature, for instance, do not vanish with the introduction of new technologies. In fact, nothing about innovation can abolish such constraints.
While technology can somewhat alleviate the difficulties of living with human epistemological or metaphysical limitations, it cannot erase them entirely. The most sophisticated technology can at best only alter the way that we experience those limitations; even a thoroughly modified transhumanist is still restricted into certain epistemological and metaphysical categories. For example, no technology allows someone to prove that their memories correspond to past events or that their senses perceive the external world as it is.
Epistemological/metaphysical restrictions are not the only things technology cannot remove, of course. Moral problems cannot be resolved by the mere presence or evolution of technology. Technology itself is entirely amoral, but it can be wielded in unjust or oppressive ways. The basic fact that scientific innovation is not automatically accompanied by moral correctness means that technology alone does nothing to answer questions of a moral nature. What, then, is technology useful for, if not for rectifying issues of epistemology or ethics?
Convenience and simplification are the ultimate benefits of technology, as each of its triumphs reduces down to one of these two things. This does not mean that technology is insignificant, only that one must look beyond it for the solutions to the deepest issues facing humankind. It is pointless to act as if something can give what it cannot. To look to technology for deliverance from metaphysical, epistemological, and moral problems, therefore, is utter folly.
Saturday, June 1, 2019
Originality In Science
Without some form of originality, one is inevitably at the mercy of other peoples' stupdity, and this is true regarding science as much as it is true of any other aspect of life or philosophy. Both of its forms are vital in their own ways. Originality of one kind is the discovery of something previously or largely unknown, while originality of a second kind is the autonomous discovery or confirmation of truths for oneself. Even when the first kind is completely exhausted, there is always the need for and the possibility of living out the second.
The first form of originality in science involves the discovery of new information about perceived phenomena in the external world. Historical examples would include the discovery of evidence pointing to the expansion of the universe, the power behind the splitting of an atom, or the usefulness of electrical power. As with logical truths, there can only be a finite number of scientific discoveries that can be made--unless scientific laws were to continually change with the passage of time (which is certainly possible despite seeming unlikely).
However, logical truths are accessible to all people, whereas a great deal of scientific information is either not accessible without continuing someone else's work or cannot be accumulated without particular equipment. There are still many logical and conceptual truths that have never been acknowledged by even a single figure in academia of historical or contemporary renown [1], but they could be autonomously discovered by anyone both intelligent and consistent enough to reason them out; when it comes to science, it is simply impossible to develop all modern scientific models within a single standard human lifetime.
The second form of originality in science involves the autonomous discovery of scientific information already documented or elaborated upon by others. Every child that performs his or her own experiments out of curiosity is already practicing this kind of originality. In other words, their scientific analyses are their own. If a child engages in private experiments with the intent of personally learning about something like electricity or gravity, they are educating themselves about matters that have received a great deal of attention throughout recorded history, but they are expressing originality all the same.
There is also one other aspect to the second form of scientific originality. It pertains to the analysis of scientific concepts, not to the conducting of one's own experiments. While scientific premises cannot be truly verified, only supported by evidence, it is still necessary to reason out what conclusions follow from those premises (if the scientific laws the premises are based on were to remain constant indefinitely) even when they have been widely discussed or examined. In some cases, another person might introduce one to a particular scientific concept, yet independently analyzing the concept and/or whatever the person claimed about it is still called for.
At the very least, the second form of originality is an ongoing necessity, but there are other benefits to originality as well. Just as emphasizing originality of either sort in strict philosophy would likely inspire some people to contemplate reason and reality more than they otherwise would, emphasizing originality in science would likely inspire at least some people to contemplate science more than they otherwise would. Originality is available to everyone, and yet it often goes unacknowledged. The subjective empowerment of originality alone is rewarding enough as it is, but the stability that can only come from autonomy is needed by all, even though scientific certainty is limited to mere perceptions.
[1]. https://thechristianrationalist.blogspot.com/2018/12/a-list-of-neglected-truths.html
The first form of originality in science involves the discovery of new information about perceived phenomena in the external world. Historical examples would include the discovery of evidence pointing to the expansion of the universe, the power behind the splitting of an atom, or the usefulness of electrical power. As with logical truths, there can only be a finite number of scientific discoveries that can be made--unless scientific laws were to continually change with the passage of time (which is certainly possible despite seeming unlikely).
However, logical truths are accessible to all people, whereas a great deal of scientific information is either not accessible without continuing someone else's work or cannot be accumulated without particular equipment. There are still many logical and conceptual truths that have never been acknowledged by even a single figure in academia of historical or contemporary renown [1], but they could be autonomously discovered by anyone both intelligent and consistent enough to reason them out; when it comes to science, it is simply impossible to develop all modern scientific models within a single standard human lifetime.
The second form of originality in science involves the autonomous discovery of scientific information already documented or elaborated upon by others. Every child that performs his or her own experiments out of curiosity is already practicing this kind of originality. In other words, their scientific analyses are their own. If a child engages in private experiments with the intent of personally learning about something like electricity or gravity, they are educating themselves about matters that have received a great deal of attention throughout recorded history, but they are expressing originality all the same.
There is also one other aspect to the second form of scientific originality. It pertains to the analysis of scientific concepts, not to the conducting of one's own experiments. While scientific premises cannot be truly verified, only supported by evidence, it is still necessary to reason out what conclusions follow from those premises (if the scientific laws the premises are based on were to remain constant indefinitely) even when they have been widely discussed or examined. In some cases, another person might introduce one to a particular scientific concept, yet independently analyzing the concept and/or whatever the person claimed about it is still called for.
At the very least, the second form of originality is an ongoing necessity, but there are other benefits to originality as well. Just as emphasizing originality of either sort in strict philosophy would likely inspire some people to contemplate reason and reality more than they otherwise would, emphasizing originality in science would likely inspire at least some people to contemplate science more than they otherwise would. Originality is available to everyone, and yet it often goes unacknowledged. The subjective empowerment of originality alone is rewarding enough as it is, but the stability that can only come from autonomy is needed by all, even though scientific certainty is limited to mere perceptions.
[1]. https://thechristianrationalist.blogspot.com/2018/12/a-list-of-neglected-truths.html
The Necessity Of Avoiding All Assumptions
To assume is to believe without proof, and, while many people like to think of themselves as if they do not merely assume things as they go about their lives, assumptions are unfortunately a common affliction. It is not as if they are inevitable blunders, of course: the entire point of rationalism is to discover truths while avoiding assumptions. However, I am not referring to the superficial, incomplete understanding that many people have of what it means to avoid assumptions. A basic example from ordinary life illustrates the difference.
Suppose that a child is looking for a shirt she owns, and she looks for it everywhere in her room except for her closet, because she assumed it was somewhere else. Since the shirt is hanging in her closet, she doesn't find it, and she tells her father that she doesn't know where it is. Her father asks if she looked in the closet and, when she explains why she did not search there, chides her for assuming, telling her to check.
Now, many people think that the young girl would have avoided making assumptions if she had simply checked her closet before consulting her dad, but there is far more to not making assumptions than that. Any being with human epistemological limitations that sees the shirt in the closet and believes that the shirt exists simply because it is perceived makes an enormous assumption! A person can derive their beliefs from the best evidence and still make a plethora of assumptions, as evidence is not proof. If even a single conclusion of theirs does not follow from its premises, or if even a single belief has been accepted short of logical proof, they have failed to pursue rationality.
When people realize what it actually means to forgo assumptions, they often erroneously come to believe that it is impossible to live without making assumptions on a philosophical or "practical" level (of course, "practicality" is still a philosophical matter). However, there is not a single assumption that a person must make to either construct a worldview or simply live; even if a person must dramatically restructure his or her worldview in order to purge assumptions, the objective is not unachievable. It is not only possible to never make a single assumption, but it is epistemologically necessary to never make assumptions if one wants genuine knowledge, for knowledge and assumptions are mutually exclusive.
The second major error people make with the topic of assumptions is concluding that the avoidance of all assumptions means that a person can only know one fact or a very small number of facts for sure. This belief, too, is false and a sign of intellectual ineptitude. Only very specific facts can be proven, but far more than just the existence/veracity of logical axioms and the existence of my consciousness can be known with absolute certainty [1].
The extent of absolute certainty aside, no human is fated to live as a slave to assumptions, yet even those who at least intend to seek the truth often fall short of doing the very thing necessary to pursue it accurately: at best, only a miniscule number of people are truly willing to submit the entirety of their lives and worldviews to reason. Without this whole and utter submission, a person will only ever know a small handul of selective, random facts at most. The process of rejecting assumptions might be a painful one, but it is absolutely necessary to do so in order to secure knowledge.
[1]. https://thechristianrationalist.blogspot.com/2018/11/the-extent-of-absolute-certainty.html
Suppose that a child is looking for a shirt she owns, and she looks for it everywhere in her room except for her closet, because she assumed it was somewhere else. Since the shirt is hanging in her closet, she doesn't find it, and she tells her father that she doesn't know where it is. Her father asks if she looked in the closet and, when she explains why she did not search there, chides her for assuming, telling her to check.
Now, many people think that the young girl would have avoided making assumptions if she had simply checked her closet before consulting her dad, but there is far more to not making assumptions than that. Any being with human epistemological limitations that sees the shirt in the closet and believes that the shirt exists simply because it is perceived makes an enormous assumption! A person can derive their beliefs from the best evidence and still make a plethora of assumptions, as evidence is not proof. If even a single conclusion of theirs does not follow from its premises, or if even a single belief has been accepted short of logical proof, they have failed to pursue rationality.
When people realize what it actually means to forgo assumptions, they often erroneously come to believe that it is impossible to live without making assumptions on a philosophical or "practical" level (of course, "practicality" is still a philosophical matter). However, there is not a single assumption that a person must make to either construct a worldview or simply live; even if a person must dramatically restructure his or her worldview in order to purge assumptions, the objective is not unachievable. It is not only possible to never make a single assumption, but it is epistemologically necessary to never make assumptions if one wants genuine knowledge, for knowledge and assumptions are mutually exclusive.
The second major error people make with the topic of assumptions is concluding that the avoidance of all assumptions means that a person can only know one fact or a very small number of facts for sure. This belief, too, is false and a sign of intellectual ineptitude. Only very specific facts can be proven, but far more than just the existence/veracity of logical axioms and the existence of my consciousness can be known with absolute certainty [1].
The extent of absolute certainty aside, no human is fated to live as a slave to assumptions, yet even those who at least intend to seek the truth often fall short of doing the very thing necessary to pursue it accurately: at best, only a miniscule number of people are truly willing to submit the entirety of their lives and worldviews to reason. Without this whole and utter submission, a person will only ever know a small handul of selective, random facts at most. The process of rejecting assumptions might be a painful one, but it is absolutely necessary to do so in order to secure knowledge.
[1]. https://thechristianrationalist.blogspot.com/2018/11/the-extent-of-absolute-certainty.html
Democracy's Error
On the surface, democracy might seem like an effective way to address the needs of a given society, providing citizens with the ability to vote on significant policy matters--whether directly or by selecting representatives. While some forms of democracy are far more damaging and irrational than others, none of them are truly centered on truth and justice, the only things that can grant a government legitimacy. At its best, democracy allows even major sociopolitical decisions to be influenced by arbitrary whims and consensus; at its worst, its irrationalistic leanings are blatant.
It does not follow in any way from the fact that someone will be affected by a political decision that his or her attitude towards that decision should be taken into consideration. After all, a person can only deserve to be listened to if they are rational and just, and only a relatively miniscule amount of people can be legitimately called those things. To insist otherwise, a person must rely on non sequitur fallacies, saying that it follows from merely being a human or being affected by a decision that a random person should be able to control at least a part of the political world.
Operating in a manner that is aligned with truth, not honoring the wishes of the people, is the goal of a sound government. In order to be a rational or just person, one must often disregard or outright war with the delusions of academia and the general populace alike, for intelligence has never been possessed by more than a select few at a time. Democracy's error is that its tenets treat the ability to influence politics (whether or not that influence is sound) as if it is a right possessed by every human, irrespective of that individual's intelligence, commitment to truth, or moral character.
It is impossible for anyone's perspective to matter unless that perspective is in alignment with reason. Short of that alignment, a person has no intellectual authority (the foundation of all other forms of authority), much less the right to receive any respect at all (from social peers or governments) beyond the bare minimum required to not violate their human rights. Since the vast majority of people are unintelligent and hypocritical, it follows that the majority of humankind does not deserve to have a political voice one way or the other. Democracy of any kind, no matter how accepted it is within a particular culture, accomplishes nothing but some degree of subversion of a rationalistic society.
It does not follow in any way from the fact that someone will be affected by a political decision that his or her attitude towards that decision should be taken into consideration. After all, a person can only deserve to be listened to if they are rational and just, and only a relatively miniscule amount of people can be legitimately called those things. To insist otherwise, a person must rely on non sequitur fallacies, saying that it follows from merely being a human or being affected by a decision that a random person should be able to control at least a part of the political world.
Operating in a manner that is aligned with truth, not honoring the wishes of the people, is the goal of a sound government. In order to be a rational or just person, one must often disregard or outright war with the delusions of academia and the general populace alike, for intelligence has never been possessed by more than a select few at a time. Democracy's error is that its tenets treat the ability to influence politics (whether or not that influence is sound) as if it is a right possessed by every human, irrespective of that individual's intelligence, commitment to truth, or moral character.
It is impossible for anyone's perspective to matter unless that perspective is in alignment with reason. Short of that alignment, a person has no intellectual authority (the foundation of all other forms of authority), much less the right to receive any respect at all (from social peers or governments) beyond the bare minimum required to not violate their human rights. Since the vast majority of people are unintelligent and hypocritical, it follows that the majority of humankind does not deserve to have a political voice one way or the other. Democracy of any kind, no matter how accepted it is within a particular culture, accomplishes nothing but some degree of subversion of a rationalistic society.
Subscribe to:
Posts (Atom)