Philosophical Fragments 1
By: Dr. Sam Vaknin
Malignant Self Love - Buy the Book - Click HERE!!!
Relationships with Abusive Narcissists - Buy the e-Books - Click HERE!!!
Scroll down to review a complete list of the articles - Click on the blue-coloured text!
Bookmark this Page - and SHARE IT with Others!
Philosophy is the attempt to enhance the traits we deem desirable and suppress the traits we view as unwanted (a matter of judgment) by getting better acquainted with the world around us (a matter of reality). An improvement in the world around us inevitably follows.
The sempiternal debate between idealists and realists demonstrates the difficulty inherent in this agenda: our perceptions (experiences, even thoughts) may be real enough (conform to an objective, observer-free reality), but we fail to communicate them efficaciously. This is because human language evolved to be useful, as an aid to survival.
Our discourse is, therefore, less concerned with conveying accurate information about the world as it is and more with purveying the kind of data that will guarantee our continued being, regardless of whether these data describe reality correctly or not. Often, precise information about the world may actually retard our chances of survival and we are much better off communicating messages and thoughts that are patently unrealistic. Hence the emergence of alternative languages such as mathematics (more precisely logic and arithmetic) which are used by us to capture reality and to remove vagueness.
To qualify as a philosophical theory, the practitioner of philosophy - the philosopher - must, therefore meet a few tests:
1. To clearly define and enumerate the traits he seeks to enhance (or suppress) and to lucidly and unambiguously describe his ideal of the world
2. Not to fail the tests of every scientific theory (internal and external consistency, falsifiability, possessed of explanatory and predictive powers, etc.)
These are mutually exclusive demands. Reality - even merely the intersubjective sort - does not yield to value judgments. Ideals, by definition, are unreal. Consequently, philosophy uneasily treads the ever-thinning lines separating it, on the one hand, from physics and, on the other hand, from religion.
The history of philosophy is the tale of attempts - mostly botched - to square this obstinate circle. In their desperate struggle to find meaning, philosophers resorted to increasingly arcane vocabularies and obscure systems of thought. It did nothing to endear it to the man (and reader) in the post-Socratic agora.
The notions of historical fame, celebrity and notoriety are a mixed bag. Some people are famous during (all or part of) their lifetime and forgotten soon after. Others gain fame only centuries after their death. Still others are considered important figures in history yet are known only to a select few.
So, what makes a person and his biography famous or, even more important, of historical significance?
One possible taxonomy of famous personages is the following:
To be considered (and, thus, to become) a historical figure a person must satisfy at least condition B above. This, in itself, is a sufficient (though not a necessary) condition. Alternatively, a person may satisfy condition A above. Once more, this is a sufficient condition – though hardly a necessary one.
A person has two other ways to qualify:
He can either satisfy a combination of conditions A and C or Meet the requirements of conditions B and C.
Historical stature is a direct descendant and derivative of the influence the historical figure has had over other people. This influence cannot remain potential – it must be actually wielded. Put differently, historical prominence is what we call an interaction between people in which one of them influences many others disproportionately.
You may have noticed that the above criteria lack a quantitative dimension. Yet, without a quantitative determinant they lose their qualifying power. Some kind of formula (in the quantitative sense) must be found in order to restore meaning to the above classes of fame and standing in history.
The creative person is often described as suffering from dysfunctional communication skills. Unable to communicate his thoughts (cognition) and his emotions (affect) normally, he resorts to the circumspect, highly convoluted and idiosyncratic form of communication known as Art (or Science, depending on his inclination and predilections).
But this cold, functional, phenomenological analysis fails to capture the spirit of the creative act. Nor does it amply account for our responses to acts of creation (ranging from enthusiasm to awe and from criticism to censorship). True, this range of responses characterizes everyday communications as well – but then it is imbued with much less energy, commitment, passion, and conviction. This is a classical case of quantity turned into quality.
The creative person provokes and evokes the Child in us by himself behaving as one. This rude violation of our social conventions and norms (the artist is, chronologically, an adult) shocks us into an utter loss of psychological defenses. This results in enlightenment: a sudden flood of insights, the release of hitherto suppressed emotions, memories and embryonic forms of cognition and affect. The artist probes our subconscious, both private and collective.
The preservation of human life is the ultimate value, a pillar of ethics and the foundation of all morality. This held true in most cultures and societies throughout history.
On first impression, the last sentence sounds patently wrong. We all know about human collectives that regarded human lives as dispensable, that murdered and tortured, that cleansed and annihilated whole populations in recurrent genocides. Surely, these defy the aforementioned statement?
Liberal philosophies claim that human life was treated as a prime value throughout the ages. Authoritarian regimes do not contest the over-riding importance of this value. Life is sacred, valuable, to be cherished and preserved. But, in totalitarian societies, it can be deferred, subsumed, subjected to higher goals, quantized, and, therefore, applied with differential rigor in the following circumstances:
There is an often missed distinction between Being the First, Being Original, and Being Innovative.
To determine that someone (or something) has been the first, we need to apply a temporal test. It should answer at least three questions: what exactly was done, when exactly was it done and was this ever done before.
To determine whether someone (or something) is original – a test of substance has to be applied. It should answer at least the following questions: what exactly was done, when exactly was it done and was this ever done before.
To determine if someone (or something) is innovative, a practical test has to be applied. It should answer at least the following questions: what exactly was done, in which way was it done and was exactly this ever done before in exactly the same way.
Reviewing the tests above leads us to two conclusions:
Innovation helps in the conservation of resources and, therefore, in the delicate act of human survival. Being first demonstrates feasibility ("it is possible"). By being original, what is needed or can be done is expounded upon. And by being innovative, the practical aspect is revealed: how should it be done.
Society rewards these pathfinders with status and lavishes other tangible and intangible benefits upon them - mainly upon the Originators and the Innovators. The Firsts are often ignored because they do not directly open a new path – they merely demonstrate that such a path is there. The Originators and the Innovators are the ones who discover, expose, invent, put together, or verbalize something in a way which enables others to repeat the feat (really to reconstruct the process) with a lesser investment of effort and resources.
It is possible to be First and not be Original. This is because Being First is context dependent. For instance: had I travelled to a tribe in the Amazon forests and quoted a speech of Kennedy to them – I would hardly have been original but I would definitely have been the first to have done so in that context (of that particular tribe at that particular time). Popularizers of modern science and religious missionaries are all first at doing their thing - but they are not original. It is their audience which determines their First-ness – and history which proves their (lack of) originality.
Many of us reinvent the wheel. It is humanly impossible to be aware of all that was written and done by others before us. Unaware of the fact that we are not the first, neither original or innovative - we file patent applications, make "discoveries" in science, exploit (not so) "new" themes in the arts.
Society may judge us differently than we perceive ourselves to be - less original and innovative. Hence, perhaps, is the syndrome of the "misunderstood genius". Admittedly, things are easier for those of us who use words as their raw material: there are so many permutations, that the likelihood of not being first or innovative with words is minuscule. Hence the copyright laws.
Yet, since originality is measured by the substance of the created (idea) content, the chances of being original as well as first are slim. At most, we end up restating or re-phrasing old ideas. The situation is worse (and the tests more rigorous) when it comes to non-verbal fields of human endeavor, as any applicant for a patent can attest.
But then surely this is too severe! Don't we all stand on the shoulders of giants? Can one be original, first, even innovative without assimilating the experience of past generations? Can innovation occur in vacuum, discontinuously and disruptively? Isn't intellectual continuity a prerequisite?
True, a scientist innovates, explores, and discovers on the basis of (a limited and somewhat random) selection of previous explorations and research. He even uses equipment – to measure and perform other functions – that was invented by his predecessors. But progress and advance are conceivable without access to the treasure troves of the past. True again, the very concept of progress entails comparison with the past. But language, in this case, defies reality. Some innovation comes "out of the blue" with no "predecessors".
Scientific revolutions are not smooth evolutionary processes (even biological evolution is no longer considered a smooth affair). They are phase transitions, paradigmatic changes, jumps, fits and starts rather than orderly unfolding syllogisms (Kuhn: "The Structure of Scientific Revolutions").
There is very little continuity in quantum mechanics (or even in the Relativity Theories). There is even less in modern genetics and immunology. The notion of laboriously using building blocks to construct an ebony tower of science is not supported by the history of human knowledge. And what about the first human being who had a thought or invented a device – on what did he base himself and whose work did he continue?
Innovation is the father of new context. Original thoughts shape the human community and the firsts among us dictate the rules of the game. There is very little continuity in the discontinuous processes called invention and revolution. But our reactions to new things and adaptation to the new world in their wake essentially remain the same. It is there that continuity is to be found.
videos that appear
to be completely authentic but are actually forgeries. The heads of
celebrities are superimposed & juxtaposed into the bodies of porn stars
amidst the scintillating action.
This raises the question: what is a copy and what is the original? This conundrum was first raised in 1935 in a seminal, groundbreaking tome: "The Work of Art in the Age of Mechanical Reproduction" by Walter Benjamin.
Consider these mindbenders:
1. A brilliant geek invents a 3D printer which replicates flawlessly the Mona Lisa. Leonardo’s masterpiece and the copy spewed out by the machine are indistinguishable even under an electron microscope: they cannot be told apart. In which sense, therefore, is the artist’s Mona Lisa superior to or different from its identical clone?
2. An ancient letter unearthed in the archives of the Church in France proves beyond any doubt that the Mona Lisa was not painted by Leonardo da Vinci, but by an obscure apprentice of his. The painting’s value drops overnight even though it has undergone no physical or chemical transformation.
3. A world-renowned photographer uses the latest in digital photography equipment to shoot the Mona Lisa in a thought-provoking, fresh manner. The resulting oeuvre becomes a sensation overnight. He then proceeds to attach the photo to 15,000 e-mail messages and sends them to his entire voluminous addressbook. In which sense is the photo that he had shot more worthwhile than its numerous digital replicas?
Intuitively, we feel that Leonardo’s Mona Lisa is not the same as its clones and that its monetary value and intrinsic worth depend crucially on its provenance: its authorship, the historical background, and its proven “biography.” The concepts of originality and authenticity, therefore, have little to do with the work of art itself and everything to do with its context and pedigree.
The commonality of an experience, shared by unrelated individuals in precisely the same way, is thought to constitute proof of its veracity and objectivity. Some thing is assumed to be "out there" if it identically affects the minds of observers. A common experience, it is deduced, imparts information about the world as it is.
But a shared experience may be the exclusive outcome of the idiosyncrasies of the human mind. It may teach us more about the observers' brains and neural processes than about any independent, external "trigger". The information manifested in an experience common to many may pertain to the world, to the observers, or to the interaction between the world and said observers.
Thus, Unidentified Flying Objects (UFOs) have been observed by millions in different parts of the world at different times. Does this "prove" that they exist? No, it does not. This mass experience can be the result of the common wiring of the brains of human beings who respond to stimuli identically (by spotting a UFO). Or it can be some kind of shared psychosis.
Somehow, God seems to get his only sentient creations wrong most of the time: He repeatedly fails to gauge human psychology and invariably ends up being frustrated and enraged at his charges’s shortsightedness, self-destructiveness, and disobedience. The Devil does a much better job of catering to the deep narcissistic strains of the human psyche. Satan is much more human than God, he is truly one of us. This abyss between good intentions and abysmal performance rendered God a rather incompetent overseer of human affairs. Gradually but inexorably, his influence and reputation waned and Man took over – only to fail equally spectacularly.
The demise of the great secular religions - Communism, Fascism, Nazism - led to the resurgence of the classical religions (Islam, Christianity, Judaism, Hinduism), a phenomenon now dubbed "fundamentalism". These ancient thought-systems are all-encompassing, ideological, exclusive, and missionary.
They face the last remaining secular organizing principle - democratic liberalism. Yet, as opposed to the now-defunct non-religious alternatives, liberalism is hard to defeat for the following reasons:
I. It is cyclical and, therefore, sempiternal.
II. Recurrent failure is an integral and welcome phase in its development. Such breakdowns are believed to purge capitalism of its excesses. Additionally, innovation breeds "disruptive technologies" and "creative destruction".
III. Liberalism is not goal-orientated (unless one regards the platitudes about increasing wealth and welfare as "goals").
IV. It is pluralistic and, thus, tolerant and inclusive of other religions and ideologies (as long as they observe the rules of the game).
V. Democratic liberalism is adaptative, assimilative, and flexible. It is a "moving target". It is hard to destroy because it is a chameleon.
The renewed clash between religion and liberalism is likely to result in the emergence of a hybrid: liberal, democratic confessions with clear capitalistic hallmarks.
Governors are recalled in midterm ballot initiatives, presidents deposed through referenda - the voice of the people is increasingly heard above the din of politics as usual. Is this Swiss-like participatory, direct democracy - or nascent mob rule?
The wave of direct involvement of the masses in politics is fostered by a confluence of trends:
1. The emergence of a class of full-time, "professional" politicians who are qualified to do little else and whose personal standing in the community is low. These "politicos" are generally perceived to be incompetent, stupid, hypocritical, liars, bigoted, corrupt, and narcissistically self-interested. It is a powerful universal stereotype.
2. Enhanced transparency in all levels of government and growing accountability of politicians, political parties, governments, corporations, and institutions.
3. Wider and faster dissemination of information regarding bad governance, corruption, venality, cronyism, and nepotism. This leads to widespread paranoia of the average citizen and distrust of all social institutions and structures.
4. More efficient mechanisms of mobilization (for instance, the Internet).
But is it the end of representative democracy as we know it?
Hopefully it is. "Democracy" has long been hijacked by a plutocrats and bureaucrats. In between elections, they rule supreme, virtually unanswerable to the electorate. The same people circulate between the various branches of government, the legislature, the judiciary, and the world of business. This clubbish rendition of the democratic ideals is a travesty and a mockery. People power is the inevitable - though unwelcome - response.
The sentence A "all rabbits are black" is either True or False. It, therefore, has a wave function with two branches or two universes: one in which all rabbits are, indeed, black and one in which, not all rabbits are black (in other words, in which at least one rabbit is white).
It is impossible to prove the sentence "all rabbits are black" - but very easy to falsify or disprove it. Enough to produce one white rabbit to do so.
The sentence B "some rabbits are black" is, similarly, either True or False. It also has a wave function with two branches or two universes: one in which some rabbits are, indeed, black and one in which no rabbit is black (or, in other words, all rabbits are white).
The worlds described by the two sentences largely intersect. If True, sentence B is partly contained by sentence A, though to what extent we can never know. We can safely say that sentences A and B are asymptotically equivalent or asymptotically identical. In a world with one white rabbit and uncounted trillions of black rabbits, A and B are virtually indistinguishable.
Yet, despite this intersection, this common ground, sentence A reacts entirely differently to syllogistic transformation than sentence B.
Imagine a sentence C: "This is a white rabbit". It FALSIFIES sentence A ("All rabbits are black") but leaves UNAFFECTED sentence B ("Some rabbits are black"). These are diametrically opposed outcomes.
How can two sentences that are so similar react so differently to the same transformation?
Arithmetic, formal logic, and, by extension, mathematics and physics deal with proving identities in equations. Two plus two equal four. The left hand of the expression equals (is identical) to the right hand. That two, potentially asymptotically identical, sentences (such as A and B above) react so at odds to the same transforming sentence (C) is astounding.
We must, therefore, study the possibility that there is something special, a unique property, an idiosyncrasy, in sentences A, and/or B, and/or C, and/or in their conjunction. If we fail to find such distinguishing marks, we must learn why asymptotically identical sentences react so differently to the same test and what are the implications of this disturbing find.
When we say: "The President
is an important person" what exactly do we mean by that? Where does the
President derive his importance from? Evidently, he loses a large portion of
the quality of being important when he ceases to be the President. We can
therefore conclude that one's personal importance is inextricably linked to
one's functions and position, past and present.
Similarly, imagine the omnipotent CEO of a mighty Fortune 500 corporation. No doubt he is widely considered to be an important personage. But his importance depends on his performance (on market share gained or lost, for instance). Technological innovation could render products obsolete and cripple formerly thriving enterprises. As the firm withers, so does the importance of its CEO.
Even so, importance is not an absolute trait. It is a derivative of relatedness. In other words, it is an emergent phenomenon that arises out of webs of relationships and networks of interactions. Importance is context-dependent.
Consider the Mayor or Elder of a village in one of the less developed countries. He is clearly not that important and the extent of his influence is limited. But what if the village were to become the sole human habitation left standing following a nuclear holocaust? What if the denizens of said erstwhile inconsequential spot were to be only survivors of such a conflagration? Clearly, such circumstances would render the Elder or Mayor of the village the most important man on Earth and his function the most coveted and crucial. As the context changes, so does one's importance.
X. Names of Collectives (Sets) versus Names of Individuals
Individuals are members of classes or sets (hereinafter referred to as “collectives”). Names of collectives are fundamentally different to names of individuals:
Individuals cannot own their names, collectives can and strive to possess their names and protect them against incursion and misuse. This is especially true in the case of brand names;
Individuals do not have exclusive names. When they do (tattooed numbers in Auschwitz; prison numbers) such exclusivity tends to be humiliating and dehumanizing. In contrast, collectives aspire to exclusivity on their names, although, in practice the enforcement of such self-imputed exclusivity may be fraught with difficulties (witness the name dispute between Macedonia and Greece). Collectives find name-exclusivity uplifting;
The names of individuals do not reveal the attributes of the bearers or referents, nor do they contain or convey any information regarding the traits or qualities of said. The names of collectives come laden with context and history and, therefore, are infused with data regarding the collective. In a sense, the names of collectives are among their more dominant and prominent attributes. This intimate relationship between names, denotats, and connotates gives rise to stereotypes;
The names of individuals do not define their bearers or referents. The name of a collective is an integral part of its definition. It is impossible to construct a workable definition of a collective without including its name in the definition, whatever its nature (lexical, stipulative, or ostensive);
The name of the individual does not determine the individual. The individual’s name also has nothing to do with his or her traits, attributes, qualities, behavior patterns, and other extensive parameters of the person named. This is different where collectives are concerned: the name of a collective is an important element in the collective’s self-determination and usually the first act on the road to autonomy, independence, and differentiation.
The names of individuals are, ultimately arbitrary and cannot be defined or explained, though they may possess semantic values. The names of collectives are always contextually “meaningful” and can always be defined;
The names of individuals are largely devoid of emotional content and provoke little or no emotional reaction in the listener. The names of collectives never fail to elicit and provoke emotional reactions;
Finally, individual names are very loosely interwoven with individual identities. In stark contrast, names of collectives are often synonymous with their identities: this is how close the relationship between the two is.
Judaism is the only monotheistic religion which expressly allots a crucial role in its rites and ceremonies to infants, their predilections, and their pursuits. Children are positively encouraged and incentivized – often monetarily – to disrupt even the most solemn proceedings with questions (in Passover, during the Seder) or with raucous displays (in Purim, when they use their rattles to mark the names of Haman and other ill-wishers.)
This emphasis is a calculated gambit aimed at securing the loyalties of future generations of Jews even as religious rituals are rendered less “serious” and more ludic in nature. A nation decimated by recurrent culling of its adults was forced to “transfer power” to its youth as a mere survival strategy.
We sneeze in order to expel foreign bodies from the nasal mucosa. Respiratory infections are common causes of sneezing. A typical sneeze releases air that moves at speeds of 60-100 miles (100-160 kilometres) per hour and requires a considerable investment of energy. What are the evolutionary advantages of such an apparently wasteful caloric profligacy? After all, a sneeze or a cough one tenth as powerful would have achieved optimal outcomes.
It seems that Nature wants us to spread pathogens via these convulsive, spasmodic exhalations. It wants us to infect other people who are 1-4 meters away. Such infections serve to weed out the weak, the old, and the disabled and aid and abet in the survival of the fittest. Sneezes and coughs are “designed” to facilitate this process of eugenic elimination.
This mechanism makes even greater sense when we consider that people tended to spend time with their kin with whom they share genetic material. Immune weaknesses and susceptibilities to airborne illnesses run in families. Infecting the entire clan via sneezing and coughing is a great way of exposing and removing corrupt and mutated genes and defunct immune systems.
is copyrighted. Free, unrestricted use is allowed on a non commercial basis.
The author's name and a link to this Website must be incorporated in any reproduction of the material for any use and by any means.
Write to me: email@example.com