Separate names with a comma.
Discussion in 'Hard Talk' started by Astroboy, Dec 17, 2007.
YouTube - The Truth about Mental Health Disorders - Psychology
YouTube - Psychology vs. Psychiatry
I am not able to listen to at Youto be because the internet speed is 64kbs. Any solution except change your ISP.
Glad you are adding sound to the fury NamJap ji. Makes it more convincing. Happy New Year -- and will Skype you again.
Click here > Forer effect
Decision-making and behavioral biases
Many of these biases are studied for how they affect belief formation and business decisions and scientific research.
Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, herd behaviour, and manias.
Base rate fallacy — ignoring available statistical data in favor of particulars
Bias blind spot — the tendency not to compensate for one's own cognitive biases.
Choice-supportive bias — the tendency to remember one's choices as better than they actually were.
Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
Contrast effect — the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view.
Distinction bias - the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.
Endowment effect — "the fact that people often demand much more to give up an object than they would be willing to pay to acquire it".
Extreme aversion — the tendency to avoid extremes, being more likely to choose an option if it is the intermediate choice.
Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
Framing — by using a too narrow approach or description of the situation or issue. Also framing effect — drawing different conclusions based on how data are presented.
Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, the closer to the present both payoffs are.
Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
Information bias — the tendency to seek information even when it cannot affect action.
Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
Loss aversion — "the disutility of giving up an object is greater than the utility associated with acquiring it". (see also sunk cost effects and Endowment effect).
Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
Moral credential effect — the tendency of a track record of non-prejudice to increase subsequent prejudice.
Need for closure — the need to reach a verdict in important matters; to have an answer and to escape the feeling of doubt and uncertainty. The personal context (time or social pressure) might increase this bias.
Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
Omission bias — The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
Planning fallacy — the tendency to underestimate task-completion times.
Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
Reactance - the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
Selective perception — the tendency for expectations to affect perception.
Status quo bias — the tendency for people to like things to stay relatively the same (see also Loss aversion and Endowment effect).
Unit bias — the tendency to want to finish a given unit of a task or an item with strong effects on the consumption of food in particular
Von Restorff effect — the tendency for an item that "stands out like a sore thumb" to be more likely to be remembered than other items.
Zero-risk bias — preference for reducing a small risk to zero over a greater reduction in a larger risk.
 Biases in probability and belief
Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research.
Ambiguity effect — the avoidance of options for which missing information makes the probability seem "unknown".
Anchoring — the tendency to rely too heavily, or "anchor," on a past reference or on one trait or piece of information when making decisions.
Attentional bias — neglect of relevant data when making judgments of a correlation or association.
Availability heuristic — estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples.
Availability cascade - a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or "repeat something long enough and it will become true").
Clustering illusion — the tendency to see patterns where actually none exist.
Conjunction fallacy — the tendency to assume that specific conditions are more probable than general ones.
Gambler's fallacy — the tendency to assume that individual random events are influenced by previous random events. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads."
Hawthorne effect — refers to a phenomenon which is thought to occur when people observed during a research study temporarily change their behavior or performance (this can also be referred to as demand characteristics).
Hindsight bias — sometimes called the "I-knew-it-all-along" effect, the inclination to see past events as being predictable.
Illusory correlation — beliefs that inaccurately suppose a relationship between a certain type of action and an effect.
Ludic fallacy — the analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-gaussian distribution of many things.
Neglect of prior base rates effect — the tendency to neglect known odds when reevaluating odds in light of weak evidence.
Observer-expectancy effect — when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
Optimism bias — the systematic tendency to be over-optimistic about the outcome of planned actions.
Overconfidence effect — the tendency to overestimate one's own abilities.
Positive outcome bias — a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias and valence effect).
Primacy effect — the tendency to weigh initial events more than subsequent events.
Recency effect — the tendency to weigh recent events more than earlier events (see also peak-end rule).
Regression toward the mean disregarded — the tendency to expect extreme performance to continue.
Reminiscence bump — the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
Rosy retrospection — the tendency to rate past events more positively than they had actually rated them when the event occurred.
Stereotyping — expecting a member of a group to have certain characteristics without having actual information about that individual.
Subadditivity effect — the tendency to judge probability of the whole to be less than the probabilities of the parts.
Telescoping effect — the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.
Texas sharpshooter fallacy — the fallacy of selecting or adjusting a hypothesis after the data is collected, making it impossible to test the hypothesis fairly.
 Social biases
Most of these biases are labeled as attributional biases.
Actor-observer bias — the tendency for explanations of other individuals' behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation (see also fundamental attribution error). However, this is coupled with the opposite tendency for the self in that explanations for our own behaviors overemphasize the influence of our situation and underemphasize the influence of our own personality.
Dunning-Kruger effect — "...when people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it. Instead, ...they are left with the mistaken impression that they are doing just fine."(see also Lake Wobegon effect, and overconfidence effect).
Egocentric bias — occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
Forer effect (aka Barnum Effect) — the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
False consensus effect — the tendency for people to overestimate the degree to which others agree with them.
Fundamental attribution error — the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).
Halo effect — the tendency for a person's positive or negative traits to "spill over" from one area of their personality to another in others' perceptions of them (see also physical attractiveness stereotype).
Herd instinct — Common tendency to adopt the opinions and follow the behaviors of the majority to feel safer and to avoid conflict.
Illusion of asymmetric insight — people perceive their knowledge of their peers to surpass their peers' knowledge of them.
Illusion of transparency — people overestimate others' ability to know them, and they also overestimate their ability to know others.
Ingroup bias — the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
Just-world phenomenon — the tendency for people to believe that the world is "just" and therefore people "get what they deserve."
Lake Wobegon effect — the human tendency to report flattering beliefs about oneself and believe that one is above average (see also worse-than-average effect, and overconfidence effect).
Notational bias — a form of cultural bias in which a notation induces the appearance of a nonexistent natural law.
Outgroup homogeneity bias — individuals see members of their own group as being relatively more varied than members of other groups.
Projection bias — the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
Self-serving bias — the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
Self-fulfilling prophecy — the tendency to engage in behaviors that elicit results which will (consciously or not) confirm our beliefs.
System justification — the tendency to defend and bolster the status quo, i.e. existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest.
Trait ascription bias — the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.
This is a good entry Namjap. Because we can apply this information to ourselves and in situations where another's decision is causing problems for a group. Nice calm way to say, It looks to me as if you are not using all of the information available (framing). Why not look at this fact along with the others.? See if you come to the same conclusion.
Rather than get angry and run off in a fit of anger. Very good. We can use this here.
Most of us miss out on life's big prizes. The Pulitzer. The Nobel.
Oscars. Tonys. Emmys. But we're all eligible for life's small pleasures. A pat on the back. A kiss behind the ear. A four-pound bass. A full moon. An empty parking space. A crackling fire. A great meal. A glorious sunset. Hot soup. Cold beer."
namrata[ a friend Of mine; who sent this mail]
YouTube - Rumi read by Coleman Barks
Psychology of Convincing Someone
Convincing someone to believe what you say is no more than simply convincing this someone to either accept a new idea or update his knowledge or belief about an existing idea. Not all people will accept your ideas with the same degree, however there are still rules that every person is subject to and if used correctly, will increase your chance of convincing other people to believe what you say.
If the person you want to convince already has prior knowledge or experience of what you're trying to convince him with, then your primary goal is to shake his beliefs and proving him false and only then present him with your own idea. If the person does not have a previous idea about that thing, you can just start by presenting your own view right away.
Why Can't I Convince Other People?
Before learning how to convince someone to believe in something or to accept your idea, you should first know the reasons that generally make people oppose ideas:
Belief Conflict: If one of your friends told you that the earth does not in fact orbit the sun, what would be your response? There would be no way you could believe him since you already know that all eight planets orbit the sun, something which you have seen proven time and time again. You already have an opposing belief so the first obstacle facing your friend when trying to convince you is your own belief system.
Knowledge: The greater a person’s knowledge about something, the harder will it be to convince him of something different. What do you think will happen if you tried to convince an astronomy professor that the sun is only 1000 km away from the earth? He'll never believe you because he already has deep knowledge of the subject and might have proven it scientifically himself. Thus the second obstacle to convincing people are their level of knowledge about what you're going to say. As you may have already noticed, the first two obstacles (prior belief and knowledge) can be grouped under one thing: having another belief that is contradicting with yours.
Skeptics: Skeptics are people who doubt almost everything and everyone. They just never accept anything unless they are truly sure of it. If you are dealing with a skeptic person then this will add further difficulties.
How to Convince Someone to Believe in Something
Based on the previous obstacles we can come up with counter techniques that can highly increase the probability of success when convincing someone. Those techniques are:
Shaking His Existing Belief: The more assertive and confident you are while talking about your idea, the higher the possibility of shaking the other person's belief about that thing (given that he does not have much knowledge about it). Speak in a confident way, use confident body language and gestures and use a confident voice tone and you will find that the other person may start to doubt his own idea.
Undermine His Knowledge Base: Even if you were confident while talking, the other person's knowledge base could act as a barrier to your ideas. That’s why convincing him that you know more than him is more important than trying to convince him of your idea itself because if you manage to convince him that you know more than he does, you will become a trusted source for his subconscious mind and it will become much easier to program him (see subconscious mind programming for more information on this). You don’t need magic to do this, you just have to be ready with proper documentation and clues. The more clear your evidence is, the more you will be able to undermine his own knowledge base and so convince him to see your point of view.(See the power of knowledge in negotiations).
Provide Proof for the Skeptic: Contrary to common belief, skeptics can be made to believe in something new provided you have clear evidence to prove your idea. The more clues you can provide to strengthen your argument, the less skeptic the other person will be and so the easier he will be convinced.
Program His Subconscious Mind: The subconscious mind can be programmed by repetition: the more a command is repeated, the more it can shake an already existing belief provided that either the conscious mind is absent or that the source of the idea is trusted. For more information on programming someone’s mind check out this guide. You can even program someone into falling in love with you, in my book, how to make someone fall in love with you i pointed out how can repeating certain words or phrases result in making someone fall in love with you. Its no magic, beliefs are formed by repetition and if you managed to repeat a certain belief enough times, the other person may actually start to believe in it too
Believing in Your Idea: Do you notice that when a person really believes in an idea he usually takes it to the light? The entrepreneur who always believed that his idea is worthy usually succeeds in building a very good business. The more you believe in your idea the more confident and, most importantly, convincing you will be when talking about it.
How to Convince Someone to Believe in Anything | 2KnowMySelf
Programme 3 - Guilt and Shame
Claudia Hammond experiences the most uncomfortable emotion of all - Guilt. It’s what makes us do the right thing, and it’s what signals to us whether a person is trustworthy. But if guilt is the emotion that lubricates society, does its close relative, shame, do anything more than titillate society? Why do we so avidly scan newspapers for stories of another's shame? Is it simply to revel in the fact that it's not us on the front page?
Whereas we feel guilt for something we do, we feel shame for what we are - which is why the most commonly quoted reaction to it is: 'I simply wanted to disappear'.
Listen again to Programme 3
Programme 4 - Sadness
Why do we cry? Is it simply to communicate how we’re feeling, or does a good cry actually shed more than just tears? Professor William Frey argues that tears are a way of eliminating waste products from our body, which is why getting through a box of tissues while watching a weepie on TV on Sunday afternoons leaves us feeling a lot better!
Listen again to Programme 4
Programme 5 - Jealousy
The Emotional Rollercoaster takes its final, rather uncomfortable ride when Claudia Hammond meets the green-eyed monster - the emotion that eats you up from the inside - Jealousy. What is the evolutionary purpose of a feeling which has frequently led to murder and ruined lives? And could its close relative, Envy, be responsible for such social advances as trade unionism and the women's movement?
Listen again to Programme 5
BBC - Radio 4 - Emotional Rollercoaster - Series 1
Just sharing my views in this context, namjap ji
When we discuss about emotions, you will find their base in the followings
Ego, anger, greed, attachment and lust. Every person have them but with different degrees.. All the reactions are noticed, come due to them, how strong could be the reaction, also depends what level of those fives exist in different individuals. Interestingly they are also related with each other, for example, level of ego decides the level of passion; however, passion usually negates the level of ego and anger if fortified. Very interesting, isn’t it? In Psychology, they go with different terms but the substance remains the same; many call, the presence of these ones, a measure of being human, in its extended form, the lack of such feelings in context of civility desired in a society decides about the insanity as well. Above all, all these fives we said above, are significantly grow into different levels in different environments. Some time Psychologists, for the well being of the individual, urge not to tap such feelings due to their side effects. In this sphere, experiments support that kind of idea of catharsis. Spiritual path molds the mind not to become slave of this in a way that their control over mind is lost. As said before different levels of these fives due to many reasons, are taught to steer them from their intensity without using suppression techniques. In spiritual field, that is why all things that attract these fives, are graded as” Maya” and worthless in pursuit of enlightenment.
YouTube - Conversational Hypnosis in Text Messages
Let me re-introduce myself in the same way as the speaker in this video. OK
I am an educational psychologist since 1974 with a Ph.D. in that field in that field for 25 years, with significant formal training in learning and cognition (consciousness and altered consciousnesses as hypnotic trance fits here).
What do many years of controlled scientific study into hypnosis tell us? 1) Suggestibility and 2) Depth of trance are two independent characteristics of hypnosis that vary widely from person to person. Notice that his method of verifying whether his audience "scratched" is to call them up and ask them. Then we are supposed to believe it because he said so ??????????????????
Did anyone fall for this and check out his web site? Because that was the actual purpose of the video. Not to have you scratch your nose, but to have you drop a few dollars.
This individual Stephen Jones is a huckster and is talking rubbish.
in the 70's Psycho Drama was one of the fads which was popular for a short time as a part of the treatment. Any idea about its origin because I had a student in Brasil whom I was teaching English who called himself an expert in it.
Psychodrama is one of several therapies used in combination with the traditional talk therapy of psychoanalysis or the more modern psychodynamic therapies.
In psychodrama an individual or group will develop an ongoing story (a kind of play) that acts out and tells a personal story/expresses very personal emotions through movement and acton. The drama can be videotaped and then discussed between therapist and patient. Or the patient can discuss the experience through journaling. Or the drama can be a topic for discussion in group therapy. The objective is to be able to assess one's feelings and perceptions as they define emotional and interpersonal problems that hamper the individual.
Similar related therapies would include art therapy, music therapy, or dance therapy, just to name a few.
The real possibility for abuse with psychodrama is its use by non-certified practitioners. Used properly, it is used in combination with talk therapy under the care of a board certified and licensed psychologist/clinical social worker with credentials in clinical psychology. Or a psychiatrist may collaborate with a licensed psychologist/clinical social worker with training in psychodrama.
The method is subject to abuse by people who are unlicensed, or who are licensed but are on ego-trips. It can be over-used or misused with a psychological condition that is not really a good candidate for psychodrama. And it can be misused as a panacea for all emotional/psychological problems when a particular client or patient needs a completely different type of treatment. For example, asking a paranoid schizophrenic to engage in psychodrama during a frank psychotic episode could be dangerous for that person and the world around him/her because the method could feed symptoms, whereas a change in medication could basically wipe the symptoms out.
Today everyone is an expert on everything. I expect that most of what I say will fall on deaf ears. So thanks for asking. It is important always to be cautious when thinking of modifying the thoughts and emotions of another human being.
Here is a link that is pretty concise and authoritative. It is from Psychodrama.org.uk
What is Psychodrama
Is Psychology Bullshit? Unlocking You 3, Mind Control Report Video