Thursday, August 13, 2009

The stage theories: are they all fiction?

The stage theories: are they all fiction?: "The stage theories: are they all fiction?: 'I normally do not like to thrash articles or opinion pieces, but this article by Michael Shermer, in the Scientific American, has to be dealt with as it as masquerading as an authoritative debunking by one of the foremost skeptics in one of the most respected magazines. Yet, it is low on science and facts and is more towards opinions, biases and prejudices.

Shermer, from the article seems to be generally antagonistic to stage theories as he thinks they are mere narratives and not science. The method he goes about discrediting stage theories is to lump all of them together (from Freud's' theories to Kohlberg's theories), and then by picking up on one of them (the stages of grief theory by Kubler-Ross) he tries to discredit them all. This is a little surprising. While I too believe (and it is one of the prime themes of this blog) that most of the stage theories have something in common and follow a general pattern, yet I would be reluctant to club developmental stage theories that usually involve stages while the child is growing; to other stage theories like stages of grief, in which no physical development is concurrent with the actual stage process, but the stages are in adults that have faced a particular situation and are trying to cope with that situation. In the former case the children are definitely growing and their brains are maturing and their is a very real substrate that could give rise to distinctive stages; in the latter case the stages may not be tied so much to development of the neural issue; as much they are to its plasticity; the question in latter case would be viz does the brain adapt to losses like a catastrophic news, death of loved one etc by reorganizing a bit and does the reorganizing happen in phases or stages. The two issues of childhood development and adult plasticity are related , but may be different too. With adult neurogenisis now becoming prominent I wont be surprised if we find neural mechanisms for some of these adult stages too, like the stages of grief, but I would still keep the issues different.

Second , assuming that Shermer is right and that at the least the stage theory of grief, as proposed by Kubler-Ross is incorrect; and also that it can be clubbed with other stage theories; would it be proper to conclude that all stage theories were incorrect based on the fact that one of them was incorrect/ false. It would be like that someone proposed a modular architecture of mind; and different modules for mind were proposed accordingly; but on of the proposed modules did not stood the scrutiny of time( lets say a module for golf-playing was not found in the brain); does that say that all theories that say that the brain is organized modularity for at least some functions are wrong and all other modules are proved non-existent. Maybe the grief stages theory is wrong, but how can one generalize from that to all developmental stage theories, many of them which have been validated extensively (like Paiget's theories) and go on a general rant against all things 'stages'!!

Next let me address another fallacy that Shermer commits; the causal analogy fallacy: that if two things are analogous than one thing is causing other , when in fact no directional inference can be drawn from the analogical space. He asserts that humans are pattern-seeking, story-telling primates who like to explain away there experiences with stories or narratives especially as it provides a structure over unpredictable and chaotic happenings. Now, I am all with Shermer up till this point and this has been my thesis too; but then he takes a leap here and says that this is the reason we come up with stage theories. Why 'stage' theories? Why not just theories? any theory, in as much as it is an attempt to provide a framework for understanding and explication is a potential narrative and perhaps anyone that tries to come up with a theory is guilty of story-telling by extension. The leap he is making here, is the assumption that story-telling is a 'stage' process and a typical story follows a pattern, which is, unfolding of plot in distinct stages.

Now, I agree with the leap too that Shermer is making- a narrative is not just any continuous thread of yarn that the author spins- it normally involves discrete stages and though I have not touched on this before, Christopher Brooks work that delineated the eight basic story plots also deals with the five -stage unfolding of plot in all the different basic story plots. so I am not contesting the fact that story-telling is basically a stage process with distinct stages through which the protagonist pass or distinct stages of plot development; what I am contesting is the direction of causality. Is it because we have evidence of distinct stages in the lives of individuals, and in general, evidence for the eight-fold or the five-fold stages of development of various faculties, that our stories reflect distinct stages as they unfold and the mono myth has a distinct stage structure; or is it because our stories have structures in the form of stages, that the theories we develop also have stages? I believe that some theorizing in terms of stages may indeed be driven by our desire to compartmentalize everything into eight or so basic stages and environmental adaptive problems we have encountered repeatedly and which have become part of our mythical narrative structure; but most parsimoniously or mythical narrative structure is stage bound, as we have observed regularities in our development and life that can only be explained by resorting to discrete stages rather than a concept to continuous incremental improvement/ development/ unfolding.

Before moving on, let me just give a brief example of the power of stage theories and how they can be traced to neural mechanisms. I'll be jumping from the very macro phenomenon I have been talking about to the very micro phenomenon of perception. One can consider the visuomotor development of a child. Early in life there is a stage when the oculomotor control is mostly due to sub cortical regions like superior colliculus and the higher cortical regions are not much involved (they are not sufficiently developed/ myelinated) . The retina of eye is such that the foveal region is underdeveloped; and all this combination means that infants are very good at orienting their eyes to moving targets in their peripheral vision, but are poor at colour and form discrimination. Also, they can perform saccades first, the capability to make antisaccades develops next and the capacity to make smooth pursuit movement comes later. There are distinct stages of oculomotor control that a child can move through and this would definitely affect its perception of the world. (for example on can recognize an disicrimintae based on form first and color later as the visual striated areas for these mature in that order. In sort, there are strong anatomical, physiological and psychological substrates for most of the developmental stage theories.

Now let me address, why Shermer, whom I normally admire, has taken this perverse position. It is because his Skeptic magazine recently published an article by Russell P. Friedman, executive director of the Grief Recovery Institute in Sherman Oaks, Calif. (www.grief-recovery.com), and John W. James, of The Grief Recovery Handbook (HarperCollins, 1998), which tried to debunk an article published by JAMA that found support for the five stage grief theory. Now, that Skeptic article had received a well-deserved thrashing by some reputed blogs, see this world Of Psychology post that exposes many of the holes in Friedman and James' argument, so possibly out of desperation Shermer though why not settle the scores and expose all stage theories as pseudoscience. Unfortunately he fails miserably in defending his publication and we have seen above why!

Now, let us come to the meat of the controversy: the stages of grief theory of Kubler-Ross for which the Yale group found evidence and which the Skeptics didn't like and found the evidence worth criticizing. I have read both the original JAMA paper and the skeptic article and see some merits to both side. In fact I guess the stance that Friedman et al have taken I even agree with to an extent, especially their decoupling of stages of grief from stages of dying person/ stages of adjustment to catastrophic death. Some excerpts:

IN 1969 THE PSYCHIATRIST ELIZABETH KÜBLER-ROSS wrote one of the most influential books in the history of psychology, On Death and Dying. It exposed the heartless treatment of terminally-ill patients prevalent at the time. On the positive side, it altered the care and treatment of dying people. On the negative side, it postulated the now-infamous five stages of dying—Denial, Anger, Bargaining, Depression, and Acceptance (DABDA), so annealed in culture that most people can recite them by heart. The stages allegedly represent what a dying person might experience upon learning he or she had a terminal illness. “Might” is the operative word, because Kübler-Ross repeatedly stipulated that a dying person might not go through all five stages, nor would they necessarily go through them in sequence. It would be reasonable to ask: if these conditions are this arbitrary, can they truly be called stages?

Many people have contested the validity of the stages of dying, but here we are more concerned with the supposed stages of grief which derived from the stages of
dying.

During the 1970s, the DABDA model of stages of dying morphed into stages of grief, mostly because of their prominence in college-level sociology and psychology courses. The fact that Kübler-Ross’ theory of stages was specific to dying became obscured.

Prior to publication of her famous book, Kübler-Ross hypothesized the Five Stages of Receiving Catastrophic News, but in the text she renamed them the Five Stages of Dying or Five Stages of Death. That led to the later, improper shift to stages of grief. Had she stuck with the phrase catastrophic news, perhaps the mythology of stages wouldn’t have emerged and grievers wouldn’t be encouraged to try to fit their emotions into non-existent stages.


I wholeheartedly concur with the authors that it is not good to confuse stages that a dying person may go through on receiving catastrophic death of terminal illness, with grief stages that may follow once one has learned of a loss and is coping with the loss(death of someone, divorce of parents etc); in the first case the event that is of concern is in the future and would lead to different tactics, than for the latter case when the event is already in the past and has occurred. thus, as rightly pointed by the authors, denial may make sense for dying people - 'the diagnosis is incorrect, I am not going to die; I have no serious disease.'; denial may not make sense for a loos of a loved one by death, as the vent has already happened and only a very disturbed and unable to cope person would deny the factuality of the event (death). but this is a lame point; in grief (equated with loss of loved one), they stage can be rightly characterized as disbelief/dissociation/isolation, whereby one would actively avoid all thoughts of the loved one's non-existence and come up with feelings like 'I still cannot believe that my mother is no longer alive' . Similarly My personal view is that while anger and energetic searching of alternatives may be the second stage response to catastrophic prospective forecast; the second stage response to a catastrophic news (news of a loss of loved one) would be more characterized by energized yearning for the lost one and an anger towards the unavoidable circumstances and the world in general that led to the loss.

The third stage is particularly problematic; in dying people it makes perfect sense to negotiate and bargain, as the event has not really happened ('I'll stop sinning, take away the cancer); but as rightly pointed out by the authors it doesn't make sense for events that have already happened.while many authoritative people have substituted yearning for the third stage in case of grief , I would propose that we replace that with regret or guilt. I know this would be controversial; but the idea is a bargaining of past events like 'God, please why didn't you take my life, instead of my young son' ; it doesn't make sense but is a normal stage of grieving - looking for and desiring alternative bad outcomes ('I wish I was dead instead of him'. The other two stages depression and acceptance do not pose as much problems, so I'll leave them for now. suffice it to say that becoming depressed / disorganized and then recovering/ becoming reorganized are normal stages that one would be expected to go through.

What I would now return is to their criticism of Kubler-Ross. They first attack her saying her evidence was anecdotal and based on personal feelings then , instead of correcting this gross error and themselves providing statistical and methodological research results, present anecdotal evidence based on their helping thousands of grieving persons.

Second they claim, that this stage based theories cause much harm; but I am not able to understand why a stage based theory must cause harm and , for all their good intentions, I think they are seriously confused here. On the one hand they claim (for eg in depression section) that stages lead to complacency:

It is normal for grievers to experience a lowered level of emotional and physical energy, which is neither clinical depression nor a stage. But when people believe depression is a stage that defines their sad feelings, they become trapped by the belief that after the passage of some time the stage will magically end. While waiting for the depression to lift, they take no actions that might help them.


and on the other hand they claim that labeling something causes over reactivity and over treatment:

When medical or psychological professionals hear grievers diagnose themselves as depressed, they often reflexively confirm that diagnosis and prescribe treatment with psychotropic drugs. The pharmaceutical companies which manufacture those drugs have a vested interest in sustaining the idea that grief-related depression is clinical, so their marketing supports the continuation of that belief. The question of drug treatment for grief was addressed in the National Comorbidity Survey published in the Archives of General Psychiatry,Vol. 64, April, 2007). “Criteria For Depression Are Too Broad Researchers Say—Guidelines May Encompass Many Who Are Just Sad.” That headline trumpeted the survey’s results, which observed more than 8,000 subjects and revealed that as many as 25% of grieving people diagnosed as depressed and placed on antidepressant drugs, are not clinically depressed. The study indicated they would benefit far more from supportive therapies that could keep them from developing full-blown depression.

Now, I am not clear what the problem is - is it complacency or too much concerns and over-treatment. And this argument they keep on repeating and hammering down - that stages do harm as them make people complacent that thing swill get better on its own and no treatment is needed. I don't think that is a valid assumption, we all know that many things like language develop, but their are critical times hen interventions are necessary for proper language to develop; so too is the case with grieving people, they would eventually recover, but they do need support of friends and family and all interventions, despite this being 'just a phase'. I don't think saying that someone would statistically go away in a certain time-period eases the effects one if feeling of the phenomenon right now. An analogy may help. It is statistically true, that on an average, within six months a person would get over his most recent breakup and start perhaps flirting again; that doesn't subtract from the hopelessness and feelings of futility he feels on teh days just following the breakup and most of the friends and family do provide support even though they know that this phase will get over. Same is true for other stages like stages of grief and the concerns of authors are ill-founded.

The concerns of the author that I did feel sympathetic too though was the stage concept being overused in therapy and feelings like guilt being inadvertently implanted in the clients by the therapists.

Grieving parents who have had a troubled child commit suicide after years of therapy and drug and alcohol rehab, are often told, “You shouldn’t feel guilty, you did everything possible.” The problem is that they weren’t feeling guilty, they were probably feeling devastated and overwhelmed, among other feelings. Planting the word guilt on them, like planting any of the stage words, induces them to feel what others suggest. Tragically, those ideas keep them stuck and limit their access to more helpful ideas about dealing with their broken hearts.

Therapists have to be really careful here and not be guided by pre-existing notions of how the patient is feeling. they should listen to the client and when in doubt ask questions, not implicitly suggest and assume things. That indeed is a real danger.

Lastly the criticism of stages/ common traits vs individual differences and uniqueness have to be dealt with. the claim that each grieves uniquely is not a novel claim and I do not find it lacking in evidence too. It is tautological. But still some common patterns can be elucidated and subsumed under stages. These stages are the 'normal' stages with enough room for individual aberration . I think there has to be more tolerance and acceptance of the 'abnormal' in general - if someone directly accepts and never feels and denial he too is abnormal - but one we readily accept as a resilient persons; the other who gets stuch at denial has to be shown greater care and hand-holded through the remaining stages to come to acceptance.

In the end I would like to briefly touch on the Yale study that reignited this controversy. Here is the summary of An Empirical Examination of the Stage Theory of Grief by Paul K. Maciejewski, PhD; Baohui Zhang, MS; Susan D. Block, MD; Holly G. Prigerson, PhD.


Context The stage theory of grief remains a widely accepted model of bereavement adjustment still taught in medical schools, espoused by physicians, and applied in diverse contexts. Nevertheless, the stage theory of grief has previously not been tested empirically.

Objective To examine the relative magnitudes and patterns of change over time postloss of 5 grief indicators for consistency with the stage theory of grief.

Design, Setting, and Participants Longitudinal cohort study (Yale Bereavement Study) of 233 bereaved individuals living in Connecticut, with data collected between January 2000 and January 2003.

Main Outcome Measures Five rater-administered items assessing disbelief, yearning, anger, depression, and acceptance of the death from 1 to 24 months postloss.

Results Counter to stage theory, disbelief was not the initial, dominant grief indicator. Acceptance was the most frequently endorsed item and yearning was the dominant negative grief indicator from 1 to 24 months postloss. In models that take into account the rise and fall of psychological responses, once rescaled, disbelief decreased from an initial high at 1 month postloss, yearning peaked at 4 months postloss, anger peaked at 5 months postloss, and depression peaked at 6 months postloss. Acceptance increased throughout the study observation period. The 5 grief indicators achieved their respective maximum values in the sequence (disbelief, yearning, anger, depression, and acceptance) predicted by the stage theory of grief.

Conclusions Identification of the normal stages of grief following a death from natural causes enhances understanding of how the average person cognitively and emotionally processes the loss of a family member. Given that the negative grief indicators all peak within approximately 6 months postloss, those who score high on these indicators beyond 6 months postloss might benefit from further evaluation.


I believe they have been very honest with their data and analysis. They found peak of denial, yearning, anger , depression and acceptance in that order. I belie they could have clubbed together anger and yearning together as the second stage as this study dealt with stages of grief and not stages of dying and should have introduced a new measure of regret/guilt and I predict that this new factors peak would be between the anger/yearning peak and depression peak.





Thus, to summarize, my own theory of grief and dying (in eth eight basic adaptive problems framework) are :

Stage theory of dying (same as Kubler-Ross):
  1. Denial: avoiding the predator; as the predator (death ) cannot be avoided , it is denied!!
  2. Anger/ Searching: Searching for resources; an energetic (and thus partly angry)efforts to find a solution to this over looming death; belief in pseudo-remedies etc.
  3. Bargaining/ negotiating: forming alliances and friendships: making a pact with the devil...or the God ...that just spare me this time and I will do whatever you want in future.
  4. Depression: parental investment/ bearing kids analogy: is it worth living/ bringing more people into this world?
  5. Acceptance: helping kin analogy: The humanity is myself. even if I die, I live via others.

Stage theory of grief (any loss especially loss of a loved one)
  1. Disbelief: Avoiding the predator (loss) . I cant believe the loss happened. Let me not think about it.
  2. Anger/ Yearning: Energetic search for resources (reasons) . Why did it happen to me; can the memories and yearning substitute for the loved one?
  3. Bargaining/ regret/ guilt: forming alliances and friendships: Could this catastrophe be exchanged for another? could I have died instead of him?
  4. Depression: parental investment/ bearing kids analogy : is it worth living/ bringing more people into this world?
  5. Acceptance: helping kin analogy: Maybe I can substitute the lost one with other significant others? Maybe I should be thankful that other significant persons are still there and only one loss has occurred.

Do let me know your thoughts on this issue. I obviously being a researcher in the stages paradigm was infuriated seeing the Shermer article,; others may have more balanced views. do let me know via comments, email!!

ResearchBlogging.org

Paul K. Maciejewski, PhD; Baohui Zhang, MS; Susan D. Block, MD; Holly G. Prigerson, PhD (2007). An Empirical Examination of the Stage Theory of Grief JAMA, 297 (7), 716-723


'
"

The Certainty Bias [The Frontal Cortex]

The Certainty Bias [The Frontal Cortex]: "The Certainty Bias [The Frontal Cortex]: '

Over at Mind Matters, I've got an interview with Dr. Robert Burton on the danger of certainty and its relevance during a presidential election:



LEHRER: To what extent does the certainty bias come into play during a presidential election? It seems like we all turn into such partisan hacks every four years, completely certain that our side is right.

BURTON: The present presidential debates and associated media commentary feel like laboratory confirmation that the involuntary feeling of certainty plays a greater role in decision-making than conscious contemplation and reason.



I suspect that retreat into absolute ideologies is accentuated during periods of confusion, lack of governmental direction, economic chaos and information overload. At bottom, we are pattern recognizers who seek escape from ambiguity and indecision. If a major brain function is to maintain mental homeostasis, it is understandable how stances of certainty can counteract anxiety and apprehension. Even though I know better, I find myself somewhat reassured (albeit temporarily) by absolute comments such as, 'the stock market always recovers,' even when I realize that this may be only wishful thinking.



Sadly, my cynical side also suspects that political advisors use this knowledge of the biology of certainty to actively manipulate public opinion. Nuance is abandoned in favor of absolutes.



Why are people so eager for certainty? I think part of the answer is revealed in an interesting Science paper by Colin Camerer and colleagues. His experiment revolved around a decision making game known as the Ellsberg paradox. Camerer imaged the brains of people while they placed bets on whether the next card drawn from a deck of twenty cards would be red or black. At first, the players were told how many red cards and black cards were in the deck, so that they could calculate the probability of the next card being a certain color. The next gamble was trickier: subjects were only told the total number of cards in the deck. They had no idea how many red or black cards the deck contained.



The first gamble corresponds to the theoretical ideal of economics: investors face a set of known risks, and are able to make a decision based upon a few simple mathematical calculations. We know what we don't know, and can easily compensate for our uncertainty. As expected, this wager led to the 'rational' parts of the brain becoming active, as subjects simply computed the odds. Unfortunately, this isn't how the real world works. In reality, our gambles are clouded by ignorance and ambiguity; we know something about what might happen, but not very much. When Camerer played this more realistic gambling game, the subjects' brains reacted very differently. With less information to go on, the players exhibited substantially more activity in the amygdala and in the orbitofrontal cortex, which is believed to modulate activity in the amygdala. In other words, we filled in the gaps of our knowledge with fear.



I'd argue that it's this subtle stab of fear that creates our bias for certainty. Not knowing makes us uneasy, and we always try to minimize such negative feelings. As a result, we pretend that we have better intelligence about Iraqi WMD than we actually do, or we make believe that the subprime debt being bought and sold on Wall Street is really safe. In other words, we selectively interpret the facts until the uncertainty is removed.



Camerer also tested patients with lesioned orbitofrontal cortices. (These patients are unable to generate and detect emotions.) Sure enough, because these patients couldn't feel fear, their brains treated both decks equally. Their amygdalas weren't excited by ambiguity, and didn't lead them astray. Because of their debilitating brain injury, these patients behaved perfectly rationally. They exhibited no bias for certainty.



Obviously, it's difficult to reduce something as amorphous as 'uncertainty' to a few isolated brain regions. But I think Camerer is right to argue that his 'data suggests a general neural circuit responding to degrees of uncertainty, contrary to decision theory.'

Read the comments on this post...'
"

The Only Two Secrets to Motivating Yourself You’ll Ever Need

The Only Two Secrets to Motivating Yourself You’ll Ever Need: "The Only Two Secrets to Motivating Yourself You’ll Ever Need: '

I’ve written about motivation a bunch of times before here on Zen Habits, but the more I learn about it, the more I realize that motivation isn’t that complicated.


Sure, there are numerous tips that can help, numerous tactics and strategies I’ve used with success. But it really all boils down to two things.


And those two things are so deceptively simple that you might decide to stop reading after I name them: 1) make things enjoyable and 2) use positive public pressure. But read on for more on how to use those two things to motivate yourself for any goal.


It’s Motivation, Not Discipline


First let’s back up a little bit. A number of readers have emailed me about sticking to their goals — anything from exercise and eating right to being organized and productive to creating new habits — and have said they simply lack the discipline to stick with things for very long.


But what is discipline, really? It’s mostly an illusion, in my experience.


When people say that someone has “discipline”, as I’ve written about before, they really mean he has the motivation to stick to something.


In a previous post I used the example of someone in the military, a typical case of somone who is said to have discipline. This military man might get up super early, fix his bed neatly, go on an early-morning run, do a bunch of other exercises, and generally do a disciplined job throughout the day.


But is that just because he’s disciplined? I think it’s mostly because he’s in a situation where there’s public pressure (both positive and negative) to do all of the things listed above. If he doesn’t do them, he might get yelled at or demerited or look bad in front of his peers. If he does do them, he’s an exemplary soldier.


There’s also the fact that after awhile, these things become pleasurable for him. He gets a sense of satisfaction out of staying in shape and keeping things neat. He enjoys the early morning. He feels good about being conscientious about his job.


So in the end, it’s not some vague quality (”discipline”) that allows him to stick to these habits, but rather the two secrets of motivation: positive public pressure and enjoyment.


What I Learned From My Experiences


Over the last few years, I’ve been experimenting with achieving various goals — from waking early to exercising to eliminating my debt and living frugally and simply and more. And what I’ve learned has repeatedly taught me that these two key motivation principles are all you need.


I’ve learned other things as well, but the more I stick to my goals, the more I realize that it’s these two themes that keep repeatedly surfacing. It’s almost eerie, actually. Just a few goals as illustration:



  • Marathon. Right now I’m training for my third marathon, in Honolulu this December. As I’ve stuck with the toughest marathon plan I’ve ever undertaken (last week my longer runs were 12 and 20 miles, and this week I’m doing 2 runs of 14 miles), I’ve marveled at my ability to keep at it. But it’s not hard to figure out why: I’ve publicly committed to doing this marathon — on this blog, on Twitter, and on Train For Humanity, where I’m raising money for humanitarian causes through my training (sponsor me here!). In addition to that, I’m really enjoying all the running!

  • Blogging. I’ve now been blogging for almost two years (I started in January 2007), making Zen Habits one of the longest-running projects I’ve ever stuck with. I’ve worked on many projects before, but they are usually completed within a year, if not within a few weeks or months. Anything longer is usually intimidating to me. But it hasn’t taken discipline to stick with blogging, not at all. It’s something I really enjoy, and there’s the added bonus of positive public pressure (that’s you, the readers) that has motivated me to stick with it.

  • Writing a book. A couple months ago, I finished the manuscript for my book, The Power of Less, that’s coming out at the end of this year. I will admit that I had some trouble writing this book, with the demands of publishing a blog (two blogs actually), training for my second marathon in March, and preparing for my wedding in June. I wasn’t always following my own advice (although in my defense I learned to segregate the different goals so I only concentrated on one at a time). But I did get the book done with both forms of motivation — pressure from my publisher to turn in the manuscript, and the enjoyment I got from writing the book once I was able to clear away distractions and focus on the writing.


I could go into many more examples of how I used these two forms of motivation, but you get the idea. Now let’s take a look at each one and how you can use them to your advantage.


Positive Public Pressure


While pressure is often seen as a bad thing (”I’m under too much pressure!”), if used properly it can actually be a good thing. It’s important that pressure not be applied in too negative a way and too high an intensity. Keep things positive and at a manageable intensity, and things will move along nicely.


Some examples of how to use positive public pressure to motivate yourself:



  • Tell all your co-workers you’re going to achieve a goal (”No sugary snacks this week!” or “I’m going to keep my email inbox completely empty”) and report to them regularly on your progress.

  • Email your family and friends and tell them about your goal and ask them to keep you accountable. Email them regular updates, and tell them about your progress when you see them.

  • Post your goal on your blog and post regular updates. It’s important that you not just post the goal but also stay accountable with the updates. Encourage people to ask you about your goal if you don’t report your progress.

  • Join an online forum related to your goal — I’ve done this when I quit smoking and also when I started running. Introduce yourself, make friends, tell them about your goal, ask for help when you need it, and report your successes and failures.

  • Write a regular column in a publication on your goal. I did this when I ran my first marathon, for my local newspaper. It created a lot of positive public pressure — everywhere I went, people would say, “Hey, you’re that marathon guy! How’s the running going?” Of course, not everyone can write a column for a newspaper, but you could do it for a group blog or a newsletter or some other type of publication.

  • Post your goal and a chart of your progress up in your office or other public place.

  • Post pictures of yourself each day. One guy did this and created a video of his progress — it was amazing to watch.


You get the idea. I’m sure you can come up with some ideas of your own.


Enjoy Your Goal Activity


You can motivate yourself to do something you don’t like to do, using positive public pressure as motivation. But if you really don’t enjoy it, you’ll only be able to keep it up for so long. And even if you could do it for months and years … is that something you’d want to do? If you don’t enjoy it, why do it for very long?


But, you might say, what if it’s something I really want to achieve but I don’t enjoy it? There are ways to find enjoyment in most things — the key is to focus on the enjoyable parts. Focus on the positive.


Here are some ways to use this motivational principle to your advantage:



  • Having trouble motivating yourself to write for your blog? Look for topics that excite you. If you find things that you’re passionate about, writing becomes easy.

  • Having trouble with a dissertation for graduate school? Maybe you’re not as passionate about the topic as you thought you were. Re-examine your dissertation topic and see if you can either re-energize yourself about it or find a new topic you can get excited about.

  • Having a hard time exercising? Find exercise that’s fun for you. If you don’t like running, try soccer or basketball or rowing. If you don’t like to lift weights, try doing some primal workouts where you flip logs and jump through tires. Go hiking. Walk with friends and talk the whole time.

  • Is eating healthy a challenge for you? Find healthy foods you love. Experiment with new recipes and have fun testing them out.

  • Is training for a marathon tough? Learn to enjoy the quiet of the early morning, the contemplative nature of running, or the beautiful nature that surrounds you. Or play some songs that pump you up. Or listen to interesting audiobooks as you run.


Find the enjoyable parts of any activity, and focus on those. In time, you can really learn to love something. Or, switch to something you love more and stick to that.


These two principles, especially when used together, can be powerful motivators. In fact, in most cases, they’re all the motivation I ever need.


If you really want more motivational tips, I’ve got a lot more here.


Have you used these motivational principles to achieve a goal? Let us know in the comments!



If you liked this article, please share it on del.icio.us, StumbleUpon or Digg. I’d appreciate it. :)






'
"

What does alcohol do to your brain?

What does alcohol do to your brain?: "What does alcohol do to your brain?: '

In the United States and most European nations, the majority of people have used alcohol by young adulthood (Substance Abuse and Mental Health Services Administration, 2007). This blog entry will review what is known about how much alcohol use may affect brain functioning long after the intoxication effects have warn off.

About 50% of people who meet diagnostic criteria for alcoholism show some problems in thinking or memory (Oscar Berman & Marinkovic, 2003). The ability to plan ahead, withhold responses, learn and hold information, and work with spatial information (such as following a map) are particularly affected (Fein, Torres, Price, & Di Sclafani, 2006; Sullivan, Deshmukh, De Rosa, Rosenbloom, & Pfefferbaum, 2005; Sullivan, Fama, Rosenbloom, & Pfefferbaum, 2002). Even 15-16 year-olds with heavy drinking histories have shown problems in the ability to recall information that was previously learned (Brown, Tapert, Granholm, & Delis, 2000). However, there is always the chicken and egg problem - what came first? It's possible that alcohol use doesn't actually cause these effects, but instead these problems may have been there before, and may in fact be a risk factor for developing alcohol abuse or dependence. What is important to keep in mind is that we have seen poorer performance over time among young people who continued a pattern of heavy alcohol use (Tapert, Granholm, Leedy, & Brown, 2002). Those who reported drinking so much at times that they experienced negative after-effects, or hangovers, were the most likely to go downhill over time, as compared to those who halted substance use (Tapert et al., 2002).


These findings are not just the results of individuals not trying hard enough on these tasks. The size and shape of brain structures are also abnormal in chronic heavy drinkers. The overall amount of gray matter (brain cells) and white matter (cabling between the cells) are reduced (Pfefferbaum et al., 1995), particularly in the frontal lobes, which are key parts of the brain for planning, withholding responses, making decisions, and regulating emotions. White matter is key for relaying information within the brain, and the coherence or quality of white matter tracts appears poorer in chronic heavy drinking adults (Pfefferbaum, Adalsteinsson, & Sullivan, 2006). In adolescent heavy drinkers, we have seen, on average, smaller sizes of the hippocampus (a key region for learning new information) and portions of the frontal lobes (Medina et al., 2008; Medina, Schweinsburg, Cohen-Zion, Nagel, & Tapert, 2007; Nagel, Schweinsburg, Phan, & Tapert, 2005). Further, our preliminary studies have suggested that white matter quality is poorer in adolescents consuming as little as 20 drinks per month than in non-drinkers.


Taken together, there are clear differences between chronic heavy drinkers and non-drinkers in how the brain works. It appears that the brain of chronic drinkers has to "work harder" to keep things in mind, such as remembering a phone number, an address where you need to go, or a shopping list. In comparison, although we see similar changes in the brains of adolescents with only 1-2 years of heavy drinking, it appears that the young brain can compensate for any subtle alcohol-related disturbances by working other brain regions a little harder (Tapert, Pulido, Paulus, Schuckit, & Burke, 2004). However, if heavy drinking continues, by young adulthood the brain may not be able to compensate as effectively, and performance may begin to decline (Tapert et al., 2001). On the other hand, the brains of adolescent heavy drinkers but not those of individuals who rarely drink spend much processing effort when they look at alcohol advertisements, relative to looking at non-alcohol beverage images (Tapert et al., 2003). Therefore, brains may become "sensitized" to processing alcohol related information once you get involved in drinking.


The bottom line is that research shows clearly that chronic use of heavy levels of alcohol is associated with adverse effects on the brain. The bad news is, if you want to reach your maximum potential with the brain you have, you should limit alcohol use to moderate levels (that is ≤1 drink for females and ≤2 drinks for males per occasion). The good news is that for people in recovery from alcohol problems, many difficulties with concentration and memory will improve substantially in the first month of recovery, and even throughout continued recovery as long as you stay away from alcohol.


References
Brown, S. A., Tapert, S. F., Granholm, E., & Delis, D. C. (2000). Neurocognitive functioning of adolescents: Effects of protracted alcohol use. Alcoholism: Clinical and Experimental Research, 24, 164-171.
Fein, G., Torres, J., Price, L. J., & Di Sclafani, V. (2006). Cognitive performance in long-term abstinent alcoholic individuals. Alcohol Clin Exp Res, 30(9), 1538-1544.
Medina, K., McQueeny, T., Nagel, B., Hanson, K., Schweinsburg, A., & SF, T. (2008). Prefrontal cortex volumes in adolescents with alcohol use disorders: Unique gender effects. Alcoholism: Clinical and Experimental Research, 32, 386-394.
Medina, K. L., Schweinsburg, A. D., Cohen-Zion, M., Nagel, B. J., & Tapert, S. F. (2007). Effects of alcohol and combined marijuana and alcohol use during adolescence on hippocampal volume and asymmetry. Neurotoxicology & Teratology, 29, 141-152.
Nagel, B. J., Schweinsburg, A. D., Phan, V., & Tapert, S. F. (2005). Reduced hippocampal volume among adolescents with alcohol use disorders without psychiatric comorbidity. Psychiatry Research, 139(3), 181-190.
Oscar Berman, M., & Marinkovic, K. (2003). Alcoholism and the brain: an overview. Alcohol Res Health, 27(2), 125-133.
Pfefferbaum, A., Adalsteinsson, E., & Sullivan, E. (2006). Supratentorial profile of white matter microstructural integrity in recovering alcoholic men and women. Biological Psychiatry, 59, 364-372.
Pfefferbaum, A., Sullivan, E., Mathalon, D., Shear, P., Rosenbloom, M., & Lim, K. (1995). Longitudinal changes in magnetic resonance imaging brain volumes in abstinent and relapsed alcoholics. Alcoholism: Clinical and Experimental Research, 19, 1177-1191.
Substance Abuse and Mental Health Services Administration. (2007). Results from the 2006 National Survey on Drug Use and Health: National Findings. Rockville, MD: Office of Applied Studies.
Sullivan, E. V., Deshmukh, A., De Rosa, E., Rosenbloom, M. J., & Pfefferbaum, A. (2005). Striatal and forebrain nuclei volumes: Contribution to motor function and working memory deficits in alcoholism. Biological Psychiatry, 57, 768-776.
Sullivan, E. V., Fama, R., Rosenbloom, M. J., & Pfefferbaum, A. (2002). A profile of neuropsychological deficits in alcoholic women. Neuropsychology, 16(1), 74-83.
Tapert, S. F., Brown, G. G., Kindermann, S., Cheung, E. H., Frank, L. R., & Brown, S. A. (2001). fMRI measurement of brain dysfunction in alcohol-dependent young women. Alcoholism: Clinical and Experimental Research, 25, 236-245.
Tapert, S. F., Cheung, E. H., Brown, G. G., Frank, L. R., Paulus, M. P., Schweinsburg, A. D., Meloy, M. J., & Brown, S. A. (2003). Neural response to alcohol stimuli in adolescents with alcohol use disorder. Arch Gen Psychiatry, 60, 727-735.
Tapert, S. F., Granholm, E., Leedy, N. G., & Brown, S. A. (2002). Substance use and withdrawal: Neuropsychological functioning over 8 years in youth. J Int Neuropsychol Soc, 8(7), 873-883.
Tapert, S. F., Pulido, C., Paulus, M. P., Schuckit, M. A., & Burke, C. (2004). Level of response to alcohol and brain response during visual working memory. J Stud Alcohol, 65(6), 692-700.


'
"

You are The Way You Value and Devalue

You are The Way You Value and Devalue: "You are The Way You Value and Devalue: '

Self-value is a far more useful construction than self-esteem. The latter is, at best, a function of what you think about yourself -- mostly in comparison to others -- or, worse, a depiction of your ego. Value is more behavioral than conceptual, more about how you act toward what you value, including yourself, than how you regard it. To value something goes beyond regarding it as important; you also appreciate its qualities, while investing the time, energy, effort, and sacrifice necessary for its maintenance. If you value a da Vinci painting, you focus on its beauty and design more than the cracks in the paint, and, above all, you treat it well, making sure that it is maintained in ideal conditions of temperature and humidity. Similarly, people with self-value appreciate their better qualities (while trying to improve their lesser ones) and take care of their physical and psychological health, growth, and development.


Now here's the tricky part. People with high self-value necessarily value others.


Although hard to see in yourself, you can probably notice the following tendency in other people. When they value someone else, they value themselves more, i.e., they elevate their sense of well being, appreciate their better qualities, and facilitate their health, growth, and development. When they devalue someone else, they devalue themselves - their sense of well being deteriorates, they violate their basic humanity to some degree and become more narrow and rigid in perspective, all of which impair growth and development. In other words, when you value someone else you experience a state of value - vitality, meaning, and purpose (literally, your will to live increases) - and when you devalue someone else you experience a devalued state, wherein the will to live becomes less important than the will to dominate or at least be seen as right.


It's often hard to notice that you are in a devalued state, because devaluing others requires a certain amount of adrenalin, which creates a temporary feeling of power and certainty - you feel right (although you are more likely self-righteousness), but it lasts only as long as the arousal lasts. To stay 'right,' you have to stay aroused, negative, and narrow in perspective: 'Every time I think of him I get pissed!' In contrast, when self-value is high, you can easily disagree with someone without feeling devalued and without devaluing.


The impulse to devalue others always signals a diminished sense of self, as you must be in a devalued state to devalue. That's why it's so hard to put someone down when you feel really good (your value investment is high) and equally hard to build yourself up when you feel resentful.


If you doubt the latter, think of things you say to yourself and others when resentful: 'I shouldn't have to put up with this; I deserve better, just look at all the good things I do....' When you value others, i.e., when your self-value is high, you do not think of what you have to put up with and you certainly don't feel the need to list the good things you do. Rather, when confronted with life or relationship challenges, you shift automatically into improve mode - you try to make bad situations better.


The great swindle of devaluing others is that it never puts you in touch with the most important things about you and, therefore, never raises personal value. On the contrary, its whole purpose is to make someone else's value seem lower than your own. If it works, you're both down; if it doesn't, you end up lower than where you started. In either case, your personal value remains low and dependent on downward comparison to those you devalue, creating a chronic state of powerlessness in regard to self-value. The motivation to gain temporary empowerment by devaluing others occurs more and more frequently, until it takes over your life. This could be what Oscar Wilde meant by, 'Criticism is the only reliable form of autobiography.'


Valuing others makes self-value soar. It also carries substantial social reward; showing value tends to invoke reciprocity and cooperation, while devaluing inspires reciprocity and resistance. Worst of all, devaluing others makes us look for something to be cranky about, so the low-grade adrenalin can inflate our egos enough to get us through the day.


'
"

Reading Fluency vs. Reading for Speed

Reading Fluency vs. Reading for Speed: "Reading Fluency vs. Reading for Speed: '

In California, our unit assessments which align with the Open Court Reading series aren’t from the publishers of Open Court but from the Sacramento County Office of Education.


The tests’ emphasis on timing students reading leads teachers to teach reading in a dibels-like fashion with reading passages typed with numbers of words written at the end of every line. While a certain amount of practice with these passages might reduce stress level on the day of the tests, a steady diet of “fluency passages” will surely turn students off to reading for life.


In college did you ever try to read through one of your texts as fast as you could? How much of it would you remember if you did?


Aside from turning students off to reading, research has shown that techniques like Sustained Silent Reading (SSR) and Drop Everything and Read (DEAR) are less effective at increasing students’ fluency levels because students are reading independently and not receiving any feedback on their reading. More effective then would be partner reading or group reading where students are reading with others. Fluency passages lend themselves to testing. Even if students have partners, their goal is usually to see how many mistakes each other makes and to figure out how fast they’re reading (that’s the point of a fluency passage). If you have taught your students to offer corrective feedback when reading fluency passages then why not teach them to offer corrective feedback when reading authentic literature or anything else that’s more interesting than a sheet of text with numbers?


While there are plenty of free fluency passages available (and again I say a little bit might be healthy if continue to assess using the same format), I would strongly recommend teachers using reader’s theater packets instead of fluency packets for daily fluency practice.


There is tons of free Reader’s Theater available and I’ve written before about how to use it. Reader’s Theater is generally more interesting, it demands that students read together, by nature it emphasizes prosidy, and it encourages rereading for a genuine purpose.




'
"

20 Things You Didn't Know About... Genius

20 Things You Didn't Know About... Genius: "20 Things You Didn't Know About... Genius: '7) Many 19th- and 20th-century creative geniuses acquired a reputation for promiscuity. Examples include Richard Feynman, Albert Einstein, and Bertrand Russell. 8) One theory suggests that male geniuses are unusually endowed with enthusiasm for risk taking, which is notoriously testosterone-linked.'
"

Baby Talk & Brain Waves

Baby Talk & Brain Waves: "Baby Talk & Brain Waves: 'Researchers studying the brains of toddlers say the strength of their brain waves can indicate language ability. The research might lead to early identification of language impairment.



'
"

Evolutionary Psychology [Gene Expression]

Evolutionary Psychology [Gene Expression]: "Evolutionary Psychology [Gene Expression]: '

Allen MacNeill of the Evolution List has a new weblog, Evolutionary Psychology. Check it out.

Read the comments on this post...'
"

The Psychology of Happiness: 13 Steps to a Better Life

The Psychology of Happiness: 13 Steps to a Better Life: "The Psychology of Happiness: 13 Steps to a Better Life: '

We think we know what will make us happy, but we don’t. Many of us believe that money will make us happy, but it won’t. Except for the very poor, money cannot buy happiness. Instead of dreaming of vast wealth, we should dream of close friends and healthy bodies and meaningful work.


The psychology of happiness

Several years ago, James Montier, a “global equity strategist”, took a break from investing in order to publish a brief overview of existing research into the psychology of happiness [PDF]. Montier learned that happiness comprises three components:



  • About 50% of individual happiness comes from a genetic set point. That is, we’re each predisposed to a certain level of happiness. Some of us are just naturally more inclined to be cheery than others.

  • About 10% of our happiness is due to our circumstances. Our age, race, gender, personal history, and, yes, wealth, only make up about one-tenth of our happiness.

  • The remaining 40% of an individual’s happiness seems to be derived from intentional activity, from “discrete actions or practices that people can choose to do”.


If we have no control over our genetic “happy point”, and if we have little control over our circumstances, then it makes sense to focus on those things that we can do to make ourselves happy. According to Montier’s paper, these activities include sex, exercise, sleep, and close relationships.


What does not bring happiness? Money, and the pursuit of happiness for its own sake. “A vast array of individuals seriously over-rate the importance of money in making themselves, and others, happy,” Montier writes. “Study after study from psychology shows that money doesn’t equal happiness.”




The happiness paradox

Writing in The Washington Post last June, Shankar Vedantam described recent research into this subject. If the United States is generally wealthier than it was thirty or forty years ago, then why aren’t people happier? Economist Richard Easterlin of the University of Southern California believes that part of the problem is the hedonic treadmill: once we reach a certain level of wealth, we want more. We’re never satisfied. From Vedantam’s article:



Easterlin attributes the phenomenon of happiness levels not keeping pace with economic gains to the fact that people’s desires and expectations change along with their material fortunes. Where an American in 1970 may have once dreamed about owning a house, he or she might now dream of owning two. Where people once dreamed of buying a new car, they now dream of buying a luxury model.


“People are wedded to the idea that more money will bring them more happiness,” Easterlin said. “When they think of the effects of more money, they are failing to factor in the fact that when they get more money they are going to want even more money. When they get more money, they are going to want a bigger house. They never have enough money, but what they do is sacrifice their family life and health to get more money.”


The irony is that health and the quality of personal relationships are among the most potent predictors of whether people report they are happy — and they are often the two things people sacrifice in their pursuit of greater wealth.


Why aren’t rich people happier? Perhaps it’s because many of them are workaholics, because they’re more focused on money than on the things that would bring them joy. A brief companion piece to The Washington Post story notes that researchers have found that “being wealthy is often a powerful predictor that people spend less time doing pleasurable things, and more time doing compulsory things and feeling stressed.”


In general, rich people aren’t much happier than those of us in the middle class. Yes, money can buy happiness if it elevates you from poverty, but beyond that the benefits are minimal. So why do so many people believe that money will make things better?


Stumbling on happiness

In 2006, Harvard psychology professor Daniel Gilbert published Stumbling on Happiness, a book about our inability to predict what will really make us happy. The following is a 22-minute video of a presentation Gilbert made at TED 2004, in which he compresses his ideas into bite-sized chunks:




Gilbert says that because humans can plan for the future, we naturally want to structure our lives in such a way that we are happy, both now and later. But how do we know what will make us happy? We don’t. In fact, we’re surprisingly bad at predicting what will bring us joy. Gilbert asks:


Which future would you prefer? One in which you win the lottery? Or one in which you become paraplegic? Which would make you happier? [...] A year after losing their legs, and a year after winning the lotto, lottery winners and paraplegics are equally happy with their lives.


The problem is impact bias, the tendency to overestimate the “hedonic impact” of future events. Put another way, the things that we think will make us happy usually don’t make us as happy as we think they will. Winning the lottery isn’t a panacea. Having an affair with your hot new co-worker won’t be as thrilling as you imagine. And losing a leg isn’t the end of the world.


It turns out that humans are able to synthesize happiness. Many people look outside themselves for fulfillment; they expect to find it in things, or in relationships, or in large bank accounts. But true happiness comes from within. True happiness comes when we learn to be content with what we have.




13 steps to a better life

What does all this mean to you? If money won’t bring you happiness, what will? How can you stop making yourself miserable and start learning to love life? According to my research, these are the thirteen actions most likely to encourage happiness:



  1. Don’t compare yourself to others. Financially, physically, and socially, comparing yourself to others is a trap. You will always have friends who have more money than you do, who can run faster than you can, who are more successful in their careers. Focus on your own life, on your own goals.

  2. Foster close relationships. People with five or more close friends are more apt to describe themselves as happy than those with fewer.

  3. Have sex. Sex, especially with someone you love, is consistently ranked as a top source of happiness. A long-term loving partnership goes hand-in-hand with this.

  4. Get regular exercise. There’s a strong tie between physical health and happiness. Anyone who has experienced a prolonged injury or illness knows just how emotionally devastating it can be. Eat right, exercise, and take care of our body. (And read Get Fit Slowly!)

  5. Obtain adequate sleep. Good sleep is an essential component of good health. When you’re not well-rested, your body and your mind do not operate at peak capacity. Your mood suffers. (Read more in my brief guide to better sleep.)

  6. Set and pursue goals. I believe that the road to wealth is paved with goals. More than that, the road to happiness is paved with goals. Continued self-improvement makes life more fulfilling.

  7. Find meaningful work. There are some who argue a job is just a job. I believe that fulfilling work is more than that — it’s a vocation. It can take decades to find the work you were meant to do. But when you find it, it can bring added meaning to your life.

  8. Join a group. Those who are members of a group, like a church congregation, experience greater happiness. But the group doesn’t have to be religious. Join a book group. Meet others for a Saturday morning bike ride. Sit in at the knitting circle down at the yarn shop.

  9. Don’t dwell on the past. I know a guy who beats himself up over mistakes he’s made before. Rather than concentrate on the present (or, better yet, on the future), he lets the past eat away at his happiness. Focus on the now.

  10. Embrace routine. Research shows that although we believe we want variety and choice, we’re actually happier with limited options. It’s not that we want no choice at all, just that we don’t want to be overwhelmed. Routines help limit choices. They’re comfortable and familiar and, used judiciously, they can make us happy.

  11. Practice moderation. Too much of a good thing is a bad thing. It’s okay to indulge yourself on occasion — just don’t let it get out of control. Addictions and compulsions can ruin lives.

  12. Be grateful. It’s no accident that so many self-help books encourage readers to practice gratitude. When we regularly take time to be thankful for the things we have, we appreciate them more. We’re less likely to take them for granted, and less likely to become jealous of others.

  13. Help others. Over and over again, studies have shown that altruism is one of the best ways to boost your happiness. Sure, volunteering at the local homeless shelter helps, but so too does just being nice in daily life.


Remember: True wealth is not about money. True wealth is about relationships, about good health, and about continued self-improvement.




Further reading

Here are some additional happiness resources:



Last month, we discussed whether it was more important to be rich or to be happy. “Money and happiness are not mutually exclusive,” many of you noted, and it’s true. You can have both. Or neither. It’s important to realize, however, that money is a less reliable source of happiness than what’s inside you.


---
Related Articles at Get Rich Slowly:





'
"

25 Ed Tech Leaders to Follow

25 Ed Tech Leaders to Follow: "25 Ed Tech Leaders to Follow: 'While I was at the MLTI Summer Institute, a great experience which I'll be writing about tomorrow, the Building Learning Communities conference was happening in Boston. I would have liked to go to BLC, but when comparing the cost of attendance versus having all of my expenses paid by presenting at MLTI, it was an easy decision. Hopefully, next year the two events will not conflict with each other.

The great thing about Twitter and social networks is that even though many of us were not able to go to BLC, we can still experience some of the learning that took place there. One such example is the posting on Slideshare of the presentation about ed tech leaders made by Lisa Thumann and Liz Davis. If you're looking for some new people to follow on Twitter or in your RSS reader, I highly recommend viewing this presentation. You can view the presentation below.
If you're viewing this in RSS you may need to click through to view the presentation.


The presentation contained references to many of the people that I have been following for a while, but it also had some new names that I hadn't seen before. I'll be adding those new names to my collection of RSS feeds and Twitter network.



'
"

Thursday, June 04, 2009

Playtime

The serious need for play

I get all excited any time I see an article regarding the beneficial effects of play. It is critical for young children, but I also think that it gets a little short-changed when it comes to adults, and certainly in the realm of education. Play, in education, is usually reserved for specific activities like recess, or gym, or possibly music or art. Play should be as central a part of the education process as it can possibly be. There are very few ways of learning that can surpass the efficiency and effectiveness of play.

Thursday, March 12, 2009

Do cultural brain differences exist

East meets west: How the brain unites us all - science-in-society - 10 March 2009 - New Scientist

Well, the short answer to the title of this post is: Yes. In the sense that all learning changes the structure of the brain at the cellular level (neurons linking to other neurons) - in that regard ALL brains are different and certainly brains from individuals within a culture will have similar learnings that are culturally defined and therefore "different" from other cultures.

However, the article tries to tackle the idea that the differences might be much more profound than just minor differences in general learning. The evidence they bring up both pro and con are quite interesting, but inconclusive. Some anthropologists (which I have linked here previously) would argue that there are no significant cultural differences that could be ascribed to evolutionary divergence. They point to things like gender roles and perception across cultures as the basis for this thinking.

I don't think I buy any argument that would go so far as to say that thinking across vastly different cultures has become evolutionarily divergent, but developmentally divergent would be a more accurate description.

Wednesday, February 25, 2009

Remote control brains?

Yep....remote controlled brains.

Obviously this is still research in the most rudimentary and basic stages, but the potential is kind of astounding. As with any basic research there are still huge hurdles to surmount before any of this becomes practical in any way, but hey....proof of concept is enough to get you exicted, no?

Saturday, February 21, 2009

When Politics gets in the way of Science

Genetic Future : Should scientists study race and IQ?

I haven't read the referenced articles yet - and to be honest I'm quite sure I won't. I'm not sure I could stomach the "no we shouldn't" article.

I agree whole-heartedly with Mr. MacArthur's thoughts concerning the necessity of studying ALL aspects of a topic that can be effectively studied through scientific methodologies. I also agree that letting politics get in the way or simply eliminating scientific inquiry because the issue is highly socially charged is a huge mistake.

I feel compelled to mention the idea that there are certain topics where preliminary scientific inquiry has emboldened one side of a debate to claim that there is scientific consensus that their side of the issue is the correct or even the only rational or ethical side to take (can anyone say global warming debate?). This is also a mistake as it will stifle true scientific inquiry. Confirmation bias is one of the hardest obstacles to overcome in science, the more political or social stake is attached to an issue, the more value is given to finding "evidence" to support the "consensus" - and that sounds an awful lot like strengthening the effect of confirmation bias.

True, unbiased, scientific inquiry is hard enough to do without having the hopes and expectations of hoards of political/social activists (and the influence/money they control) waiting in the wings to reward scientists that confirm their agenda or excoriate those that disagree with them. Unfortunately, that is the climate we work in and what we have to deal with. This climate exists in EVERY political climate - if Republicans or Democrats are in charge the only thing that changes is the agenda. The influence and the consequences thereof are still just as present.

Subliminal messaging

Subliminal messages really do affect your decisions - life - 14 February 2009 - New Scientist

Well, the overall result would seem to not necessarily confirm the long-standing belief that I can convince you to do nearly anything with subliminal messages, but at least with visual images our brains seem to register things in the background while we aren't really paying attention.

Not quite the earth-shattering finding the title suggests, but still, an interesting finding nevertheless.

Thursday, February 19, 2009

What is the source of happiness?

The New Science of Happiness - TIME

This is an article from a while back (2005) - but it is still worth looking at. I was reminded of this article through a hyperlink embedded in this article.

Monday, February 16, 2009

Listening to cells think/communicate

BBC NEWS | TODAY | Tom Feilden's blog | Do cells think...and is this what it sounds like?

This is a somewhat freaky sound recording, but be sure to pay attention to the caveat that the sound was "processed to sound like a human voice" - so essentially the signal characteristics of neurons communicating was processed heavily until it could simulate sound.

Still....that is very cool.

Friday, January 16, 2009

How to halucinate

Check out this link: Boston.com - Ideas - Globe

Awesome. Although, we're not talking the type of acid-trip halucination you're probably thinking about. It would produce more of a "distortion" type of illusion that wouldn't be terribly dramatic.

Still - tripping out sans drugs.....could be a neat party trick.

Sunday, January 11, 2009

Picking college applicants based on politics

Dr. Helen: Should schools pick applicants based on their adherence to "social justice"?

This is a truly disturbing post, and an even more disturbing trend. Some of the comments are really worthwhile as well. "Diversity" usually boils down to the where and to whom someone was born - whereas who a person actually becomes is a product of their personal decisions. So, diversity, if people really meant what they wanted it to mean, would include not just skin color or nationality, but also religious, philosophical and political ideology. If that were actually the case, just imagine how fun a class discussion would be with a truly diverse student body? That would be a dream come true (assuming that the discussion stays civil).

There are some that would argue that personal ideology is very accurately reflected in the individual based on the ideology of their parents and/or environment. But that really is a straw-man argument. I can 100% guarantee that if you are born in Nigeria, you will be considered Nigerian. If you are born to dyed-in-the-wool religious southern conservative parents, there is a very real possibility that you could end up an atheistic, socialist left-winger. There is just no way to predict with 100% accuracy.

That's the beauty of the kind of diversity mentioned. True diversity would span every aspect of humanity - not just nationality or skin color or even socioeconomic status.

Saturday, January 03, 2009