This has been a long-time interest of mine. Some anthropologists and sociologists say that culture has no effect on our brains. I call BS. Shared cultural experience shapes different kinds of cognitive bias we may have. If you're talking about if there is a difference in the structure or general function of brains, then yes - culture makes little difference. But on a cognitive level it surely makes some differences.
The big kerfuffle recently over whether or not we have free will centers on recent findings using various types of brain scans that show our brains making a decision (or indicating action) before we are cognitively aware of such decisions.
This article does a great job in explaining all of that:
The introductory paragraph of this article is what has my brain spinning right now. The author is entirely correct, every objective measure of an individual's intelligence is based entirely on that: the individual.
The ensuing research example only seems to highlight the idea. Does social interaction boost intelligence or creativity? I think he's on to something. Introspectively I completely agree with him, but I'd LOVE to see some experimental or observational data on the subject.
I wouldn't necessarily go so far as the title of the article suggests, but now we have a better understanding of the vector the drugs take to affect the neurons. If we really understood how they work we would be able to create drugs that were more effective than a meager 30% of the time.
This is not an article about the now thoroughly debunked "visual-auditory-kinesthetic" learning theories. This more about how different techniques have greater or lesser effects on specific types of memory.
The marshmallow test is one of the more famous experiments in psychology, not for the experiment itself, which is rather uninteresting, but in the follow-up studies that tracked behavior over the long term.
One of the most fascinating subjects you'll ever study.
I would be doing a disservice to try and describe this article - but for sake of a headline, our visual receptors shouldn't be able to record the color magenta, and there might be some people out there with the genetic ability to see MORE colors than the average person.
There are variations of this same principle such as Confirmation Bias and Hindsight Bias, but they all have the same root in our cognitive predisposition to look for patterns that fit our expectations.
Also, if you're not subscribed to "You Are Not So Smart" - you should be.
Mr. Carey does a good job in summarizing many of the leading theories and highlighting the history of where we were or are and where we are going. The only critique I have is that it is just that, summaries. As with most journalistic writing he is obliged to stick with short sound-bites of information and therefore leaves a great deal of detail out of the narrative.
Nothing he presents should be regarded as "truth" or even as a dominant position. Still, the information causes you to think...and that's a good thing.
I have to say that this caught me a little off-guard. I teach some of these "wrong" techniques to my students! However, there are some assumptions that have been made based on prior studies. Such as the environment in which you study. Prior studies have shown that recall in an environment that is similar to where the learning occurred helps boost recall. It is interesting that the article points out a subtle variation of that principle that, when considered, makes sense.
Excellent article that should be read by students, teachers, parents and administrators.
This is something I highlight in my class. It strikes right at the heart of the mind/body dilemma: how big of a role does our mind play? The placebo effect says lots, but current trends in medical care as well as psychological care (psychopharmacology anyone?) are trending the opposite direction.
And yet, in our technologically oriented society it is the ne-plus-ultra to declare yourself as an effective Multitasker. Heck, it's practically a requirement! Just look at nearly any job posting out there and they all say something like "must be good at multitasking." Maybe the new standard should be "must be good at focused attention" - you'd certainly get better cognitive results.
However, there is one thing I want to know: how did they know the mice were daydreaming? or reviewing? I'm not doubting the results, I am genuinely interested to know the answer to questions like that.
As with most things brain related, at first blush the idea of training your brain sounds perfectly reasonable, but there is quite a bit of research out there that says it is a non-starter at best. I'm still on the fence.
It's not specifically related to play, but the idea that "gaming" is not just a throw-away pass-time is an idea I've been promoting for a long time. It's good to see research that shows how we can learn from gaming (an incredibly complex and difficult activity).
It seems that there is a resurgence in the interest around using "recreational" drugs as a part of psychiatric medication. LSD, for example, was used in the 1950s and 60s as a drug to help aid in Freudian psychotherapy both here and in Europe.
Apparently current experiments are finding that there are indeed practical uses and in fact potential medical benefits for using psychoactive drugs in certain types of disorders.
Good news indeed! Although, the caveat is that the doodling was as "non-conscious" as possible, so distracted doodling is better than directive doodling of specific designs as drawn from an individual's imagination.
What we actually engage in could be better described as "switch-tasking". Cognitively speaking our brains simply aren't wired for handling what most people define as "multitasking" - and only a very small minority of people are capable of efficient switch-tasking.
I wish everyone gave the kind of careful consideration he gives to the needs and outcomes of education. What he is describing is a model of cognitive education that has been theorized about and put into limited practice in a number of places over the years since at least the 1960's (Jerome Bruner was one of the early advocates of this type of learning). There are even a limited number of universities that use this model.
I'd love to see it implemented to a larger degree but that is a fight against a culture that is resistant to change, and a system that is even entrenched and even more resistant to change.
It is absolutely no different from the steroid debate. We're not 100% sure of the side-effects of most of the psychoactive drugs out there, and they differ dramatically between individuals as well. How big a price are you willing to pay later for that extra edge now?
As with most things it depends on the type of behavior you engage in. Fanatical religious behavior probably leads to unhealthy behavior in most cases, whereas a moderate approach to religion probably leads to more overall health.
The moderate behavior I'm speaking about is actually coming out more and more to be related to most health benefits as almost any activity, when taken to the extreme, leads to unhealthy behaviors or results. Moderation in all things seems to be a pretty healthy way to go.
While not technically about psychology, this does speak to psychological behavior such as prejudice and racism. I always try to tell my students that between any given culture there are FAR more similarities than differences. We do, however, focus on the outward obvious differences which are generally superficial.
Most people look at guilt as a bad thing. Recent research points out that the development of guilt happens alongside other mechanisms of self-control and may, in fact, aid in the development of overall self-control