Interview with Ken Robinson

7/9/11

Nothing you do is great.

For anyone who is progressing in knowledge or skill in any discipline, this is what I think. Take it or eat it.

Don't be happy with where you are in your pursuit of anything, whether it be the pursuit of knowledge, skill, or whatever you can think of. What I really mean is, don't be happy to the point of passivity. You've got to be content enough to be incited to actively progress but not contented while under conceit.

The more I learn, the more I learn I don't know. Learning is a trade off. You give up the thought of "Well, shoot, I know pretty much everything there is to know about web development" for the thought "I know nothing about this, but I'm glad I know that so I can actually seek out the subject. I need to get goin'." If ignorance is bliss, then basic understanding is saddening, increasing knowledge is humbling, and ultimate knowledge is bliss on steroids. If you're afraid of failure, you're competing with no one. If you aren't, then you're competing with everyone in your field. That's why it's so intimidating. Once you accept your ignorance, no matter how talented you actually are, and progress humbly, you're going to get somewhere above the others and you won't even notice it until you get there.


When Ben Folds was asked what advice he has for new artists once, he said:

"Same advice that I think would have been true as long as there have been artists at all--you put your craft and your art first, you learn as much as you can, and you keep striving to be better. I'm never comfortable with people in the studio or people that I work with who think that everything that they do is great...it always bothers me. I think we all ought to be trying to get better and that's really the main thrust of it"

As harsh as it sounds, nothing you do is great. And as you become greater, you'll still believe this because you won't realize how much you've progressed. Let your content and self-confidence/self-assurance come from the joy of progress and the acquisition of new knowledge instead of the old.

4/11/11

Falling

When a wise man makes an ostentatious remark, he dives from an elevation of humility into a pool of conceit and must climb back up, step by step, to reach it again. This is not all; for each time he dives, the elevation to which he must climb increases notably.

2/21/11

Get out of bed.

Have you ever had a real epiphany? That overpowering, liberating sense of something ethereal and barely tenable? If you have, you probably know what real creativity is. That epiphany that you hate, the thought that comes from anonymity and you try and try to figure it out but you can't understand what you did to get it, or deserve it, however you want to put it. Whether or not you believe it, if you know what I'm talking about, something always comes from it. And I have to be vague by saying 'something' because it can be as variable and as any variable gets. 

It's unconventional for me to get out of bed in first quarter of the morning to write about nothing, and the normal part of me would deem doing so illogical. I don't agree. When an enigmatic cloud appears and hovers over your brain, get out of bed and think about it.

We live everyday in normal mode, making normal reactions to normal occurrences that normally happen to normal people. The things we do are normal. I hate this, but most people either realize it and don't acknowledge it, don't know it's happening, or embrace the abnormal. Stop being normal and do something to change yourself. I have no idea what I'm talking about, but I'm still writing my thoughts. Yeah, maybe I won't learn anything; maybe I'm in my own little world conjecturing for self aggrandizement, but it's better than doing the same thing every single day and night. It's better than thinking my erratic state of mind is just a byproduct of sleep deprivation.

Don't think "Oh great, I'll be spending Friday night alone this weekend." Take advantage of such a great opportunity! Build a freaking bird house, invent a sport, play an instrument you've never touched, read a spanish childrens' book, stay up all night doing something you've never done. Better yet, practice, study, or think about whatever ever it is that interests you until you can't bare to do it anymore. Be insane once in a while. Don't be normal. Be abnormal and love it. Plenty of people live normally.

1/22/11

How to Really Learn Something

     Some may argue that in the modern world people are becoming apathetic and increasingly seeking lower and lower types of entertainment. In fact, the degenerating society is a notion held by much of the world. Headlines in newspapers and TV continually debase the gaming culture and criticize those who seek idle entertainment because it's seen as unproductive junk. Everything Bad is Good For You by Steven Johnson is a fantastic rebuttal to this claim. Without going into too much depth, he proves the cognitive benefits of engaging with what is commonly regarded as "junk" entertainment. He says our culture is actually becoming more and more complex, that people are learning a great deal from video games with challenging intellectual puzzles and tv shows with intricate plot lines and characters' relationships. Gamers and tv watchers may not realize how much training they are actually giving their brains. I recommend this book, especially if you disagree with Johnson's claim--as I did at first.

There are two tenets of learning I want to describe:
1. The importance of awareness
2. Learning to teach

1st:
     The things one normally associates with learning are easily identifiable. The proverbial image of a child reading a textbook probably comes to mind. Perhaps, but less often, an image of someone reading an article on a website such as Psychology Today may come to mind. The image of a kid playing games on the computer probably never comes to mind. It's important that we realize how often we acquire new information without consciously knowing it. You may ask why, if we acquire the information, does it make a difference whether we realize it? I think it's important because as we develop a working hypothesis of how we acquire information, we learn more about our memory and retention, and thus become more educated in how to be educated. (One thing that isn't stressed enough in our education.) Once we learn how we process information, retaining bits and pieces of what we learn will become much easier.

     For example, have you ever considered that while browsing Facebook, looking at trivial status updates and frivolous wall posts, you could be learning something about social relationships you could not learn otherwise? Have you ever considered that while you play The Sims you could be developing a working model of how a society should ideally run, or how lowering industrial tax rates can revive a rundown manufacturing district? Have you considered that while debugging frustrating computer problems, you could be developing applicable, real life problem solving skills and increasing your brain's capacity to handle complexity? Once we realize how much we're learning, it's easier to commit learned material to memory for easy access later.


2nd:
     Really learning something, knowing the ins and outs, and being able to discuss something in depth and with conviction, takes a great amount of work. It's easy to fall under the assumption that those who have mastered some subject are just naturally adept at what they do and can do it without any difficulty. Read Outliers by Malcolm Gladwell for some excellent proof against this assumption. What does take a lot of work but what I have found profoundly helpful in my learning is doing so with the intent to teach. No, masters of subjects have not all done this--some may actually be horrible teachers--but it gives one a straightforward course to validating one's efforts and motivating oneself. I believe it is imperative to be active with that which you are learning. You must deliberately and constantly cogitate on whatever it is. It is very beneficial to pretend as though you are giving someone a lecture on your subject, anticipate questions he might have, and develop thorough answers to those questions. Imagine a retort to your argument and emulate a rebuttal or counterpoint in your mind.

     The whole learn-to-teach thing makes for a nice snowball effect. Learning to teach effectively makes you a good teacher. And teaching, in itself, is an excellent learning method. Teaching and learning are two of the most inherent tendencies we have so I think it's best to consciously tie them together.

12/4/10

Sway - The Irresistible Pull of Irrational Behavior

I recently finished Sway - The Irresistible Pull of Irrational Behavior by Ori Brafman and Rom Brafman.  It's a great book that might not have given me very many imminent benefits, but surely some long term ones. It discusses how humans are influenced, or 'swayed', by irrational thought. Three terms are brought up repeatedly in the book that offer a concise summary:

Loss aversion - the tendency to go to great lengths to avoid possible losses
This is the influence that leads people to stay committed to failing causes. Brafman tells a story of a professor who convinced his students to pay more than one hundred dollars for a twenty dollar bill in an auction. In his auction, everybody was free to bid; there were only two rules.

1. Bids were to be made in $1 dollar increments
2. The winner of the auction wins the bill, but the runner-up must still honor his or her bid, while receiving nothing in return.

The bidding starts out fast until it reaches $12 to $16 dollars, when people realize where it's going. Everyone except the two highest bidders drop out of the auction. They both don't want to be the one who payed good money for nothing. They play not to lose instead of playing not to win. The commitment to a chosen path inspires additional bids, driving the price up, making the potential loss larger. So students keep bidding past twenty. It's an incredibly powerful force, and you can see how irrational it is.

Value attribution - the inclination to imbue a person or thing with certain qualities based on perceived value
It's the force that leads people to assume intelligence. It's what makes us overlook phenomena. We assume too much.
I'm sure you've heard about the subway experiment a famous violin player, Joshua Bell, took part in. If not, read about it here. It's a perfect example of value attribution.

There's also something called the Pygmalion effect and Golem effect which describe how we take on positive and negative traits assigned to us by someone else, respectively. It's fascinating how this happens. When people treat us according to their perceived value of us, we actually take on the traits they assign. We are actually able to respond to the subtle hints and clues people give off while entirely unconscious of it.

Diagnosis bias - blindness to all evidence that contradicts our initial assessment of a person or situation
Like value attribution, this is presumptuous. Under this bias, people disregard things that may make their assessment of someone or something more accurate. For example, a doctor overlooking certain symptoms in favor of a simple diagnosis.

I didn't ever want to put this book down. It's all written in a story/analysis format, which I love.
If only more people were aware of these subtle forces...

11/25/10

Postprandial Jabberwocky

I can't verbalize how relieved I am to have a week of respite from school. It's so liberating and enlightening to do what I want for a week, disregarding the lingering threat of the poetic project I have to complete next week. I've become fairly adept at pardoning my forgetfulness when it comes to schoolwork, and I'm beginning to see it as a lifelong skill. I'm not going to get anything done remembering how much work I have to do, so I might as well forget it and put my mind on other things.

Today was Thanksgiving, and a good one. While my kindred feasted upon eagerly prepared delectables at supper, I pondered quietly at the foot of the table, intermittently looking up from my book. Oftentimes it's more interesting to watch and listen to people in conversation than to take part in it. Anyway, as I listened, I realized what a significant influence environment plays in what we think about and talk about. Increasingly evident, too, was the significant difference between teenager-adult conversation and adult-adult conversion.

Normally when you want to start a conversation, you say what's on your mind at the moment (after filtering it for relevancy and other things). If I see my friend and notice he has a weird looking haircut, I'll ask him about it. But sometimes what's on your mind isn't completely evident to you until you start talking. Conversation seems to prompt the subconscious more than merely thinking can. When people think about something deeply, and are unsure or interested about it, I think they hand it over to a part of their brain they can access later for inquiry. I realized this might be true when I listened to my parents and aunt and uncle talk. They talked about friends we have who have been very successful in raising their kids, dealing with hardships, earning a lot of money, etc. I'm sure that's not what was on the tip of their tongue as they initiated conversation. They somehow got to talking about that subject through subconscious means, from the place where they store interesting information they want to think about or talk about some more. Of course there are triggers, like certain words or ideas, that bring thoughts from long-term memory into working memory, but that's not all that has to happen for someone to input a certain thought into a conversation. The thought has to pass filters. (Filters that judge whether information is relevant to bring up, whether the thought might be offensive toward the listener, whether the listener will really care, how intelligent the listener is, etc.) I think one very interesting filter is one that determines whether the listener will accurately discern what you say--in some cases, the filter that determines the listener's age. I don't mean to say that age correlates with accurate discernment, but to suggest that some device used in developing conversation filters a lot of what is said based on age. This filter is very necessary most of the time, (You wouldn't want to discuss marital problems with a 10 year old) but when this filter is applied, information that could have been put into the conversation and information that could have benefited the listener is never relayed.

I'll use my experience today as an example. My parents started discussing a family that has been very successful in raising kids, dealing with hardships, earning a lot of money, etc. after dinner. They described the conditions and events that led to the family's success, intending to convey the interestingness of the its success. The listeners, my aunt and uncle, were interested in the story and could relate to the way my parents felt. I like to think of this as a successful transaction of fascination. (I don't know what else to call it) If my parents were having a conversation with only me, my parents wouldn't have said the same things about the family. I would have gotten an abridged version of the story. Their filters would have made the story more relevant to me and aimed towards my assumed relatively lower capacity to discern.

I want to make a point that many filters used in conversation are very accurate and useful, but if the filter used to gauge the capacity to discern were made more loose, so to say, teenagers like me could benefit more from adult-teenager conversation. After all, if I hadn't heard the adult-adult conversation, I wouldn't have thought of any of this.

The best example of how inefficient this filter can be is manifest in baby-adult conversation. Sometimes parents feel the need to use elementary language to communicate with their child. Often, it's very appropriate, (The baby obviously wouldn't understand you if you were yelling for it to 'stop being disrespectful and unconcerned' instead of 'stop running around.') but little kids can pick up more than you'd think. Speaking intellectually, or holding an appropriate adult conversation in front of a child, can be a rewarding educational experience for her. Using words she might not know is only going to teach her new words.

It's not a very plausible proposal, but if adults could speak with a more efficient filter--that is, speaking to kids/teenagers as they would adults (within reason)--more people would be benefited and perhaps we'd all grow up faster.

I'm sure tons of people have thought of these same ideas. There are probably psychological terms for them. If anyone knows them, I'd love to know too. And for those people who read these crazy posts, tell me what you think the answers to the questions I ask are, because I don't know and can only speculate. I'd like to hear your thoughts, or read them.

11/2/10

Should Kids Be Bribed to Do Well in School? Well, first we have some things to consider...

I recently read a TIME news article with the title "Should Kids Be Bribed to Do Well in School?"
http://www.time.com/time/nation/article/0,8599,1978589-1,00.html

Economist Ronald Fryer Jr. created an experiment to test whether giving money to students could improve grades and test scores. Fryer found that in some of the schools in which he performed the experiment, the monetized reward had no effect. In many others, however, monetized rewards led to either better grades or better standardized tests scores at the end of the year. (You should read the whole thing; this is just the part I want to discuss.)

This data is really interesting considering the raft of research conducted on motivation that says this shouldn't happen. For a long time people have thought that money is the ultimate incentive for great performance on a task, but it has been proven to only be true when the task calls for elementary skill. As soon as a task requires even rudimentary cognitive skill, a higher reward yields a poorer performance. As soon as some level of creativity is involved in a task, a high reward can actually hurt performance. RSA Animate has a great video that explains this perfectly. (I put a link at the bottom of this post)

This excerpt was actually in the article:

"The most damning criticism of Fryer came from psychologists like the University of Rochester's Edward Deci, who has spent his career studying motivation. Deci has found that money — like other tangible rewards — does not work very well to motivate people over the long term, particularly for tasks that involve creativity. In fact, there is a lot of evidence that rewards can have the perverse effect of making people perform worse.

A classic experiment in support of this hypothesis took place at a nursery school at Stanford University in the early 1970s. There, researchers divided 51 toddlers into groups. All the kids were asked to draw a picture with markers. But one group was told in advance that they would get a special reward — a certificate with a gold star and a red ribbon — in exchange for their work. The kids did the drawings, and the ones in the treatment group got their certificates.

A few weeks later, the researchers observed the children through a one-way mirror on a normal school day. They found that the kids who had received the award spent half as much time drawing for fun as those who had not been rewarded. The reward, it seemed, diminished the act of drawing. So instead of giving kids gold stars, Deci says, we should teach them to derive intrinsic pleasure from the task itself. "What we really want is for people to value the activity of learning," he says. People of all ages perform better and work harder if they are actually enjoying the work — not just the reward that comes later."


Deci says that money "does not work very well to motivate people over the long term, particularly for tasks that involve creativity." It seems like Deci was assuming that the tasks Fryer was incentivizing involved creativity, or else he wouldn't have criticized Fryer. Deci is certainly right about motivation. He's one of the many that have discovered and researched the topic for a long time. Fryer's research didn't contradict Deci's assertion; it built upon it in the other direction and reassured what Dan Pink said: if the task involves straightforward, elementary skill, then monetary incentives DO lead to a better performance.

What I want to point out is that whether or not he knows it or chooses to recognize it, Fryer is proving that the tasks he's incentiviting in the schools are not creative tasks. If they were creative tasks, then the students' performances surely would not be better. If anything, they would be worse. The students in the experiments were definitely not doing anything they had an initiative to do on their own. They had to be given money to do it. You can't force a student to be creative. Yes, I know that Fryer probably wasn't trying to prove anything about creativity in schools, but he certainly did.

Perhaps there are people who believe that doing non-creative activities in school is the way to educate our kids--those people are few and far between. (I've always wanted to say that.) But for those people who know that creativity is the thing we need to allow and encourage to happen in schools, here's some more proof that it's not happening. 

People who are creative and curious are going to find a way to learn. It's irrefutable. Instead of letting kids take the first stair--creativity and curiosity--we're trying to make them take the second stair, which is the love of learning and the knowledge of how to learn. We have to let them take the first stair before we make them take the second.

"When researchers asked them how they could raise their scores, the kids mentioned test-taking strategies like reading the questions more carefully. But they didn't talk about the substantive work that leads to learning. 'No one said they were going to stay after class and talk to the teacher,' Fryer says. 'Not one.'"

10/30/10

School is a Breeding Ground for Cheaters

This article from PsychologyToday is fantastic. It's very true.

These are just some snippets I found meaningful. Go here to read the whole article. I highly recommend it.

"They learn that their own questions and interests don't count. What counts are their abilities to provide the "correct" answers to questions that they did not ask and that do not interest them. And "correct" means the answers that the teachers or the test-producers are looking for, not answers that the students really understand to be correct."

"Students recognize that it would be impossible to delve deeply into their school subjects, even if they wanted to. Time does not permit it. They must follow the schedule set by the school curriculum. Moreover, many of them have become convinced that they must also engage in a certain number of formal extracurricular activities, to prove that they are the "well rounded" individuals that top colleges are seeking. Anyone who really allowed himself or herself to pursue a love of one subject would fail all the others."

"Teachers often say that if you cheat in school you are only cheating yourself, because you are shortcutting your own education. But that argument holds water only if what you would have learned by not cheating outweighs the value of whatever you did with the time you saved by cheating."

"One of the tragedies of our system of schooling is that it deflects students from discovering what they truly love and find worth doing for its own sake. Instead, it teaches them that life is a series of hoops that one must get through, by one means or another, and that success lies in others' judgments rather than in real, self-satisfying accomplishments."

10/25/10

Smart People

I'm sure there are a lot of people who think that those around them are dumb because they never talk about anything interesting. The truth is there are probably some people that really have nothing interesting to talk about most of the time, but I think many very smart people don't consider sharing what they know because it's so obvious to them, and instead of being something always on their mind, it becomes a part of them--so much so that they don't even consider it knowledge.

Say someone was watching you work a math problem and you multiplied 9 by 8 in your head. They ask how you got to 72 so quickly. You'd probably think, "9x8 is just 72..."
Our multiplication tables are so deeply ingrained in us that we don't really even consider it knowledge.

When someone learns about/gets better at/practices something, they lose sight of what's common knowledge. And what tends to happen is they lose the incitement to try to share with those on a different level of understanding because it's frustratingly futile.

Regard my cheesy drawing depicting competence on an artful level.


In short, the stick man on the upper level doesn't have the motive to descend to man on the lower level using the dumb elevator. He'd rather hang around and wait for someone to come up a level to talk to him.
(I suppose a set of stairs would have been more befitting to convey my point, but I already closed Microsoft Paint.)

10/17/10

Sir Ken Robinson on Changing Education Paradigms

Here is a talk by Ken Robinson at the Royal Society for the encouragement of Arts (RSA), a rising organization that embodies much of TED’s vision.

10/14/10

Myopia

It scares me how little I know and how small we all are but how much we think we know. Does anyone feel the same way? I hate to think that some people hide in a little cave of myopia, but myopia is everywhere. If you think about it, animosity, pretentiousness, conceit, anger, depression, and even happiness are somehow tied to myopia. If we could always see the other side of things, what would our emotions be like? If we're angry, are we failing to acknowledge the precariousness of the other person's situation? If we're always happy, are we missing a depressing side of everything? If we're depressed, are we missing all that is good? I like to think the latter is true, and even if it's not, I choose to live by it obviously because it makes me happy. Perhaps our limited perspective let's us be motivated. Maybe we become ambitious because we can't see everything ahead of us. I think when I'm older I'll look back and think that things I tried were stupid, but I'll realize that if I didn't do them, I wouldn't have gained the experience that led me to do whatever it is I am glad I did.

It seems that as the human mind learns more about the world around it, things lose value. Some things that you find entrancing lose their awe when you realize that there is so much more out there, and many times, you realize you can't get to it.

The question is whether we should remain ignorant and enjoy ourselves, or strive to be wise and acquire knowledge. Unfortunately, the only way to find out is by acquiring more knowledge and becoming wise. It's a strange paradox, but since the only way to know is to become wise, we should do so.

Steps we can take to get out of the cave of myopia:

1)
Be curious

Ask yourself what everyone might be thinking when people are expecting you to think your own thoughts. When you're thinking about something, think about why you are. Think about what led you to the thoughts you're having. Be skeptical of your thoughts. When you do something irrational, try to prove that it was rational. It will ultimately lead you to being more rational. When something is interesting to others and not to you, figure out why. Think about why the things around you work the way they do. Try to guess what people will say next, and try to figure out where they're thoughts might be coming from or what their premise is.

2)
Be empirical

Don't be boring. Don't do the same thing everyday. Take a different way home from school or work than you normally do and try driving slow if you normally speed and vice-versa. Listen to music you normally don't. Read a book that doesn't look interesting to you. Talk to someone you don't know. Do something that makes you feel uncomfortable and get comfortable with it.

3) As banal as it sounds,
consider the golden rule. (Wikipedia's definition makes it seem less cliche and more applicable, so read that if you have trouble taking it seriously.)

I hate hearing that, and it hurts to type it, and you probably stopped reading for a second after reading that, but the real value behind it isn't as cliche as the phrase. Because we cannot live inside another's mind, we have to rely on our own experiences to judge whether how we act is justified and circumspect. Before disregarding this, think about it. The best way to implement this into your instinctive, reflexive thought is when someone is angry with you for doing something, remember your thought process in that occasion. Make a conscious effort to commit to memory the defensive strategies you prepare in your mind to counter your offender. Then, the next time you are angry at someone, find this experience deep down in your brain, or at least the feelings from it, and consider them. If you do this enough, it becomes a reflex, and you'll find it increasingly easy to tolerate others. Make your anger rational. It's hard to do when you consider what the other person might be thinking.



If everyone would be little more curious and empirical, so many problems with society would be solved right away. I'm sure of it.

I feel like I'm writing some kind of self-help book on anger management or something. I don't know what happened, but I still think these things work and make people more perceptive. I don't mean to sound like I know everything about being wise. I certainly don't; these are my attempts at explaining how I think myopia can be overcome. This is really more a way for me to sort all my jumbled thoughts.

10/11/10

What the weekend has taught me that school cannot...

Something peculiar happens nearly every weekend for me--something that never happens at school. Starting normally on Saturday night or Sunday morning, I find a deep curiosity for something, it escalates, and I pursue it until Monday morning. This weekend it was blind euphoria, last weekend it was cognitive dissonance. There's always something interesting to learn. The frustrating and increasingly evident fact is that these kinds of curiosities and interests are never provoked in a school setting. There's something about the way we've been taught for so long that discourages our own curiosity, and we don't even realize it. We get "educated" away from our curiosity in a school system that places right answers above questions and general intelligence above specific intelligence.

Today I learned how blue screens work, how ribbon microphones are manufactured, how ink is made, what clairvoyance has to do with pre-cognition and retro-cognition, what hot, cold, and warm reading is, that listening to an audio book at twice the normal speed is twice as productive, a myriad of other things, and I got to talk to a very talented cellist about compositional efforts and music education. All of these things I attribute to productive education, (I know at least, that the conversation I had with the cellist will greatly influence how I deal with my future) but I did them all while I should have been reading the Odyssey and taking notes from my government book. I do suck at time management and I do realize, teachers, that I could have done my homework and THEN done these things, but the reason I do this is because if I stifle my curiosity while it's at its highest point, I'll lose sense of how to achieve it and I'll lose my love of learning. I've learned that the best way to be curious is to never place anything above the need for curiosity. The reason I think students do not have high curiosity in school is because they are trained to ignore it. They're trained to get the "right answer" instead of questioning why the "wrong answer" is wrong.

A demonstrative scenario:

A teacher instructs a student to solve for the final velocity of a free falling object on a test. The student arrives at the correct answer, yet the teacher marks it wrong because he did not use the calculus based method he taught in class the day before.

The "educated student" -
Without questioning the context under which he's solving for the velocity, he tries to solve the problem logically, implementing physics, which makes the most sense to him, in order to solve the problem. The student asks why he got points deducted and is told that he must use the "correct method." He quietly submits with, "Okay" and goes to his desk.

The curious student -
Asks why he is solving for the final velocity. He receives the overly used excuse of "Because you need to know how." He seeks out a situation in which final velocity would need to be determined and ponders the benefits of solving for it. He then solves the problem using physics, which makes the most sense to him to solve the problem. He asks why he got points deducted and receives the same response as the 1st student. Instead of quietly submitting, though, he asks why the calculus based method is more efficient than the physics based method. He then argues for his case but considers the teacher's reasoning of why the calculus method is more efficient. He learns the calculus based method and applies it to the next problem he solves. He then commits to memory the different situations in which each method should be implemented and the reasoning behind both and considers its application in any other discipline that might concern him.

The first problem with both situations is that the teacher marked the problem wrong based on the rationale that the method he teaches is what the students must follow.

The second problem is that the teacher gave no context for the problem, which leaves the students with no real live application for the problem. In order to learn something, students need a reason to learn it. They need to want to learn it.

The third problem is that we have in conventional education systems way too many students like the 1st student. They have no impetus to determine why their teacher's method is the best and they don't attach what they learn in school to their lives outside of school.

The fourth problem is that students are rewarded for following the 1st student's approach. Doing exactly what the teacher tells you what to do will get you an 'A', but this approach is like reading a step by step instruction manual to playing monopoly that tells you every property to buy, which properties to build on, when to build, which token to use, with whom to trade, and what to trade. There's no fun in that. If you played that way, you would be bored and never learn why what you did produced the results you got. The potential for adaptability is lost. You would lose interest very quickly and there would be no innovation in playing.

All these problems are inhibitors of curiosity. I believe that real learning cannot happen without curiosity.

Students are losing interest. They're losing curiosity because they're being educated out of it.

If students are educated in order to be able to govern and direct the future, how can they do so without innovation? Innovation is derived from curiosity. You cannot have innovation without it. We don't know what the future will hold, but the type of education that we have is based on the presumption that we do.


Update: As a result of saying up late to write this, I probably won't get up in time to read the spark notes of the Odyssey for the quiz in 8 hours. Too bad I missed out on Homer's story of the great Odysseus for an explanation of why I could care less.

10/10/10

Sokanu: The Blog

Whoever has not visited this blog should do so. The creator of the blog got his inspiration from Ken Robinson just like I did. It's aimed at helping people find their passion and it has some excellent passages about education.

http://blog.sokanu.com/

Blind Euphoria

Two nights ago, after returning home from the "homecoming football game" which I did not actively watch, I got to contemplate the social paradigm I'll call "blind euphoria." It happens without fail at all the large social events I've been to. And it seems that the best way to recognize this paradigm is by consciously placing yourself outside of all the excitement enveloping you. What seems to happen to attendants of these large social events is that as a result of planning to enjoy their evening and completely immersing themselves in a certain effort, (the football team, the band, the student section, the walk-around-and-look-cool people) they tend to lose their normal perception of propriety and others' perceptions. When you accomplish something fantastic, and you are just euphoric, you believe that all eyes are on you, but you don't mind it; in fact, you enjoy it. You feel more significant to others than you actually are. And even the consideration that you are experiencing blinded emotions does not help but mitigate your distorted sense of propriety. You still cannot help but have pride in yourself. But this is definitely not a bad thing because if those around you share the same state of mind, atypically progressive things can happen.

When two people have a conversation with each other while both permitting blind euphoria, a new understanding of one another or a new relationship not attainable in another setting is created. In this kind of circumstance, people tend to act in a way they normally would not, like greeting someone they never talk to or shouting something personally hilarious but not publicly hilarious. Perhaps the reason that memorable experiences are associated with large social gatherings is because the the energy normally put towards conducting oneself adequately and avoiding awkwardness is put toward living in the moment instead.

It's captivating to think: If there were some way to keep such an elevation of emotion at its place, would it be as enjoyable? Would we actually experience euphoria constantly? Just like question of "Is there light if there is no dark?", It could be argued well that there would be no euphoria because there would be no absence of it, but imagine if it were. What would perennial euphoria be like? I think the benefit of having only temporal euphoria is having depression that allows us to contemplate euphoria. That itself might prove to be a better alternative to experiencing euphoria perennially. There really isn't any application in questioning it, but it's an interesting concept that, if studied, leads to new realizations.

The most fascinating and enlightening thing for me to do is to step out of this poignant realm in which I find myself engaged and observe this blind euphoria happening. Seeing yourself objectively rather than subjectively is a door into a world of curiosity otherwise latent.