Yesterday and today, I taught my Fall semester classes for the first time. And in keeping with my desire to record my experience (for my reflection and your um, entertainment), here’s something these “first classes” have made me realize: I have a really tough time not centering my courses around me. This will take some work
Here’s some background. For the first day of my classes, I like to start right off with activity – no reading the syllabus at ’em. So, all three face-to-face courses (despite being different courses) followed the same format. They got into randomly assigned groups of six and took turns telling stories about their most memorable classroom experience. While one told a 3 minute story, the others would use post-it notes to write down one or two words the story made them think about. Once everyone had told their story and written post-its about the other stories, they worked in groups to organize the group’s words into categories (for one class, each class had to come up with two metaphors for what school is that they can use to organize their words. The other had to organize by whether the words fell into learning, motivation, or assessment.)
So, there is a lot of me resetting the online timer, walking around the room, and standing there. I’d interact with students and listen to stories being told, but mostly, the students ran the show. And I know that is how it should be. But here’s the thing: I noticed myself getting awfully fidgety during those three minute (and eight minute at the end) segments.
I think one reason I felt fidgety – like I should be DOING something – is that subconsciously, I have picked up the message about teaching and learning that most of us do: good teaching means the teacher is doing something at all times, and maybe ideally, is the central focus of what goes on in the classroom. But when your class period is designed in a way where students can move and do themselves – and where you don’t have to be the Sun in the room around whom student-planets must orbit – the teacher (well, at least me) will be left with the awkward feeling that at that moment, he should be doing more. (more…)
Today, I watched a really intriguing Intelligence Squared Debate on the motion of “Smart Technology is Making us Dumb.” For the affirmative, Nicholas Carr and Andrew Keen argued; for the negative, Steven Weinberger and Genevieve Bell. For my part, I agreed the most with Genevieve Bell, whose main point was that there is a lot more to the story than whether it IS making us smart or stupid, mostly because there are a lot of smart technologies and a lot of ways we can use those technologies. I agreed least with Andrew Keen, whose conservative persective seemed to be that if we can find even a few ways “we” are using technology in ways he thinks aren’t substantive (twitter seems his favorite example), then that settles the matter.
But one thing I noticed is that the debate really hinges on what we mean by ‘smart,” “dumb” and for that matter, “intelligence.” Carr seemed to tie these things to short-term memory and attention; Weinberger seemed to argue that all intelligence requires is having access to information.Here is a comment I wrote below the debate that may be interesting to share here. (And, of course, check out the debate to form your own conclusions.)
Warning: since you are reading this on a blog, Keen may accuse you of being stupider for it.
In less than a week, I jump in with both feet. As a college professor who loves teaching, I try to vary up my pedagoy every year or so to keep things fresh. This summer, I’ve done a lot of research on Socratic pedagogy and the more I’ve read (and talked with those who’ve used it) the more I like what I heard. So, in less than 1 week, when the Fall semester starts, I will be taking the plunge and running one of my classes in almost its full entirety Socratically. So that I can track (and let others in on) my experiences, I am going to blog about my experience doing this throughout the semester (and maybe the upcoming school year)
What is Socratic pedagogy? Well, in brief, it is running the class in a highly discussion driven way. While I generally keep lecture to a minimum anyway, this semester, I will do as little lecturing as possible. The course will be centered around texts I have students read, and my role will largely be confined to crafting discussion questions that we can have conversations about in class. But even beyond that, it won’t resemble the traditional “discussions” we see in most classrooms, where the teacher stands at the front of the room and calls on students, who do their best to craft answers to the discussion questions that the professor/teacher wants to hear. Socratic pedagogy generally demands that the students face each other during discussion and talk to each other rather than the teacher, while the teacher either sits and discusses with the students, or stands outside the discussion, occasionally jumping in to move conversation forward. Quite literally, Socratic pedagogy is about as student led as things can get.
So, why am I so excited to try this? There are a few reasons. (more…)
Recently, Pittsburgh Steeler James Harrison made headlines by returning two participation trophies given to his children – student athletes. In so doing, a lot of internet praise has accrued to Harrison. Here was his original Instagram post explaining his decision:
I came home to find out that my boys received two trophies for nothing, participation trophies! While I am very proud of my boys for everything they do and will encourage them till the day I die, these trophies will be given back until they EARN a real trophy. I’m sorry I’m not sorry for believing that everything in life should be earned and I’m not about to raise two boys to be men by making them believe that they are entitled to something just because they tried their best…cause sometimes your best is not enough, and that should drive you to want to do better…not cry and whine until somebody gives you something to shut u up and keep you happy. #harrisonfamilyvalues
The only way
I can describe my thoughts on this is that they are mixed. I’m definitely not as sure about Harrison’s decision as some seem to be. So let me explain a bit of why I am having trouble.
First, the whole message seems to be that these prizes were unearned. I’m not sure about that. A prize is earned when you meet the criterion/a for the prize. And in this case, that is exactly what the student athletes did. The prize was for anyone who showed up, the students showed up, so they met the criterion for the prize. Harrison (and others) may disagree that there should be a prize for showing up, ot that showing up shouldn’t be sufficient to earn a prize. But that is a different argument altogether than saying that the prize was unearned. (more…)
Much proverbial ink has been spilled in the past year or so debating whether or not professors should ban computer technology in class. A lot of it comes down on the side of disallowing students from bringing or opening laptops, cell phone, and other potential distractions in class. I respectfully dissent from that viewpoint, and I’d like to offer some reasons why.
[First, a caveat. Since the context in which I teach may be different from others, I should explain my general situation. I teach in a College of Education, and the classes I teach generally don’t get larger than 30 students. Thus, I am able to do more activity-based learning than professors stuck with larger lecture-hall class sizes.]
The primary reason I am concerned about the efficacy of banning computer technology in class is that it assumes that computer surfing in class is a cause, not a symptom, of inattention. That is, many professors seem to assume that banning computer technology will stop (or make it significantly harder) for students to think about other things, like their friends, what they want to do after class, etc. I am skeptical that this is how attention works. I think all us 30+ers all remember our k-12 experience, where we were unlikely to have computer technology at the ready, and that didn’t stop us from not paying attention. When we think about something other than what the professor wants us to think about, it is a general sign that what we are thinking about is, to us, more engaging than what they are talking about. Computer technology may give us a way to act on our boredom or inattention, but disallowing computers in class is unlikely to alleviate that inattention. (more…)
Currently, I am reading a (so far) fantastic book on how to do Socratic (highly discussion-based) pedagogy, called Teaching With Your Mouth Shut. So, a few of my coming entries will be about that book. In this entry, I want to reflect on how some of the book’s message reflects what I call my ‘facebook philosophy of education.’ I developed this philosophy when working at University of Delaware’s Center for Teaching and Assessment of Learning as a graduate assistant.
Let me explain. The dilemma the Center was having was that while they often advocated that professors do less lecture and more activity/discussion (“teaching with your mouth shut,” so to speak), professors would find that students often gave them lower evaluations when they did this, with comments about how the professor didn’t seem to do much teaching. (That led students to believe that the teacher who didn’t lecture was probably less competent than the teacher who did.)
My own observation has confirmed this a bit, and it is likely due to teachers and students falling for what I sometimes call the “labor theory of teaching and learning.” Like Karl Marx’s labor theory of value – where a thing’s value amounts to the labor used to create the thing – teachers and students often believe (wrongly, I think) that how much teaching or learning is going on can be surmised by looking at how much labor is visibly going on. This can be seen, for instance, when teachers believe that the more homework students do, the more they must be learning (and avoid assigning too few readings or homework for fear that the class will be too easy). It can also be seen when students equate teaching with lecture; if I don’t see the teacher up in front of the class lecturing, that means she just isn’t teaching. (Of course, we often get these ideas by absorbing what happened in our own schooling, where lecture often was what counted as teaching, and parents and teachers believed that a class’s workload was a proxy for how much learning went on in the course.) (more…)
If you’ve follows this blog, I am very sorry for the EXTREMELY extended absence. Frankly, as soon as I started work on my dissertation a few years ago, I found I didn’t have time to post and forgot all about the blog. 😦
I actually rediscovered my own blog (a strange thing to say!) sort of by accident. I was looking for anything written on Richard Taylor’s essay “In Praise of Wisdom” on google and came up with, well, the last post I’d written, from 2013. Coincidentally, I was thinking about re-entering the blogosphere, as I find it a fun way to get down and share my thoughts without investing in a more extended journal article.
Anyway, I’m back. And a lot has changed. I have my PhD and am now a Teaching Assistant Professor at East Carolina University, in their College of Education. There, I teach courses on learning and motivation theory (Learning, Motivation, and Assessment), diversity issues in the classroom (Intro to Diversity) and philosophy/history/sociology of education (Foundations of Education).
I’m excited about starting the blog again, and I hope you are at least wiling to read some new posts. Some of my views have changed, some have strengthened, and I have acquired some new interests, like “extended” theories of mind (which I’ll probably be writing a bit about in short order).
Now, back to your regularly scheduled program!
What is philosophy for? What can it, and can it not, be expected to do? I have been thinking about these questions a lot lately. First, I will be teaching a class this fall to undergraduates regarding ethical and legal issues in education; I want to make sure I use philosophy to good effect and know I will have at least some students with (good, bad, or other) expectations for a philosophy course. Second, with all the emphasis on data-driven research, we philosophers of education (and other fields) sometimes feel like we’re on the defensive, having to justify ourselves in ways that other researchers don’t.
Well, recently I stumbled on a really interesting answer to the questions of what philosophy is for and what it can, and cannot, do. Richard Taylor’s essay “Dare to Be Wise” (Taylor 1968) has a bold, but satisfying, thesis that philosophy has taken a mistaken direction in questing for philosophic knowledge:
I shall maintain that there simply is no such thing as philosophical knowledge, nor any philosophical way to know anything, and defend the humble point t hat philosophy is, indeed, the love of wisdom (615).
I want to briefly rehearse Taylor’s argument before discussing why I see his view as a very ennobling one for philosophy. Briefly, in suggesting that philosophy is not about knowledge but wisdom, philosophy does not try and be as other disciplines, but offers something that is more unique that other disciplines can’t as adeptly provide. And, of course, I also happen to think Taylor’s argument is basically true.
Taylor starts with Socrates and the Greeks (Stoics, Epicureans). He suggests that the works that they produced and what they (likely) saw themselves as doing was offering wisdom rather than knowledge. Knowledge is the search for what can be demonstratively proved and is true in a factual sense. Wisdom is a deep acquaintance with a problem, sensitivity to its subtleties and parts, and (possibly) an acquaintance with possible-rules-of-thumb-type answers. While this may be a bit of oversimplification, think of Aristotle’s Nicomachean Ethics as the quest for moral wisdom and Kant’s Metaphysics of Morals as the quest for moral knowledge; in the former case, Aristotle thinks through some moral problems and reasons about some overall possible solutions that are subtle, flexible, and not considered to be ‘true’ in any provable sense. Kant, on the other hand, had it as his mission to discover via reason a moral imperative that could be proved every bit as true as a law of physics, and that was invariant to circumstance, social convention, etc. (Where Taylor may miss the mark about the Ancient Greeks is with Plato, who conceived of the philosopher as the one who could, via reason, attain the truth in the midst of those who saw only appearance.) (more…)
After a long, unintentional hiatus from posting on my blog, an unforeseen question has beckoned me back to write a post. The question (not really in my area of interest, but fascinating nonetheless) is this: what gives someone the right to be a (literary, cultural, social, etc) critic?
The question was posed on a Guardian Books Podcast (available from itunes) called “Life, Death, and Literary Critics” (2/4/2011). Toward the end of the discussion about what literary (and other) critics do and their importance, one listener comment asked what exactly gives someone the right to be a critic?
The answer one critic gave – the only answer given on the show, which is a shame as it seems wrong – was something like “knowledge of one’s subject.” Why does that seem wrong? To me, “knowledge of one’s subject” seems, at best, to be a necessary condition for being a critic, not a sufficient condition. In order to be a critic, it may be the case that ONE necessary trait to have is knowledge of the subject one is critiquing. But one may have knowledge of a subject, but not be a good writer, or not have very good taste, and it seems to me that many would be reluctant to call that person a critic. It also seems to me that knowledge of one’s subject isn’t ALWAYS a necessary condition for being a critic; one can have fair knowledge of one’s subject but have really good taste, instincts, and be a good writer, and be a critic, where someone with great knowledge of the subject, but lesser instincts or taste, would not.
To be honest, the obvious answer I recall practically blurting out during the podcast in response to “What gives someone the right to be a critic?” is…nothing – nothing except having the urge and follow-through to offer a critique. And if one is lucky (or offers a product that others find value in), one’s status as a critic will become stronger the more others appraise you to be a legitimate critic (using whatever criteria they want to use).
Part of the problem, I think, is that when we ask “What gives y the right to x?” we are really asking something like “Why is x entitled to y?” Indeed, that is the sense in which the listener seemed to be asking “What gives x the right to be a critic?” So, if the question is whether anyone can be called entitled to be a critic, I think the answer is a pretty obvious “no.” Now, we can ask why James is entitled to be a teacher in the state of Maryland, or why Josephine is entitled to practice psychiatry in the state of Wisconsin, but in those cases, the answer is largely because they have jumped through the (justified or not) hoops that gave them the license which thereby “entitles” them to be a teacher or doctor. In fact, the word “entitle” is pretty much a legalistic term that means roughly “to have been given the title,” and that is precisely what a certification is – a title that grants and “entitlement.”
But a critic? There is no certification for that. One can be an English major, or a political science major, but whether one is entitled to be a critic doesn’t seem to be dependent on whether one has gotten a certain title as much as whether one’s writing performs the role of giving a critique (and whether others who read the work concur that the writing does that). So, no one is entitled to be a critic; one must earn the title in the way one earns the title “recording artist” or “poet.” One earns the title by performing the role that people in those categories perform.
But, we can object, not everyone who scrapes together the money to record their songs in a basement studio REALLY is a recording artist. Well, in a way that is correct and, in a way, incorrect. In a literal sense, they are a recording artist because they have recorded artistry; just like anyone who has collected baseball cards was, at that time, a baseball card collector. But if the question is whether the basement-studio singer is a SUCCESSFUL recording artist or is acknowledge to listeners to be a good recording artist is another question – related but different.
Now is where I’ll suggest that maybe the listener’s question was phrased wrong: rather than “What gives someone the right to be a critic?” maybe the better asked question is: “What conditions must someone meet to be considered a critic by others?” Not, “What gives someone the right to be a recording artist?” (answer: enough money to record artistry) but “What conditions must someone meet to be considered a critic by others? (answer: talent, good material, a product others want to listen to).
This is where I think it simply comes down to consensus. You are a critic if you offer a critique, and you are a critic to others when others consider your critiques worthy of being read and acknowledged as good critiques. I am sure this might drive many batty, as it is very relativistic beauty-in-the-eye-of-the-beholder kind of stuff. And many will object that x may be considered a critic because they have a blog that is read by others, but they really don’t have a good grasp of what they critique, have poor taste, etc. So, am I saying they are REALLY a critic? Well, yes. Objections like this generally come down to saying “Well, I judge their work to be unworthy and wish others would do the same. To me they are not a critic. I wish they were not held as a legitimate critic in others’ eyes either. And if they saw it may way, used the critieria I used, or had the knowledge I have, they wouldn’t see that person as a critic. Therefore, they are wrong to see that person as a critic, and the person would not be seen as a critic but for the fans’ mistakes.”
But it doesn’t erase the fact that if we asked of this blogosphere critic ‘What gives them the right to be a critic?” The answer would basically be that the fact that they offer a critique that some people find useful makes them a critic to those people.
And honestly, I think that is the best we can do… unless we can find some really good sufficient conditions that are strong enough to trump my subjectivistic theory. If we can find an instance where, say, someone has millions of fans who view that person’s work as good criticism, but we came up with a theory of sufficient conditions for criticism strong enough to really show that, despite being called a critic by millions, they are really not a critic at all, then the theory would be disproved. (But in reality, I think any such theory could be reduced to the theory’s inventor coming up with THEIR OWN standard for how they judge who is a worthy critic arguing that everyone else should just adapt that same theory also, and that anyone who doesn’t is wrong.)
So, I think it was a shame that the question “What gives someone the right to be a critic? was badly answered. I think the answer given may have been intuitive to some critics, who really do not want their status as critics to be wholly dependent on a market process, and their work as something more than products that depend on appealing to consumers even before imparting a superior knowledge. But, I just think my answer is more convincing.
Recently, there has been a plethora of books put out by journalists like Malcolm Gladwell Stephen Dubner, Stephen Levitt, and David Shenk dealing with scientific issues in a way explainable to the lay public. I’ve heard some folks call this the “Gladwellization” of information. Of course, this was meant as a pejorative meaning something similar to the “simplification” of information. And in a sense, that is what these journalists are doing: taking highly specialized information from disciplines and writing books intended to explain to and interest the lay public.
Instead of either praising or bemoaning the trend, I want to think about the benefits and costs of this new emerging breed of popular science writer. First, it should be noted that while the trend of writing science books for the lay public is not new, it has tended to be done by that rare breed of scientist who has a knack for distilling complex concepts into simple and readable prose: Gary Becker in economics, Stephen Jay Gould and Richard Dawkins in biology – all of these writers are bonafide experts in their field as well as popular writers devoted to explaining their science to the masses.
Not man y will argue that this is not a useful or necessary endeavor. Well, correct that: it is not necessary that the general public understand genetics or supply-side economics, but it is certainly good to provide the option to interested members of the lay public. After all, no one wants to see a world where only scientists understand what scientists do, and only economists can understand economics.
But these new journlsits – Gladwell and the like – are just that: journalists. They are often writing about areas they have not gotten degrees in, reading journal articles aimed at a technical audience, and (often without technical acumen) are interpreting data for us. So, it begs a question: in what ways is this a good thing and in what ways is it a bad thing?
To attempt an answer, it is necessary to understand that all of this is being done on account of the division of labor (or, we might say, the division of knowledge). The sciences, and most other fields we see nowadays, have advanced to a point where careers are devoted to them. That is a good thing in that it means we are advancing a great deal (because the more advanced the field and its knowledge, the more time initiates must spend mastering that field and its knowledge). But there is a downside here: it kinda leaves the general public – those whose careers are not devoted to that particular science – behind. (Even professionals in other closely related fields often don’t know the ‘inside’ information of fields around them because each sub-discipline now often rests on knowledge exclusive to those in that sub-discipline.)
So, the obvious good of having journalists writing books distilling these often heady and complex fields for the lay public is that it, to the degree possible, disburses previously monopolized information. In a sense, it lessens the barrier between disciplines and between the technician and the expert. True, I don’t exactly need to understand molecular biology or chaos theory, but at least I can, in theory, get it if I want it. Going along with the American (and European) tradition of egalitarianism, these journalists make sure that everyone’s got a shot at this information.
But here is where I see a subtle downside. (more…)