What the heck is research anyway?
December 21, 2011 10:35 PM Subscribe
Plagiarize! Plagiarize! Why do you think God made your eyes?
...but please, to call it "research"!
posted by Chocolate Pickle at 10:40 PM on December 21, 2011 [4 favorites]
...but please, to call it "research"!
posted by Chocolate Pickle at 10:40 PM on December 21, 2011 [4 favorites]
When I schedule my tasks I always add time for research which actually means look at pictures of cats on the Internet.
posted by Ad hominem at 10:47 PM on December 21, 2011 [5 favorites]
posted by Ad hominem at 10:47 PM on December 21, 2011 [5 favorites]
and many nights we drink bourbon worrying about...well everything.
posted by babbyʼ); Drop table users; -- at 10:47 PM on December 21, 2011 [2 favorites]
posted by babbyʼ); Drop table users; -- at 10:47 PM on December 21, 2011 [2 favorites]
"Re-search" is when you keep looking for something in the same place because you can't believe it's not where you're sure you left it.
Dammit, it was right there just a minute ago...
posted by Greg_Ace at 10:51 PM on December 21, 2011 [2 favorites]
Dammit, it was right there just a minute ago...
posted by Greg_Ace at 10:51 PM on December 21, 2011 [2 favorites]
"Research is what I'm doing when I don't know what I'm doing."
(Werner Van Braun)
posted by philip-random at 11:02 PM on December 21, 2011
(Werner Van Braun)
posted by philip-random at 11:02 PM on December 21, 2011
Research is what you're doing at 2 o'clock in the morning because without some furrow to work in, some hard knot of something to grind away at, your mind loops back and you begin to gnaw at your own self.
It's what you do instead of sitting upstairs and watching your sleeping child or crawling in between the warm sheets with your lover. It keeps your dog whining, your parents anxious, your colleagues expectant and your peers nervous.
It can't be rendered down into a few sentences that anyone can understand because it is the mind working at the edge of its own comprehension, a windswept cliff where words are the first things to take flight.
It's an itch of an intuition that keeps nagging you as you look, for the three-hundredth time at the text you've been working on for half a decade. It's a turn of phrase that is so perfect you start to construct an argument that could sustain it. It's a flash of realization in the bathtub that keeps you working it over in your head until the water's gone cold. It's the joy of having a genuinely new thought and the sober realization that you are now obliged to work for months at midwifing it into the world.
In a world where most people are trying to sell something, steal something, hang on to something or pretend at having something, it is the making of things themselves.
posted by R. Schlock at 11:10 PM on December 21, 2011 [25 favorites]
It's what you do instead of sitting upstairs and watching your sleeping child or crawling in between the warm sheets with your lover. It keeps your dog whining, your parents anxious, your colleagues expectant and your peers nervous.
It can't be rendered down into a few sentences that anyone can understand because it is the mind working at the edge of its own comprehension, a windswept cliff where words are the first things to take flight.
It's an itch of an intuition that keeps nagging you as you look, for the three-hundredth time at the text you've been working on for half a decade. It's a turn of phrase that is so perfect you start to construct an argument that could sustain it. It's a flash of realization in the bathtub that keeps you working it over in your head until the water's gone cold. It's the joy of having a genuinely new thought and the sober realization that you are now obliged to work for months at midwifing it into the world.
In a world where most people are trying to sell something, steal something, hang on to something or pretend at having something, it is the making of things themselves.
posted by R. Schlock at 11:10 PM on December 21, 2011 [25 favorites]
A complex, long winded treatment of a reasonably simple idea, such that most people who have the time and inclination to get through it won't learn anything new. I'm not sure who the audience for this article is supposed to be. Extremely bright children?
posted by Kwine at 11:24 PM on December 21, 2011 [1 favorite]
posted by Kwine at 11:24 PM on December 21, 2011 [1 favorite]
Or say people that don't know that much about university structure? It's pretty complex, and if you aren't involved in it, you just don't know what it entails. I found the article interesting, it's a window into a job that's very different from my own. One of my friends is a professor, and now I think I have a better understanding of what she does. I can understand why the author wrote it, nobody outside of my industry knows what my job is either. On one hand, it's hard to explain, but on the other, my career is a big part of my life, and it would be nice if the people I'm close to knew about it. Hmm. maybe I should write my own letter like this.
posted by 5_13_23_42_69_666 at 12:57 AM on December 22, 2011
posted by 5_13_23_42_69_666 at 12:57 AM on December 22, 2011
He makes research sound so much simpler and easier (and less satisfying) than it really is, because he left out the failure modes.
(1) We can't pick the right question out of the vast miasma of our collective ignorance.
(2) We can't come up with a method that any chance of actually answering the question, either because we picked the wrong question, or the right question is just too damn hard for the tools we have at hand. Back to step (1).
(3) We don't even know what tools we need to answer the question, or the tools are too expensive (relative to our level of funding) or require more people-hours than we have hours or people, or collecting the tools requires solving other problems that we didn't realize were still unsolved. Or upon reading the background literature, we discover that the question we came up with was completely answered in a 1954 paper in the Proceedings of the Steklov Institute, which is only available in Russian. Back to step (1).
(4) Our equipment breaks down; our assistants get sick, or get stuck in Iran waiting for a re-entry visa, or leave to work for Google; our IRB paperwork is lost in administrative limbo; our hard disk crashes and our offsite data backups are lost in a flood. The method we thought of just doesn't work the way we thought it would, because of some subtlety we didn't think of, which nobody talks about it in the prior literature. Back to step (1).
(5) The data is too noisy to extract a signal one way or the other, either because there is no signal to extract (Wrong question! Back to step (1)!) or because the standard analysis tools aren't powerful enough to extract it, despite claims to the contrary in the literature, and we need to develop new tools, which requires solving some statistical problem we thought was already solved. Back to step (1).
(6) We get a stupid editor or a stupid reviewer who doesn't understand anything we've written, who questions our basic methods because they differ from those described in a famous 1954 paper in the Proceedings of the Steklov Institute (which is only available in Russian), who thinks the question is obvious or boring, who thinks our conclusions are "obviously wrong" without pointing out the flaw in our analysis, who either thinks the math is too formal or thinks the math is too hand-wavy, who insists on rejecting any paper for which source code is not freely downloadable, who rejects the paper because one of our assistants already posted a vague description of our experiment on his LiveJournal, or who is applying for a competing NIH grant. Or we find a smart reviewer who finds a major flaw in our methods and/or analysis, or who points out a beautiful English translation of that 1954 Steklov paper, or who insists that we actually write in coherent English. We don't get tenure. Meanwhile, someone else publishes a paper answering the same questions, using similar techniques, and arriving at the same conclusions, and wins the Nobel Prize. Back to Step (1).
posted by erniepan at 1:47 AM on December 22, 2011 [29 favorites]
(1) We can't pick the right question out of the vast miasma of our collective ignorance.
(2) We can't come up with a method that any chance of actually answering the question, either because we picked the wrong question, or the right question is just too damn hard for the tools we have at hand. Back to step (1).
(3) We don't even know what tools we need to answer the question, or the tools are too expensive (relative to our level of funding) or require more people-hours than we have hours or people, or collecting the tools requires solving other problems that we didn't realize were still unsolved. Or upon reading the background literature, we discover that the question we came up with was completely answered in a 1954 paper in the Proceedings of the Steklov Institute, which is only available in Russian. Back to step (1).
(4) Our equipment breaks down; our assistants get sick, or get stuck in Iran waiting for a re-entry visa, or leave to work for Google; our IRB paperwork is lost in administrative limbo; our hard disk crashes and our offsite data backups are lost in a flood. The method we thought of just doesn't work the way we thought it would, because of some subtlety we didn't think of, which nobody talks about it in the prior literature. Back to step (1).
(5) The data is too noisy to extract a signal one way or the other, either because there is no signal to extract (Wrong question! Back to step (1)!) or because the standard analysis tools aren't powerful enough to extract it, despite claims to the contrary in the literature, and we need to develop new tools, which requires solving some statistical problem we thought was already solved. Back to step (1).
(6) We get a stupid editor or a stupid reviewer who doesn't understand anything we've written, who questions our basic methods because they differ from those described in a famous 1954 paper in the Proceedings of the Steklov Institute (which is only available in Russian), who thinks the question is obvious or boring, who thinks our conclusions are "obviously wrong" without pointing out the flaw in our analysis, who either thinks the math is too formal or thinks the math is too hand-wavy, who insists on rejecting any paper for which source code is not freely downloadable, who rejects the paper because one of our assistants already posted a vague description of our experiment on his LiveJournal, or who is applying for a competing NIH grant. Or we find a smart reviewer who finds a major flaw in our methods and/or analysis, or who points out a beautiful English translation of that 1954 Steklov paper, or who insists that we actually write in coherent English. We don't get tenure. Meanwhile, someone else publishes a paper answering the same questions, using similar techniques, and arriving at the same conclusions, and wins the Nobel Prize. Back to Step (1).
posted by erniepan at 1:47 AM on December 22, 2011 [29 favorites]
I'm not sure who the audience for this article is supposed to be.
The answer is in the footnotes:
"The ideas described here would offend our colleagues who consider themselves “post modern” or “deconstructivists”. They don’t believe in replicable knowledge or anything roughly thought of as the scientific method. We mostly humor them as they eat their own departments and fields from the inside out and then we take their faculty lines for our own."
The audience is deans.
The message is: don't treat us like Those Unspecified People Over There. Treat us like the real, honest-to-heck Scientists that we are. I imagine that there are studies of this sort of mindset, but campus politics in a world of rapidly shrinking budgets provides a quicker explanation.
posted by GeorgeBickham at 2:11 AM on December 22, 2011 [6 favorites]
The answer is in the footnotes:
"The ideas described here would offend our colleagues who consider themselves “post modern” or “deconstructivists”. They don’t believe in replicable knowledge or anything roughly thought of as the scientific method. We mostly humor them as they eat their own departments and fields from the inside out and then we take their faculty lines for our own."
The audience is deans.
The message is: don't treat us like Those Unspecified People Over There. Treat us like the real, honest-to-heck Scientists that we are. I imagine that there are studies of this sort of mindset, but campus politics in a world of rapidly shrinking budgets provides a quicker explanation.
posted by GeorgeBickham at 2:11 AM on December 22, 2011 [6 favorites]
I used to describe what I did as "rip aside reality's veil and gaze upon it's face." Of course, when I did this, I would preface it with "Time to...." and be doing an impression of "Fred the Baker" from the Dunkin' Donuts commercials of my youth.
posted by Kid Charlemagne at 3:34 AM on December 22, 2011
posted by Kid Charlemagne at 3:34 AM on December 22, 2011
The data is too noisy to extract a signal...
Pssst, Fourier transform.
posted by Kid Charlemagne at 3:37 AM on December 22, 2011
Pssst, Fourier transform.
posted by Kid Charlemagne at 3:37 AM on December 22, 2011
Pssst, Fourier transform.
Principal component analysis.
posted by mr_roboto at 4:04 AM on December 22, 2011 [3 favorites]
Principal component analysis.
posted by mr_roboto at 4:04 AM on December 22, 2011 [3 favorites]
You all pay a part of our salary, either through your taxes or the generous support of your kid’s education, and therefore should know where your money goes.
...and that's why the papers resulting from all that taxpayer-funded research are freely available on the Internet.
No, wait a minute...
posted by ZenMasterThis at 5:04 AM on December 22, 2011 [5 favorites]
...and that's why the papers resulting from all that taxpayer-funded research are freely available on the Internet.
No, wait a minute...
posted by ZenMasterThis at 5:04 AM on December 22, 2011 [5 favorites]
The ideas described here would offend our colleagues who consider themselves “post modern” or “deconstructivists”
They really wouldn't.
They don’t believe in replicable knowledge or anything roughly thought of as the scientific method.
They certainly do; they just don't tend to believe that these represent good models for most research in the humanities. In this respect they resemble researchers in the humanities who don't consider themselves "post modern" or "deconstructivists."
I mean, this guy is a psychologist! That's great science. But as far as replicability and the scientific method goes, it ain't physics. Which is fine by me. Is it fine by him?
posted by escabeche at 5:11 AM on December 22, 2011 [1 favorite]
They really wouldn't.
They don’t believe in replicable knowledge or anything roughly thought of as the scientific method.
They certainly do; they just don't tend to believe that these represent good models for most research in the humanities. In this respect they resemble researchers in the humanities who don't consider themselves "post modern" or "deconstructivists."
I mean, this guy is a psychologist! That's great science. But as far as replicability and the scientific method goes, it ain't physics. Which is fine by me. Is it fine by him?
posted by escabeche at 5:11 AM on December 22, 2011 [1 favorite]
A complex, long winded treatment of a reasonably simple idea
The concept of scientific research is not a reasonably simple idea. Trained and respected scientists get it wrong all the time.
They certainly do; they just don't tend to believe that these represent good models for most research in the humanities.
If your results aren't replicable, that means they can only be observed in the particular instance when you studied them. Isn't that just reporting?
posted by LogicalDash at 5:24 AM on December 22, 2011
The concept of scientific research is not a reasonably simple idea. Trained and respected scientists get it wrong all the time.
They certainly do; they just don't tend to believe that these represent good models for most research in the humanities.
If your results aren't replicable, that means they can only be observed in the particular instance when you studied them. Isn't that just reporting?
posted by LogicalDash at 5:24 AM on December 22, 2011
If your results aren't replicable, that means they can only be observed in the particular instance when you studied them. Isn't that just reporting?
Most people call what historians do "research." It's fine not to, if that's your bag. I'm just saying that this guy's answer to "What the heck is research anyway?" is not the standard one.
posted by escabeche at 5:41 AM on December 22, 2011
Most people call what historians do "research." It's fine not to, if that's your bag. I'm just saying that this guy's answer to "What the heck is research anyway?" is not the standard one.
posted by escabeche at 5:41 AM on December 22, 2011
As a psychologist, perhaps he should study his footnote 4 more closely to see whether there are perhaps some hidden, irrational fears of his own there that do not come from knowledge but from a shallow misunderstanding, quite similar to that of his parents who don't know even the very basics of what it means to do research.
posted by Pyrogenesis at 5:46 AM on December 22, 2011
posted by Pyrogenesis at 5:46 AM on December 22, 2011
Perhaps something of a tangent but... I do mostly philosophy-related stuff (although my MO is always: more proper science needed!), and when I was writing my masters thesis, the supervisors and all the rest were always pestering me about explicating the methodology I use. But what sort of a methodology do you have in philosophy? Because what I do is I read a bunch of books and papers on a topic and then try to add to the discussion. That's pretty much it, but "I read some stuff and then opined about it" not a method make.
So after endless demands about my "methodology" I made a special section for it in my thesis and just quoted the very first paragraph from Giorgio Agamben's The Signature of All Things: On Method:
"Anyone familiar with research in the human sciences knows that, contrary to common opinion, a reflection on method usually follows practical application, rather than preceding it. It is a matter, then, of ultimate or penultimate thoughts, to be discussed among friends and colleagues, which can legitimately be articulated only after extensive research."
I then submitted my thesis and called a bunch of friends, bought them some wine and beer, and we discussed "methodology" in the human sciences all night long.
posted by Pyrogenesis at 6:01 AM on December 22, 2011 [2 favorites]
So after endless demands about my "methodology" I made a special section for it in my thesis and just quoted the very first paragraph from Giorgio Agamben's The Signature of All Things: On Method:
"Anyone familiar with research in the human sciences knows that, contrary to common opinion, a reflection on method usually follows practical application, rather than preceding it. It is a matter, then, of ultimate or penultimate thoughts, to be discussed among friends and colleagues, which can legitimately be articulated only after extensive research."
I then submitted my thesis and called a bunch of friends, bought them some wine and beer, and we discussed "methodology" in the human sciences all night long.
posted by Pyrogenesis at 6:01 AM on December 22, 2011 [2 favorites]
Historical research usually is replicable; it's quite rare for one person to have access to all of the primary sources about a particular event, and rarer still to really use them all, so eventually someone else will do their own study of the same event using different sources. The facts that those studies agree upon are results that have been replicated. It's the same deal with paleontology.
posted by LogicalDash at 6:01 AM on December 22, 2011
posted by LogicalDash at 6:01 AM on December 22, 2011
(nb. just because it's replicable doesn't mean it'll ever be replicated. sometimes the data will just never be enough)
posted by LogicalDash at 6:03 AM on December 22, 2011
posted by LogicalDash at 6:03 AM on December 22, 2011
History is like astronomy and geology - built up from observations of past events. Apparently it's both impossible and unethical to go back in time and change things to run experiments.
The difference between a historian and a journalist is that we care about sample bias, previous research into that issue, and finding the best story (aka model) rather than the most interesting. Of course, there are also very few journalists writing about the process of enclosure in the 17th century or occupational structures in Britain in 1851. The long view is what I love most about history.
but back to the original link: I feel like it's an interesting post, not really a great explantation of what research is for those who really don't know. I know what research is (worked in history and in psychology/epidemiology research support) and I wasn't following his points well. You have to keep things simple for relatives. If you work in cancer research, you are "trying to find a cure." If you work in psychology, your research is "finding out how our brains work," with a possible reference to cures if that is relevant. In history, it's "how and why things happened in the past, so we don't repeat the bad things." We don't have to tell them the truth: that just finding out stuff is interesting.
posted by jb at 7:21 AM on December 22, 2011 [2 favorites]
The difference between a historian and a journalist is that we care about sample bias, previous research into that issue, and finding the best story (aka model) rather than the most interesting. Of course, there are also very few journalists writing about the process of enclosure in the 17th century or occupational structures in Britain in 1851. The long view is what I love most about history.
but back to the original link: I feel like it's an interesting post, not really a great explantation of what research is for those who really don't know. I know what research is (worked in history and in psychology/epidemiology research support) and I wasn't following his points well. You have to keep things simple for relatives. If you work in cancer research, you are "trying to find a cure." If you work in psychology, your research is "finding out how our brains work," with a possible reference to cures if that is relevant. In history, it's "how and why things happened in the past, so we don't repeat the bad things." We don't have to tell them the truth: that just finding out stuff is interesting.
posted by jb at 7:21 AM on December 22, 2011 [2 favorites]
The answer is in the footnotes:
"The ideas described here would offend our colleagues who consider themselves “post modern” or “deconstructivists”. They don’t believe in replicable knowledge or anything roughly thought of as the scientific method. We mostly humor them as they eat their own departments and fields from the inside out and then we take their faculty lines for our own."
The audience is deans.
The message is: don't treat us like Those Unspecified People Over There. Treat us like the real, honest-to-heck Scientists that we are. I imagine that there are studies of this sort of mindset, but campus politics in a world of rapidly shrinking budgets provides a quicker explanation.
That sounds just about right, actually. Thanks.
Related story: Internet authors! If I have to click to read your footnote/endnote and then click the back button on my browser to get back to the main text, I'm not going to bother. Put it in a title tag so I can see it on hover, or put them in the right column like grantland does. But do something, so it doesn't take me two clicks per.
posted by Kwine at 7:58 AM on December 22, 2011
"The ideas described here would offend our colleagues who consider themselves “post modern” or “deconstructivists”. They don’t believe in replicable knowledge or anything roughly thought of as the scientific method. We mostly humor them as they eat their own departments and fields from the inside out and then we take their faculty lines for our own."
The audience is deans.
The message is: don't treat us like Those Unspecified People Over There. Treat us like the real, honest-to-heck Scientists that we are. I imagine that there are studies of this sort of mindset, but campus politics in a world of rapidly shrinking budgets provides a quicker explanation.
That sounds just about right, actually. Thanks.
Related story: Internet authors! If I have to click to read your footnote/endnote and then click the back button on my browser to get back to the main text, I'm not going to bother. Put it in a title tag so I can see it on hover, or put them in the right column like grantland does. But do something, so it doesn't take me two clicks per.
posted by Kwine at 7:58 AM on December 22, 2011
Being at a college that is more teaching-focused, I don't share the same feeling of teaching getting in the way of my love for research (although service definitely gets in the way of my teaching). Research is that thing, for me, that allows me to become a better teacher. It allows me to meet with my peers and get energized to come back and share those ideas with students. Research with students enables me to take a group of them to a conference each year to present their own work,and to help them in their grad school goals. Then there's research to see if that cool new thing that people are using in classes actually does what it's supposed to do. And teaching, well that helps with research too. Every time I go through a semester I have ideas that come directly through discussions with my students.
posted by bizzyb at 8:07 AM on December 22, 2011
posted by bizzyb at 8:07 AM on December 22, 2011
Ever since I started grad school I've thought that "research" was a horrible misnomer. "Re-search" is fine to describe going to the library and learning what people know about a topic. Going to the lab and *expanding* what people know about a topic should just be called "search".
posted by roystgnr at 9:56 AM on December 22, 2011
posted by roystgnr at 9:56 AM on December 22, 2011
Is research still being 'underpinned' ??? Because thats a sure marker of quality for me.
posted by sgt.serenity at 11:21 AM on December 22, 2011
posted by sgt.serenity at 11:21 AM on December 22, 2011
roystgnr I think research is accurate when we attempt to expand our knowledge, even with novel ventures. The physical laws are already there, we are simply discovering what they do in greater depth and detail.
posted by karmiolz at 12:08 PM on December 22, 2011
posted by karmiolz at 12:08 PM on December 22, 2011
"The ideas described here would offend our colleagues who consider themselves “post modern” or “deconstructivists”. They don’t believe in replicable knowledge or anything roughly thought of as the scientific method. We mostly humor them as they eat their own departments and fields from the inside out and then we take their faculty lines for our own."
Sigh, not offended, just a bit over these kinds of lazy, somewhat inaccurate, school boy level taunts.
Sometimes I think that the reason that (at least some) people that subscribe to the positivist world view have such an antipathy towards other forms of research practice and theoretical groundings, dismissed so glibly in this passage through the use of ever so scary scare quotes, has less to do with the merit or otherwise of such approaches, and more to do with what is at stake as a consequence of those approaches. (Though there is way too much cruft that is pretty easy to dismissively make fun of in these approaches to be sure)
For instance, post-modernism refers to quite a broad range of ideas and concepts, but one of the more significant ones is the critique of the modernist epistemological conception of universal truth. That's not to say that it isn't possible to observe or measure objective reality, just that we need to be very careful and reflexively aware about how our world views and research tools implicitly impact our perception of the world, both in everyday life and in the lofty world of research. Then of course we have the impacts of the socio-cultural world in which science takes place and from which it can't escape, for instance the political economic realities which help shape what kinds of research are prioritised through funding, or the ongoing impact of gender imbalances in the STEM fields for example.
Obviously this isn't meant as a rejection of the scientific method or scientists, as good scientists are aware of such things and mindful of their impact on the development of knowledge, for example the first two failure modes erniepan mentions above. But there does seem to be lazy positivists out there who would benefit from some self-reflection and a refresher in the philosophy of science, they might realise then that the humanities is not an antagonist towards or refutation of the scientific method, but something that can complement, constructively challenge, and contribute to the broader undertaking of knowledge.
posted by Hello, I'm David McGahan at 1:04 PM on December 22, 2011 [2 favorites]
Sigh, not offended, just a bit over these kinds of lazy, somewhat inaccurate, school boy level taunts.
Sometimes I think that the reason that (at least some) people that subscribe to the positivist world view have such an antipathy towards other forms of research practice and theoretical groundings, dismissed so glibly in this passage through the use of ever so scary scare quotes, has less to do with the merit or otherwise of such approaches, and more to do with what is at stake as a consequence of those approaches. (Though there is way too much cruft that is pretty easy to dismissively make fun of in these approaches to be sure)
For instance, post-modernism refers to quite a broad range of ideas and concepts, but one of the more significant ones is the critique of the modernist epistemological conception of universal truth. That's not to say that it isn't possible to observe or measure objective reality, just that we need to be very careful and reflexively aware about how our world views and research tools implicitly impact our perception of the world, both in everyday life and in the lofty world of research. Then of course we have the impacts of the socio-cultural world in which science takes place and from which it can't escape, for instance the political economic realities which help shape what kinds of research are prioritised through funding, or the ongoing impact of gender imbalances in the STEM fields for example.
Obviously this isn't meant as a rejection of the scientific method or scientists, as good scientists are aware of such things and mindful of their impact on the development of knowledge, for example the first two failure modes erniepan mentions above. But there does seem to be lazy positivists out there who would benefit from some self-reflection and a refresher in the philosophy of science, they might realise then that the humanities is not an antagonist towards or refutation of the scientific method, but something that can complement, constructively challenge, and contribute to the broader undertaking of knowledge.
posted by Hello, I'm David McGahan at 1:04 PM on December 22, 2011 [2 favorites]
Thanks David McGahan for actually bothering to write out what I couldn't bring myself to due to all my eye rolling and exasperation. Richard Rorty put it best (substitute "relativism" for "postmodernism"):
'"Relativism" is the view that every belief on a certain topic, or perhaps about any topic, is as good as every other. No one holds this view. Except for the occasional cooperative freshman, one cannot find anybody who says that two incompatible opinions on an important topic are equally good. The philosophers who get called 'relativists' are those who say that the grounds for choosing between such opinions are less algorithmic than had been thought.'
posted by Pyrogenesis at 1:28 PM on December 22, 2011
'"Relativism" is the view that every belief on a certain topic, or perhaps about any topic, is as good as every other. No one holds this view. Except for the occasional cooperative freshman, one cannot find anybody who says that two incompatible opinions on an important topic are equally good. The philosophers who get called 'relativists' are those who say that the grounds for choosing between such opinions are less algorithmic than had been thought.'
posted by Pyrogenesis at 1:28 PM on December 22, 2011
jb: We don't have to tell them the truth
We don't?
posted by stebulus at 4:17 PM on December 22, 2011
We don't?
posted by stebulus at 4:17 PM on December 22, 2011
the gov't won't pay for people to have fun, so researchers lie and say that they are bettering the world. Sure, they might also be bettering the world, but the main reason is that they are curious.
posted by jb at 6:04 PM on December 22, 2011
posted by jb at 6:04 PM on December 22, 2011
And if anybody asks me why I do math, that's what I tell them — it's fun, I'm curious, etc. If they ask what applications my work has, I start my answer with "I don't know".
Being honest about my work is as important to me as being honest in my work.
posted by stebulus at 8:59 PM on December 22, 2011
Being honest about my work is as important to me as being honest in my work.
posted by stebulus at 8:59 PM on December 22, 2011
History is like astronomy and geology - built up from observations of past events. Apparently it's both impossible and unethical to go back in time and change things to run experiments.
I'm late in pointing it out, but the linked article doesn't say that research has to contain experiments. Statistical studies count. They might even count as "experiments" by some definitions.
posted by LogicalDash at 6:50 AM on December 28, 2011
I'm late in pointing it out, but the linked article doesn't say that research has to contain experiments. Statistical studies count. They might even count as "experiments" by some definitions.
posted by LogicalDash at 6:50 AM on December 28, 2011
that wasn't in response to the article, but in response to the general question of whether history is evidence based without experiments. Statistical historical studies are evidence based - as is all well-researched history - but definitely not experimental: they do not fufil the requirements of experimental science. I'm a quantitative historian, but I recognize the limits of my research. Indeed, statistical history even has a serious problem with sampling: no other social science has to work with samples which have already been pre-selected by mice (or rather, let behind by them), let alone fires, floods, collapsing archives and widows who sell off court papers to the local baker for fuel (parchment is apparently a superior fuel for baking). There is a saying in pre-modern quantitative history: "There are lies, damned lies, damned damned lies, damned damned damned lies - and early modern statistics."
posted by jb at 11:37 AM on December 28, 2011
posted by jb at 11:37 AM on December 28, 2011
but in response to the general question of whether history is evidence based without experiments.
Who asked that?
posted by LogicalDash at 1:03 PM on December 28, 2011
Who asked that?
posted by LogicalDash at 1:03 PM on December 28, 2011
I can't remember now, this thread was ages ago, and I don't feel like re-reading it. Point is, I wasn't responding to the article.
That said, the link/ post wasn't that clear on explaining research to non-researchers.
posted by jb at 10:46 PM on December 28, 2011
That said, the link/ post wasn't that clear on explaining research to non-researchers.
posted by jb at 10:46 PM on December 28, 2011
« Older Think Different | Mean Streets Newer »
This thread has been archived and is closed to new comments
Tell me about it. I'm going to be up all night with a proposal...
posted by mr_roboto at 10:40 PM on December 21, 2011