Carcinisation

Truthseeking about controversial things (status: indecisive rambling)

Suppose you ask me a question that is fraught (politically, or for some other reason), such as “Does the minimum wage do more good than harm?” or “Could a computer be conscious?” or “Is Lockheed Martin stock going to go up?”

(Not that you should ask me any of those things.)

What makes any particular answer I might give “objective” or “honest”? Or alternatively, what makes the process by which I arrive at my answer “objective” or “honest”?

Here are some prima facie plausible answers:

  1. The answer is objective & honest if it is, in fact, justified & true (faithfulness to facts).
  2. The answer is objective & honest if it fairly presents all reliable points of view, weighted by their reliability and by consensus of expert opinion (NPOV).
  3. The answer is objective & honest if it was arrived at by my best effort as an epistemic agent, either eliminating or acknowledging personal biases, conflicts of interest etc., and steelmanning alternative positions (process objectivity).
    1. People who have read Hanson or Taleb may want to turn the “no conflicts of interest” part on its head and assert a need to make a bet or to have “skin in the game”; i.e., a real-world incentive aligned with truth-seeking.
    2. Some have suggested discussing updates on evidence, rather than discussing posteriors directly.
  4. All possible answers are necessarily factional. “Objectivity” is not a coherent goal, but the most honest answer simply presents my factional view and reasons for it, while neither attempting to conceal which faction I belong to or the existence of other factional views, nor making special efforts to do them justice (factionalism a la Moldbug).
  5. The answer itself is the wrong level of analysis; you would be better off scrutinizing whether the answerer is an epistemically virtuous and responsible person (virtue epistemology).

I’m not really thrilled with any of these, but I don’t have a great alternative to offer up. (1) is charmingly simple, but too outcome-oriented; I don’t want to condemn a wrong answer that’s due to bad epistemic luck. (2) is exploded by the need to cash out “reliable” and “expert” and “consensus” in ways that aren’t blatantly factional. (3) is the position I am most attracted to (is that too obvious?). My problem with it is not its unattainability – after all, this definition is only meant to be an ideal to aim at. Rather, I fear that each additional attempt to eliminate bias represents another free variable for Rationalization to play with, in service of the Bottom Line. (4) seems unattractively defeatist or self-serving, and “denies the phenomenon” – I can remember occasions on which I believe people were genuinely objective and honest in their presentation of evidence/beliefs to me, although it’s hard to put my finger on what convinced me of this. (5) comes in second place, but I’m skeptical that people are reliably epistemically responsible from day to day or across domains.

What do you think? I suppose defining objectivity was just a jumping off point (perhaps an excessively abstract one); I’m more interested in the conditions under which you are willing to trust somebody’s statements on a controversial question.