First off, awesome thread!! 
I remember a talk, I believe it was TED, in which rationality was explained from different viewpoints
on the example of teens actually being more rational in decision-making then adults, considering their
standpoint and how they reach the conclusions to their actions.
I also believe that to distinguish between rational belief and rational behavior is essential.
Also I agree with your dissection of the two.
But it's also important to note, without moral judgement of its value, that a rational choice does not
have to be the right one, either because off winning against the odds, subjective gain or unforseeable
events that change the situation and outcome, or our own inability to grasp all factors.
b), which concerns behavior, can be unpacked in a similar way. If I smoke cigarettes, but I value my health and believe that cigarettes harm my health, then it is irrational for me to smoke cigarettes. To become rational I can stop smoking, discard my belief that cigarette smoke is harmful, or stop valuing my health.
Just to adress the point with the cigarettes, you disregard addiction.
Rationality becomes very different in the light of physical conditions.
(Specifically addiction might have been in that same TED talk, or I saw somewhere else)
Now to the questions:
Do humans generally think and behave rationally or irrationally? If it's somewhere in between, when are we rational and when are we irrational?
It's probably true that most humans think they think and behave, and therefore decide, rationally.
To this I already made a thread (and I already started reading the papers - now I have more time),
which argues that this is mainly not true. Even though these points are rather based on irrational
belief or irrational evaluation of ones own capabilties - meaning the irrationality is unintended.
So, what might be more to the question, do people act irrational inspite of knowing better?
I would argue that this frequently is the case also. Human emotion, needs or hope cloud our
judgement often enough, even though we might know better. If everyone were to reflect on
their choices, I'm sure many would reconsider.
We are rational, when we act in accordance to our own belief system (which in itself needs to be
at least somewhat rational or has not been repeatedly disproven) in a logical way. But what I think is
even more important is the rational behavior that grows out of the individual's set of needs and
perspective. And keeping in mind that it might often be irrational, but nonetheless right to do sth.
I'd like to add that I think it's always rational to consider that ones beliefs, however rational they
might seem, might be also irrational or just plain untrue.
What is the role of emotions in rationality? Does emotion help or hinder us in achieving rationality? Or is the question of emotions irrelevant?
Emotion is not irrelevant, if only because of its important influence on everything.
Emotion doesn't help nor hinder us in achieving rationality, the same way rationality doesn't
help nor hinder us in achieving or enjoying emotion. They very well might; combining the two
might even appear like combining relativity with quantum mechanics, mainly because the subjective
value is impossible to grasp. Although in this context, emotion might as well be called subjectivity.
In my opinion, they actually go quite well together.
Is it rational or irrational to value events that happen in the present more than those same events if they were to occur in the future? In other words, to what extent should we rationally be concerned about the welfare of our future selves relative to our present selves?
That's a good question.
My personal belief, however rational it may be, is that it is the moment that counts, even though I
must admit that I am not always rationally living up to this philosophy. If it comes down to it, both
is needed. Planning for the future is different from worrying about the future and the further into the
future the planning goes, the more irrational it gets, since unforseeable events are to be rationally
expected to happen.
Is logical consistency a necessary condition for rationality? Is it a sufficient condition?
Hm, in my understanding, logical consistency is a necessity for rationality.
Is it a sufficient condition? I don't know, sometimes maybe yes, sometimes maybe no.
Is is necessary for our beliefs about the world to correspond with objective reality? (Assuming that it is possible for us to know objective reality... let's save that discussion for another day.)
Well, I woud argue that it isn't, since, even if there was an objective reality, at the time,
we have no idea what it may be. I don't intend to go into the possibility of knowing
objective reality, or wether it exists, but I do think it's safe to say that what most of
us accept as the objective reality is only part of the story.
But all we have is that, so I think it is at least necessary to consider it thoroughly.
(As adressed a little further down, I do think we are bonded to our zeitgeist)
Referring to the earlier distinction between beliefs, values, and attitudes: can values and/or attitudes be irrational?
They can be rational to some, while irrational to others.
To be truly rational is a very difficult or moreso an impossible project.
I suppose that people that have unchangebale beliefs already know their own irrationality,
since it is often argued that science is just as flawed (which might be true) and tries to deflect.
If a certain act or belief ultimately benefits a person, is that alone sufficient to call that act or belief rational? Or is it necessary for the chosen act or belief to be the best possible for the actor in terms of expected value? In other words, if there are two choices A and B, and A has the higher expected value according to the judge's personal knowledge, but in fact B will ultimately benefit the judge more if chosen, which is the more rational choice?
This is one of those cases in which you have to ask rational for whom.
In my moral construct it is not rational, but who is to say that morality has any say here.
Ones own rational vs the the group? Again, rational for whom? (Which is my point)
Any of the things we experience, as rational as we make them out to be, are still in a way decoded
through our senses - we can't perceive otherwise. Therefore the perception is a crucial point. While
someone acts rationally in that instance, it might have been irrationaly out of an outside perspective.
That is why I think we sometimes (often, always?) have to take a step back and question our rational.
Are the tenets of rationality universal or are they culturally and historically relative?
This also depends. It's rational to assume that they can only be as advanced as the rational
(meaning scientific) progress. A simple thought experiment might be that it would be irrational
to think we know everything, so therefore our concept of it is tied to what we have figured out
so far. We know that we have to keep searching, but from time to time, we also need to make
decisions. Therefore it is undoubtedly related to culture and the historical status quo.
Does it matter whether people are strictly rational or not? How hard should we work to make our beliefs and decisions conform with rationality?
Belief only matters if people act on what they believe. As long as the morals of a belief system are
in accordance with those of the "universal, rational consensus" *cough* it doesn't make a difference.
And in my opinion, diversity is what makes everything wonderful and interesting. Everyone should
work hard to conform their beliefs to their personal rational, in addition to living in the now and
being emotionally, subjectively fulfilled, but that's as far as it goes. I believe we should be true to
ourselves, but as soon as we pass the line to applying (or forcing) rationality (or anything, really)
on others, it looses its objective features.
|
|
Bookmarks