And it uses these receptors on the heart that tell the heart to one, beat or contract harder with more force which increases your stroke volume. And two, to beat faster, which increases your heart rate. But if you use these too much they start to down-regulate. Or decrease in amount. So you know, there's less of them. And less receptors means your heart won't respond as it did before.
So think of it like this. What if you're moving? And you call a bunch of your friends over to try to help you move your stuff. And you know, they're glad to come over and help you. And so you move your stuff and then you decide next week, oh, I want to move again. So you call them up again and say "hey, I need some more help". And they're like okay, I guess it's kinda weird, but I'm gonna come help you. And then they come. But then the next week you're like, oh, I'm gonna move again. And then you keep doing this week after week and those friends are gonna start to well one, question why you're moving so much.
And two, they're gonna stop maybe answering your phone calls. It's sorta the same thing with your sympathetic nervous system. When you activate it too much it's gonna start helping less and less. So the next big way we can compensate is by increasing this thing called preload. Now preload is defined as this pressure in the ventricles. So in this lower chamber, after it's filled but before it contracts.
When it's filled, the walls and the heart muscle cells are all stretched out because it's like filling up a water balloon. So you put the water in it and then the water causes that balloon to expand, right? So the more you fill in, the higher this pressure or preload. So as you keep filling, the walls stretch more and more and we get higher pressures or preload.
To do this, to fill more, to get more blood in those ventricles your body releases these specific hormones like antidiuretic hormone, or we call it ADH sometimes, or aldosterone, to increase this filling volume. So if the ventricles used to have about milliliters at the end of diastole, now maybe they have a little more, like milliliters. This extra little bit might not seem like a lot, it's like one tablespoon, but it's enough to stretch the heart chamber and muscles just a little more. So that's great, but why does increasing the preload, or filling pressure, increase the force of contraction, and so your stroke volume?
Well, think of it like stretching a rubber band. The more you stretch the rubber band the more force it'll snap back with, right? Okay, so there's a lot going on here, right? So let's go one step at a time. So as you fill the ventricles with more blood it stretches those muscles out, right? And just like the rubber band did, when you stretch it out it contracts, or it snaps back with more force.
And when it contracts with more force, you get more stroke volume, you get more blood ejected. And so this guy named Frank-Starling saw this and kind of decided to cut corners and create this law that says, as you increase your pressure or your preload, you also increase your stroke volume. And that's the Frank-Starling law. Well, it gets bigger, it expands. But this is like passive, the water is forcing it to get bigger. Now think about filling up one of those glass flasks from Chemistry class.
I mean, I'm gonna be really extreme here just to make a point, but, what's gonna happen when you fill it up is just, it's not gonna get bigger, it's not gonna change shape, it's just gonna fill up all the way and then start to overflow and spill all over. That's because it's a lot less compliant. It's probably one of the least compliant things we can think of. It's the same with the heart with a bunch of fibrous connective tissue. It can't relax and it can't passively expand, and it can't fill completely.
So that's what's going on with diastolic heart failure, but how does it get like that and how do we get these enlarged and stiffened muscles? Well, just like systolic failure, it's a secondary disease which means that this growth and stiffening is caused by some kind of underlying disease that's been there before. The big one that we tend to understand the most is chronic hypertension or high blood pressure.
So when the pressure in your blood vessels goes up, they become harder to pump against. Harder to pump into. This is kinda like blowing into a straw versus like a big tube. Which one do you think it's gonna be harder to blow air to? It's probably the smaller one, right? Well, it's sort of like that for the heart. Except the heart has to pump blood through these narrowed vessels, and this is way more difficult to do. So what does your heart do? Well, it bulks up. It gains muscle and it gets bigger, so it can pump against these high pressures.
Now, both diet and diabetes can both contribute to higher blood pressure and hypertension. Those are definitely big risk factors for hypertension. And therefore, diastolic heart failure. The second underlying disease is aortic stenosis, and stenosis from the systolic heart video we know is a narrowed valve. Specifically, we're gonna talk about this valve right here, this aortic valve. Then that valve goes out from the left ventricle and pumps into an artery called aorta. It's similar to hypertension. It's a lot harder to pump blood through this narrowed opening, as opposed to a valve that's opening all the way.
Well, the heart muscle again bulks up and gains muscle so it can try to pump harder through this smaller valve. Now, this is a little tricky though, right? Because we remember that this can also lead to systolic failure. Well, unfortunately, a lot of the mechanisms behind why in one case it might lead to this growth of muscles like in diastolic heart failure or it might lead to this series weakening of the muscles like in systolic heart failure are pretty complex and honestly, a lot of these mechanisms are unknown in still big areas of research.
Next up, we have cardiomyopathies which means heart muscles diseases and sometimes this can be a little general, but for diastolic heart failure in particular, there's two that we're gonna focus on. The first one is hypertrophic cardiomyopathy which we can kinda figure out by the name.
Hypertrophic or hypertrophy means muscle growth. Analogies are commonly applied to problems where we do not possess determination rules, such as Example 8 morphine and meperidine. In many perhaps most cases, we are not even aware of all relevant factors, let alone in possession of a determination rule. Medical researchers conduct drug tests on animals without knowing all attributes that might be relevant to the effects of the drug.
Indeed, one of the main objectives of such testing is to guard against reactions unanticipated by theory. For cases such as animal testing, neither option seems realistic. Weitzenfeld proposes a variant of this approach, advancing the slightly more general thesis that analogical arguments are deductive arguments with a missing enthymematic premise.
That premise typically amounts to a determination rule. But Weitzenfeld does not insist that the missing premise should be background knowledge. Rather, he suggests that the missing premise is discovered and justified through independent processes: Enumeration amounts to examining source and target domains systematically. Surveillance is a matter of perceiving that the same determining structures are present in both domains:.
But on the matter of acquiring this holistic grasp of similarity, Weitzenfeld concedes: In the case of drug testing on animals, for example, we appeal to common evolutionary history and similar functional organization. If that is right, then we have come full circle. We have replaced our argument by analogy, which required no commitment to any generalization, with a valid deductive argument that has an extra premise that has to be supported with plausibility arguments.
That problem re-appears as the need to establish the plausibility of the determination rule, and that is at least as difficult as justifying the original analogical argument. Some philosophers have attempted to portray, and justify, analogical reasoning in terms of some simpler inductive argument pattern. The problem of providing a general justification for analogies then reduces to the problem of justifying that simpler inference pattern.
There have been three moderately popular versions of this strategy. The first treats analogical reasoning as generalization from a single case; the second treats it as a kind of sampling argument. A third type of inductive justification recognizes the argument from analogy as a distinctive form, but treats its past successes as evidence for future success.
Can such a simple analysis of analogical arguments succeed?
A single instance can, perhaps, lead to justified generalization. Even if we accept that there are such cases, the objection to understanding all analogical arguments as single-case induction should be obvious: Most analogical arguments will not meet the requisite conditions. We may not know that we are dealing with a natural kind or Aristotelian nature when we make the analogical argument; we may not know which properties are essential properties. Interpreting the argument from analogy as single-case induction is also counter-productive in another way.
Such a brutally simple analysis does nothing to advance the search for criteria that help us to distinguish between relevant and irrelevant similarities, and hence between good and bad analogical arguments. On the sampling conception of analogical arguments, acknowledged similarities between two domains are treated as statistically relevant evidence for further similarities. His only restriction has to do with sample size: Mill saw no difficulty in using analogical reasoning to infer characteristics of newly discovered species of plants or animals, given our extensive knowledge of botany and zoology.
But if the extent of unascertained properties of A and B is large, similarity in a small sample would not be a reliable guide; hence, Mill's dismissal of Reid's argument about life on other planets Example 2. The sampling argument is presented in more explicit mathematical form by Harrod The key idea is that the known properties of S the source domain may be considered a random sample of all S 's properties—random, that is, with respect to the attribute of also belonging to T the target domain.
If the majority of known properties that belong to S also belong to T , then we should expect most other properties of S to belong to T , for it is unlikely that we would have come to know just the common properties. More precisely, Harrod's fair sampling postulate is the following supposition: The sort of problem to which this distribution standardly applies is drawing balls from an urn.
The binomial distribution gives the chance of drawing r black balls in n selections with replacement from an urn in which the proportion of black balls is p. There are grave difficulties with Harrod's and Mill's analyses. One obvious difficulty is the counting problem , noted earlier: How are we to count similarities and differences? The ratio of shared to total known properties varies dramatically according to how we do this. Should properties logically implied by other properties be counted?
Alternatively, should some properties be weighted more heavily than others? If we can add similarities or differences at will, then these arguments yield inconsistent results. Sampling arguments have no value without guidelines for counting similarities, or better, a theory of relevance. A second serious difficulty is the problem of bias: The paradigm of repeated selection from an urn seems totally inappropriate. In the case of the urn, the selection process is arranged so that the result of each choice is not influenced by the agent's intentions or purposes, or by prior choices.
There is good reason to believe that samples are randomly selected. By contrast, the presentation of an analogical argument is always partisan. Bias enters into the initial representation of similarities and differences: We have excellent reason to reject a distribution based on random sampling. Additional versions of the sampling approach have been developed e. Liston suggests that, despite the mystery, physicists are entitled to use Pythagorean analogies on the basis of induction from past successes. Setting aside familiar worries about arguments from success, the real problem here is to determine what counts as a similar strategy.
In essence, that amounts to isolating the features of successful Pythagorean analogies. This strategy would face an additional problem even if we could find such a characterization: An a priori approach traces the validity of a pattern of analogical reasoning, or of a particular analogical argument, to some broad and fundamental principle. The fundamental principle, in turn, is held up as self-evident, philosophically indispensable, or at the very least highly plausible.
Three such arguments will be considered here. The first is due to Keynes Keynes's approach appeals to his famous Principle of the Limitation of Independent Variety, which he articulates as follows:. In the Keynesian universe, there is a set of generator properties with two characteristics: In practical terms, the measure of the amount of independent variety in a system is either the number of groups or the number of generators. Keynes makes one further assumption: Keynes also offers a simple characterization of the problem of justification.
An analogical argument is justified if knowledge of the positive analogy increases the logical probability of the conclusion. In terms of schema 4: Here, P and Q are arbitrary finite conjunctions of properties. Unfortunately, Keynes's account falls short at precisely this point. His assumptions are too weak to provide justification for anything other than perfect analogies, the cases where there is no negative analogy at all.
This problem was noticed by Hesse Those familiar with Carnap's theory of logical probability will recognize the difficulty. In assuming that the generator properties are independent, Keynes has settled on a Carnapian measure that permits no learning from experience. Hesse offers an alternative a priori approach, a refinement of Keynes's strategy.
In her , Hesse develops a justification for analogical reasoning along Carnapian lines by adopting what she calls the Clustering Postulate: Specifically, for any attributes P and Q , and for any finite set of objects a 1 , …, a n having the attribute P , the prior probability of finding a proportion of these objects with attribute Q is skewed in favour of the proportions 1 and 0. The objections to such global postulates of uniformity are well-known see Salmon , but even if we waive them, her argument fails.
The objection—which also applies to Keynes—is the familiar problem of failure to discriminate: In simplified form, they require the existence of non-trivial positive analogy and no known critical disanalogy; the details depend upon a set of models reflecting the nature of the causal and logical relationships being transferred from the source to the target domain.
The scope of Bartha's argument is also limited to analogical arguments directed at establishing prima facie plausibility, rather than degree of probability. Bartha's argument rests on a principle of symmetry reasoning articulated by van Fraassen There are two modalities here. Bartha argues that satisfaction of the criteria of the articulation model is sufficient to establish the modality in the antecedent, i. He further suggests that prima facie plausibility provides a reasonable reading of the modality in the consequent, i.
The argument is vulnerable to two major sorts of concerns. First, there are questions about the standing and interpretation of the symmetry principle on which it rests. Second, there remains a residual worry that this justification, like all the others, proves too much. The articulation model may be too vague or too permissive. Arguably, the most promising available defense of analogical reasoning may be found in its application to case law see Precedent and Analogy in Legal Reasoning.
Judicial decisions are based on the verdicts and reasoning that have governed relevantly similar cases, according to the doctrine of stare decisis Levi ; Llewellyn ; Cross and Harris ; Sunstein In practice, of course, the situation is extremely complex. No two cases are identical. The ratio must be understood in the context of the facts of the original case, and there is considerable room for debate about its generality and its applicability to future cases.
If a consensus emerges that a past case was wrongly decided, later judgments will distinguish it from new cases, effectively restricting the scope of the ratio to the original case. The practice of following precedent can be justified by two main practical considerations. First, and above all, the practice is conservative: People need to be able to predict the actions of the courts and formulate plans accordingly. Stare decisis serves as a check against arbitrary judicial decisions.
Second, the practice is still reasonably progressive: Careful judges distinguish bad decisions; new values and a new consensus can emerge in a series of decisions over time. In theory, then, stare decisis strikes a healthy balance between conservative and progressive social values. This justification is pragmatic. It pre-supposes a common set of social values, and links the use of analogical reasoning to optimal promotion of those values. Notice also that justification occurs at the level of the practice in general; individual analogical arguments sometimes go astray.
Analogy and Analogical Reasoning
A full examination of the nature and foundations for stare decisis is beyond the scope of this entry, but it is worth asking the question: Is a parallel pragmatic justification available for analogical arguments in general? Bartha offers a preliminary attempt to provide such a justification by shifting from social values to epistemic values. The general idea is that reasoning by analogy is especially well suited to the attainment of a common set of epistemic goals or values. In simple terms, analogical reasoning—when it conforms to certain criteria—achieves an excellent perhaps optimal balance between the competing demands of stability and innovation.
It supports both conservative epistemic values, such as simplicity and coherence with existing belief, and progressive epistemic values, such as fruitfulness and theoretical unification McMullin provides a classic list.
- Heart failure;
- The Empress Ruby.
- Analogy and Analogical Reasoning (Stanford Encyclopedia of Philosophy).
- Compensation and decompensation in heart failure.
- Analogy of my heart | miacastile.
How do we combine analogical reasoning with other inferential processes? As a more manageable question, which is also an important special case: Confirmation, in a broad sense, is the process by which a scientific hypothesis receives inductive support on the basis of evidence see evidence and Bayes' Theorem.
Confirmation may also signify the logical relationship of inductive support that obtains between a hypothesis H and a proposition E that expresses the relevant evidence. Can analogical arguments play a role, either in the process or in the logical relationship? Arguably yes to both , but this role has to be delineated carefully, and several obstacles remain in the way of a clear account.
Earlier sections of this entry advanced the claim that a good analogical argument can make a hypothesis plausible, either in the sense of prima facie plausibility or in justifying the assignment of an appreciable subjective probability or credence. On either version, a hypothesis derives inductive support from a credible analogy. Furthermore, the idea that analogical arguments can provide such support seems to be independent of one's theory of confirmation. But there are good reasons to reject the claim that analogies provide actual confirmation. In the first place, there is a logical difficulty: To appreciate this, let us concentrate on confirmation as a relationship between propositions.
Compensation and decompensation in heart failure (video) | Khan Academy
Some propositions seem to help make it rational to believe other propositions. When our current confidence in E helps make rational our current confidence in H , we say that E confirms H. A Bayesian agent starts with an assignment of subjective probabilities to a class of propositions. Confirmation is understood as a three-place relation:. E represents a proposition about accepted evidence, H stands for a hypothesis, K for background knowledge and Pr for the agent's subjective probability function. To confirm H is to raise its conditional probability, relative to K.
The relation between these two probabilities is typically given by Bayes' Theorem setting aside more complex forms of conditionalization:. For Bayesians, it seems that an analogical argument cannot provide confirmation. In the first place, it is not clear that we can encapsulate the information contained in an analogical argument in a single proposition, E. Second, even if we can formulate a proposition E that expresses all of that information, it is not appropriate to treat it as evidence.
The information contained in E is already part of the background, K. According to the definition, we don't have confirmation. Again, the definition of confirmation seems inapplicable. Although these observations are based on a Bayesian approach, similar problems exist for other accounts of confirmation. On an error-statistical approach, for instance, it is readily apparent that analogical arguments should not be regarded as evidence. If analogies don't provide inductive support via conditionalization, is there an alternative? Here we face a second difficulty, once again most easily stated within a Bayesian framework.
Van Fraassen has a well-known objection to any belief-updating rule other than conditionalization. This objection applies to any rule that allows us to boost credences when there is no new evidence.
Adopting any such rule would lead us to acknowledge as fair a system of bets that foreseeably leads to certain loss. This type of vulnerability provides good reason to reject such rules. If we think that good analogical arguments provide inductive support, it looks like we are vulnerable to van Fraassen's objection. There is a way to avoid these difficulties and to find a role for analogical arguments within Bayesian epistemology. If analogical reasoning influences prior probability assignments, it can provide inductive support while remaining formally distinct from confirmation, avoiding the first difficulty.
If it can provide this support in non-rule-based fashion, then we avoid the second difficulty, i. The cost of taking this route, however, is high. Analogical arguments are employed both for newly considered hypotheses and to shift existing opinion without marshalling new evidence. An orthodox Bayesian, such as de Finetti de Finetti and Savage , de Finetti , might have no problem in allowing that analogies, in company with considerations of symmetry and other psychological predilections, play a causal role in influencing our prior probability assignments.
But how can analogies lead to rational belief change e. As Hawthorne notes, some Bayesians simply accept that both initial assignments and ongoing revision of prior probabilities based on plausibility arguments can be rational, but. The cost of admitting analogical reasoning into the Bayesian tent in this manner is to acknowledge a dark corner of the tent in which rationality operates without any clear rules. Of course, van Fraassen himself recognizes that belief change other than conditionalization can be rational, so long as three conditions are met.
The first is synchronic rationality: Second, as already noted, such changes cannot be based on any rule. The third constraint is Reflection: In fact, this final and controversial constraint, Reflection , offers a way to mitigate the darkness. A modicum of open-mindedness is required as a way of anticipating my possible future opinion, just as required by Reflection. Support from a good analogical argument provides sufficient justification for this type of open-mindedness. This position may be the common core of the ideas about analogy held by Herschel, Whewell and Campbell.
Significant challenges remain before this picture of how analogies fit into the framework of confirmation can be made perfectly clear. More generally, serious questions remain about how to make connections between prima facie plausibility and probabilistic credences. There now exists a wide variety of formal devices for representing uncertainty Halpern It may be that some non-probabilistic representation proves more suitable than probability for modeling analogical reasoning.
If so, the challenge will be to find a way that allows us to integrate prima facie plausibility reasoning with confirmation, for which a probabilistic representation appears indispensable. Criteria for evaluating analogical arguments 3. Philosophical foundations for analogical reasoning 4. According to Joseph Priestley, a pioneer in chemistry and electricity, analogy is our best guide in all philosophical investigations; and all discoveries, which were not made by mere accident, have been made by the help of it. For instance Example 3 , Darwin takes himself to be using an analogy between artificial and natural selection to argue for the plausibility of the latter: The method of ethnographic analogy is used to interpret the nonobservable behaviour of the ancient inhabitants of an archaeological site or ancient culture based on the similarity of their artifacts to those used by living peoples.
Specifically, it focuses on three central epistemological questions: What criteria should we use to evaluate analogical arguments? What philosophical justification can be provided for analogical inferences? How do analogical arguments fit into a broader inferential context i. First, a geometric example: Two examples from the history of science: Finally, an example from legal reasoning: An analogical argument has the following form: S is similar to T in certain known respects.
S has some further feature Q. Let P stand for a list of accepted propositions P 1 , …, P n about the source domain S. Then we refer to P as the positive analogy. Then we refer to A and B as the negative analogy. The neutral analogy consists of accepted propositions about S for which it is not known whether an analogue holds in T. The hypothetical analogy is simply the proposition Q in the neutral analogy that is the focus of our attention. A classic expression may be found in Mill's analysis of the argument from analogy in A System of Logic: Here are some of the most important ones: G1 The more similarities between two domains , the stronger the analogy.
G2 The more differences, the weaker the analogy. G3 The greater the extent of our ignorance about the two domains, the weaker the analogy. G4 The weaker the conclusion, the more plausible the analogy. G5 Analogies involving causal relations are more plausible than those not involving causal relations. G6 Structural analogies are stronger than those based on superficial similarities. G7 The relevance of the similarities and differences to the conclusion i.
G8 Multiple analogies supporting the same conclusion make the argument stronger.
The argument from example paradeigma is described in the Rhetoric and the Prior Analytics: Topics b10—17 This passage occurs in a work that offers advice for framing dialectical arguments when confronting a somewhat skeptical interlocutor. In Topics I 17, Aristotle states that any shared attribute contributes some degree of likeness: Topics a13 It is natural to ask when the degree of likeness between two things is sufficiently great to warrant inferring a further likeness.
The common principle is this: Mete a17 From this method of justification, we might conjecture that Aristotle believes that the important similarities are those that enter into such general causal principles. Summarizing, Aristotle's theory provides us with four important and influential criteria for the evaluation of analogical arguments: The strength of an analogy depends upon the number of similarities. Similarity reduces to identical properties and relations.
Good analogies derive from underlying common causes or general laws. A good analogical argument need not pre-suppose acquaintance with the underlying universal generalization. Hume makes the same point, though stated negatively, in his Dialogues Concerning Natural Religion: Hesse's theory Hesse offers a sharpened version of Aristotle's theory, specifically focused on analogical arguments in the sciences. She formulates three requirements that an analogical argument must satisfy in order to be acceptable: Requirement of material analogy. The horizontal relations must include similarities between observable properties.
The essential properties and causal relations of the source domain must not have been shown to be part of the negative analogy. This has the same mathematical form as Poiseuille's law for ideal fluids: Causal condition Hesse requires that the hypothetical analogy, the feature transferred to the target domain, be causally related to the positive analogy. She states the requirement as follows: Benjamin Franklin's Experiments , Franklin's hypothesis was based on a long list of properties common to the target lightning and source electrical fluid in the laboratory.
Consider the sentence, Gravitational attraction between the sun and a planet, and the fact that the mass of the sun is much greater than that of the planet, causes the planet to orbit the sun. Gentner represents this in the following form: Gentner's Systematicity Principle states: The program handles the following type of problem: The fundamental idea is that a good analogical argument must satisfy two conditions: There must be a clear connection, in the source domain, between the known similarities the positive analogy and the further similarity that is projected to hold in the target domain the hypothetical analogy.