The Firing Line Forums

Go Back   The Firing Line Forums > The Conference Center > Law and Civil Rights

Reply
 
Thread Tools Search this Thread
Old July 18, 2013, 06:45 AM   #1
2damnold4this
Senior Member
 
Join Date: August 12, 2009
Location: Athens, Georgia
Posts: 2,526
Stand your ground/castle law stuidies

I am a supporter of stand your ground/castle laws but there are a couple of studies that suggest stand your ground laws increase the homicide rate.

Here are the studies:

Texas A&M


Georgia State


What do you think?
2damnold4this is offline  
Old July 18, 2013, 10:24 AM   #2
Glenn E. Meyer
Senior Member
 
Join Date: November 17, 2000
Posts: 20,064
Thanks for the links. Please read them, folks and discuss them without invectives.

Glenn
__________________
NRA, TSRA, IDPA, NTI, Polite Soc. - Aux Armes, Citoyens
Glenn E. Meyer is offline  
Old July 18, 2013, 10:50 AM   #3
Jimboh247
Senior Member
 
Join Date: December 5, 2012
Location: Lexington, NC
Posts: 102
I did not realize that "castle" doctrine, in some states, extends outside one's own home.

I was always under the assumption that as long as I was in my house, or on my property, I did not have to retreat.

If given the choice of standing my ground in public, or retreating to a safe area, I'd definitely retreat. If, and only if the aggressor continues would I consider lethal force.

I understand "stand your ground" on your own property, not so much in public.
Jimboh247 is offline  
Old July 18, 2013, 11:19 AM   #4
Evan Thomas
Senior Member
 
Join Date: July 7, 2008
Location: Upper midwest
Posts: 5,631
Everyone who posts in this thread needs to read the sticky, "Duty to Retreat, "Stand Your Ground", and Castle Doctrine" before they do so. That will clarify the meaning of SYG and castle doctrine, and save explanations here.
__________________
Never let anything mechanical know you're in a hurry.
Evan Thomas is offline  
Old July 18, 2013, 11:43 AM   #5
Jimro
Senior Member
 
Join Date: October 18, 2006
Posts: 7,097
The math from the Texas study is torturous.

The overall violent crime rate in Florida from 2005 to 2012 per 100k residents is:

2005: 702.2
2006: 705.8
2007: 705.5
2008: 670.3
2009: 604.9
2010: 542.9
2011: 519.3
2012: 492.6

Clearly Castle laws do not have a negative impact on the total rate of violent crime.

As far as "murder" and "manslaughter" the numbers are all over the charts, and increase or decrease depends on where you set the zero. By setting the zero at 2005 when Florida had the lowest number of homicides in a 20 year spread it makes the following years show a growth in homicide.

However, if you set the zero at 1992 (1,191 homicides), there is a reduction in total number of homicides to 2012 (1,009 homicides), but a 5.5 Million person increase in the population. The authors failed to take in historical variability as a confounding factor in their methodology, and the review board should have caught that.

So violent crime has been on the demise since 1992 in Florida, homicide numbers have fluctuated between a high of 1,191 and a low of 881. Average homicides per year, 977, with a standard deviation of 115. However this is for total homicides in the state of Florida, and not per 100k residents, which has continually gone down (save for one year, slight bump) as the population grew faster than the crime rate.

So, looking at the methodology makes me think that the authors of the article took a conclusion, then twisted the math to make an argument with a pseudo-scientific justification.

Jimro
__________________
Machine guns are awesome until you have to carry one.
Jimro is offline  
Old July 18, 2013, 11:46 AM   #6
Aguila Blanca
Staff
 
Join Date: September 25, 2008
Location: CONUS
Posts: 18,468
Jimboh and Jimro, please refer to Vanya's post #4
Aguila Blanca is offline  
Old July 18, 2013, 11:58 AM   #7
overhead
Senior Member
 
Join Date: January 28, 2013
Location: Norfolk, VA
Posts: 182
I read the first study, and to me it appears they are going to a lot of trouble just to end up with what amounts to an incredibly small sample size and a problem with the old "correlation does not equal causation" issues. They did a wonderful job of making it as complicated as possible, though. Of course that is coming from a guy that barely squeaked by statistics and could not stand econ. classes.

The second study might be interesting, but I am not willing to pay 5$ to read it and the summary does not tell me much about their methods.

The premise of the first seems to be that the lowered "cost" of shooting someone in a self-defense situation (removing civil liability and making justifying the use of deadly force in self defense easier) would increase the number of people willing to shoot someone in a questionable self defense situation instead of just "retreating" out of danger. I cannot imagine, in the heat of the moment, people are considering this sort of thing. If they are, I would suggest they are not likely under an imminent threat of great bodily harm or death. That being said I am open to having my opinion changed, but the first study did not do it for me.
overhead is offline  
Old July 18, 2013, 12:44 PM   #8
kevinjmiller
Member
 
Join Date: April 15, 2013
Location: MA
Posts: 16
Just reading through the Texas A&M study (didn't read the Georgia State one because it appeared to require a paid subscription) I noticed that the authors included statistics for burglary and robbery even thought such crimes are not defined as part SYG or even CD laws. Now that might be because Texas is one of the few (only?) states that allows, by legislation and common law, the use of deadly force in defense of property, but one would think that competent, unbiased researchers would know this and factor it out accordingly. They did not: "To the extent that criminals respond to the higher actual or perceived risk that victims will use lethal force to protect themselves, we would expect these crimes [burglary, robbery, and aggravated assault] to decline after the adoption of castle doctrine. " (4.2 Deterrence, p17)

To me this is, at best, a red herring, and more likely points to logical fallacies in the study. I did not delve into the myriad of statistical details of this study (proof by intimidation?) to see how much their inclusion of burglary and robbery data corrupted other aspects of their conclusions, but I lean toward similar conclusions as Jimro and overhead: This study is flawed in composition, execution and conclusion.
kevinjmiller is offline  
Old July 18, 2013, 01:21 PM   #9
2damnold4this
Senior Member
 
Join Date: August 12, 2009
Location: Athens, Georgia
Posts: 2,526
Jimro, there does seem to be a large variation in murder rates from year to year. Some of the SYG states showed a drop in murder rates that did better than the regional and national averages but others showed results that lagged behind. The Texas A&M folks claim to have statistically significant results. I'm not good enough with math to tell.


Kevinjmiller, the Texas A&M folks checked to see if SYG laws might have a deterrent effect on burglary, robbery and aggravated assault. The reason they gave for checking for a deterrent was that some backers of SYG laws have claimed that these laws help prevent crime.

Overhead, it did seem to be a reach to claim that SYG laws were responsible for the increased murder rates claimed by the authors.


One thing the A&M folks mention is the possibility that the increase in murders were murders that were justifiable but incorrectly classified by the FBI's UCR. They say they don't think it likely but it is possible.


What I'd like to know is if these "extra" homicides are in the home, a place of business or on the street.
2damnold4this is offline  
Old July 18, 2013, 01:48 PM   #10
Evan Thomas
Senior Member
 
Join Date: July 7, 2008
Location: Upper midwest
Posts: 5,631
I also don't intend to spend money to read the Georgia study.

The Texas study uses a definition of "castle doctrine" that's a bit different from what we're used to: they use "expanded castle doctrine" to refer to what we'd call "stand your ground" laws. (See table 1, p. 36, for a summary of laws in the states they studied.)

Be that as it may, a couple of posts seem to reflect a basic misunderstanding of the methodology of the Texas study. The authors are using results across states to compare expected and actual changes in crime rates, with adoption of castle doctrine laws as the independent variable. From the introduction:
. . .we primarily identify effects by comparing changes in castle doctrine states to other states in the same region of the country by including region-by-year fixed effects. Thus, the crucial identifying assumption is that in the absence of the castle doctrine laws, adopting states would have experienced changes in crime similar to non-adopting states in the same region of the country.
The graphs in Figure 1 make this comparison directly for experimental and control states. The data in Figure 2 are before-and-after (adoption of "castle doctrine" laws) comparisons within states, but they're comparing the differences from the control (non-castle-doctrine) states. They show consistent increases in those differences after the adoption of the new laws.

It misses the point to critique the study on the basis of changes in crime rates within a particular state. The fact that one doesn't understand the methodology doesn't invalidate it.

As to robbery and burglary, it would be odd if they were excluded, given that muggings and other armed robberies, as well as most break-ins, are committed with the intent of stealing rather than committing mayhem on the victims; robbery and burglary rates are obvious dependent variables in a study of whether these laws deter crime.
__________________
Never let anything mechanical know you're in a hurry.

Last edited by Evan Thomas; July 18, 2013 at 02:20 PM.
Evan Thomas is offline  
Old July 18, 2013, 05:18 PM   #11
2damnold4this
Senior Member
 
Join Date: August 12, 2009
Location: Athens, Georgia
Posts: 2,526
The part that I have difficulty with is the math. We obviously have a lot of statistical noise with fluctuations of murder rates in states and regions over time. I don't have the math skills to check and see if the Texas A&M researchers are correct in their assertion that they got a statistically significant result. If they did find something significant, it could have implications in the coming debate over SYG laws. Maybe the NRC will check into the figures.


Perhaps someone who is a faculty member of a university can read the Georgia paper for free and report back.
2damnold4this is offline  
Old July 18, 2013, 06:50 PM   #12
Evan Thomas
Senior Member
 
Join Date: July 7, 2008
Location: Upper midwest
Posts: 5,631
Quote:
I don't have the math skills to check and see if the Texas A&M researchers are correct in their assertion that they got a statistically significant result.
Statistical significance isn't an "assertion;" it's a test of the data that's determined by the research design. It means, roughly, "probability that the observed result was due to random variation rather than manipulation of the independent variable." So if a difference is significant at the .01 level, that means there's a 99% probability that it's due to the variable being studied, and a 1% probability that it was due to chance.

Table A1, p.42, gives the results, with significance levels, for the types of crime the authors studied before and after passage of various "castle doctrine" laws. Note that the study controlled for a number of possible confounding variables, including:
. . . policing and incarceration rates, welfare and public assistance spending, median income, poverty rate, unemployment rate, and demographics.
Statistical significance is a mathematical assessment of the difference between two (or more) sets of data; the meaning of "significance" in this context is a technical one. Whether the results are seen as important (for social policy decisions, for example) is a separate question.
__________________
Never let anything mechanical know you're in a hurry.
Evan Thomas is offline  
Old July 18, 2013, 10:49 PM   #13
2damnold4this
Senior Member
 
Join Date: August 12, 2009
Location: Athens, Georgia
Posts: 2,526
I guess I'm having trouble framing my question about the result. I understand what a statistically significant result means. Other folks have reached different results when looking at states that changed their castle/SYG laws (John Lott looking at 1977 to 2005). What I'm asking is if A&M plugged the numbers in right. If they did, why doesn't that jive with Lott's earlier numbers?
2damnold4this is offline  
Old July 18, 2013, 10:52 PM   #14
2damnold4this
Senior Member
 
Join Date: August 12, 2009
Location: Athens, Georgia
Posts: 2,526
At the bottom of the linked page, there is a neat interactive map that shows homicide rates of different states that changed their laws in the 2000s.
2damnold4this is offline  
Old July 18, 2013, 11:19 PM   #15
Evan Thomas
Senior Member
 
Join Date: July 7, 2008
Location: Upper midwest
Posts: 5,631
From the article you linked above: "Starting with Florida in 2005, at least 24 states have adopted some variation of a stand-your-ground law."

As to the comparison with Lott's data, that would seem to be your answer right there. His data end in 2005, while Hoekstra and Cheng are analyzing data from states that changed their laws in 2005; their pre-change period for comparison includes data starting in 2000, but they're interested in data from the point at which Lott's study ends. So there's no reason their data should be consistent with his, other than wishful thinking.
__________________
Never let anything mechanical know you're in a hurry.

Last edited by Evan Thomas; July 19, 2013 at 07:40 PM.
Evan Thomas is offline  
Old July 19, 2013, 01:25 AM   #16
Jimro
Senior Member
 
Join Date: October 18, 2006
Posts: 7,097
Aguila Blanca, how does study methodology refer to Vanya's post #4? I recommend you get a copy of "Study a study and test a test" http://www.amazon.com/books/dp/0781774268 to help you understand that not all research papers are well written, or well reviewed.

Quote:
As to the comparison with Lott's data, that would seem to be your answer right there. his data end in 2005, while Hoekstra and Cheng are analyzing data from states that changed their laws in 2005; their pre-change period for comparison includes data starting in 2000, but they're interested in data from the point at which Lott's study ends. So there's no reason their data should be consistent with his, other than wishful thinking
It all depends on where you set the zero point. The underlying data set will be the same, but setting a zero point at a particular place in time will determine whether you see a rise or fall in the results in any number of data sets.

For example, if you set the zero point for "global warming" at 1900 you can demonstrate global warming. If you set the zero point at the height of the Medievel Warm Period then you show no warming.

My previous analysis of the A&M study was to show that the researchers deliberately discounted normal variation in homicides in their methodology by artificially limiting the scope of their investigation. In order to come up with a real confidence value they have to prove that their results are not the result of normal variation.

Alternately, their methods of "control" was any state that did did not enact SYG laws, which includes a lot more states than any individual graph they showed. This means the control had a larger population to statitistically normalize than the "experiment" which is bad design.

Jimro
__________________
Machine guns are awesome until you have to carry one.
Jimro is offline  
Old July 19, 2013, 06:24 AM   #17
2damnold4this
Senior Member
 
Join Date: August 12, 2009
Location: Athens, Georgia
Posts: 2,526
Quote:
As to the comparison with Lott's data, that would seem to be your answer right there. his data end in 2005, while Hoekstra and Cheng are analyzing data from states that changed their laws in 2005; their pre-change period for comparison includes data starting in 2000, but they're interested in data from the point at which Lott's study ends. So there's no reason their data should be consistent with his, other than wishful thinking.

We have two sets of data. One from 1977 to 2005 and the other from 2000 to 2010. The first shows states that strengthen self defense laws have a decrease in murder while the second shows states that strengthen self defense laws have an increase in murder. If the changes in murder rates were due to the changes in laws, why did one study show a negative change and the other a positive? Does changing the law have a beneficial effect if it is done in certain years but a detrimental effect if it's changed in other years?
2damnold4this is offline  
Old July 19, 2013, 09:20 AM   #18
carguychris
Senior Member
 
Join Date: October 20, 2007
Location: Richardson, TX
Posts: 7,523
Quote:
...their methods of "control" was any state that did did not enact SYG laws, which includes a lot more states than any individual graph they showed. This means the control had a larger population to statitistically normalize than the "experiment" which is bad design.
IMHO this is one of the two most readily apparent flaws in the study methodology, and is related to the second most apparent flaw- singling out Florida.

The authors justify this because FL was the only state to enact the laws in 2005, but the FL data also seems to show the most mathematically neat surge in homicides after the law is passed, suggesting that the authors have fallen victim to the "Texas sharpshooter" fallacy.

http://en.wikipedia.org/wiki/Texas_sharpshooter_fallacy
__________________
"Smokey, this is not 'Nam. This is bowling. There are rules... MARK IT ZERO!!" - Walter Sobchak
carguychris is offline  
Old July 19, 2013, 09:37 AM   #19
Jimro
Senior Member
 
Join Date: October 18, 2006
Posts: 7,097
Quote:
We have two sets of data. One from 1977 to 2005 and the other from 2000 to 2010. The first shows states that strengthen self defense laws have a decrease in murder while the second shows states that strengthen self defense laws have an increase in murder. If the changes in murder rates were due to the changes in laws, why did one study show a negative change and the other a positive? Does changing the law have a beneficial effect if it is done in certain years but a detrimental effect if it's changed in other years?
1977 to 2005 is 28 years of data.

2000 to 2010 is 10 years of data.

An intellectually rigorous person would give more weight to the conclusion that was derived from the larger data set. However, all these statistical studies must have very stringent controls to rule out confounding factors. Over the past 100 years crime has been "associated with" (meaning someone got two graphs with the same or inverse curve over the same time period) the economy, climate, population density, number of 1st gen immigrants, etc.

So saying that a specific law, in a specific state cause a specific change in the homicide rate is a very risky argument to make, if you use a large data set. For example, instead of doing the math in a regression analysis they could have done a case comparison between two states of similar population size, density, and urban areas, and done a "controlled pair study."

A controlled pair study is useful in medicine and research, and very useful when you have a very small control population. This is particularly useful with cities of similar size, density, and income distribution to analyze the effects of laws (although State laws can become confounding factors).

Jimro
__________________
Machine guns are awesome until you have to carry one.
Jimro is offline  
Old July 19, 2013, 01:33 PM   #20
2damnold4this
Senior Member
 
Join Date: August 12, 2009
Location: Athens, Georgia
Posts: 2,526
Here is a free link to the Georgia State study
2damnold4this is offline  
Old July 19, 2013, 02:01 PM   #21
Evan Thomas
Senior Member
 
Join Date: July 7, 2008
Location: Upper midwest
Posts: 5,631
Quote:
Originally Posted by Jimro
1977 to 2005 is 28 years of data.

2000 to 2010 is 10 years of data.

An intellectually rigorous person would give more weight to the conclusion that was derived from the larger data set.
The first thing an intellectually rigorous person would consider is what a given study was designed to evaluate.

A comparison of the number of years of data is irrelevant, because they weren't studying the same thing as Lott. The whole point of the Texas A&M study was to examine the effects on crime rates of the liberalizations of castle doctrine that were passed in several states between 2005 and 2009. Lott was studying the effects of self-defense laws in general in a time frame that preceded that of the Texas A&M study. The laws in which Hoekstra and Cheng were interested had not been passed at that time.

As to the difference in the sizes of experimental and control groups which you raised in an earlier post, the difference-in-differences method handles that readily by using weighted averages. (See Table A1, for example, in which the regressions are weighted by state population.) It's not unusual in such studies for the size of the control group to exceed that of the experimental group by a factor of 10 (see, for example, this paper on using the method in health care research).

Rather than attempting to find fault with the study because one doesn't like its conclusions, we might find it far more interesting to discuss the implications of such research from the perspective that its findings may be valid. If "stand your ground" laws indeed lead to increased homicide rates, that should interest us. As their conclusion notes, it's possible that the increase reflects a greater number of justifiable homicides:
A critical question is whether all the additional homicides that were reported as murders or non-negligent manslaughters could have been legally justified. Based on the results of various tests and exercises performed here, our view is that this is unlikely, albeit not impossible.
So it's an open question as to how much of the increase represents the lawful use of deadly force; there may be more meat here for proponents of "stand-your-ground" laws than for opponents.
.....
2damnold4this, thanks for the link.
__________________
Never let anything mechanical know you're in a hurry.

Last edited by Evan Thomas; July 19, 2013 at 07:32 PM. Reason: grammer. ;)
Evan Thomas is offline  
Old July 21, 2013, 01:43 AM   #22
Jimro
Senior Member
 
Join Date: October 18, 2006
Posts: 7,097
Quote:
A comparison of the number of years of data is irrelevant, because they weren't studying the same thing as Lott. The whole point of the Texas A&M study was to examine the effects on crime rates of the liberalizations of castle doctrine that were passed in several states between 2005 and 2009. Lott was studying the effects of self-defense laws in general in a time frame that preceded that of the Texas A&M study. The laws in which Hoekstra and Cheng were interested had not been passed at that time.
No, the whole point of the A&M study was put in the title, which was a horrible title because it asks 4 questions.

Quote:
Does strengthening Self Defense Law Deter Crime or Escalate Violence?
Let me translate that into 4 questions
Does strengthening self defense laws deter crime?
Does strengthening self defense laws have no effect on crime deterrence? (null hypothesis)
Does strengthening self defense laws escalate violence?
Does strengthening self defense laws have no effect on violence? (null hypothesis)

Then they focus solely on "homicide" instead of "crime." In Florida they focus solely on the increase in homicide, but not the dramatic decrease in overall violent crime.

And the larger data set is relevant because it gives an actual baseline of data that shows normal variation.

And here is why. If you were looking at, "Has the introduction of Biotech corn caused an increase in water consumption?" and you didn't have a good baseline of data for a particular area, for a number of years showing average water use during different growing conditions and instead choose to use a "control" group of different states with different growing conditions, and only looking at the years after Biotech corn was introduced, you have designed a poor experiment.

And that is what the Hoekstra did, was eliminate any baseline fluctuation which is freely available from the data sets that they used in their study. So now their study is open to the criticism that they limited their data set to ignore "noise in the system" instead of accounting for "noise in the system."

As far as particular state laws go, simply accounting for population density and urban distribution is only the beginning. Think about this, gun friendly Vermont is in the "no stand your ground or castle doctrine" control group despite having some of the most gun friendly laws in the country.

They limited their data set, limited the scope of "violence" to "homicide" and to me that is not good science, or good english.

Good science is observable, and repeatable. Here we have the data set they used freely available to the public, and already here I have shown a different interpretation of the data set. I could repeat their work by following their steps exactly, but I could also refute their work by using a larger portion of the data set they used. Not good science.

In terms of having a small experimental group, remember the scare that "vaccines cause autism" brought around? That study was never replicated, suffered from small sample size, and caused children across the world to get sick from preventable diseases. I'm very critical of bad science, as it has profound effects on policy.

Jimro
__________________
Machine guns are awesome until you have to carry one.
Jimro is offline  
Old July 21, 2013, 06:19 AM   #23
2damnold4this
Senior Member
 
Join Date: August 12, 2009
Location: Athens, Georgia
Posts: 2,526
The Georgia State study differed from the A&M study in several ways. One way was the GS study looked at laws that removed the duty to retreat separately from laws that changed things like civil liability or a presumption of innocence. Another way the GS study was different than the A&M study is it looked at race and gender. It's interesting that the GS study found an increase in murder rates associated with a removal of duty to retreat laws but not other changes. It's also interesting that increase in murder rates was only for white males.
2damnold4this is offline  
Old July 21, 2013, 02:14 PM   #24
Jimro
Senior Member
 
Join Date: October 18, 2006
Posts: 7,097
Ok, just finished the first read through of the Georgia study.

Much more rigorous than the Texas A&M study.

However, their findings of 4.4 to 7.4 more homicides per month spread across 18 states is something that I find to be a non-issue.

Also it should be noted that Texas and Louisianna had some trouble with the weather in 2005 which caused some social upheaval that is out of the "statistical norm" for those states. There is no mention of "mass human migration in the event of New Orleans flooding" as a confounding factor.

I do appreciate that they showed the control rates both before and after showing similar curve patterns between the SYG and control population. I also noticed that the curved diverged very little after the SYG zero point.

Jimro
__________________
Machine guns are awesome until you have to carry one.
Jimro is offline  
Old July 21, 2013, 02:51 PM   #25
johnelmore
Senior Member
 
Join Date: April 6, 2013
Posts: 456
I truly believe the violent crime rate is dependent on the age of the population. An older population is less violent then the younger version. America is getting older so the violent crime rate is dropping. Half the population is over 40. Average age of a Floridian is 42.
johnelmore is offline  
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 10:25 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
This site and contents, including all posts, Copyright © 1998-2021 S.W.A.T. Magazine
Copyright Complaints: Please direct DMCA Takedown Notices to the registered agent: thefiringline.com
Page generated in 0.13842 seconds with 10 queries