Have you ever wondering the effectiveness of dry fire practice.
Like many people these days I have implemented a more dedicated dry fire program to augment my live fire training. In this process, I have discovered a few positives and a few negatives about dry fire effectiveness.
Does Dry Fire Really Work
First, let me tell you about my experiment. I dry fire a lot already, but to be honest I haven’t thought to evaluate whether it has a positive effect on my live fire performance. It became obvious I would need a means to validate my belief dry fire is helpful. I have talked about dry fire in the past and I know it is heavily recommended to many shooters of various skill. The real question is has anyone done anything to provide tangible results on this recommendation. How do we know it really does produce results.
What About Those With Skills Already
My approach was pretty simple, but not easy. As I mentioned, I currently dry fire practice a lot. What is a lot, how do I define a lot. It averages about 100 minutes a week. Family and travel can impact my weekly quota, but in general its really close. One could argue that I already perform too significant an amount of dry fire to evaluate whether it has a good or bad effect. This could be true, but one other element to consider is sustainment. Does dry fire work at sustaining current skill level. Is is an effective replacement to live fire training for the maintenance required to sustain your skills. With an already establish routine I can say it did make it somewhat easier, but I still needed a method to measure and track my progress.
A Battery of Tests To Evaluate Dry Fire
There are a few commercially available dry fire tools you can purchase. Many of them claim to be the answer to your shooting problems, but at best they are only one piece of the puzzle. The question is how big a piece of the puzzle. This question will revolve around your perceived return on investment for dry fire effectiveness. For this experiment I purchased the Mantis X dry fire module. Then I needed a means to measure live fire performance. I could have picked a single shooting drill, but to do so would have been too narrow in scope. Both for this experiment and in real life. No single shooting drill is an adequate measurement of overall skill. Instead, I used a battery of drills to test and evaluate on a much broader scale. I shot them all cold and recorded the scores before I started. I took all of them, seven in total and averaged them for my overall score. Having so many measurements seemed good, but it also opened the door to a single poor performance having a negative effect on the overall score. But, isn’t that why I’m dry firing in the beginning.
Breaking Down My Experiment
In an effort not to jade my live fire results, I did not practice the selected tests during dry fire. Aside from it being very hard to do this, I did not want to show bias to these drills. I have been conducting this little experiment for a while now. Here are some initial observations. I say initial because I don’t think I’m ready to complete this experiment. I feel like more time is needed to gain any useful information. However, here is what I can tell you now. Yes, it does help. I started out establishing my baselines then put my professional development on the back burner. I wanted any live fire training to be evaluated for this experiment. That was harder than expected for a lot of reasons. Performing demonstrations in classes is live fire practice, however I do it in a manner than is not 100% authentic. Meaning, I explain the drill as I’m shooting the drill or illustrate high points for students. Then there was the aspect of worrying I would let my skills degrade too much. Since I had no idea if this would work, did I want to risk loosing too much of my skill set. I feel and the results show I’m at the very minimum breaking even.
Dry Fire Observations
After the baselines, I took a hiatus on my professional development for six months. The only live fire I completed was for this experiment. I then performed dry fire only for the first three months. This gave me a chance to let the dry fire routine get established and what I hoped was enough time to allow my live fire skills to be more authentically evaluated. The last three months I went to the range to retest my baselines. My score for the first month was a 3% decrease from my baselines. Then the second and third month I saw 5% and 4% increases respectively. Not the huge numbers I was expecting, but it does lend credibility to my notion as a valuable tool for sustainment. I’m thinking I will do another three months of live fire retesting then take the final three months off and perform dry fire only. See if there is anything significant to report.
Overall, I’m happy with these results and while 5% may not be a huge return on my investment it is at least safeguarding my investment. I will look forward to seeing the results at the end of another six months.
4 thoughts on “Dry Fire Effectiveness”
5% increase in performance in six months? I can’t think of any activity where that would not be considered a success. Also, you are an experienced shooter, so these gains were not “newby ” gains. It sounds like this was a significant success.
I think that using tools such as the Mantis was a big part of the success of your experiment. I doubt you would have had as positive results using conventional dry fire (i.e. working against a par timer) since the Mantis provides objective feedback on each repetition and a means to compare and record each session’s performance. You should consider using other dry fire training aids such as the laser based apps like LaserHit, LASR, ITarget, or Mantis Laser Academy; together with the Mantis X, these will provide you with feedback on each rep and hold you responsible for your training time. Without these tools, dry firing can degrade into just mindlessly going through the motions if you are not careful.
I would be very interested in hearing what drills you chose to use for your benchmark and more details about the dry fire program you used. I will be looking forward for a follow up article as you continue this experiment. Excellent post!
I agree. I wasn’t sure what to expect, but happy with the results. I’m not sure if it was the tool or the consistency, I’m more inclined to side with the consistency. Actually, the more I think about it I believe the results might be attributed to the fact I decided to start measuring them. Anything of valuable, must be measurable. Thanks.
“When performance is measured, performance improves. When performance is measured and reported back, the rate of improvement accelerates.” Pearson’s Law
It is hard to believe so many don’t subscribe to this type of thinking, thanks for sharing.