Multiple regression results help The 2019 Stack Overflow Developer Survey Results Are In Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)Not-significant F but a significant coefficient in multiple linear regressionWhen to transform predictor variables when doing multiple regression?Combining multiple imputation results for hierarchical regression in SPSSHow to deal with different outcomes between pairwise correlations and multiple regressionProbing effects in a multivariate multiple regressionPredictor flipping sign in regression with no multicollinearityMeta analysis of Multiple regressionWhat is the difference between each predictor's standardized betas (from multiple regression) and it's Pearson's correlation coefficient?Multiple linear regression coefficients meaningWhat is the correct way to follow up a multivariate multiple regression?

Wolves and sheep

What force causes entropy to increase?

Windows 10: How to Lock (not sleep) laptop on lid close?

Why is superheterodyning better than direct conversion?

What's the point in a preamp?

In horse breeding, what is the female equivalent of putting a horse out "to stud"?

Difference between "generating set" and free product?

Why can't wing-mounted spoilers be used to steepen approaches?

How long does the line of fire that you can create as an action using the Investiture of Flame spell last?

How does ice melt when immersed in water?

How to split my screen on my Macbook Air?

Simulating Exploding Dice

Road tyres vs "Street" tyres for charity ride on MTB Tandem

Is above average number of years spent on PhD considered a red flag in future academia or industry positions?

Who or what is the being for whom Being is a question for Heidegger?

Can a novice safely splice in wire to lengthen 5V charging cable?

Why does this iterative way of solving of equation work?

Format single node in tikzcd

Did the new image of black hole confirm the general theory of relativity?

Why did all the guest students take carriages to the Yule Ball?

How many people can fit inside Mordenkainen's Magnificent Mansion?

What information about me do stores get via my credit card?

Does Parliament hold absolute power in the UK?

Derivation tree not rendering



Multiple regression results help



The 2019 Stack Overflow Developer Survey Results Are In
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)Not-significant F but a significant coefficient in multiple linear regressionWhen to transform predictor variables when doing multiple regression?Combining multiple imputation results for hierarchical regression in SPSSHow to deal with different outcomes between pairwise correlations and multiple regressionProbing effects in a multivariate multiple regressionPredictor flipping sign in regression with no multicollinearityMeta analysis of Multiple regressionWhat is the difference between each predictor's standardized betas (from multiple regression) and it's Pearson's correlation coefficient?Multiple linear regression coefficients meaningWhat is the correct way to follow up a multivariate multiple regression?



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








2












$begingroup$


For my first ever research paper I've run a hierarchal multiple linear regression with two predictors and one outcome variable, however I don't understand my results. I've found predictor A to be a significant predictor for my outcome variable alone. However, when both my predictors are in the model, predictor A is not a significant predictor, only predictor B is. How can this be if predictor A was significant in the first model? How does predictor B change how significant predictor A is?



Thank you!










share|cite|improve this question







New contributor




ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$


















    2












    $begingroup$


    For my first ever research paper I've run a hierarchal multiple linear regression with two predictors and one outcome variable, however I don't understand my results. I've found predictor A to be a significant predictor for my outcome variable alone. However, when both my predictors are in the model, predictor A is not a significant predictor, only predictor B is. How can this be if predictor A was significant in the first model? How does predictor B change how significant predictor A is?



    Thank you!










    share|cite|improve this question







    New contributor




    ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.







    $endgroup$














      2












      2








      2





      $begingroup$


      For my first ever research paper I've run a hierarchal multiple linear regression with two predictors and one outcome variable, however I don't understand my results. I've found predictor A to be a significant predictor for my outcome variable alone. However, when both my predictors are in the model, predictor A is not a significant predictor, only predictor B is. How can this be if predictor A was significant in the first model? How does predictor B change how significant predictor A is?



      Thank you!










      share|cite|improve this question







      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.







      $endgroup$




      For my first ever research paper I've run a hierarchal multiple linear regression with two predictors and one outcome variable, however I don't understand my results. I've found predictor A to be a significant predictor for my outcome variable alone. However, when both my predictors are in the model, predictor A is not a significant predictor, only predictor B is. How can this be if predictor A was significant in the first model? How does predictor B change how significant predictor A is?



      Thank you!







      multiple-regression mlr






      share|cite|improve this question







      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.











      share|cite|improve this question







      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      share|cite|improve this question




      share|cite|improve this question






      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      asked 9 hours ago









      ummmmummmm

      111




      111




      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.





      New contributor





      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.




















          3 Answers
          3






          active

          oldest

          votes


















          1












          $begingroup$

          regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.






          share|cite|improve this answer









          $endgroup$




















            1












            $begingroup$

            The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



            In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



            In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



            As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.






            share|cite|improve this answer









            $endgroup$




















              0












              $begingroup$

              To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



              So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.






              share|cite|improve this answer









              $endgroup$













                Your Answer








                StackExchange.ready(function()
                var channelOptions =
                tags: "".split(" "),
                id: "65"
                ;
                initTagRenderer("".split(" "), "".split(" "), channelOptions);

                StackExchange.using("externalEditor", function()
                // Have to fire editor after snippets, if snippets enabled
                if (StackExchange.settings.snippets.snippetsEnabled)
                StackExchange.using("snippets", function()
                createEditor();
                );

                else
                createEditor();

                );

                function createEditor()
                StackExchange.prepareEditor(
                heartbeatType: 'answer',
                autoActivateHeartbeat: false,
                convertImagesToLinks: false,
                noModals: true,
                showLowRepImageUploadWarning: true,
                reputationToPostImages: null,
                bindNavPrevention: true,
                postfix: "",
                imageUploader:
                brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                allowUrls: true
                ,
                onDemand: true,
                discardSelector: ".discard-answer"
                ,immediatelyShowMarkdownHelp:true
                );



                );






                ummmm is a new contributor. Be nice, and check out our Code of Conduct.









                draft saved

                draft discarded


















                StackExchange.ready(
                function ()
                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f402894%2fmultiple-regression-results-help%23new-answer', 'question_page');

                );

                Post as a guest















                Required, but never shown

























                3 Answers
                3






                active

                oldest

                votes








                3 Answers
                3






                active

                oldest

                votes









                active

                oldest

                votes






                active

                oldest

                votes









                1












                $begingroup$

                regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.






                share|cite|improve this answer









                $endgroup$

















                  1












                  $begingroup$

                  regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.






                  share|cite|improve this answer









                  $endgroup$















                    1












                    1








                    1





                    $begingroup$

                    regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.






                    share|cite|improve this answer









                    $endgroup$



                    regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered 8 hours ago









                    IrishStatIrishStat

                    21.4k42342




                    21.4k42342























                        1












                        $begingroup$

                        The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



                        In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



                        In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



                        As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.






                        share|cite|improve this answer









                        $endgroup$

















                          1












                          $begingroup$

                          The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



                          In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



                          In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



                          As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.






                          share|cite|improve this answer









                          $endgroup$















                            1












                            1








                            1





                            $begingroup$

                            The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



                            In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



                            In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



                            As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.






                            share|cite|improve this answer









                            $endgroup$



                            The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



                            In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



                            In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



                            As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.







                            share|cite|improve this answer












                            share|cite|improve this answer



                            share|cite|improve this answer










                            answered 7 hours ago









                            NoahNoah

                            3,6811417




                            3,6811417





















                                0












                                $begingroup$

                                To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



                                So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.






                                share|cite|improve this answer









                                $endgroup$

















                                  0












                                  $begingroup$

                                  To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



                                  So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.






                                  share|cite|improve this answer









                                  $endgroup$















                                    0












                                    0








                                    0





                                    $begingroup$

                                    To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



                                    So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.






                                    share|cite|improve this answer









                                    $endgroup$



                                    To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



                                    So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.







                                    share|cite|improve this answer












                                    share|cite|improve this answer



                                    share|cite|improve this answer










                                    answered 31 mins ago









                                    AlexKAlexK

                                    1908




                                    1908




















                                        ummmm is a new contributor. Be nice, and check out our Code of Conduct.









                                        draft saved

                                        draft discarded


















                                        ummmm is a new contributor. Be nice, and check out our Code of Conduct.












                                        ummmm is a new contributor. Be nice, and check out our Code of Conduct.











                                        ummmm is a new contributor. Be nice, and check out our Code of Conduct.














                                        Thanks for contributing an answer to Cross Validated!


                                        • Please be sure to answer the question. Provide details and share your research!

                                        But avoid


                                        • Asking for help, clarification, or responding to other answers.

                                        • Making statements based on opinion; back them up with references or personal experience.

                                        Use MathJax to format equations. MathJax reference.


                                        To learn more, see our tips on writing great answers.




                                        draft saved


                                        draft discarded














                                        StackExchange.ready(
                                        function ()
                                        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f402894%2fmultiple-regression-results-help%23new-answer', 'question_page');

                                        );

                                        Post as a guest















                                        Required, but never shown





















































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown

































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown







                                        Popular posts from this blog

                                        Can not update quote_id field of “quote_item” table magento 2Magento 2.1 - We can't remove the item. (Shopping Cart doesnt allow us to remove items before becomes empty)Add value for custom quote item attribute using REST apiREST API endpoint v1/carts/cartId/items always returns error messageCorrect way to save entries to databaseHow to remove all associated quote objects of a customer completelyMagento 2 - Save value from custom input field to quote_itemGet quote_item data using quote id and product id filter in Magento 2How to set additional data to quote_item table from controller in Magento 2?What is the purpose of additional_data column in quote_item table in magento2Set Custom Price to Quote item magento2 from controller

                                        Magento 2 disable Secret Key on URL's from terminal The Next CEO of Stack OverflowMagento 2 Shortcut/GUI tool to perform commandline tasks for windowsIn menu add configuration linkMagento oAuth : Generating access token and access secretMagento 2 security key issue in Third-Party API redirect URIPublic actions in admin controllersHow to Disable Cache in Custom WidgetURL Key not changing in Magento 2Product URL Key gets deleted when importing custom options - Magento 2Problem with reindex terminalMagento 2 - bin/magento Commands not working in Cpanel Terminal

                                        Aasi (pallopeli) Navigointivalikko