Just-in-time mvn test results in addition to final summary










1















When running the mvn test command against a set of tests, maven executes all of them, prints tracing(which can be parsed in run-time by other party), aggregates the results, and summarizes them "at the end".



Is there a way, via the command line, to get the result for a test by test during the execution? and still have the aggregated summary at the end? Why we have to wait until it completes?



Our Continuous Integration environment can be adapted to get "just-in-time" test(s) feedback, instead of waiting until the whole suite is done to know if some test(s) failed or not. This will save time significantly, so developers could investigate in the failing tests until the suite execution is complete.



On the contrast, when run in Eclipse, we can get just-in-time test results, but it is challenging to integrate that in our environment. Any ideas how to achieve the same through the command line?










share|improve this question


























    1















    When running the mvn test command against a set of tests, maven executes all of them, prints tracing(which can be parsed in run-time by other party), aggregates the results, and summarizes them "at the end".



    Is there a way, via the command line, to get the result for a test by test during the execution? and still have the aggregated summary at the end? Why we have to wait until it completes?



    Our Continuous Integration environment can be adapted to get "just-in-time" test(s) feedback, instead of waiting until the whole suite is done to know if some test(s) failed or not. This will save time significantly, so developers could investigate in the failing tests until the suite execution is complete.



    On the contrast, when run in Eclipse, we can get just-in-time test results, but it is challenging to integrate that in our environment. Any ideas how to achieve the same through the command line?










    share|improve this question
























      1












      1








      1








      When running the mvn test command against a set of tests, maven executes all of them, prints tracing(which can be parsed in run-time by other party), aggregates the results, and summarizes them "at the end".



      Is there a way, via the command line, to get the result for a test by test during the execution? and still have the aggregated summary at the end? Why we have to wait until it completes?



      Our Continuous Integration environment can be adapted to get "just-in-time" test(s) feedback, instead of waiting until the whole suite is done to know if some test(s) failed or not. This will save time significantly, so developers could investigate in the failing tests until the suite execution is complete.



      On the contrast, when run in Eclipse, we can get just-in-time test results, but it is challenging to integrate that in our environment. Any ideas how to achieve the same through the command line?










      share|improve this question














      When running the mvn test command against a set of tests, maven executes all of them, prints tracing(which can be parsed in run-time by other party), aggregates the results, and summarizes them "at the end".



      Is there a way, via the command line, to get the result for a test by test during the execution? and still have the aggregated summary at the end? Why we have to wait until it completes?



      Our Continuous Integration environment can be adapted to get "just-in-time" test(s) feedback, instead of waiting until the whole suite is done to know if some test(s) failed or not. This will save time significantly, so developers could investigate in the failing tests until the suite execution is complete.



      On the contrast, when run in Eclipse, we can get just-in-time test results, but it is challenging to integrate that in our environment. Any ideas how to achieve the same through the command line?







      maven unit-testing junit maven-surefire-plugin






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 15 '18 at 19:09









      AbdoAbdo

      82




      82






















          1 Answer
          1






          active

          oldest

          votes


















          0














          During Surefire's execution, for each test executed, you'd see a line like this:



          [INFO] Running com.example.MyTest
          [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in com.example.MyTest


          You could catch those that have Failures or Errors larger than 0, and use them to alert developers to the issue.






          share|improve this answer






















            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53326408%2fjust-in-time-mvn-test-results-in-addition-to-final-summary%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            During Surefire's execution, for each test executed, you'd see a line like this:



            [INFO] Running com.example.MyTest
            [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in com.example.MyTest


            You could catch those that have Failures or Errors larger than 0, and use them to alert developers to the issue.






            share|improve this answer



























              0














              During Surefire's execution, for each test executed, you'd see a line like this:



              [INFO] Running com.example.MyTest
              [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in com.example.MyTest


              You could catch those that have Failures or Errors larger than 0, and use them to alert developers to the issue.






              share|improve this answer

























                0












                0








                0







                During Surefire's execution, for each test executed, you'd see a line like this:



                [INFO] Running com.example.MyTest
                [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in com.example.MyTest


                You could catch those that have Failures or Errors larger than 0, and use them to alert developers to the issue.






                share|improve this answer













                During Surefire's execution, for each test executed, you'd see a line like this:



                [INFO] Running com.example.MyTest
                [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in com.example.MyTest


                You could catch those that have Failures or Errors larger than 0, and use them to alert developers to the issue.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Nov 16 '18 at 7:36









                MureinikMureinik

                184k22136203




                184k22136203





























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53326408%2fjust-in-time-mvn-test-results-in-addition-to-final-summary%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Top Tejano songwriter Luis Silva dead of heart attack at 64

                    ReactJS Fetched API data displays live - need Data displayed static

                    政党