How can I process the content of a CSV file as Pipeline input in Powershell cmdlet










2















I want to use a CSV file to feed the parameters of powershell cmdlet



Role, email, fname, lname
Admin, a@b.com, John, Smith


I want to process a cmdlet as follows:



import-csv myFile| mycmdlet | export-csv myresults


I also want to be able to call the cmdlet like this



mycmdlet -role x -email j@b.com -fname John -lname Smith


and see a result as an object like:



lname: "Smith"
fname: "John"
email: "j@b.com"
role: "X"
ResultData: "something else"


I didn't want to have to do this:



import-csv X.txt | foreach-object mycmdlet -email $_.email 


In the powershell I wanted to do something line this:



function global:test-Pipeline
param(
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$lname )

$result = new-object psobject

process
foreach($address in $email)

Do something with each role, email, fname lname
Add output to $result



End
return $result





I'm sure this must be possible, how do i do it? Can it be done without having to process the CSV in the cmdlet?










share|improve this question




























    2















    I want to use a CSV file to feed the parameters of powershell cmdlet



    Role, email, fname, lname
    Admin, a@b.com, John, Smith


    I want to process a cmdlet as follows:



    import-csv myFile| mycmdlet | export-csv myresults


    I also want to be able to call the cmdlet like this



    mycmdlet -role x -email j@b.com -fname John -lname Smith


    and see a result as an object like:



    lname: "Smith"
    fname: "John"
    email: "j@b.com"
    role: "X"
    ResultData: "something else"


    I didn't want to have to do this:



    import-csv X.txt | foreach-object mycmdlet -email $_.email 


    In the powershell I wanted to do something line this:



    function global:test-Pipeline
    param(
    [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$role,
    [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$email,
    [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$fname,
    [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$lname )

    $result = new-object psobject

    process
    foreach($address in $email)

    Do something with each role, email, fname lname
    Add output to $result



    End
    return $result





    I'm sure this must be possible, how do i do it? Can it be done without having to process the CSV in the cmdlet?










    share|improve this question


























      2












      2








      2








      I want to use a CSV file to feed the parameters of powershell cmdlet



      Role, email, fname, lname
      Admin, a@b.com, John, Smith


      I want to process a cmdlet as follows:



      import-csv myFile| mycmdlet | export-csv myresults


      I also want to be able to call the cmdlet like this



      mycmdlet -role x -email j@b.com -fname John -lname Smith


      and see a result as an object like:



      lname: "Smith"
      fname: "John"
      email: "j@b.com"
      role: "X"
      ResultData: "something else"


      I didn't want to have to do this:



      import-csv X.txt | foreach-object mycmdlet -email $_.email 


      In the powershell I wanted to do something line this:



      function global:test-Pipeline
      param(
      [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$role,
      [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$email,
      [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$fname,
      [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$lname )

      $result = new-object psobject

      process
      foreach($address in $email)

      Do something with each role, email, fname lname
      Add output to $result



      End
      return $result





      I'm sure this must be possible, how do i do it? Can it be done without having to process the CSV in the cmdlet?










      share|improve this question
















      I want to use a CSV file to feed the parameters of powershell cmdlet



      Role, email, fname, lname
      Admin, a@b.com, John, Smith


      I want to process a cmdlet as follows:



      import-csv myFile| mycmdlet | export-csv myresults


      I also want to be able to call the cmdlet like this



      mycmdlet -role x -email j@b.com -fname John -lname Smith


      and see a result as an object like:



      lname: "Smith"
      fname: "John"
      email: "j@b.com"
      role: "X"
      ResultData: "something else"


      I didn't want to have to do this:



      import-csv X.txt | foreach-object mycmdlet -email $_.email 


      In the powershell I wanted to do something line this:



      function global:test-Pipeline
      param(
      [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$role,
      [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$email,
      [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$fname,
      [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$lname )

      $result = new-object psobject

      process
      foreach($address in $email)

      Do something with each role, email, fname lname
      Add output to $result



      End
      return $result





      I'm sure this must be possible, how do i do it? Can it be done without having to process the CSV in the cmdlet?







      powershell pipeline cmdlet






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 14 '18 at 1:13







      James Moore

















      asked Nov 14 '18 at 1:08









      James MooreJames Moore

      242




      242






















          1 Answer
          1






          active

          oldest

          votes


















          3














          Yes, you almost have it right. Your parameters should not use ValueFromPipeline but should use ValueFromPipelineByPropertyName. They should be [String] but not [String]. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".



          You also don't need the End block here it should all be done in Process.



          function Test-Pipeline
          param(
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
          )


          Process
          New-Object PSObject -Property @
          role = "~$role~"
          email = "mailto:$email"
          fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
          lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)





          Usage:



          Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation

          Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json



          What's going on?



          Import-Csv gives you one object for every row in the CSV, and each object has a property for each column in the CSV.



          When you pipe that into another command, including your function each individual object gets sent through one at a time.



          You could accept the one object, and process its properties in the command.



          What you have here, using ValueFromPipelineByPropertyName, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).



          Since you want objects again as a result, you create a new object in the Process block, which will be passed out through the pipeline to the next command.






          share|improve this answer























          • As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way

            – James Moore
            Nov 14 '18 at 13:08











          • @JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.

            – briantist
            Nov 14 '18 at 22:08










          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53291756%2fhow-can-i-process-the-content-of-a-csv-file-as-pipeline-input-in-powershell-cmdl%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          3














          Yes, you almost have it right. Your parameters should not use ValueFromPipeline but should use ValueFromPipelineByPropertyName. They should be [String] but not [String]. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".



          You also don't need the End block here it should all be done in Process.



          function Test-Pipeline
          param(
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
          )


          Process
          New-Object PSObject -Property @
          role = "~$role~"
          email = "mailto:$email"
          fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
          lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)





          Usage:



          Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation

          Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json



          What's going on?



          Import-Csv gives you one object for every row in the CSV, and each object has a property for each column in the CSV.



          When you pipe that into another command, including your function each individual object gets sent through one at a time.



          You could accept the one object, and process its properties in the command.



          What you have here, using ValueFromPipelineByPropertyName, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).



          Since you want objects again as a result, you create a new object in the Process block, which will be passed out through the pipeline to the next command.






          share|improve this answer























          • As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way

            – James Moore
            Nov 14 '18 at 13:08











          • @JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.

            – briantist
            Nov 14 '18 at 22:08















          3














          Yes, you almost have it right. Your parameters should not use ValueFromPipeline but should use ValueFromPipelineByPropertyName. They should be [String] but not [String]. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".



          You also don't need the End block here it should all be done in Process.



          function Test-Pipeline
          param(
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
          )


          Process
          New-Object PSObject -Property @
          role = "~$role~"
          email = "mailto:$email"
          fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
          lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)





          Usage:



          Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation

          Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json



          What's going on?



          Import-Csv gives you one object for every row in the CSV, and each object has a property for each column in the CSV.



          When you pipe that into another command, including your function each individual object gets sent through one at a time.



          You could accept the one object, and process its properties in the command.



          What you have here, using ValueFromPipelineByPropertyName, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).



          Since you want objects again as a result, you create a new object in the Process block, which will be passed out through the pipeline to the next command.






          share|improve this answer























          • As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way

            – James Moore
            Nov 14 '18 at 13:08











          • @JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.

            – briantist
            Nov 14 '18 at 22:08













          3












          3








          3







          Yes, you almost have it right. Your parameters should not use ValueFromPipeline but should use ValueFromPipelineByPropertyName. They should be [String] but not [String]. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".



          You also don't need the End block here it should all be done in Process.



          function Test-Pipeline
          param(
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
          )


          Process
          New-Object PSObject -Property @
          role = "~$role~"
          email = "mailto:$email"
          fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
          lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)





          Usage:



          Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation

          Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json



          What's going on?



          Import-Csv gives you one object for every row in the CSV, and each object has a property for each column in the CSV.



          When you pipe that into another command, including your function each individual object gets sent through one at a time.



          You could accept the one object, and process its properties in the command.



          What you have here, using ValueFromPipelineByPropertyName, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).



          Since you want objects again as a result, you create a new object in the Process block, which will be passed out through the pipeline to the next command.






          share|improve this answer













          Yes, you almost have it right. Your parameters should not use ValueFromPipeline but should use ValueFromPipelineByPropertyName. They should be [String] but not [String]. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".



          You also don't need the End block here it should all be done in Process.



          function Test-Pipeline
          param(
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
          [Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
          )


          Process
          New-Object PSObject -Property @
          role = "~$role~"
          email = "mailto:$email"
          fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
          lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)





          Usage:



          Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation

          Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json



          What's going on?



          Import-Csv gives you one object for every row in the CSV, and each object has a property for each column in the CSV.



          When you pipe that into another command, including your function each individual object gets sent through one at a time.



          You could accept the one object, and process its properties in the command.



          What you have here, using ValueFromPipelineByPropertyName, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).



          Since you want objects again as a result, you create a new object in the Process block, which will be passed out through the pipeline to the next command.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 14 '18 at 1:30









          briantistbriantist

          31.3k34274




          31.3k34274












          • As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way

            – James Moore
            Nov 14 '18 at 13:08











          • @JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.

            – briantist
            Nov 14 '18 at 22:08

















          • As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way

            – James Moore
            Nov 14 '18 at 13:08











          • @JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.

            – briantist
            Nov 14 '18 at 22:08
















          As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way

          – James Moore
          Nov 14 '18 at 13:08





          As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way

          – James Moore
          Nov 14 '18 at 13:08













          @JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.

          – briantist
          Nov 14 '18 at 22:08





          @JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.

          – briantist
          Nov 14 '18 at 22:08

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53291756%2fhow-can-i-process-the-content-of-a-csv-file-as-pipeline-input-in-powershell-cmdl%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Top Tejano songwriter Luis Silva dead of heart attack at 64

          ReactJS Fetched API data displays live - need Data displayed static

          Evgeni Malkin