How can I process the content of a CSV file as Pipeline input in Powershell cmdlet
I want to use a CSV file to feed the parameters of powershell cmdlet
Role, email, fname, lname
Admin, a@b.com, John, Smith
I want to process a cmdlet as follows:
import-csv myFile| mycmdlet | export-csv myresults
I also want to be able to call the cmdlet like this
mycmdlet -role x -email j@b.com -fname John -lname Smith
and see a result as an object like:
lname: "Smith"
fname: "John"
email: "j@b.com"
role: "X"
ResultData: "something else"
I didn't want to have to do this:
import-csv X.txt | foreach-object mycmdlet -email $_.email
In the powershell I wanted to do something line this:
function global:test-Pipeline
param(
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$lname )
$result = new-object psobject
process
foreach($address in $email)
Do something with each role, email, fname lname
Add output to $result
End
return $result
I'm sure this must be possible, how do i do it? Can it be done without having to process the CSV in the cmdlet?
powershell pipeline cmdlet
add a comment |
I want to use a CSV file to feed the parameters of powershell cmdlet
Role, email, fname, lname
Admin, a@b.com, John, Smith
I want to process a cmdlet as follows:
import-csv myFile| mycmdlet | export-csv myresults
I also want to be able to call the cmdlet like this
mycmdlet -role x -email j@b.com -fname John -lname Smith
and see a result as an object like:
lname: "Smith"
fname: "John"
email: "j@b.com"
role: "X"
ResultData: "something else"
I didn't want to have to do this:
import-csv X.txt | foreach-object mycmdlet -email $_.email
In the powershell I wanted to do something line this:
function global:test-Pipeline
param(
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$lname )
$result = new-object psobject
process
foreach($address in $email)
Do something with each role, email, fname lname
Add output to $result
End
return $result
I'm sure this must be possible, how do i do it? Can it be done without having to process the CSV in the cmdlet?
powershell pipeline cmdlet
add a comment |
I want to use a CSV file to feed the parameters of powershell cmdlet
Role, email, fname, lname
Admin, a@b.com, John, Smith
I want to process a cmdlet as follows:
import-csv myFile| mycmdlet | export-csv myresults
I also want to be able to call the cmdlet like this
mycmdlet -role x -email j@b.com -fname John -lname Smith
and see a result as an object like:
lname: "Smith"
fname: "John"
email: "j@b.com"
role: "X"
ResultData: "something else"
I didn't want to have to do this:
import-csv X.txt | foreach-object mycmdlet -email $_.email
In the powershell I wanted to do something line this:
function global:test-Pipeline
param(
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$lname )
$result = new-object psobject
process
foreach($address in $email)
Do something with each role, email, fname lname
Add output to $result
End
return $result
I'm sure this must be possible, how do i do it? Can it be done without having to process the CSV in the cmdlet?
powershell pipeline cmdlet
I want to use a CSV file to feed the parameters of powershell cmdlet
Role, email, fname, lname
Admin, a@b.com, John, Smith
I want to process a cmdlet as follows:
import-csv myFile| mycmdlet | export-csv myresults
I also want to be able to call the cmdlet like this
mycmdlet -role x -email j@b.com -fname John -lname Smith
and see a result as an object like:
lname: "Smith"
fname: "John"
email: "j@b.com"
role: "X"
ResultData: "something else"
I didn't want to have to do this:
import-csv X.txt | foreach-object mycmdlet -email $_.email
In the powershell I wanted to do something line this:
function global:test-Pipeline
param(
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)][String]$lname )
$result = new-object psobject
process
foreach($address in $email)
Do something with each role, email, fname lname
Add output to $result
End
return $result
I'm sure this must be possible, how do i do it? Can it be done without having to process the CSV in the cmdlet?
powershell pipeline cmdlet
powershell pipeline cmdlet
edited Nov 14 '18 at 1:13
James Moore
asked Nov 14 '18 at 1:08
James MooreJames Moore
242
242
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Yes, you almost have it right. Your parameters should not use ValueFromPipeline
but should use ValueFromPipelineByPropertyName
. They should be [String]
but not [String]
. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".
You also don't need the End
block here it should all be done in Process
.
function Test-Pipeline
param(
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
)
Process
New-Object PSObject -Property @
role = "~$role~"
email = "mailto:$email"
fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)
Usage:
Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation
Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json
What's going on?
Import-Csv
gives you one object for every row in the CSV, and each object has a property for each column in the CSV.
When you pipe that into another command, including your function each individual object gets sent through one at a time.
You could accept the one object, and process its properties in the command.
What you have here, using ValueFromPipelineByPropertyName
, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).
Since you want objects again as a result, you create a new object in the Process
block, which will be passed out through the pipeline to the next command.
As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way
– James Moore
Nov 14 '18 at 13:08
@JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.
– briantist
Nov 14 '18 at 22:08
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53291756%2fhow-can-i-process-the-content-of-a-csv-file-as-pipeline-input-in-powershell-cmdl%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Yes, you almost have it right. Your parameters should not use ValueFromPipeline
but should use ValueFromPipelineByPropertyName
. They should be [String]
but not [String]
. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".
You also don't need the End
block here it should all be done in Process
.
function Test-Pipeline
param(
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
)
Process
New-Object PSObject -Property @
role = "~$role~"
email = "mailto:$email"
fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)
Usage:
Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation
Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json
What's going on?
Import-Csv
gives you one object for every row in the CSV, and each object has a property for each column in the CSV.
When you pipe that into another command, including your function each individual object gets sent through one at a time.
You could accept the one object, and process its properties in the command.
What you have here, using ValueFromPipelineByPropertyName
, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).
Since you want objects again as a result, you create a new object in the Process
block, which will be passed out through the pipeline to the next command.
As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way
– James Moore
Nov 14 '18 at 13:08
@JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.
– briantist
Nov 14 '18 at 22:08
add a comment |
Yes, you almost have it right. Your parameters should not use ValueFromPipeline
but should use ValueFromPipelineByPropertyName
. They should be [String]
but not [String]
. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".
You also don't need the End
block here it should all be done in Process
.
function Test-Pipeline
param(
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
)
Process
New-Object PSObject -Property @
role = "~$role~"
email = "mailto:$email"
fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)
Usage:
Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation
Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json
What's going on?
Import-Csv
gives you one object for every row in the CSV, and each object has a property for each column in the CSV.
When you pipe that into another command, including your function each individual object gets sent through one at a time.
You could accept the one object, and process its properties in the command.
What you have here, using ValueFromPipelineByPropertyName
, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).
Since you want objects again as a result, you create a new object in the Process
block, which will be passed out through the pipeline to the next command.
As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way
– James Moore
Nov 14 '18 at 13:08
@JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.
– briantist
Nov 14 '18 at 22:08
add a comment |
Yes, you almost have it right. Your parameters should not use ValueFromPipeline
but should use ValueFromPipelineByPropertyName
. They should be [String]
but not [String]
. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".
You also don't need the End
block here it should all be done in Process
.
function Test-Pipeline
param(
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
)
Process
New-Object PSObject -Property @
role = "~$role~"
email = "mailto:$email"
fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)
Usage:
Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation
Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json
What's going on?
Import-Csv
gives you one object for every row in the CSV, and each object has a property for each column in the CSV.
When you pipe that into another command, including your function each individual object gets sent through one at a time.
You could accept the one object, and process its properties in the command.
What you have here, using ValueFromPipelineByPropertyName
, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).
Since you want objects again as a result, you create a new object in the Process
block, which will be passed out through the pipeline to the next command.
Yes, you almost have it right. Your parameters should not use ValueFromPipeline
but should use ValueFromPipelineByPropertyName
. They should be [String]
but not [String]
. The reason is that you are going to get a single set of parameters, corresponding to a single input object, in each "pass".
You also don't need the End
block here it should all be done in Process
.
function Test-Pipeline
param(
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$role,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$email,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$fname,
[Parameter(ValueFromPipelineByPropertyName=$true)][String]$lname
)
Process
New-Object PSObject -Property @
role = "~$role~"
email = "mailto:$email"
fname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($fname)
lname = [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($lname)
Usage:
Import-Csv myData.csv | Test-Pipeline | Export-Csv tranformedData.csv -NoTypeInformation
Import-Csv myData.csv | Test-Pipeline -role "Override Role" | ConvertTo-Json
What's going on?
Import-Csv
gives you one object for every row in the CSV, and each object has a property for each column in the CSV.
When you pipe that into another command, including your function each individual object gets sent through one at a time.
You could accept the one object, and process its properties in the command.
What you have here, using ValueFromPipelineByPropertyName
, looks at the input object, if there is one, and if it contains a property that matches the name of the parameter (or any of its aliases), then the value of that property will be bound to the parameter (unless you specified on calling; that would override the input value).
Since you want objects again as a result, you create a new object in the Process
block, which will be passed out through the pipeline to the next command.
answered Nov 14 '18 at 1:30
briantistbriantist
31.3k34274
31.3k34274
As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way
– James Moore
Nov 14 '18 at 13:08
@JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.
– briantist
Nov 14 '18 at 22:08
add a comment |
As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way
– James Moore
Nov 14 '18 at 13:08
@JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.
– briantist
Nov 14 '18 at 22:08
As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way
– James Moore
Nov 14 '18 at 13:08
As I'm understanding pipeline more - I think what I want is for the pipeline to be a custom object and make sure that all the properties needed in the custom object are defined and present in the pipeline in an optimized way
– James Moore
Nov 14 '18 at 13:08
@JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.
– briantist
Nov 14 '18 at 22:08
@JamesMoore if that's the case it sounds like you want to define a class to describe this type. You can set your parameters to mandatory if they must all be present (and add validation attributes). Then instead of creating a PSObject you can create an object of your type to pass through instead. If you don't want your own class, you do most of that stuff anyway as it passes through.
– briantist
Nov 14 '18 at 22:08
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53291756%2fhow-can-i-process-the-content-of-a-csv-file-as-pipeline-input-in-powershell-cmdl%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown