Pipe complete array-objects instead of array items one at a time? Pipe complete array-objects instead of array items one at a time? powershell powershell

Pipe complete array-objects instead of array items one at a time?


Short answer: use unary array operator ,:

,$theArray | foreach{Write-Host $_}

Long answer: there is one thing you should understand about @() operator: it always interpret its content as statement, even if content is just an expression. Consider this code:

$a='A','B','C'$b=@($a;)$c=@($b;)

I add explicit end of statement mark ; here, although PowerShell allows to omit it. $a is array of three elements. What result of $a; statement? $a is a collection, so collection should be enumerated and each individual item should be passed by pipeline. So result of $a; statement is three elements written to pipeline. @($a;) see that three elements, but not the original array, and create array from them, so $b is array of three elements. Same way $c is array of same three elements. So when you write @($collection) you create array, that copy elements of $collection, instead of array of single element.


The comma character makes the data an array. In order to make the pipe line process your array as an array, instead of operating on each array element individually, you may also need to wrap the data with parentheses.

This is useful if you need to assess the status of multiple items in the array.

Using the following function

function funTest {    param (        [parameter(Position=1, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]        [alias ("Target")]        [array]$Targets         ) # end param    begin {}    process {        $RandomSeed = $( 1..1000 | Get-Random )        foreach ($Target in $Targets) {            Write-Host "$RandomSeed - $Target"            } # next target        } # end process    end {}    } # end function 

Consider the following examples:

Just Wrapping your array in parentheses does not guarantee the function will process the array of values in one process call. In this example we see the random number changes for each element in the array.

PS C:\> @(1,2,3,4,5) | funTest153 - 187 - 296 - 396 - 4986 - 5

Simply adding the leading comma, also does not guarantee the function will process the array of values in one process call. In this example we see the random number changes for each element in the array.

PS C:\> , 1,2,3,4,5 | funTest1000 - 184 - 2813 - 3156 - 4928 - 5

With the leading comma and the array of values in parentheses, we can see the random number stays the same because the function's foreach command is being leveraged.

PS C:\> , @( 1,2,3,4,5) | funTest883 - 1883 - 2883 - 3883 - 4883 - 5


There's an old-school solution, if you don't mind that your process is a function.

Example: You want an array copied to the clipboard in a way that allows you to build it again on another system without any PSRemoting connectivity. So you want an array containing "A", "B", and "C" to transmute to a string:@("A","B","C")...instead of a literal array.

So you build this (which isn't optimal for other reasons, but stay on topic):

# Serialize-Listparam (    [Parameter(Mandatory, ValueFromPipeline)]    [string[]]$list)    $output = "@(";    foreach ($element in $list)    {        $output += "`"$element`","    }    $output = $output.Substring(0, $output.Length - 1)    $output += ")"    $output

and it works when you specify the array as a parameter directly:

Serialize-List $list@("A","B","C")

...but not so much when you pass it through the pipeline:

$list | Serialize-List@("C")

But refactor your function with begin, process, and end blocks:

# Serialize-Listparam (    [Parameter(Mandatory, ValueFromPipeline)]    [string[]]$list)begin{    $output = "@(";}process{    foreach ($element in $list)    {        $output += "`"$element`","    }}end{    $output = $output.Substring(0, $output.Length - 1)    $output += ")"    $output}

...and you get the desired output both ways.

Serialize-List $list@("A","B","C")$list | Serialize-List@("A","B","C")