Output ("echo") a variable to a text file Output ("echo") a variable to a text file powershell powershell

Output ("echo") a variable to a text file


The simplest Hello World example...

$hello = "Hello World"$hello | Out-File c:\debug.txt


Note: The answer below is written from the perspective of Windows PowerShell.
However, it applies to the cross-platform PowerShell Core edition (v6+) as well, except that the latter - commendably - consistently defaults to BOM-less UTF-8 character encoding, which is the most widely compatible one across platforms and cultures.
.


To complement bigtv's helpful answer helpful answer with a more concise alternative and background information:

# > $file is effectively the same as | Out-File $file# Objects are written the same way they display in the console.# Default character encoding is UTF-16LE (mostly 2 bytes per char.), with BOM.# Use Out-File -Encoding <name> to change the encoding.$env:computername > $file# Set-Content calls .ToString() on each object to output.# Default character encoding is "ANSI" (culture-specific, single-byte).# Use Set-Content -Encoding <name> to change the encoding.# Use Set-Content rather than Add-Content; the latter is for *appending* to a file.$env:computername | Set-Content $file 

When outputting to a text file, you have 2 fundamental choices that use different object representations and, in Windows PowerShell (as opposed to PowerShell Core), also employ different default character encodings:

  • Out-File (or >) / Out-File -Append (or >>):

    • Suitable for output objects of any type, because PowerShell's default output formatting is applied to the output objects.

      • In other words: you get the same output as when printing to the console.
    • The default encoding, which can be changed with the -Encoding parameter, is Unicode, which is UTF-16LE in which most characters are encoded as 2 bytes. The advantage of a Unicode encoding such as UTF-16LE is that it is a global alphabet, capable of encoding all characters from all human languages.

      • In PSv5.1+, you can change the encoding used by > and >>, via the $PSDefaultParameterValues preference variable, taking advantage of the fact that > and >> are now effectively aliases of Out-File and Out-File -Append. To change to UTF-8, for instance, use:
        $PSDefaultParameterValues['Out-File:Encoding']='UTF8'
  • Set-Content / Add-Content:

    • For writing strings and instances of types known to have meaningful string representations, such as the .NET primitive data types (Booleans, integers, ...).

      • .psobject.ToString() method is called on each output object, which results in meaningless representations for types that don't explicitly implement a meaningful representation; [hashtable] instances are an example:
        @{ one = 1 } | Set-Content t.txt writes literal System.Collections.Hashtable to t.txt, which is the result of @{ one = 1 }.ToString().
    • The default encoding, which can be changed with the -Encoding parameter, is Default, which is the system's "ANSI" code page, a the single-byte culture-specific legacy encoding for non-Unicode applications, most commonly Windows-1252.
      Note that the documentation currently incorrectly claims that ASCII is the default encoding.

    • Note that Add-Content's purpose is to append content to an existing file, and it is only equivalent to Set-Content if the target file doesn't exist yet.
      Furthermore, the default or specified encoding is blindly applied, irrespective of the file's existing contents' encoding.

Out-File / > / Set-Content / Add-Content all act culture-sensitively, i.e., they produce representations suitable for the current culture (locale), if available (though custom formatting data is free to define its own, culture-invariant representation - see Get-Help about_format.ps1xml).This contrasts with PowerShell's string expansion (string interpolation in double-quoted strings), which is culture-invariant - see this answer of mine.

As for performance: Since Set-Content doesn't have to apply default formatting to its input, it performs better.


As for the OP's symptom with Add-Content:

Since $env:COMPUTERNAME cannot contain non-ASCII characters, Add-Content's output, using "ANSI" encoding, should not result in ? characters in the output, and the likeliest explanation is that the ? were part of the preexisting content in output file $file, which Add-Content appended to.


After some trial and error, I found that

$computername = $env:computername

works to get a computer name, but sending $computername to a file via Add-Content doesn't work.

I also tried $computername.Value.

Instead, if I use

$computername = get-content env:computername

I can send it to a text file using

$computername | Out-File $file