PowerShell locking File PowerShell locking File powershell powershell

PowerShell locking File


This is likely caused by the Get-Content cmdlet that gets a lock for reading and Out-File that tries to get its lock for writing. Similar question is here: Powershell: how do you read & write I/O within one pipeline?

So the solution would be:

${C:\Path\File.cs} = ${C:\Path\File.cs} | foreach {$_ -replace "document.getElementById", '$'}${C:\Path\File.cs} = Get-Content C:\Path\File.cs | foreach {$_ -replace  "document.getElementById", '$'}$content = Get-Content C:\Path\File.cs | foreach {$_ -replace "document.getElementById", '$'}$content | Set-Content C:\Path\File.cs

Basically you need to buffer the content of the file so that the file can be closed (Get-Content for reading) and after that the buffer should be flushed to the file (Set-Content, during that write lock will be required).


The accepted answer worked for me if I had a single file operation, but when I did multiple Set-Content or Add-Content operations on the same file, I still got the "is being used by another process" error.

In the end I had to write to a temp file, then copy the temp file to the original file:

(Get-Content C:\Path\File.cs) | foreach {$_ -replace "document.getElementById", '$'} | Set-Content C:\Path\File.cs.tempCopy-Item C:\Path\File.cs.temp C:\Path\File.cs


Personal experience: I had the "locked file syndrome" in one of my procedures. I found it was caused by a New-Object assignment on the file. I realised that I had not issued a "Dispose()" call on the object. I rewrote the offending code to dispose of the 'New-Object' as soon as convenient and the "locked file" syndrome was resolved.A learning event for me = always dispose of each New-Object!