PowerShell Performance Tips for Large Text Operations – Part 1: Reading Files
Posted in Windows Powershell | 2 Comments | 3,921 views | 28/02/2015 04:08
I want to give some performance tips for large text operations on PowerShell.
Test File: 424390 lines, 200 MB Microsoft IIS Log
1. First of all, we have to read file :) Lets try our alternatives:
a. Native command: Get-Content
If I use this option, script takes: 13.3727013 seconds to read and loop in 424390 lines.
But how about memory usage?
Get-Content stores file into memory, so it’s normal to see high memory usage.
b. Using .Net method: [io.file]::ReadAllLines
In this option, script takes: 2.0082615 seconds to read and loop in 424390 lines which is extremely fast instead of Get-Content.
Memory usage is less than Get-Content but still too much. Also I can’t capture it but CPU is max 13%.
c. Using .Net method: System.IO.StreamReader
If I use this option, script takes: 1.7062244 seconds to read and loop in 424390 lines. This seems fastest method.
Also memory usage is too low because it reads file line by line. So PowerShell doesn’t hold file in memory.
But in this case, CPU usage is still too high. Probably it’s killing server’s one core at running time. But it’s something that I can’t help :)
In next part, I’ll show you text manipulation tips. See you.
Tags: powershell file operations, powershell large text file operations, PowerShell performance tips, powershell read large file, powershell read large text
Leave a Reply