≡ Menu

While dealing with some dotnet exception, I faced a weird problem. The exception returned by dotnet exception is having newline characters in it and it is causing my output to clutter. So, the technique I used to get rid of the situation is, replace the new line characters with space or comma.

Replace new line characters with comma.

To get rid of newline characters from from the above string, you can replace them with space.

$myvar.replace("`n"," ")

Similarly to replace newline with comma, replace space with comma character. Like this you can replace with any other character you want.

$myvar.replace("`n",",")

Well, hold on. I believe you know know that different Operating Systems treats newline character different. While MAC OS treats \r (Carriage return) as newline our favorite windows treats “\r\t” as newline. So it is always wise to replace both \r and \n from the raw string to get rid of the newlines.

$myvar.replace("`n",",").replace("`r",",")

Similarly if you want to replace the tabs, use the same trick

$myvar.replace("`t",",")

Hope this helps…

{ 5 comments }

I used foreach loop statement and foreach-Object via piping many times but never cared to check what is the difference between them unless I noticed a bug in one of my scripts which impacted the script flow.

By referring foreach loop statement and Foreach-Object cmdlets throughout my post here, I mean the below.

Foreach loop statement

foreach($i in 1..10) {            
    Write-Host $i            
}

Foreach-Object

"a","b","c","d" | Foreach-Object {            
    Write-host $_            
}

Before jumping on to bug I noticed in my code, I just want to brief what is the difference between Foreach loop statement and Foreach-Object that we generally use via pipe line. If you look at the aliases for Foreach-Object cmdlets, you will see both foreach and %. That means foreach is nothing but an alias to Foreach-Object.

But wait, there is a difference depending on the place you use. If you use foreach after a pipe line, then it is treated as alias for foreach-object cmdlets.

When you use foreach as a looping statement, the shell recognizes it as a loop control like the way we have in any other programing language.

Difference#1

Now you might ask, what is the exact difference?. I would first start with the reasons Bruce Payette quoted in PowerShell in Action book. The Foreach-Object is more efficient in terms of memory utilization but lacks the performance and the foreach loop control is more efficient in terms of performance but might utilize more memory depending on the objects you are looping through. You can easily understand this because, the Foreach-Object is used via Pipe line that means one object is passed to this at a time so there is no need to store anything when you are processing huge data. When it comes to foreach loop control statement, data should be in one variable which you will loop through one at a time so requires some space in memory.

Difference#2

Ok, now let me come to the difference I noticed. The Foreach-Object doesn’t support branching statements like Break and Continue but the foreach loop statement does.

Look at the below example. There it should print all combinations of 1 : a, 1: b, 1:d and similarly for other numbers. But you will notice that after 1:b it is proceeding to 2:a without processing 1:d. That means continue statement inside a Foreach-Object will reach the foreach loop statement which is above the foreach-Object cmdlets.

foreach($i in 1..3) {             
    "a","b","c","d" | foreach-object {             
        if($_ -eq "c") {             
        continue             
        }            
    write-host "$i : $_"             
    }             
}

Conclusion

So, my conclusion is, MS should have avoided this ambiguity from beginning by naming foreach-Object cmdlets as Each-Object or something similar which doesn’t collide with foreach loop statement. Where there is a need for looping, I would suggest using foreach loop statement instead of Foreach-Object cmdlets because that is not actually loop but rather a cmdlets which processes one object at a time that is passed via pipeline.

Hope my understanding is clear here and if anyone thinks that my assumptions are wrong or not correctly interpreted, please feel free to comment. I am open for learning.

{ 0 comments }

I worked on a print server queue permissions delegation today and had some learnings. I was trying to grant a new group to manage printers on a Windows server 2008 print server and after adding the group to print server security permissions, it is not propagated to print queues in that print server. I was surprised because, the other permissions that are defined at the print server level are propagated to print queues and I am able to see them on each printer.

After trying a few things, I figured out that there is no concept of permissions inheritance from print server to print queue. That means the permissions you define on the print server object in print management console will not propagate to print queues under that print server. There is no concept of permissions inheritance. But the point to note here is, print queue gets the ACLs from print server at the time of creation. That means, if I add a new group to print server object security properties, it will be applied to new printers created thereafter. You have to update the new group manually on the existing printers.

I am not happy with this kind of behaviour in print server management console. Ideally, they should have allowed the print queues to get permissions from the printer object. I searched a bit on this behaviour but couldn’t find any details.

While I am in the course of determining why it is behaving like this, I came across a few useful resources for delegating printer permission to at granular level. You might want some group of admins to only clear/pause/resume the queues and other group to change the printer properties as well. Such kind of granular permissions can be granted.

Refer to this link http://technet.microsoft.com/en-us/library/ee524015%28v=ws.10%29.aspx on details on how to grant granular permissions to manage the printers.

{ 0 comments }

In today’s article, let us see how to grant NTFS full permissions to a user account on list of files using PowerShell.

We know that permissions of a file or folder can be read using the Get-ACL cmdlets. When ran against a file/folder it lists the permissions like below.

Tip : You might have noticed that I am selecting AccessToString property to display the permissions in a readable manner. If you try to select the Access property alone, it will return the ACL object which doesn’t make sense unless you interpret them.

If we want to grant a new user(assume AD\Testuser1) to permissions of the file with FullControl, then first we have a create a ACL that we want to add.

$rule=new-object System.Security.AccessControl.FileSystemAccessRule ("mydomain\testuser1","FullControl","Allow")

It is as simple as above command. You can play with changing Username and type of permissions to customize your ACL. Once your acl is ready, then you need to read what the existing ACL of the file is. You can do that using simple Get-ACL cmdlets like below

$acl = Get-ACL c:\local\me.txt

Once we have the ACL list of the file, it is the time to update it with the new ACL entry we created. It can be done calling SetAccessRule function as specified below.

$acl.SetAccessRule($rule)

Now we the ACL list updated and we should apply the new ACL to the file. It can be done with Set-ACL cmdlets.

Set-ACL -Path C:\local\me.txt -AclObject $acl

Below is the nice PS function built with the help of above code to add any given user/group to security permissions of a file/folder.

function Grant-userFullRights {            
 [cmdletbinding()]            
 param(            
 [Parameter(Mandatory=$true)]            
 [string[]]$Files,            
 [Parameter(Mandatory=$true)]            
 [string]$UserName            
 )            
 $rule=new-object System.Security.AccessControl.FileSystemAccessRule ($UserName,"FullControl","Allow")            

 foreach($File in $Files) {            
  if(Test-Path $File) {            
   try {            
    $acl = Get-ACL -Path $File -ErrorAction stop            
    $acl.SetAccessRule($rule)            
    Set-ACL -Path $File -ACLObject $acl -ErrorAction stop            
    Write-Host "Successfully set permissions on $File"            
   } catch {            
    Write-Warning "$File : Failed to set perms. Details : $_"            
    Continue            
   }            
  } else {            
   Write-Warning "$File : No such file found"            
   Continue            
  }            
 }            
}

Output:

In the above output I have added a local account (SITARAM\Administrator) to the permissions of c:\local\me.txt file

Hope this helps and happy reading.

{ 6 comments }

As promised in my previous post, here is the script that our group developed during Singapore PowerShell Saturday #008 event.

This script relies on System.IO.Compression.FileSystem assembly to read the contents of ZIP(archive) for without extracting them to the disk. The advantage with this script is, it will not consume any resources/disk space because it is directly reading the zip file contents without extraction.

This script can take multiple ZIP files as input and return the output in PSObject format. This format is helpful if you want to feed the output of this script to some other script/function/cmdlets. It also supports exporting the output to CSV easily with the –ExportCSVFileName parameter. You need to give the CSV file path where you want to store it.

Note that this script works only if you have Dotnet Framework 4.5 or above. This is because the ZipFile class used to read the zip contents is made available starting from dotnet framework v4.5. I will share you other script in my next article which can be used without any dependency on dotnet version. But the only downside I noticed so far is, we can not determine what is the compressed file size and what is the uncompressed file size.

Code

            
[cmdletbinding()]            
param(            
 [Parameter(Mandatory=$true)]            
 [string[]]$FileName,            
 [String]$ExportCSVFileName            
)            
#Exit if the shell is using lower version of dotnet            

$dotnetversion = [Environment]::Version            
if(!($dotnetversion.Major -ge 4 -and $dotnetversion.Build -ge 30319)) {            
 write-error "You are not having Microsoft DotNet Framework 4.5 installed. Script exiting"            
 exit(1)            
}            

# Import dotnet libraries            

[Void][Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem')            
$ObjArray = @()            
foreach($zipfile in $FileName) {            
if(Test-Path $ZipFile) {            
 $RawFiles = [IO.Compression.ZipFile]::OpenRead($zipFile).Entries            
 foreach($RawFile in $RawFiles) {            

  $object = New-Object -TypeName PSObject            
  $Object | Add-Member -MemberType NoteProperty -Name FileName -Value $RawFile.Name            
  $Object | Add-Member -MemberType NoteProperty -Name FullPath -Value $RawFile.FullName            
  $Object | Add-Member -MemberType NoteProperty -Name CompressedLengthInKB -Value ($RawFile.CompressedLength/1KB).Tostring("00")            
  $Object | Add-Member -MemberType NoteProperty -Name UnCompressedLengthInKB -Value ($RawFile.Length/1KB).Tostring("00")            
  $Object | Add-Member -MemberType NoteProperty -Name FileExtn -Value ([System.IO.Path]::GetExtension($RawFile.FullName))            
  $Object | Add-Member -MemberType NoteProperty -Name ZipFileName -Value $zipfile            
  $ObjArray += $Object            
  if(!$ExportCSVFileName) {            
   $Object            
  }            
 }            
} else {            
 Write-Warning "$ZipFileInput File path not found"            
}            
if ($ExportCSVFileName){            
 try {            
  $ObjArray  | Export-CSV -Path $ExportCSVFileName -NotypeInformation            
 } catch {            
  Write-Error "Failed to export the output to CSV. Details : $_"            
 }            
}            

}

Most of the code is self-explanatory. If you get any questions about this, please feel free to raise them. I am more than happy to help you.

Usage:

Output

Hope this helps and Happy learning…

{ 26 comments }

Singapore Powershell Saturday #008 and My experience

Introduction

Ever since I started working in Singapore, I wanted to attend Singapore PowerShell user group to interact with experts in the community. Finally I made it to PowerShell Saturday #008 today and it is awesome experience.

What is covered

Attending this kind of events motivates me to learn (and share) more and more. With the same motivation, I made it to the venue on time. After registration, I shook hands with other participants to introduce each other. The session started with the presentation of Milton Goh about improving efficiency of operations team. It’s not a technical presentation but it gives you an idea why you really should care about using PowerShell in your environments.

After a small break, Matt Hitchcock presented on important aspects one should keep in mind while designing a PowerShell solution. The points he mentioned are very useful and can be adapted to any other programing language. One thing I liked more and want every System admin to know is, “Do not over engineer”. To simply put this, don’t try to perform each and every activity with PowerShell while you have other smart non-PowerShell tools around to do the task easily. Once of the classic example is, if you want to deploy a wall paper to all computers in your company, try GPOs rather than building a script in PS from scratch. Sounds correct? Isn’t it?

While we are at this “Over engineering” topic, I want to highlight a point to my blog readers. During my day-to-day activities, I come across lot of requirements to gather information or make changes to large systems etc. When I am such situations, I question myself, how can I do this with PowerShell. It’s not that I want to implement the solution to PowerShell but looking for ways to explore doing more things with PowerShell. It will definitely come handy some or other time. I still want you to remember the ground rule (don’t over engineer) but explore how you can do that in PowerShell way. Taking the previous example, you should deploy wallpaper using GPOs but I want you to explore, how to modify the group policy using PowerShell to set wallpaper settings. If you get a requirement to deploy multiple wallpapers to multiple OUs in future; that time you should definitely automate GPO modifications using PowerShell. Hope you got my point.

After the presentations, we were hungry and yes its lunch time. Had nice meal while chatting with other folks in the session.

What we practiced?

Now the time for practical’s. We were divided into two groups and made to sit in separate conference rooms. The task given to both of the groups is to develop a script to read the contents of the zip file. Folks in our team quickly searched the web and came up with a dotnet piece of code to read the zip and I quickly wrapped up that one using PS code in my regular style. In no time we came up with a script that can take multiple archive files as input and display what is inside them, properties of the files like actual size, compressed size, file extn etc. I felt happy that I could explain my code part to other members in the team and hope they liked the way I explained it. The other team also came with the similar functionality but with COM objects and it came out well too.

I will share the script that our group prepared in next article, so stay tuned.

After completing the coding part, it’s time for presenting what we did. The mentors gave me an opportunity to present and I loved doing it. Hope the participants liked the way I presented it. The presentation of other group followed after that and it is well applauded in deed.

At the end of presentation, the mentors suggested a few enhancements to the code for better functionality and robustness. Out of all, I liked the Peter(sorry forgot full name)’s comment the most. If the user of your script has to make any changes to the code, then you haven’t made the script in right way. Basically he mean to say nothing should be hardcoded in the script and everything should be parameterized. Thanks for the suggestion. I will remember and implement it wherever possible.

Prizes

Now it’s time to announce the Prizes. I got 1-year free subscription for Office 365 home addition in lucky draw. And also “The Phoenix Project” book. Look at the below tweet.

That ended the session and every looked excited to know how frequently these sessions will be conducted and asked few other questions.

We all headed back to home…

My since thanks to organizers of this event and looked forward to participate such ones in future as well.

Happy learning…

{ 7 comments }

This morning I was working on writing a PowerShell code that establishes TCP connection remote server using a dynamic port. I did some coding based on TCPClient class in past (see Test Port connectivity using PowerShell), so started with the below command. Here I am trying to connect to myhost.domain.com on port 80000. – for some unknown reason the remote system is using very higher port for communication.

$Socket = New-Object Net.Sockets.TcpClient("myhost.domain.com",80000)

After executing the above command I got the error.

new-object : Exception calling “.ctor” with “2” argument(s): “Specified argument was out of the range of valid values.

Parameter name: port”

The error message is quite straight forward that the port I have given is higher than what TCPClient class can take. I did some research around this and found that 65535(0X0000FFFF) is the higher port I can establish a connection using the TCPClass method.

Now I am stuck!! Using the dotnet abilities from the Powershell is the only way to achieve stuff like this but I am hit with a limitation. Obviously I cannot ask the owner of the remote server to change their port number so started search for the alternatives.

One thing that I came to my mind is portforwarding. What if one of my server take connection from my powershell script on lower port and forward to actual remote server on the port 80000? I quickly searched on internet and figured out the below command.

netsh interface portproxy add v4tov4 listenport=9090 listenaddress=10.10.10.100 connectport=80000 connectaddress=10.10.10.241

After this, I changed my powershell code to connect to 10.10.10.100 on port 9090 instead of myhost.domain.com on port 80000. It worked like a champ and I could able to make a connection to the real target server and able to read/write data through that socket.

Felt really wonderful after finding this and eager to share this with my blog readers. Hope you will find it useful.

{ 3 comments }

In this post, I will demonstrate how to query the list of computers in a WSUS group using PowerShell.

Introduction

We all know that WSUS is a patching tool and we can create custom groups inside it to manage the approvals and to do better organization. When you have a various groups in WSUS console, how do you know the list of computers currently members of a given WSUS group. The PowerShell script described in this article will help you in querying this information.

Connect to WSUS server

First of all, we need to connect to the WSUS server from which you want to query the group information. The below code will help you to connect to a WSUS server and $wsus stores the reference information of that.

[void][reflection.assembly]::LoadWithPartialName("Microsoft.UpdateServices.Administration")            
$wsus = [Microsoft.UpdateServices.Administration.AdminProxy]::getUpdateServer("SRVWSUS1",$False)

List all WSUS groups

Once connected to the server, we need to find out what all groups available in that particular WSUS server. This can be done by calling GetComputerTargetGroups() method.

$wsus.GetComputerTargetGroups()

Get the WSUS group you want

After you decide on the group that you want to query, just filter the output of the above command. For example, if the group that I am interested in is “America Computers”, then I can use something like to get the group I need.

$mygroup = $wsus.GetComputerTargetGroups() | ? {$_.Name -eq "America Computers"}

List members of WSUS group

Once the group reference object is available, its matter of calling querying the member names (aka targets) of this group by invoking the method GetComputerTargets(). The output will contain list of computer objects along with other properties like last sync time, last reported time, etc. The below command will return all target computers FQDNs.

$mygroup.GetComputerTargets() | select FullDomainName

Hope this helps and happy learning.

{ 21 comments }

In this post I will show you how to query active directory security group members and export them to CSV(or excel) using PowerShell

While there are variety of ways available to export group membership to excel/CSV, the easiest method I found is using the combination of cmdlets in ActiveDirectory module & Export-CSV cmdlets. It helps you in exporting the member’s information to CSV file and it’s not much efforts to convert that CSV into EXCEL format if you insist that you need in XLS format only.

Since this code I am going to discuss uses ActiveDirectory module, make sure that you have at least one Domain Controller running ADWS service. Without that ActiveDirectory module will not work. ADWS is default with Domain Controllers running Windows Server 2008 R2 or above. For the lower versions, you need to install it manually. You can download ADWS for Windows Server 2003 and Windows Server 2008 from http://www.microsoft.com/en-sg/download/details.aspx?id=2852

Ok, now let jump the coding section.

First of all you need to import the ActiveDirectory module. This module is available on Windows 7 and Windows server 2008 or above only. If you have lower versions of operating systems, this is not available.

Import-Module ActiveDirectory

Once the module is imported, we can query the group members using Get-ADGroupMember cmdlet in this module

For example,

Get-ADGroupMember -Id "Group Name"

This will return list of objects that are part of the group. The objects include Users, computers and Group objects if any.

If you want to export this entire information to CSV, use Export-CSV cmdlet.

Get-ADGroupMember -Id "Group Name" | Export-CSV c:\temp\GroupOutput.CSV -NoTypeInformation

The above command is self-explanatory. I have used -NotypeInformation parameter for Export-CSV cmdlet to supress the type information of the objects it is exporting.

If you want to export group membership of multiple groups, then you have to try the below code.

Here the script reads the groups list from text file c:\temp\groups.txt and generates the output in c:\temp\GroupsInfo.CSV. The output will have the group name in first column.

$groups = Get-Content c:\temp\Groups.txt            
            
foreach($Group in $Groups) {            
            
Get-ADGroupMember -Id $Group | select  @{Expression={$Group};Label="Group Name"},* | Export-CSV c:\temp\GroupsInfo.CSV -NoTypeInformation -Append
            
}

Hope this helps and happy learning.

{ 16 comments }

Whenever a mailbox is moved in Exchange environment, it creates a move mailbox request entry to track the movement progress. This request will be set to completed once the movement is completed. However, this entry will not be deleted unless the Exchange admin deletes manually. If not done, when the same mailbox move is attempted, it will fail saying another request is in queue. So it is good to delete the completed/failed requests.There is no harm in removing these completed/failed requests.

To view the list of move requests in completed state, try the below command from exchange shell.

Get-MoveRequest | ? {$_.Status -eq "Completed" }

If you want to view the pending requests of a particular account, try the below. Here I am querying for testuser2 mailbox.

Get-MoveRequest -Identity testuser2

Use the below command to clear all completed move requests at single go.

Get-MoveRequest | Remove-MoveRequest -Confirm:$false

Use below command to clear move requests of a given mailbox

Get-MoveRequest -Identity test2 | Remove-MoveRequest -Confirm:$false

All these commands should be executed from Exchange Management Shell

{ 0 comments }