Category: Uncategorized


A few months back, I had the need to tell a bunch of machines to start a BITS transfer.  Sadly, BITS reported that it cannot be started remotely.  So after trying a bunch of stuff, I have come up with a workaround for getting the job done.

$ComputerName = "192.168.1.1"
    
    $Session = New-PSSession -ComputerName $ComputerName -Credential Get-Credential
    
    Invoke-Command -Session $Session -ScriptBlock {
        
        $start = (Get-Date).AddMinutes(1).ToString("HH:mm:ss")
        [string]$Result = schtasks /create /tn "Start Bits" /tr "$PSHome\powershell.exe -File C:\Start-Bits.ps1" /sc once /st $start /ru "username" /rp "password"
        
        $Result
    
    }

While the transfer doesn’t start until 1 minute after you run the command.  IT DOES START.  You can also replace the –File with –Command when starting Powershell to do ad hock stuff.

Note:  This is a workaround and not an elegant solution.  You still won’t have access to Get-BitsTransfer.

-Shane

My boss came to me and wanted to clean up some of our SharePoint lists.  We have an “Archived” flag on all of our document libraries so we can create views that will filter out stuff that isn’t needed anymore.  The problem with this is the list will never shrink.  I created a script for him to run that will download each of the files in the provided view and then delete them from SharePoint.  The ideal location to download the files to is somewhere that IT will backup regularly or a place that will be put into long term storage.  As always, feedback welcome.

Edit:  Added $WhatIf support

-Shane

[CmdletBinding()]
param(
    [Parameter(Mandatory=$True)]
    [string]$Site,
    [Parameter(Mandatory=$True)]
    [string]$ListName,
    [Parameter(Mandatory=$True)]
    [string]$ViewName,
    [Parameter(Mandatory=$True)]
    [string]$Destination,
    [Parameter(Mandatory=$True)]
    [string]$Username,
    [Parameter(Mandatory=$True)]

[string]$Password,

[switch]$WhatIf

)
 
$Pass = ConvertTo-SecureString -AsPlainText $Password -Force
$Cred = New-Object System.Management.Automation.PSCredential -ArgumentList $Username,$Pass
 
$proxy = New-WebServiceProxy -Uri "$Site/_vti_bin/Lists.asmx?WSDL" -Credential $Cred
$viewproxy = New-WebServiceProxy -Uri "$Site/_vti_bin/Views.asmx?WSDL" -Credential $Cred
$views = $viewproxy.GetViewCollection("$ListName")
$archiveView = $views.View | ? { $_.DisplayName -eq "$ViewName" }
 
$listitems = $proxy.GetListItems("$ListName",$($archiveView.Name),$null,$null,"50000",$null,"")
 
$WC = New-Object System.Net.WebClient
$Credential = New-Object System.Net.NetworkCredential -ArgumentList $Username,$Password
$WC.Credentials = $Credential
 
$listitems.data.row | % { 
 
    Write-Host "Downloading $Site/$($_.ows_LinkFileName)"

if($WhatIf –ne $True)

{

$WC.DownloadFile("$Site/$ListName/$($_.ows_LinkFileName)","$Destination\$($_.ows_LinkFileName)")

}

    
    [xml]$delete = @"
<Batch OnError="Continue" PreCalc="TRUE" 
ListVersion="0" 
ViewName="$($archiveView.Name)">
   <Method ID="1" Cmd="Delete">
      <Field Name="ID">$($_.ows_ID)</Field>
      <Field Name="FileRef">$Site/$ListName/$($_.ows_LinkFileName)</Field>
   </Method>
</Batch>
"@
 
Write-Host "Deleting $Site/$ListName/$($_.ows_LinkFileName) from SharePoint"
 

if($WhatIf –ne $True)

{

$proxy.UpdateListItems("$ListName",$delete) | Out-Null

}

    
}

Put this together for a coworker to switch between Windows XP and Windows 7.  Was a “quicky” solution so feedback welcome to polish it up.

Note:  You will need to place a copy of bcdedit.exe in the C:\Windows\System32 directory of your XP instance.

-Shane

[CmdletBinding()]
param(
 
    [Parameter(Mandatory=$True)]
    [ValidateSet('WIN7','WINXP')]
    [string]$OperatingSystem
 
)
 
[System.Collections.ArrayList]$BCDInfo = . "$env:SystemRoot\system32\bcdedit.exe" /v
 
$IDIndex = $BCDInfo.IndexOf("Windows Boot Loader") + 2
 
$Win7GUID = ($BCDInfo[$IDIndex].Split(" "))[-1]
 
. "$env:SystemRoot\system32\bcdedit.exe" /set "{ntldr}" description “Windows XP Pro”
. "$env:SystemRoot\system32\bcdedit.exe" /set "$Win7GUID" description “Windows 7”
. "$env:SystemRoot\system32\bcdedit.exe" /timeout 5
 
switch ($OperatingSystem) {
    'WIN7'    {
    
        . "$env:SystemRoot\system32\bcdedit.exe" /default "$Win7GUID"
        
        . "$env:SystemRoot\system32\bcdedit.exe" /displayorder "$Win7GUID" "{ntldr}"
        
        for($i=10;$i -gt 0;$i--)
        {
            Write-Host "Restarting in $i"
            Sleep 1
        }
        
        Restart-Computer -Force
    
    }
    'WINXP'  { 
        
        . "$env:SystemRoot\system32\bcdedit.exe" /default "{ntldr}"
        
        . "$env:SystemRoot\system32\bcdedit.exe" /displayorder "{ntldr}" "$Win7GUID"
        
        for($i=10;$i -gt 0;$i--)
        {
            Write-Host "Restarting in $i"
            Sleep 1
        }
        
        Restart-Computer -Force
    
    }
}

When I first came to the company I work for now, I was fairly new to the world of QA.  I had come from 5 years as an IT Administrator and a couple years before that as a Technical Support Manager.  The concept of testing software was brand new as well as something else I had never ventured into;  Scripting.

One of my first tasks as a QA Engineer was to write a system wide test that would take our system back to front and run a battery of tests against it.  This included 3 database boxes, several middleware machines, an .NET web app, several supporting test machines, a network appliance, and a client machine to simulate traffic through the network appliance.  Once the test was written, it was my job to run it.  This became my primary role during a development cycle.  Take all the components, install the latest builds and make sure they play nice together.  The first couple times was fine.  Then one by one, other tasks were put on my plate.  Eventually, the test I designed that took several days to run manually was becoming a pain in my backside due to time constrains.  I approached my boss and suggested we automate it.  Once approved, I took to the task of finding out how the heck I was going to get 30+ machines to work in concert with one another while at the same time test our product.

One of my coworkers, who some of you know as qa_warrior, is in charge of automated testing and the build system.  At the time it was a myriad of python, vbscript, batch, perl, C#, and who knows what else.  I thought to myself, my goodness, do I have to learn all this just to automate something?!?  There has to be a better way.  At my previous job, I heard of this revolutionary new scripting language that was suppose to blow the socks off of everyone.  Monad.  Worth a look I thought.

One book (Powershell In Action), tons of blogs, and a couple of Channel 9 interviews with Jeffrey Snover later, I was foaming at the mouth ready to bring this new found gem to my workplace.  I approached qa_warrior with Powershell with reserved excitement since I was fairly new there.  His response was “Just what I need, another scripting language to learn”.  I handed him the Powershell In Action book and left it at that.

Days later, he comes back to me.  “This stuff is REALLY cool!  You can do this, and this, and that, and it ties into pretty much everything we have already done!”  He was sold.  Soon after that he went on vacation.  Upon his return, he brought back the start of our new automation framework.  It first started out as a simple xml file with data we may need during a test, a few libraries, some Cmdlets and some scripts that had useful functions in them to tie everything together.

Over the past year and a half, we have been expanding, refining and evangelizing what we lovingly call “FPTestLab”.  What started with qa_warrior and myself, is now being worked on and used by the entire QA department.  It is also being spread to other areas of the company like technical support, data services, development, and IT.  Powershell is the mantra for many in our organization.  (Of course there are those of us that live in the Linux world that swear by Perl.  Looking at you John 😉 )

Today FPTestLab is tens of thousands (probably approaching hundreds of thousands soon) of lines of code that builds our product, deploys a test environment (again 30+ machines) through VMWare and PowerCLI, tests our product (we have developed our own testset/testcase/step/assert framework), reports on the results of tests and moves those results into SharePoint on a nightly basis.  We have over 500 test cases written in pure Powershell that plug into FPTestLab.  This may not sound like much but bringing disparate systems together like Quality Center, VMWare, SharePoint,  and our own systems on a complex network was only possible due to Powershell and it’s ability to consume just about anything we throw at it.

With the release of Powershell Version 2, we have begun the process of moving our complex testing framework over to use the new remoting, background jobs, self documentation, events, proxy functions and modules.  We first tried to make this move with CTP1.  Yeah, I know.  We were told not to by the Powershell team but the promise of remoting was just too good to leave alone.  Unfortunately, it was not mature enough to sustain the workload we were throwing at it and had to go with nSoftware’s Powershell Server. Good product for simple V1 remoting.

Over the past two weeks I have taken my testset that was written in V1 and moved the code over to V2.  The first milestone was “Does it run without code change in V2?”.  Yes.  Only 1 issue with ambiguous methods which is a known issue in the release notes.  Next milestone was convert the remoting from nSoftware to Powershell V2 remoting.  Today I checked in my last code changes and got a 100% successful pass with 100% Powershell remoting.  0 issues.  This was a good day.

To more directly answer Jeffrey Snover and his request for what we have tested and on what platforms.  Here is my best attempt at an answer.

Platforms

  • Windows Server 2003 SP2
  • SQL Server 2000/2005
  • Windows XP SP3
  • VMWare vSphere 4.0
  • SharePoint 2007
  • Quality Center 9.x
  • Cruise Control

Powershell Features

  • Cmdlets
  • Advanced Functions
  • Inline documentation
  • Proxy functions
  • Remoting – I can’t begin to explain how much we use this.  24×7 traffic between boxes. Connections up for up to 16 hours sometimes.
  • Global functions, PSObjects, NoteProperties, and ScriptBlocks/Nested ScriptBlocks HEAVILY.
  • Cross domain authentication
  • HEAVY xml use
  • Tons of web requests through system.web.httprequest .NET objects
  • REST based web service calls
  • HEAVY PowerCLI use
  • Tons of in-house .NET libraries wrapped with Powershell
  • Decent WMI use
  • Use of sqlcmd for running .sql files and lots of calls through ADO.NET for straight queries
  • Some use of direct win32 calls through Powershell and PInvoke.
  • Calls to the SharePoint web services API
  • HEAVY use of global vs. script vs. local scope variables
  • Hashtables everywhere
  • MD5 hashing then web requests of URLs (This one drove one of our engineers crazy)

I’m sure I am forgetting stuff but it’s definitely something we are proud of.  Version 2 is opening up tons of new possibilities for us and so far has been rock solid with everything listed above on the XP/V2 combo.

Great work and thank you to the Powershell Team.  You have made our jobs tremendously more efficient and I would gladly put my badge on the table again for Powershell given the chance.

-Shane

Now that SharePoint 2010 has some details released, I have been looking around the web for information about what is coming.  One of the areas that interests me the most is what improvements are coming in the wiki features.  After using other wikis out there, I am accustomed to a certain level of functionality.  There are a couple of things missing in 2007 that I hope will make it into Sharepoint 2010.
 
  1. Jump to anchor
    1. This makes large wiki pages usable and allows users to quick find what they are looking for.  If you have a table that is quite large and broken up into sections, it helps to have a section listing at the top of the page and be able to jump down to the section by clicking on the link.  This could also be expanded as a way to autogenerate a TOC at the top of the page based on the anchor tags found throughout the page.
  2. Templates
    1. If you plan on creating a culture around using a wiki, you MUST have templates.  It really takes the wind out of someones sails when they have created a really nice looking page that could be used over and over but come to find out the only way to duplicate this work is copy and paste the code.  Being able to have a library of templates to choose from when creating a new page saves alot of time and allows a company to come up with a standard for how a particular type of page is suppose to look.
  3. Ability to edit sections
    1. There comes a time when you will need to add more than one section to a wiki page.  Anytime you create a new header you are telling the user, this is a new section of text that stands alone.  In large pages, if you have to deal with editing the entire page to get to the section you want, it becomes cumbersome and confusing due to the amount of text you must sift through.  If you can edit just the section you are interested in, it helps the user focus on the changes at hand and not worry about the wall of text in front of them.
  4. Discussion for each page
    1. When users make changes to a wiki page, there may be questions about the edit or challenges to the content.  You don’t want to put this in the wiki content page itself and it’s sometimes inconvenient to email or IM someone to resolve the changes.  To foster communication around each page in the wiki, it’s best to have a place to discuss the changes right at the user fingertips while on the page.  The Community Kit for SharePoint 2007 was a step in the right direction but this really should be built into the product.
  5. Markup
    1. While having a WYSIWYG editor is nice, being able to quickly type something out can be just as useful.  While I think it was said that this might not make it into 2010, I really think unless your editor is top notch (the 2007 editor is far from top notch) you should support the ability to use and/or define a markup language.

I hope I have provided some useful feedback to the powers that be and hope to see some of the changes outlined here.

Shane

 

Convert From-Markdown

A colleague at work came to me with a new idea for creating html documentation using lightweight text files and a simple markup language.  The project was called Markdown by John Gruber.

This was fine and all but I have been trying to find a way to do everything I come across in Powershell just to get practice in solving different problems using the language.

I quickly found a project that ported the Perl based project to .NET by Milan Negovan called Markdown.NET.  Using the binary from this project I simply created a wrapper around the core method, gave it some advanced function documentation and gave credit to the two parties involved.

The next step would be to port it over to purely Powershell but I really don’t feel like reinventing the wheel today 😉

The module .zip can be found here.  It requires CTP3 of Powershell V2.  Try it out.  If you find any bugs, have any comments or suggestions, please let me know.

-Shane

So far so good with Powershell Server from nSoftware.  Found a couple issues but for the most part it is handling just about everything we throw at it.  We rely heavily on remote runspaces and the performance seems to be better than remoting in CTP1 of Powershell.
 
The really nice thing about Powershell Server that I have been enjoying is the event logs.  Every command that is run in a remote runspace is logged which makes troubleshooting and debugging much easier.  We had a script that was throwing a strange error and when I went to check what was being run on the remote side I noticed that a variable wasn’t making it to the other side.  Come to find out the string being sent over was double quoted without escaping the variable by mistake.
 
Overall, I am pleased with Powershell Server.  Good features.  Good price.  Support has been responsive.  It is looking like this will get us by until V2 of Powershell goes gold.
 
Part 3 I will attempt to use CTP3 of Powershell over Powershell Server to overcome the lack of support for remoting on Windows XP and Windows Server 2003.
 
-Shane

On TweetGrid I noticed Jeffrey Snover ask someone to blog about what they would have liked to see in the latest drop of the Powershell ISE.  He didn’t ask me, but I’ll blog about it anyway. 😉

Wishlist:

  • Expand/Collapse of:
    • Comments
    • Functions
    • Regions
    • All
  • Import/Export/Customization of Environment Settings (much like Visual Studio)
    • I really really really like to code in this environment
    • Use Visual Studio format so they are interchangeable.
  • Indication of Opening/Closing Brackets,Parenthesis,Scriptblocks,etc
    • When the cursor is next to the opening, the closing is also highlighted.
  • Full XML editor support (Pulled from MSDN)
    • Design time well-formedness and validation errors.
    • Context-sensitive Intellisense.
    • Validation support for Schema, DTD, and XDR.
  • Intellisense, Intellisense, Intellisense
    • I think we all know what’s needed here.  Intellisense on everything.  The more the merrier.
  • Dockable panels
    • Current Runspace Variables
    • Properties

-Shane

In the previous blog, we had some issues with CTP1 and remote runspace stability.  While looking for alternatives, I remembered nSoftware’s Powershell Server.  After reading the info on their website, I decided to download it and try it out.

First off.  NetCmdlets CANNOT create runspaces on remote machines without Powershell Server.  The website is a bit misleading since they list the Cmdlets to connect to a SSH Runspace but if the SSH Runspace service is not running on the remote machine, you get no joy.  I will give them the fact that they do say in their New-SSHRunspace description that you need Powershell Server.

Now the install.  Installation was a snap for both NetCmdlets and Powershell Server.  Powershell Server has a nice option to run as a service so it’s waiting for connections at startup.  I tried to set the service to interact with the desktop but the service won’t start once that option is checked.  Was worth a try.

I noticed some differences between the two products in that they return two different object types when using the runspaces depending on which Cmdlets are used.  New-SSHRunspace is one Cmdlet.

image

New-PowershellServerRunspace is the other Cmdlet.

image

The first type of runspace (SSHRunspace) seems to return objects as their own homegrown ShellObject.

image

The next runspace type(PowershellServer) is more familiar to us as it returns the process objects as it normally would in the Powershell Host.

image

As you can see, there are subtle differences between the two runspace Cmdlets but overall, this product is able to do everything I’ve thrown at it thus far.  More testing to come.

-Shane

WinRS and Early Bits

Played with WinRS and Powershell today.  Got most of it working but had some problems with authentication and remote machines.  I thought I was giving it the right syntax but it didn’t agree with me.  Gonna have to play with it some more soon.
 
Ok.  Now the reason I am playing with WinRS.  At work we decided to give Powershell V2 CTP1 a test drive.  Well, that test drive turned into "Wow, this thing is so cool we should create a framework around it!".  Did I not learn anything from Episode 21 of the Powerscripting Podcast and banking on early bits?  Totally the wrong move and now there is a small issue that has come up that is keeping our automation framework from being rock solid.  For some reason Powershell crashes randomly when executing a remote command.  Not sure yet which command(s) but I think it’s something to do with the Write-Host Cmdlet and it’s really the only thing holding up my project right now.  Note to self:  Listen to Jeffrey when he speaks.
 
-Shane