Latest Entries »

A few months back, I had the need to tell a bunch of machines to start a BITS transfer.  Sadly, BITS reported that it cannot be started remotely.  So after trying a bunch of stuff, I have come up with a workaround for getting the job done.

$ComputerName = "192.168.1.1"
    
    $Session = New-PSSession -ComputerName $ComputerName -Credential Get-Credential
    
    Invoke-Command -Session $Session -ScriptBlock {
        
        $start = (Get-Date).AddMinutes(1).ToString("HH:mm:ss")
        [string]$Result = schtasks /create /tn "Start Bits" /tr "$PSHome\powershell.exe -File C:\Start-Bits.ps1" /sc once /st $start /ru "username" /rp "password"
        
        $Result
    
    }

While the transfer doesn’t start until 1 minute after you run the command.  IT DOES START.  You can also replace the –File with –Command when starting Powershell to do ad hock stuff.

Note:  This is a workaround and not an elegant solution.  You still won’t have access to Get-BitsTransfer.

-Shane

Advertisements

My boss came to me and wanted to clean up some of our SharePoint lists.  We have an “Archived” flag on all of our document libraries so we can create views that will filter out stuff that isn’t needed anymore.  The problem with this is the list will never shrink.  I created a script for him to run that will download each of the files in the provided view and then delete them from SharePoint.  The ideal location to download the files to is somewhere that IT will backup regularly or a place that will be put into long term storage.  As always, feedback welcome.

Edit:  Added $WhatIf support

-Shane

[CmdletBinding()]
param(
    [Parameter(Mandatory=$True)]
    [string]$Site,
    [Parameter(Mandatory=$True)]
    [string]$ListName,
    [Parameter(Mandatory=$True)]
    [string]$ViewName,
    [Parameter(Mandatory=$True)]
    [string]$Destination,
    [Parameter(Mandatory=$True)]
    [string]$Username,
    [Parameter(Mandatory=$True)]

[string]$Password,

[switch]$WhatIf

)
 
$Pass = ConvertTo-SecureString -AsPlainText $Password -Force
$Cred = New-Object System.Management.Automation.PSCredential -ArgumentList $Username,$Pass
 
$proxy = New-WebServiceProxy -Uri "$Site/_vti_bin/Lists.asmx?WSDL" -Credential $Cred
$viewproxy = New-WebServiceProxy -Uri "$Site/_vti_bin/Views.asmx?WSDL" -Credential $Cred
$views = $viewproxy.GetViewCollection("$ListName")
$archiveView = $views.View | ? { $_.DisplayName -eq "$ViewName" }
 
$listitems = $proxy.GetListItems("$ListName",$($archiveView.Name),$null,$null,"50000",$null,"")
 
$WC = New-Object System.Net.WebClient
$Credential = New-Object System.Net.NetworkCredential -ArgumentList $Username,$Password
$WC.Credentials = $Credential
 
$listitems.data.row | % { 
 
    Write-Host "Downloading $Site/$($_.ows_LinkFileName)"

if($WhatIf –ne $True)

{

$WC.DownloadFile("$Site/$ListName/$($_.ows_LinkFileName)","$Destination\$($_.ows_LinkFileName)")

}

    
    [xml]$delete = @"
<Batch OnError="Continue" PreCalc="TRUE" 
ListVersion="0" 
ViewName="$($archiveView.Name)">
   <Method ID="1" Cmd="Delete">
      <Field Name="ID">$($_.ows_ID)</Field>
      <Field Name="FileRef">$Site/$ListName/$($_.ows_LinkFileName)</Field>
   </Method>
</Batch>
"@
 
Write-Host "Deleting $Site/$ListName/$($_.ows_LinkFileName) from SharePoint"
 

if($WhatIf –ne $True)

{

$proxy.UpdateListItems("$ListName",$delete) | Out-Null

}

    
}

Put this together for a coworker to switch between Windows XP and Windows 7.  Was a “quicky” solution so feedback welcome to polish it up.

Note:  You will need to place a copy of bcdedit.exe in the C:\Windows\System32 directory of your XP instance.

-Shane

[CmdletBinding()]
param(
 
    [Parameter(Mandatory=$True)]
    [ValidateSet('WIN7','WINXP')]
    [string]$OperatingSystem
 
)
 
[System.Collections.ArrayList]$BCDInfo = . "$env:SystemRoot\system32\bcdedit.exe" /v
 
$IDIndex = $BCDInfo.IndexOf("Windows Boot Loader") + 2
 
$Win7GUID = ($BCDInfo[$IDIndex].Split(" "))[-1]
 
. "$env:SystemRoot\system32\bcdedit.exe" /set "{ntldr}" description “Windows XP Pro”
. "$env:SystemRoot\system32\bcdedit.exe" /set "$Win7GUID" description “Windows 7”
. "$env:SystemRoot\system32\bcdedit.exe" /timeout 5
 
switch ($OperatingSystem) {
    'WIN7'    {
    
        . "$env:SystemRoot\system32\bcdedit.exe" /default "$Win7GUID"
        
        . "$env:SystemRoot\system32\bcdedit.exe" /displayorder "$Win7GUID" "{ntldr}"
        
        for($i=10;$i -gt 0;$i--)
        {
            Write-Host "Restarting in $i"
            Sleep 1
        }
        
        Restart-Computer -Force
    
    }
    'WINXP'  { 
        
        . "$env:SystemRoot\system32\bcdedit.exe" /default "{ntldr}"
        
        . "$env:SystemRoot\system32\bcdedit.exe" /displayorder "{ntldr}" "$Win7GUID"
        
        for($i=10;$i -gt 0;$i--)
        {
            Write-Host "Restarting in $i"
            Sleep 1
        }
        
        Restart-Computer -Force
    
    }
}
Will be presenting a talk this year at the Silicon Valley Code Camp 2010.  The session will be on Powershell Version 2 Modules and how to build and work with them.  Come and learn the new way to automate and manage just about anything using Microsofts new shell.
 
 

CodeCamp at FootHill College.  Click Here for Details and Registration

 
qawarrior, a co-worker of mine, is also presenting a session on Advanced Scripting for Powershell.  See it here.  http://nullreferenceproject.blogspot.com/2010/05/5th-silicon-valley-code-camp-oct-9th.html
 
 
-Shane
Just played with this a bit.  Works like the normal functions in Powershell V1 but thought I would give an example in V2 advanced functions.
 

function Test {

 [CmdletBinding()]
 param(
 [Parameter(Mandatory=$true)]
 [ref]$var
 )
 #Note the $var.value.  $var = 5 will not work.
 $var.value = 5

}

PS C:\Users\spowser> $var = 1
PS C:\Users\spowser> Test -var ([ref]$var)
PS C:\Users\spowser> $var
5

-Shane

 

Ran into a case where I need to know if –Verbose had been passed in as a parameter to an advanced function today.  I initially thought I could simply do the following:

function Test
{

    [CmdletBinding()]
    param()
    if($Verbose)
    {
        $True
    }
    else
    {
        $False
    }
}

PS C:\Users\spowser> Test -Verbose
False

Hmmmm.  Ok.  Powershell doesn’t think –Verbose was used.  Has to be a way to get to the bound parameters.

Found this as the solution.

function Test
{

    [CmdletBinding()]
    param()
    if($PSBoundparameters["Verbose"].IsPresent)
    {
        $True
    }
    else
    {
        $False
    }
}

PS C:\Users\spowser> Test -Verbose
True

Much better 🙂

Hope this helps.

-Shane

WSUS Client Issues

 
Ran into the following error today on a Windows XP SP3 machine when trying to get Windows updates working.
 
"The Background Intelligent Transfer Service service terminated with service-specific error 2147500053 (0x80004015). "
 
Created the following script to fix it.
 
 
-Shane

When I first came to the company I work for now, I was fairly new to the world of QA.  I had come from 5 years as an IT Administrator and a couple years before that as a Technical Support Manager.  The concept of testing software was brand new as well as something else I had never ventured into;  Scripting.

One of my first tasks as a QA Engineer was to write a system wide test that would take our system back to front and run a battery of tests against it.  This included 3 database boxes, several middleware machines, an .NET web app, several supporting test machines, a network appliance, and a client machine to simulate traffic through the network appliance.  Once the test was written, it was my job to run it.  This became my primary role during a development cycle.  Take all the components, install the latest builds and make sure they play nice together.  The first couple times was fine.  Then one by one, other tasks were put on my plate.  Eventually, the test I designed that took several days to run manually was becoming a pain in my backside due to time constrains.  I approached my boss and suggested we automate it.  Once approved, I took to the task of finding out how the heck I was going to get 30+ machines to work in concert with one another while at the same time test our product.

One of my coworkers, who some of you know as qa_warrior, is in charge of automated testing and the build system.  At the time it was a myriad of python, vbscript, batch, perl, C#, and who knows what else.  I thought to myself, my goodness, do I have to learn all this just to automate something?!?  There has to be a better way.  At my previous job, I heard of this revolutionary new scripting language that was suppose to blow the socks off of everyone.  Monad.  Worth a look I thought.

One book (Powershell In Action), tons of blogs, and a couple of Channel 9 interviews with Jeffrey Snover later, I was foaming at the mouth ready to bring this new found gem to my workplace.  I approached qa_warrior with Powershell with reserved excitement since I was fairly new there.  His response was “Just what I need, another scripting language to learn”.  I handed him the Powershell In Action book and left it at that.

Days later, he comes back to me.  “This stuff is REALLY cool!  You can do this, and this, and that, and it ties into pretty much everything we have already done!”  He was sold.  Soon after that he went on vacation.  Upon his return, he brought back the start of our new automation framework.  It first started out as a simple xml file with data we may need during a test, a few libraries, some Cmdlets and some scripts that had useful functions in them to tie everything together.

Over the past year and a half, we have been expanding, refining and evangelizing what we lovingly call “FPTestLab”.  What started with qa_warrior and myself, is now being worked on and used by the entire QA department.  It is also being spread to other areas of the company like technical support, data services, development, and IT.  Powershell is the mantra for many in our organization.  (Of course there are those of us that live in the Linux world that swear by Perl.  Looking at you John 😉 )

Today FPTestLab is tens of thousands (probably approaching hundreds of thousands soon) of lines of code that builds our product, deploys a test environment (again 30+ machines) through VMWare and PowerCLI, tests our product (we have developed our own testset/testcase/step/assert framework), reports on the results of tests and moves those results into SharePoint on a nightly basis.  We have over 500 test cases written in pure Powershell that plug into FPTestLab.  This may not sound like much but bringing disparate systems together like Quality Center, VMWare, SharePoint,  and our own systems on a complex network was only possible due to Powershell and it’s ability to consume just about anything we throw at it.

With the release of Powershell Version 2, we have begun the process of moving our complex testing framework over to use the new remoting, background jobs, self documentation, events, proxy functions and modules.  We first tried to make this move with CTP1.  Yeah, I know.  We were told not to by the Powershell team but the promise of remoting was just too good to leave alone.  Unfortunately, it was not mature enough to sustain the workload we were throwing at it and had to go with nSoftware’s Powershell Server. Good product for simple V1 remoting.

Over the past two weeks I have taken my testset that was written in V1 and moved the code over to V2.  The first milestone was “Does it run without code change in V2?”.  Yes.  Only 1 issue with ambiguous methods which is a known issue in the release notes.  Next milestone was convert the remoting from nSoftware to Powershell V2 remoting.  Today I checked in my last code changes and got a 100% successful pass with 100% Powershell remoting.  0 issues.  This was a good day.

To more directly answer Jeffrey Snover and his request for what we have tested and on what platforms.  Here is my best attempt at an answer.

Platforms

  • Windows Server 2003 SP2
  • SQL Server 2000/2005
  • Windows XP SP3
  • VMWare vSphere 4.0
  • SharePoint 2007
  • Quality Center 9.x
  • Cruise Control

Powershell Features

  • Cmdlets
  • Advanced Functions
  • Inline documentation
  • Proxy functions
  • Remoting – I can’t begin to explain how much we use this.  24×7 traffic between boxes. Connections up for up to 16 hours sometimes.
  • Global functions, PSObjects, NoteProperties, and ScriptBlocks/Nested ScriptBlocks HEAVILY.
  • Cross domain authentication
  • HEAVY xml use
  • Tons of web requests through system.web.httprequest .NET objects
  • REST based web service calls
  • HEAVY PowerCLI use
  • Tons of in-house .NET libraries wrapped with Powershell
  • Decent WMI use
  • Use of sqlcmd for running .sql files and lots of calls through ADO.NET for straight queries
  • Some use of direct win32 calls through Powershell and PInvoke.
  • Calls to the SharePoint web services API
  • HEAVY use of global vs. script vs. local scope variables
  • Hashtables everywhere
  • MD5 hashing then web requests of URLs (This one drove one of our engineers crazy)

I’m sure I am forgetting stuff but it’s definitely something we are proud of.  Version 2 is opening up tons of new possibilities for us and so far has been rock solid with everything listed above on the XP/V2 combo.

Great work and thank you to the Powershell Team.  You have made our jobs tremendously more efficient and I would gladly put my badge on the table again for Powershell given the chance.

-Shane

Now that SharePoint 2010 has some details released, I have been looking around the web for information about what is coming.  One of the areas that interests me the most is what improvements are coming in the wiki features.  After using other wikis out there, I am accustomed to a certain level of functionality.  There are a couple of things missing in 2007 that I hope will make it into Sharepoint 2010.
 
  1. Jump to anchor
    1. This makes large wiki pages usable and allows users to quick find what they are looking for.  If you have a table that is quite large and broken up into sections, it helps to have a section listing at the top of the page and be able to jump down to the section by clicking on the link.  This could also be expanded as a way to autogenerate a TOC at the top of the page based on the anchor tags found throughout the page.
  2. Templates
    1. If you plan on creating a culture around using a wiki, you MUST have templates.  It really takes the wind out of someones sails when they have created a really nice looking page that could be used over and over but come to find out the only way to duplicate this work is copy and paste the code.  Being able to have a library of templates to choose from when creating a new page saves alot of time and allows a company to come up with a standard for how a particular type of page is suppose to look.
  3. Ability to edit sections
    1. There comes a time when you will need to add more than one section to a wiki page.  Anytime you create a new header you are telling the user, this is a new section of text that stands alone.  In large pages, if you have to deal with editing the entire page to get to the section you want, it becomes cumbersome and confusing due to the amount of text you must sift through.  If you can edit just the section you are interested in, it helps the user focus on the changes at hand and not worry about the wall of text in front of them.
  4. Discussion for each page
    1. When users make changes to a wiki page, there may be questions about the edit or challenges to the content.  You don’t want to put this in the wiki content page itself and it’s sometimes inconvenient to email or IM someone to resolve the changes.  To foster communication around each page in the wiki, it’s best to have a place to discuss the changes right at the user fingertips while on the page.  The Community Kit for SharePoint 2007 was a step in the right direction but this really should be built into the product.
  5. Markup
    1. While having a WYSIWYG editor is nice, being able to quickly type something out can be just as useful.  While I think it was said that this might not make it into 2010, I really think unless your editor is top notch (the 2007 editor is far from top notch) you should support the ability to use and/or define a markup language.

I hope I have provided some useful feedback to the powers that be and hope to see some of the changes outlined here.

Shane

 

Powershell REST Client

The past couple of days I have been working on a Powershell based client for our REST web services.  While the client is specific to our business the core logic behind the client should apply to just about all REST based web services.  I have tried to generalize(read: remove company stuff) the code so hopefully everything still makes sense.

In our implementation of REST, we allow a person to either submit a GET request with query string parameters or attach some XML to the request and submit as a POST.  Our response will always be some xml.  We handle it with 3 xml root types.  Success, Error, and Data.  Success just lets the user know a particular POST was successful.  Error will give the user back an error code with a description.  Data will return an xml document with whatever data the user requested.  Eventually a Success I would think will also return the changes made but hey, this is version 1.

The core of each GET request is just making the request with the query strings defined.  The declaration of the core function with the parameters section is:

function Invoke-APICall {
    param(
        [string]$Username,
        [string]$Password,
        [string]$UserAgent = "Powershell API Client",
        [string]$URL,
        [xml]$XMLObject
        )

 

   1: #Create a URI instance since the HttpWebRequest.Create Method will escape the URL by default.
   2: $URI = New-Object System.Uri($URL,$true)
   3:  
   4: #Create a request object using the URI
   5: $request = [System.Net.HttpWebRequest]::Create($URI)
   6:  
   7: #Build up a nice User Agent
   8: $request.UserAgent = $(
   9: "{0} (PowerShell {1}; .NET CLR {2}; {3})" -f $UserAgent, 
  10: $(if($Host.Version){$Host.Version}else{"1.0"}),
  11: [Environment]::Version,
  12: [Environment]::OSVersion.ToString().Replace("Microsoft Windows ", "Win")
  13: )
  14:  
  15: #Establish the credentials for the request
  16: $creds = New-Object System.Net.NetworkCredential($Username,$Password)
  17: $request.Credentials = $creds
  18:  
  19: $response = $request.GetResponse()
  20:  
  21: $reader = [IO.StreamReader] $response.GetResponseStream()
  22:  
  23: #Our response will always be xml except in 404/401 case so cast as such        
  24: [xml]$responseXML = $reader.ReadToEnd()
  25:         
  26: $reader.Close()
  27:         
  28: #Let others down the pipeline have fun with our xml object
  29: Write-Output $responseXML
  30:  
  31: $response.Close()

 

To decide if this is a POST I use the following: 

if($XMLObject){ Do the POST stuff}else{Do the GET stuff}

And now the POST specific code.

   1: $creds = New-Object System.Net.NetworkCredential($Username,$Password)
   2: $request.Credentials = $creds
   3:  
   4: #Since this is a POST we need to set the method type
   5: $request.Method = "POST"
   6:  
   7: #Set the Content Type as text/xml since the content will be a block of xml.
   8: $request.ContentType = "text/xml"
   9:  
  10: #Create a new stream writer to write the xml to the request stream.
  11: $stream = New-Object IO.StreamWriter $request.GetRequestStream()
  12: $stream.AutoFlush = $True
  13: $stream.Write($($XMLObject.psbase.OuterXML),0,$($XMLObject.psbase.OuterXml.Length))
  14: $stream.Close()
  15:  
  16: #Make the request and get the response
  17: $response = $request.GetResponse()
  18:  
  19: #Create a stream reader to read the response stream.
  20: $reader = New-Object IO.StreamReader $response.GetResponseStream()
  21:  
  22: #Read the response and cast the response to XML
  23: [xml]$responseXML = $reader.ReadToEnd()
  24:  
  25: #Dump the XML out to the pipeline for others to consume
  26: Write-Output $responseXML
  27:  
  28: $response.Close()

 

There was one thing in the POST that I had trouble with at first.  When trying to execute the $stream.Write method, if I had set the $request.ContentLength property with $request.ContentLength = $$XMLObject.psbase.OuterXML.Length, it would fail stating the bytes being written were longer than the bytes specified.  So I removed the code to set the $request.ContentLength and everything seemed to work fine.  Not sure how Joel Bennett got that to work but it drove me nuts for a while.

So that just leaves the calling functions.  Here is a GET

   1: function Invoke-APIGetStuffInGroup {
   2:     param( 
   3:       [string]$Group,
   4:       [string]$Username,
   5:       [string]$Password,
   6:       [string]$UserAgent = "Powershell API Client"      
   7:    )
   8:    
   9:    
  10:         $Group = [System.Web.HttpUtility]::UrlEncode($Group)
  11:         $URL = "https://api.website.com/api/1.0/Groups/GetStuff?group=$Group"
  12:         Invoke-APICall -URL $URL -Username $Username -Password $Password -UserAgent $UserAgent
  13:         
  14: }

 

And a POST

   1: function Invoke-APIAddStuffToGroups {
   2:     param( 
   3:       [string]$Username,
   4:       [string]$Password,
   5:       [string]$UserAgent = "Powershell API Client"      
   6:    )
   7:         $xml = #Create your XML Document here
   8:         
   9:         $URL = "https://api.website.com/api/1.0/Groups/AddStuff"
  10:             
  11:         Invoke-APICall -URL $URL -XMLObject $xml -Username $Username -Password $Password -UserAgent $UserAgent
  12: }
  13:         

 

One change I might make at some point is to require a SecureString for the password since clear text can be bad.

Hope this helps someone!

-Shane