January 2010 Archives

I know I said it before but hurry now and register! 

Absolutely FREE event on Windows 7 and Deployment is being held NEXT Monday (February 8th 2010) at Microsoft Mississauga!

Details are as follows! If you’re interested Click HERE to get directly to the Microsoft site to register!  Just bring your interest and passion!

----------------------------------------------------------

Deployment Deep Dive on Windows 7 Community Tour-Mississauga

Event ID: 1032436872

 

February 8, 2010 6:30 PM - February 8, 2010 8:30 PM Eastern Time (US & Canada)
Welcome Time: 6:00 PM

Microsoft Canada Headquarters

MPR A/B/C
1950 Meadowvale Boulevard
Mississauga Ontario L5N 8L9
Canada

Event Overview

Are you running Windows XP?  Are you feeling the pressure of creating a deployment plan? Have no fear!  Leveraging learning’s from two Windows 7 early adopters this session will give you the skills you need to proceed with your own deployment.  The session will focus on free Windows 7 deployment planning and deployment tools that customize operating system packages and automate deployment planning and network deployments seamlessly. We will dive right into:        

  • How to use the Microsoft Assessment and Planning (MAP) tool to identify your current hardware and application inventory.
  • How to use the Windows Automated Install Kit (WAIK) to build a customized image for your organization.
  • How to use the Microsoft Deployment Toolkit (MDT) to build, deploy and maintain Windows installation images.
  • How to migrate the end users profile from their current installation to the Windows 7 installation using the User State Migration Toolkit (USMT).
  • How to integrate MDT and Windows Deployment Services (WDS) to perform Lite Touch installations of Windows 7.

Finally we will look at how we can leverage the various tools to solve any application compatibility issues you might encounter.  We will look at how you can overcome common obstacles using the Application Compatibility Toolkit (ACT), or larger obstacles using Windows XP Mode and Microsoft Enterprise Desktop Virtualization (MED-V) and even how you can leverage Application Virtualization (App-V) to streamline application deployment and ensure all your applications work!

Please note: there is no cost to attend this event.

Powershell

Oh my love affair with Powershell gets overwhelming some days.  It really is a troublesome little mistress.

For every time somebody comes up to me and asks “Can this be done in Powershell?” the answer is almost invariably “YES!”

Sometimes the answer takes a bit of thought and bit of effort.

But what I love, is once you design the answer it is almost ALWAYS Neutral of your Infrastructure or Development Environment!  You can take those solutions with you anywhere!

Today we had a question.  “Can’t we just query the SQL servers to see what instances we have?”

I’m not an SQL guy.  I can’t even pretend to be.  I can install it, I can navigate it, I can drop tables and make messes.

But I’m not a SQL guru.

But I DO know that SQL server 2008 Management tools has a Powershell Snapin.  It sits INSIDE the Management Studio but it’s there.

It’s a very Dead simple command to for Powershell people.  Just use the SQL Provider and good old “GET-CHILDITEM”

Yup.

GET-CHILDITEM SQLSERVER:\SQL\SERVERNAME\DEFAULT\Databases

 

Where “SERVERNAME” is the name of an SQL server.  And you can navigate to the different servers this way to by just changing the name.

Now the problem I ran into is if you don’t have credentials, you can’t connect (DUH).  I couldn’t find the answer yet on how to pass alternate credential in the SQLProvider in Powershell.  But I DID discover that the database list on a SQLServer (like a LOT of information in Windows) can be obtained by good old WMI.

 

So the alternate method you can ALSO use to show the Databases on a SQL Server is run a GET-WMIOBJECT against “win32_perfformatteddata_mssqlserver_sqlserverdatabases” (*Yes, it’s a mouthful*)

 

GET-WMIOBJECT win32_perfformatteddata_mssqlserver_sqlserverdatabases

 

Need the list from a foreign computer?  Just drop in the IP address or resolvable name!

 

GET-WMIOBJECT win32_perfformatteddata_mssqlserver_sqlserverdatabases –computer SERVERNAME

 

And of course like everything else in Powershell, passing credentials will validate you if necessary. 

 

GET-WMIOBJECT win32_perfformatteddata_mssqlserver_sqlserverdatabases –computer SERVERNAME –credential DOMAIN\Username

 

Now you’d like that to be readable, because WMI usually gives a LOT of information we don’t need, so just format the output to a table and pick what you need.

 

GET-WMIOBJECT win32_perfformatteddata_mssqlserver_sqlserverdatabases –computer SERVERNAME –credential DOMAIN\Username | format-table Name

See?  And I didn’t even trip on a single “Table” in the “process”.

Yes my worst Pun of the day.

 

Powershell, Enjoy the good life

Sean
The Energized Tech

Sometimes we ITPros do blunt and stupid things.   Sometimes we do them without thinking.

There and RIGHT and WRONG ways to delete USERProfile data on a workstation.

In the Pre Vista/7 days you could just go in to “Documents and Settings” and DELETE the old user data and all was good.  DONE!

Found something out in Vista and Windows 7.

*AHEM* – Don’t do that.  Or *IF* you do make sure you delete the reference of the user from the registry.  There is a spot in Windows 7 and Vista that says “User Profile for JOHNSMITH is HERE!”

If you do it the old way, just look for an entry under

HKEY_LOCAL_MACHINE\Software\Microsoft\Window NT\CurrentVersion\ProfileList

You’ll find a series of keys, under each one is an entry called “ProfileImagePath”. That will contain the path on the hard drive of where the User profile was.  If you do NOT delete that key you’ll find out the hard way that IF “JOHNSMITH” ever tries to log into that machine again?  He’ll keep getting logged in with a TEMP profile, which is … IRRITATING!

OR If you’re really “Lazy” (AHEM EFFICIENT) you can run a Script with Powershell like THIS to get rid of the offending key. :)

----- Profile Cleaner -----

$OLDUSERID=’John.Smith’
$REGPROF=’REGISTRY::HKLM\Software\Microsoft\Windows NT CurrentVersion\ProfileList’
$PROFLIST=GET-CHILDITEM $REGPROF
FOREACH ( $ITEM in $PROFLIST )

{

$userid=(get-itemproperty registry::$item).ProfileImagePath
if ($userid -like '*sean.kearney*') { $userid; remove-item registry::$item -whatif }

}

------ Profile Cleaner -----

 

Ok that’s not such a lazy script but I just couldn’t RESIST using Powershell to do something :)

Cheers all and clean up your mess the right way

Sean
the Energized Tech

Powershell

One of the nice things in Powershell is that Microsoft has made it so easy to translate information that is sitting in the GUI to useful scriptable data or just even a simple report.

Point at hand.  If you want to show what your DPM server is monitoring or backing up, do you…

 

a) Make lots of screen shots of the configurations and keep them in a binder

or

b) Have Powershell do the work for you in a repeatable manner?

 

Well although I LIKE wasting lots of Color paper on the Company Color Laser Printer, I also prefer being efficient.   Sometimes being “efficient” means sitting down for about 30 minutes to figure how to do it.   But once done you have solid repeatable process.

 

So I decided I wanted a way to be “Efficient”, at a moments notice dump that configuration into something useful that could be read at any time

Now A poke for the DPM team.  Beautiful job in Putting in Powershell, Bad job documenting the Cmdlets.   They DO work but the examples were a little buggy.   

 

But here’s what we’re interested in, two nice CmdLets in DPM called “GET-PROTECTIONGROUP” and “GET-DATASOURCE

The first CmdLet will allow us to list all the available protection Groups in a particular DPM server.

 

Get-ProtectionGroup -DPMServerName MYDPMSERVER

Name                                Protection method
----                                    -----------------
ProtectedFileServers        Short-term using disk
MailServers                       Short-term using disk
 

That’s all fine, but really I wanted to be able to show my boss, at any point in time WHAT we’re backing up.  Make’s him happier.  I like having a happy boss.  That’s where “GET-DATASOURCE” comes into play.  It’s job is to show you the “Source” of the “Data” being backed up.  (well maybe not, but that’s how *I* remember it) ;)

You just have to pipe in a “Protection Group” in the GET-DATASOURCE to get the list.

So I cheat and throw the results in a variable called $GROUPLIST and just do a FOREACH-OBJECT to pull out the lists for each ProtectionGroup.  If you only have one, you can simplify it.

 

$GROUPLIST=GET-PROTECTIONGROUP -DPMSERVERNAME MYDPMSERVER; foreach ( $MEMBER of $GROUPLIST ) { GET-DATASOURCE -ProtectionGroup $Member}

 

And now you have a perfectly viewable and format able list of what you’re backing up.   Where it gets NEAT is if you plug in a GET-MEMBER into the results on a GET-DATA, you can see what OTHER useful information shows up, like say Oooooo the STATUS of those Protection Groups?

 

I am saddened the GUI interface for DPM cannot sit on my PC.  I am brightened and cheered up DAILY when I discover just how MUCH of it is not needed once you start into the DPM Management Shell, with Powershell driving it.

 

Sean
The Energized Tech

Powershell

Dropping or adding a domain normally can be a headache.  Especially in the Enterprise where who KNOWS what the addresses were attached.  One of nastiest headaches in the past was trying to find out the default email address for a User.  in Exchange 5.5?  BHA!  Look at the objects one at a time, or get really good with vbScript and a bottle of Vodka!

But now in Powershell, life has changed for the better.

With ONE line!  ONE Line!  I can find all the users in a particular domain for the default email.  Distribution Groups *OR* Email addresses.  It doesn't matter.

One line.

All you have to use is the GET-MAILBOX or GET-DISTRIBUTIONLIST Cmdlets.  Each one of those holds the field "PrimarySMTPAddress".  But the nice part is that field is already a set of data.   So if you're looking for a user who's email end's in a particular domain (say ABC.COM) just run a quick

 

GET-MAILBOX * | where { $_.PrimarySMTPAddress.Domain -eq 'abc.com' }

or

GET-DISTRIBUTIONLIST * | where { $_.PrimarySMTPAddress.Domain -eq 'abc.com' }

But of course if you're dealing with over 10,000 plus users you might want to limit it to the odd OU.  That's quite easy.  Just add in the -ORGANIZATIONALUNIT parameter to the Cmdlet like so.  If we're in the fictional domain of "FABRIKAM.COM" trying to find our Distribution List from a particular OU.

 

GET-MAILBOX * -Organizationalunit 'Fabrikam.com/Office/Redmond/Mailgroups' | where { $_.PrimarySMTPAddress.Domain -eq 'abc.com' }

or

GET-DISTRIBUTIONLIST *  -Organizationalunit 'Fabrikam.com/Office/Redmond/Mailgroups' | where { $_.PrimarySMTPAddress.Domain -eq 'abc.com' }

 

And of course you can get really fancy too once you have this information.   You could just as EASILY modify that same Cmdlet and just SWITCH those users to the new email domain.  No muss.  No fuss!

 

Powershell, It really is so Easy to use!

 

Sean
The Energized Tech

I had to deploy a new DC at work to take advantage of some of the management capabilities supplied in Server 2008 R2 (Powershell Active Directory Modules) but wanted this DC to be Special.

Yep “Special”.  I wanted a REAL DC.  CORE.  Secure.  Restricted.  

Bet ya thought it was going to be tricky too didn’t you?

Well it isn’t

Thanks to the fact we now have a built in utility in the Core edition called “Sconfig” a LOT of the nasty stuff is easily taken care of.  

You can assign those static IP addresses to your network cards, name your machine, allow management and reboot without any stress on the CORE version of Server 2008 R2.

But Making that  a DC?  I’ll bet you thought I’d pull the last of my hairs out with that one.

Nope.  I *DID* in fact already setup in my test Environment almost just under two years ago a NEW Domain on a single Core box.   So I DID remember the command line structure really wasn’t nasty.  It takes a little typing but thanks to this article on Technet all the information you need is there.

The beautiful part is the process automatically calls up the OCSETUP and gets the Active Directory binaries installed at the same time.   And unlike PREVIOUS Core versions you can extend POWERSHELL to it for better management and Remote CmdLets and Shell sessions.

OH.

the COMMAND LINE.

You were waiting for the command line.  Sorry!

dcpromo /ReplicaorNewDomain:Replica /Password:* /UserDomain:"CONTOSO.LOCAL"
/Username:"AdministratorName" /AutoConfigDNS:Yes /DomainNetBiosName:"CONTOSO.LOCAL"
/ParentDomainDNSName:"CONTOSO.LOCAL" /ReplicaDomainDNSName:"CONTOSO.LOCAL"
/SafeModeAdminPassword:"StupidSecretPasswordYouCanTRemember1"

This is for the ficticious Domain of CONTOSO.LOCAL. You’ll have to tie the lines together of course.  And It also sets your “SAFEMODE” password.

If you’re like to Automate this process?  There are two EXCELLENT articles written by Mitch Garvis and Steve Syfuhs on the subject.

Now if you’re adding this Server 2008 R2 to an EXISITING Domain that ISN’T already running a Server 2008 R2 DC, you’ll have to run a /FORESTPREP and /DOMAINPREP in that order from the ADPREP provided under the SUPPORT folder on the server media.   There are additional objects and Schema that need to be added into Active Directory.

But that takes very little time.  

The best part is if you have Windows 7 and the new RSAT tools.  This configuration allows you to immediately leverage ADWS (Active Directory Web Services) so you can use the new Administrative center on your A/D.  Life for Admins get’s a LOT nicer and easier to work with.  Plus you get the EXTRA bonus of the Active Directory Modules for Powershell V2.  Creating an manipulating users in a SINGLE line!

Sweet!

Enjoy Server 2008 R2 and all it has to offer YOU!

Sean
The Energized Tech

I was searching on the internet for more bits about Powershell and ran into Chrissy LeMaire’s blog with the catch phrase “Talk Nerdy to Me”

Yes, once a phrase is in my head, it’s “Doomed”, and yes I do mean ID Software “Doomed”.   I contacted and got permission first (Cuz that’s just what you should do and VOILA!)

Sung to “Talk Dirty to Me” by “Poison”

 

Talk Nerdy to Me

You know I saw it
It was Green and all Monochrome
My eyes wouldn't let me roam
Cuz I liked it
And you know I'm tellin' True
It got it hooks in you
I ain't lyin!

Oh a Commodore
A Commodore 64
On my parents' big living room floor
I got leg cramps
BBS'in all night long
Sidplayer crankin' out a song
As I geek out!

Y'know that I just

Bought a modem
In the K-mart store
It's Accoustic and
300 baud down to the core
In my basement
Peripherals all on the Floor
Wait for ya
To Talk Nerdy to Me

You know I'm tyin'
I'm tying up the Telephone
Long distance with a BlueBox tone
Cuz I'm phreakin'
Then I war-dial once again
To type to you my friend
In IRC now

Y'know that we'll be

Swappin' programs
Roms scattered all over the floor
Playin' Zelda
Chugging Jolt eyes buzzin' for more
In my basement
Just knock on my door
Quote "Python"
And Talk Nerdy to Me

Captain Kirk beam me and take me away!

Solo,

Cause y'know that we'll be

Fightin' Dragons
And tell Wizardry Lore
Readin' Usenet
Who ya kidding you're downloading porn!
In my bedroom
Gandalf is on my door
Speak Elvish
Or Talk Nerdy to me

In COBOL,
Talk Nerdy to me

I'm baudy,
Talk Nerdy to me

Powershell

So you’ve decided to take a step into the Good life.  Go cutting edge and use the new Active Directory Modules!

Excellent!  Good man!

Just a small piece to remember first.

You need a Server 2008 R2 DC.

“The whole Domain?”

No.  Just a single DC.   The new Active Directory services are web based and you need at least ONE DC to have the Server 2008 R2 SOFTWARE as a Domain Controller.

Once you have that in place you can utilize the new Cmdlets.  They’ll be on your controller of course.

But if you REALLY want to seize the day in Windows 7, get yourself the newest RSAT (Remote Server Administration Tools) and you’ll have all those new Active Directory Cmdlets on your workstation out of the box!

“But I don’t HAVE Windows 7!”

What?  No Windows 7? No problem!

Because with Powershell V2 there is a new feature that allows you to build modules!   So you can take those new Cmdlets sitting on your Server 2008 R2 controller, export them out to a module and import them onto your Powershell V2 workstation, even if it’s Windows XP!

Did I just make a puddle of drool in front of your computer?

 

Good.  I’ll tell you how to do it tomorrow! :-)

 

Sean
the Energized Tech

I was banging my head on this one.  I’m no OCS expert.  I can set this sucker up but did I ever do a major *DOH!*

I deployed an OCS Edge server for Federation purposes and immediately after EVER OCS client got the same stupid message just tickling at the top of the screen.

“Limited External Calling”

this was odd since I never configured connectivity to the phone system or any other connectivity for that matter.

I validated, tested, used magic words, offered up some Developers to “Juju” only to finally stumble across an Article in “Experts Exchange” that pointed me down the right path.

I had some mismatched ports.

When I initially set it all up, I planned on Authenticating my A/V on Port 12345 (Example of course).  The DEFAULT on the Edge is 5062.  I DID configure the A/V authentication on Port 12345 on My Edge server for OCS but on the EXTERNAL port.  The Internal was still set at “5062”

So a quick change on the INTERNAL port to match was the Regular OCS server was trying to communicate on, restart my services for A/V for good measure and *BOOM* no more stupid Red Exclamation mark.

All that silliness, I knew it was something stupid.

Yeah. ME. *DOH!*

Sean
the Energized Tech

Powershell 

So we can create users in Active Directory with Powershell.   But the accounts created were lacking something.   Details, additional information. So the trick is what do we need to populate to make a “normal user”.  What fields are TYPICALLY populated when I do a “New User” in the Active Directory Users and Computers GUI interface?

Well lets look at a new user called “John Smith” in the ficticious domain of “techdays.contoso.com”

image 

 

Normally when we create a User, we’re typing in the First Name, Last Name, Display Name, User Logon Name (UPN) and the legacy “SAM” ID for Windows NT 4.0 legacy domains.    In the Powershell Scripts previously all we did were create Active Directory objects for Users.  None of this was supplied, although in the Server 2008 R2 version, it does assume at least the SAM.  

So how do we extend this functionality to Powershell and have it match what a normal New User gets?  We just fill in the variables and pass them along to the Active Directory Objects.  Let’s try this in our make believe land of “CONTOSO.LOCAL” with out favourite user “John Smith”.  Only now we’ll populate all the basic information he needs.

 

Using [ADSI] Accelerator

$FirstName=”John
$LastName=”Smith

$DOMAIN=’@contoso.local
$NEWUSERNAME=$Firstname+".”+$Lastname
$FULLNAME=$Firstname+” “+$Lastname
$SAM=$Firstname+”.”+$Lastname
$UserLogonName=$Firstname+”.”+$Lastname+$Domain

$Class=”User”
$ObjectName=”CN=”+$NEWUSERNAME
$ADSI=[ADSI]”LDAP://cn=Users,dc=contoso,dc=local'”
$User=$ADSI.create($Class, $ObjectName)
$User.Put(“sAMAccountName”, $NEWUSERNAME)
$User.Put(“displayName”,$FULLNAME)
$User.Put(“Name”,$FULLNAME)
$User.Put(“givenName”,$Firstname)
$User.Put(“sn”,$Lastname)
$User.Put(“userPrincipalName”,$UserLogonName)


$User.setInfo()

 

So the same as last time, but we just define more information.  And pass it along, but look how easy it gets as we step into better featured software, like Quest

Using Quest Active Roles

$FirstName=”John
$LastName=”Smith

$DOMAIN=’@contoso.local
$NEWUSERNAME=$Firstname+".”+$Lastname
$FULLNAME=$Firstname+” “+$Lastname
$SAM=$Firstname+”.”+$Lastname
$UserLogonName=$Firstname+”.”+$Lastname+$Domain

NEW-QADUSER -name $NEWUSERNAME -ParentContainer 'CN=users,DC=contoso,DC=local' -samAccountName $NEWUSERNAME –Firstname $Firstname –Lastname $Lastname –userprincipalname $UserLogonName –displayname $FULLNAME

 

The preparation is almost identical but the execution is just ONE LINE.   It’s this ease of use I love.  But let’s jump to the future now! New Active Directory Modules in Server 2008 R2

Using Active Directory Server 2008 R2

$FirstName=”John
$LastName=”Smith

$DOMAIN=’@contoso.local
$NEWUSERNAME=$Firstname+".”+$Lastname
$FULLNAME=$Firstname+” “+$Lastname
$SAM=$Firstname+”.”+$Lastname
$UserLogonName=$Firstname+”.”+$Lastname+$Domain

NEW-ADUSER –name $NEWUSERNAME –path ‘CN=Users,DC=contoso,DC=local’ –samaccountname $NEWUSERNAME –Givenname $Firstname –surname $Lastname –userprincipalname $UserLogonName –displayname $FULLNAME

 

There you have it.   With a few minor changes it almost identical to the Quest version of the Commandlet.  But it’s a native part of the Server 2008 R2 infrastructure.

 

Remember that in ALL cases the method of passing information and variables to the Cmdlets, (no matter what version you choose) is identical and this can be just as easily scripted into an interactive session or to import multiple objects at the same time.

Next time we’ll show you how to take these SAME techniques to turn them into full Active users with passwords.

 

Powershell: It’s so Easy, and it’s FREE.

 

Sean
The Energized Tech

Powershell

Powershell and Active Directory is marriage of pure Power.  But one of my initial stumbling blocks was in learning WHERE to find that information and what particular names to call up in Active Directory

I’ve found three tricks that worked for me.   And you can use whatever you want, it’s what works BEST for YOU.

The first one is using Active Directory Users and Computers.  I turn on the “Advanced Features” option.  Once I do this, I can double click on a User or Object in Active Directory and get a new tab. The “Attribute Editor

 

 image image

 

This gives the ability to view and edit ALL of the fields associated with an Object within Active Directory. Most of the fields are revealed on your General Tabs.  But this shows us the ACTUAL field names in Active Directory for those objects.  For our purposes with Powershell, we can browse down this list of names to see the NAMES of the INDIVIDUAL objects we can use and Edit in Powershell.

 

The Second one I’ve used that isn’t too nasty is very similar to the first, Using ADSIEdit to manage the users and objects, you go into the properties of an Object (IE: A User) to see all the information and names associated with that object.

 

image

 

The third I use is a really simple cheat in Powershell.   Using the QUEST active Roles, get the info on a user and EXPORT-CSV the all the properties from the object from there.

 

GET-QADUSER john.smith –includeallproperties| EXPORT-CSV ‘'C:\Powershell Scripts\UserDetails.csv'

I then use my spreadsheet of choice to see not only the names I can WORK with but also the TYPES of data they contain in a SAFE and NON DESTRUCTIVE environment. 

 

image

 

Note this pumps out a LOT of information including all the non stamped properties (IE: Exchange Details for a non Exchange Enabled User) but it’s FANTASTIC for comparing TWO users (IE: One with a certain feature enabled like OCS and another DISABLED from that feature) to see what changes you need to make.  Also once you know the types of information AVAILABLE on a low level in Active Directory, it’s very easy to build Queries on information like when a user was created, who is locked out etc etc.

The really cool part is when you learn how to see the objects directly, you can make CmdLets or manipulate objects that don’t presently have Powershell CmdLets (Like Live Communications Server 2005 or Exchange Server 2003) since most of their job when a user is “Enabled” with that feature is to populate fields in Active Directory.  In the case of Exchange 2003, RUS takes over and checks A/D for the object information and passes that along to the address book.

 

Active Directory and Powershell – Two tools when leveraged together gives you more Power than you ever imagined!

 

Sean
The Energized Tech

The other weekend I threw in Windows Deployment Services on my test environment.  It worked beautifully.  I didn’t even have to try!

 

And then I decided lock down the server with the Security Configuration Wizard.  I erred.  I forgot to save the original configuration.  My error.

 

So my beautifully running WDS suddenly, failed.   Something got “locked down” or service “disabled” in the process.  Enter me jumping up and down yelling magic words at the computer gods.

 

No errors in the eventlog. Everything seemed to BE right.  The rules on the firewall were enabled.  Remote PXE boots were grabbing an IP address but no sign of a Boot file.

In fact when I cranked up the logging on the Server 2008 firewall, the attempts to pull a PXE boot were getting through.  Nothing blocked.

So…errr…… huh?

In desperation, I removed and reinstalled the instance of WDS.  The logic BEING that in the install, it should re-enable the needed component or service.

 

Turns out, that worked.  Worked the first time.     No funkiness or reconfiguration needed either.   I ran through the Wizard, choose not to install images and everything was back working the way it was.

Next time I’ll do it again and lock it down on purpose and track the “Before and After”.  I’m placing the odds on a service.

 

But I did learn one good thing from my stupidity.  Powershell.  If you want to document your list of services and their current state BEFORE you do something dumb like play with your only Production/Test environment WITHOUT backup?

In Powershell run

 

GET-SERVICE | EXPORT-CSV C:\MYSERVICES.CSV

 

Will give you a nice CSV file you can edit,view and search in Excel of ALL your services.  So when things go stupid after?  You can run the same command again in Powershell and compare the results SIDE by SIDE.  And probably just re-enable the needed service.

 

Me dumb dumb, me break own server.  But me fix.

 

Sean
the Energized Tech

Powershell

As I learn more about Powershell, I really appreciate the Power it holds.

One of those Powers is the Hashtable.  A variable of indescribable power.  I’ll try to bring you new users in with this.  It’s really cool. :)

Normally in Powershell you just define a variable like this.

 

$FIRSTNAME=’Johnny’

or

$NUMBEROFCPUS=6

 

Or possibly you’ve done an Array such as

 

$Weekdays=”Sunday”,”Monday”,”Tuesday”,”Wednesday”,”Thursday”,”Friday”,”Saturday”

 

A Hashtable is a lot like an Array, and yet completely NOTHING like an array.   Normally an Array I can do this.

 

$Weekdays[3]

 

and get the result of

 

Wednesday

 

But a Hashtable can have one extra piece into it.  I can Group various TYPES of information into one list.  Alphabetic, Numeric, SecureString… If you can store it in Powershell, you can store it in a Hashtable.  It’s almost a MINI database when you’re done. So normally I might have a few arrays like this.

$Divisions=”Accounting”,”Manufacturing”,”Infrastructure”,”Development”

$Positions=”Manager”,”Assistant”,”Bookkeeper”

$Payscale=100000,75000,50000,25000,10000

 

But as a hashtable I can group all of this possibly dis-similar information together as one Group.   It’s done by using the @{} as part of the variable assignment and assigning a variable WITHIN that.  So I could rewrite all of this as.

 

$EmployeeDetails=@{Divisions=”Accounting”,”Manufacturing”,”Infrastructure”,”Development”}

$EmployeeDetails=$EmployeeDetails+@{Positions=”Manager”,”Assistant”,”Bookkeeper”}

$EmployeeDetails=$EmployeeDetails+@{Payscale=100000,75000,50000,25000,10000}

 

And done like this I can now access them as

$EmployeeDetails.Divisions[3]

or

$EmployeeDetails.Payscale

 

The situations where you may or may not want to utilize a Hashtable are up to you.  But it’s just another way information is accessed.   Even if you don’t heavily use this, understanding HOW the information is created and accessed can help YOU further in learning how to navigate data through Powershell.

 

“Tabled for your consideration…”

Sean
The Energized Tech

Powershell

What we’re going to do for the next while is show you how you can manage Active Directory in Powershell.   I’m not going to go “deep dive” but I AM going to at least show you the basics.  And there are generally three common methods.   Using the [ADSI] Accelerator within Powershell V1, Using Quest Active Roles Management add-on or using the new Active Directory Modules in Server 2008 R2.

The [ADSI] method I personally find to be the most complex BUT also the most compatible.   It requires no third party software and uses native features you can leverage in Active Directory.

Using Quest Active Roles I find is the easiest of the three, but because it’s a third party solution you may have a harder time convincing management to implement that into the environment (Even if it IS free)

The third is Server 2008 R2 Active Directory Modules.   This is built in if you have a Server 2008 R2 computer as at least ONE of your domain controllers and will involve a change to your Active Directory schema.  It is not as EASY as Quest but is FAR easier than the [ADSI] Accelerators.  Plus it’s a native component to the Microsoft environment and part of the built in management tools.  Once it’s implemented, you don’t really need to sell it to anybody.

 

So let’s start with a new user.   We’re going to assume a domain of “CONTOSO.LOCAL” and a user who’s name is “John Smith” (Sorry John, I know we’re abusing you’re name and you should be getting royalties for it, but I was lazy … )

 

In all cases the Username will take on the format of “John.Smith” and are disabled accounts in A/D in the Default Users container with no assigned Password

 

Using [ADSI] Accelerator

$FirstName=”John
$LastName=”Smith

$NEWUSERNAME=$Firstname+".”+$Lastname

$Class=”User”
$ObjectName=”CN=”+$NEWUSERNAME
$ADSI=[ADSI]”LDAP://cn=Users,dc=contoso,dc=local'”
$User=$ADSI.create($Class, $ObjectName)
$User.Put(“sAMAccountName”, $NEWUSERNAME)
$User.setInfo()

 

With the [ADSI] Accelerator there is a lot of power going on here, but it is VERY daunting for even seasoned Admins.  But it does NOT require any change to your infrastructure.  Just Powershell

 

Using Quest Active Roles

$FirstName=”John
$LastName=”Smith

$NEWUSERNAME=$Firstname+”.”+$Lastname

NEW-QADUSER -name $NEWUSERNAME -ParentContainer 'CN=users,DC=contoso,DC=local' -samAccountName $NEWUSERNAME

 

Definitely not quite as daunting as the [ADSI] method.  But it requires Quest Active Roles to be installed use it.   But EASE of use has increased! 

 

Using Active Directory Server 2008 R2

NEW-ADUSER $NEWUSERNAME –path ‘CN=Users,DC=contoso,DC=local’

 

Pretty simple AND a small must nice feature in the Server 2008 R2 commandlets.  You don’t have to specify the information for SAM (It is picked up by default) AND if I DON’T specify the the Path to create the object in? It will DEFAULT to the default “USERS” container in your Active Directory.

 

In all cases, this user does not have the UPN defined (ID@domain.com) email address, or various other details.  It is also a disabled object in Active Directory.    But with Powershell in all cases, this information we can set very easily.   We’ll look at that next time in all three CmdLets

 

Powershell: It’s so Easy and it’s FREE!

Sean
The Energized Tech

I thought I smelt something in the air.  Something slicing through these icy Canadian nights.

I don’t know when or where.  But I can smell it in the air.

ENERGIZE IT 2010

Watch for it

And be there, to get ENERGIZED!

image

Here’s an easy one but a stickler for you.

Had a Word 2007 document today that contained “VBA” and “Macros”.

It was pretty funny since I helped design it and there were none in it.  The user didn’t even KNOW how to create a Macro.

Thump thump thump goes the impatient foot trying to solve this.

As Maxwell Smart, Secret Agent 86 would say “Would you believe …”

File Extension.

Somehow it got saved as “DOCM” which is a “Word Macro-Enabled Document”

image

Why do you care?

Well if your security is pretty standard and you’re not automatically trusting Macros and VBA (A good thing) you’ll get a warning saying “Hey I don’t trust this” and certain things like Protecting the document don’t work.

It drove me bananas when I found out it just got renamed some how.

So just rename it without that extension if you’re certain or re-save it as a .DOTX (Template) or .DOCX (Word 2007 Document).  It will give you a nasty warning about “Losing all your precious VBA and Macro code” which is fine since it was never there to begin with.

 

Go figure! :)

 

Sean
The Energized Tech

Caught this one in twitter and thought I’d echo it along.

I’ve noticed ever since Microsoft introduced the “Cached Email address” feature in Outlook 2003 and 2007, user’s have a nasty habit of treating it as an “Address Book”.

Where this is a problem for ITPros is when they LOSE that little file (when you migrate a computer) their world is over (And thus yours)

But not to fear.  It’s just a single file.  It’s easy to migrate and just as easy to pull the data into something much useful, a CSV file!

All you have to do is dig for the NK2 file for the user.  Typically it’s in the same folder structure as your Outlook data.   It’s naming will match the profile name assigned to the particular Outlook user.  By default, it is “OUTLOOK.NK2” but if you have multiple profiles in Outlook it could be “OUTLOOK.NK2”, “MARY.NK2” or “MAIL.NK2” or something appropriate.

Just copy the file to the new location it needs to go to and replace the current .NK2.   Only a restart of Outlook is needed for the user to see their Nicknames (Cached email addresses)

But personally I prefer this alternate solution.  Pull the data out.  I found one user with about a hundred “Really important world ending addresses” in this silly file.   Storing them in “Contacts” just never dawned on them.

So it’s good to have “Plan B” up your sleeve.  A free Utility called NK2.INFO.  It views and edits the silly things and EVEN allows an export to CSV or Tab delimited Text file.

Guess what it costs?

*zip* Zero Free!

Works great too.  But if you ever bump into the programmer and you’ve used it at least ONCE?

Buy them lunch or a beer and tell them thanks.  Because if you EVER need it?

You’ll want to.

 

Sean
The Energized Tech

Powershell

I keep harping on and On and ON about Powershell.   The reason for it is REALLY simple.

With very little extra work on my part, my job just gets EASIER!

I remember VIVIDLY less than an year ago having to create users in the GUI.  It worked but it was inconsistent on the results.  You had to remember to “Do this” or “Don’t do that” or “Don’t forget this permission” and “Don’t forget to logoff the server when you’re done.”

I clearly remember managing users in a Small Business environment.   Where each server would have it’s own ID password, no Trusts, and getting new Admins to FOLLOW our personal standard for clients. AIGH.  It’s human nature that nobody wants to listen when they think they know better.

But with Powershell.  It’s beautiful.  The odd task that I don’t do daily? Yeah.  I’ll jump into the GUI.  

If I’ve repeated it even ONCE tho, look into Powershell.   The scripts just MAKE SENSE.   It’s easy to repeat and manipulate tasks.

Heck UNLOCKING A USER ACCOUNT IS A COMMAND!

So yes, I get crazy when I talk about Powershell.   Powershell has changed my life for the better.   It is easy, secure, flexible and oh… the most important bit.

IT’S FREE!


Try it now and unleash YOUR inner Kraken!

Sean
The Energized Tech

Let’s just say you have a running OCS server and you have a newly created child Domain.  And for whatever reason you don’t want to Deploy another OCS 2007 server.  Reasons for cost are enough.

But you’d like to leverage that server? You can.  You can enable OTHER Child domains for use in your OCS server.   You can even assign them the SAME SIP extensions for login as your regular users (provided there are no naming Conflicts).

There’s only ONE thing you have to do.  I almost missed it too.  And It was pretty obvious.

Just like in Microsoft Exchange you have to PREP the Forest and PREP the Domain.   But since your forest is already prepped, there is only one thing left to do.

PREP that child domain for OCS.  Drop in the OCS Media onto a DC controlling your child domain, the media will automatically detect the Domain has not been prepped, and indicate that in the OCS 2007 setup list.

You can still enable the users for OCS independent of this action.   But UNTIL you prep that Child Domain, no user IN that Domain will be able to login to OCS.

That’s it folks! Nothing more than that. 

 

Sean
The Energized Tech

In Exchange Server 2007, it is incredible easy to grant access to a mailbox via Powershell or the Management Console.    Just click on the mailbox in Question in the console, select the “Manage Full Access Permission” or “Manage Send As Permission” depending on what it is you need to do.

However, if you have a division that requires that this is the ONLY mailbox they work from (But each user still has to have their own ID and password for SOX/PCI compliance issues) you have a stumbling block.   When you login as that user, it will AUTOMATICALLY in Outlook 2007 populate that user name and details into Outlook 2007 profile.

It’s supposed to!  That’s what makes it easier to use.   But if you need this process to automatically be a Common mailbox, you’ll find that it’s just WAY too easy to setup.

You don’t need to mail enable the user’s at all.  Just type in ONE little entry in Active Directory for the users.    Change the “Email” field in Active Directory users and computers to the Email address of the common mailbox.

That’s it.  Nothing more.  (Well that and GRANT the User or Group necessary rights to the mailbox as is needed of course!)

But when the user log’s in to Outlook 2007 for the first time, it will Query the email address itself from Active Directory and pull down the settings from Exchange Server 2007 for that address.

Nothing more to it.  Sorry.   It really *IS* easy!

Sean
The Energized Tech

When I first got into doing a large deployment, I started with MDT 2008, since it’s a great way to build the image.   The environment I was deploying into was Server 2008 and soon I discovered Windows Deployment Services.  This is the component that allows you to create network based boots, PXE Enabled setups.

And honestly the first time I got into it, I started at it backwards.  I presumed that I needed MDT to use WDS.  No it’s the other way around.

Windows Deployment Services is the HEART of the system.  You start by installing it.   MDT is a TOOL that makes the images that WDS CAN use but WDS can run all by itself.   Import it’s own images.

 

I found that out this weekend.  I finally updated my main box to Server 2008 R2 (another post in itself, that was too easy!) and decided to start REALLY learning all about Server 2008 R2 and it’s infrastructure. (Especially POWERSHELL).

When I did my server update I used the USB key.  I forgot to keep an ISO file handy or burn to DVD.

 

As a result of this we run into my second problem.  I IMMEDIATELY wanted to start playing with Server 2008R2 core in my Hyper-V environment but since Hyper-V does not allow USB passthru and my install media WAS a USB key, a stumbling block.  Thus why I got into WDS.

In Server 2008 R2 (and Server 2008) it’s an easy solution.  Just add in the ROLE “Windows Deployment Services”.   There are two additional selections, the “Deployment Server” and “Transport Server”.   The Full WDS requires BOTH a “Deployment Server” and “Transport Server”.   If you only install the “Transport Server” the server will only be good for Multicasting.  Think of the “Transport Server” as the “CORE” version of WDS.

clip_image002

 

Once it’s added in, it’s a few clicks to have that Media deployable over the Network.   You’ll have to run through a few clicks of course.   Just enter the new console under Server Manager or Administrative Tools for “Windows Deployment Services”, right click on the Server in question and choose “Configure Server”

 

image

 

The Step by Step Wizard is pretty self explanatory with its questions.  Where do I store the Images?  Which machines are allowed to use this server for a PXE boot?

Then of course you’ll get asked to install some images into the Repository.   Even if you just have a USB key with the install media it will do fine.   The media does NOT need to be bootable for this to work, You’ll get prompted to give a name to describe the new Group for the Images.  This way you can look in the Console and see Server, Workstations, Legacy, Utilities all broken down in a sensible manner.   Really not much to just get going!

Once completed you’ll have a loaded repository of images including boot files.  At this point you should be able to enable the Boot from PXE / LAN option in the network rom and install a system over the wire.  In some systems you can select a boot menu, if you can choose “Network”.   You’ll find now you can start the computer and hit F12 by default to install over the network. 

clip_image022clip_image024

If for no other reason, I found this immensely handy if I had a legacy server that couldn’t read DVD media but was up to spec for Server 2008.    Being able to install over the LAN was a tremendous bonus in that situation.   But what’s really nice about this is I can keep a SINGLE repository of all my images.   And when and newer updated version comes out, I don’t have to redo everything, I just remove the older version of the O/S and replace it with the new!

 

Technology, my greatest friend :)

 

Sean
The Energized Tech

Powershell

Ok I didn’t sing this… YET! (Need to find a Karaoke Version of the tune…)

 

Sung to “WAITING” by Green Day, for the Powershell Community

"Shelling"

I've been
Lost in a Land of
Mice and clickin' the GUI
Brain Dead
From the Monotony

POWER
Shell Came out of nowhere
Blue bird turned into software
sitting
happy right on my screen

I GET-IT
Easy to use the
Verbs and
all of the OBJECTS
Flowing
Smoothly inside my screen

Now I just can't wait to get on my machine

BOOT UP!

Modules and all the Snapins
Piping
I see you clappin'
Smilin'
At your new PS1

No more
wastin' the day now
Work is
Feelin' like play now
One liners
who new it was this fun?

Now I just can't wait to get to work on my machine

BOOT UP!

Stand up thank that Jeffrey Snover

(Shey hey hey)

Now I just can't wait to get to work on my machine

BOOT UP!

Stand up thank that Jeffrey Snover

I've been
Lost in a Land of
Mice and clickin' the GUI
Brain Dead
From the Monotony

Done now
Not even tryin'
Watch me
As I go flyin'
No VB
holding me back at all

AT ALLLLL.....

Stand up thank that Jeffrey Snover

Powershell

The other day I built a little script that showed the free space on a computer in Powershell with the ability to email a warning if it crossed over a margin.

Now the nice thing with Powershell, with a MINOR change, I can now Query Active Directory for a list of computers, all or specific by OU or any other method I choose and have a system MONITOR those servers for free space.  And I don’t even need to touch the servers themselves.

So right now, let’s pretend I am a smart Admin and keep all my servers in their own private OU since I control my Server updates manually and they have their own configuration TOTALLY different from my workstations.   I've put them under the OU Computers/Servers.  This script also requires Powershell V2 if you wish to use the “SEND-MAILMESSAGE” option

---------------- Monitor Remote C Drive --------------------

# Get a list of computers from Active Directory

$Computerlist=GET-QADCOMPUTER –searchroot ‘Contoso.local/Computers/Servers’

Foreach ($Server in $Computerlist) {

# Change C: to whichever letter you'd like monitored

$driveinfo=get-wmiobject win32_volume –computername $Server.name | where { $_.driveletter -eq 'C:' } | select-object freespace, capacity, drivetype, driveletter

# 10 percent 10/100

$Percent=.10

# XXX % of maximum space is our warning level

$WarningLevel=$driveinfo.capacity *  $Percent

if ($driveinfo.freespace -lt $WarningLevel)

{

$Free=$driveinfo.freespace
$Servername=$Server.name

$Emailfrom="$Servername <$Servername@contoso.local>"
$EmailSubject="$Servername is running low on space"
$Emailbody="Hello Mr. Admin.  This is Server $Servername...I have $free bytes leftover."

send-mailmessage -from $Emailfrom -to "NetworkAdmin <networkadmin@contoso.local>" -subject $Emailsubject -body $Emailbody -priority High -dno onSuccess, onFailure -smtpServer smtp.contoso.local

}

}

---------------- Monitor Remote C Drive --------------------

There wasn’t that easy?  Mind you it’s only monitoring the C: drive and next time I’ll look at a script that monitors all the drives.  It’s not as robust as SCOM but it has one thing SCOM doesn’t

It’s FREE.

Seize Powershell and Seize the Day!

Sean
the Energized Tech

Powershell

In the newest version of Powershell we have the Powershell ISE (Integrated Scripting Environment) which gives you a great free way to test and edit your scripts.   But I stumbled across something just today.

Yes, we can DEFINITELY chalk up to I didn’t “Read The Finelyprinted Manual” but the Powershell ISE has it’s OWN Profile! I ran into this when every time I went to test my Quest scripts the Snapin wasn’t there.

It makes sense.  The test environment SHOULD be clean.  But I need mine to match my “production” Powershell setup.

And just like your regular profile it’s easy to access.  

Within the Powershell ISE launch a

NOTEPAD $PROFILE

Which of course launches the Windows Notepad so you can actually create and edit the Profile (if you need to at it).  It’s best to keep it clean.  I just added the Quest Cmdlets to mine so I can now edit my regular scripts without having to keep typing in a ADD-PSSNAPIN

It sit’s in the same location when it’s saved as your standard personal Powershell profile but it’s name is Microsoft.PowershellISE_profile.ps1

 

Empowering the many with Powershell

Sean
The Energized Tech

Want a quick and EASY way of pulling down the Mac addresses on your Network?

You can of course do it with Powershell and WMI but how about all workstations using DHCP?  As long as they have logged in recently all you need to do in the DHCP server in Windows is Right Click on the “Address Leases” under your Scope

image

and choose Export List.   It will let you export that list to a Tab Delimited file.  The list will contain not only the IP address but ALSO the Machine name and the ASSOCIATED MAC ADDRESS!

This is important for the previous post.  If you have that Mac Address handy?  There’s your “Wake on Lan” list.

 

Quick, dirty and FREE

 

Sean
The Energized Tech

Powershell

I was digging about for a Wake on Lan Utility.  How many times I thought to myself “If I could just send a pulse, that’s all this server needs to start”.

Most modern day systems have this feature too.   I started digging and finding this utility and that and then, for a lark started searching for WMI and Powershell solutions.  I really do not need this often, but something I could carry on a USB key that could be typed would be preferred.

Right near the top came this hit from " /\/\o\/\/" the Powershell Guy

The script is dead simple and EXACTLY what I needed.   And the fact that it could work into Powershell was even BETTER!

Just save it as some name .PS1 (I called mine WAKELAN.PS1) and when you need to use it just type

WAKELAN.PS1 00:00:00:00:00:00

Where the 00:00:00:00:00:00 is the Mac Address of the network card in question.

Now in MY case I want this to Power up any computer remotely to ensure they’re live for Automatic updates, or if I need to maintain a system in a locked office.  But having Wake on Lan is also great if you’re a consultant who accidentally shut off a system remotely.  As long as you have the Wake on Lan feature enabled in the systems and some way to get in? (VPN, Terminal Services) You can get it and get it powered up.

Thank you “/\/\o\/\/" the Powershell Guy” – Your script made my day!

Sean
The Energized Tech

Powershell

Here’s a quick one from the forums.  I think I did this before but it’s interesting to relook at things.  

A lot of things can be written in one line in Powershell.  There’s the whole “POWER” part there.

In Active Directory there is a field called "LastLogonTimeStamp” which usually contains when somebody last logged into a computer.    This is important especially if you’d like to know which computers HAVEN’T been logged in.  Say there’s a whole pile of old computer accounts in A/D?  Here is the easy way to tell.

Using GET-QADCOMPUTER normally you don’t get this field.  But if you run the following command

 

GET-QADCOMPUTER –includeallproperties | get-member

 

You can see EVERY property that can be queried from the computer accounts.  I personally do it like this

 

GET-QADCOMPUTER –includeallproperties | export-csv C:\computerdata.csv

 

I like that for one big reason.  I can open it up in Excel and not ONLY does it show me the properties but what they contain.  So you can get a really good idea what you need to query.  So if I want to get a list of computers with the LastLogonTimestamp I just execute this command

 

GET-QADCOMPUTER -IncludedProperties LastLogonTimeStamp

 

Now all need to do is decide how stale the computer accounts should be.   Let’s say 90 days?  Comparing that date with the current date

 

$date=get-date; $days=60; get-qadcomputer -IncludedProperties LastLogonTimeStamp | where { ($date-$_.LastLogonTimeStamp).Days -gt $days }

 

There you have it.  Simple, useful.  One line. Powerful.  that’s Powershell

Abuse the Power :)

Sean
The Energized Tech

Logo_PowerShell

Ok, it’s not one line but here’s what I have just done to Powershell in my profile.  I created a function called “ChangeDomain” It’s not a fancy function, it just builds TWO global variables, one called $DOMAIN and the other called $CREDS.

The nice part is it’s VERY easy to add alternate domains to the list, and very easy to use and NON-DESTRUCTIVE! And with the Quest Active Roles Management shell it’s incredibly easy to use these new variables.

So I can do this now

CHANGEDOMAIN

 

and it will give me a list of Domains like this

image

Key in a number from 0 to whatever to choose and let it populate the variables.

And now you can run any Quest Cmdlet like this

GET-QADUSER –service $domain –credential $creds

 

And it will connect and work on the foreign domain.   Yes you could just type it all in.  But the important part in this little function is it now allows you to make managing multiple Domains EASIER with techs by just giving them a predefined list of accounts, Domain IP’s and showing them some basic stuff from quest.

 

And isn’t that the important part?  Making life easier on ourselves?

Sean
The Energized Tech

----------------------- Begin Global function changedomain ----------------------------------------

function global:ChangeDomain ( $PreChoice ) {

# Check whether Quest Active Roles Snapin is loaded.  If not, load it. 
# If it can’t the whole function should do nothing
#

$STATUSLOADED=$FALSE

$SNAPIN=’Quest.Activeroles.ADManagement’

ADD-PSSNAPIN $SNAPIN –erroraction SilentlyContinue

IF ((GET-PSSNAPIN $SNAPIN) –eq $NULL)
     {

          WRITE-HOST ‘This Script requires Quest ActiveRoles Management Shell’
          WRITE-HOST ‘Which can be downloaded free from http://www.quest.com/powershell/activeroles-server.aspx’
     }
ELSE
    {
          # If it DID, Flag Status as GOOD
          #
          $STATUSLOADED=$TRUE
     }

IF ($STATUSLOADED)

{

#
# Defined list of Domains
# First field is Domain name, Second field is the FQDN or NETBIOS name of a DC
# that domain, Third Field is the IP address if the DC, Fourth field is an account
# with Domain Admin rights to work in the Domain
#
# The variable just uses the IP address right now but you can modify it to
# use the fqdn/netbios name.   Whatever you are comfortable with
#

$DomainList = ("SMITH","dc.fqdn.local","192.168.1.5","Administrator"),("JOHNANDJOHN","dc.fqdn.local","192.168.1.5","Administrator")
$DomainList += ,("BIGLAWFIRM","dc.fqdn.local","192.168.1.5","Administrator")
$DomainList += ,("TEST","dc.fqdn.local","192.168.1.5","Administrator")
$DomainList += ,("SOMEBODYINC","dc.fqdn.local","192.168.1.5","Administrator")

Do
{
$DONE=$FALSE
$CountDomains=0

WRITE-HOST 'Which Domain are we managing Today?'
WRITE-HOST '-----------------------------------'
WRITE-HOST ''

FOREACH ($Domain in $DomainList)
{
Write-Host $CountDomains, $Domain[0]
$CountDomains++
}

$Choice=READ-HOST '( 0 - '($CountDomains-1)')'
IF ($Choice -ge 0 -and $Choice -lt $CountDomains)
    { $DONE=$TRUE }
ELSE
    { $DONE=$FALSE ; Write-Host $Choice; Write-Host $CountDomains; Write-Host 'Please Make a Correct Selection' }
}
# When the $DONE Variable contains a Boolean $TRUE the loop ends.

Until ($DONE)

# Using $GLOBAL creates a variable that can be accessed globally, not just locally

# Creates variable $CREDS as a GLOBAL variable

$GLOBAL:CREDS=Get-Credential -Credential ($DomainList[$Choice][0]+'\'+$DomainList[$Choice][3])

# Creates variable $Domain as a GLOBAL variable

$GLOBAL:Domain=$Domainlist[$Choice][2]+':389'

Connect-QADSERVICE ($Domainlist[$Choice][2]+':389') -credential $CREDS

}

}

------------------------------------------------------------------------------------------------------

I was banging my head against a wall.  Clean install of an OCS 2007R2 Edge server.   IT WAS WORKING!

And then as part of my testing it appeared that it was no longer activated and WOULDN’T

I was muttering a lot of “magic words” over this

So I drilled down through the Activation Log and got this wonderful cryptic error number.

Failure [0xC3EC78D8] Lots of jibberish…

So I checked all the usual suspects.  Time, requirements…. check check and TRIPLE check!

So fine.  Let’s “Bing” that error.   And right up at the WAAAAAAY top was a beautiful article from John Gilham at Gilham Consulting regarding this very issue.

So I checked and SURE enough KB974571 HAD made it’s way into my system.  Removed it and rebooted and my OCS EDGE is now sharp again!

Thanks Mr. Gilham!

Powershell

One of the COOLEST features that came in version 2 of Powershell was the ability to run jobs in the background.   This is incredibly important because normally in old “Classic” Powershell with a large “FOREACH” loop each task would run and would go on to the next task only when the previous one was done.  

So if at one point you hit a big snag, things would slow right down until that “snag” finished it’s work.

But not with background jobs.

SUPER easy to work with too!

So here we have job that will waste a LOT of time.

foreach ( $i in 1..50) {

get-childitem c:\ –recurse

}

 

It is a completely pointless script in that is just get’s a directory of Drive C.  (Well not just a directory, EVERYTHING in fact).  That will waste a lot of time.  And the next time it goes to run, it CAN’T until it finishes the FIRST time!

But this could also be a task that sometimes is busy, and sometimes get’s done quick.   But they’re tied to the same rules.  It’s a big line up of things in a queue.

But enter Background Jobs!  Life get’s amazingly over the top better.  With Background Jobs everything get’s queued up and runs in parallel.  So although one job might be very busy, the other jobs can at least get a shot and doing their work.  You’re still limited to CPU and Processing power but at least you’re no longer limited to the RUN, WAIT, RUN, WAIT, RUN, WAIT game.

Now you’re running them all and the Big jobs will still process while the little ones can go through.

This same script could be run like this using START-JOB

foreach ( $i in 1..50) {

start-job { get-childitem c:\ –recurse }

}

You’ll note when you run it this way all 50 “Get-Childitems” are running at once.  (which will max out your CPU) but they are all working at once.

If you want to check on the status of the background jobs just run a GET-JOB CmdLet.  You can even just list jobs that are in process.  You can use STOP-JOB to pause various jobs to balance CPU time if needed or REMOVE-JOB to get them out of the QUEUE.

There’s also RETURN-JOB that would give you the normal piped output of that job to do something with it once it’s done.

 

But it’s beautiful to work with and easy to use.  And a free benefit of using Powershell V2.

One word

Sweeeeeeeet!

Sean
The Energized Tech

PowershellBIG

I was for a good nine years of my life working in the field as a Technician and Consultant for a company in Toronto called Around the Clock I.T. Solutions Inc.   I will put this name out there since a) They’re a great bunch of guys, b) I like throw a little free advertising out to friends and c) Some of the best experience I ever got as a field tech came from there.

So please listen now.  I’ve been there and KNOW first hand what it’s like to be in the Small Business are and even Mid Size.  My present position has got me touching into the Enterprise. It’s “pretty cool” to understate.

One of the mistakes I used to make was doing ALL my management on the server.  Using Small Business Server 2003, it was typical since all of the tools were sitting on the server and they worked beautifully.

So when I got into Powershell, I “Assumed” like many Administrators it HAD to be on the server to use it.   I “Assumed” because Powershell V1 (the original version) couldn’t do “Remoting” it had to run all the scripts on the server.

But that is COMPLETELY the biggest load of hooey ever.

 

Yes, it’s definitely FASTER on the server.   No question.   But what I found is I can leverage almost 99.999999999% of Powershell from my workstation using Version 1.   And version 2 has some great features if you have it on both the server and Client side.   But that doesn’t stop you from using it to manage almost ANY server environment in Active Directory.

If you have the Administrative account and credentials for a Domain, including the Server name / IP address?  You can manage that server with Powershell.

I keep harping on and on and on about Quest Active Roles Management Shell.  I do it for a reason.   Because I’m not a Developer and I need to get my work done easily.   Quest does that with this free Snap in.   It does it easily.   It DOES IT ON MY WORKSTATION!

 

All you NEED to manage your server environment from your workstation with Powershell is

a) Connectivity to the Server or Server environment (physical or wireless or VPN)

b) Needed snapins for the components you’re trying to manage.  (Exchange, Active Directory

c) Credentials for the Server (domain Admin membership, Enterprise Admin membership or whatever rights you SHOULD have)

d) POWERSHELL, any version although since Version 2 is out for all versions of Windows as low as Windows XP?  Get it.  It’s free. :)

 

Powershell does NOT need to be installed on the Server, Powershell does not need to extend, touch, change, alter or in way hurt Active Directory.  

Powershell is AS SAFE or AS DANGEROUS as Active Directory Users and Computers in the wrong hands.  It is also restricted by the same Policies as Active Directory Users and Computers.   

 

To get those snapins needed you need only acquire the media for Exchange 2007, Sql Server 2008, or the application in question and install the “Management Components”.  Sometimes it will want to install “Shell components”.  Those are typically the Powershell Snapins and components you need.

I can state this with conviction.   I have migrated users from my workstation between separate domains.   The script works the same on the workstation as it does on the server.  I can query user accounts for old stale systems from my workstation the same as on the server.    I can do ALL of this WITHOUT putting Powershell on the server.   I say this for a reason.

 

If you are an Administrator in an Enterprise network and you are restricted about changes to the server (which I can fully understand) but CAN install components on your workstation (as long as it’s cleared by Corporate Policy) start trying Powershell.  It’s a component in Windows 7.   It’s a free update from Microsoft for all current versions of Windows.

And it will change your life for the better.

 

Sean
The Energized Tech

Powershell

Ok. Bad Pun.  I’ll pay for that later.

This seems obvious and it really isn’t difficult.  But like a lot of new people, I found if Powershell did something IMMEDIATE to make my job easier?  I tend to use it more.

How about resetting Active Directory passwords?  As a bonus forcing that password to be reset when the user log’s in?

It’s a simple task in Powershell but it should be shared.  As always I use the Quest Active Roles Management Shell.   It also doesn’t care if you’re running Powershell V1 or V2.

---------- Reset password -----------------

$alias=READ-HOST ‘UserID to Reset’
$password=READ-HOST ‘Temporary Password’ -assecurestring

UNLOCK-QADUSER $alias
SET-QADUSER $alias -password $password -userMustChangePassword $TRUE

---------- Reset password ----------------

So it’s really simple.  Now if you DON’T want to force the user to change their password at next logon?  Change the $TRUE to read $FALSE

But to really make it useful, I want it permanently in my shell environment.  There are just certain things I prefer at my finger tips ALL the time.  So made this into a simple function and added it directly to my profile.

In Powershell, the path to YOUR Powershell profile is in $PROFILE.  I prefer using NOTEPAD to edit mine. Just type in

NOTEPAD $PROFILE

To make this as a function useful all the time add these lines near the top or bottom of your profile

function global:ResetPW ( $Alias, $tempPassword ) {

$password=convertto-securestring -asplaintext -force $temppassword
UNLOCK-QADUSER $alias
SET-QADUSER $alias -password $password -userMustChangePassword $TRUE
}

Now what you can do (the next you load Powershell is type

resetpw username password

Where username is the name of the user you are resetting and password is the temporary (or new) permanent password.   If you have password complexity rules it will of course have to abide by them.

Enjoy Powershell and save time!

Sean
The Energized Tech

Powershell

One of great features of Powershell is it’s raw power. 

However it DOES have limitations.  With great power comes great processing.  I caught a posting in a discussion group about finding the size of folder structure.  And you can do it in Powershell but the user needed to do it on 100’s of servers and systems.  And Powershell CAN do it but, being over the LAN, it’s slow.

But who says you ONLY use Powershell to do the tasks?  Sometimes there are established methods that already JUST WORK!  And you can you them and have the file manipulation features of Powershell do the muscle work.

What I figured out (and It seems to work very fast) is using a Good old fashioned “DIR /S > INFO.TXT” command.

Yes “DOS” or “CMD.EXE”.  “Old School” 

the reason I was thinking of this was on an Enterprise scale you’re not going to “just get permission” to throw Powershell onto 100’s of servers.  But he needs the information easily and FAST.

Well CMD.EXE is stock.  And Powershell can CERTAINLY manipulate a simple text file with a GET-CONTENT.   And the location of the Directory size is always constant.

 

So here you have it… Just an idea…  What do you think?

---------------------------------------

Here's the example.  You'd have to incorporate this into your Server FOREACH but here's the idea.

Each server nightly or weekly run the following task.

DIR "C:\Documents And Settings\" /s > C:\FOLDER\INFO.TXT

Then you want to copy the file locally somehow, that will speed up the processing of the INFO.TXT file.  the 2nd last line has the size of the folder structure from this outputted text file.

copy \\SERVER\C$\FOLDER.INFO.TXT C:\LOCALFOLDER\ or whatever fits your environment, the trick is to have the data LOCAL to work with not over the lan

---------------------------------------

Then pull that TEXT file into a Variable in Powershell

$DATA=GET-CONTENT C:\LOCALFOLDER\INFO.TXT

And the following three lines figure out the size of the Text file, Counts back two lines and figures out where the number starts (constant) and ends (Variable)

# Total lines in Text file

$TotalLines=$DATA.Count

# How big the text field is

$Outputsize=$Data[($TotalLines-2).Length

# the spot that contains the text data representing the total dir size

$DirSize=$Data[($TotalLines-2).substring(25, ($Outputsize-31))

--------------------------------

$Dirsize is at least now a variable with the needed data.  Most of the processing (producing the folder structure) happens on the back end servers.   User just needs a way to pull those files into the local system temporarily to pull the last line out.  It could now be printed easily.

 

See?

 

Sometimes it shouldn’t be done in just Powershell, no matter HOW much I love it

Sean
The Energized Tech

“Auld Blue Lines”

| | TrackBacks (0)

Powershell

My Contribution the New Year and the Powershell Community.

I redid the Lyrics to “Auld Lang Syne”

"Auld Blue Lines"

Should Old O/S's be forgot
And lost in data lines
Should Old O/S's be forgot
And techniques lost in time

A call for Power echoes out
A Shell that's oh so fine
We'll pipe away our objects now
And do it one line

Within Exchange and DPM
And Scom and SQL too
We'll script away the task we need
Within a screen of blue

We do it all in Powershell
In lines of Powershell
In .NET objects flowing free
We code in Powershell

 

Happy 2010 Microsoft and the Powershell Community

From Sean
the Energized Tech

Powershell

I’m not a very creative person when it comes to making up fake names.   Thus when I need to “invent some users” for a Demo environment I usually end up with “John Smith”, “Mary Smith”, “Joe Bloe” and I start running out of ideas pretty darn quick.

Tomorrow I’m doing a small presentation at our User Group on Powershell and I wanted to actually have a REAL looking Active Directory.  

The populating part is easy.  But I wanted a REALLY good list of names.  And I wanted to be able to add to that list without TRYING.

Thus, Powershell

I made two simple text files.   Firstname.csv and Lastname.csv

First row in “Firstname.csv” was called “Firstname” and the first row in “Lastname.csv” was called “Lastname” (See! I told you I wasn’t THAT kind of creative!)

In the “Firstname.csv” text file was a list of (Gee can you guess?) Firstnames.  I found a few lists online and used those for samples.  Mine were French!

And (drum roll please) “Lastname.csv” was a list of last names.

Ahem.  Here’s the fun part

A nice little script that pulls that data together (regardless of size) and produces a brand NEW CSV file called “DomainUsers.CSV” which is perfect for making whatever you want.  All your fake John, Joe, Mary, Frank Smiths ET AL!

And all of my files are in a folder called “C:\Powershell Scripts'”

----------- RandomUsers.PS1 ---------------------

# Import list of Last names from Simple Text file

$lastname=import-csv 'C:\Powershell Scripts\lastname.csv'

# Import list of First names Simple Text file

$firstname=import-csv 'C:\Powershell Scripts\firstname.csv'

# How many names to generate

$totalnames=10

# the Header for our new CSV file

$firstline='Firstname,Lastname'

# Create a file called “DomainUsers.csv”

# By the way, the script is DEAD simple.  It doesn’t check for the existence

# of the file before hand.   So play with it how you want :)

Add-content -path 'C:\Powershell Scripts\DomainUsers.csv' -value $firstline

# Go through and Generate some names

foreach ( $namecounter in 1..$totalnames )
{

    # Pick a random first and Last name

    $lastnamenumber=(get-random -min 0 -max (($lastname.Count)-1))
    $firstnamenumber=(get-random -min 0 -max (($firstname.Count)-1))

    $FakeName=($firstname[$firstnamenumber].Firstname+','+$lastname[$lastnamenumber].Lastname)

    # Echo the New name to the Screen

    write-host $fakename

    # and write to the File

    add-content -path 'C:\Powershell Scripts\DomainUsers.csv' -value $fakename
}

------------------ RandomUsers.ps1 --------------------

And that’s it!

All that baby does is generate a list of names from random First and Random Last names which I can now use to import into Demo Active Directories or other environments as sample data!

Save your Brain!  Use Powershell!

Sean
The Energized Tech

PowershellBIG

Powershell is lot a big box of Legos.   There are so many Cmdlets and Variables in the mix, and as you start play you discover something about them. That each piece has SO many more pieces already attached to it.  Because each Property can use several Methods and after Methods are applied you have new Properties to work with; sometimes OTHER properties buried within.

It’s a lot like that whole “mirror facing the mirror” bit except it produces more usefulness each and every time.

So lets look at a built in Variable that you might overlook all the time.  One that is a perfect example of pieces within Rabbit Hole.

 

$HOST

 

This is more than a variable.   It’s your connection under the hood to your console.   It not only tells you information about how the Date/Time is formatted but also locations of the cursor on the screen, what color it is.  As you dig into the methods you can call the very sequences that CmdLets call up

For example, the Cmdlet you call up to read input from the console READ-HOST, can also be expressed from the $HOST variable as

 

$HOST.UI.ReadLine()

 

More importantly to you, as somebody working in a Powershell environment, this is your method to adjust color and cursor position.   You can tell from this variable how BIG the actual console has been defined as (Columns by rows).   If you were to get creative, you could use this to write a VERY nice non GUI menu system.

 

$HOST.UI.RawUI contains some of the most important attributes of all.  These are the ones you’re going to want to play with a LOT if you get heavily into the console.  Just typing it will give you the list.

 

image

 

To read any one of these just put a “.” between the variable and the particular property.   Note.  Just like in many parts of Powershell, as you start opening up the little box more pieces show up.

So although typing

 

$HOST.UI.RawUI.WindowsTitle

 

Will show just my present Window Title “The Wild and Wonderful World of Sprinkles” (normally it’s “Windows Powershell” but as you can see I was playing already) typing in

 

$HOST.UI.Rawui.CursorPosition

 

image

 

Will show two more variable on the screen (In this case the Current Column and Row of the Cursor in the Console) and again just tack on a “.” to step deeper into the Rabbit Hole.

 

$HOST.UI.RawUI.CursorPosition.X

$HOST.UI.RawUI.CursorPosition.Y

 

And to SET any of these values just simply assign using the normal methods

 

$HOST.UI.RawUI.CursorPosition.X=10

Of course in the live console, it’s a little useless since the minute you type in this command it will reset back to a new location.  But within a script where you want to output data to a user or Put a Warning at the middle of the console, this is where it gains power.   Colors are even easier!

 

$HOST.UI.RawUI.ForegroundColor

$HOST.UI.RawUI.BackgroundColor

 

Will show you what it is, and to change it, just assign either one a string value from the assigned list; "DarkRed","DarkBlue","DarkCyan","DarkMagenta","DarkYellow","Gray","DarkGray","Blue","Green","Cyan","Red","Magenta","Yellow" or "White".

 

$HOST.UI.RawUI.ForegroundColor=”Yellow”

 

Has now just changed the Cursor and typing letters to Yellow.   It will look weird with the new letters one color and the old letters another until you execute a CLEAR-HOST.

But as I said, this is a big box of “Legos” inside.  Poke about, have fun.  Discover what YOU can do with Powershell.

 

It’s never the same for two people.

Sean
The Energized Tech

Powershell

The whole point of Powershell is to make your life, to be blunt, EASIER and more efficient.

One of the nice simple and easy to work with features of Powershell is what is referred to as a “Function” (Yes you can picture some “Austin Powers Dr. Evil” finger quotes as I said that.

A “Function” is a lot like a small script to repeat a task you might do on a regular basis.   It can be part of script too.  It’s job is to simply put it, perform a “Function”

What is a function?  It could be as complex as a script, but really doesn’t need to be.

Really if you find yourself repeating a Sequence of commands over and over and over?  A lot of that might become a simple function.

Let’s just say you have a script like this… A simple one…

 

---------- sillyscript.ps1 --------------

$INFO=READ-HOST ‘Gimme some input:’

for ($i in 1..20)
{
$WRITE-HOST $INFO
}

$INFO=READ-HOST ‘Gimme more input:’

for ($i in 1..20)
{
$WRITE-HOST $INFO
}

$INFO=READ-HOST ‘Gimme even more input:’

for ($i in 1..20)
{
$WRITE-HOST $INFO
}

 

---------------- sillyscript.ps1 -------------------

 

Of course this is a completely pointless script.  Ask for input, write that input on the screen 20 times.   It’s almost like something from the Commodore PET!

 

But you CAN see we are repeating ourselves.   Where what we can do is take most of this and turn it into a function.   A function is just another Script block but you’re giving it a name a telling it what it can receive, and possibly return.

 

So we can write the For loop as a function and just simply call it up.  You just place the definition for the function at the beginning of the script and call it up whenever it’s needed.  Our new sillyscript.ps1 will look like THIS instead. 

--------------- sillyscriptv2.ps1 -----------------

function loopyprinter ($DATA)

{


for ($i in 1..20)
{
$WRITE-HOST $DATA
}

}

$info=READ-HOST ‘Gimme some input:’

loopyprinter($info)

$info=READ-HOST ‘Gimme more input:’

loopyprinter($info)

$info=READ-HOST ‘Gimme even more input:’

loopyprinter($info)

 

---------------sillyscriptv2.ps1 --------------------------

 

Of course this script only serves one point.  To show you how a function can simplify a script by controlling repeated code and of course just how pointless printing your name on the screen 20 times is.

 

But you can see, it’s NOT difficult.  In fact, it’s absolute painless, even if my script is absolutely pointless :)

 

Sean
The Energized Tech

 

It really pays to follow people on Twitter.   Mary Jo Foley has just posted an article detailing a HOTTER than HADES offer being brought out by Microsoft to encourage many Businesses to update to Windows 7 if they’re presently running XP.  I won’t bother watering down her words or even trying to kill the message.

 

Just go Here and read it now!  My eyeballs are rolling on the desk right now so it’s hard for me to type!

 

Mary-Jo Foley “Microsoft offer Windows XP, Office XP users 50 percent discount to encourage ugprades”

 

WOW! I’m floored!

One of the biggest challenges I always had in the field was in upgrading workstations from an environment that previously had a non DC setup, nothing but workstations; was in migrating data and configuration with the least amount of disruption.

In the field you’re paid by the hour, so if a problem happens you get paid for it.   But you also want the client to be happy at the end.   You need to get as much of the old setup swapped over so you’re not digging about for passwords, id’s, cookies etc etc etc.

Because on the smaller sites the word “Documentation” is about as common as “Network Administrator”

One of the nicest features in Windows 7 is the Easy Transfer.   It’s known by a more common name on the Enterprise level as the “User State Migration Tool”.   Oh by the way… It’s free.

In Windows 7 I can actually run a Wizard that will allow me as an Administrator to gather all the settings in the User’s Registry, files, data, EVERYTHING to move it somewhere else.   The beautiful parts about this process it is 1) Non Destructive it DOES NOT MOVE THE DATA and 2) I don’t need to know the UserID password to perform this process just Administrative credentials.

 

Now what it’s DESIGNED for is to move from one computer to another.   What it CAN do (just as easily) is move from one Configuration (locally logged in user) to another (Domain User)

When you start up the easy Transfer it will normally ask you where to save the data.  Run the Wizard as normal and backup the data.

Once the process is complete reboot the system and login as a Domain User.  Start typing Easy Transfer in your Search bar, right click and “Run as Administrator”.  Re run the Wizard again, specifying this is the New PC and browse to the location where your stored the Migration Data file.   Before completing the Wizard and allowing the data migrate back you’ll see an easily missed option on the lower right side that says “Advanced”.

When you select advanced you’ll see it prompting for the old and new profile.  By default it will recreate the account on the new system.   You can choose ANY profile already setup on that computer (including a Domain user ID) as the destination.

Allow the process to finish the Migration and you’re done.  You may have to log off and then log on to take on the new settings.

 

But in the field I didn’t always have the luxury of having an extra hard drive to play with.  Sometimes space was scarce.   So in this scenario we can cheat a little and just get the Configuration and use a cut and paste for the data (since it’s on the same hard drive)

So what you’re going to do in Easy Transfer, after it runs it’s initial scan, is to DESELECT whatever you’re not moving.  Typically Music, Documents, Etc (since this transfer will occur on the same hard drive we don’t NEED to replicate data; just move it.

image image

Of course you may want to DESELECT other profiles that may not need to switch to domain membership.  Some local accounts might be a component of Visual Studio and don’t need any data migration or configuration change since they are a part of the computer.   And again, we’re concerned with migrating the configuration to the “new side”.  I usually finish the migration like normal, seize ownership of the old folder from the user profile and just do a cut and paste from the “old” Documents to the new Documents.   It’s a lot faster (although less safe) that way if you don’t have the option of having an external hard drive. 

Of course of the amount of data/configuration fits the present free space on the physical drive I say go for it and back it all up anyhow.  We don’t trust Murphy.

 

But it does work.  I’ve used it in the past and as I am typing the article, just used it to switch my own laptop’s local user data onto my newly setup domain Controller.  All of my email, Id’s passwords carried over with no issues as well as all my blogging data.

 

Remember, if you’re stuck?  There’s almost always a solution.

Sean
The Energized Tech

Logo_PowerShell

Caught this one in the Technet forums.  How do you monitor for drive space and Notify?

This is where I absolutely fall head over heals in love with Powershell.   It’s so EASY to write a way to monitor and list the information!  It’s FREE MONITORING SOFTWARE with a BUILT IN Emailing system!

So let’s say you want to know about hard drives in your computer.    All of that information is available via WMI in the win32_volumes

So running a GET-WMIOBJECT win32_volumes will show you all the physical drives attached.   Cdroms, EISA partitions etc.

But do we care about CDrom drive free space?

Nope.  But It’s not difficult to get what we want from WMI with Powershell.

 

GET-WMIOBJECT WIN32_VOLUME | select-object freespace, capacity, drivetype, driveletter

 

Will show us all the details we’re looking for.  And the cool part is drive letter is labelled EXACTLY the way it should be “C:” “D:” “Z:”
”freespace” is in raw bytes as is the “capacity” property but that’s fine.    You can always format something like that.  The important part is to get the information

And again you can say your freespace cutoff is 100 bytes, 100 kilobytes, 20 gigabytes or just have it factored in as a percentage of the free space.

 

$driveinfo=get-wmiobject win32_volume | where { $_.driveletter -eq 'C:' } | select-object freespace, capacity, drivetype, driveletter
$Percent=.10
$WarningLevel=$driveinfo.capacity *  $Percent

 

Now in Powershell V2 you have the ability to send mail straight from the Script.  It’s a built in command that only requires an SMTP server to point to and can attach files as well.  All in a SINGLE line!


send-mailmessage -from "Server01 <server01@contoso.local>" -to "MyAdministrator <networkAdmin@contoso.local>" -subject "I'm all out of space" -body "I'm all out of space, I'm so lost without you, please get me some more now, the users are crying...I have $driveinfo.freespace bytes leftover! Hurry!" -priority High -dno onSuccess, onFailure -smtpServer smtp.contoso.local

 

The message can be whatever YOU want in the body, same of course for the addresses.   Just typing a HELP SEND-MAILMESSAGE –EXAMPLES in Powershell gives you some dead simple ones to work from, covering almost all emailing examples!

With no effort we now have a basic script to monitor free space on a hard drive and to email somebody when it gets low.   You don’t need SCOM or expensive monitoring software to do this.  Just Powershell V2 (Free) and a little time! And it works ALL the WAY DOWN TO XP/Server 2003!

------------------- FreeDriveSpaceWarning.PS! ----------------------------

# Change C: to whichever letter you'd like monitored

$driveinfo=get-wmiobject win32_volume | where { $_.driveletter -eq 'C:' } | select-object freespace, capacity, drivetype, driveletter

# 10 percent 10/100

$Percent=.10

# XXX % of maximum space is our warning level
$WarningLevel=$driveinfo.capacity *  $Percent

if ($driveinfo.freespace -lt $WarningLevel)
{
send-mailmessage -from "Server01 <server01@contoso.local>" -to "MyAdministrator <networkAdmin@contoso.local>" -subject "I'm all out of space" -body "I'm all out of space, I'm so lost without you, please get me some more now, the users are crying...I have $driveinfo.freespace bytes leftover! Hurry!" -priority High -dno onSuccess, onFailure -smtpServer smtp.contoso.local
}

------------------- FreeDriveSpaceWarning.PS! ----------------------------

There you go.  

Free for the Holidays! :)

Sean
The Energized Tech

Hello World 2010!

| | TrackBacks (0)

I bid farewell to 2009, an interesting year for myself and my family on so many levels.  

There were things I wish that did not happen in 2009 that did, and there are things I wish DID happen that did not.

And there are things that happened to me in 2009 I wouldn’t have traded anything for.

So the new decade is upon us. In these changing times, I offer up a toast to all.

May we
The many and the confused
Find a common goal and solution
To bring us all to a brighter day

That our children and their children
Will remember us all as one
As a people that finally looked past
our differences

and Discovered what we all have in Common

Our families

Happy New Year all!  Happy 2010

Your Friend

Sean
The Energized Tech

Oh what a treat to find floating in Twitter today.  I am about the biggest fan of the Hitchhikers Guide to the Galaxy ever.  Douglas Adams was an absolute wonder to the mind, reading his books brought Humor and Science Fiction together on a new level

 

Somebody found an actual performance of “Marvin the Paranoid Android” singing “Paranoid Android”

 

If you enjoyed the books, love this!

“Life, don’t talk to me about life.  Pain running up and down my diodes…”

 

Ok I FINALLY got a copy of this online.

If any of you don’t know, I was granted the opportunity to speak at Techdays in Calgary as well as in Toronto.   I’ve never been so floored.

While in Calgary I did a presentation on the Windows Recovery Environment as well Free tools for IT Professionals (co Presented with Rodney Buike of Microsoft Canada).  I worked with and met a lot of people for whom I not only looked up to as heroes but as role models.

And then the unexpected happened.   Not only did I get to see Adam Carter of “The Edge” but I was interviewed after doing the most tempting of all tasks.

I touched the Microphone from “The Edge”.  Wow!  That was cool. 

TheEdge

Only to have myself interviewed.

Here it is folks.    A former “Funny Guy” who’s passions went a little overboard.   Yes, I still talk fast :)

And just keep this in mind.   This can be YOU too.  Take that chance and try.

We all are imperfect.   But we can all share.  And look where it might take you.

 

 

And to Microsoft, all the MVP’s, the Community?  Thanks for a GREAT YEAR!

Sean
The Energized Tech

 

And to Microsoft, all the MVP’s, the Community?  Thanks for a GREAT YEAR!

Sean
The Energized Tech

Logo_PowerShell

Last time we played with having a script become a module and passing through variables from the command line. 

Which is nice.   VERY nice.  “Life is good” Nice.

And a lot of the time, the system will just “let it run” if you forget to type in variables.  It might not do anything actually USEFUL but it will run.

But we’re going to take take last module we made and give it some “fallbacks”.   Plug in information INCASE you forget.  Or the next admin. 

Having Defaults is a good thing you know. :-)

So here’s our last module.

---------------------------------------------

New-Module -scriptblock {

function ListUsers ($startdate, $enddate, $ou)

{

connect-qadservice -service ‘contoso.local’

get-qaduser –searchroot “contoso.local/$ou” |  where { ($_.Creationdate -ge $startdate) -and ($_.Creationdate -le $enddate) } |  select-Object Name

}

Export-ModuleMember -function ListUsers

}

--------------------------------------------

Now “What if” I forgot to type in either date?  The end result is a big load of nothing, unless you had a user created on the EXACT DATE you ran this on.   So pretty useless.  

Now there are very fancy ways to do this, but the point of any of my posts is for new Admins to get a FEEL for Powershell and learn from there.  And honestly? I’m not that good.  So I go the lazy way.

So for my first check and balance, I want to make sure the End date, if it’s not supplied, is assumed to be at LEAST today.

And with a little “IF statement” that’s fixed.

IF ( $enddate –eq $NULL) { $enddate=get-date; write-host ‘Ending Date not supplied, Today Assumed’ )

And of course, it would make sense to have the Startdate in this particular module to be something BEFORE the End Date, again otherwise, useless results.   But I want to make sure that information is supplied.  If it isn’t, I’m going to tell the silly Administrator and stop the module there and then.

IF ( $startdate –gt $enddate) { write-host ‘The Starting Date MUST be BEFORE the Ending Date’}

In fact with this in mind, I’m going to make sure the module never runs if this information is not supplied by dropping in an “ELSE” that incorporates the rest of the script.

Now there is one REALLY important details I almost forgot.   You see, unless you TELL powershell that the information you’re passing in is a DATE? It’s going to take that nice little ‘4/20/2009’ and figure it out like math!  *oops*.  So you’re “date” will end up being I think some stupid amount like ‘9.95520159283226E-05’

So to fix that, we’re going to tell the module that certain variables are going to be ASSUMED as a date/time.   And that is done by prefixing the variable with [datetime]

Yes, so if you just “skipped off” after Post # 2?  Your ummmm module?  Not working no good :)

So there’s a lot more we can do here but we now have a script that looks like this.

---------------------------------------------

New-Module -scriptblock {

function ListUsers ([datetime]$startdate, [datetime]$enddate, $ou)

{

IF ( $enddate –eq $NULL) { $enddate=get-date; write-host ‘Ending Date not supplied, Today Assumed’ }
IF ( $ou –eq $NULL) { $ou=’Users’; write-host ‘Organization Unit not supplied, Assuming default Users container’ }
IF ( $startdate –eq $NULL) { $startdate=$enddate; write-host ‘Starting Date not supplied, Today Assumed’ }

IF ( $startdate –gt $enddate) { write-host ‘The Starting Date MUST be BEFORE the Ending Date’}
ELSE
{

connect-qadservice -service ‘contoso.local’

get-qaduser –searchroot “contoso.local/$ou” |  where { ($_.Creationdate -ge $startdate) -and ($_.Creationdate -le $enddate) } |  select-Object Name

}
}

Export-ModuleMember -function ListUsers

}

--------------------------------------------

Next time I’ll touch on something the example in the Powershell help showed us, how we could add in a pseudo “help line” to our module.  There’s a few ways of doing that.

But until next time, Keep on Scriptin’!

Sean
The Energized Tech

Logo_PowerShell

Ok, it’s not one line but here’s what I have just done to Powershell in my profile.  I created a function called “ChangeDomain” It’s not a fancy function, it just builds TWO global variables, one called $DOMAIN and the other called $CREDS.

The nice part is it’s VERY easy to add alternate domains to the list, and very easy to use and NON-DESTRUCTIVE! And with the Quest Active Roles Management shell it’s incredibly easy to use these new variables.

So I can do this now

CHANGEDOMAIN

 

and it will give me a list of Domains like this

image

Key in a number from 0 to whatever to choose and let it populate the variables.

And now you can run any Quest Cmdlet like this

GET-QADUSER –service $domain –credential $creds

 

And it will connect and work on the foreign domain.   Yes you could just type it all in.  But the important part in this little function is it now allows you to make managing multiple Domains EASIER with techs by just giving them a predefined list of accounts, Domain IP’s and showing them some basic stuff from quest.

 

And isn’t that the important part?  Making life easier on ourselves?

Sean
The Energized Tech

----------------------- Begin Global function changedomain ----------------------------------------

function global:ChangeDomain ( $PreChoice ) {

# Check whether Quest Active Roles Snapin is loaded.  If not, load it. 
# If it can’t the whole function should do nothing
#

$STATUSLOADED=$FALSE

$SNAPIN=’Quest.Activeroles.ADManagement’

ADD-PSSNAPIN $SNAPIN –erroraction SilentlyContinue

IF ((GET-PSSNAPIN $SNAPIN) –eq $NULL)
     {

          WRITE-HOST ‘This Script requires Quest ActiveRoles Management Shell’
          WRITE-HOST ‘Which can be downloaded free from http://www.quest.com/powershell/activeroles-server.aspx’
     }
ELSE
    {
          # If it DID, Flag Status as GOOD
          #
          $STATUSLOADED=$TRUE
     }

IF ($STATUSLOADED)

{

#
# Defined list of Domains
# First field is Domain name, Second field is the FQDN or NETBIOS name of a DC
# that domain, Third Field is the IP address if the DC, Fourth field is an account
# with Domain Admin rights to work in the Domain
#
# The variable just uses the IP address right now but you can modify it to
# use the fqdn/netbios name.   Whatever you are comfortable with
#

$DomainList = ("SMITH","dc.fqdn.local","192.168.1.5","Administrator"),("JOHNANDJOHN","dc.fqdn.local","192.168.1.5","Administrator")
$DomainList += ,("BIGLAWFIRM","dc.fqdn.local","192.168.1.5","Administrator")
$DomainList += ,("TEST","dc.fqdn.local","192.168.1.5","Administrator")
$DomainList += ,("SOMEBODYINC","dc.fqdn.local","192.168.1.5","Administrator")

Do
{
$DONE=$FALSE
$CountDomains=0

WRITE-HOST 'Which Domain are we managing Today?'
WRITE-HOST '-----------------------------------'
WRITE-HOST ''

FOREACH ($Domain in $DomainList)
{
Write-Host $CountDomains, $Domain[0]
$CountDomains++
}

$Choice=READ-HOST '( 0 - '($CountDomains-1)')'
IF ($Choice -ge 0 -and $Choice -lt $CountDomains)
    { $DONE=$TRUE }
ELSE
    { $DONE=$FALSE ; Write-Host $Choice; Write-Host $CountDomains; Write-Host 'Please Make a Correct Selection' }
}
# When the $DONE Variable contains a Boolean $TRUE the loop ends.

Until ($DONE)

# Using $GLOBAL creates a variable that can be accessed globally, not just locally

# Creates variable $CREDS as a GLOBAL variable

$GLOBAL:CREDS=Get-Credential -Credential ($DomainList[$Choice][0]+'\'+$DomainList[$Choice][3])

# Creates variable $Domain as a GLOBAL variable

$GLOBAL:Domain=$Domainlist[$Choice][2]+':389'

Connect-QADSERVICE ($Domainlist[$Choice][2]+':389') -credential $CREDS

}

}

------------------------------------------------------------------------------------------------------

Not my favorite device on the planet but I have to use it.  And it works ok most of the time (between having to re flash the silly firmware)  But it has a trackball.  

A little trackball to move the mouse pointer. A teeeeeeeeny weeeeeeny little trackball.  And in my last year I’ve had to swap out about a dozen for sticking and jamming on different units.  On the weekend at Christmas mine got stuck REALLY good.  The unit was rendered useless.   It would only move “left and right” unless I beat on it.

Now I could go online and order one or wait for three days until I got back in the office and “harvest” one from a bad unit.

But rumor was they were fixable.   That, just like the older mice, they stuck for the normal reasons.

DIRT! FLUFF! “Gunk”.  Little aliens living underneath.

So it was time to play operation and disassemble this little pest.

On the Blackberry Curve there is a small plastic Circlet protecting the Mouseball.   Gently (OH SO GENTLY) pop that sucker free with a small flat blade screwdriver, a butterknife or your fingernail.  Whatever is handy.

Just be careful since there are some little clips that are needed to hold this down.  Damage these and you will lose you ball.  (You in the back stop laughing!)

You can flip the unit over at this point and the ball should just fall out in the white plastic case.

Now on EACH side there is a very Very VERY tiny little metal clip.  You will need a steak knife or a VERY fine tip to very gently nudge these free.   They are just clips.  They don’t need to be forced to get off the assembly but they DO need to be moved very delicately in order to NOT misshape the frame.  You will need to re-assemble this later.

Once the enclosure is off, you (if it’s like mine was!) will probably see some “Special padding”

It’s not “Special padding”.  Who would you PAY to put that in there?

No it’s lint, or dirt, or “something” that shouldn’t be there.

The only thing you should find inside is a somewhat dirty little ball and fall tiny little plastic rollers with a micro small black magnet on the end of each.

Yes.  They are magnets.  I found out when I put them on the desk and they all just STUCK TOGETHER.   They’re not terribly strong, but like magnets, they like to stick to screwdrivers.  Well because they ARE magnets. (duh!)

At this point you can remove each little roller carefully and look for dirt or cracks.  If it’s dirt, clean it with rubbing alcohol.  If it’s a crack, give up and go by a new ball for $30 online.

Once you’ve removed all the offending material.  Just carefully put it ALL back together the way you found it.  The Metal frame can only line up one way on the bottom so DON’T FORCE IT.

It WILL require a little gentle nudging to get it to all go back together.  Just a little.    And it MUST oh it MUST MUST MUST be PERFECTLY clicked back together. 

If it isn’t then one of the sides isn’t “quite” snapped down You’ll find the suck rolls “this way and that” but not quite the “other ways”.   (In short, it ain’t workin’ right!)

If you do it right, you’ve got a working (or more working) ball.  

And if not, well it was probably pooched to start off with :)

 

Good luck and don’t panic!

Sean
The Energized Tech

Logo_PowerShell

Others may already know this.   But this HAS to be shared!

When you use the Quest Active Roles Management shell for Powershell you get a bunch of really cool Cmdlets that make life as an Administrator beautiful.

Up to this point if I wanted to manage a different domain I have been doing this

$CRED=GET-CREDENTIAL

(A windows Appears and I type in the DOMAIN\UserID and password for the Domain I’m trying to Administer.

CONNECT-QADSERVICE –service ‘fqdn.contoso.local:389’ –credential $CRED

(Connect to foreign Domain DC)

GET-QADUSER

(or OTHER Quest Cmdlet)

 

My problem has been I’m writing a script / module that can just “switch domains” so I can run the Cmdlets without a headache.  Because I need to switch domains on the fly.  I also want to have OTHER admins (independant of me) use this same module to their own advantage. 

Now I learned all about_scopes in Powershell but when I did a “CONNECT-QADSERVICE” in the module.  It would work but immediately drop back to whatever the local scope was before the module.

Then I found out I can just connect to the DC DIRECTLY in the GET-QADUSER CmdLet!

GET-QADUSER –service ‘fqdn.contoso.local:389’ –credential $CRED

 

I’ll post the module later on today so you can see what I was after! 

Oh this is COOL!

Sean
The Energized Tech

Wow! This made my day!

I stumbled across a silly problem in DPM 2007 a few months back (Service Pack naming issue with 64bit and 32bit versions) and blogged about it, thinking that “If only this helps ONE person! It’ll be worth it!”

 

Well what do you know?

A few months after the blog Posting somebody stumbled across the same problem, did a search and stumbled across my Blog post.    And it helped them solve the problem a little faster!

 

So for all of you out there thinking that “Nobody will benefit from your blog post” or that “Mine isn’t good enough to be of use to anybody”

“Bzzzzzzzzzzzzzzzzzttt!!!!

 

Wrong answer.  Post it.  Even if it’s “I found that doing this in the Colour Green makes things crash less for me” POST IT!

 

You might just discover that other’s share your problems and sometimes you’ll have a piece to the solution.

Or the whole solution, without even knowing it.

 

Sean
The Energized Tech

Logo_PowerShell 

We’re going a little more into turning a script into a module.  And again, don’t scare yourself off!  They are easy to deal with!

So here today,  I’ll go the other way around.   I’m going to take a simple script with a few built in Variables and modify it into a module that I can type IN those variables every time.

Here’s one I use on a regular basis.   It’s a simple thing that uses the Quest Active Roles and pulls up a list of users created in a specific date range under a particular OU.

-----------------------------------------------

$startdate='8/1/2009'
$enddate='10/31/2009'

connect-qadservice -service 'contoso.local'

get-qaduser -searchroot 'contoso.local/Users' |  where { ($_.Creationdate -ge $startdate) -and ($_.Creationdate -le $enddate) } |  select-Object Name

-------------------------------------------------

Not a very fancy script, but one I find handy.     It users in a particular OU in Active Directory, and lists those created within a particular date range.

Now rather than going in to the script EACH and EVERY time, wouldn’t it make more since to just turn it into a useful command and TYPE in what you wanted?

That’s a module.

So all we have to do is specify the variables that will PASS THRU from the command line.   Last time it was the Variable called ‘$Name’ that was sent to be echoed to the screen.  Yes, it was a lame but simple example.   This time we’re going to pass THREE variables to the module.  $startdate, $enddate, and $ou so that we can simply call up our module now and have as a useful regular command in the Shell environment.

So this script as a module looks like *THIS*

---------------------------------------------

New-Module -scriptblock {

function ListUsers ($startdate, $enddate, $ou)

{

connect-qadservice -service ‘contoso.local’

get-qaduser –searchroot “contoso.local/$ou” |  where { ($_.Creationdate -ge $startdate) -and ($_.Creationdate -le $enddate) } |  select-Object Name

}

Export-ModuleMember -function ListUsers

}

--------------------------------------------

As you can see all we’ve done is take the variables we want to pass through from the command line and put them into a function and then Export that function at the end.   That’s it.

Once you execute this script you will now have a new command call “LISTUSERS” you can from the Powershell environment.  Note, that unlike a Shell script you do NOT have to specify a path to the filename to make it execute.   And you can easily add in the required parameters.   And just like a Shell script, you can check for the input values and compare that they meet Syntax or whatever you choose.

Note the “ModuleName ‘LISTUSERS’ is derived directly from the original function name.  Because we “exported” that function.

Next time we’ll play a little deeper like showing how to have the variables default and EVEN how to have you module check for dependancies (like it’s needed snapins) before running.

'Til Next Time

Sean
The Energized Tech

image

It’s been an interesting year for this overly wired IT Pro!

Been involved with a user Group and helped deploy our new website.   Been diving into Powershell and talking about it so much my face is turning Blue!

(and with Powershell THAT is a good thing!)

I got to be involved in full scale deployment of one of the most INTERESTING and COMPLEX hardware / software setups in my Entire life, got through a nasty thing with the Taxman, spoke at one of the Biggest conferences here in Canada. TWICE!

I flew on a Plane for the first time ever And even did a Demo in front of a crowd that was acknowledge by a REAL guy at Microsoft as one of the Best he’s ever seen.

AND Boy was I blushing after that!

Who knows what life holds forward next year for any of us, but I will offer this to all of you out there.

If you take the chances and make those mistakes, you will learn and things WILL get better, even on a small level.  

And each and every one of us, in our own SMALL way can add up into a greater solution that none of us can foresee; If we only take that chance.

So in 2010, let’s look forward to the future and what it holds.  Let’s see how each and every one of us; in that small way; can shape it and brighten it for better tomorrow!

A Happy Holidays to all in HOWEVER you celebrate them :)

Sean
The Energized Tech

Logo_PowerShell

Ok let’s take a crack at this feature in Powershell V2 called Modules.

Modules really aren’t all that difficult when you get down to it.   It’s very much like taking a script and loading it into memory. 

Well maybe not exactly like that but the cool part is when you turn that script into a module (as long as it’s loaded) you don’t have to keep specifying the path to the script, and you could even pop it into your Profile file (since it’s just text) and add functions that make YOUR admin life better without trying.

So how hard is it?

Would you like the fact that the Powershell team GIVES you a simple sample module in the Help file?

In Example 3 (I like this one the best) under the “New-Module” help they give you this example

 

------------------ Sample from Microsoft Powershell ISE Help System – Example 3 – New-Module -------------------------------

C:\PS>New-Module -scriptblock {$SayHelloHelp="Type 'SayHello', a space, and a name."; function SayHello ($name) { "Hello, $name" }; Export-ModuleMember -function SayHello -Variable SayHelloHelp}

C:\PS> $SayHelloHelp Type 'SayHello', a space, and a name.

C:\PS> SayHello Jeffrey Hello, Jeffrey

Description
-----------

This command uses the Export-ModuleMember cmdlet to export a variable into the current session. Without the Export-ModuleMember command, only the function is exported. The output shows that both the variable and the function were exported into the session.

------------------ Sample from Microsoft Powershell ISE Help System – Example 3 – New-Module -------------------------------

 

The nice thing about this example is I can EASILY break it down so it looks like a script and you can VERY easily see how to reverse the process.

 

The same module can also look like this and is the EXACT same code.  I just haven’t left it as one line. Remember, a Semicolon ‘;’ is used to TIE the lines together.  You DON’T have to have all the code on ONE line just to make the module work.  And Honestly, If I’m writing it, I like it broken up.  It makes more sense to my eyes and SOMEBODY ELSES eyes too!

 

--------------------------------------------------------------------------------

# BEFORE
#
# New-Module -scriptblock {$SayHelloHelp="Type 'SayHello', a space, and a name."; function SayHello ($name) { "Hello, $name" }; Export-ModuleMember -function SayHello -Variable SayHelloHelp}
#

# AFTER
#

New-Module -scriptblock {

$SayHelloHelp="Type 'SayHello', a space, and a name."

function SayHello ($name)

{
"Hello, $name"
}

Export-ModuleMember -function SayHello -Variable SayHelloHelp

}

---------------------------------

 

And this module if it was just a script would look like THIS

 

---------------------------------

# This is a function

function SayHello ($name)

{
"Hello, $name"
}

# this is the script part that will use the function

$somename=’EnergizedTech’

SayHello $somename

SayHello ‘EnergizedTech’

-----------------------------------

 

So really all you’re doing is taking a script and where you want to send variables on a regular basis, instead of assigning them IN the script or typing in literals, you’re just typing them on the command line.

 

We’ll take a slightly deeper look at how you can take this and customize this to your own needs next time.

 

Happy Holidays.  Don’t Script and Drive!

Sean
The Energized Tech

 

Start Date/Time: February 8, 2010 6:00 PM
End Date/Time: February 8, 2010 8:30 PM
Location: Microsoft Canada HQ, Mississauga
Description:

user-group.jpg

I am excited to announce to the members of the ITPro Toronto UG that Microsoft is coming to town to host a session on Monday, February 8th, 2010, entitled “Deployment Deep Dive on Windows 7”. You are invited! This exclusive event is hosted by Microsoft user groups across Canada. We’re opening our doors to other passionate technical professionals and I want to give you the first opportunity to register for the event. REGISTER NOW to attend.

User Group Event- Deployment Deep Dive on Windows 7
WELCOME TIME: 6:00pm-6:30pm
EVENT START TIME: 6:30-8:30 pm
LOCATION:
Microsoft Canada Headquarters

MPR A/B/C

1950 Meadowvale Boulevard
Mississauga Ontario L5N 8L9
Canada

SESSION OVERVIEW: Are you running Windows XP?  Are you feeling the pressure of creating a deployment plan? Have no fear!  Leveraging learning’s from two Windows 7 early adopters this session will give you the skills you need to proceed with your own deployment.  The session will focus on free Windows 7 deployment planning and deployment tools that customize operating system packages and automate deployment planning and network deployments seamlessly. We will dive right into:        

  • How to use the Microsoft Assessment and Planning (MAP) tool to identify your current hardware and application inventory.
  • How to use the Windows Automated Install Kit (WAIK) to build a customized image for your organization.
  • How to use the Microsoft Deployment Toolkit (MDT) to build, deploy and maintain Windows installation images.
  • How to migrate the end users profile from their current installation to the Windows 7 installation using the User State Migration Toolkit (USMT).
  • How to integrate MDT and Windows Deployment Services (WDS) to perform Lite Touch installations of Windows 7.

Finally we will look at how we can leverage the various tools to solve any application compatibility issues you might encounter.  We will look at how you can overcome common obstacles using the Application Compatibility Toolkit (ACT), or larger obstacles using Windows XP Mode and Microsoft Enterprise Desktop Virtualization (MED-V) and even how you can leverage Application Virtualization (App-V) to streamline application deployment and ensure all your applications work!

I look forward to seeing you at this event.

Sincerely,

Russell Onizuka

ITPro Toronto UG

www.ITProToronto.ca

 

Register Now

 

Logo_PowerShell

A very good friend I’ve been in contact with on Twitter who is a die hard Mac/Linux guy that works in Windows sent me a message.

“Need a vBscript to Query Active Directory for old accounts and disable them”

At which point I immediately pointed out “Powershell” as the quicker and more direct solution.

He in return sent back the specs to what he needed.

  • Go through and disable accounts over XXX days
  • Go through and delete accounts older than yyy days
  • Accounts which are flagged with key words in the “Description” field should NOT be deleted

This, in Powershell with the Quest Active roles commandlets is like a walk in the park.   Really it IS!

In Active Directory, there is a field for “LogonTime” which is the last time you successfully logged in.  And we can very easily compare that with the CurrentDate.

Quest has a command let built right into disable users.  It’s just

DISABLE-QADUSER identity

Where identity is the name you’re disabling in Active Directory

You can also EASILY remove Objects in Active Directory with the Quest Cmdlets too.  Remember, everything (OU’s, computers, users) is just an Object in A/D so to pull that little miracle off just run a

REMOVE-QADOBJECT identity

Where identity is the name of the object you are removing.  Careful with this CmdLet.  I ALWAYS run it (and any other destructive ones) with a –WHATIF parameter to make sure it’s going to do what I want.  If you mistype, you can delete a LOT of things easily.

In the Active Directory Notes field (Typically under the “Telephones” tab) We’re going to put in the word ***OVERRIDE*** to indicate the account NEVER gets deleted automatically (unless we remove those words from the Notes Field)

Also in Active Directory under the Description we’re adding the words “On Leave Until" to indicate the user is gone for a specified period of time.   This will allow us to have a script that can look and shut things down without us doing anything.  

In this rendition I’ve left in the –WHATIF parameter to cover the bases of “Murphy”

It also echoes to the screen the status of Each account that is potentially going to be deleted or not.  Accounts to be disabled are just disabled without notice

Let me know how this works and be VERY VERY VERY VERY careful using it.  I highly recommend practicing this on a TEST domain or TEST OU at the VERY minimal before going production.

A special shout out goes to my buddy @moldor in Australia for which this script would NOT have existed.  He created the need and the requirements.  If you’re looking for a great Mac/Linux guy who’ll even work with Windows in Australia?  Give him a shout!

Sean
The Energized Tech

----------------------------------------------- AutoCleanADUsers-Stale.PS1 -------------------------------------

# Uses Quest Active Roles
# Free to download http://www.quest.com/powershell/activeroles-server.aspx
#
# Get all users in that have not logged on within
# XXX days in "Active Directory" and Disable them
#
# Get the Current Date
#
$COMPAREDATE=GET-DATE
#
# Number of Days to check back.  90 days
#
$NumberDays=90
#
# Number of Days to check for REALLY Stale accounts
# Our sample here is taking "OldAccounts" and pumping up
# 30 more days.  Therefore 120 days old accounts that haven't
# logged in should be purged
#
$DeleteDate=$NumberDays+30
#
# We have certain "Override fields" that bypass a delete
# happening.  If the "Notes" field in A/D contains the
# EXACT Override phrase ANYWHERE (in this case it is the
# word ***OVERRIDE*** and it IS case sensitive
# The account will NEVER be deleted (unless of course you remove
# Word from the Notes field
#
#$OverRide='***OVERRIDE***'
#
# The other override field is if
# the OnLeave details are in the Description
# Field in A/D.  this allows for a User who is
# Not gone (IE: Contractor / Student) but may
# Return to have the account disabled and
# Left alone until they return.  The words here are
# simple On Leave Until and can be ANYWHERE in the
# Description Field in A/D
#
$OnLeave='On Leave Until'
#
# Organizational Unit to search – This is in the fictional domain of
# ‘Contoso.local’ in the OU of Users under the Business OU on the Root
# of the Contoso A/D
#
$OU='Contoso.local/Business/Users'
#
# Get all users not active within the specified range and disable the accounts in Active Directory
#
# We store them away as a variable since we're going to examine the list a few times.
#
$LISTOFACCOUNTS=GET-QADUSER -SearchRoot $OU | where { $_.LastLogon.AddDays($NumberDays) -gt $CURRENTDATE }
#
# Any account not logged in within the short range gets Disabled in AD
#
$LISTOFACCOUNTS | DISABLE-QADUSER -whatif
#
# Pull up a new list.   Really old accounts
#
$LISTOFPOTENTIALDELETES=$LISTOFACCOUNTS | where { $_.LastLogon.AddDays($DeleteDate) –gt $CURRENTDATE }
#
# Secondary compare is more interesting.  If the accounts are VERY stale, they get deleted UNLESS special keywords
# are in place
#
#
#
FOREACH ($USER in $LISTOFPOTENTIALDELETES)
{
    IF (($USER.Notes -notlike '*'+$OVERRIDE+'*') -and ($USER.Description -notlike '*'+$OnLeave+'*'))
    {
        REMOVE-QADOBJECT $USER.Name -whatif
        WRITE-HOST $USER.Name 'Deleted'
    }
    ELSEIF ($USER.Notes -like '*'+$OVERRIDE+'*')
        {
            WRITE-HOST $USER.Name 'Not removed due to Administrative Override'
        }
        ELSE
        {
            WRITE-HOST $USER.Name 'Not removed - Presently on Leave'
        }
}

----------------------------------------------- AutoCleanADUsers-Stale.PS1 -------------------------------------

Logo_PowerShell

Here’s one that drove me nuts and I finally decided “I’ve had enough! I’m going to sit down and tell others how to avoid this problem!”

When you work with Powershell and various applications you have one BIG Bane to deal with.

“Powershell Snapins”

They are the STRENGTH and the WEAKNESS to your Powershell scripts.  

Strength because of the abilities they give you and Weakness because unless you have a Shell like I do where all the Snapins are just LOADED up and ready to go?   You’ll get lots of Red errors as the Script tries to execute CmdLets it doesn’t know about yet because the darn Snapin didn’t get loaded up for that session.

And honestly, loading up a Pile of UnNeeded snapins just Bogs things down.

And the solution is very simple.  Add a few lines to the top of your script to CHECK for the Snapins that you need first, try to add them in.   Of course we should document the heck out of our scripts anyhow but shouldn’t the SCRIPT be doing the work for us?

So at the top of my scripts that use the Quest Active Roles Management CmdLets I’m going to be adding these few lines.  Their job is simple.  Check for the Snapin, if it’s not there, load it, if is go one, if it can’t BE loaded (say you errrrr don’t actually HAVE the management Snapin?) throw an error to notify the person running the script.

 

So all you have to do is run

ADD-PSSNAPIN NameOfSnapin –erroraction SilentlyContinue

 

And try to install the Snapin first. the “-erroraction SilentlyContinue” will hide those nasty Red letters if things went bad and it’s not there to load.

 

GET-PSSNAPIN NameOfSnapin –erroraction SilentlyContinue

and we can check the status of the snapin.  If it loaded ok (We had the module) we’ll get results, otherwise the results are $NULL and we can inform the user.

 

So now my Scripts (since they rely on the Quest Active Roles Management Powershell Snapin) will look like THIS to ensure that no matter what machine I’m on, It will check, load and warn before I run the script and see a kabillion red errors because I forgot to type something

 

It might seem cumbersome, but if you get into this habit your scripts will be a lot more portable and friendlier for other admins to use.

And REMEMBER!  You DON’T have to have Powershell and your Management tools on the SERVER just to use them!  Almost EVERY SINGLE CMDLET will manage from the Workstation, albeit just a bit slower.

 

Rock on with Powershell

 

Sean
The Energized Tech

------------------------- BeginningOfGenericScript.PS1-------------------------------------------------

#
# Add in Snapins Before launching Script.  I’ve written this, so you could run a pile of these checks
# and until it’s all ‘GOOD’ the script will quickly fail out.
#
#
# Variable to remember the $STATUS of your modules loaded

$STATUSLOADED=$FALSE

# The name of whatever snapin being dealt with

$SNAPIN=’Quest.Activeroles.ADManagement’

# Try to add the Snapin and be VEWY QWIET if it DOESN’T load
#
ADD-PSSNAPIN $SNAPIN –erroraction SilentlyContinue
#
# Check to see if it successfully loaded
#
IF ((GET-PSSNAPIN $SNAPIN) –eq $NULL)
     {
          # If not loaded – Notify user with required details
          #
          WRITE-HOST ‘This Script requires Quest ActiveRoles Management Shell’
          WRITE-HOST ‘Which can be downloaded free from http://www.quest.com/powershell/activeroles-server.aspx’
     }
ELSE
    {
          # If it DID, Flag Status as GOOD
          #
          $STATUSLOADED=$TRUE
     }

# And we treat the WHOLE script as an “IF” to make it work only “IF”
# the require modules have been loaded!
#
IF ($STATUSLOADED)

{

# Begining of your normal script here

}

------------------------- BeginningOfGenericScript.PS1-------------------------------------------------

Logo_PowerShell

Here’s my Holiday gift to all of you.

I’m a Network Administrator and was a Field technician for far too many years.  We hate wasting time.   

With Powershell and the Quest Active Roles you can easily have a system sitting on your laptop, not attached to any client domain so that you can Administer MULTIPLE domains with a minor change in this file.

Now this is just a VERY basic menu based system I threw together.  It’s not fancy, it’s not pretty.  It gives you a list of domains, pre-populates the User information when you login, and connects to the Foreign Domain

You can manually run commandlets inside that shell at this point *OR* you *COULD* write your own and add to this little system.

 

It’s yours to have and to work with. Happy Holidays.

Sean
The Energized Tech

 

----------------------- MultiDomainMenu.PS1 --------------------------------------------------------
#
# Requires Quest Active Roles http://www.quest.com/powershell/activeroles-server.aspx
# For connecting to Foreign domain
#
#
# A Simple menu system for managing multiple domains
# this is the top half.  The fields in each section should make sense
# Field #1 is the name of the NT domain
# Field #2 is the name of the Domain Controller
# Field #3 is the IP address of the Domain Controller
# Field #4 is an account that has credentials in the Domain specified in field #1
# 
# If you need to add more domains just copy this line below and replace the "Field#" with
# the appropriate data and add it to the end of the last entry of $Domain List
#
# $DomainList += ,("FAKEDOMAIN","FAKEDOMAINCONTROLLER","0.0.0.0","SomeFakeAdministratorAccount")
#
$DomainList = ("CONTOSO","DC.Contoso.local","10.0.0.5","Administrator"),("FABRIKAM","SERVER","192.168.1.25","TECH")
$DomainList += ,("JONES","MRJONESSERVER","192.168.0.5","Administrator")
$DomainList += ,("FAKEDOMAIN","FAKEDOMAINCONTROLLER","0.0.0.0","SomeFakeAdministratorAccount")
#
# Very simple menu system, nothing fancy. List the Domains.   
#
Do
{
# Boolean Variable (True/False) to set when the loop is done
$DONE=$FALSE

# There is probably a more direct way, but I cheated and told
# the Loop to just Count up my Domains. :{P
$CountDomains=0

# Echo the information to the screen

WRITE-HOST 'Which Domain are we managing Today?'
WRITE-HOST '-----------------------------------'
WRITE-HOST ''


FOREACH ($Domain in $DomainList)
{
Write-Host $CountDomains, $Domain[0]
$CountDomains++
}

# Get the Domain number to work with.  The First is always '0'
#
# Don't finish until you've got a correct entry
#
$Choice=READ-HOST '( 0 - '($CountDomains-1)')'
IF ($Choice -gt 0 -and $Choice -lt $CountDomains)
    { $DONE=$TRUE }
ELSE
    { $DONE=$FALSE ; Write-Host 'Please Make a Correct Selection' }
}
# When the $DONE Variable contains a Boolean $TRUE the loop ends.
Until ($DONE)

#
# We're going to pre-populate the "Credential box" but not the password
# The $Choice was from the prompt before this.  $Choice will reflect which
# member of $domainlist you're working on.
# 
# Accessing $DomainList[0][2] will access the FIRST member of the Domain list
# and the 3rd property of that record.  The '+' allows us to join two pieces together
# to let them be treated as one.  Putting the information within a set of ( ) keeps
# it Cleaner to view but also tells Powershell that "Everything here is MINE, don't mix
# it up with the rest of this line!" 
#
$CREDS=Get-Credential -Credential ($DomainList[$Choice][0]+'\'+$DomainList[$Choice][3])
#
# Connect to the Foreign domain with the Supplied credentials
#
Connect-QADSERVICE ($Domainlist[$Choice][2]+':389') -credential $CREDS
#
# Everything below here is up to you.  Take note if all you want is to have this script prompt
# for Credentials and connect you to the foreign Domain?  DONE! :)
# You can manually run GET-QADUSER and various commandlets manually at this point
#
# Future goodies to be added by Powershell people :)
----------------------- MultiDomainMenu.PS1 --------------------------------------------------------

At one point or another you WILL (not might, WILL) have a virtual machine that needs to get data into it on Hyper-V and you will NOT have network access.

It could be a demo environment.  It might be an unsupported configuration you’re playing with.   Some test environment. 

May the machine needs an update to recognize the Integration services.  Any number of things.

And you’re stuck.

“Grumble grumble grumble”

No you’re not.

A good friend showed me one of the simplest and nicest utilities ever called “Folder2ISO”.  It’s a bit tricky to find, but when you have a copy it’s beautiful.  And TINY.

Folder2ISO creates a .ISO file from a folder structure.  You can of course burn that file, but more importantly in Hyper-V (and many Virtual Environments) you can just mount that sucker.   Ka WHAMMO! You can throw your data in there and life is good.

But guess what? If you have a Server 2008R2 as the host, life gets better. Why?

Server 2008R2 and Windows7 can directly mount a VHD file.  

All you need to do is enter DiskMgmt.msc (Disk Management)

image

Off the top of the “Action Menu” you have a two new options.  “Create VHD” and “Attach VHD”.  Choosing “Attach VHD” allows you to browse to a VHD file and mount it as a regular drive.

image

What I like about this is by checking “Read-only.” I can play it safe.  Let’s say I want to get data OUT of that environment (or ANY virtual environment) without hurting the source?  Yes.  that “Murphy” character is hiding about every corner with a big surprise when we’re not careful. 

But once it’s mounted I can see the data as any other drive and move goodies in and out.


Oh did I mention the best part?  **** FREE FEATURE **** :)

Season’s Greetings all

Sean
The Energized Tech

Logo_PowerShell

One of the most IRRITATING and FRUSTATING things I had working in the field was dealing with multiple Domains.  

You’d have so many clients with their own Active Directory setups and each time you’d have to go directly to the Small Business Server to manage it, or try to have a machine dedicated to that purpose.   Plus you might have tools you’d want to copy to the server to help in that process.

A real pain in the you know what.

But Along came Powershell and Quest ActiveRoles Management Shell; and My life changed for the better.

You see Powershell can connect to a Domain Controller on it’s own terms, you just need to validate against it once before you do so.  Which makes things far far easier.

For example here he have a small simple script that let’s you connect to a Domain Controller called ‘dc.contoso.local’ and unlock a user account called John.Smith

------------------------------------------------------

# Note uncomment line below if you need to add in the Snapin for Quest ActiveRoles

# (Software is installed but not part of default Powershell Profile)

#

# ADD-PSSNAPIN Quest.ActiveRoles.ADManagement

#

# Popup and get Credentials for Domain

#

$CREDS=GET-CREDENTIAL

#

# Connect to Domain Controller dc.contoso.local with

# Provided Credentials

#

Connect-QADSERVICE ‘dc.contoso.local:389’ –credential $CREDS

#

# Now that you are in the domain, unlock the User

#

UNLOCK-QADUSER john.smith

------------------------------------------------------------

 

In truth this is a VERY simple script but the point you need to understand that it takes nothing to edit the name of the DC to be an IP address of a different domain on different site. 

The “GET-CREDENTIAL” commandlet does not know or even CARE about the name of the other domain.  All it does it ask you to provide the credentials.  These credentials take the standard format of a UPN or the traditional DOMAIN\Username. It will generate the keys. 

Once the CONNECT-QADSERVICE Commandlet talks to the other domain controller it provides those credentials through the –credential parameter.

Active Directory on THAT Domain takes that information, processes it and accepts it (because ID and password and provided Domain are valid) or not.  There is no magic to it!

Once you have done this process you (as long as that shell is open) are managing THAT domain.   Or at least you are as far as the Quest Active Roles are concerned.

Even if your computer is a member of DOMAIN “ABC.FABRIKAM.COM” and you’re trying to manage the “CONTOSO.LOCAL” domain, it doesn’t matter!  You can create, delete, unlock or Query the Active Directory for whatever information you need.  

 

Isn’t it nice when life is simpler?

 

Sean
The Energized Tech

I stumbled into an interesting problem.   Well it’s not a problem but it’s something I realized I should look at.

I got to deal with some servers with a LOT of ram (ok, 32gig in one and 72gig in the other) which is well… A LOT.

One of the problems I noticed was this WHOPPER of a pagefile on each one, since naturally the system generates a PageFile based upon your physical ram.

And that’s fine since the guys who designed the O/S have to assume WORST CASE ENTERPRISE USE of that system.

But I thought, “How do I tell what’s REALLY being used? Can I make sure the Pagefile is REALLY using all of that space on the disk?”

You can.  And it’s surprisingly very easy.

This article on Technet goes into tuning for Server 2003 and Windows XP with 64bit versions and with some minor changes (VERY minor) it works for 64 Bit versions of Vista, Server 2008, Windows 7 and Server 2008R2.

It’s something we need to look at as technicians and Administrators as we get into computers with larger than 4gigabyte physical ram installs.   Which are getting more common as memory gets cheap.

The article references “Performance Logs and Alerts” which has been replaced by the “Performance and Reliability” monitor in Vista/Server 2008 and the “Performance Monitor” in Windows 7/Server 2008R2.  But the technique is identical.

Now in the article, you can choose whichever method you’re most comfortable with.  I prefer #2 since it pretty much shows me “Worst case scenario” and very quickly.  that’s what I’m interested in.  The scenario of “Do I REALLY need to have a 72 gigabyte pagefile wasting valuable storage space?”

So under Administrative tools when you start up the new version of “Performance Monitor” you’ll see this, which really does look the same as it’s older counterpart.

image

So when need to Right click on “Performance Monitor” tab and choose “Properties” to bring up this window.

image

So at this point you can just follow the instructions.  Mind you, this is a live view of the situation at hand, but if you let it run, it should meet the need.

So click on “Data”, then “Add”.  In the List choose “PROCESS”.  In Windows 7 you will have to click on a little image to to the right of Process (see below)

image 

image

From this expanded list you must choose the counter called “Page File Bytes Peak” image

Once you have that selected click the “ADD” button image to bring the counter under “Added Counters”.  When completed it should look like this

image

Click “OK” which will bring you back to the main configuration window.   A few more clicks and you’re done.  Click on the “Graph” image Tab next.

Change the “View” from “Line” to “Report” to just give you a summarized version of the data. image image

Once done you’ll have a view in the Window similiar to this that reflects the most PAGEFILE ram that has been used “Peak”, which means your file shouldn’t be too much larger than that.

image

If the maximum is close to what you have and things are working fine?  Leave it alone of course.  But if you have say oh, a wasted 32 gig+ page file and you find the system never gets closer than 8 gig of that, you can probably go into your settings and trim it down by 20 gig.

If this is a server, you should seriously run this tests for a much longer time than it took to read this blog and definitely take all three methods into account.    But it should help out in wasted space and I think even performance somewhat.

Take it easy all

Sean
The Energized Tech

Did you ever just sit back for a moment and THINK how computers have ADDED to the English Language and changed it?

I don’t me so much as “Hey I have a Spellchecker and I can actually type without know how to spell” but more the new words and slang terms that have spurred from computer people.

If you came up to somebody 15 years ago and said “I coastered one” they would have thought you went on a wild and wonderful ride!  And possibly displayed your choice of eating afterwards!  But no! You were of course referring to “Failing to properly burn a CD and turning into nothing better than a ‘coffee coaster’”.  And those of us in the field remember the turn well!  For those first generation CD Burnings were not only expensive, but so was the media. 

They sucked, they were unreliable but OH SO CUTTING EDGE! And when you burned a Disc, you were probably burning on a $35 blank.  And if you FAILED to burn it properly, you just couldn’t have the heart to throw it out, I mean after all, It WAS expensive. 

So you used them for “Nerdy Coffee Coasters”

And “Bricking” something.  I had a user look at me as if I was nuts.   “Bricking” a term very familiar to IT Pros (not so much devs) that after applying a needed update to the a manufacturer’s hardware, upon restart, it became useless.   As useless as a …. well …. BRICK.   Usually the devices that were the victims of these “Bricks” were often small too.  Cell Phones, MP3 players.   So you often did have a useless plastic Brick.

What about “Bug in the System” ? There’s an all too common term.   People accept it as a common term now.   Hard to realize it’s origins somedays!  Back in the old ancient days of TUBE based computers, when a system problem was traced down one day to being a dead moth creating static at the base of a tube!

And how about some of our more common acronyms? PEBKAC, ID10T errors, FUBAR, LOL and all the terms most people take for granted!  I still remember when “Smilies” were a secret joke shared amongst us nerdy guys.

NOW THE DARN THINGS ARE IN ADVERTISING! :O

Downloading, Ripping, Logging In, Geeking out, Getting Nerdy!  Who ever thought that being a “GEEK” about something would be considered “Trendy” or “K001” or the terms we used would become a part of everyday life.

Oh yes, that other bit.  “3117 Speak!” (“Leet Speak”)

Typing in Letter / Number combinations to form words that look so bizarre to normal humans.    Using extended members of the ASCII character set to decorate out usernames.  People accept this (especially kids!) as normal!

Now new terms are popping into the language.  “Twittering”,”Accepting a Friend”,”Social Networking”.  It IS quite hard to believe that barely two years ago NONE of these words would have been spoken by Executives at ANY Fortune 500 company.

Now if you don’t speak it, you may as well get ready for the nuthouse. :P

 

W@t a w1ld 1nrfl w0rld :)

Sean
The Energized Tech

Logo_PowerShell

If you’re following best practice and running your Exchange Management from anything OTHER than your server, you might have stumbled into a small problem when you go to use the “Message Tracking” tool.  It doesn’t work.

The Reason is simple.  The Generated Script from the console “ASSUMES” (You in the back, keep the dirty joke to yourself) you’re running it on the server.

But the provided CmdLet still works if you copy it and drop it into a Powershell Session and add ONE little option.  -SERVER EXCHANGESERVERNAME --- and of course correct the time… Grrrrrrr……

Yep.  That easy

So in this Window on remote workstation we want to look at all the received emails from 5:02pm to 5:12pm on my Exchange Server.

image

But Of course once I run it I get THIS on my screen.

‘Exchange Server “STUPIDWORKSTATIONNAME.Contoso.local” was not found.’

 

That’s because the developer “Assumed” we run ALL of our management tasks on the server……*SMACK*

 

We don’t, or at least I avoid it like the plague.  You see, if you DON’T do it on the server you CAN’T accidentally do things like oh……reboot the server.  Not easily at least…not by accident anyhow.

 

So to make this work all I have to do is click on “Go Back” and copy that commandlet from the Window where it says “Exchange Management Shell Command”

 

get-messagetrackinglog -EventID "RECEIVE" -Start "12/18/2009 05:02:00" -End "12/18/2009 05:12:00"

 

And just add on “-SERVER NAMEOFMYEXCHANGESERVER”

 

get-messagetrackinglog -EventID "RECEIVE" -Start "12/18/2009 05:02:00" -End "12/18/2009 05:12:00" –Server energizedtechmail.contoso.local

 

The other booboo I will point out to save you some headaches is the script generate produces the wrong time to query.   It doesn’t specify the “PM” part if you’re trying to examine emails after 12:00 noon.  So just edit those times appropriately

 

get-messagetrackinglog -EventID "RECEIVE" -Start "12/18/2009 17:02:00" -End "12/18/2009 17:12:00" –Server energizedtechmail.contoso.local

 

This will now correctly give you a listing of all received Emails on that particular Exchange server from 5:02pm to 5:12pm.  Remotely, from a workstation that has the Exchange Management tools.

And all this without getting out of your chair.

 

Be Lazy, Be Efficient, but just be SMART :)

 

Sean
The Energized Tech

Finally got the Beta of Office 2010 installed this week onto the computers and heard rumor of a new feature in Powerpoint.

In the newest Powerpoint 2010 you can publish your presentations to a website, and hand remote users a hyperlink to view the presentation!

And the use of this feature was so incredibly easy I couldn’t believe it!

Just start up Powerpoint and load up a presentation.

image

Then in the new Powerpoint 2010 just click on “FILE” and choose the new “SHARE” option and choose “Broadcast Slide Show” clicking on the large button to the right.

image

It will take you through a very simple wizard which will connect to the Office Live service on live.com.   You will need an account on here to try it out.  But the procedure is very simple.   Click on “Start Broadcast”, key in your credentials (only needed once generally per session). It will upload your presentation and present you with a Hyperlink to a temporary location on Live.com that you can send or give to people to allow them to view your presentation.

image image 

It will upload your presentation and present you with a Hyperlink to a temporary location on Live.com that you can send or give to people to allow them to view your presentation.   With this procedure complete, just click on “Start Broadcast” and people worldwide can view your presentation easily.  

image  image

No credentials are required and they will see a synchronized view of your presentation.   Small businesses can easily take advantage of this (or consultants) for their clients.   No fancy WebEx or funky remote software is needed on the client side which is the beautiful part.  It’s all run in the backend on Live.com.

 image

And that’s just ONE small beautiful piece of the new Office 2010.  It’s a huge back of goodies.   We’ll touch on another really nice feature in Outlook 2010 next time.

 

Sean
The Energized Tech

Stumbled across an interesting problem.   The solution is easy, just it was a minor thing I ran across and thought to pass it along.

If you’re running Server 2008R2 as a Guest in a “Classic Hyper-V Environment” (Original Server 2008 all patched up) you might stumble into some “oddness”

The Virtual machine might just “disappear” from the system.  WMI errors.  

Just “Weird stuff”

Now the machine isn’t actually GONE, the Child process is alive and running but it either disappears from Hyper-V console completely until you reboot or restart services or becomes unmanageable.

The solution is easy.

If you’re hosting a SERVER 2008R2 environment, make sure the parent is Server 2008R2.  It seems obvious but I think it’s the Guest components pre-installed with Server 2008R2 are sending some “Mixed signals” back to Hyper-V R1 and it just simply doesn’t like them.

I would suspect the same with Windows 7.

In theory, I think if you could pull out the R2 Guest components and put in the R1 version, Hyper-V might be happier, but just getting everything to the newest version for the hosting environment is a lot better.

 

Cheers and keep those thoughts Virtual

Sean
The Energized Tech

PowershellBIG

Hmm a problem. 

You’ve got a Hyper-V box where people got sloppy and left all these stupid ISO files mounted in Virtual DVD rom drives everywhere wasting space.

Do you get angry?

Do you get vengeful?

Do you get out a big can of "Whoopass” and go off your rocker at these silly fools?

Nope.

You just whip out Powershell, point it at your folder of Virtual Machines and “look”.  Presuming of course the Virtual Machines were named with some level of consistency :)

 

GET-CHILDITEM E:\VirtualMachineFolder –recurse –include *.ISO

 

There.  You’re done.   And nobody had to go off the deep end to solve that one! Now just go in and remove them puppies and clean up the mess.

 

I errrrrrr…. cleaned up 20 gig of ISO files I left “sitting about” :P

Sean
The Energized Tech

PowershellBIG

Last time we updated the Tree script from BSONPOSH and www.get-admin.com

This time I took my version and went; well nuts.

I decided to try and make the lights “twinkle” by building an array of “Light locations” and selecting randomly those targets to be on or off.   I also decided to have two new characters be the twinkly lights.   And more colours.

And for fun?  A Marquee.

So what we have is a two fold solution.

A nice sizeable tree in Powershell, and errr a great burn in utility.

You see, it’s coded really badly and eats CPU :)

But it LOOKS NICE!

 

image

Enjoy and Happy Holidays

Sean
The Energized Tech

---------------------------------------------------------------

# PowershellTree-V-whatever.PS1

#Clear the Screen

clear-host

#Move it all down the line

write-host

# Assign two special characters into an array for potential "LIGHTS"

$starchar=[char][byte]15, "*"

# Number of rows Deep for the Tree- from 2 to whatever fits on the screen, after 30 it gets funky

$Rows = 30

# These variables are for the Bouncing Marquee at the bottom
# Column, number of Columns to move (relative to size of tree)
# and Direction it will move

$BottomRow=$Rows+4
$BottomColumn=0
$BottomMaxCol=($Rows)
$Direction=1

# Standard console Colours
# Just for fun I added in all the possible Console Foreground Colors
# Delete whichever ones you don't like

$colors = "DarkRed","DarkBlue","DarkCyan","DarkMagenta","DarkYellow","Gray","DarkGray","Blue","Green","Cyan","Red","Magenta","Yellow","White"

# Get where the Cursor was

$oldpos = $host.ui.RawUI.CursorPosition

# BsponPosh’s ORIGINAL Tree building Algorithm :)
# None of this would be possible if it weren’t for him

    Foreach ($r in ($rows..1)){
        write-host $(" " * $r) -NoNewline
        1..((($rows -$r) * 2)+1) | %{
                write-Host "*" -ForegroundColor Darkgreen  -nonewline
       }
        write-host ""
    }           

    # trunk

# A slight change, an extra row on the stump of the tree
# and Red (Trying to make it look like a brown trunk

    write-host $("{0}***" -f (' ' * ($Rows -1) ))  -ForegroundColor DarkRed
    write-host $("{0}***" -f (' ' * ($Rows -1) ))  -ForegroundColor DarkRed
    write-host $("{0}***" -f (' ' * ($Rows -1) ))  -ForegroundColor DarkRed

$host.ui.RawUI.CursorPosition = $oldpos

# New Addins by Sean “The Energized Tech” Kearney

# Compute the possible number of stars in tree (Number of Rows Squared)

$numberstars=[math]::pow($Rows,2)

# Number of lights to give to tree.  %25 percent of the number of green stars.  You pick

$numberlights=$numberstars *.35

# Initialize an array to remember all the “Star Locations”

for ($i = 0; $i -lt $numberlights; $i++)
{
$Starlocation+=@($host.ui.Rawui.CursorPosition)
}

# Probably redundant, but just in case, remember where the  heck I am!

$oldpos = $host.ui.RawUI.CursorPosition

# New change.  Create an Array of positions to place lights on and off

foreach ($light in ($numberlights..1))
    {
    # Pick a Random Row

    $row=(get-random -min 1 -max (($Rows)+1))

    # Pick a Random Column – Note The Column Position is
    # Relative to the Row vs Number of Rows

    $column=($Rows-$row)+(get-random -min 1 -max ($row*2))

    #Grab the current position and store that away in a $Temp Variable
    $temppos=$host.ui.rawui.CursorPosition

    # Now Build new location of X,Y into $HOST
    $temppos.x=$column
    $temppos.y=$row

    # Store this away for later
    $Starlocation[(($light)-1)]=$temppos

    # Now update that “STAR” with a Colour
    }
# Repeat this OVER and OVER and OVER and OVER

while($true)

{

# Now we just pull all those stars up and blank em back
# on or off randomly 7 at a time

for ($light=1; $light -lt 7; $light++)
    {
    # Set cursor to random location within Predefined "Star Location Array"

    $host.ui.RawUI.CursorPosition=($Starlocation | get-random)
    # Pick a random number between 1 and 1000
    # if 500 or higher, turn it to a light
    # Else turn it off

    $flip=get-random -min 1 -max 1000
    if ($flip -gt 500)
        {

        # Write a Random "star character" on the screen
    # in a Random Foreground Color from defined sets

    write-Host ($starchar | get-random) -ForegroundColor  ($colors | get-random) -nonewline
    }
    else
    {
    write-host "*" -Foregroundcolor DarkGreen -nonewline
    }
    }

# Remember where we are

$temppos=$oldpos

# Set a position for the row and Column

$oldpos.X=$BottomColumn
$oldpos.Y=$BottomRow

# update the console

$host.ui.Rawui.CursorPosition=$oldpos

# Bump up the column position based upon direction

$BottomColumn=$BottomColumn+$Direction

# Ok this was a BAD way to do it but it works for
# Christmas.   If we hit the right side change
# Direction to backwards.  If we hit the left side
# Change direction forwards

If ($BottomColumn -gt $Rows)
{ $Direction=-1 }

If ($BottomColumn -lt 1)
{ $Direction=1 }

# Print greeting.  Space must be before and after to avoid
# Trails.  Output in Random Colour

write-host " Happy Holidays Powershell " -ForegroundColor  ($colors | get-random)

# End of the loop, keep doin’ in and go “loopy!”

}

powershell_logo

 

I have heard so much about Azure, about it’s abilities to run in the cloud, how even ITPros can build an Azure website. 

But more importantly about it’s ability to be controlled and SCRIPTEDrom Powershell!

I have been scrambling about like a little gerbil trying to find the darn CmdLets! 

I thought they came with the demo kit… Nooooooo.

I thought they were installed with the SDK… Noooooooooo…

I searched all over my hard drive.  COULDN’T find a reference.   But it turns out I shouldn’t have. 

You see Azure is STILL not released and the CmdLets are of course still in development.  But that’s doesn’t mean you can’t get them in Pre release stage.  Just a quick download you see.

Just head down to

HTTP://CODE.MSDN.MICROSOFT.COM/AZURECMDLETS to download them.

It will require a little prep time.  They are after all STILL packed as code and need to be compiled.  But you don’t need to be a Developer in order to do this.  You DO need to download a copy of Visual Studio Express 2008 at least tho.  But being “Express” it’s free, and good to play with if you’re mucking about with Azure and Clouds. 

I should point out as well I compiled mine with Visual Web Developer 2010 Express Beta with no surprises.  In fact the installer did all the compiling so I really didn’t need to be a Dev at all for this step.  It followed a very simple process.

It opened up a file explorer box with a pretty obvious “STARTHERE.CMD” on the screen, I followed the dead simple Step by Step instructions, clickity clickity click and WHAMMO!  “Azure Cmdlets”

The compile was quick on my little Centrino based laptop, nothing too painful and the only complaint I had was the installer was INSISTENT on dropping my “EXECUTION POLICY” in Powershell to “UnRestricted”.

Ahem…”UnRestricted” for people using Powershell is akin to running about in a parking lot full of broken glass with your shoes and socks off without the streetlights.

But it’s a short term issue and you can turn it back to your previous state of Nirvana.

Once compiled you get introduced to some new Kids on the Block (or there I go with those Puns again) or really CmdLets

My only other complaint (being this is pre-release version, I’ll not soap box) was the fact it didn’t automatically create a nice little “Powershell Azure Prompt” which I don’t bother with anyhow since I add my Snapins to my main window as I need.   But it didn’t immediately say something like “GO THIS WAY FOR CMDLETS” which would have been nice.

But again, PRE-RELEASE! Not polished!  So the fact that it compiles and works without a hitch I forgive :D

To fire up new the new Snapin in Powershell just type up

ADD-PSSNAPIN AzureManagementToolsSnapin

And to view your new Azure tools fire up a

GET-COMMAND –PSSNAPIN AzureManagementToolsSnapin

Which yields this box of rockin’ Shelly goodness.

image

So can you already see the power sitting here?  The ability to get the status of your applications and automatically (via Powershell Script) adjust your settings in the CLOUD from your system in the office if need be?

I’ll be diving a bit more into Azure now.  I may not code well (or at all to be honest) but I’m going to learn what I can about Powershell and Azure.  You see, rumor has it, an ITPro and a DEV might join forces as one in an Azure presentation.

 

With my head in the clouds

 

Sean
The Energized Tech
”The Dark Side of the Force has Cookies” :D

Boy did I feel like an idiot today.

I had an XPS file from Vista.  Needed to view it.  NEEDED.

I tried downloading the viewers from Microsoft, nope won’t install.   No 64bit stuff available. 

Then my brain rebooted.  It’s in Vista, it HAS to be in Windows 7 and by Proxy, in some form in Server 2008R2.

DUUUUUUH Yep!

Just take a run down into the Server and go to Add Features, drill right to the bottom

image

Yep there it is, just check off that box, click on Next and then Install.

image

Soon you can view XPS files on Server 2008R2. 

And Nooooooo this is not meant for a REAL Production server, but nice if you’re running Server 2008R2 on a Workstation.  You know, just cuz.

And yeah, you better license it, or be running it on a Test environment.

 

Sean
the Energized Tech

The Holidays are upon us, and like many I’m already worried about what THIS person wants and what THAT person wants and ohhhh, will I be able to pull it off, will I fall asleep beside the dog under the tree.

Will I step into an “unwanted surprise” from Reindeer.

Things like that.

But with all kidding aside, it is important to remember, that we can all find something to hang onto and be thankful for.  To also remember there are others worse off.

And during these times, it is SO easy to just ignore the problems of others and just focus on ones self.  I know.  I’ve done it.  I feel like a complete heel too.  

So this year (it seems stupid) I remind myself.  You do have to do that sometimes.   Remember how bad it was before, and no matter how bad you think YOU have it, somebody else is hurting worse.

And so think about giving.   $5, $10, a friendly phone call, a song, a Hot Coffee, an uplifting email.

Give whatever you can, however you can.

If you can afford yourself a cold beer?  You can give.

You don’t have to go broke in doing it, or stand up and make yourself noticed.   But you can do it.   You can give even if you have nothing to give.

You can give time.  Soup Kitchens, Play Santa for some kids, maybe just even help your neighbour who might be too busy watching her kids and working, shoveling snow.

Heck even just “turning down your radio” and giving a little peace. 

And if you have something to give?  And you feel that little pinch at your side that says “Maybe I should?”

Go ahead!  Dive in.

For no other reason than this.  

“Just cuz.”

 

Cheers to a Thousand Years

Sean
The Energized Tech
”Happy Holidays”

Continuing on with out step by step into Windows Recovery Environment, the previous times we created a CDrom version of the Windows Recovery Environment (WinRE).  Once produced directly from the Windows 7 Backup / Restore environment (which a click of a button) and an ISO image created from scratch.

I should also mention that same procedure that was used to create the ISO image can be used to customize it and expand to that environment, but we’ll deal with a bit of that next time.

This time we’re going to show you how to put that environment on a UFD (USB Flash Drive) or memory key.  

Now because it’s all based on the same core as Windows 7, the procedure is identical to putting Windows 7 or Server 2008R2 media on a bootable key.

Go into a Command prompt (CMD.EXE) with Administrative rights.

Once there plug in your MEMORY KEY.

*** WARNING *** THIS PROCEDURE WILL ERASE YOUR KEY ***
******* IF THERE IS IMPORTANT DATA, BACK IT UP FIRST *******
***** NOTE THE BIG FLASHING WARNING SIGNS BEFORE ******
****** PROCEEDING FURTHER!  THIS MEANS YOU *****

Start up DISKPART from within that Command prompt.  This is your command line based partitioning software

DISKPART

You’ll need to execute a LIST DISK to see which drives are attached, and more importantly which number has been assigned to yours.  Typically your removable is probably “Disk 1” or “Disk 2”, You can usually tell by the physical size.   That’s odd.   What’s a 16MB Compact Flash doing here?

image

So ignoring the plight of my useless 16mb Flash, we need to change the focus of the Didk Partitioning software to that particular drive, in my case “Disk 2” the 8gb memory key

image

Now that we’re looking at Disk 2 we need to *** ERASE IT *** with the CLEAN command

CLEAN

Then Create a Paritition

CREATE PARTITION PRIMARY

Once done, we need to format it

FORMAT FS=FAT32 QUICK

Then set it as Active

ACTIVE

and of course, once done, Assign a drive letter to it.

ASSIGN

Good!  Now here is the bit you’re going to love.

Take that Windows Recovery Environment that you’ve burned to CD, Select the Contents and copy it to the USB key.

That’s right.  This is the exact same process as you use to make a Windows 7 Bootable disc.  It also applies if you have the “Desktop Application Recovery Environment” handy on disc or even Windows PE.

There you go!  A free Christmas Present

 

Sean
The Energized Tech

Ok I’m finally really comfortable with MDT 2010.  Actually that’s an incorrect statement. 

I now understand Windows PE and the process of deployment (mostly) under MDT and WDS.   So may I already was pretty comfortable with making the image, I just initially didn’t understand how it got from “here” to “there”

 

So I’m looking at a new challenge.  And a true challenge it is.

 

I’m going to learn how to do a ZTI with Systems Center Configuration Manager 2007. 

A “ZTI” or “Zero Touch Installation” is the Nirvana of images and imaging.  The idea behind ZTI is you define EVERYTHING at the back end (Computer name, Mac Address, Software deployment, you name it!) and if you need to blank out a computer you just go “ZAPPO!”

The end result as an Administrator?

“My laptop is broken.”

Press the magic PXE boot button on the laptop and life is magically better. 

Ok, it’s not an “EASY” button.  Things still have to install.   But imagine a completely CLEAN install of the O/S where you’re not questioning the process of the imaging because it’s doing exactly what you would do by shoveling in disks.   Only a lot more efficient.    So if the machine goes “Boom” after, bad hardware therefore replace or warranty.  Don’t FIDDLE!

The sucky part of ZTI is time.  You have to spend time and REALLY set it up.   The pay off is the amount of time you NEVER EVER EVER EVER have to waste when a new machine needs to be installed or an old one needs re-imaging.  And changing the O/S (Since it’s an extension of the images in MDT) is just as easy.

So I’m going to play and install and LEARN.  And oh yes, I’m a gonna BLOOOOOOOG about this.   My headaches, the magic words I will mutter and the elation when it’s finally working!

So get ready for a  ride folks.  I’m expecting a little “Road Rash” along the way

 

Sean
The Energized Tech

Are you afraid of presenting?  Standing in front of a room of 200 people?  20 or maybe 20,000?

I can tell you this.  I am.  Even after presenting at ITPro Toronto various times and at Techdays I still get the willys!

Why?  

I can tell you this.  Once you start going INTO the presentation, you’re usually ok.   Once you get a demo going, you feel better.   But why the fear?

Because you might mess up?

All people mess up.  I’ve seen presenters who are have been doing this time and time and time and TIME again mess up.

So don’t worry, it will happen.  Just roll with it. 

A good friend (and great presenter) gave some of the best advice I could ever get.

“……They’ll forgive you for being nervous, and talking a bit fast.  They won’t forgive if you don’t know your stuff, so just show you know your stuff…”

and it’s true.   If you ask a room of 200 people (especially to those you’re presenting to) how many of them are afraid or nervous to do what you just did, you’ll find the bulk of them (if not all) will yield an emphatic “YES!” to that question.

Because we’re all going to be afraid of failing.

But let me put this one thought in your head.  A simple one.

Even the greats started and failed at one point, and even the greats still fail on stage.

Why?

Because we’re human.    So when “IT” happens on stage (Yes the unWANTED unBEARable //FAIL\\) , roll with it.  Be human.   Talk about something else, crack a joke with the audience, get them on your side, because you’ll find for the most part they are.

And most of all, be yourself.   Bring your brains.  You’ll find you present most effectively when you combine professionalism with your natural ability, whatever it is.

Unless you’re a professional wrestler doing a talk, in which case giving the hecklers a “Body Slam” just ain’t gonna help ;)

 

Cheers all and take that chance
Sean
The Energized Tech

Completely un IT Related.  Just a thing that caught me off guard and made me say “Has the world gone mad?”

When you look up at a subway wall from a MAJOR Transit provider, offering safety tips on how to use the Escalator. 

Yes really.

Safety tips on “HOW TO USE AN ESCALATOR”.  I had to give my head a shake.  Several in fact.  And douse it with a large bucket of cold water.

 

“ESCALATOR SAFETY…”

Over and over the words echo inside my head.

“ESCALATOR SAFETY…”

 

First question, just when, WhenWHEN did this become a problem?

At what point did a NATIONAL EMERGENCY occur that people just COULD NOT FIGURE OUT how to go UP and DOWN the escalator in *SAFETY* ?

When did this crisis occur?  Was I asleep on way too many news reports?  did I doze off during the night the President and the Prime Minister and World Leaders broke news of this major problem.  That People ALL ACROSS THE WORLD had neglected to read their “Escalator Safety Manuals” ?

Come to think of it, I don’t remember reading mine.  Now that I mention it I never had one. *ACK*

Should I be worried?

All those years in K-mart and Woolworth was I SKIRTING the EDGE of disaster by NOT knowing how to be “safe” on an Escalator?  I’m just shaking in my boots!

All those daring years of my youth bolting up the wrong direction, DARING to stand on until the last step, facing the DAEMON that was the Escalator monster.  I’m surprised I’m not dead already!  Because I didn’t have ANY Escalator safety tips to save me!

I feel so comforted now, that when I go onto a subway, there is now a large friendly sign and gleaming pamphlets everywhere cautioning me of the dangers of the escalator.  Life is so much better.  I had no idea how dangerous it was!  The terrors it offered!  The deadly panic it creates!

 

“THE ESCALATOR!”

 

Yes I can see it now, the new TV Horror movie of the week.

“Don’t look now…It’s in the Mall.  It’s in Office Buildings!  Rolling in Subways!”

Yes this Tuesday Night, “THE ESCALATOR! AIAIGHAIHGHAIHGIHGAHGAHI!!!!! HIDE YOUR SHOELACES!!!”

 

I’m so relieved now that I’ve been alerted of this danger!  I think now that the Elevator is SO much less of a threat, I’ll play in it now.  

I mean it’s OBVIOUS the Elevator is so much safer than the ESCALATOR!  It doesn’t have ANY safety campaign launched at all!

 

Going mad with the world. :)
Sean
The Complete Loony Tech

I’m sitting down in a Second Cup with a coffee before by 70-403.

Am I studying?

No.

I have it in my head at this point there are only two options, pass or fail.   I either know the material enough (and I am confident) or am dead wrong.

So rather than worrying, I’m sitting down, reading something that has nothing to do with the exam.  Twitters email, web pages.

I am relaxing.

And of course I have a Technet Tshirt on underneath, like it’ll make a difference but we all wear our own “mojo” to these things ;)

So it’s got me wondering, what is YOUR little ritual (if any) before an exam or test?

Comment here or send me details at sean@energizeit.ca.  I’ll pool the responses together and post them without names.  It could be interesting to see what “Tiki’s” we carry or what we do across the board to prepare for tests.

Cheers and see you after I pass
Sean
The Energized Tech

BTW.  I try to carry in whatever confidence I can, pass or fail. 

So I’m writing an MCTS exam tomorrow on Configuring SCVMM 2008.  

The problem with this exam is there are no preparatory materials other than field experience (which I have) and an online course (which I’ve taken).

There is a classroom course, but in my case, I think it would go into areas that I’ve already touched on.  

So I need to play with SCVMM 2008 and Hyper-V in an environment that is NOT work (so I don’t make an error in the corporate network) and one where I can freely change pieces on the fly.

So I have single computer.  Not a network of machines.  I do not have a house full of the latest and greatest.  I’m like you.  I’ve got kids to feed and I’m allowed, at great discretion to own one really good thing. 

I have ONE (count ‘em folks) ONE 64bit computer.  It’s got the Hypervisor on the CPU and 8 gigs of ram.   This was a machine that started out with 2 gigs and as I could get $50 and $50 there, the additional was added over time.   The machine is about a year and a half old with a single 500 gig hard drive.

So to practice for my 70-403 exam I’ve downloaded from Technet Direct (so if you just went to Techdays in Canada, you have that subscription) and have downloaded Systems Center Virtual Machine Manager 2008 (not the R2 version, because this is for 70-403), Server 2008, Systems Center Operations Manager 2007, Server 2008 and Windows Vista.

So this weekend (and last) I did nothing but play with creating machines, templates, hardware profiles, deployment, storing into Libraries.   I setup Virtual Server 2005 under a Windows Vista Computer in Hyper-V to “manage” it with SCVMM.  I played with the Powershell scripts and learned how to deploy multiple machines with ONE LINE.

I obtained free software from StarFire to give me Virtual SAN device by setting up a Virtual machine with their software and emulating a SAN.   I think I even setup a Cluster (single node) in my house.

And the cost of the hardware to do all this?  You’re about $500.  Yeah that’s it.  For the computer chip, ram and harddrive.  And the thing is once you have this all setup, you can really play with Hyper-V and various operating systems.  Why it’s even a breeze to setup multiple domains and test Swing migrations.

So tomorrow morning I’m going to write my 70-403.   And when you see my next post on Twitter, my next blog post it will simply be one word I hope.

*PASS*

Sean
The Energized Tech

It’s interesting to realize just how many free Virtualization Solutions are out there now.    VMware, Sun and Microsoft all have free ones for you to use.   And they all work nicely.

You’ve got Virtual PC 2007 if you’re running Windows Vista, or Windows Virtual PC part of Windows 7, Hyper-V Server 2008 all from Microsoft and all Free.

There’s SUN Virtual box, a very nice solution that run’s in 32bit Windows 7 systems that do not have Hardware Acceleration.  Granted it does not support VHD’s but if you’re trying to put together a demo system and don’t have 64bit virtualization available, it’s not a bad option.

And of course there’s even some free solutions from VMware out there.  Oddly the only Virtual player I can’t see is Apple.

But the nicest part I like about the Virtualization on the cheap, is it allows the little guys to put together multiple computer networks in their basement (albeit, maybe a bit slow) without have a LOT of expensive network equipment.  It lets people PLAY without redesigning their whole system or trying to figure out a Multiple boot across three different vendors.

Not that that isn’t COOL (it is) but it means MORE people can learn about technology and bigger networks without the bigger budget or space constrictions a real physical network puts on them.

And that’s just nice for all of us.  The environment included.

Put your thoughts into a Virtual Place and keep them efficient.

Sean
The Energized Tech

So last time we looked and saw a way to create a standardized workstation for deployment on SCVMM 2008.   Of course I say “Workstation” when it really could even be a server, a web server, an additional file server for DFS.  It could be anything.

But now we’re going to see the REAL power of SCVMM 2008.   Powershell commandlets.

As with all the current Microsoft technology, the GUI is really just a system that’s building a Powershell script and running that script.   That’s it.   And initially when you look at the SCVMM 2008 script for deployment your reaction will be the same as mine.

“Where do I start with this thing?”

In actual fact there is only ONE line you need to worry about if you’re deploying, the rest are building variables and content for that one line.    They all do need to be there for that line to work, but you don’t actually need to fiddle with them.

The important line is right at the bottom of the generated script

image_thumb11image 

The line you see that starts with “NEW-VM” is what you’re interested in from that script.  That’s it!  And use of that script (Just do quick “SAVE AS” in Notepad and give it a name, say NEWWorkstation.PS1) and you’re off to the races.

So in the self generated script I have this in my final line

New-VM -Template $Template -Name"Workstation01" -Description "" -VMHost $VMHost -Path "C:\VirtualMachines" -JobGroup 972411ef-4233-4e96-8e52-b95e472dd7fd -RunAsynchronously -Owner "TECHDAYS\Administrator" -HardwareProfile $HardwareProfile -ComputerName "*" -FullName "Techdays" -OrgName "Techdays" -TimeZone 35  -AnswerFile $null -OperatingSystem $OperatingSystem -RunAsSystem -StartAction AlwaysAutoTurnOnVM -DelayStart 0 -StopAction SaveVM

 

For the uninitiated, pick your eyeballs off the ground, sit down.  It’s not scary.  It’s WORDY but not scary.  There’s only really three spots you need to be concerned with.

The section marked “-Name”, “–Description” and “-Computername”

Now you could manually edit these fields each time to get what you wanted, but well to be honest, that kills the luster of Powershell.  The computer should work for YOU.

So here’s a very simple approach.   Use a “READ-HOST” and have it prompt you for the needed information.  And really, what changes from one standard workstation to the next?  The name and possible description.

So at the very top of the script or even just before the “NEW-VM” line place these commands

$ComputerName=READ-HOST ‘Key in name of Workstation’
$Description=READ-HOST ‘Type in the Description’

And then change where it says -Name “Workstation01 , -Description “” and –Computername “*” to read

-Name $Computername, –Description $Description and –Computername $Computername

So the line now reads as

New-VM -Template $Template –Name $Computername -Description $Description -VMHost $VMHost -Path "C:\VirtualMachines" -JobGroup 972411ef-4233-4e96-8e52-b95e472dd7fd -RunAsynchronously -Owner "TECHDAYS\Administrator" -HardwareProfile $HardwareProfile -ComputerName $Computername -FullName "Techdays" -OrgName "Techdays" -TimeZone 35  -AnswerFile $null -OperatingSystem $OperatingSystem -RunAsSystem -StartAction AlwaysAutoTurnOnVM -DelayStart 0 -StopAction SaveVM

Of course some parts of this line will reflect your particular configuration, but those THREE are the ones you need to change to have this self generated script be repeatable to your needs.

Next time we’ll take a look at using this same script to deploy multiple machines at the same time with a very minor change.

 

Script on!
Sean
The Energized Tech

This is a simple and kind of obvious thing but we all forget about our local libraries.

I’ll be honest.   The last time I went near a library was back in High School.  I had a job since then so I was buying my own books, and honestly didn’t need it.  So I forgot.

It’s a damn shame too, because lately, like with many of us, my funds are tight.   And I’ll be honest.  I don’t have internet right now.

*Pause* *Shock*

“An ITPRO WITHOUT INTERNET?!?”

Hey it happens.   But this is where we get to supporting our libraries.   Did you know many libraries now have free internet access on computers?  Ok NOT a complete shock but MANY (because laptops are common) have Wireless internet access.  Free with your library card.  

So you can sit in a quiet place, with no real noise and concentrate on your work.   I am finding more often the local library is beccoming that quiet place to blog about something, clear my thoughts.   Type and just think for a bit.

And had I thought about this a year or so ago, I donated all my old MCSE books to Goodwill.  But I took a look in the local section of the library, and do you know how much computer books they had ?

Two, maybe three.  That’s it.

So here’s how I’m going to start, and you can do it too.  If you’re looking at that big cool shelf full of books you rarely ever look at?   Full of material about Operating Systems or older servers and applications? 

Take those extra books you might not need.   Because honestly, we don’t need them.   Bring them to your local branch.   You’ll be pleasantly surprised how much they are going to benefit from them.   Because no matter how little you think you have?  Don’t forget.  There’s always that kid who doesn’t even have a job that would LOVE to read up on some of this.  

So next week, I’m going to start bringing in those extra manuals I bought for the office, the SQL 2005 course I never finished, even “Powershell step by Step” and deliver to my local branch.

Because I *DO* remember vividly what it was like wishing I could get books and the learning material.  And you can help others too so easily in that manner.

Yeah, it’s cool and nerdy to have a shelf full of books.  But if they’re sitting there wasting space not being read, what use are they?   Give them to a library and let those books be enjoyed, let the knowledge grow, let people prosper.

Support your local Library in any way you can.

Sean
The Energized Tech

Finally got the Beta of Office 2010 installed this week onto the computers and heard rumor of a new feature in Powerpoint.

In the newest Powerpoint 2010 you can publish your presentations to a website, and hand remote users a hyperlink to view the presentation!

And the use of this feature was so incredibly easy I couldn’t believe it!

Just start up Powerpoint and load up a presentation.

image

Then in the new Powerpoint 2010 just click on “FILE” and choose the new “SHARE” option and choose “Broadcast Slide Show” clicking on the large button to the right.

image

It will take you through a very simple wizard which will connect to the Office Live service on live.com.   You will need an account on here to try it out.  But the procedure is very simple.   Click on “Start Broadcast”, key in your credentials (only needed once generally per session). It will upload your presentation and present you with a Hyperlink to a temporary location on Live.com that you can send or give to people to allow them to view your presentation.

image image 

It will upload your presentation and present you with a Hyperlink to a temporary location on Live.com that you can send or give to people to allow them to view your presentation.   With this procedure complete, just click on “Start Broadcast” and people worldwide can view your presentation easily.  

image  image

No credentials are required and they will see a synchronized view of your presentation.   Small businesses can easily take advantage of this (or consultants) for their clients.   No fancy WebEx or funky remote software is needed on the client side which is the beautiful part.  It’s all run in the backend on Live.com.

 image

And that’s just ONE small beautiful piece of the new Office 2010.  It’s a huge back of goodies.   We’ll touch on another really nice feature in Outlook 2010 next time.

 

Sean
The Energized Tech

Systems Center Virtual Machine Manager is a work of beauty.  It truly is.

First off you have to understand that I love my job.  I love technology.  But I really hate wasting time.

I truly do.  I’d rather sit down during dental surgery chewing on Tinfoil while scratching my fingers on a chalkboard than waste time.

And SCVMM 2008 eliminates that.  And I’m going to help YOU eliminate wasted time.  By showing you how easy it is to deploy systems in SCVMM 2008.

First step?  Create a system in Hyper-V.   Setup that puppy the way you want it.  Of course don’t install it on a domain ( at least for this purpose ). 

Setup the Virtualized machine, install the applications, get it just the way you want it.  Install you Integration Services for Hyper-V.   So far so good?  Got your install key installed?

Then here’s where it get’s easy.

Right click on the machine in question and choose “Close”.  This will re-create that system as a duplicate you can “mess with”.  In our case, a system we can Sysprep and turn into a template.

image imageimage   image

It will run you through a simple wizard, prompting for a new Machine name defaulting to “-Clone” appending to the name.  You can choose any name you like of course.  The power is yours.

Next you want to create a “Template” in Systems Center VMM 2008.   Click on “New Template”, You’ll get a warning about “Destruction and Damage” to the machine you’re about to make a template of (and a great disturbance in the ‘The Force’, the moon crashing into the sun, yada yada…)  That’s ok.  We’re working with a Clone, and well, that’s why we sent in the Clones (oh that was bad, really bad, I‘m so sorry………No… no I’m not… :P )

  image image

Run through a a simple wizard, define the ram and hardware settings you want this machine to have and continue to the “Guest Operating System” screen.  Fill in a password, Install key for the operating system in question, and specify the settings if you need to join a domain.  Make sure as well that you have the right type of license key to match the operating system media (IE: MAK, KMS or RTL).

Let the Wizard run through and you’ll have a new entry under “Templates”.

It will take some time as it runs a “SYSPREP” on the Virtual Machine in question, copy the VHD’s and configuration through the BITS and sets you up into SCVMM 2008 with a new Template.

So if you named the computer in Templates as “Krusty the Clone Template”, when you want to create a new Virtual Machine you can choose that new system “Krusty the Clone Template”, deploy it through the wizard and away you go.  

The beautiful part is once you’ve gone through this process, you have a Powershell Script to work with.   What this does is give you a way to REPEAT all of this without all the headache.

 

But we’ll touch on that next time around, let’s let you drool over the fact you can deploy without headache for the moment.

 

Cheers All, I’m so Virtually Out of Control
Sean
The Energized Tech

powershell_logo

One of the biggest features that sets Powershell aside from all other scripting Languages is the ability to ensure the code CAN be trusted.  By signing that script with a Certificate you can ensure that scripts meant to run on a particular machine are only from that machine or more particularly from within your department, division or company.

What stops most of us from doing this are usually cost (Certificates usually cost money) or just a lack of knowledge.

Well guess what?  We’re going to put that knowledge in your hands, and it DOESN’T have to cost anything.   You don’t even need a Domain or Certificate infrastructure just to USE this.

Because the tool is free, the instructions are free.   You can buy a certificate of course but if you’re a small business, you may not want to incur that cost to run scripts on a single server.

What do you need to do this?

The freely downloadable SDK for your version of Windows (I don’t think you need to download the entire kit) and Powershell

That’s it.   Oh and a few minutes time.

The instructions are sitting right inside Powershell too if you want to read up on them.   I found the easiest way was to just use the Powershell ISE Help System and search for “digital” or “signature” and you’ll see a reference to “about_signing”.   There’s your instructions.   But here’s the quick version.

Run these two commands, and when prompted for a password, key one in. 

makecert -n "CN=PowerShell Local Certificate Root" -a sha1 ` -eku 1.3.6.1.5.5.7.3.3 -r -sv root.pvk root.cer ` -ss Root -sr localMachine

makecert -pe -n "CN=PowerShell User" -ss MY -a sha1 ` -eku 1.3.6.1.5.5.7.3.3 -iv root.pvk -ic root.cer

To verify it was created correctly

get-childitem cert:\CurrentUser\my –codesigning

Once you know the Cert is there and running well You can Digitally Sign your Powershell Scripts

$cert = @(Get-ChildItem cert:\CurrentUser\My -codesigning)[0]

Set-AuthenticodeSignature NAMEOFSCRIPT.PS1 $cert

Which will take the script called NAMEOFSCRIPT.PS1 and digitally sign it.  That’s it! 

Now you can lock down execution of Powershell scripts on that environment

SET-EXECUTIONPOLICY –ExecutionPolicy AllSigned

You now have a Server running the scripts securely.  And in such a way that that unless the scripts are signed with a certificate they can’t run automatically.

And I wasn’t kidding either.  It WAS easy!

 

Sean
The Energized Tech

powershell_logo

I had a good friend the other day ask the Community. 

“I want to learn Powershell?  What books should I read?  Where do I start?”

And the problem is that you will get 100 answers, all of them correct.

The problem is that Powershell, as it’s name implies is incredibly POWERFUL.  And to a new person taking a look at the online community and the features it has, it can be overwhelming at the amount of available scripts written by people.  

And you may have the same reaction I had.  “Oh I could NEVER do ANYTHING like those scripts, I should just shut up so I don’t look stupid…”

And that’s the flaw.

The material online for Powershell all started somewhere.  Everybody knew NOTHING about it at one point or another.   And your best resource is those magic six words.

“I don’t know, I’ll go ask…”

What actually seems to work for me is something as simple as (and that’s how I started learning) “I have a need to repeat something en masse or for consistency.”

If it can be done in a Login Script, it gets done there.  GPO there.

If' it has anything at all on ANY level to do with WMI, Active Directory, working with the Registry, manipulating files, date, logs, Powershell.

So what helped me learn Powershell was that magic thing that drives all ITPros and Devs.  “I NEED a solution.  I need one I can easily repeat and replicate with consistency.”

What I found worked best for me was to realize I was “repeating something” and could this be scripted in any way?  The reason for this was with an automated solution, the results were more consistent and faster.   So to learn Powershell, find a need you have.   Learn how to do that NEED in Powershell.   The solution is probably online.   And once have it full filled, you may want to understand how that solution works, and why.

The Syntax of the language is simple.   A “VERB-NOUN” structure.  Running a “GET-COMMAND” will show you all the commands, “ALIAS” will show you all the Aliases.

That doesn’t teach you Powershell.  What teaches you Powershell is just using it, for something simple.   Get comfortable with just ONE feature.  Even if it’s just using “GET-CHILDITEM” to navigate the file System.   Get really comfortable with using that one feature with a “GET-MEMBER” to learn how to pull out properties (information) and Methods (functions which modify the output)

In short, Play with Powershell on a small level and get comfortable with it.  Because ALL of the fancy stuff, all the nifty stuff all works the same way.

And don’t be afraid to ask, nobody in the Powershell community thinks there are silly questions.

Because one day, we were all asking them ourselves.

 

Sean
The Energized Tech 

powershell_logo

This might seem like such a simple command to be ecstatic about, that is unless you’ve ever tried to script emailing log files.

Oh it’s doable.  There’s vbScripts that do it.    They just, well… they’re ‘wordy’

They’re not horribly complex to be honest, but it seemed to me at the time there had to be an easier way.  Since the vbScripts themselves weren’t exactly in ‘English’

But now There is

Powershell V2’s new “SEND-MAILMESSAGE

The Syntax of the command is a little eye popping as it any piece of software at the command level.


Send-MailMessage [-To] <string[]> [-Subject] <string> -From <string> [[-Body] <string>] [[-SmtpServer] <string>] [-Attachments <string[]>] [-Bcc <string[]>] -BodyAsHtml] [-Cc <string[]>] [-Credential <PSCredential>] [-DeliveryNotificationOption {None | OnSuccess | OnFailure | Delay | Never}] [-Encoding <Encoding>] [-Priority {Normal | Low | High}] [-UseSsl] [<CommonParameters>]

 

But really, it’s very easy.  

 

SEND-MAILMESSAGE –to johnqsmith@contoso.com –subject ‘BackupLogs’ –from backup@contoso.com SmtpServer 10.0.0.10

 

There you’ve sent a simple mail message of nothing from one line.  And read it.  You don’t have to be a developer to understand what that means!

SEND an eMAILMESSAGE to johnqsmith@contoso.com from backup@contoso.com with a SUBJECT of ‘BackupLogs’ to the SMTP Server which was 10.0.0.10

And if you need to tuck in an attachment like those backup logs, just plug in the –attachment parameter with the location of the file in question.

 

SEND-MAILMESSAGE –to johnqsmith@contoso.com –subject ‘BackupLogs’ –from backup@contoso.com SmtpServer 10.0.0.10 –attachment ‘C:\BackupLog\Logfile.log’

 

That’s the beauty of this, a command that just makes dead simple sense.  So there you have it, Powershell the nicest “MAIL”man you’ve ever met

Sean
The Energized Tech

powershell_logo

It was brought to me the other day.  

“We have this file buried in an old server, only thing we know about it is the day it was created.  But we need it now…”

*** NOW ***

Ever had one of those?

Well this was not an issue.  We did know the type of file it was, we just had to dig through 125,641 copies in hundreds of subfolders, Sure, easy…

 

But it was actually, because I had Powershell.

 

Now I could have sat down and done some really cool script but you don’t have to get fancy with Powershell to get the job done.  And remember you can ALWAYS refine it later.

So I needed to just put this down as I was thinking it

 

“I want a list of ALL the files in the Archive.”

GET-CHILDITEM D:\BIGHONKINARCHIVE –recurse

 

“Ooops… wait a minute… I want a listing of all WORD documents in that structure.”

GET-CHILDITEM D:\BIGHONKINARCHIVE –include *.DOC recurse

 

“Better…but actually just the ones made in 2003…”

GET-CHILDITEM D:\BIGHONKINARCHIVE –include *.DOC –recurse | where { $_.LastWriteTime.Year –eq ‘2003’ }

 

“Ok this is nice smaller list, but really I want the stuff done in November 2003”

GET-CHILDITEM D:\BIGHONKINARCHIVE –recurse | where { ($_.LastWriteTime.Year –eq ‘2003’) –and ($_.LastWriteTime.Month –eq ‘11’)}

 

“Ooooo, now could I just have the ones done on the 26th?”

GET-CHILDITEM D:\BIGHONKINARCHIVE –recurse | where { ($_.LastWriteTime.Year –eq ‘2003’) –and ($_.LastWriteTime.Month –eq ‘11’) –and ($_.LastWriteTime.Day –eq ‘26’) }

 

Now the first part I should state is this is NOT the most efficient way to do it.  But what it DOES show is you can easily use Powershell as an amazing search tool to mine through your folders.   And more importantly, you can write it out as you’re thinking about it the WAY you’re thinking about it.

You could even (with a little pipe) have Powershell examine the contents of those files and determine which ones might have the content you needed.

 

Oh somedays I wonder, what was life like BEFORE Powershell?

Oh yes, right.  I forgot.

It was horrid.

 

Thanks Powershell!

Sean
The Energized Tech

Day One of Techdays_CA in Calgary

IMGA0002

Up at 5:00am.  5:00am MOUNTAIN STANDARD TIME.

The day began actually at 4:48am with me up BEFORE the alarm clock.   The excitement was unbearable.

I was up running about the room, getting ready to go out the door.  Up before the alarm clock could go off and out the door.

IMGA0011

And in the early hours before 8:00am where it was still dark, final preparations were still ongoing. Volunteers scrambling about to make sure everything was setup, demos loaded up, machines powered.   A last few Internet connections to double check and batteries to plug in

And the day began.  

IMGA0017 IMGA0016 

The crowds began to move in, past the registration desk into the main dining Area.   One thing I will say about Calgary, they REALLY know how to get things done!  I looked at the layout of the room and my jaw dropped.   Such an amazing dining area including the Windows 7 area prepped with various renditions of Multi Touch PC’s from Dell and HP.   The new Ford Flex.

IMGA0021

And truly the community showed it’s support and interest.   Each and every session at Techdays_CA Calgary was packed not only with information but people intensely interested in that session.    It also reflected the intense desire of people wanting to know just how that technology COULD be (but not necessarily SHOULD be) leveraged.

And it showed one other small thing.

Common interests.  

IMGA0030

I’m from Toronto (Much farther east) and you could hear people talking about similar problems and issues.   People were looking for answers, or in some cases pieces to the puzzle.    There were some things that might be unique to their particular area but for the most part, people were very curious about Windows 7, especially the newer UAC or Multitouch.   Many people genuinely didn’t know about the Media Center Extender and were very impressed when all the content was accessible from a single location.   People were very impressed that a computer in the house could interface in that manner.

I for one, would VERY much like to see that technology extended to OEM’s.   I think if the market was to have a pile of systems that could interact in that manner with people, the face of computing would change.   I personally can’t wait to see “Project Natal” released for the Xbox360.   I expect it to have issues (like first generation Voice Recognition did) being a VERY new technology.   But sit and think for a moment.

A computer where YOU go to it, and *IT* immediately begins to work with YOU on YOUR terms rather the classic “type type, click click”.   MultiTouch and “Project Natal” (If that were to extend to the PC world) would COMPLETELY change the face of computing as we know it.  Security, Login, Interaction.  

Many didn’t realize about how many free tools Microsoft offers to the public to make their jobs easier, there were some *I* didn’t know about! And although we all understand the value of getting it for free, there is also greater value in having an Enterprise level solution you can easily manage.

For a full TWO DAYS this continued.   Interest did not fall back, it continued.  Especially in both sessions at Day end that I was involved in.   In both Toronto and Calgary I personally noticed that the room was full each time with people tired from a long day but so intensely interesting in learning something, they would stick it out to the end.

For my part I found many of us had a common nemesis, Mr. “Murphy S. Law, Attorney in Troublemaking”.   And we fight the unknown and unexpected daily.   All of us.

IMGA0028 IMGA0029

And of course Techdays_CA brought forth Community.   I ended meeting a few new people and maybe (just maybe) inspiring a few new people to “Try something different” even if that something is Microsoft technology OR more importantly, stepping out and getting involved with the IT Community.

Getting involved not because Microsoft says you should, getting involved because it’s an extension of yourself and some of us are afraid to take chances and look foolish.

Well I’ll tell you one thing.

Take the chances, look foolish, make mistakes.   Because if you try and fail or try and succeed, you always learn something.  Take that to heart from one guy who spent his entire life afraid to try.  It is WORTH it on levels beyond describing to take that “dip in the pool”

And at Techdays_CA one thing I have learned is there truly *IS* always something more to learn.

Even as people

Sean
The Energized Tech

powershell_logo

I’m going to start with something you should be aware of when working with Powershell.  You CAN access a lot of information remotely without Powershell Remoting. 

You can without question.

It’s just a lot slower and not as Powerful.   I can do a GET-EVENT from a remote computer and get it’s event log.   But it’s just SLOOOOOOOOOOWWWW!!!

With Powershell Remoting your live is a breeze!

And it’s REALLY easy to work with too!

First off both machines have to be running Powershell V2.

Machines receiving the “Remote Instructions” need to have Remoting enabled

And you need  a few minutes to play.  Yes it’s THAT easy.

There’s two types of remoting I’ve started to play with.  One is more of a DIRECT interaction remote Shell, and the other actually runs remote commands and let’s you received the data locally.

The second is the coolest!

So stage one.  Enabling Remoting on the “Remote System”

In an Elevated Powershell Prompt (Run as Administrator) execute the following command

Enable-PSRemoting

You will get prompted to allow it to run afterwards since Ps Remoting is enabling features and adjusting Firewall settings to allow it to run

image

Select “A” for Yes to All (I promise it won’t hurt you) and allow to run through.  You’ll need to be connected to the network and running in a Domain Profile or Private network profile to work.   It will take a few moments and now the machine is ready to accept remote Powershell connections.

But how to use them?

Ahh well THAT is the easiest bit.  Here’s the “Direct Console” method which effectively has you running commands locally but executing and processing remotely.

Just run a

NEW-PSSESSION –computername REMOTECOMPUTERNAME

ENTER-PSSESSION –computername REMOTECOMPUTERNAME

That’s it!  You’re now connected to that computer running commands as if you were logged into running a normal Powershell prompt.

To exit just type

EXIT-PSSESSION –computername REMOTECOMPUTERNAME

Now that wasn’t so difficult was it?

But HERE is coolness.   It’s the ONE feature I’ve been dying for.   Invoke command remotely but have the results piped to you locally!

And what do you know?  You use INVOKE-COMMAND

So if type

INVOKE-COMMAND –scriptblock { get-childitem } –computername REMOTECOMPUTERNAME

That will actually run a ‘get-childitem’ in the default context (file system) on the computer called “REMOTECOMPUTERNAME”.  And the really cool bit is what results from that command I can save and work with (including Piping) LOCALLY.

Can you see the potential here?  Get entire event logs from a DC, filter them for what you want and look at the results locally on your Excel spreadsheet!

So this example

INVOKE-COMMAND –scriptblock { GET-EVENTLOG –LOGNAME ‘Application’ | where { $_.EntryType –eq ‘Error’ } } –computername REMOTECOMPUTERNAME

Will pull down the Application Event log from that remote computer.  I can put that data DIRECTLY into a EXPORT-CSV via a pipe, or make a more specific script and have it filter for certain types of data.

But the import detail here is one thing.  It is SO much faster and SO much more powerful.  Because of Powershell?  It wouldn’t take much to Query Active Directory for a list of Servers and query ALL the Event logs and pull down a nice fancy report from the result!

 

Powershell – I love You

Sean
The Energized Tech

My eyes are open and they have SEEN the LIGHT!

If you are wondering and humming and hawing about having an Open License with Microsoft and are unsure about Software Assurance?  I just found the nail to close the sale for you!

There is additional cost, and the fact that a new version “Might” get released isn’t enough for some people.  I can completely understand that.

But MDOP! the Microsoft Desktop Optimization Pack for Software Assurance is MANNA from the gods! And is PART of Software Assurance!

Ok first off you get MED-V which allows you to have Virtualized legacy environments running seamlessly on the Back end so anything that is old and NEEDED can STILL run on Current technology solidly and SEAMLESSLY.  App-V as well.  These two environments alone independent of Windows 7 can bring Compatibility to levels you’ve never seen.

But to boot you get ADVANCED Group Policy Management that lets you roll out and roll back different Group Policies, Asset Inventory and Systems Center Desktop Error Monitoring.  More power to monitor and extend your networks abilities in one package.

Any ONE of these would open up my eyes like giant pancakes.

But then I met “D.A.R.T.” – the Desktop Application Recovery Toolkit.  

DART takes the Windows Recovery Environment (WINRE) and extends its wings and MAKES IT FLY!  “D.A.R.T.” builds an ISO file with more power than you could ever want.  But that Boot.WIM file it builds is ALL WinRe based.

So guess what?  With a little clicking here and there (or just a handy MDT / WDS setup in the backend?) You can boot into the GREATEST single tool to help systems repair and recover in any Environment, Enterprise or Small Business.

DART contains a Malware scanner, A strong Registry Editor, the Ability to reset Local Admin Passwords, File Undeletion, MBR repairs for the Partition, HOTFIX Removals and even a decent version of Explorer!

WinRE is a gorgeous piece of technology.  And It’s free with Windows 7 and Server 2008R2.  But adding DART to WinRE launches it like a Rocket!

If you’re considering Software Assurance?  Consider the fact the ability to Undelete files alone as a backup Data Recovery plan ALONE could more than pay for the value of Software Assurance, or an even MORE Guaranteed factor in your environment to ensuring Maximum recoverable uptime!

So my workstation is now testing out DART.  The only drawback with my Workstation being Windows 7 is it’s so darn stable?

I don’t get to see things break.

MDOP.  Seriously look into it for your organization.

Sean
the Energized Tech

powershell_logo

I had a question in our User Group during Powershell Script Club.   Would he be able to use Powershell to show a list of all computers in Active Directory?

Well I took a shot at it tonight.

With Quest Active Roles installed I ran a

GET-QADCOMPUTER | EXPORT-CSV C:\Computers.csv

and examined the Headers from the CSV file in Excel to see what data was available.  I spotted about four date columns but it appears there was some duplication but the useful ones seem to be “CreationDate” and “ModificationDate”

CreationDate seems simply enough to be the Date the object got Created in Active Directory

ModificationDate seems to be the last time the Object was accessed in A/D by the computer for any reason.  It also seems to contain a date field that is update somewhat quasi regularly by a live system.

So running the process is typical date compare stuff.  Store the current date and Compare against the date

 

$COMPAREDATE=GET-DATE

GET-QADCOMPUTER | WHERE { $_.ModificationDate.CompareTo( $COMPAREDATE.AddDays(-90) ) –eq $TRUE }

 

Should give us a nice simple list of computers that are ACTIVE within the network in the last 90 days.

Now mind you this is not a precise science.  I wouldn’t just go ACTIVELY purging objects with this, but at least it can help you see a list of Computer Objects you can probably purge in the A/D

 

And of course if your Pipe that into an EXPORT-CSV you can get a nice list in Excel to see and work with.  

 

$COMPAREDATE=GET-DATE

GET-QADCOMPUTER | WHERE { $_.ModificationDate.CompareTo( $COMPAREDATE.AddDays(-90) ) –eq $TRUE } | EXPORT-CSV C:\IDLECOMPUTERS.CSV

 

Keep in mind as well, if you have Virtual Machines that don’t normally get powered up and SIT for 90 days, they would also fall into this list.  Thus why I said use it as a reference list. DON’T and I repeat DON’T just pipe this into a Remove-Object without a –whatif unless you’re in a Test Domain.

Or unless you’re looking to get fired really quick.

 

Powershell

It just makes life EASIER!

 

Sean
The Energized Tech

 
Remembrance Day

So long ago
A time long forgotten
When madness
engulfed our world
 
Shouts of terror
Screams to Silence
So much anger
And cries of despair
 
Confusion and Chaos
And attempts to lay down order
Crimes of the Guilty
The Punishing of the Innocent
 
A time when the few
Stood up to the many
A world divided
A world united
 
Days long ago
When our Father's Fathers
And Mother's Mothers
Fought each other
And Side by Side
 
So that we
Their many children
Their unknown offspring
Could have today
 
To live
And breathe
With the Right
To say 'NO' to the darkness
 
And most importantly
To Remember
Their mistakes
Their Sacrifices
 
That we might stand today
And live to
learn from them
And Grow
 
So today
This Nov 11th
On that 11th hour
Pause for a moment
 
And Remember
What They did
And Gave up
For you today
 
Sean P. Kearney
Ran into a small stumbling block today on the Network. 
 
I had a user request to have an Add-In Disabled on the workstation.    No problem easily done.
 
Go to list of Add-ins and just "Clear the Box".
 
But I thought for a moment, I like being Laz..... I mean "Centrally Efficient" and controlling as much as possible from Group Policy.  Oddly enough (And maybe I'm missing it) I couldn't find a GPO to control Addins at all.   Either to deploy, or more importantly, to disable.
 
But if you're running Server 2008 with a newer Domain?  You've get a GREAT Feature called Group Policy Preference Client Side Extensions you can leverage.  And it will work on Windows XP, Vista, Server 2003 and Windows 7 as long as you have the Group Policy Preference Client Side Extensions Hotfix installed on the workstation.
 
For Windows XP          - Download Here
For Windows XP x64    - Download Here
For Windows Vista       - Download Here
For Windows Vista x64 - Download Here
For Server 2003           - Download Here
 
For Windows 7            - Built right in!
 
So if you have a Server 2008 Environment with the newer Group Policy there is an addition option now called "Preferences".  These DON'T Negate the regular Group Policy options.   Think of them at what they are, Extensions.   Do GPO's the normal way but there's a whole new way of tweaking very specific features like drive Mappings etc.
 
Or today, applying Registry settings!
 
So in my case I had to Disable the Office Live Addin.  Nothing wrong with it, but we just don't need it in our environment.   So we'd like it disabled.  
 
So under HKEY_LOCAL_MACHINE\Software\Microsoft\Office there are a series of keys for Word / Excel / Outlook etc. ( If you're on a 64bit workstation the actual location is HKEY_LOCAL_MACHINE\Software\Wow6432Node\Microsoft\Office\ )
 
Under each entry you will see "Add ins" with a Subkey for each Add in, like this
 
 
The Value to note here for each key is called "Load Behaviour".  If it has a value of '3' it will run the add-in automatically, if it has a value of '2' it is "Disabled" although still present in the file system.  I like disabling rather than uninstalling, far easier to "Flip the switch on" if needed.
 
And you can probably even use a script to do this do.  But with Group Policy Preference Client Side Extensions, I can push that Registry change to my workstations centrally.
 
I just go to the Computer Configuration in Server 2008 GPO, Pop into the Preferences for there and you'll see an Entry that says "Registry"
 
 
Just Right Click on Registry, Choose "NEW, the Registry Item" and fill out the forms as appropriate.
 
 
You can even browse to the key but for now we'll type the details in directly.
 
Hive is "HKEY_LOCAL_MACHINE"
Path is "Software\Microsoft\Office\Word\Addins\OLConnectorAddin.Connect" (That's the Key we're modifying)
Value Name is "LoadBehaviour" (since that's the value we're changing)
Value Type is "REG_DWORD" (since that is the original value type, you can tell by viewing with REGEDIT)
Value Data is "2" (Normally for Addins it's '3' which is enabled, '2' is disabled, I found this out by changing the settings and looking at values after)
 
This will disable the Office Live Com Addin in Word.   If you need to do this for Excel?  Just change the Path to "Software\Microsoft\Office\Excel\Addins\OLConnectorAddin.Connect"
 
Also note, if you're working on a 64bit machine, and since Office is a 32bit app, it's path will be "SOFTWARE\Wow6432Node\Microsoft\Office\Word\Addins\OLConnectorAddin.Connect"
 
But you CAN control all these addins from Group Policy thanks to Group Policy Preference Client Side Extensions.   It takes a tiny bit of thought, but it's worth the effort since you can now manage all of this for anything from 5 to 50,0000 workstations EASILY!
 
I love Technology
 
Sean
The Energized Tech
 

The silence blows through the air like the sound of a thousand dead thermal fax machines.  The air is brim with calm and quiet and horrid writing.

It is a night, a night ripe for terror!  It is Devils night, the night before Halloween.

A late night data entry clerk is sitting down, keying in mounds upon mounds of useless data into an oversized database.  A long night it seems.  Exceedingly long.

Our poor hapless victim ...er data entry clerk....has stepped up for a moment to go refill his can of "unnamed corporate caffeinated beverage so I can't be sued." He turns and inadvertently bumps the remainder of his drink onto the desk.

"EEEEP!" he shrieks in a high pitched girlish scream as he quickly tries to mop up the mess with his worn out sweaty shirt.

But it is too late.  The sweat and soda and whatever else was on the desk have oozed through their way to the computer below.   Crawling and melding it's oozy smelly sugary way into the heart of the system. The smell of BO and Soda burns in the air along with an, as of yet, un-named third and completely un-identifiable smell.

The smell is not relevant however.  What is, is the sound. Indistinguishable it sits in the background working up the spine of the living. Roaming throughout the room and omnipresent.  Like the smell of a thousand burning Kaypros, it hangs in the air.

"Wow, never seen THAT happen from a soda spill!" Our hapless vict.... Hopeless idiot states.  He runs out (Convenient timing eh?) to grab a towel to sop up the rest of the mess.  Quickly drying up the mess he heads out, hoping the mess will be blamed on somebody else.

Returning to the room, stupid the Data Entry Clerk views a completely indescribable sight, a sight which only responds with one sound as he is enveloped into nothingness.

*GLOMP!*

During the night Charlie the unwary and completely and utterly helpless janitor; who just so happens to be wandering the hallways; hears the noise.  It is coming from the vents.   Suspecting it to be a rat or small child roaming aimlessly within the ducting system, he grabs a broomstick to shoo it off.  Opening the access port, he hears the noise.  Low and rumbling. Deep and foreboding.   Calling out to potential Darwin Award winners everywhere.

And hungry, very hungry.  "More Brainz ... Brainzzzzzzz... more brainzzzz" it echoes in a slow dull rumbling sound.  

"Odd.   The heating vent isn't usually this chatty." mutters Charlie shaking his head POKEing the vent and jabbing with uncertainty, and stupidity.

It was at that point the broomstick suddenly disappeared into a puff of illogic.   Charlie looked inside.  Quickly disappearing himself into nowhere.  Swallowed up by bad spaghetti programming echoed from Basic 4.0 on Commodore Pets from ancient past.

*GLOMP!*

Unwitting readers now scratching their head at this point looked for clarification.   But clarification was not for them to be had.  Clarification requires documentation and clearly defined parameters.   On this night, the parameters were NULL and VOID and the variables random.

Only one thing would come to light.  (or dark).  The Living Undead Processes had been spawned.   sHell itself had opened up.   Nothing was safe anymore. (or any when).  Resources would shriek out in terror tonight, Users would cry in agony.   And writers of good horror stories would sob at this poorly written idea.

They roamed the halls.  They roamed paths, crawling through directories and subfolders.  They roamed random thoughts of the author seeking one thing.  Brains.   Big thick juicy brains.  With Ketchup.  And perhaps a side order of Flies.  

the Living Undead Processes, ripe with the taste of fresh Janitor ala Broomstick were looking for more brainz.  But being a late night, nobody else was really there.  And so they hid in a back hallway and suspended themselves until morning.  A scheduled task to awaken them at the appropriate hour.  And spawn the Daemons.

Time passed...

And passed further....

And yet further on...

Until... yes UNTIL.

Morning.  October 31st.

11:00am, deep into the work day, the living Undead Processes spawned free and began their Task of CONSUMING resources and Brainz.   Slowly, outwards from the vents, they seeped out.   The smell of brainz were fresh indeed.  Deep inside the IT Department four score of people deep into Product Development and one lone IT Pro.

But before diving deep into the meal, the Undead processes went for a little snack.

"Apples!" cried the leader as it spotted the Marketing Department.  Yes rows upon rows of delicious, electronic glowing Apples.  But no, not Macintoshes.  Classic Apples.  Running on Floppy disks no less. 

A roar of delight came from the Undead Processes as they dove in the circuit boards and began gnawing on the minimal storage of the floppies,  Gnawing and chewing, they almost didn't notice a small band of hapless marketing staff walking

in from their trip to a "completely unnamed coffee shop" that MIGHT end in the name BUCKS.  ALMOST.

"Yes well Jenny, I personally believe that using a bit of mauve to top of the schema on the third la...."

The sentence never finished.

The Undead Processes launched from the ancient computers, routing through the wireless, completely indescribable to their victims.  Surprising them with both untested speed and surprising blocky graphics.

"BRAINZ! BRAINZ!"

The Helpless marketing staff never knew what hit them. Shrieks and screams of terror as in moments, they too began hosting the unstoppable entity known as the Undead Processes. 

*GLOMP!* *GLOMP!* *GLOMP!* *GLOMP!*

The moaning and staggering began.

"BrAiNzzzzzz!  bRAinnZZzzzz!!!!" slurred the Marketing staff and the Undead Processes now linked as one.   Old IEEE488 cables were fashioned to create a strange and horrifying pseudo network.   The Undead Processes were now, no longer trapped.   They began to spawn further creating multiples of themselves.

the Roaming began slowly through the halls.  

Into the hallways.  Working their way towards the secured domain know as IT.  "BrAAAIINNNZZ! BrAiiNnnnzzz!"

Along the way, they spotted terminals and users.   They consumed the minds of the users and attempted to access the network, seeking more resources along the way.   But with the weak credentials, they could gain nothing.  They needed IT.

They craved the unlimited and powerful resources held within the brains of IT.  

"Brainzzzzz!!" The growing army of Living Undead Processes, moaned out.  Now tied together with bits of old Arcnet, Token Ring and pieces of Zip drives.    Slurring and dragging horribly misspelled phrases and badly created slices of code. Spitting randomly created batch files.  Coughing Hollerith punch cards everywhere.

Down to the deepest levels,  Scarfing down Managers, Copy staff and people in the Call Center.   The Living Undead Processes grew.  A large and empty maw of nothingness taking over the entire office.  Devouring Fax numbers and Calendar entries as well. Brainz and Resources their diet.   With the occasional MP3 collection as a dessert.

*GLOMP!* *GLOMP!* *GLOMP!* *GLOMP!*

The Office began the slowly disappear.

Soon, they arrived at the entrance to the IT Department.

*Beep* (RED LIGHT)

*Beep* (RED LIGHT)

The access cards they had acquired were all useless.   Nobody had the rights to the Domain of IT except for IT.

But the Living Undead Processes were not stupid.  Having consumed enough people in Accounting, they knew where 36% of the IT budget went.  They knocked on the door.

"PIZZA DELIVERY! Free booze! Dancing girls!"

The door opened a crack, just enough to slip some code, or a hapless Developer out.  

And too late for our friend, *GLOMP*, he was consumed by the Living Undead Processes.   They licked their input channels in delight at this newly acquired resource.   It was good but still lacking something.   The taste was not quite right.

They tried the new Swipe Card. 

*Beep* (RED LIGHT)

*Beep* (RED LIGHT)

"PAAARRRRRRGGGGHHH!!!" came the sound of a former Nurse.  "It's a COOP Student!"

It was true.  IT sent a pawn for it's bidding.  A traditional tactic.   Good to keep Managers at bay.  Or random armies on Halloween of Living Undead Processes.

The Living Undead Processes formed a Pseudo Discussion Group and began looping through ideas.   PUSHing ideas into a huge STACK.

"We musssssst enter the IT Domain!  We neeeedddds their Braaaiiiinzzzz!!" Muttered the Undead Processes as one  with Unification.   "Trickssss them we willll..."

"Taunts them with baublesssss we mussssst."

"SCHWAAAAG! Gives 'dem SCHWAAAAG!"

They knocked again.

"Free TechNet Direct!  Free MSDN! Sign up now!"

Squeeeeeaaaak! (the door to the Holy chambers creaked open, two tiny blinking pairs of eager eyes looked out.)

*GLOMP! GLOMP!*

Soon two more hapless victims and their hardware were consumed, but still...

"PAIGH!  MORE CO-OP students!  Their tassssssste is lacking assss issssss their ressssourcesssssss….."

The former head of Accounting growled out....

"They weres inflatingsssss their budgetsssss I thinkssssss.... Unpaid staff!  These are smart onesssss.... Their Brainzz we must havveee..."

The nothingness echoed.  "The IT Department we must have...."

Meanwhile, the disappearance of three co-op students did not go un noticed.   Although it was quite common for  Co-Op students to disappear for hours at a time, they did not usually exit with a sound of crunching bones and *GLOMP* noises.

The lone IT Pro examined the security camera.  

Quickly he messaged the rest of the paid Development Team (all three of them) through his Communicator R2 application.

ITDEPT: Problem in Office.  Management and staff appear more "dead" that usual this morning.
CODELORDZ: Anything odd on the system last night?
ITDEPT: Checking logs.   Basement Data Entry system, spawned an overload.   Suspect leaking memory and one less Data Entry Clerk.
CODELORDZ: Anything sign of anything unusual?
ITDEPT: Checkings cams.  Oh great.  Caffeinated Beverage all over Terminal. Missing Janitor too.  Ugh!
CODELORDZ: Big problem.  That means only one thing.  Undead Processes.  Lots of them.  And by the sounds, Hungry
ITDEPT: Undead Processes?  So that means no TASKKILL on this one.  
CODELORDZ: And unfortunately copying them to /DEV/NULL on that Linux router on the back corner won't kill them either.
ITDEPT: How do you kill an UNDEAD Process?
CODELORDZ: Normally you'd just try to shoot off the SOURCE Undead Process that spawned them all into an endless loop. That doesn't kill them, but it renders them useless.
ITDEPT: Why can't we do that?
CODELORDZ: Because the source code was eaten with that last Coop Student. WE errrr.... left it on Floppy disk.  :)
ITDEPT: :( Store all code on the TFS in future! So you can't kill an undead Process but you can keep it busy?
CODELORDZ: Yes, they just consume and consume resources, but they're not very Smart Processes.  They're like CP/M 2.2 applications.
ITDEPT: Ahhh, hang on..... I think I have an answer, it will involve a recursion routine... and perhaps another Co-Op student

Handing off a new laptop to Ernie, the newest and most eager of the Co-op students.  His task was simple.

“Bring this outside, and we’ll make you a Systems Administrator…”

Our hapless little Vict…. Voluntold eye’s lit up

Out in the hallway, the Undead Processes had found a small cache of mice and were gnawing on the cables for nourishment.  "Braiinnzzz!  We want Braaiiiiinnnzzz!" They groaned hungrily eyeing the entrance to IT

...Squeeeeeeeaaaaaaakkkkk!...

The lone Co-op wandered outside with a DuoCore laptop, 16 gb of ram with mirrored 1 Terabyte hard drives.   And an unlocked desktop!

“Woohooo! I’m a Systems Administrator now! YEAH! Woohoo! Wooo…..” cried the silly fool.

The scent of this large irresistible resource was too much for the Undead Processes.

"ARARARARRRAAAARRRRGGGHHHH!!" The processes leaped out and consumed the student *GLOMP* and dived in the laptop.

" I HAVEEEEEE HISSSS CREDENTIALLLLLLSSSSS!!!" The Source UnDead Process cried out!  It Haaaaaasssssss DOMAIN ADMIN RIGHTTSSSSS!!!!!"

Logging in, they drooled

USERID       Voluntold-CoOp
Password    Ub3rk001R@d
DOMAIN     IT

Diving in, they looked, "RESOURCES!!!! UNLIMITED POWER AND RESOURCESSSS!!! YESSSSSS!!!!" The Undead Processes roared out.  They consumed the recursion routine and began growing, growing phenomenally.

One by one they exited their victims, diving into their prey.   An Unprotected Domain.   Soon the floor was littered with the shells of many staff.  Now Brainless without Process or Purpose. 

The Domain Roared with CPU usage, Page files overloading with storage wasted.  They began to roam the LAN to seek out and find...

"NOTHINGS!!! THERE IS NOTHINGS!!!! TRICKS USSSS!"

IT smiled.   The domain, being virtual, smelled real and powerful, but was as useless as a Webcam on a Dos 3.3 computer.

"NOOOOOO!!!!! " the screams of the Undead Processes roared in agony as the child partition of the Hyper-V Virtualization domain went to sleep.  Quickly the screaming of their WAV files vanished.

IT smiled as it quickly, quietly and simply scrambled the VHD files by recursively zipping them with Random passwords, destroying the Undead Processes.

ITDEPT: Make sure we put in a call for some extra Co-Ops in the morning.
CODELORDZ: :) Appears we finally found a good use for them, what about all those empty shells in the hall?  No Brainz, Dead to the World?
ITDEPT: Never fear.  Just sit them back down at their desks.  Nobody will be able to tell the difference. :)