Bulk Import Folders and Files in SharePoint Document Library | Part 1

This noon I was on a call with SharePoint Development team to discuss few activities which is pretty straight forward approach for SharePoint IT Professionals. However, we need to inform about the activity we carry out in the farm to both IT Pro’s and developers. By doing this we simply avoid confusion. And, as always in any farm bulk upload of files and folders may cause one or other issues like threshold, performance high risk of uploading vulnerable files etc.

Our client came with 5000 odd folders which has almost 9000 files approximately and the requirement is to push it to document folder. For example, files underneath the folder should be organized as a single folder and files needs to be stored in it. The appended a request “Share the cmdlet so we can add more in future!”

Everything is easy if we build one! There is no out of the box solution. We can’t drag and drop easily as we know the limitations in document library. So, I built a small cmdlet in Visual C# which serves the purpose.

The folder structure is very simple, Main Folder which contains N number of folder and each folder may have some files. No Folders underneath the folders.

using System;
using System.IO;
using System.Management.Automation;
using Microsoft.SharePoint.Client;
namespace SP2010Admin
{
    [Cmdlet(VerbsData.Import, "Document")]
    public class ImportDocument : Cmdlet
    {
        [Parameter(Mandatory = true)]
        public Uri SPUrl;
        [Parameter(Mandatory = true)]
        public string SPDocLib;
        [Parameter(Mandatory = true,
                   ValueFromPipeline = true,
                   ValueFromPipelineByPropertyName = true)]
        public string FolderPath;
        protected override void ProcessRecord()
        {
            ClientContext ctx = new ClientContext(SPUrl);
            Web web = ctx.Web;
            ctx.Load(web);
            ctx.ExecuteQuery();

            List DocLib = web.Lists.GetByTitle(SPDocLib);
            string[] filenames = Directory.GetFiles(FolderPath);
            DirectoryInfo dInfo = new DirectoryInfo(FolderPath);
            FolderCollection folders = DocLib.RootFolder.Folders;
            ctx.Load(folders);
            ctx.ExecuteQuery();
            bool exists = false;
            foreach (Folder efolder in folders)
            {
                if (efolder.Name.Equals(dInfo.Name))
                {
                    foreach (string filename in filenames)
                    {
                        string[] names = filename.Split('\\');
                        WriteObject("Existing " + filename);
                        byte[] data = System.IO.File.ReadAllBytes(filename);
                        FileCreationInformation fci = new FileCreationInformation();
                        fci.Content = data;
                        fci.Url = names[names.Length - 1];
                        efolder.Files.Add(fci);
                        exists = true;
                    }
                }
            }
            if (!exists)
            {
                Folder tFolder = folders.Add(dInfo.Name);
                foreach (string filename in filenames)
                {
                    string[] names = filename.Split('\\');
                    WriteObject("Uploading file " + filename);
                    byte[] data = System.IO.File.ReadAllBytes(filename);
                    FileCreationInformation fci = new FileCreationInformation();
                    fci.Content = data;
                    fci.Url = names[names.Length - 1];
                    tFolder.Files.Add(fci);
                    exists = true;
                }
                ctx.ExecuteQuery();
            }
        }
    }
}

Usage:

Import-Module "c:\users\ChenV\documents\visual studio 2015\Projects\SP2010Admin\SP2010Admin\bin\Debug\SP2010Admin.dll" -Verbose
(Get-ChildItem "C:\Users\ChenV\ProjectFolder\Test\" -Recurse).FullName | Import-Document -SPUrl http://www.contoso.com/web -SPDocLib "Doc Lib" -Verbose

In Part 2 we will discuss about the best practice and will enhance this code much better! Enjoy PowerShell 🙂 🙂 🙂

Thanks to Author Rashu, Rahul in TechNet Wiki

Get-SPOAppInfo | SharePoint Online | PowerShell

During SharePoint Online discussion a question popped up “How to get all installed application information in SharePoint Online?” a simple answer is Get-SPOAppInfo cmdlet! But, wait we are partially correct but read this documentation https://technet.microsoft.com/en-us/library/fp161398.aspx. The below image illustrates the parameters of Get-SPOAppInfo both set to be false and it’s not $TRUE – This cmdlet needs either Name or Product ID! So, we can’t use this cmdlet to retrieve all the apps installed in the given Tenant!

Issue:

So we can Get installed apps information by using below

Enough! We are not going to use this. Let’s use Client Side Object Model in PowerShell and solve the issue.

Output

Code:

Import-Module C:\SPPowerKit\Microsoft.SharePoint.Client.dll
Import-Module C:\SPPowerKit\Microsoft.SharePoint.Client.Runtime.dll
#Import-Module C:\SPPowerKit\Microsoft.SharePoint.Client.UserProfiles.dll
Import-Module C:\SPPowerKit\Microsoft.Online.SharePoint.Client.Tenant.dll
function Get-SPOAppInformation
{
    param(
    [Parameter(Mandatory=$true)]
    [string]$SPOUrl,

    [Parameter(Mandatory=$true)]
    [System.Management.Automation.CredentialAttribute()]$SPOCredential
    )

    $ClientContext = [Microsoft.SharePoint.Client.ClientContext]::new($SPOUrl)
    $ClientContext.Credentials = [Microsoft.SharePoint.Client.SharePointOnlineCredentials]::new($SPOCredential.UserName,$SPOCredential.Password)
    $Tenant = New-Object Microsoft.Online.SharePoint.TenantAdministration.Tenant -ArgumentList $ClientContext
    $Tenant.Context.Load($Tenant)
    $Tenant.Context.ExecuteQuery()
    $Appinfo = $Tenant.GetAppInfoByName([string]::Empty)
    $Tenant.Context.Load($Appinfo)
    $Tenant.Context.ExecuteQuery()
    $Appinfo
    $ClientContext.Dispose()
}

Get-SPOAppInformation -SPOUrl "https://contoso-admin.sharepoint.com" -SPOCredential "TenantAdmin@contoso.onmicrosoft.com"

Bit more to organize it, we can use PSObject!

Import-Module C:\SPPowerKit\Microsoft.SharePoint.Client.dll
Import-Module C:\SPPowerKit\Microsoft.SharePoint.Client.Runtime.dll
#Import-Module C:\SPPowerKit\Microsoft.SharePoint.Client.UserProfiles.dll
Import-Module C:\SPPowerKit\Microsoft.Online.SharePoint.Client.Tenant.dll
function Get-SPOAppInformation
{
    param(
    [Parameter(Mandatory=$true)]
    [string]$SPOUrl,

    [Parameter(Mandatory=$true)]
    [System.Management.Automation.CredentialAttribute()]$SPOCredential
    )

    $ClientContext = [Microsoft.SharePoint.Client.ClientContext]::new($SPOUrl)
    $ClientContext.Credentials = [Microsoft.SharePoint.Client.SharePointOnlineCredentials]::new($SPOCredential.UserName,$SPOCredential.Password)
    $Tenant = New-Object Microsoft.Online.SharePoint.TenantAdministration.Tenant -ArgumentList $ClientContext
    $Tenant.Context.Load($Tenant)
    $Tenant.Context.ExecuteQuery()
    $Appinfocollection = $Tenant.GetAppInfoByName([string]::Empty)
    $Tenant.Context.Load($Appinfocollection)
    $Tenant.Context.ExecuteQuery()
    foreach($Apps in $Appinfocollection)
    {
        $Results = New-Object psobject -Property ([Ordered]@{
        Name = $Apps.Name
        ProductID = $Apps.ProductID
        Source = $Apps.Source
        })
        $Results
    }
    $ClientContext.Dispose()
}

Get-SPOAppInformation -SPOUrl "https://contoso-admin.sharepoint.com" -SPOCredential "TenantAdmin@contoso.onmicrosoft.com"

2016-01-20_10-22-29

SharePoint 2013 Foundation | Central Administration | File Not Found

Yet another day with SharePoint 2013 foundation troubleshooting. I was building a SharePoint 2013 Foundation farm on Windows 2008 R2 Standard with Service Pack 1. The same box has SQL Server 2012! Lesson learnt “Do not do installation on the fly!” J

I was very much sure the box I chose for installing SharePoint 2013 foundation meets the software and hardware requirements. Indeed, scalability and performance was not considered at this point of time.

First and foremost, Symantec Antivirus blocked the configuration wizard at step 4.

Solution: Uninstalled the Symantec Antivirus Software and fixed the access issue.

Then we moved one step ahead and completed the Set up and was bit happy to see the below window

Then, while launching Central Admin we got the glossy and silky error J

Sorry, Something Went Wrong

File Not Found

Followed the below steps

Turned on Custom Error mode – No luck! No clue why including stack trace $true browser didn’t show the error message

Removed Central Admin and Recreated new one – No Luck!

Un-Provisioned the Central Admin and Central Admin Database using below code – No Luck!

Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction 0
Function Remove-CentralAdmin
{
    $CA = Get-SPWebApplication -IncludeCentralAdministration | ? {$_.IsAdministrationWebApplication -eq $true}
    if($CA -eq $null)
    {
        "Central Admnistration is missing!"
    }
    foreach($App in $CA)
    {
        $App.UnprovisionGlobally($true)
        foreach($DB in $App.ContentDatabases)
        {
            $DB.Unprovision()
        }
        $App.Delete()
    }
}

Remove-CentralAdmin

Checked the IIS host header and virtual directory settings! All set to good!

Finally, the below simple snippet saved my day!

Get-SPEventLog -StartTime <DateTimeStamp> -EndTime <EndTimeStamp> | ? { $_.Correlation -eq "CorrelationID"
$_.Level -eq 'Unexpected'} | Select Message

Message:

System.IO.FileNotFoundException: Could not load file or assembly ‘MySql.Web.v20, Version=6.9.3.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d’ or one of its dependencies.

The system cannot find the file specified.

Simply Awesome!

Again, I modified the Web.Config File custom errors and stack trace, restarted the IISAdmin service and bounced the server – Just to keep things neat and clean! Bingo, it worked – I got the same error message


The line number 287 in the system machine configuration file is below

<providers>

Line 287: <add name=”MySqlSiteMapProvider” type=”MySql.Web.SiteMap.MySqlSiteMapProvider, MySql.Web.v20, Version=6.9.3.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d” connectionStringName=”LocalMySqlServer” applicationName=”/” />

</providers>

Solution: Commented this line!

<providers>

<!– <add name=”MySqlSiteMapProvider” type=”MySql.Web.SiteMap.MySqlSiteMapProvider, MySql.Web.v20, Version=6.9.3.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d” connectionStringName=”LocalMySqlServer” applicationName=”/” /> –>

</providers>

Now, the Central Admin opened up fine J

Keep SharePointing and Enjoy PowerShell !

Show or Hide SharePoint List Using CSOM | C# | PowerShell

Disclaimer: This is pretty old topic just remade it using CSOM and PowerShell at my client work place.

During SharePoint due diligence many suggested to use SharePoint Manager Tools, CAML Query Builder etc. It’s better to keep tools handy but gathering and executing the client requirement is better than the best.

Environment: Client has WSS 3.0, MOSS 2007 and SharePoint 2010 platforms, they want to organize the structure and keep the contents similar. An ideal goal is to move to upgraded versions.

During this big discussion development team suggested to make a script which reads the web, site, lists and document library properties and set the same in each farm. Dude! writing script is easy and deploying is very easy as well. But, before we do that let’s know why properties are set differently ? Who made the decision? Business Justification is much more important so I suggested Power Users or SharePoint L1 to decide and deploy the solution.

How? Power Users like GUI based solutions they don’t want to execute console applications or scripts. As usual we picked up a call and I was shocked by listening to content manager requirement – “Need one liner cmdlet and we will create Lists and Set it’s Property. It may be one or many!” It’s that simple PowerShell reached to content managers as well. 🙂 I was happy because I am going to meet their needs in short span of time!

To kick on the demo in a catchy way. I showed them how PowerShell one liner looks like 🙂

Get-EvenNumbers -From 1 -To 10

It’s that simple to use, I haven’ t used any validation in this function very simple code which prints the even number using range operators. The function code is below

Function Get-EvenNumbers
{
    param(
    [Parameter(Mandatory = $true)]
    $From,
    [Parameter(Mandatory=$true)]
    $To
    )

    ($From..$To).ForEach({if($_ %2 -eq 0){$_}})
}
Get-EvenNumbers -From 1 -To 10

Solution for Content Managers:

Consume CSOM and build binary module using Class Library 🙂 🙂 🙂 This time I used Visual Studio 2013 just for a change 🙂

using System;
using System.Collections.Generic;
using System.Text;
using System.Security;
using System.Net;
using System.Management.Automation;
using Microsoft.SharePoint.Client;

[Cmdlet(VerbsCommon.Set,"SPListVisibility")]
public class SetSPListVisibility : Cmdlet
{
    [Parameter(Mandatory = true)]
    public string SPUrl;

    [Parameter(Mandatory = true)]
    public string SPListName;

    [Parameter(Mandatory = true)]
    public bool ListVisibility;
    protected override void ProcessRecord()
    {
        ClientContext ctx = new ClientContext(SPUrl);
        NetworkCredential credentials = CredentialCache.DefaultNetworkCredentials;
        ctx.Credentials = credentials;

        List list = ctx.Web.Lists.GetByTitle(SPListName);
        ctx.Load(list);

        list.Hidden = ListVisibility;
        list.Update();
        ctx.ExecuteQuery();
        ctx.Dispose();
    }
}

Usage (Just for Demo):

Import-Module C:\Scripts\Module\CSOM_Demo.Dll
#Hide the List
Set-SPListVisibility -SPUrl http://weburl -SPListName "PowerShell" -ListVisibility $false

#Show the List
Set-SPListVisibility -SPUrl http://weburl -SPListName "PowerShell" -ListVisibility $true

In next my next blog I will share the complete module! It’s that’s easy ! Happy SharePointing and Enjoy PowerShell 🙂 🙂 🙂

Reason: Content manager need this kind of solution for re usability. Smart Guy!

SharePoint Content Management using PowerShell | Dude! it’s a Due Diligence – Part 1

October-26-2015: Due to SharePoint tasks I missed MVP Summit 2015; Energy drained out due to disappointment!. So, I engaged 16 hours of my time with SharePoint team to perform a due diligence. Traveled in and around Netherlands for consultations with clients, developers and IT Professionals.

Introduction: Though we know each other we gathered at a party lounge for a friendly handshake and a BIER 🙂 and that’s Hertog Jan a typical Dutch Bier!

Daan_Samson_Hertog_Jan3

 

Not huge if we are four in numbers to share! – Cheers!!!

Challenge 1: Our security team will not allow us to install any third party components and products not guaranteed by Microsoft. So, we need to code to inventory of SharePoint farm(s). Includes MOSS 2007 & WSS 3.0,  SharePoint 2007 and SharePoint 2013.

Challenge 2: Developer team is busy in SharePoint Online training, we can engage them only twice a week.

Challenge 3: We need to plot a blue print of SharePoint farm without third party tools. The same needs to be shared with IT Managers before we design a plan for deployment or migrations or transformations.

🙂 🙂 🙂 Let’s keep the challenges apart. We may come across many while working on the Test and Production environment. So, we decided to get back to work on October-27-2015 with fresh mind! Yeah! we concluded not to work at home or not even to bing for solutions a day prior to work. A fresh mind is most important to work independently!

Requirement 1: Get all the items from Large List – We can see 30,000 + items we need a report in Excel.

I quickly showed the option in ribbon “Export to Excel” 🙂 “Assuming things will not help at work place!” Clients need Large List items reports to be shared in email and uploaded SharePoint portal. If it’s in portal why in email? To gain customer satisfaction we stopped asking questions, we gathered our tasks and grabbed a cup of coffee to our desk.

th

oops! That’s not a large list, it’s a Document Library enabled with Document Set! The complete web is built and deployed using custom code. Exploring is not a challenge but need to be handled it with care. Many columns are Interconnected or mapped with multiple lists in the same web.

CAML Query is SharePoint IT Professionals best friend, Just three things we need to do Query, Select Fields, and set Attributes to do this we need a tool; Indeed that will be PowerShell. The mission begins now!

12

Dude, that’s SharePoint 2010 hosted on Windows 2008 R2 Standard. PowerShell Version 2.0 🙂 🙂 🙂 – Developers Team looked at me and laughed – Do you remember this version? Can we raise a request to upgrade the version of PowerShell? (Nope! PowerShell 4.0 stable version requires some workaround in SharePoint box and PS 5.0 Product Preview is not stable version) shall we do a timer job? (No! No! No! – Folks, don;t forget MOSS 2007 we did many without Shell Mangement – Now, we have a tool to manage!. Go and Check your Visual Studio version and updates – I assume you may need one more Bier!)

Dusted up my old SharePoint script file and modified it as required

Step 1: Export – Items to CSV

Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction 0
$spsite = Get-SPWeb -Identity http://www.contoso.com/web
$splist = $spsite.Lists["Large List"]
$spQuery = New-Object Microsoft.SharePoint.SPQuery
$camlquery = "<Where><Eq><FieldRef Name='FSObjType'></FieldRef><Value Type='Text'>1</Value></Eq></Where>"
$spQuery.Query = $camlQuery
$spQuery.ViewFields = [String]::Concat("<FieldRef Name='Project_Number'/><FieldRef Name='Created'/><FieldRef Name='Editor'/><FieldRef Name='Author'/><FieldRef Name='Number_of_downloads'/>")
$spQuery.ViewAttributes = [String]::Concat("Scope='RecursiveALL'")
$spQuery.ViewFieldsOnly = $true
$spListItems = $spList.GetItems($spQuery)
$collections = @()
Foreach($spListItem in $spListItems)
{
    $results = New-Object psobject -Property @{
    "Number" = $spListItem["Project_Number"]
    "Created By" = $spListItem["Author"].Split("#")[1]
    "Status" = $SPListItem.ModerationInformation.Status
    "Modified By" = ($spListItem["Editor"]).Split("#")[1]
    "Number of Downloads" = ($spListItem["Number_of_downloads"])
    }
    $collections += $results 
}
$Collections #| Export-Clixml C:\temp\LargeList_Report.xml

Step 2: Send in email as attachment and close the requirement!

$param = @{
FROM = "From@contoso.com"
TO = "To@contoso.com"
Subject = "Large List Report"
SMTPServer = "SMTP Server"
Attachment = "C:\temp\LargeList_Report.csv"
}
Send-MailMessage @param

Requirement 2: We have removed the existing Approval Workflow and lost the track of Content Approved Items – We can get that in search result.

Yes, this time no SharePoint Shell Management, kicked on Windows PowerShell 5.0 Product Preview from my local machine and got the report. Lucky me! the SharePoint document library will not allow duplicate items 🙂 so, search results returns one URL (customized search page!) and team didn’t reset the Search Index 🙂 “25% of the tasks completed – Yes, users can search and read contents from the application

The items found in search results URL will be re-processed : Set Content Approval to $true 🙂

(Get-Content C:\Temp\Large_list.csv).Foreach({
(Invoke-WebRequest -Uri "http://www.contoso.com/web/Pages/SearchPage.aspx?k=$_" -UseDefaultCredentials | 
Select -ExpandProperty Links).href | Select -Last 1
})

Requirement 3: Approve all documents – No need to think about the tasks which are already fired. So, concluded to change moderation status information to approved for the items found in search result.

Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction 0
$web = Get-SPWeb -Identity 'http://www.contoso.com/web'
$list = $web.Lists["Large List"]
$ids = Get-Content C:\Temp\Approved_Id.txt 
foreach($id in $ids)
{
    $item = $list.GetItemById($id)
    $item.ModerationInformation.Status = "Approved"
    "Approving Project {0} - Status: {1}" -F ($item["Project_Number"] , $item.ModerationInformation.Status)
    $item.Update()
}

Content mangers did a sanity checks and confirmed all set to good! So, it’s time for us to deploy new (or) upgraded solution. Moved to look at developer team. Oops! really pity – a laptop with 4 GB RAM running Visual Studio 2010 in SharePoint Development environment.

lgf01a201311081300

It’s lunch time 🙂 🙂 🙂 we have few more fun to flow in! Gained energy with BitterBal

l_6572_dutch-food-1

Continued discussion with development at lunch and assessed the challenges – We discussed about buying some more time 🙁 “My Mind voice: I could have been to MVP Summit 2015!”. That was just a discussion not a conclusion; Post lunch we were back at desk to continue content management automation.

I requested developer to ignore all exiting 🙂 DOC LIB items and automation tasks. So, they can develop ASP.NET form to map with DOC LIB columns – Yayyyyyy! Now we have Metadata Columns in the library! – Need a coffee! 🙂

Requirement 4: Deploy the new solutions – Yep deploying new WSP! One liner using PowerShell – Deployment went fine and development team signed off (Will be back into production support as required), But Application didn’t turned up well!

It’s time for us to show the Power of Shell 🙂 🙂 🙂

I expected the below shiny error 🙂 But I should explain this to IT Managers like “sorry, we are encountering an unexpected error

Capture

🙂 Woot! Go ! Open up SharePoint Management Shell and execute the below piece of code with start and end time parameters – Pipe and fetch the correlation ID. It’s that simple!

Start Time : Grab it from the error screen shot

End Time : Just increase by 1 minute or 2 (Assumption!)

Get-SPLogEvent -StartTime "05/11/2015 11:22:08" -EndTime "05/11/2015 11:23:08" | Where-Object 
{$_.Correlation -eq "8a343106-0d90-492f-a683-dd0fb59c0058"}

“Missing LCID for <Column>” 🙂 he he he. Missing English and French!

New solution includes new fields which is set to be mandatory; managed term store now need two more language English and French – Dutch is by default!

Okay, I had the service packs binaries so installed English and French Language Packs to confirm the fix- Voila! we were back on track – 45% of the tasks accomplished! Note: If you are in RTM – Download the latest service pack – Extract and Delete any entries in Update folder which is in the year 2013. So, you can install the SharePoint 2010 Service Packs as RTM and upgrade to the current farm build version id required.

We have done only the initial phase of the work. Few more SharePoint 2010 IT Professional tasks are shared below

  1. Update all missing fields.
    • How to find the missing fields? We may have newly added columns 🙂
  2. Read other list and update the DOC LIB.
    • Dude! it’s Note Field so we have HTML tags Export and Import in PowerShell or editing in DOC LIB using data sheet looks ugly 🙁
  3. Approve all the approved items in the DOC LIB.
    • After manipulating the items – Approve the items if it was approved and pending should remain as pending. 🙂 We haven’t touched workflow
  4. Most of the information are rendering in SharePoint portal by reading Active Directory.
    • Few users left the organization – We need to update author names with common ID (This is interesting 🙂 )
  5. What happened to LYNC Presence in the SharePoint portal? LYNC Client is upgraded to Skype for Business. Need to investigate further more! May be browser add in!
    • Not my cup of tea though still I will figure out the reason!
  6. Who created the item? Who Modified the same?
    • We sorted this – But will discuss this again!
  7. Read the department of author and update it in the DOC LIB field (This is very interesting task because we need to explore User Profile Service Application)
  8. Complete the content automation – After all set to good we will start the due diligence for migrate the farm to SharePoint 2013. Or, will think about a POC using SharePoint 2016 Technical Preview!
    • I am waiting ! Not, that easy but possible!
  9. Mean while we need to upgrade the Operating System of MOSS 2007 – Since Windows 2003 server has no support we need to upgrade it to Windows 2008 R2 Standard and strictly needs to be in 32 bit.

More real time scripts, SharePoint Developer and IT Professional discussion about PowerShell will be continued in next blog 🙂

 

Retrieve Ticket Information – Using Service Desk Plus REST API and PowerShell – Part 1 – Get_Request

Recently I was reading the article Service Desk Plus REST API. Well! yeah not really great for Automation at this point of time but partially served my needs.I love automation so thought of spinning up PowerShell to do few tasks which I perform on daily basis with minimum tool permission. Indeed, service desk plus has reporting feature and can be automated (Personally not interested to deep dive in it). Reporting automation in tool depends on the server hardware / software capacity and tool configuration. Consider Mid level managers and multiple technicians configured reporting for their needs we may fall in performance degradation. Many will do SQL query to pull reports. Do, remember after pulling reports we should work on Excel to make good reports.

What can we do with Service Desk Plus REST API?

  1. Request Operations [We will cover only this part in this blog!]
  2. Notes Related Operations
  3. Work log Related Operations
  4. Change Related Operations
  5. CMDB Related Operations
  6. Technician Related Operations

Let’s start Automating Reporting, to begin with ensure you have below perquisites

  1. Know about your server host name or URL of the tool and the port information – Eg: http://ServerName:<port>
  2. Keep the API documentation handy – we need to refer to this while coding.
  3. Contact your tool admin and get the API key – Do remember if you are a tool admin you can’t generate one for yourself with the same ID.

You can fetch Request Number by querying your SQL DB using PowerShell – Since, we have enough code in the internet we will skip the PowerShell code for retrieving inform from SQL DB.

My first attempt failed because I didn’t read the API documentation 🙂

REST_API1Ay! it’s doing something! grab the API documentation and follow the instructions. We need to post the operation name and we have got our API key already from tool admin!. Now use the PowerShell code like shown below

$Params = @{OPERATION_NAME=’GET_REQUEST’;TECHNICIAN_KEY=’ASK YOUR TOOL ADMIN’;}
$ticketid = ‘Get it manually for now’
$data = Invoke-RestMethod -Uri http://localhost:80/sdpapi/request/$ticketid/ -Method Post -Body $Params

Okay! This gave me what I need 🙂 🙂 🙂 XML 🙂 🙂 🙂

REST_API2

To retrieve the status information we need to see the response code like shown below

$data.API.response.operation.result

REST_API3

Okay! let us see the ticket information in PowerShell ISE 🙂 But! how do we know that information and status are in result and detail node? Simple, see the API documentation of Viewing Request – Example is shown in the below image

REST_API4Cool! We are good to fire the below PowerShell code.

$data.API.response.operation.Details.parameter

TaDa! The sample output is below

REST_API5

Service Desk Plus documentation time format is human readable format and this is not why? Well! I am not from manage engine team so I have no answer but as a System Admin I can read that as EPOCH time format. So, again do little more code in PowerShell and convert this time format.

“Date and Time Format is {0}” -f (([datetime]’1/1/1970′).AddSeconds([int](1439388404268 / 1000)))

REST_API6

So, I ended up converting it to Short Date String!

(([datetime]’1/1/1970′).AddSeconds([int](1439388404268  / 1000)).ToShortDateString())

That’s it for now, in next post we will see exporting multiple requests to XML for our automation demo! Enjoy PowerShell 🙂 🙂 🙂 🙂

Bangalore PowerShell User Group DSC Work Shop – July 18, 2015

Bangalore PowerShell User Group DSC Work Shop – July 18, 2015

vs-powershell

  • Are you in INDIA?
  • Are you an IT Pro?
  • Are you a Dev?
  • Looking for Automation?
  • IT Manager looking for a bridge between IT Pro’s and Dev’s?
  • Want to learn or know about DEVOPS?

Just 1 day to go!

Bangalore PowerShell User Group announced a Work Shop event to showcase DSC Basics and Advanced topics. Let’s go declarative ! Stop not! @PSBUG opened a gate. Yeah! Book for the workshop and make use of it. Come and explore! Learn and Share!

 

Query AD and Update SQL Express Table Using PowerShell

Requirement

Query AD [Active Directory] computers and insert into SQL Express.

Considerations

Requirement is just an one liner. We need to consider a lot before we deliver this to production.

  • Do we need all computers from AD or just enabled? Yeah! We need only enabled computers. If Admin disables / enables a computer or remove / adds a computer in AD the SQL express table should update accordingly.
  • Do we need only client operating systems? Yeah! We need this is for Client OS reporting.

Summary

To summarize the requirement customer needs a solution which should do the below tasks.

  • Insert if new Computer found in AD.
  • Drop if computer is removed.
  • Update properties if any change identified.

Solution

  • Query AD with Filter ON [Enabled -eq $true and OperatingSystem -eq ‘Windows 7 Enterprise’].
  • Insert If exists.
  • Drop Row if SQL Express rows doesn’t match AD value.[Removed from AD but exists in SQL table]
  • If new computer found insert into table.

PowerShell Code

 #Establish Connection to SQL Express
$conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=Localhost\SQLExpress; Initial Catalog=Employee; Integrated Security=SSPI")
$conn.Open()

#Query AD and Update SQL Table
Get-ADComputer -Filter {(Enabled -eq $true) -and (OperatingSystem -eq 'Windows 7 Enterprise')} -Properties * | %{

$Name = $_.Name 
$OS = $_.OperatingSystem 
$LastLogonDate  = $_.LastLogonDate 
$PasswordLastSet = $_.PasswordLastSet
$Description = $_.Description 
$DN = $_.DistinguishedName
$Enabled = $_.Enabled

$cmd = $conn.CreateCommand()
$query = @"
IF EXISTS (Select * from ComputerInformation Where ComputerName = '$Name')
Delete from ComputerInformation where ComputerName <> '$Name'
ELSE
INSERT INTO ComputerInformation VALUES ('$Name' , '$OS' , '$LastLogonDate' , '$PasswordLastSet' , '$Description' , '$DN' , '$Enabled')
"@
$cmd.CommandText = $query
$cmd.ExecuteNonQuery()
}

#Close the connection
$conn.Close()

 

Exploring LYNC Contacts Using PowerShell

Exploring LYNC Contact folders using PowerShell.

Get-MailboxFolderStatistics will display all the folders information in users mailbox. So using this we can explore LYNC Contacts with the folder type ‘Quick Contacts’.

Get-MailboxFolderStatistics -Identity <MailboxID> | 
? {$_.FolderType -eq 'QuickContacts'}

Lync Contact Folders with 0 EntriesLync Contact Folders with 0 Entries

If we delete lync contacts programmatically it may appear as GUID as well. Like shown below.

Get-MailboxFolderStatistics -Identity <MailboxID> | 
Select Name , ItemsinFolder | ft -AutoSize

2Underneath Contacts we can see two GUID – That’s LYNC and GAL Contacts

Refer this article to remove LYNC contact entries using PowerShell and EWS

How to Explore all folders?

PowerShell Tip: String Manipulation using Regular Expression

Summary

  • Regular Expression is very Powerful and best way to play with strings.
  • Using REGEX in PowerShell is best way to query log files, pattern matching etc.., however we should use it as applicable.

Example Code

#Given Name : Chendrayan Venkatesan
#Required Output: Venkatesan , Chendrayan
"Chendrayan Venkatesan" -replace "([a-z]+)\s([a-z]+)" ,'$2, $1' 

#Given Name : Chendrayan12345 Venkatesan
#Required Output: Venkatesan, Chendrayan
'Chendrayan12345 Venkatesan' -replace "\d+" -replace "([a-z]+)\s([a-z]+)" ,'$2, $1'