Wednesday, April 30, 2025

Convert IST to CST/CDT in PowerShell – Handling Daylight Saving Time (DST)

Working across time zones can be tricky, especially when Daylight Saving Time (DST) is involved. If you’re dealing with automation, scheduling, or data transformation tasks, converting between Indian Standard Time (IST) and Central Time (CST/CDT) becomes essential.

In this post, I’ll walk you through a PowerShell function that smartly handles the conversion of an IST datetime to either CST (Central Standard Time) or CDT (Central Daylight Time), depending on the date and DST rules.

The PowerShell Function

Below is a reusable PowerShell function that converts IST to CST/CDT accurately by accounting for Daylight Saving Time.

function Convert-ISTToCSTorDST {

    param (

        [datetime]$ISTDate    # Input IST DateTime to be converted

    )


    # Central Time Zone Information

    $centralTimeZone = [System.TimeZoneInfo]::FindSystemTimeZoneById("Central Standard Time")

    

    # Check if the given ISTDate is in Daylight Saving Time (CDT) or Standard Time (CST)

    $isDST = $centralTimeZone.IsDaylightSavingTime($ISTDate)


    # Convert IST to UTC (since IST is UTC +5:30)

    $utcDate = $ISTDate.AddHours(-5).AddMinutes(-30)


    # Convert to CST or CDT

    if ($isDST) {

        # If in Daylight Saving Time (CDT), subtract 5 hours from UTC to get CDT

        $convertedDate = $utcDate.AddHours(-5)

        Write-Host "Converted Date (CDT): $convertedDate"

    } else {

        # If in Standard Time (CST), subtract 6 hours from UTC to get CST

        $convertedDate = $utcDate.AddHours(-6)

        Write-Host "Converted Date (CST): $convertedDate"

    }


    return $convertedDate

}

Example Usage

$ISTDate = Get-Date "2025-03-15 10:00:00" # Example IST datetime
$convertedDate = Convert-ISTToCSTorDST -ISTDate $ISTDate

How It Works
1. Finds the Central Time zone using system identifiers.

2. Determines if DST is in effect using .IsDaylightSavingTime().

3. Converts IST to UTC by subtracting 5 hours and 30 minutes.

4. Adjusts from UTC to CST or CDT by subtracting either 6 or 5 more hours.

Bulk Import Excel Data to SharePoint List Using PowerShell and PnP

 

Managing large datasets in SharePoint can be tricky, especially when you're dealing with Excel files and need to avoid list view threshold issues. In this guide, I’ll walk you through a PowerShell script that efficiently imports data from Excel into a SharePoint Online list using PnP PowerShell — with batching support for performance.

Prerequisites

Make sure you have the following before running the script:

  • SharePoint Online site URL

  • Excel file with data properly formatted

  • PnP PowerShell module installed (Install-Module PnP.PowerShell)

  • Appropriate SharePoint permissions

What the Script Does

  • Connects to your SharePoint site

  • Loads and reads an Excel file

  • Converts Excel date values

  • Batches records in groups (to avoid the 5000 item threshold)

  • Adds the items to your SharePoint list or library

  • Logs execution time

PowerShell Script

$siteUrl = "[Site Collection URL]"
Connect-PnPOnline -Url $siteUrl -UseWebLogin

# Capture the start time
$startTime = Get-Date

# Define lists and field mappings
$LibraryName = "[LibraryName]"
$excelFilePath = "[excelFilePathwithExtention]" 

# Import data from the Excel file using Import-Excel
$excelData = Import-Excel -Path $excelFilePath

# Define the batch size (to avoid SharePoint's list view threshold of 5000 items)
$batchSize = 200

# Initialize a list to hold batches of items
$batch = @()

# Loop through each row in the Excel data
foreach ($row in $excelData) {
    # Calculate the OBSERVATION_DATE
    $baseDate = [datetime]"1900-01-01"
    $convertedDate = $baseDate.AddDays($row.OBSERVATION_DATE - 2)

    # Map Excel data to SharePoint fields
    $Values = @{
        "Title"                     = $row.Title
        "SHIFT"                     = $row.SHIFT
        "OBSERVATION_DATE"         = $convertedDate
        "BEHAVIOR"                 = $row.BEHAVIOR
        "ROLE"                      = $row.ROLE
        "Behaviour_unsafeCondition" = if ($row.AT_RISK_BY_UNSAFE_COND -eq "N") { "No" } 
                                      elseif ($row.AT_RISK_BY_UNSAFE_COND -eq "Y") { "Yes" } 
                                      else { "" }
        "LOCATION_ID"               = $row.LOCATION_ID
        "AREA_ID"                   = $row.AREA_ID
        "SUB_AREA_ID"               = $row.SUB_AREA_ID
        "LINE_ID"                   = if ($null -eq $row.LINE_ID -or $row.LINE_ID -eq "") { $null } else { $row.LINE_ID }
        "CATEGORY_ID"               = $row.CATEGORY_ID
        "SUB_CATEGORY_ID"           = $row.SUB_CATEGORY_ID
    }

    $batch += $Values

    # Process the batch if it reaches the threshold
    if ($batch.Count -ge $batchSize) {
        foreach ($item in $batch) {
            Add-PnPListItem -List $LibraryName -Values $item -Verbose
        }
        Write-Host "Added $($batch.Count) items to $LibraryName"
        $batch.Clear()
    }
}

# Add remaining items if any
if ($batch.Count -gt 0) {
    foreach ($item in $batch) {
        Add-PnPListItem -List $LibraryName -Values $item -Verbose
    }
    Write-Host "Added the last batch of $($batch.Count) items to $LibraryName"
}

# Calculate execution time
$endTime = Get-Date
$executionTime = $endTime - $startTime
Write-Host "Script executed in:" $executionTime

Key Benefits
  • Efficient batching: Avoids the 5000-item limit by processing in chunks.

  • Dynamic date conversion: Transforms Excel numeric dates into valid DateTime objects.

  • Safe and robust: Skips null values and handles conditional logic cleanly.


Tuesday, April 29, 2025

How to Split a Large Excel File into Smaller Chunks Using PowerShell

Working with massive Excel files can be cumbersome—slow to open, hard to process, and error-prone in automation. If you’re dealing with a large dataset and need to split it into smaller, manageable files, PowerShell offers a powerful and efficient way to do it—especially with the help of the ImportExcel module.

In this guide, I’ll walk you through a simple script that takes a large Excel file and splits it into multiple smaller Excel files, each containing a defined number of records.

Requirements

  • PowerShell

  • ImportExcel module
    You can install it via PowerShell with:

          Install-Module -Name ImportExcel

# Import the ImportExcel module

Import-Module ImportExcel


# Path to the large Excel file

$excelFilePath = "[LocalFilePathwithFileExtention]"


# Define the chunk size (e.g., 10,000 records per chunk)

$chunkSize = 10000


# Read the Excel file

$excelData = Import-Excel -Path $excelFilePath


# Calculate how many chunks are needed

$totalRows = $excelData.Count

$chunkCount = [math]::Ceiling($totalRows / $chunkSize)


# Loop to split and save each chunk

for ($i = 0; $i -lt $chunkCount; $i++) {

    $startIndex = $i * $chunkSize

    $endIndex = [math]::Min(($startIndex + $chunkSize), $totalRows)


    # Extract the subset of data

    $chunkData = $excelData[$startIndex..($endIndex - 1)]


    # Define output file path

    $chunkFilePath = "[LocalFilePathwithFileLocation]chunk_$($i + 1).xlsx"


    # Export to a new Excel file

    $chunkData | Export-Excel -Path $chunkFilePath


    Write-Host "Saved chunk $($i + 1) to $chunkFilePath"

}


Write-Host "Splitting completed!"


What This Script Does

  1. Imports the Excel file using the Import-Excel cmdlet.

  2. Divides the data into smaller chunks (10,000 rows by default).

  3. Exports each chunk into a separate .xlsx file.

  4. Prints progress messages to the console for visibility.


Why Use This Approach?

  • Avoids Excel file size limitations.

  • Ideal for batch processing or feeding smaller files into automated workflows.

  • Fast and memory-efficient compared to loading Excel in the GUI.


Customization Tips

  • Change $chunkSize to adjust how many records go into each file.

  • Modify $chunkFilePath to suit your file naming or storage structure.

  • Integrate this into a scheduled task or data pipeline for regular use.