Collect Office 365 Message Trace logs

Supported in:

This document explains how to ingest Office 365 Message Trace logs to Google Security Operations using multiple ingestion methods. An ingestion label identifies the parser which normalizes raw log data to structured UDM format. The information in this document applies to the parser with the OFFICE_365_MESSAGETRACE ingestion label.

Office 365 Message Trace is a Microsoft Exchange Online feature that tracks email messages as they travel through the Exchange Online mail flow pipeline. Message trace logs provide detailed information about each email message, including sender and recipient addresses, subject, delivery status, source and destination IP addresses, and message size. This data is essential for troubleshooting mail delivery issues, investigating suspicious email activity, and monitoring email traffic patterns.

Message trace data is retrieved programmatically using the Get-MessageTraceV2 cmdlet in Exchange Online PowerShell. The Exchange Online PowerShell module (ExchangeOnlineManagement) is officially supported on Windows, macOS, and Linux with PowerShell 7.4.0 or later (and Windows PowerShell 5.1 on Windows).

Before you begin

Ensure that you have the following prerequisites:

  • A Google SecOps instance
  • A Microsoft 365 tenant with Exchange Online
  • A Global Administrator, Exchange Administrator, or Security Reader role in Microsoft 365
  • Access to Google Cloud Console (for API key creation, GCS, and webhook)
  • Permissions to create and manage feeds in Google SecOps
  • A dedicated Windows, macOS, or Linux server or workstation to run the export script:
    • Windows: Windows 10/11 or Windows Server 2016 or later with Windows PowerShell 5.1 (built-in) or PowerShell 7.4.0 or later
    • macOS: macOS 13 Ventura or later with PowerShell 7.4.0 or later
    • Linux: Debian 11+, Ubuntu 20.04+, RHEL 8+, or Fedora 36+ with PowerShell 7.4.0 or later
  • Network connectivity from the export machine to both Microsoft 365 and the Google SecOps ingestion endpoint

Set up the export environment

Office 365 Message Trace does not have a native push capability. You must run an export script on a dedicated Windows, macOS, or Linux server or workstation that retrieves message trace data from Exchange Online using PowerShell and sends it to Google SecOps.

Message trace data is available through Exchange Online PowerShell with the following retention limits:

  • Get-MessageTraceV2: Returns data for the last 10 days. Results are available instantly.
  • Start-HistoricalSearch: Returns data for 10 to 90 days. Results are prepared as downloadable CSV reports and may take several hours.

The following steps install PowerShell and the Exchange Online PowerShell module on your machine.

Install PowerShell

Windows

Windows PowerShell 5.1 is included with Windows 10, Windows 11, and Windows Server 2016 or later. No additional installation is required.

  • To verify your PowerShell version, open PowerShell and run:

    $PSVersionTable.PSVersion
    
  • Optionally, to install PowerShell 7 for improved performance, open Command Prompt or PowerShell as an administrator and run:

    winget install --id Microsoft.PowerShell --source winget
    

macOS

  1. Install Homebrew if not already installed.
  2. Open Terminal and run the following command:

    brew install powershell/tap/powershell
    
  3. Verify the installation by running:

    pwsh --version
    

The output should display PowerShell 7.x.x or later.

Linux (Debian/Ubuntu)

  1. Open a terminal with root or sudo privileges.
  2. Run the following commands:

    sudo apt-get update
    sudo apt-get install -y wget apt-transport-https software-properties-common
    source /etc/os-release
    wget -q "https://packages.microsoft.com/config/ubuntu/$VERSION_ID/packages-microsoft-prod.deb"
    sudo dpkg -i packages-microsoft-prod.deb
    rm packages-microsoft-prod.deb
    sudo apt-get update
    sudo apt-get install -y powershell
    
  3. Verify the installation by running:

    pwsh --version
    

Linux (RHEL/CentOS/Fedora)

  1. Open a terminal with root or sudo privileges.
  2. Register the Microsoft repository:

    curl "https://packages.microsoft.com/config/rhel/$(rpm -E %rhel)/prod.repo" | sudo tee /etc/yum.repos.d/microsoft.repo
    
  3. Install PowerShell:

    sudo dnf install -y powershell
    
  4. Verify the installation by running:

    pwsh --version
    

Install Exchange Online PowerShell module

After installing PowerShell, install the Exchange Online management module. This step is the same on all platforms.

  1. Launch PowerShell:

    • Windows: Open PowerShell or PowerShell 7 as an administrator
    • macOS and Linux: Open a terminal and run pwsh
  2. Run the following command:

    Install-Module -Name ExchangeOnlineManagement -Scope CurrentUser -Force
    
  3. If prompted to install from an untrusted repository (PSGallery), enter Y to confirm.

  4. Verify the installation by running:

    Import-Module ExchangeOnlineManagement
    Get-Module ExchangeOnlineManagement | Select-Object Name, Version
    

The output should display the module name and version number.

Set up certificate-based authentication (required for scheduled scripts)

The Connect-ExchangeOnline -UserPrincipalName method requires interactive sign-in and cannot be used in scheduled scripts. For unattended execution, set up certificate-based authentication with an Azure AD (Microsoft Entra ID) application.

Create a self-signed certificate

  • Windows (run in PowerShell as administrator):

    mkdir C:\Certs
    $cert = New-SelfSignedCertificate -Subject "CN=MessageTraceExport" -CertStoreLocation "Cert:\CurrentUser\My" -KeyExportPolicy Exportable -KeySpec KeyExchange -KeyLength 2048 -NotAfter (Get-Date).AddYears(2)
    $cert.Thumbprint
    Export-Certificate -Cert $cert -FilePath "C:\Certs\MessageTraceExport.cer"
    

Save the thumbprint value for use in later steps.

  • macOS and Linux (run in a terminal):

    mkdir -p /opt/certs
    openssl req -x509 -newkey rsa:2048 -keyout /opt/certs/messagetrace-key.pem -out /opt/certs/messagetrace-cert.pem -days 730 -nodes -subj "/CN=MessageTraceExport"
    openssl pkcs12 -export -out /opt/certs/messagetrace.pfx -inkey /opt/certs/messagetrace-key.pem -in /opt/certs/messagetrace-cert.pem -passout pass:
    
  • Get the certificate thumbprint:

    openssl x509 -in /opt/certs/messagetrace-cert.pem -noout -fingerprint -sha1 | sed 's/://g' | cut -d= -f2
    

Save the thumbprint value for use in later steps.

Register the application in Microsoft Entra ID

  1. Sign in to the Microsoft Entra admin center.
  2. Go to Identity > Applications > App registrations.
  3. Click New registration.
  4. Provide the following configuration details:
    • Name: Enter MessageTraceExport
    • Supported account types: Select Accounts in this organizational directory only
  5. Click Register.
  6. On the application overview page, copy the Application (client) ID and save it.
  7. Copy the Directory (tenant) ID from the overview page and save it.

Upload the certificate to the application

  1. On the application page, go to Certificates & secrets > Certificates.
  2. Click Upload certificate.
  3. Upload the certificate file:
    • Windows: Upload C:\Certs\MessageTraceExport.cer
    • macOS and Linux: Upload /opt/certs/messagetrace-cert.pem
  4. Click Add.

Assign API permissions

  1. On the application page, go to API permissions.
  2. Click Add a permission.
  3. Select APIs my organization uses.
  4. Search for and select Office 365 Exchange Online.
  5. Select Application permissions.
  6. Expand Exchange and select Exchange.ManageAsApp.
  7. Click Add permissions.
  8. Click Grant admin consent for <your organization> and confirm.

Assign Exchange Administrator role

  1. In the Microsoft Entra admin center, go to Identity > Roles & admins > All roles.
  2. Search for and select Exchange Administrator.
  3. Click Add assignments.
  4. Click No member selected and search for the MessageTraceExport application.
  5. Select the application and click Next.
  6. Select Active as the assignment type.
  7. Click Assign.

Verify unattended connection

Launch PowerShell and run:

  • Windows:

    Connect-ExchangeOnline -CertificateThumbprint "<CERTIFICATE_THUMBPRINT>" -AppId "<APPLICATION_ID>" -Organization "yourdomain.onmicrosoft.com"
    Get-MessageTraceV2 -StartDate (Get-Date).AddHours(-1) -EndDate (Get-Date) -ResultSize 1
    Disconnect-ExchangeOnline -Confirm:$false
    
  • macOS and Linux (run pwsh first):

    Connect-ExchangeOnline -CertificateFilePath "/opt/certs/messagetrace.pfx" -AppId "<APPLICATION_ID>" -Organization "yourdomain.onmicrosoft.com"
    Get-MessageTraceV2 -StartDate (Get-Date).AddHours(-1) -EndDate (Get-Date) -ResultSize 1
    Disconnect-ExchangeOnline -Confirm:$false
    

Replace <CERTIFICATE_THUMBPRINT>, <APPLICATION_ID>, and yourdomain.onmicrosoft.com with your actual values. If the command returns message trace data (or an empty result set with no errors), the authentication is working correctly.

Choose your ingestion method

Google SecOps supports multiple ingestion methods for Office 365 Message Trace logs. Select the method that best fits your environment:

Ingestion Method Use Case Latency Setup Complexity
Webhook Real-time push from scripts or applications Seconds Low
Google Cloud Storage V2 Batch export to GCS bucket Minutes to hours Medium

Option 1: Webhook ingestion

Use this method when you have a script or application that can send HTTP POST requests containing message trace data to Google SecOps.

Create webhook feed in Google SecOps

Create the feed

  1. Go to SIEM Settings > Feeds.
  2. Click Add New Feed.
  3. On the next page, click Configure a single feed.
  4. In the Feed name field, enter a name for the feed (for example, Office 365 Message Trace - Webhook).
  5. Select Webhook as the Source type.
  6. Select Office 365 Message Trace as the Log type.
  7. Click Next.
  8. Specify values for the following input parameters:
    • Split delimiter (optional): Enter \n to split newline-delimited JSON events
    • Asset namespace: The asset namespace
    • Ingestion labels: The label to be applied to the events from this feed
  9. Click Next.
  10. Review your new feed configuration in the Finalize screen, and then click Submit.

Generate and save secret key

After creating the feed, you must generate a secret key for authentication:

  1. On the feed details page, click Generate Secret Key. A dialog displays the secret key.
  2. Copy and save the secret key securely.

Get the feed endpoint URL

  1. Go to the Details tab of the feed.
  2. In the Endpoint Information section, copy the Feed endpoint URL.
  3. The URL format is:

    https://malachiteingestion-pa.googleapis.com/v2/unstructuredlogentries:batchCreate
    

    or

    https://<REGION>-malachiteingestion-pa.googleapis.com/v2/unstructuredlogentries:batchCreate
    
  4. Save this URL for the next steps.

  5. Click Done.

Create Google Cloud API key

Chronicle requires an API key for authentication. Create a restricted API key in the Google Cloud Console.

Create the API key

  1. Go to the Google Cloud Console Credentials page.
  2. Select your project (the project associated with your Chronicle instance).
  3. Click Create credentials > API key.
  4. An API key is created and displayed in a dialog.
  5. Click Edit API key to restrict the key.

Restrict the API key

  1. In the API key settings page:
    • Name: Enter a descriptive name (for example, Chronicle Webhook API Key - O365 Message Trace)
  2. Under API restrictions:
    1. Select Restrict key.
    2. In the Select APIs drop-down, search for and select Google SecOps API (or Chronicle API).
  3. Click Save.
  4. Copy the API key value from the API key field at the top of the page.
  5. Save the API key securely.

Create the webhook export script

Create the following PowerShell script on the machine where you set up the export environment. This script retrieves message trace data from Exchange Online and sends it to the Chronicle webhook endpoint.

Windows

  1. Create the script directory:

    mkdir C:\Scripts
    
  2. Create a file named C:\Scripts\messagetrace-webhook.ps1 with the following content:

    # Configuration
    $endpointUrl = "https://malachiteingestion-pa.googleapis.com/v2/unstructuredlogentries:batchCreate"
    $apiKey = "<API_KEY>"
    $secretKey = "<SECRET_KEY>"
    $certThumbprint = "<CERTIFICATE_THUMBPRINT>"
    $appId = "<APPLICATION_ID>"
    $organization = "yourdomain.onmicrosoft.com"
    
    # Log file
    $logFile = "C:\Logs\messagetrace-webhook.log"
    if (!(Test-Path "C:\Logs")) { New-Item -ItemType Directory -Path "C:\Logs" }
    
    try {
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Starting message trace export"
    
        # Connect to Exchange Online using certificate-based authentication
        Connect-ExchangeOnline -CertificateThumbprint $certThumbprint -AppId $appId -Organization $organization -ShowBanner:$false
    
        # Retrieve message trace data for the last hour (up to 10 days available)
        $startDate = (Get-Date).AddHours(-1)
        $endDate = Get-Date
        $messages = Get-MessageTraceV2 -StartDate $startDate -EndDate $endDate -ResultSize 5000
    
        if ($messages.Count -eq 0) {
            Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - No messages found in the time range"
            Disconnect-ExchangeOnline -Confirm:$false
            exit 0
        }
    
        # Convert to NDJSON (one JSON object per line)
        $ndjson = ($messages | ForEach-Object { $_ | ConvertTo-Json -Compress }) -join "`n"
    
        # Send to Chronicle webhook
        $headers = @{
            "Content-Type" = "application/json"
            "x-chronicle-auth" = $secretKey
        }
    
        $response = Invoke-RestMethod -Uri "$endpointUrl`?key=$apiKey" -Method Post -Headers $headers -Body $ndjson
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Successfully sent $($messages.Count) messages"
    }
    catch {
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Error: $_"
    }
    finally {
        Disconnect-ExchangeOnline -Confirm:$false -ErrorAction SilentlyContinue
    }
    
  3. Replace the placeholder values:

    • <API_KEY>: The Google Cloud API key
    • <SECRET_KEY>: The Chronicle webhook secret key
    • <CERTIFICATE_THUMBPRINT>: The certificate thumbprint from the authentication setup
    • <APPLICATION_ID>: The Azure AD application (client) ID
    • yourdomain.onmicrosoft.com: Your Microsoft 365 tenant domain

macOS and Linux

  1. Create the script directory:

    sudo mkdir -p /opt/scripts
    sudo mkdir -p /var/log/messagetrace
    
  2. Create a file named /opt/scripts/messagetrace-webhook.ps1 with the following content:

    # Configuration
    $endpointUrl = "https://malachiteingestion-pa.googleapis.com/v2/unstructuredlogentries:batchCreate"
    $apiKey = "<API_KEY>"
    $secretKey = "<SECRET_KEY>"
    $certFilePath = "/opt/certs/messagetrace.pfx"
    $appId = "<APPLICATION_ID>"
    $organization = "yourdomain.onmicrosoft.com"
    
    # Log file
    $logFile = "/var/log/messagetrace/messagetrace-webhook.log"
    
    try {
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Starting message trace export"
    
        # Connect to Exchange Online using certificate-based authentication
        Connect-ExchangeOnline -CertificateFilePath $certFilePath -AppId $appId -Organization $organization -ShowBanner:$false
    
        # Retrieve message trace data for the last hour (up to 10 days available)
        $startDate = (Get-Date).AddHours(-1)
        $endDate = Get-Date
        $messages = Get-MessageTraceV2 -StartDate $startDate -EndDate $endDate -ResultSize 5000
    
        if ($messages.Count -eq 0) {
            Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - No messages found in the time range"
            Disconnect-ExchangeOnline -Confirm:$false
            exit 0
        }
    
        # Convert to NDJSON (one JSON object per line)
        $ndjson = ($messages | ForEach-Object { $_ | ConvertTo-Json -Compress }) -join "`n"
    
        # Send to Chronicle webhook
        $headers = @{
            "Content-Type" = "application/json"
            "x-chronicle-auth" = $secretKey
        }
    
        $response = Invoke-RestMethod -Uri "$endpointUrl`?key=$apiKey" -Method Post -Headers $headers -Body $ndjson
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Successfully sent $($messages.Count) messages"
    }
    catch {
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Error: $_"
    }
    finally {
        Disconnect-ExchangeOnline -Confirm:$false -ErrorAction SilentlyContinue
    }
    
  3. Replace the placeholder values:

    • <API_KEY>: The Google Cloud API key
    • <SECRET_KEY>: The Chronicle webhook secret key
    • <APPLICATION_ID>: The Azure AD application (client) ID
    • yourdomain.onmicrosoft.com: Your Microsoft 365 tenant domain
  4. Set file permissions:

    sudo chmod 700 /opt/scripts/messagetrace-webhook.ps1
    

Schedule the export script

Configure the script to run automatically at regular intervals (for example, every hour).

Windows

  1. Open Task Scheduler: Press Win+R, type taskschd.msc, and press Enter.
  2. In the right pane, click Create Task.
  3. On the General tab:
    • Name: Enter MessageTrace Webhook Export
    • Select Run whether user is logged on or not
    • Select Run with highest privileges
  4. On the Triggers tab:
    1. Click New.
    2. Set Begin the task to On a schedule.
    3. Select Daily, set Start time to 12:00:00 AM (midnight, so it repeats on whole hours), and set Recur every 1 days.
    4. Select Repeat task every 1 hour for a duration of Indefinitely.
    5. Select Enabled.
    6. Click OK.
  5. On the Actions tab:

    1. Click New.
    2. Set Action to Start a program.
    3. In the Program/script field, enter:

      powershell.exe
      
    4. In the Add arguments field, enter:

      -NoProfile -ExecutionPolicy Bypass -File "C:\Scripts\messagetrace-webhook.ps1"
      
    5. Click OK.

  6. On the Settings tab:

    • Select Allow task to be run on demand
    • Clear the Stop the task if it runs longer than checkbox
  7. Click OK and enter the service account credentials when prompted.

macOS

  1. Create a launchd plist file at ~/Library/LaunchAgents/com.chronicle.messagetrace-webhook.plist:

    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
    <plist version="1.0">
    <dict>
        <key>Label</key>
        <string>com.chronicle.messagetrace-webhook</string>
        <key>ProgramArguments</key>
        <array>
            <string>/usr/local/bin/pwsh</string>
            <string>-NoProfile</string>
            <string>-File</string>
            <string>/opt/scripts/messagetrace-webhook.ps1</string>
        </array>
        <key>StartInterval</key>
        <integer>3600</integer>
        <key>StandardOutPath</key>
        <string>/var/log/messagetrace/launchd-stdout.log</string>
        <key>StandardErrorPath</key>
        <string>/var/log/messagetrace/launchd-stderr.log</string>
        <key>RunAtLoad</key>
        <true/>
    </dict>
    </plist>
    
  2. Load the job:

    launchctl load ~/Library/LaunchAgents/com.chronicle.messagetrace-webhook.plist
    
  3. Verify the job is loaded:

    launchctl list | grep messagetrace
    

Linux

  1. Open the crontab editor:

    crontab -e
    
  2. Add the following line to run the script every hour:

    0 * * * * /usr/bin/pwsh -NoProfile -File /opt/scripts/messagetrace-webhook.ps1 >> /var/log/messagetrace/cron.log 2>&1
    
  3. Save and exit the editor.

  4. Verify the cron job is registered:

    crontab -l
    

Verify log ingestion

  1. Run the script manually to test:

    Windows:

    powershell.exe -NoProfile -ExecutionPolicy Bypass -File "C:\Scripts\messagetrace-webhook.ps1"
    

    macOS and Linux:

    pwsh -NoProfile -File /opt/scripts/messagetrace-webhook.ps1
    
  2. Check the log file for errors:

    Windows:

    Get-Content "C:\Logs\messagetrace-webhook.log" -Tail 10
    

    macOS and Linux:

    tail -10 /var/log/messagetrace/messagetrace-webhook.log
    
  3. In Google SecOps, go to Search and verify that logs with metadata.log_type = "OFFICE_365_MESSAGETRACE" appear.

Option 2: Google Cloud Storage V2 ingestion

Use this method when your export script writes message trace logs to a Google Cloud Storage bucket.

Create GCS bucket

  1. Go to the Google Cloud Console.
  2. Select your project or create a new one.
  3. In the navigation menu, go to Cloud Storage > Buckets.
  4. Click Create bucket.
  5. Provide the following configuration details:

    Setting Value
    Name your bucket Enter a globally unique name (for example, office365-messagetrace-logs)
    Location type Choose based on your needs (Region, Dual-region, Multi-region)
    Location Select the location (for example, us-central1)
    Storage class Standard (recommended for frequently accessed logs)
    Access control Uniform (recommended)
    Protection tools Optional: Enable object versioning or retention policy
  6. Click Create.

Install Google Cloud CLI

Install the Google Cloud CLI on the same machine where you set up the export environment. The gsutil command (included with the Google Cloud CLI) is used to upload files to the GCS bucket.

Windows

  1. Download the Google Cloud CLI installer (GoogleCloudSDKInstaller.exe).
  2. Run GoogleCloudSDKInstaller.exe. On the Installation Options screen, keep the defaults selected. Click Install. When the installation completes, keep Run gcloud init selected, and click Finish.
  3. Verify the installation by opening Command Prompt and running:

    gcloud --version
    gsutil --version
    

macOS

  1. Open Terminal and run the following command:

    brew install google-cloud-sdk
    
  2. Verify the installation:

    gcloud --version
    gsutil --version
    

Linux (Debian/Ubuntu)

  1. Open a terminal with root or sudo privileges.
  2. Run the following commands:

    sudo apt-get install -y apt-transport-https ca-certificates gnupg curl
    curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo gpg --dearmor -o /usr/share/keyrings/cloud.google.gpg
    echo "deb [signed-by=/usr/share/keyrings/cloud.google.gpg] https://packages.cloud.google.com/apt cloud-sdk main" | sudo tee /etc/apt/sources.list.d/google-cloud-sdk.list
    sudo apt-get update
    sudo apt-get install -y google-cloud-cli
    
  3. Verify the installation:

    gcloud --version
    gsutil --version
    

Linux (RHEL/CentOS/Fedora)

  1. Open a terminal with root or sudo privileges.
  2. Create the repository file:

    sudo tee /etc/yum.repos.d/google-cloud-sdk.repo << 'EOF'
    [google-cloud-cli]
    name=Google Cloud CLI
    baseurl=https://packages.cloud.google.com/yum/repos/cloud-sdk-el9-x86_64
    enabled=1
    gpgcheck=1
    repo_gpgcheck=0
    gpgkey=https://packages.cloud.google.com/yum/doc/rpm-package-key.gpg
    EOF
    
  3. Install the Google Cloud CLI:

    sudo dnf install -y google-cloud-cli
    
  4. Verify the installation:

    gcloud --version
    gsutil --version
    

Authenticate Google Cloud CLI

  1. Run the following command to authenticate:

    gcloud auth login
    
  2. Set the project that contains your GCS bucket:

    gcloud config set project <PROJECT_ID>
    
    • Replace <PROJECT_ID> with your Google Cloud project ID.

For unattended (scheduled) use, authenticate with a service account key. This service account is used by the export script to upload files to the GCS bucket. It is separate from the Chronicle service account used to read from the bucket (configured in a later step).

  1. Go to the Google Cloud Console and select your project.
  2. Click Create Service Account.
  3. In the Service account name field, enter a name (for example, chronicle-gcs-export).
  4. Click Create and Continue.
  5. In the Grant this service account access to project section, select the Storage Object Creator role.
  6. Click Done.
  7. In the service accounts list, click the service account you created.
  8. Go to the Keys tab.
  9. Click Add Key > Create new key.
  10. Select JSON and click Create.
  11. Save the downloaded JSON key file:
    • Windows: C:\Certs\gcs-service-account.json
    • macOS and Linux: /opt/certs/gcs-service-account.json
  12. Activate the service account:

    gcloud auth activate-service-account --key-file=<PATH_TO_KEY_FILE>
    

Create the GCS export script

Windows

  1. Create a file named C:\Scripts\messagetrace-gcs.ps1 with the following content:

    # Configuration
    $certThumbprint = "<CERTIFICATE_THUMBPRINT>"
    $appId = "<APPLICATION_ID>"
    $organization = "yourdomain.onmicrosoft.com"
    $gcsBucket = "gs://office365-messagetrace-logs/messagetrace/"
    $localLogDir = "C:\Logs\MessageTrace"
    
    # Log file
    $logFile = "C:\Logs\messagetrace-gcs.log"
    if (!(Test-Path $localLogDir)) { New-Item -ItemType Directory -Path $localLogDir }
    
    try {
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Starting message trace export to GCS"
    
        # Connect to Exchange Online using certificate-based authentication
        Connect-ExchangeOnline -CertificateThumbprint $certThumbprint -AppId $appId -Organization $organization -ShowBanner:$false
    
        # Retrieve message trace data for the last hour (up to 10 days available)
        $startDate = (Get-Date).AddHours(-1)
        $endDate = Get-Date
        $messages = Get-MessageTraceV2 -StartDate $startDate -EndDate $endDate -ResultSize 5000
    
        if ($messages.Count -eq 0) {
            Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - No messages found in the time range"
            Disconnect-ExchangeOnline -Confirm:$false
            exit 0
        }
    
        # Convert to NDJSON (one JSON object per line)
        $ndjson = $messages | ForEach-Object { $_ | ConvertTo-Json -Compress }
        $fileName = "messagetrace_$(Get-Date -Format 'yyyyMMddHHmmss').json"
        $filePath = Join-Path $localLogDir $fileName
        $ndjson | Out-File -FilePath $filePath -Encoding UTF8
    
        # Upload to GCS
        gsutil cp $filePath $gcsBucket
    
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Uploaded $fileName with $($messages.Count) messages"
    
        # Remove local file after successful upload
        Remove-Item $filePath
    }
    catch {
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Error: $_"
    }
    finally {
        Disconnect-ExchangeOnline -Confirm:$false -ErrorAction SilentlyContinue
    }
    
  2. Replace the placeholder values:

    • <CERTIFICATE_THUMBPRINT>: The certificate thumbprint from the authentication setup
    • <APPLICATION_ID>: The Azure AD application (client) ID
    • yourdomain.onmicrosoft.com: Your Microsoft 365 tenant domain
    • office365-messagetrace-logs: Your GCS bucket name

macOS and Linux

  1. Create a file named /opt/scripts/messagetrace-gcs.ps1 with the following content:

    # Configuration
    $certFilePath = "/opt/certs/messagetrace.pfx"
    $appId = "<APPLICATION_ID>"
    $organization = "yourdomain.onmicrosoft.com"
    $gcsBucket = "gs://office365-messagetrace-logs/messagetrace/"
    $localLogDir = "/var/log/messagetrace/export"
    
    # Log file
    $logFile = "/var/log/messagetrace/messagetrace-gcs.log"
    if (!(Test-Path $localLogDir)) { New-Item -ItemType Directory -Path $localLogDir }
    
    try {
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Starting message trace export to GCS"
    
        # Connect to Exchange Online using certificate-based authentication
        Connect-ExchangeOnline -CertificateFilePath $certFilePath -AppId $appId -Organization $organization -ShowBanner:$false
    
        # Retrieve message trace data for the last hour (up to 10 days available)
        $startDate = (Get-Date).AddHours(-1)
        $endDate = Get-Date
        $messages = Get-MessageTraceV2 -StartDate $startDate -EndDate $endDate -ResultSize 5000
    
        if ($messages.Count -eq 0) {
            Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - No messages found in the time range"
            Disconnect-ExchangeOnline -Confirm:$false
            exit 0
        }
    
        # Convert to NDJSON (one JSON object per line)
        $ndjson = $messages | ForEach-Object { $_ | ConvertTo-Json -Compress }
        $fileName = "messagetrace_$(Get-Date -Format 'yyyyMMddHHmmss').json"
        $filePath = Join-Path $localLogDir $fileName
        $ndjson | Out-File -FilePath $filePath -Encoding UTF8
    
        # Upload to GCS
        gsutil cp $filePath $gcsBucket
    
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Uploaded $fileName with $($messages.Count) messages"
    
        # Remove local file after successful upload
        Remove-Item $filePath
    }
    catch {
        Add-Content -Path $logFile -Value "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Error: $_"
    }
    finally {
        Disconnect-ExchangeOnline -Confirm:$false -ErrorAction SilentlyContinue
    }
    
  2. Replace the placeholder values:

    • <APPLICATION_ID>: The Azure AD application (client) ID
    • yourdomain.onmicrosoft.com: Your Microsoft 365 tenant domain
    • office365-messagetrace-logs: Your GCS bucket name
  3. Set file permissions:

    sudo chmod 700 /opt/scripts/messagetrace-gcs.ps1
    

Schedule the GCS export script

Windows

  1. Open Task Scheduler: Press Win+R, type taskschd.msc, and press Enter.
  2. In the right pane, click Create Task.
  3. On the General tab:
    • Name: Enter MessageTrace GCS Export
    • Select Run whether user is logged on or not
    • Select Run with highest privileges
  4. On the Triggers tab:
    1. Click New.
    2. Set Begin the task to On a schedule.
    3. Select Daily, set Start time to 12:00:00 AM (midnight, so it repeats on whole hours), and set Recur every 1 days.
    4. Select Repeat task every 1 hour for a duration of Indefinitely.
    5. Select Enabled.
    6. Click OK.
  5. On the Actions tab:

    1. Click New.
    2. Set Action to Start a program.
    3. In the Program/script field, enter:

      powershell.exe
      
    4. In the Add arguments field, enter:

      -NoProfile -ExecutionPolicy Bypass -File "C:\Scripts\messagetrace-gcs.ps1"
      
    5. Click OK.

  6. On the Settings tab:

    • Select Allow task to be run on demand
    • Clear the Stop the task if it runs longer than checkbox
  7. Click OK and enter the service account credentials when prompted.

macOS

  1. Create a launchd plist file at ~/Library/LaunchAgents/com.chronicle.messagetrace-gcs.plist:

    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
    <plist version="1.0">
    <dict>
        <key>Label</key>
        <string>com.chronicle.messagetrace-gcs</string>
        <key>ProgramArguments</key>
        <array>
            <string>/usr/local/bin/pwsh</string>
            <string>-NoProfile</string>
            <string>-File</string>
            <string>/opt/scripts/messagetrace-gcs.ps1</string>
        </array>
        <key>StartInterval</key>
        <integer>3600</integer>
        <key>StandardOutPath</key>
        <string>/var/log/messagetrace/launchd-gcs-stdout.log</string>
        <key>StandardErrorPath</key>
        <string>/var/log/messagetrace/launchd-gcs-stderr.log</string>
        <key>RunAtLoad</key>
        <true/>
    </dict>
    </plist>
    
  2. Load the job:

    launchctl load ~/Library/LaunchAgents/com.chronicle.messagetrace-gcs.plist
    
  3. Verify the job is loaded:

    launchctl list | grep messagetrace
    

Linux

  1. Open the crontab editor:

    crontab -e
    
  2. Add the following line to run the script every hour:

    0 * * * * /usr/bin/pwsh -NoProfile -File /opt/scripts/messagetrace-gcs.ps1 >> /var/log/messagetrace/cron-gcs.log 2>&1
    
  3. Save and exit the editor.

  4. Verify the cron job is registered:

    crontab -l
    

Verify GCS upload

  1. Run the script manually to test:

    Windows:

    powershell.exe -NoProfile -ExecutionPolicy Bypass -File "C:\Scripts\messagetrace-gcs.ps1"
    

    macOS and Linux:

    pwsh -NoProfile -File /opt/scripts/messagetrace-gcs.ps1
    
  2. Verify the file was uploaded to the GCS bucket:

    gsutil ls gs://office365-messagetrace-logs/messagetrace/
    
  3. Check the log file for errors:

    Windows:

    Get-Content "C:\Logs\messagetrace-gcs.log" -Tail 10
    

    macOS and Linux:

    tail -10 /var/log/messagetrace/messagetrace-gcs.log
    

Get Chronicle service account

  1. Go to SIEM Settings > Feeds.
  2. Click Add New Feed.
  3. Click Configure a single feed.
  4. In the Feed name field, enter a name for the feed (for example, Office 365 Message Trace - GCS).
  5. Select Google Cloud Storage V2 as the Source type.
  6. Select Office 365 Message Trace as the Log type.
  7. Click Get Service Account. A unique service account email will be displayed, for example:

    chronicle-12345678@chronicle-gcp-prod.iam.gserviceaccount.com
    
  8. Copy this email address for use in the next step.

Grant IAM permissions

  1. Go to Cloud Storage > Buckets.
  2. Select your bucket name.
  3. Go to the Permissions tab.
  4. Click Grant access.
  5. Provide the following configuration details:
    • Add principals: Paste the Google SecOps service account email
    • Assign roles: Select Storage Object Viewer
  6. Click Save.

Configure Chronicle feed for GCS

  1. Continue from the feed creation page (or go to SIEM Settings > Feeds > Add New Feed).
  2. Click Next.
  3. Specify values for the following input parameters:

    • Storage bucket URL: Enter the GCS bucket URI with the prefix path:

      gs://office365-messagetrace-logs/messagetrace/
      
    • Source deletion option: Select the deletion option according to your preference:
      • Never: Never deletes any files after transfers (recommended for testing)
      • Delete transferred files: Deletes files after successful transfer
      • Delete transferred files and empty directories: Deletes files and empty directories after successful transfer
    • Maximum File Age: Include files modified in the last number of days (default is 180 days)
    • Asset namespace: The asset namespace
    • Ingestion labels: The label to be applied to the events from this feed
  4. Click Next.

  5. Review your new feed configuration in the Finalize screen, and then click Submit.

UDM mapping table

The following table lists the log fields of the OFFICE_365_MESSAGETRACE log type and their corresponding UDM fields.

Log Field UDM Mapping Logic
Index, __metadata.type, __metadata.id, StartDate, EndDate, operationName, Tenant, properties.SenderMailFromDomain, properties.DeliveryAction, properties.UserLevelAction, properties.UserLevelPolicy, properties.DeliveryLocation, properties.Connectors, properties.OrgLevelAction, properties.OrgLevelPolicy, properties.AdditionalFields, properties.UrlLocation additional.fields Merged with labels created from each source field
properties.AuthenticationDetails extensions.auth.auth_details Value copied directly
Received, time metadata.event_timestamp Parsed as timestamp from Received using formats yyyy-MM-ddTHH:mm:ss.SSSSSSS, dd/MMM/yyyy HH:mm:ss, ISO after appending Z, or unix_ms from grok Date(%{Received}), else from time as ISO
metadata.event_type Set to "EMAIL_TRANSACTION"
tenantId metadata.product_deployment_id Value copied directly
category metadata.product_event_type Value copied directly
MessageTraceId metadata.product_log_id Value copied directly
properties.EmailDirection network.direction Set to INBOUND if Inbound, OUTBOUND if Outbound, UNKNOWN_DIRECTION otherwise
properties.SenderFromAddress, SenderAddress network.email.from Set to properties.SenderFromAddress if matches email regex, then overwritten by SenderAddress if not empty and not '<>'
MessageId network.email.mail_id Value copied, with < and > removed using gsub
properties.Subject, Subject network.email.subject Merged from properties.Subject and Subject
properties.RecipientEmailAddress, RecipientAddress network.email.to Merged from properties.RecipientEmailAddress and RecipientAddress (if not empty and not '<>')
Size network.received_bytes Converted to uinteger
properties.SenderFromDomain, Organization principal.administrative_domain Set to properties.SenderFromDomain if not empty, then overwritten by Organization if not empty
FromIP principal.ip Value copied directly
properties.FileType principal.process.file.file_type Set to FILE_TYPE_PNG if matches 'png'
properties.FileName principal.process.file.names Merged from properties.FileName
properties.SHA256 principal.process.file.sha256 Value copied directly
properties.FileSize principal.process.file.size Converted to uinteger
properties.SenderMailFromDomain, properties.UserLevelAction, properties.UserLevelPolicy principal.user.attribute.labels Merged with labels from properties.SenderMailFromDomain, UserLevelAction, UserLevelPolicy
properties.SenderFromAddress principal.user.email_addresses Merged from properties.SenderFromAddress
properties.SenderObjectId principal.user.product_object_id Value copied directly
properties.SenderDisplayName principal.user.user_display_name Value copied directly
Status security_result.action Set to ALLOW if in [Resolved, Delivered], BLOCK if Failed
Status security_result.action_details Value copied directly
properties.ThreatTypes security_result.category Set to MAIL_PHISHING if Phish, else UNKNOWN_CATEGORY
properties.ConfidenceLevel security_result.confidence Set to HIGH_CONFIDENCE if malicious, MEDIUM_CONFIDENCE if suspicious
properties.ConfidenceLevel security_result.confidence_details Value copied directly
properties.DetectionMethods security_result.detection_fields Extracted detection_method using grok, merged as map with key 'Detection Method' and value
properties.DetectionMethods security_result.rule_name Extracted detection_method using grok, set directly
properties.ThreatNames security_result.threat_name Value copied directly if not empty/null
properties.UrlDomain target.asset.hostname Value copied directly
ToIP target.asset.ip Value copied directly
properties.UrlDomain target.hostname Value copied directly
ToIP target.ip Value copied directly
properties.Url target.url Value copied directly
properties.RecipientEmailAddress target.user.email_addresses Merged from properties.RecipientEmailAddress
properties.RecipientObjectId target.user.product_object_id Value copied directly

Need more help? Get answers from Community members and Google SecOps professionals.