Skip to Content

How We Cut 40 Hours of Manual Data Entry With Microsoft Graph API and n8n

Last year we started working with a mid-sized professional services firm. Thirty people, busy enough to be growing, small enough that nobody had stopped to ask whether the way they were working actually made sense.

Their operations team was spending about 40 hours every week on data entry. Not complex work. Not work that required judgement. Just moving information from one place to another: pulling data from emails, copying it into spreadsheets, updating records in their CRM, forwarding summaries to the right people.

Forty hours. That is one full-time person doing nothing but transferring data between systems that could, with the right setup, talk to each other directly.

The secret was not hiring someone faster at copy-paste. It was building agents that did it without being asked.

What was actually happening

The firm received a steady stream of structured information by email every day. Client updates, form submissions, approval confirmations, status notifications from external platforms. Each email contained data that needed to go somewhere specific.

The process looked like this. Someone on the operations team opened the email. They read it. They opened a spreadsheet. They typed the relevant values into the right columns. They opened the CRM. They found the right record. They updated it. They sent a summary to the relevant colleague. They moved to the next email.

There were around 80 to 100 of these emails per week. Each one took 15 to 25 minutes. The team was competent and careful. They made very few mistakes. They also had no time to do anything else.

The question was not whether they were doing it badly. The question was whether it needed to be done by a human at all.

The stack we chose

Two tools did most of the work.

n8n for the automation layer. If you have not used it: n8n is a workflow automation platform that lets you connect services, write logic, handle errors, and run sequences without building a full application. It is self-hostable, which mattered to this client for data reasons, and it handles the orchestration that would otherwise require a developer writing custom integration code.

Microsoft Graph API for the Microsoft 365 layer. The client lived in Outlook, Excel, and SharePoint. Graph API is Microsoft's unified API for accessing all of those services programmatically. Emails, calendar events, Excel workbooks, SharePoint lists, Teams messages, OneDrive files — all accessible through the same authentication model and the same endpoint structure.

The combination meant we could read emails as they arrived, extract the data we needed, write it directly to the right Excel workbook and SharePoint list, update the CRM record via its own API, and send a formatted Teams notification to the right person. No human in the loop for the standard cases.

How the authentication worked

This is usually where people get stuck with Graph API, so it is worth being specific.

Graph API uses OAuth 2.0 with Azure Active Directory. For unattended automation (no human clicking "allow" each time), you register an application in Azure AD and use the client credentials flow. The app gets its own identity with specific permissions granted by an admin.

1. Register app in Azure AD
   Azure Portal → App Registrations → New Registration
   Name: n8n-automation
   Supported account types: Single tenant

2. Add API permissions
   Microsoft Graph → Application permissions (not delegated)
   Mail.Read
   Files.ReadWrite.All
   Sites.ReadWrite.All
   User.Read.All

3. Grant admin consent
   (Required for application permissions — a Global Admin clicks "Grant admin consent")

4. Create a client secret
   Certificates & Secrets → New client secret
   Copy the value immediately — it is not shown again

5. Note your credentials
   Tenant ID:     found in Azure AD → Overview
   Client ID:     found in App Registration → Overview
   Client Secret: the value you just copied

In n8n, store these as credentials under Credentials → Microsoft Graph API (or as generic HTTP header credentials if using HTTP Request nodes directly).

The n8n workflow in detail

The workflow has four stages. Here is what each one does and why we built it the way we did.

Stage 1: Email trigger and initial filter

n8n polls the shared Outlook mailbox every 5 minutes using the Microsoft Graph API node. The trigger reads new emails and immediately filters on two conditions: the sender domain must be in an approved list, and the subject line must match one of the defined patterns we are looking for.

Everything that does not match passes through to a separate branch that simply moves the email to a "Manual Review" folder in Outlook. The automation never discards anything. It routes things it understands and flags everything else for a human.

// n8n Code node: Filter and classify incoming email

const email = $json;

const approvedDomains = [
  'trustedpartner.com',
  'clientportal.io',
  'notifications.platform.com'
];

const subjectPatterns = [
  { pattern: /^Update: (.+) — Status Change/i,    type: 'status_update'    },
  { pattern: /^Form submission: (.+)$/i,           type: 'form_submission'  },
  { pattern: /^Approval confirmed: (.+)$/i,        type: 'approval'         },
  { pattern: /^Monthly report: (.+) ready$/i,      type: 'report_ready'     },
];

const senderDomain = email.from.split('@')[1]?.toLowerCase() || '';
const isApprovedSender = approvedDomains.some(d => senderDomain.endsWith(d));

let emailType = null;
let matchedData = null;

if (isApprovedSender) {
  for (const { pattern, type } of subjectPatterns) {
    const match = email.subject.match(pattern);
    if (match) {
      emailType = type;
      matchedData = match[1];
      break;
    }
  }
}

return [{
  json: {
    ...email,
    classified:   emailType !== null,
    emailType:    emailType,
    matchedData:  matchedData,
    shouldRoute:  !isApprovedSender || emailType === null,
  }
}];

Stage 2: Body parsing and data extraction

For emails that pass the filter, the body needs to be parsed. The emails from their external platforms came in a consistent structured format — essentially a labelled list of fields and values. We wrote a parser that pulled the relevant fields out regardless of minor formatting variations between senders.

// n8n Code node: Parse email body into structured data

const body = $json.emailBody || '';
const emailType = $json.emailType;

function extractFields(text) {
  const lines = text.split('\n').map(l => l.trim()).filter(Boolean);
  const fields = {};

  for (const line of lines) {
    // Handles "Field Name: Value" and "Field Name — Value"
    const match = line.match(/^([^:\n]{2,40}):\s*(.+)$/);
    if (match) {
      const key = match[1].trim()
        .toLowerCase()
        .replace(/\s+/g, '_')
        .replace(/[^a-z0-9_]/g, '');
      fields[key] = match[2].trim();
    }
  }

  return fields;
}

const parsed = extractFields(body);

// Normalise date fields
const dateFields = ['date', 'submitted_on', 'effective_date', 'due_date'];
for (const field of dateFields) {
  if (parsed[field]) {
    const d = new Date(parsed[field]);
    if (!isNaN(d)) {
      parsed[field] = d.toISOString().split('T')[0];
    }
  }
}

return [{
  json: {
    emailType,
    fields:    parsed,
    parsedAt:  new Date().toISOString(),
    messageId: $json.messageId,
  }
}];

Stage 3: Write to Excel and SharePoint

This is where Graph API does the heavy lifting. The workflow writes the extracted data directly to a specific row in the right Excel workbook on SharePoint. No downloading, no uploading. Direct writes through the API.

// n8n HTTP Request node configuration for writing to Excel via Graph API
// (configured as a Code node here for clarity)

const tenantId    = $env.AZURE_TENANT_ID;
const clientId    = $env.AZURE_CLIENT_ID;
const clientSecret = $env.AZURE_CLIENT_SECRET;

// Step 1: Get access token
const tokenResp = await $helpers.httpRequest({
  method: 'POST',
  url:    `https://login.microsoftonline.com/${tenantId}/oauth2/v2.0/token`,
  headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
  body:   new URLSearchParams({
    grant_type:    'client_credentials',
    client_id:     clientId,
    client_secret: clientSecret,
    scope:         'https://graph.microsoft.com/.default',
  }).toString(),
});

const token = JSON.parse(tokenResp.body).access_token;

const siteId     = $env.SHAREPOINT_SITE_ID;
const workbookId = $env.EXCEL_WORKBOOK_ID;
const sheetName  = 'Operations Log';

const fields  = $('Parse Email Body').first().json.fields;
const rowData = [
  new Date().toISOString().split('T')[0],   // Date logged
  fields.reference_number  || '',
  fields.client_name        || '',
  fields.status             || '',
  fields.amount             || '',
  fields.effective_date     || '',
  fields.submitted_by       || '',
  $('Parse Email Body').first().json.emailType,
];

// Step 2: Append row to Excel table
await $helpers.httpRequest({
  method: 'POST',
  url:    `https://graph.microsoft.com/v1.0/sites/${siteId}/drive/items/${workbookId}/workbook/tables/OperationsTable/rows/add`,
  headers: {
    'Authorization': `Bearer ${token}`,
    'Content-Type':  'application/json',
  },
  body: JSON.stringify({ values: [rowData] }),
});

// Step 3: Update SharePoint list item if reference number exists
if (fields.reference_number) {
  const listId = $env.SHAREPOINT_LIST_ID;

  // Find existing list item by reference number
  const searchResp = await $helpers.httpRequest({
    method: 'GET',
    url:    `https://graph.microsoft.com/v1.0/sites/${siteId}/lists/${listId}/items?$filter=fields/ReferenceNumber eq '${fields.reference_number}'&$expand=fields`,
    headers: { 'Authorization': `Bearer ${token}` },
  });

  const items = JSON.parse(searchResp.body).value;

  if (items.length > 0) {
    const itemId = items[0].id;

    // Patch the existing record
    await $helpers.httpRequest({
      method: 'PATCH',
      url:    `https://graph.microsoft.com/v1.0/sites/${siteId}/lists/${listId}/items/${itemId}/fields`,
      headers: {
        'Authorization': `Bearer ${token}`,
        'Content-Type':  'application/json',
      },
      body: JSON.stringify({
        Status:          fields.status         || '',
        LastUpdated:     new Date().toISOString(),
        EffectiveDate:   fields.effective_date  || '',
        ProcessedByAI:   true,
      }),
    });
  }
}

return [{ json: { success: true, reference: fields.reference_number } }];

Stage 4: Teams notification to the right person

The last step sends a formatted Teams message to whoever needs to know. Not a broadcast. A targeted message to the specific person or channel responsible for that email type. Different email categories route to different people.

// n8n HTTP Request node: Send Teams notification via Graph API

const fields    = $('Parse Email Body').first().json.fields;
const emailType = $('Parse Email Body').first().json.emailType;
const token     = $('Write to Excel').first().json.accessToken;

// Map email type to the right Teams channel webhook or user ID
const routingMap = {
  'status_update':   { type: 'channel', id: $env.TEAMS_CHANNEL_OPERATIONS },
  'form_submission': { type: 'channel', id: $env.TEAMS_CHANNEL_INTAKE     },
  'approval':        { type: 'user',    id: $env.TEAMS_USER_APPROVALS      },
  'report_ready':    { type: 'channel', id: $env.TEAMS_CHANNEL_REPORTS     },
};

const destination = routingMap[emailType];
if (!destination) return [{ json: { skipped: true } }];

const messageBody = {
  body: {
    contentType: 'html',
    content: `
      <p><strong>Automated update logged</strong></p>
      <p>Reference: <strong>${fields.reference_number || 'N/A'}</strong></p>
      <p>Type: ${emailType.replace(/_/g, ' ')}</p>
      <p>Status: ${fields.status || 'Not specified'}</p>
      <p>Effective: ${fields.effective_date || 'N/A'}</p>
      <p style="color: #888; font-size: 12px;">
        This was logged automatically. 
        Check the Operations Log in SharePoint for full details.
      </p>
    `.trim(),
  },
};

const endpoint = destination.type === 'channel'
  ? `https://graph.microsoft.com/v1.0/teams/${destination.id}/channels/general/messages`
  : `https://graph.microsoft.com/v1.0/users/${destination.id}/chats/messages`;

await $helpers.httpRequest({
  method: 'POST',
  url:    endpoint,
  headers: {
    'Authorization': `Bearer ${token}`,
    'Content-Type':  'application/json',
  },
  body: JSON.stringify(messageBody),
});

return [{ json: { notified: destination.type, id: destination.id } }];

Error handling: the part that actually matters

Automation that fails silently is worse than no automation, because at least with manual work someone notices when something goes wrong.

Every error path in this workflow does two things. It moves the original email to a specific "Processing Failed" folder in Outlook with a tag showing which stage failed. It sends a Teams message to the operations team with the error detail and the original email reference so they can handle it manually.

// n8n Error Trigger node: attached to every workflow

const error    = $json.error;
const workflow = $json.workflow;

// Move email to failed folder via Graph API
const token = await getAccessToken(); // helper function

await $helpers.httpRequest({
  method: 'POST',
  url:    `https://graph.microsoft.com/v1.0/users/${$env.MAILBOX_USER}/messages/${workflow.lastEmailId}/move`,
  headers: {
    'Authorization': `Bearer ${token}`,
    'Content-Type':  'application/json',
  },
  body: JSON.stringify({ destinationId: $env.OUTLOOK_FAILED_FOLDER_ID }),
});

// Notify the team
await $helpers.httpRequest({
  method: 'POST',
  url:    `https://graph.microsoft.com/v1.0/teams/${$env.TEAMS_CHANNEL_ALERTS}/channels/general/messages`,
  headers: {
    'Authorization': `Bearer ${token}`,
    'Content-Type':  'application/json',
  },
  body: JSON.stringify({
    body: {
      contentType: 'text',
      content: `Automation failed at stage: ${error.stage}\nEmail: ${workflow.subject || 'unknown'}\nError: ${error.message}\nAction: Check Processing Failed folder in Outlook.`,
    },
  }),
});

return [{ json: { errorHandled: true } }];

What the numbers looked like after four weeks

The workflow ran for a month before anyone on the team changed what they were doing. We wanted to see how it behaved in production before drawing conclusions.

Metric Before After
Emails processed per week 85 avg 85 avg
Handled by automation 0% 78%
Manual time per week 40 hrs 9 hrs
Errors requiring rework 3 to 5/week 0 to 1/week
Average processing time per email 18 mins 40 seconds

The 22% that still required human handling were emails that did not match the expected format, contained ambiguous data, or fell into edge cases the classification had not seen before. Each of those was handled faster because the email was already categorised and in the right folder.

The 40 seconds figure for automated emails is the end to end time from email arriving to SharePoint updated and Teams notification sent.

What this was not

This was not an AI project in the headline sense. There was no large language model making decisions. The intelligence in the system was in the design of the classification rules, the field extraction logic, and the routing map. Those were built by understanding the specific workflow this team actually had, not by applying a general-purpose AI to a vague problem.

That is worth saying because a lot of automation conversations start with "we should use AI" when what is actually needed is a well-designed integration that connects systems that already hold the right data. The Microsoft Graph API gave us access to everything in the Microsoft 365 environment. n8n gave us the orchestration layer to connect it. Those two things, built around the specific shape of this team's work, were enough.

The agents here are not AI agents in the current buzzword sense. They are automation agents. Software that watches for specific inputs, applies rules, writes to specific places, and notifies specific people. The outcome, 40 hours returned to the team, came from that clarity of design, not from the sophistication of the technology.

If your team is in a similar situation

The pattern described above applies to a lot of organisations that run on Microsoft 365. If your operations team is spending significant time moving structured data between emails, Excel files, SharePoint, and internal systems, the ingredients for this kind of automation are almost certainly already present in your stack.

The work is in the analysis: mapping exactly what comes in, what format it arrives in, where it needs to go, and who needs to know about it. Once that is documented clearly, the implementation is straightforward.

If you are trying to figure out whether this is applicable to your situation or where to start, write to us at sales@bithost.in or visit www.bithost.in. We are happy to look at what you have and tell you honestly whether it is a good fit.

How We Cut 40 Hours of Manual Data Entry With Microsoft Graph API and n8n
ZHOST March 20, 2026
Share this post
Building AI-Powered Customer Support With Odoo, OpenAI, and n8n