I’ve previously written about how to import data from Excel into Microsoft Lists. In this post, we will switch things up and export list data into Excel using Power Automate. In our scenario, we will export a subset of the list data into a date-named spreadsheet that is stored in Microsoft Teams daily.
Excel Preparation
Our solution requires an empty Excel file with columns that match our list schema that has been formatted as a table. This file will serve as a template for the export process and will be named Template.xlsx. Take note of the table name within your template file.

Be sure to take note of the SharePoint site URL associated with the Teams site.
Hint: From the Files tab, click Open in SharePoint and copy the site URL.

Flow Build
Create a new Flow from our list by clicking Integrate > Power Automate > See your flows.

Click New Flow > Scheduled cloud flow.

Supply a Flow name, i.e. “Export List to Excel”, set the recurrence to “Day” and click Create.

Create an Initialize variable action and set:
- Name to “varFileName”
- Type to “String”
- Value to the following expression:
concat('IssueTracker-', utcNow('yyyy-MM-dd'), '.xlsx')
Note: This expression generates the file name based on static text (“IssueTracker-“) and the current date. Change the static text as you see fit.

Add another Initialize variable action and set:
- Name to “varTableName”
- Type to “String”
- Value to the name of the table in the template file, i.e. “Table1”

Create a SharePoint Copy file action and set:
- Current Site Address to our site address noted above
- File to Copy, navigate to “/Shared Documents/General/Template.xlsx”
Note: “General” is the name of the Teams channel and can be changed to the folder name that corresponds with your desired Teams channel - Destination Site Address to our site address
- Destinate Folder to “/Shared Documents/General”
- If another file is already there to “Copy with a new name”

Add a SharePoint Send an HTTP request to SharePoint action and set:
- Site Address to our site address
- Method to “POST”
- Uri to “_api/web/lists/GetByTitle(‘Documents’)/items(@{outputs(‘Copy_file_-_Template.xlsx’)?[‘body/ItemId’]})/validateUpdateListItem” where ItemId comes from the Copy file action
- Body to:
{
"formValues":[
{
"FieldName": "FileLeafRef",
"FieldValue": "@{variables('varFileName')}"
}
]
}

Create a new SharePoint Get items action and set:
- Site Address to our site address
- List Name to our target list I.e. “Issue Tracker”
- Filter Query to
Status ne 'Complete'
Note: The filter query should match your requirements. In my scenario, I am only exporting active issues.

Create a new Excel Add a row into a table and set:
- Location to our site address
- Document Library to Documents
- File to the following expression:
concat('/General/', variables('varFileName'))
Note: The “General” tab is hardcoded in the expression and can bechange as needed - Table to “varTableName”
- Row to:
- Pattern:
{
"Excel column name 1": List column name 1,
"Excel column name 2": List column name 2
} - Example:
{
"Title": @{items('Apply_to_each')?['Title']},
"Issue description": @{items('Apply_to_each')?['Description']},
"Priority": @{items('Apply_to_each')?['Priority/Value']},
"Status": @{items('Apply_to_each')?['Status/Value']},
"Date reported": @{items('Apply_to_each')?['DateReported']},
"Due Date": @{items('Apply_to_each')?['Duedate']},
"Person or group the issue is assigned to": @{items('Apply_to_each')?['Assignedto/DisplayName']},
"Issue source': @{items('Apply_to_each')?['IssueSource']},
"Issue logged by": @{items('Apply_to_each')?['Issueloggedby/DisplayName']}
}
- Pattern:
Note: Column names should be updated to match your Excel table schema

My completed Flow looks like the image below.

Save and run your Flow. If all goes well, your list rows will be added to a new Excel file with the current date in the filename.

Manually exporting list data is easy enough but automating the process can take a bit more work than expected. This Flow pattern can be extended to send the Excel file to users or for further integration with other apps and services.
Thanks for reading.
NY