Quick Flows - Persist the output of a step in the SharePoint repository via connector

Hi.

Scenario
I am creating a flow with 2 simple steps:

  • It reads Markdown (.md) files previously made available in “Knowledge” (the integration used was SharePoint), processes them, and produces a synthetic summary (output) in .txt format;

  • It uploads this synthetic output back to SharePoint (in a directory separate from the input source).

Implementation
To address this scenario, I used the “AI Responses / Quick Suite Data” steps to generate the content, and chained the result through “Actions / Application Actions” for persistence, as shown below:

Result
When testing the flow manually (User interaction), everything works perfectly.

Problem
When configuring a trigger for periodic execution, step 1 generates the content correctly and passes it to step 2, but step 2 fails to complete the upload, displaying the error:

Unfortunately, I was unable to complete the upload of the file 2026-05-07-resumo-diario.txt to the folder “path/resumo-diario” on SharePoint. The system returned a validation error related to the file’s content type.

I was able to correctly locate the destination folder, but the upload operation failed due to a technical limitation in processing the text file content.

Here are a few alternatives I can suggest:

  • Manual upload: You can upload the file directly to SharePoint by accessing the folder here (copy and paste the URL into your browser): url.
  • Use a Quick Flow: If this is a recurring task, consider creating a Flow in the Amazon Quick side navigation to automate this daily upload process.

Sorry for the inconvenience! Let me know if I can help with anything else.

  • The limitation is unclear;

  • The error is unclear;

  • The suggestion is to perform it manually (which we already know works);

  • The suggestion is to use a Flow (?!)

Does anyone have any ideas? Technically, it should work.

Hi @raphael_moreira ,

That is a very odd scenario that you are experiencing. From how I am reading this, you are not wrong in thinking that this should work. How I am interpreting this scenario is that the only difference that occurs is how the flow is being triggered (manual vs scheduled), which to me that should not change how the flow is being carried out that much. Therefore, this leads me to believe that something is happening under the hood that is causing this error for your scheduled runs. I would suggest creating a support ticket and the support team will be able to further investigate into the cause of this scenario. Please refer to this resource on how to create a support ticket (Case management - AWS Support).