Dataset loads weird (S3 bucket with csvs)


I’ve created a manifest json file to point to my S3 bucket with csvs (bucket has folders: year > month > day).

When I point to the whole bucket, I get weird gibberish for the first few rows. If I skip the rows, I lose my header.

However if I point to random selected months, it can either load fine (automatically skipping the first 10 rows, returning the data with the headers) or it can be gibberish again.
What makes it even more strange, when selecting random months, I’ve selected April and November, the data loads fine with the 10 row skip but with April alone, the data is gibberish.

Does anyone know why this is happening?
The engineering team have checked the schema is fine.

Hmmm, are you sure you are using URI prefix’s?

Hi Max,

Thanks for your reply.
Would an incorrect URI prefix cause the data to load funny - I would have thought it would just refuse to connect?
My manifest file is below and the data can load in without any issues but the folder combination seems random.

    "fileLocations": [
            "URIPrefixes": [
                "<Bucket Name>/"
    "globalUploadSettings": {
        "format": "CSV",
        "delimiter": ",",
        "containsHeader": "true"

I was thinking that you had URI’s instead of URI prefixes.

Another question, is there any other data in the folder that isn’t in this format?

There is a _SUCCESS file in the folder.
When the python script creates a csv successfully, it generates a success file.

That might be the issue. I would look to open up a case with AWS quicksight.

I wonder if there is a way in the Manifest file to exclude certain types of files like that. However, I don’t know if you can and couldn’t find any documentation for that specific use case.

Here are the steps to open a support case. If your company has someone who manages your AWS account, you might not have direct access to AWS Support and will need to raise an internal ticket to your IT team or whomever manages your AWS account. They should be able to open an AWS Support case on your behalf. Hope this helps!