How to override DataSources when using Folder event?

link

I’m using start_asset_bundle_export_job/start_asset_bundle_import_job triggered by the folder event(s).

When I add a dashboard that uses 2 datasets (CSV files), I see the dashboard in the target account actually points to the source account. This means the import job actually overrides the information.

To fix that, I am trying to overrideParameters. In CLI, I can unzip the export qs file and easily pick out datasource IDs from the /datasource folder. Then, I can throw those into OverrideParameters

 OverrideParameters={
                    'DataSources': [
                        {
                            'DataSourceId': 'xxxx',
                            'DataSourceParameters': {
                                'S3Parameters': {
                                    'ManifestFileLocation': {
                                        'Bucket': stg_bucket_name,
                                        'Key': 'xxxxxxxxxx'
                                    }
                                }
                            }
                        },
                        {
                            'DataSourceId': 'yyyyy',
                            'DataSourceParameters': {
                                'S3Parameters': {
                                    'ManifestFileLocation': {
                                        'Bucket': stg_bucket_name,
                                        'Key': 'yyyyyyyyyyyyyy'
                                    }
                                }
                            }
                        }
                    ]
            }

But, since I want to do this with folder event, I am trying to find a better way to do this in the Lambda code.

Esp. getting DataSourceId is the challenge. Can I get those without unzipping? Or, do I have to implement this within the lambda also?

Hi @tbdori , what resource are you using as the parameter for the start-asset-bundle-export-job? Is it the dashboard or analysis?

When exporting, I set IncludeAllDependencies=True. Ideally, I use ‘Dashboard’ to initiate the event, including all assets. With a specific datasource ( esp. s3 file type), it looks like I need to unzip to find out datasource ID plus I need to override the bucket name to target one so I can pass this info as parameters.

I currently do that in Lambda. Download to /tmp, unzip, locate data source files, then get the ID, get the bucket name, change them, and pass into parameter while I’m calling import. This is doable but I wonder if there is a better approach.