so, for me to get the exact value of the timestamp I have to import the data as String. What I did not realize is that Quicksight only has a “Date option”. is there a way I can convert a string value into a timestamp or datetime value?
any help will be greatly appreciated on this side!
You can also change the datatype of a column in the dataset.
Thank you @ErikG. For the reference, I’ll be sure to try this out.
@thomask, I actually try this before posting. but here is the issue you can change a string that is in the format of date only. but in my case, I have timestamps more like datetime format which doesn’t seem to be supported when editing columns (changing data types) in Edit datasets. but thanks for the suggestion.
Hi @Wvelasqu - Welcome to AWS QuickSight community and thanks for posting the question. Parsedate is a good option as suggested by @ErikG , however if you can send the sample data, it will be easy to guide you the right solution.
Regards - Sanjeeb
Thanks for the welcome, @Sanjeeb2022,
and yes, I can show you the format I have is the in the following manner:
----> 2022-08-23 00:00:00.0 === > Currently as a string format
if I try the Date conversion in the edit dataset, this is what comes up:
" We encountered an error trying to validate the date format. Please try again.
Provide the date format which represents this field. Formats are case sensitive.
For example, dd/MM/yyyy HH:mm:ss translates to 31/08/2017 23:59:59
Of course, because the format of a timestamp is not the same as the format of a date field.
I do need to use those fields as filters later on, and for that I do need to use the date field instead.
Hope this helps.
Agree parseDate() is the right way to go here. QuickSight calls the data type a Date but it can also contain time information. Looking at the one sample string value you shared, I think the format in QS should look like “yyyy-MM-dd HH:mm:ss.S”. Im not sure if it will only accept 1 digit for millisecond, if that does’t work try "“yyyy-MM-dd HH:mm:ss.SSS” instead
but in this case the format I’m giving along with the column I want to parse is it the format I want it to be, or the format the timestamp has?
In the formula you are providing the format of your actual data, not what you want it to be. You can control the display format in the field formatting properties itself (either from the list of fields, or from the field well where you are using the field in the visual).
oh! I see, that was the confusion when I tried.
just a quick clarification, and me trying to process this. let say if I have 25 different column fields as timestamps. I will have to create other 25 calculated fields as parse date (basically a clone of the original so I can manipulate the datetime value).
Usually we would just convert the field type in the dataset itself. When you change it from string to date it should ask you the format of your existing values. Then you dont need to write any calcs. Sometimes it doesnt like the format and wont let you convert it (I dont have an explanation when or why, but for whatever reason converting doesnt always work - I think different underlying data sources behave differently and some cant convert them), which is where I resort to creating a calc and using parseDate, but typically I just convert the actual data type. If we are resorting to a calc, then yes, we are doing one of these for every date field.
is what I did, actually I made a lot of transformations in the dataset itself because of its complexity.
and changing datatypes was one, however, when I did the ingestion to Quicksight it did not recognize the datatype. so, to avoid more issues (because I was getting that with other fields as well) I switch them all to string, that way I could choose what to make what, but encountered this change on quicksight. The main issue in my case is that other fields are dependent on those dates/timestamps.
let’s say a user goes and change one value in the ERP, it records the timestamp (though changes the timestamp as the last update in multiple places, I did have to find multiple ways to capture that). but then I only get the date and not the time it will display a lot of redundancy because many changes can be made in the same day. Hopefully this little detail helps clarify. I’ll try again a different approach and then use parseDate, because otherwise there will be so many extra columns added to each dataset. I do appreciate the time taken to respond
Hi @Wvelasqu - Thanks for the details. @Jesse mentioned rightly, please try to use parsedate with the actual format so that Quicksight is able to read the format. Also on another note, if you lots of transformation which depend upon this format, can you also try to push to these details on the elt/etl layer and the transformed data you can do the reporting. Also what is the source of the data, is it relational? If is the RDS or redshift, you can also explore the custom sql functionality and some of the calculation can be done at sql layer before ingesting to QuickSight.
If we are able to answer to your question, please mark Jess’s reply as solution so that it can help other community members.
Regards - Sanjeeb
actually @ErikG gave the reference for the solution but do thank @Jesse for elaborating on my other questions which helped clarify which route to take.
and yes, Sanjeeb. is what I’m currently doing working directly with the datasets before ingesting them again to Quicksight, so I don’t have to do so much manipulation with the dates. a to answer your question regarding the database. I ingest the data form a SQL server database (our source) which is a type of read-only mode and then I do all the manipulations bringing over from the data lake to data warehouse so I can ingest them on to Quicksight.
Thank you all for your help on this one.