Could you please help me in finding a solution to a problem.
I'm using Pentaho Kettle v 6.1 for my data integration. Source sytem would be json file whose size would vary based on the data load on the source system from 10kb to 1gb or more ( in peak hours) per 15 mins.
Could you please let me know if it is feasible to create transformations/jobs to load 1gb size file via pentaho?? If yes then what would be the best approach and if not then how can I achieve this requirement.??