AnsweredAssumed Answered

Running a Spark Transformation through java.

Question asked by nathan steele on Aug 2, 2018

I trying to run a simple spark transformation(containing GenerateRows and WriteToLog) using kettle 8. This is working fine for me using Spoon(using the Spark Run Configuration). Now I would like to run this same Spark Transformation from java. I have included all the kettle jars. This code will run the ktr as a normal transformation.


 public boolean execute(String kTRFile) throws KettleException {
        boolean status = false;
        TransMeta transMeta = null;
        Trans trans = null;
            transMeta = new TransMeta(kTRFile);
            trans = new Trans(transMeta);
            if (trans.getErrors() > 0 || trans.isStopped()) {
                throw new RuntimeException("There were errors during transformation execution.");
            status = true;
        return status;

But could anyone tell me what all changes do I need to make so that I would be able to execute the transformation as a Spark Transformation(i.e., using Engine:Spark, Protocol:http://,Spark host URL:x.x.x.x:53000)?


Simply put I'd like to know how to add Spark's Run Configuration to Trans during execution?


Please could someone help me with this???