• How to read in a list of ints to be passed into sql

    I'm a newbie. I need to load a list of ids ( integers ) from a file. I then need to pass these into sql. I can't work out how to do that because manipulating the list of int into a comma separated list leaves me with ...
    maia zoggo
    last modified by maia zoggo
  • Pentaho Java script - Multi line field parsing

    I have a multi line field populated with some specific values I need to parse and write to new fields. Field:   001 AIR CONDITIONING UNK TYPE UNKNOWN 30.0     005 EXTERIOR WALL FRB FRAME BRICK...
    vikas reddy
    last modified by vikas reddy
  • Unable to launch Spoon client in Centos 7

    I have just installed Spoon client in a Centos 7 virtual machine.  The splash screen appears when the spoon.sh is executed.  However a "buffer overflow" error is then thrown and the Spoon client does not sta...
    Kok Soon Oliver Tan
    last modified by Kok Soon Oliver Tan
  • The bug of CsvInput which is about multiple chars of delimiter

    I have submit my code here: 7.1--fix the bug which is about multiple chars of delimiter by xiaoaowanghu · Pull Request #5810 · pentaho/pentaho-kettl… But I only do some simple test with my code. So wa...
    Xiaoao Wanghu
    last modified by Xiaoao Wanghu
  • mondrian.olap.MondrianException: Mondrian Error:Internal error: Query required more than 12 iterations

    Hi, I am having the error "query required more than 12 iterations" in Mondrian. I have modified the property mondrian.rolap.evaluate.MaxEvalDepth in mondrian.properties to reduce the amount of errors but I'm still get...
    B241N7QV
    last modified by B241N7QV
  • Avoiding timing conditions with Text File Output step

    I am running into a timing condition between the "Text file output" step and another step (it is a custom HCP step).  The problem is that zero length files are being written by the "HCP Put" step.  When obse...
    Clifford Grimm
    last modified by Clifford Grimm
  • Can Pentaho read .pdf files as input ?

    Hi Team,   I have a requirement to read a .pdf file which we cant convert into .txt or any other format due to insufficient privileges. Can Pentaho read .pdf files ? if yes then how can you please suggest ?
    Nilesh Purohit
    last modified by Nilesh Purohit
  • Appending new data to existing table based on date

    Hi, I am pretty new to Kettle so apologies if my question is too simple - I want to create an ETL process that will allow me to test what rows currently exist in the output table (Max date for example) and then them u...
    Alina Lot
    last modified by Alina Lot
  • Kerberos Authentication in CE

    Hi everyone!   We are using Data Integration CE to develop some jobs using a Cloudera Cluster, everything runs ok, but now the cluster is being integrated with Kerberos, how can we integrate our Data Integration...
    Landy Reyes
    last modified by Landy Reyes
  • Communications link failure during commit() with MySQL

    I have problem with MySQL in PDI (Kettle). This error appears in process of reading information by Input Table. Even if all data is gived out of base successfully, this error appears and, probably, doesn't affect on t...
    Alex Shumskiy
    last modified by Alex Shumskiy
  • PDI pentaho reporting output parameter issue

    Hi,     I m using pentaho Data integrator 8.1 and pentaho reporting output component inside a transformation. The report contains a simple integer parameter ${batchno} which is working fine with report desi...
    Narasimha Rao Ch
    last modified by Narasimha Rao Ch
  • Getting out of memory error while processing a SQL through hive on spark .I am running transformation on spoon having JVM size of 4GB.

    Java.lang.OutOfMemoryException :Java heap space. Resultset is having only 150 rows.Sometime it is throwing error on select value step some times on write to log step.
    kuldeep singh
    last modified by kuldeep singh
  • Run this transformation in a clustered mode?

    Hi In PDI 8, where do I find the "Run this transformation in a clustered mode?" option?   In PDI 6, the option is in the advanced tab of the transformation job entry, but in PDI 8 is not. I need this option to ...
    Ricardo Carrera
    last modified by Ricardo Carrera
  • How to get data from REST API with pagination? example www.abc.com/data?page=1, ..... with number of pages we do not know???

    I have a rest API with pagination, I would like to get all data with automatically passing the page number into the URL www.abc.com/data?page=1 www.abc.com/data?page=2 .... www.abc.com/data?page=n   n = the ...
    William Nguyen
    last modified by William Nguyen
  • Job containing "Build Model" step works in Spoon but fails on Kitchen

    Hi, I am developing a PDI job that: executes a transformation that loads data into a PostgreSQL table builds a model based on the table output step of the transformation using auto modeler publishes the model to t...
    Stefano Piva
    created by Stefano Piva
  • I have the following problem when trying to connect to an Oracle database:

    I have the following problem when trying to connect to an Oracle database:   Error connecting to database [DWH] :org.pentaho.di.core.exception.KettleDatabaseException: Error occurred while trying to connect to ...
    Giovani Stefani
    last modified by Giovani Stefani
  • Database lookup and Database join are driving me crazy

    Hi, I have a transformation where I use a database lookup or database join step to look up a value from my database. The problem is: it does not find the record in my table! I am 100 % sure it is there and if I run t...
    B241Q6DO
    last modified by B241Q6DO
  • PDI 8.1 not starting

    Hello All, I have PDI 8.1 downloaded and installed on my computer (community edition and trial enterprise edition).  It worked perfectly yesterday, and today when I try to start it, the command prompt pops up an...
    Zachary Pappas
    last modified by Zachary Pappas
  • How to pass through variables the log database connection in Kettle?

    Hi everyone, I would like to know if there is a proper way to pass the database connections variable so they can be used in the logging sections of both jobs and transformations.   Regards, Nicolas.
    Nicolas Soria
    last modified by Nicolas Soria
  • Problems reading CSV file

    Hi, I'm working with PDI Version 7.1 and I'm trying to read a CSV file with the CSV File Input step, but I'm having an error that I can not handle. My CSV file is delimited by the character | (ASCCI Code 124) and at ...
    Luis Suarez
    last modified by Luis Suarez