I installed Kylo version 0.10.1 with HDP version 3.1.0.0-78.
I Have to change Nifi to enable using Spark2 (version available at HDP 3.1).
Using Kylo standard-ingest, the process go to the end withou error.
But at the end, only the table _feed and _profile were with records. The main table and the tables _valid and _invalid are empty.
I think the step Validade And Split Records" (with uses ExecuteSparkJob) are not working, but I can identify any error in the log, only warnings.
I entered in hive session and confirm that _profile and _feed are ok. But no data is in _valid, _invalid and main table.
Looking at HDP the valid files was created to _valid table.
Follows the kylo and Nifi logs of an complete execution and the screenshots of the execution and profile.
Can you help me?