Hitachi Vantara Pentaho Community Forums
Results 1 to 4 of 4

Thread: Kettle 4.2.0 stable posted on SourceForge

  1. #1
    Join Date
    Nov 1999
    Posts
    9,729

    Lightbulb Kettle 4.2.0 stable posted on SourceForge

    Dear friends,

    Get your new release over here on SourceForge.

    Here are some of the new things in this version:

    • The Excel Writer step offers advanced Excel output functionality to control the look and feel of your spreadsheets.
    • Graphical performance and progress feedback for transformations
    • The Google Analytics step allows download of statistics from your Google analytics account
    • The Pentaho Reporting Output step makes it possible for you to run your (parameterized) Pentaho reports in a transformation. It allows for easy report bursting of personalized reports.
    • The Automatic Documentation step generates (simple) documentation of your transformations and jobs using the Pentaho Reporting API.
    • The Get repository names step retrieves job and transformation information from your repositories.
    • The LDAP Writer step
    • The Ingres VectorWise (streaming) bulk loader step
    • The Greenplumb (streaming) bulk loader step (for gpload)
    • The Talend Job Execution job entry
    • Healthcare Level 7 : HL7 Input step, HL7 MLLP Input and HL7 MLLP Acknowledge job entries
    • The PGP File Encryption, Decryption & validation job entries facilitate encryption and decryption of files using PGP.
    • The Single Threader step for parallel performance tuning of large transformations
    • Allow a job to be started at a job entry of your choice (continue after fixing an error)
    • The MongoDB Input step (including authentication)
    • The ElasticSearch bulk loader
    • The XML Input Stream (StAX) step to read huge XML files at optimal performance and flat memory usage by flattening the structure of the data.
    • The Get ID from Slave Server step allows multi-host or clustered transformations to get globally unique integer IDs from a slave server: http://wiki.pentaho.com/display/EAI/...m+Slave+Server
    • Carte improvements:
      1. reserve next value range from a slave sequence service
      2. allow parallel (simultaneous) runs of clustered transformations
      3. list (reserved and free) socket reservations service
      4. new options in XML for configuring slave sequences
      5. allow time-out of stale objects using environment variable KETTLE_CARTE_OBJECT_TIMEOUT_MINUTES

    • Memory tuning of logging back-end with: KETTLE_MAX_LOGGING_REGISTRY_SIZE, KETTLE_MAX_JOB_ENTRIES_LOGGED, KETTLE_MAX_JOB_TRACKER_SIZE allowing for flat memory usage for never ending ETL in general and jobs specifically.
    • Repository Import/Export
      1. Export at the repository folder level
      2. Export and Import with optional rule-based validations
      3. Import command line utility allow for rule-based (optional) import of lists of transformations, jobs and repository export files: http://wiki.pentaho.com/display/EAI/...+Documentation

    • ETL Metadata Injection:
      1. Retrieval of rows of data from a step to the “metadata injection” step
      2. Support for injection into the “Excel Input” step
      3. Support for injection into the “Row normaliser” step
      4. Support for injection into the “Row Denormaliser” step

    • The Multiway Merge Join step (experimental) allows for any number of data sources to be joined using one or more keys using an inner or a full outer join algorithm.


    Enjoy,

    Matt

  2. #2
    Join Date
    Feb 2011
    Posts
    840

    Default

    congrats on the new release, Matt! Already downloaded and using it =)
    Join us on IRC! =)

    Twitter / Google+ / Timezone: BRT-BRST
    BI Server & PDI 5.4 / MS SQL 2012 / Learning CDE & CTools
    Windows 8 64-bit / Java 7 (jdk1.8.0_75)

    Quote Originally Posted by gutlez
    PLEASE NOTE: No forum member is going to do your work for you. We will help you sort out how to do a specific part of the work, as best we can, in the timelines that our work will allow us.

    I'm no expert.Take my comments at your own risk.

  3. #3
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    Thank you all for your continued support and encouragements.
    A special thank you also needs to go to the whole Pentaho QA team for spending weeks testing fixes and features.

  4. #4
    Join Date
    Nov 2010
    Posts
    10

    Default

    i have downloaded Kettle 4.2.0,
    thereis hadoop inside


    Hi Matt, do you have simple project which use hadoop either hadoop file output, file input, hive or etc ?

    i need it for test use hadoop on data integration,

    i have used reference from http://sandbox.pentaho.com/2011/04/c...doop-tutorial/ but it's failed

    i hope thereis simple sample job/transformation for new item on pentaho integration (hadoop file output, hadoop file input,hbase, hive, etc)

    Tks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.