Andolasoft leverages Pentaho for end-to-end data integration and analytics to offer data integration, analytics and data mining capabilities.
Pentaho has a subscription-based enterprise edition which runs on-premises applications. We employ Kettle which is a free and open source ETL (Extract-Transform-Tool) tool which enables us to gather data from various data sources and then consolidate into a single, unified location.
Pentaho provides powerful analytics capabilities that enable organisations to collect and refine different types of data that promotes more accurate and timely decision-making.
Customers
Countries including USA
Solutions Delivered
Pentaho is a leading business intelligence tool which makes it possible for organizations to easily access, organize and analyize data thereby enabling businesses to derive useful information to make strategic decision making.
Pentaho is one of the leading business intelligence platforms in the world. Our developers are well versed with all the different aspects of Pentaho. We have the technical experience and expertise of our dedicated pentaho developers who can help you monetize your ideas.
We begin our Pentaho Development by gathering and understanding your business objectives. After assessing and establishing your business objectives we begin the assembling and designing stage. Then we begin implementing, followed by testing and monitoring.
We work with you at every step of the way thereby empowering you with the business intelligence strategies you need that provide a competitive market advantage.
You can use multiple copies of your output step and connect the proceeding step by giving 'Distribute rows' while you connect the hop.
You can use the bulk loading option which contains Vertical Bulk Loader, Oracle Bulk Loader, MySQL Bulk Loader, etc.
No, the default transformation architecture is to run in parallel while jobs run in sequence. Changing this requires an architecture change that might affect the performance.
You can calculate aggregate functions over the whole dataset by leaving this The Fields that make up the Group Table blank in the 'Group By' step.
No, we cannot form a loop inside the same transformation. But we can form a loop between transformation/jobs inside a job
We can execute a prepared SQL join statement directly in the database join step whereas we cannot do that in the regular join
The Call DB Procedure needs to be triggered. Use a Row generator step generating e.g. 1 empty row and link that with a hop to the Call DB Procedure step.
Yes, you can use the 'Get System Info' step in a transformation to get the Pentaho version. In the 'Type' column choose ‘Available Processors’.
Upon every upgrade, Pentaho enhances a step/remove a step/replace a step/add a configuration. Those steps which change from its default configuration is called depreciated steps.
Dashboard metrics can be exported in Excel, CSV, PNG, and PDF.
Database Tables, CSV file, SQL query are the inputs to create data source.
An Open source framework that allows the creation of highly customizable dashboards on top of the Pentaho Business Intelligence server. It’s based on web development standards such as CSS, HTML5 and JavaScript (leveraging some commonly used frameworks like jQuery or Bootstrap).
Pentaho provides flexible API with full documentation to integrate visualizations from third party libraries such as D3 or Fusion Charts.
Standard data source like SQL, MDX, Pentaho Metadata, etc, apart from this it also supports Kettle Transformations on the fly, so there are no limits to where your data comes from or how source data are combined together.
Yes, in Pentaho it’s very easy to translate all resources (User console, Menu, Labels etc., except data being displayed from database) from English to other languages. 13 Translations are available currently; there is also an option to propose a new translation. Pentaho Language pack is applicable to both Community and Enterprise Edition.
Windows, Android, iPhone / iPad, Mac
Relational Databases, Analytic Databases, NoSQL Databases, Hadoop and other Data Sources such as Files, Business Applications, etc.,
Lots of people in your organization spend time manually producing reports – downloading data into Excel, manipulating it then emailing it out. It’s costly and error prone.
Pentaho Data Integration provides advanced clustering and partitioning capabilities that allow organizations to scale out their data integration deployments.
Data Ingestion, Manipulation and Integration, Enterprise and Ad Hoc Reporting, Data Discovery and Visualization Predictive Analytics.
Pentaho Data Science Pack operationalizes analytical modeling and machine learning while allowing data scientists and developers to unburden the labor of data preparation to Pentaho Data Integration.
It can not be added directly in the Pentaho application. Download the schema file, update it using schema workbench and then publish the schema file.
Central Authentication Service (CAS) and Integrated Windows Authentication (IWA) are supported.