Software Development and Testing at Jedox

Jedox software consists of many components, developed with diverse technologies in different product teams and seamlessly integrated into a single Jedox platform. Project teams are built as needed to work on specific aspects of each project. Team members may shift from one area to another to tackle a technological challenge in a project defined by a specific business need.

As a framework for developing business applications with multiple front ends, Jedox offers a great degree of freedom to the consumer, allowing for more flexibility in terms of usage. In combination with the rapid speed of development, Quality Assurance at Jedox has to cope with constant changes to the software and make sure that it meets and exceeds the desired quality of functionality, stability, and performance.

Thus, the testing process at Jedox makes use of multiple approaches, from unit tests via API and integration tests to end-to-end tests, both automated and manual, executed on various platforms (Windows, Linux, 32bit, 64bit etc.). As tools for the automated testing process, both existing and self-developed software is used. The test-case base is constantly being enlarged, based on real-world experience, pragmatic test design and functionality specifications.

The Jedox In-Memory DB (together with Jedox Supervision Server) is tested on a daily basis, running hundreds of scripted tests against its HTTP API. Apache JMeter software is used for this purpose, surrounded by a self-designed framework that manages the test execution. Functionality, performance, and regression tests are executed, and results are reported in a common, self-developed Test Monitor, which is also used by other automated test frameworks.

The Jedox Integration Server is tested via a self-designed framework. During test execution, ca. 100 pre-defined Integrator Projects (both lab-designed and real-world-derived) are executed, and the job execution process, the results of the job, and the performance are monitored. Results are reported in the aforementioned Test Monitor.

Jedox Web is also tested via a self-developed framework, which at its core uses the WebDriver API to simulate real-world usage of a web browser using Jedox Web. More than 10000 test cases are executed daily on several execution environments, ranging from atomic functionality tests (for example, creating and using form elements, creation of PDF documents, or task execution) to tests of complex business applications (such as the demo applications shipped with Jedox). Again, test results are reported in a web-based test monitor. Test execution events are also reported in an automatic mailing that is sent to the development teams involved.

Jedox Excel Add-in is also tested automatically in a framework utilizing the Ranorex software tool for test execution. In addition, Excel Add-in is tested in repeated cycles before release using pre-specified test plans for groups of functionality.

On top of automated, regular test executions, specific manual tests are also executed for out-of-line releases (e.g. patch releases), based on regression and pre-specified test plans. Finally, there is also a formalized internal feedback loop, providing information on functionality and software defects from Jedox's own resources (“dogfooding”).

Features in preview

Some Jedox features are released "in preview". Features that are in preview have undergone testing and have passed quality thresholds, but they should not be used in production until they have reached "generally available" (GA) status. For features in preview, we are targeting a maximum of two releases for maturation, i.e., it will take a maximum of two releases until they become GA. Our intention is to get your feedback on these features before making them generally available and production-ready. These in-preview features may undergo minor design changes while going through the maturation process.

For more information, see In-Preview Features in Jedox.

Updated June 7, 2024