The hexTraffic resource provides information on both a high, overall level as well as a detailed level on the current and past states of a service provider’s network and serves a range of purposes.
For a technical overview of TC equipment, follow occupancy and error rates of network nodes and connections. On the financial side, track profitability and other financial performance indicators for internal as well as external services and cooperation. In marketing, see the effect of marketing on traffic, with short- and long-term trends and responses. And data-mining identifies new and hidden correlations in data.
hexTraffic provides online access to the current state of the system and examines short- and longer-term traffic trends. It serves as a monitor, an analysis tool, and a report generator for checking invoices and comparing data with billing and cash flow systems, and provides billing data for interconnect billing.
Mediation has become a critical component in LCR systems that allows individual applications, modules and components to communicate with each other.
Enter the hexLCR Mediation application, an adaptable, scalable and highly robust event record processor. The application is highly configurable, allowing many operations, from file formats to record processing rules, to be easily modified, resulting in shorter implementation times and fast adaption to operators’ evolving network infrastructures.
The system delivers most where multiple record formats need to be normalized and custom-filtered to produce standardized feeds to other systems, such as billing, fraud and MIS.
The hexTraffic solution defines a multitude of aggregated tables that collect only the relevant, meaningful combinations of dimensions, producing small tables that can be accessed with simple ad-hoc queries.
hexTraffic has been designed modularly, with easy configuration, automation and maximum functionality in mind. The main modules can be split into three main functional areas, which can be adapted according to the complexity of the existing IT infrastructure and business requirements.
The data acquisition process primarily collects service usage CDRs (and other xDR files) and network element configuration files. Source data is collected by script-driven mediation processes, then led through a process of data decoding and ultimately inserted into the database and archived.
A specially developed processing engine executes scripts that control a pool of modules to perform specific processing tasks. Data processing harnesses a special processing engine to handle data validation, streaming and filtering, correlation, enrichment and basic aggregation of CDRs and other xDRs, aggregation of calculated KPIs, statistical parameters and more. As a result, new merged (processed) data and information are produced in order to build multidimensional data cubes (Data Domains).